4
Behavior Observation Method for the Cricket Y. Okuda and S. Takahashi Kagawa University Department of Intelligent Mechanical Systems Engineering 2217-20 Hayashi-cho, Takamatsu, Kagawa 761-0396, Japan E-mail: [email protected] K. Kawabata RIKEN(The Institute of Physical and Chemical Research) RIKEN-XJTU Joint Research Unit, Global Research Cluster 2-1 Hirosawa, Wako, Saitama 351-0198, Japan E-mail: [email protected] H. Aonuma Hokkaido University Complex Systems Group, Research Institute for Electronic Science Kita14, Nishi7, Kita-ku, Sapporo, 060-0812, Japan E-mail: [email protected] K. Iwata and Y. Satoh The National Institute of AIST Intelligent Systems Research Institute 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan E-mail: {kenji.iwata, yu.satou}@aist.go.jp Abstract—Recently, research on the habit of creatures and their characteristics are widely based on image processing. Observing the behavior of Gryllus bimaculatus is important to understanding the ecology of their characteristics. Therefore, in this paper, we introduced an image processing algorithm and constructed a technique of image observation to detect crickets based on an advanced statistical reach feature, and generated action trajectories. Through experimentation, we proved the effectiveness of our proposed method. Keywords— cricket Gryllus bimaculatus, advanced statistical reach feature, observation of avoidance behavior; I. INTRODUCTION Observing the behavior of creatures attracts attention in various fields. For example, insects which consist of micro brain are able to show their adaptive action in accordance with the situation. We aim to analyze the behavior observation of the cricket Gryllus bimaculatus. In particular, we treated the aggressive attacking behavior of males as in [1]. The time cost and misidentification by human-eye evaluations is often problematic. Therefore, in order to improve the method of creature observation, image processing becomes a key role. The method which automatically derives the action trajectory of creatures and conducts creature observation quantitatively is important. Observed in paper [1], a tracking method using particle filter based on the color of crickets and robots, had difficulty detecting crickets with the same color. In this paper, we introduced a new method of detecting G.bimaculatus by using background subtraction based on a statistical reach feature (SRF) as in [2], and generated action trajectory of G.bimaculatus. Since we use the statistical reach feature which robust method to the illumination change and proposed discrimination method based on the linear motion of uniform acceleration of G.bimaculatus, we can prevent false detection by other creatures. Through experimentation, we proved the effectiveness of our proposed method. II. DETECTION OF CRICKET We described the detection method of target crickets by using background subtraction based on a statistical reach feature. Statistical reach feature is stable by probability of brightness magnitude using a set of images. First, we define a set of images expressed as U = {I 1 , I 2 , , I M } and Γ is a set of coordinates of pixels on image which is brightness value I(i, j) and then p = (i p , j p ). Then, we calculate a point pair (p, q) by probability of brightness magnitude in a set of images U. Namely, equation (1) defines the probability P + (p,q;T p ) of greater than a threshold T p difference between the brightness of p and q with U ) ( ) ( U : ) ; , ( } { p p T I I I T P q p q p (1) |U| is the number of a set of images U. Similarly, probability P (p,q;T p ) of less than a threshold T p difference between the brightness of p and q as follows: U ) ( ) ( U : ) ; , ( } { p p T I I I T P q p q p (2) Namely, we obtain the robust point pair (p, q) using threshold T p according to the level of noise in the images U. Further, SRF (p q) is defined by these probabilities and a threshold T R (0.5T R < 1) from equation (3). ) ( ) ) ; , ( ( 0 ) ) ; , ( ( 1 : ) SRF( otherwise T T P T T P R p R p q p q p q p (3) 978-1-4799-2827-9/13/$31.00 ©2013 IEEE

[IEEE TENCON 2013 - 2013 IEEE Region 10 Conference - Xi'an, China (2013.10.22-2013.10.25)] 2013 IEEE International Conference of IEEE Region 10 (TENCON 2013) - Behavior observation

  • Upload
    y

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [IEEE TENCON 2013 - 2013 IEEE Region 10 Conference - Xi'an, China (2013.10.22-2013.10.25)] 2013 IEEE International Conference of IEEE Region 10 (TENCON 2013) - Behavior observation

Behavior Observation Method for the Cricket

Y. Okuda and S. Takahashi Kagawa University

Department of Intelligent Mechanical Systems Engineering 2217-20 Hayashi-cho, Takamatsu, Kagawa 761-0396, Japan

E-mail: [email protected]

K. Kawabata RIKEN(The Institute of Physical and Chemical Research)

RIKEN-XJTU Joint Research Unit, Global Research Cluster 2-1 Hirosawa, Wako, Saitama 351-0198, Japan

E-mail: [email protected]

H. Aonuma Hokkaido University

Complex Systems Group, Research Institute for Electronic Science Kita14, Nishi7, Kita-ku, Sapporo, 060-0812, Japan

E-mail: [email protected]

K. Iwata and Y. Satoh The National Institute of AIST

Intelligent Systems Research Institute 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan

E-mail: {kenji.iwata, yu.satou}@aist.go.jp

Abstract—Recently, research on the habit of creatures and their characteristics are widely based on image processing. Observing the behavior of Gryllus bimaculatus is important to understanding the ecology of their characteristics. Therefore, in this paper, we introduced an image processing algorithm and constructed a technique of image observation to detect crickets based on an advanced statistical reach feature, and generated action trajectories. Through experimentation, we proved the effectiveness of our proposed method.

Keywords— cricket Gryllus bimaculatus, advanced statistical reach feature, observation of avoidance behavior;

I. INTRODUCTION

Observing the behavior of creatures attracts attention in various fields. For example, insects which consist of micro brain are able to show their adaptive action in accordance with the situation. We aim to analyze the behavior observation of the cricket Gryllus bimaculatus. In particular, we treated the aggressive attacking behavior of males as in [1]. The time cost and misidentification by human-eye evaluations is often problematic. Therefore, in order to improve the method of creature observation, image processing becomes a key role. The method which automatically derives the action trajectory of creatures and conducts creature observation quantitatively is important. Observed in paper [1], a tracking method using particle filter based on the color of crickets and robots, had difficulty detecting crickets with the same color. In this paper, we introduced a new method of detecting G.bimaculatus by using background subtraction based on a statistical reach feature (SRF) as in [2], and generated action trajectory of G.bimaculatus. Since we use the statistical reach feature which robust method to the illumination change and proposed discrimination method based on the linear motion of uniform acceleration of G.bimaculatus, we can prevent false detection by other creatures. Through experimentation, we proved the effectiveness of our proposed method.

II. DETECTION OF CRICKET

We described the detection method of target crickets by using background subtraction based on a statistical reach feature. Statistical reach feature is stable by probability of brightness magnitude using a set of images. First, we define a set of images expressed as U = {I1, I2, …, IM} and Γ is a set of coordinates of pixels on image which is brightness value I(i, j) and then p = (ip, jp). Then, we calculate a point pair (p, q) by probability of brightness magnitude in a set of images U. Namely, equation (1) defines the probability P+ (p,q;Tp)

of greater than a threshold Tp difference between the brightness of p and q with

U

)()(U:);,(

}{ pp

TIIITP

qpqp (1)

|U| is the number of a set of images U. Similarly, probability P- (p,q;Tp)

of less than a threshold Tp difference between the brightness of p and q as follows:

U

)()(U:);,(

}{ pp

TIIITP

qpqp (2)

Namely, we obtain the robust point pair (p, q) using threshold Tp according to the level of noise in the images U. Further, SRF (p q) is defined by these probabilities and a threshold TR (0.5≤ TR < 1) from equation (3).

     

  

  

)(

));,((0

));,((1

:)SRF(

otherwise

TTP

TTP

Rp

Rp

qp

qp

qp (3)

978-1-4799-2827-9/13/$31.00 ©2013 IEEE

Page 2: [IEEE TENCON 2013 - 2013 IEEE Region 10 Conference - Xi'an, China (2013.10.22-2013.10.25)] 2013 IEEE International Conference of IEEE Region 10 (TENCON 2013) - Behavior observation

The background subtraction is based on a statistical reach feature as in [2] and [3]. We constructed the background model based on the statistical reach feature of a set of images. A set of points Q = {q1, q2,…, qK} is extracted at random on the image. The local reach graph locRG(p) consisting of only points p with which SRF ≠ φ is satisfied among the points q selected at random with equation (4).

})SRF(,|),{(:)(locRG k qpqqpp Q (4)

Then, Background model is that the local reach graph locRG(p) is calculated for all pixels, it is used for background subtraction. To calculate the background subtraction based on the statistical reach feature. Using equation (5) we defined the incremental sign of point pair (p, q) on the input image J as b (p ≻ q).

)(0

))()((1:)b(

otherwise

JJ

  

   qpqp (5)

Then, we calculated B(p) by defining the correlation between locRG(p) and b (p ≻ q) on same point as follows:

)locRG(

})()(SRF)locRG(),({:)(

p

yxyxpyxp

bB

(6)

Binarize processing was defined using threshold TB (0 ≤ TB < 1) for the decision of a foreground / background. According to (7), a binarization image C(p) was defined by the correlation coefficient B(p) and threshold TB.

)(0

))((1:)(

otherwise

TBC B

  

    pp (7)

C(p) decided “1” as the foreground pixel and “0” as the background pixel. The target pixel which is correlated is defined the foreground pixel. Similarly, the target pixel which is uncorrelated is defined the background pixel using a background model based on statistical reach feature. For example, Fig.1(a) shows the input image, and Fig.2(b) is the result of background subtraction by statistical reach feature. Therefore, the SRF image only shows the detection of crickets.

(a) Input Image (b) SRF Image

Fig. 1. Background subtraction based on SRF

Center of gravity areaSearch area

Center of gravityposition

Fig. 2. Image position of cricket G.bimaculatus

III. OBSERVATION METHOD OF CRICKET

To calculate the proposed observation method based on background subtraction by SRF as in [4]. It is important to cricket positions by setting a search area, conducting a method of discrimination, and preventing false detection by other creatures.

A. Position of Cricket

In order to detect cricket positions using SRF images, we set up the search area of S×S pixels. Pixels which are outside the search area are defined as background pixels. The calculation area M×M, which takes maximum value wij in search area is extracted, and it shows the detected position of a cricket based on the center of gravity point in extracted area as in Fig.2. It is important to extract the area which includes the most foreground pixels. Equation (8) defines the maximum value wij with

1),(|T),(: baCbawij (8)

where T is a set of coordinates of pixels in the search area. We calculate the center of gravity of the foreground pixels by background subtraction by SRF in extracted area, as follows:

1

0

1

0

1,

1 n

m

n

mmm J

nI

n 

)1,,0(

),(

nm

JIforeground mm

  (9)

In this paper, n is the number of foreground pixels in the extracted area, and m is the foreground pixel number. The detected position is defined as the position of the cricket. The center of search area at next time is defined by this position of cricket. By repeating this time series, we can detect the position of the target cricket.

B. Method of Discrimination

In order to discriminate from other crickets, we estimated the target position from cricket motion. After detecting the position of cricket in Section 3-A, we estimated the target cricket based on the linear motion of uniform acceleration of the cricket. To estimate the target cricket position at time k, we used the detected position of the cricket from time k-1 to time k-3. Based on the following (10), we calculate the estimate

Page 3: [IEEE TENCON 2013 - 2013 IEEE Region 10 Conference - Xi'an, China (2013.10.22-2013.10.25)] 2013 IEEE International Conference of IEEE Region 10 (TENCON 2013) - Behavior observation

Removed Position(i2, j2)

DetectedPosition(i1, j1)

DetectedPosition(i2, j2)

LD < CL ×

LD ≥ CL

Fig. 3. Method of discrimination of two G.bimaculatus

positions (ipm, jpm) at time k using velocity v, v and acceleration a, a as follows:

21

21

ˆ2

ˆ2

TaTvjj

TaTvii

jkjkkpm

ikikkpm (10)

where ΔT is a sampling time, we define Dm as an evaluation value given by equation (11). Dm means a normalization of distance from estimate positions (ipm, jpm) and each cricket positions (im, jm).

22 )()( mpmmpmm jjiiD (11)

The pixel which takes the minimum of Dm is extracted, and then it shows the detected position of the target cricket. When another cricket approaches the position of the target cricket, the cricket position is removed based on equation (11), and we express the new cricket position far from another cricket. Then, the longest axis of an ellipse is 2CL, and CL express its radius of cricket. For example, we repeat the position calculation of redefined cricket in its search area. Then, when distance value Dm is greater than CL, calculated position is defined as position of another cricket.

IV. EXPERIMENTAL RESULT

Through experimentation we were able to prove the effectiveness of our proposed method. We obtained the 501 time series images, and selected 30 images to be used for construction of a background model. Suppose that search area S is 61 × 61 pixels, detection area M is 35 × 35 pixels, reach candidates K is 32, brightness difference threshold Tp is 105, foreground / Back-ground distinction threshold TB is 0.9, stable distinction threshold TR is 0.8 and the radius of cricket CL is 25 pixels. We express the first image as Time = 0. Similarly, the last image is Time = 500. In Fig. 4(a), we show the search areas (green), and the position of crickets which a center of detection areas as each color (red, blue, pink, violet). Therefore, Fig. 4(a) shows the observation results of each color trajectory based on our proposed method. Furthermore, we detected the only crickets using SRF from Fig. 4(b). Fig. 4 shows that our proposed method is possible to calculate the cricket trajectories.

Timek = 0

Time k = 250

Time k = 500

(a) Time Series Images (b) SRF Images

Fig. 4. Trajectories of four crickets

0

80

160

240

320

400

480

0 80 160 240 320 400 480 560 640jc

oord

inat

ej coordinate

Fig. 5. Comparison of observation accuracy

TABLE I. RESULT OF OBSERVATION ACCURACY

Observation Accuracy

Minimum Error Maximum Error Average Error

Cricket1 (red)

0 pixel 10.63 pixels 4.42 pixels

Cricket1 (blue)

0 pixel 16.12 pixels 6.68 pixels

Cricket1 (pink)

0 pixel 22.20 pixels 5.95 pixels

Cricket1 (violet)

0 pixel 37.11 pixels 14.00 pixels

Fig. 5 shows comparison results of trajectories of our proposed method and the true trajectories.True trajectories are

Page 4: [IEEE TENCON 2013 - 2013 IEEE Region 10 Conference - Xi'an, China (2013.10.22-2013.10.25)] 2013 IEEE International Conference of IEEE Region 10 (TENCON 2013) - Behavior observation

obtained by plotting the center of the target cricket. Table.1 shows the distance between calculated trajectories and true trajectories. As the results show, our proposed method is possible to keep average accuracy within 15 pixels from an average error. Therefore, our proposed method calculates high-accuracy trajectories even with plural targets, and when detection by background subtraction based on statistical reach feature is temporarily difficult.

V. CONCLUSION

In this paper, we proposed a new observation method to observe crickets. Especially, we considered the observation method which is robust method to the illumination change and a similar of crickets. For this purpose, we proposed a method based on background subtraction using a statistical reach feature. The calculation area which takes maximum value in search area was extracted to detect the position of a cricket using a center of gravity in the extracted area. Furthermore, the proposed discrimination method derives true results even when the target cricket approaches other crickets. When detection by background subtraction based on a statistical reach feature is difficult temporarily, difficult we proposed a method which expands the search area for cricket motion in order to keep the target cricket. In fact, trough experimentation of cricket observation our proposed method is effective. In addition, our proposed method can calculate cricket trajectories. In the future, we will continue experimenting with G.bimaculatus. And win consider the observation method when crickets conduct extreme irregular motion, such as fighting.

REFERENCES [1] K. Kawabata, H. Aonuma, K. Hosoda and J. Xue, “A System for

Automated Interaction with the Cricket Utilizing a Micro Mobile Robot”, Journal of Robotics and Mechatronics. Vol.25, No.2, pp.333-339, 2013.

[2] K. Iwata, Y. Satoh, R. Ozaki and K. Sakaue, “Robust Background Subtraction based on Statistical Reach Feature Method”, IEICE Trans. D, Vol.J92-D, No.8, pp.1251-1259, 2009.

[3] K. Iwata, Y. Satoh, R. Ozaki and K. Sakaue, “Robust Background Subtraction by Statistical Reach Feature on Random Reference Points”, Proceedings of the 18th Korea-Japan Joint Workshop on Frontiers of Computer Vision, pp.188-192, 2012.

[4] Y. Okuda, S. Takahashi, K. Minagawa, K. Iwata and Y. Satoh, “Behavior Observation of Periplaneta Americana based on Advanced Statistical Reach Feature”, Proceedings of Dynamic Image processing for real Application (CD-ROM), pp.125-130, 2013.

[5] T. Matsuyama, T. Wada, H. Habe and K. Tanagashi, “Background Subtraction under Varying Illumination”, IEICE Trans. D-Ⅱ, Vol. J84-D-Ⅱ, No.10, pp2201-2211, 2001.

[6] F. Ullah and S. Kaneko, “Using orientation codes for rotation invariant template matching”, Pattern Recognition, Vol.37, No.2, p.201-209, 2004.

[7] T. Honda, S. Takahashi, H. Takauji and S. Kaneko, “Robust Tracking Method to Moving Object with Random Walk”, IIEEJ Trans., Vol.41, No.4, pp.360-365, 2012.

[8] Z. Xinyue, Y. Satoh, H. Takauji, S. Kaneko, K. Iwata and R. Ozaki, “Object Detection Based on a Robust and Accurate Statistical Multi-point-pair Model”, Pattern Recognition, Vol.44, No.6, pp1296-1311, 2011.

[9] N. Wajima, S. Takahashi, M. Itoh, Y. Satoh and S. Kaneko, “Tracking Method of Moving Object by Dynamic Image Processing Based on Radial Reach Filter”, IEEJ Trans. C, Vol.129, No.10, pp.1942-1948, 2009.

[10] N. Goda, H. Kamada, S. Takahashi, H. Takauji and S. Kaneko, “Cell Detection and Tracking Method combined Orientation Code and Particle Filter”, Proceedings of Dynamic Image processing for real Application (CD-ROM), pp.30-35, 2013.