6
Autonomous Flight of Small Helicopter with Real-Time Camera Calibration Takanori Matsukawa, Shogo Arai, Koichi Hashimoto Abstract— We propose a real-time camera calibration method for an autonomous flight of a small helicopter. Our purpose is to control a small helicopter automatically by using cameras fixed on the ground. We use calibrated cameras, un-calibrated cameras, and the small helicopter that is not attached with any sensors. The proposed method finds correspondences between image features in the two images of a calibrated camera and an un-calibrated camera, and estimates the extrinsic parameters of cameras using a particle filter in real time. We evaluate a utility of the proposed method by experiments. We compare real-time calibration by typical bundle adjustment with a Gauss Newton method to the proposed method in the experiment of the small helicopter flight. The autonomous flight of the small helicopter can be achieved by the proposed method in the situation that a flight of a helicopter can not be achieved with typical bundle adjustment with a Gauss Newton method. I. I NTRODUCTION A multi camera system is widely used for tracking and measurement [1], [7], [13], [17]. In particular, cameras fixed on the ground are often used to measure position and orientation of a small unmanned helicopter, since payload of small helicopters are too low to have some sensors, such as GPS and IMU. The small helicopter takes advantage of low risk when it go down and its ability to fly slowly and at a low altitude, hover, take-off and land vertically. However, flight stability and control requires precise measurement of the small helicopter’s angular orientation and position, since small disturbance, such as wind, affect the helicopter flight adversely. Thus we have to find the relative position and orientation between each camera for precise measurement. This process is called the camera calibration. Camera calibration is to determine the intrinsic parame- ters, such as focal length and distortion, and the extrinsic parameters, such as the position and orientation of the camera. A number of calibration method have been proposed. Classical camera calibration is performed by observing a calibration object whose geometry in 3D space is known with very good precision [18]. Therefore, this approach requires an expensive calibration apparatus and an elaborate setup. However, the autonomous flight can not be achieved without precise calibration for multiple cameras. Camera calibration methods for moving vehicles have been studied. A calibration algorithm to estimate the rel- ative orientations among multiple non-overlapping cameras has been proposed for a flight of a small helicopter [12]. This method calibrates camera positions before the flight T. Matsukawa, S. Arai, and K. Hashimoto are with the De- partment of System Information Sciences, Tohoku University, Ara- maki Aza Aoba 6-6-01, Sendai, Japan {matsukawa, arai, hashimoto}@ic.is.tohoku.ac.jp and assumes that the relative orientations among multiple cameras are given. In addition, a camera calibration method with SFM (Structure From Motion) for an autonomous flight of a helicopter has been presented in [4]. However, these previous works assume that at least positions or orientations of cameras are calibrated before the autonomous flight. Thus a range of flight is limited since a number of the calibrated camera is necessary to expand the range and it takes much time and effort. To solve the problem, we propose a real-time calibration method for the autonomous flight of a small helicopter. The proposed method calibrates cameras from the images ob- tained by calibrated and un-calibrated cameras and a position and orientation of the helicopter which is computed form the images of the calibrated cameras. It is difficult to identify the intrinsic and extrinsic parameters accurately at the same time [8]. In addition, methods to estimate the intrinsic parameters accurately have been proposed [18]. Thus, we assume that the intrinsic parameters are calibrated and calibrate only the extrinsic parameters. The proposed method consists of the two parts. First the proposed method finds correspondences between image features extracted from the un-calibrated and calibrated cameras by bundle adjustment. Then the method estimates the extrinsic parameters with the particle filter. We show that computation cost of the proposed method for finding the correspondences is much less than that of the full search method. The experimental result shows that the flight of the helicopter can be achieved using the proposed method in the situation that the flight of the helicopter can not be achieved by only bundle adjustment with a Gauss Newton method. This paper is organized as following. The section 2 describes bundle adjustment based on a pinhole camera model. The section 3 represents the proposed method. The section 4 shows experiments of flight of the unmanned small helicopter with a calibration. At last, the section 5 describes conclusions. II. BUNDLE ADJUSTMENT A camera calibration is to identify the intrinsic parameters and extrinsic parameters. Bundle adjustment is often used for multiple cameras calibration and 3D shape reconstruction. We use bundle adjustment in a part of the proposed method. In this section, we describe bundle adjustment. We now represent a pinhole camera model to explain bundle adjustment. A 3D position of image features j = 1, ··· ,N and image features j are denoted by q j = [x j y j z j ] and ξ ij =[u ij v ij ] , respectively. A pinhole 978-1-4244-9315-9/10/$26.00 ©2010 IEEE - 242 - SI International 2010

[IEEE 2010 IEEE/SICE International Symposium on System Integration (SII 2010) - Sendai, Japan (2010.12.21-2010.12.22)] 2010 IEEE/SICE International Symposium on System Integration

  • Upload
    koichi

  • View
    214

  • Download
    1

Embed Size (px)

Citation preview

Autonomous Flight of Small Helicopterwith Real-Time Camera Calibration

Takanori Matsukawa, Shogo Arai, Koichi Hashimoto

Abstract— We propose a real-time camera calibration methodfor an autonomous flight of a small helicopter. Our purpose isto control a small helicopter automatically by using camerasfixed on the ground. We use calibrated cameras, un-calibratedcameras, and the small helicopter that is not attached with anysensors. The proposed method finds correspondences betweenimage features in the two images of a calibrated camera and anun-calibrated camera, and estimates the extrinsic parameters ofcameras using a particle filter in real time. We evaluate a utilityof the proposed method by experiments. We compare real-timecalibration by typical bundle adjustment with a Gauss Newtonmethod to the proposed method in the experiment of the smallhelicopter flight. The autonomous flight of the small helicoptercan be achieved by the proposed method in the situation thata flight of a helicopter can not be achieved with typical bundleadjustment with a Gauss Newton method.

I. INTRODUCTION

A multi camera system is widely used for tracking andmeasurement [1], [7], [13], [17]. In particular, cameras fixedon the ground are often used to measure position andorientation of a small unmanned helicopter, since payloadof small helicopters are too low to have some sensors, suchas GPS and IMU. The small helicopter takes advantage oflow risk when it go down and its ability to fly slowly and ata low altitude, hover, take-off and land vertically. However,flight stability and control requires precise measurement ofthe small helicopter’s angular orientation and position, sincesmall disturbance, such as wind, affect the helicopter flightadversely. Thus we have to find the relative position andorientation between each camera for precise measurement.This process is called the camera calibration.

Camera calibration is to determine the intrinsic parame-ters, such as focal length and distortion, and the extrinsicparameters, such as the position and orientation of thecamera. A number of calibration method have been proposed.Classical camera calibration is performed by observing acalibration object whose geometry in 3D space is known withvery good precision [18]. Therefore, this approach requiresan expensive calibration apparatus and an elaborate setup.However, the autonomous flight can not be achieved withoutprecise calibration for multiple cameras.

Camera calibration methods for moving vehicles havebeen studied. A calibration algorithm to estimate the rel-ative orientations among multiple non-overlapping camerashas been proposed for a flight of a small helicopter [12].This method calibrates camera positions before the flight

T. Matsukawa, S. Arai, and K. Hashimoto are with the De-partment of System Information Sciences, Tohoku University, Ara-maki Aza Aoba 6-6-01, Sendai, Japan {matsukawa, arai,hashimoto}@ic.is.tohoku.ac.jp

and assumes that the relative orientations among multiplecameras are given. In addition, a camera calibration methodwith SFM (Structure From Motion) for an autonomous flightof a helicopter has been presented in [4]. However, theseprevious works assume that at least positions or orientationsof cameras are calibrated before the autonomous flight. Thusa range of flight is limited since a number of the calibratedcamera is necessary to expand the range and it takes muchtime and effort.

To solve the problem, we propose a real-time calibrationmethod for the autonomous flight of a small helicopter. Theproposed method calibrates cameras from the images ob-tained by calibrated and un-calibrated cameras and a positionand orientation of the helicopter which is computed form theimages of the calibrated cameras. It is difficult to identify theintrinsic and extrinsic parameters accurately at the same time[8]. In addition, methods to estimate the intrinsic parametersaccurately have been proposed [18]. Thus, we assume thatthe intrinsic parameters are calibrated and calibrate only theextrinsic parameters. The proposed method consists of thetwo parts. First the proposed method finds correspondencesbetween image features extracted from the un-calibrated andcalibrated cameras by bundle adjustment. Then the methodestimates the extrinsic parameters with the particle filter.We show that computation cost of the proposed method forfinding the correspondences is much less than that of the fullsearch method. The experimental result shows that the flightof the helicopter can be achieved using the proposed methodin the situation that the flight of the helicopter can not beachieved by only bundle adjustment with a Gauss Newtonmethod.

This paper is organized as following. The section 2describes bundle adjustment based on a pinhole cameramodel. The section 3 represents the proposed method. Thesection 4 shows experiments of flight of the unmanned smallhelicopter with a calibration. At last, the section 5 describesconclusions.

II. BUNDLE ADJUSTMENT

A camera calibration is to identify the intrinsic parametersand extrinsic parameters. Bundle adjustment is often used formultiple cameras calibration and 3D shape reconstruction.We use bundle adjustment in a part of the proposed method.In this section, we describe bundle adjustment.

We now represent a pinhole camera model to explainbundle adjustment. A 3D position of image features j =1, · · · , N and image features j are denoted by qj =[xj yj zj ]

� and ξij = [uij vij ]�, respectively. A pinhole

978-1-4244-9315-9/10/$26.00 ©2010 IEEE - 242 - SI International 2010

camera model is given by

s

[ξij1

]= Pi

[qj1

], (1)

where Pi and s denote a perspective projection matrix ofthe camera i = 1, · · · ,M , and an arbitrary scale factor,respectively. The perspective projection matrix is representedby

Pi := Ki[Ri|ti], (2)

where Ki is the intrinsic matrix of the camera i, Ri is therotation matrix, and ti is the translation matrix. The positionand orientation of the camera i is determined by Ri and ti.

In bundle adjustment, the camera calibration problem isdescribed as follows.

Problem 1: It is assumed that correspondencesof image features among all cameras are given.Let Ki, Ri, ti, and qj denote estimation ofKi, Ri, ti, and qj , respectively. A cost function isdefined by

J =∑

(i,j)∈S

[(uij − u(Ki, Ri, ti, qj))2

+ (vij − v(Ki, Ri, ti, qj))2], (3)

where u and v are functions of Ki, Ri, ti, and qj , andS are combinations of each point in the image without thepoints that can not be observed. The functions u and vimply 2D point with coordinate of image features computedfrom Ki, Ri, ti, and qj . The cost function J represents thereprojection error.

Find

K∗i ,R

∗i , t

∗i , q

∗j = arg min

Ki,Ri,ti,qj

J. (4)

III. PROPOSED METHOD

In this section, we propose a method for a real-time cameracalibration for an autonomous flight of a small helicopter.The intrinsic parameters and extrinsic parameters can not becalibrated accurately at the same. We focus on an estimationof the extrinsic parameters, since it is difficult to identifythe the extrinsic and intrinsic parameters accurately at thesame time and methods to estimate the intrinsic parametersaccurately have been proposed [18].

We will propose a real time method to calibrate the un-calibrated cameras using the image features extracted fromthe calibrated and un-calibrated cameras. In most cases,the accuracy of the camera calibration is higher as thenumber of image features increases. However, computationcost increases as the number of image features does, sincewe have to find optimal correspondence of image featuresbetween calibrated cameras and un-calibrated cameras (Fig.1 is an example of correspondences). In particular, compu-tation cost for finding the optimal correspondence increasesexponentially when we compute the reprojection errors givenby Eq. (3) for all possible correspondence, which is calledthe full search method in this paper. Thus we propose a

Fig. 1. Correspondence of features.

effective finding correspondence method whose computationcost is much less than the full search method. Then weshow a method to estimate the extrinsic parameters with theparticle filter. We use bundle adjustment to find correspon-dences between image features since computation cost ofbundle adjustment is usually less than that of the particlefilter. Bundle adjustment with a Gauss Newton method isoften used for the multiple cameras calibration [14], [16].However, image noise affect the estimation of the extrinsicparameters adversely. Therefore, we adopt the particle filterto estimate the parameters.

In what following, we describe a system model, anda method to find correspondences of image features andestimate the extrinsic parameters using the particle filter. Inthis paper, we estimate only the extrinsic parameters since weassume that the intrinsic parameters of un-calibrated camerasis given.

A. System ModelIt is assumed that multiple coplanar features are attached

to a helicopter (See Fig. 5 as an example). Let N denotethe number of image features. The numbers of uncalibratedcameras and calibrated cameras are represented by K andL, respectively. Image features of calibrated camera k ∈{1, · · · ,K} is defined by

ξgk = [ξ1gk ξ2gk · · · ξNgk], (5)

and image features of un-calibrated camera � ∈ {1, · · · , L}is defined by

ξ� = [ξ1� ξ2� · · · ξN� ]. (6)

The proposed method is independent of an algorithm to ex-tract image features. SIFT and SURF are often used to extractimage features [2], [15]. Let us describe a state space modelof a system. A position and an orientation of a helicopter andan un-calibrated camera k are represented by x ∈ R6 and pl,respectively. The state variable and the observed variable aredenoted by X(t) = [x�(t) p�

1 p�2 · · · p�

L (t)]� ∈ R6(1+L)

and ξ(t) = [ξ�1 ξ�2 · · · ξ�L ξ�g1 ξ�g2 · · · ξ�gK]� ∈ R8(L+K),

respectively. The state space model of the system is givenby

X(t) = f(X(t− 1)) +w(t), (7)ξ(t) = H(t)(X(t)) + v(t), (8)

- 243 - SI International 2010

where w(t) ∈ R6(1+K) and v(t) ∈ R8(L+K) are whitenoise, and H = [P�

1 P�2 · · · P�

L P�g1 P�

g2 · · · P�gK]

� ∈R8(L+K)×6(1+K). Here Pgk is a perspective projection ma-trix of a calibrated camera k. When the un-calibrated camerasare fixed, the system noise for the cameras are zero. In theother words, w(t) = [v1 v2 · · · v6 0 0 · · · 0].

B. Find Correspondences between Image Features

In this section, we propose a finding correspondingmethod. The proposed method to find correspondences be-tween image features in the two images obtained by a un-calibrated camera and a calibrated camera are described asfollows.

1) Extract image features. Label the image features from1 to R, and define a label set of the image featuresas I = {1, 2, · · · , |I|}. Here, |I| is the number ofelements of I. The set I represents vertex sets of convexpolygons as shown below.

2) Set b=1. Repeat the following operations 2)-a), b), c),and d) until the number of elements of I is less than3.

a) Form a convex polygon Qb such that Qb includesall image features of I and the vertexes of Qb areelements of I. Define a vertex set of Qb as Jb.

b) Label each vertex of the convex polygon Qb withnovel labels 1pb , 2pb , · · · , qpb clockwise in theimage, respectively. Here p represents the labelof the vertex attached in the operation a). (SeeFig. 2 for an example). There qpb combinations oflabeling patterns and define each labeling patternas Oc

b , c = 1, 2, · · · , qb. Further define Ob :={O1

b , O2b , · · · , Oqb

b }.c) Operate I← I/Jb and b← b+ 1.

3) Label the image features of I with novel labels1pb , 2pb , · · · , qpb , respectively. There qpb ! combina-tions of labeling patterns and define each labelingpattern as Oc

b, c = 1, 2, · · · , qb. Further define Ob :={O1

b , O2b , · · · , Oqb

b }.4) Set L = b. Here, L represents the number of the sets

Ob.5) Define L := O1 ×O2 × · · · ×OL.6) Repeating the following operations 6)-a), b), c), and d)

until L = ∅ × ∅ × · · · × ∅.a) Choose a labeling pattern Ld ∈ L

b) Derive the optimal extrinsic parameters R∗d and

t∗d for Ld by using the bundle adjustment. In otherwords, solve Problem 1 for Ld.

c) Compute the difference between the optimal ex-trinsic parameters and the initial parameters R0,t0, given by

De = ‖t∗d − t0‖+ ‖rs(R∗d)− rs(R0)‖, (9)

where ‖ · ‖ represents the Euclidean norm.d) L← L/Ld

7) e∗ = arg mine De.

111

216

813

412

631

523

722

321

915

1

2

3

4

10

5

9

8

76

1014

Fig. 2. Example: Image features’s sort.

8) Obtain the optimal correspondence between the twoimages for e∗.

Let us now evaluate the computational cost of the proposedfinding corresponding method.

Theorem 1: It is assumed that image features which arevertexes of any convex polygon Qb are not in the samestraight line in the image. The computational cost of theproposed finding corresponding method is less than or equalto O(L3N/3).

Proof: Let us define the number of convex polygonsas C. Here, we evaluate the repeat counts of operating 5)which takes much time in the proposed method. By usingthe inequality of arithmetic and geometric, the repeat countsof operating 5) is given by

| J1 || J2 | · · · | JC | ≤[ | J1 | + | J2 | + · · ·+ | JC

C

]N

=[NC

]C:= g(C). (10)

The function g(C) is monotone increasing function for Cand C ≤ N/3, since N ≥ C ≥ 1 and the convex polygonconsists of at least 3 vertexes. Thus the maximum value ofthe function g(C) is evaluated by 3N/3.

A full search method requires LN ! operations 5). Thuscomputation cost of the proposed finding correspondencemethod is much less than the full search method. In addition,the proposed method is more effective as the number ofimage features increases.

C. Particle Filter

The particle filter is the method to estimate a state by thelikelihood computed from a large number of random samplesgenerated from an arbitrary probability density function.

Likelihood μi for a camera i is computed based on by theGaussian distribution whose norm is a and variance is σ fromEuclidean distance between the observed image features ξiand the estimation of image features ξi. The likelihood isrepresented by

μi(t) =1√2πσ

exp(−|ξi(t)− ξi(t)|22σ2

) + a. (11)

- 244 - SI International 2010

IV. EXPERIMENT

In this section we explain the experimental setup, proce-dure, and result. A purpose of the experiment is a flight ofan autonomous small helicopter using the proposed method.

A. Experimental Setup

The system considered in this section consists of a smallhelicopter, three stationary cameras, three computers, a DAconverter, an electronic circuit, and a transmitter as illustratedin Fig. 3 (see [9], [10], [17] for multiple cameras and wiredversion of this system). Experimental environment is shownFig. 4. The helicopter does not have any sensors which mea-sure the position or orientation. It has four small green balls,and they are attached to rods connected to the bottom of thehelicopter. The green balls are labeled from 1 to 4. Theirpositions in the image by camera have calibrated planes areimage features. We compute simplicity helicopter’s positionand orientation from green markers on the image. We wantto estimate the helicopter’s position and orientation fromhelicopter images without using some markers in the future.The three cameras are placed on the ground and they lookdownward. The cameras are labeled from 1 to 3. The camera

Client

Electronic circuit

Client

Experimental Field

Computer 1

Server Computer

DA ConverterTransmitter

Ground Station

Computer 2

Camera 3(Un-Calibrated Camera)

Camera 1

(Calibrated Camera)

Camera 2

(Calibrated Camera)

Experimental Field

Fig. 3. System configuration.

x

y

zCalibrated Camera

Un-Calibrated Camera

Fig. 4. Experimental system.

1 and 2 are calibrated. The camera 3 is not calibrated. Twoof three cameras and the other are connected to a computer,respectively. The two computers connected to cameras arecalled client computers. The computer connected to the DAconverter is called a server computer.

The client computers extract image features in real time.Client computes take 16.6 [ms] to extract image features. Weuse fast IEEE 1394 cameras, Dragonfly Express, developedby Point Grey Research Inc. The values of the extractedimage features are sent to a server computer.

The server computer selects correctly extracted imagefeatures by the algorithm proposed in [10]. It also derivesrequired input voltages in a manner described in [10]. Thecomputed control input voltages are supplied from the servercomputer to the helicopter through the DA converter and thetransmitter. The clocks of computers are not synchronizedwith each other. The server computer uses the latest dataobtained from client computers. It takes 3 [ms] to updatethe input voltages.

The small helicopter used in the experiments is X. R. BSR SKY ROBO Shuttle developed by HIROBO (see Fig. 5).It has a coaxial rotor configuration. The two rotors share thesame axis, and they rotate in opposite directions. The tail isa dummy. A stabilizer is installed on the upper rotor head.It mechanically keeps the posture horizontal.

Table I summarizes specifications of the system.

B. Experimental Procedure

The autonomous flight of the helicopter requires at leasttwo calibrated cameras, since two cameras are necessary forcomputing the helicopter’s position and orientation. Thus weset two calibrated cameras and an un-calibrated camera.

Steps of extraction of image features in each camera,that is necessary for computing the helicopter’s position andorientation, are shown as following.

1) Process in time t=0

Fig. 5. X.R.B. with four green balls.

TABLE ISPECIFICATIONS OF THE SYSTEM.

Length of the helicopter, 0.40 [m].Height of the helicopter, 0.21 [m].Rotor length of the helicopter, 0.34 [m].Weight of the helicopter, 0.21 [kg].Focal length of the lens, 0.0042 [m].Camera resolution, 640 × 480 [pixels].Pixel size, 7.4 [μm] × 7.4 [μm].

- 245 - SI International 2010

0 5 10 15 20 25-0.8

-0.6

-0.4

-0.2

0

0.2

time [s]

x [m

]

referenceexperiment Aexperiment B

0 5 10 15 20 25-0.8

-0.6

-0.4

-0.2

0

0.2

time [s]

y [m

]

referenceexperiment Aexperiment B

Fig. 6. Experimental result: Time profile of the generalized coordinate xand y.

Get the background image.2) Process in time t = 1, · · · , T

a) Get an image at the current time.b) Extract region that are not in background image

using background differencing algorithm.c) Extract region of image features’s color by HSV

color spaces that is robust to an illuminationchange.

d) Get rid of noise by opening, closing files.e) Label region of the rest areas, compute gravity

center of each label, and decide it as imagefeatures.

We can extract image features in each image by repeatingthe process 2).

In Experiment A, we calibrated a camera using onlybundle adjustment. In Experiment B, we calibrated a cameraby the proposed method.

C. Experiment A : Autonomous Flight of Helicopter UsingCalibration of Bundle-Adjustment with a Gauss NewtonMethod

We computed the position and pose of the helicopter fromthe extrinsic and intrinsic parameters and the images of twocalibrated cameras for takeoff, hovering, and landing. Duringa flight of the helicopter, we calibrated the un-calibratedcamera in real time from using bundle adjustment from thehelicopter state computed from the calibrated cameras andimage features obtained the un-calibrated camera. We hidedthe camera 2 completely to be not able to extract correctimage features, after the helicopter can hover stable,

Fig. 6 and Fig. 7 show error between the reference andmeasured value of the helicopter’s horizontal directions x,y, hight direction z, and yaw direction θ. Fig. 8 showstime profiles of elements which means the set of correctlyextracted image features. A label number is 4(i−1)+ j−1,where the label of camera is i = 1, 2, and 3, the label ofmarker j = 1, 2, 3, and 4. The label of the un-calibratedcamera is i = 3 (The following experiment is the same).Fig. 9 and Fig. 10 shows calibration error of the positionsand orientations between the real extrinsic parameters p∗

� andthe convergence performance of calibration p� in ExperimentA and Experiment B. The real extrinsic parameters are cali-brated by the method proposed by Zhang [18]. In ExperimentA, we hided the camera 2 from 13 [s].

0 5 10 15 20 250

0.10.20.30.40.50.60.70.8

time [s]

z [m

]

referenceexperiment Aexperiment B

0 5 10 15 20 25

-0.4

-0.2

0

0.2

0.4

time [s]

θ [r

ad]

reference

experiment Bexperiment A

Fig. 7. Experimental result: Time profile of the generalized coordinate zand θ.

0 5 10 15 20 250

2

4

6

8

10

12

time [s]

labe

l of t

he i

mag

e fe

eatu

res

}}}

camera 1

camera 2

camera 3

the helicopter crashed

Fig. 8. Experimental result: The label of the selected image features.

D. Experiment B : Autonomous Flight of Helicopter UsingProposed Method

The situation of Experiment B was the same one ofExperiment A. In Experiment B, we hided the camera 2 from13 [s] to the landing by a white board. Fig. 11 shows profilesof elements which means the set of correctly extracted imagefeatures.

E. Discussions

As you can see from Experiment A, bundle adjustmentwith a Gauss Newton method did not estimate the extrinsicparameters precisely. The most likely cause of this result isinfluence of image noise and the noise affect the optimizationadversely. The helicopter crashed in the ground since camera2 was hided in Experiment A. This implies that bundleadjustment with a Gauss Newton method can not providethe extrinsic parameters correctly for the autonomous flightof the helicopter.

0 5 10 15 20 25-0.8-0.6-0.4-0.2

00.20.40.60.8

time [s]

erro

r of p

ositi

on [m

] referenceexperiment Aexperiment B

Fig. 9. The calibration result: position.

- 246 - SI International 2010

0 5 10 15 20 25-40-30-20-10

010203040

time [s]

erro

r of o

rient

atio

n [°

]

referenceexperiment Aexperiment B

Fig. 10. The calibration result: orientation.

0 5 10 15 20 250

2

4

6

8

10

12

time [s]

labe

l of t

he im

age

feea

ture

s

}

}}

camera1

camera3

camera2

Fig. 11. Experimental result: The label of the selected image features.

In Experiment B, the estimated parameters converged tosome constant values with some error bound. However, theautonomous flight of the helicopter was achieved even if theon-line calibrated camera with some errors for the extrinsicparameters is used. In addition, the flight of the helicoptercan not be achieved by only one camera. This result showsthat accuracy of calibration with the proposed camera isenough for the autonomous flight of the helicopter.

Fig. 11 shows that some images features are extracted incamera 2 after camera 2 is hided by the white board. Thismeans that the image features can not be extracted correctly.However, the other two cameras can extract image featurescorrectly. This implies that the helicopter can be controlledeven if not all image features are extracted correctly.

V. CONCLUSIONS

In this paper, we have proposed the real-time camera cal-ibration method for the autonomous flight of the helicopter.The proposed method consists of the two parts. First the pro-posed method finds correspondences between image featuresextracted from the un-calibrated and calibrated cameras bybundle adjustment. Then the method estimates the extrinsicparameters with the particle filter. Computation cost of theproposed method for finding the correspondences is muchless than that of the full search method. The experimentalresults show that the autonomous flight of the helicopter isachieved with the proposed calibration method. In contrast,the flight can not be achieved by bundle adjustment with

a Gauss Newton method which is widely used for cameracalibration.

In the experiment, only one camera is calibrated in realtime by the proposed method. However, the proposed methodcan calibrate many un-calibrated cameras in real time whentwo cameras are calibrated before. Thus we will try theautonomous flight with a number of un-calibrated camerasby using the proposed method.

VI. ACKNOWLEDGMENTS

Thank an Associate Professor Shingo Kagami for verykindly advice in this study,

REFERENCES

[1] E. Altug, J. P. Ostrowski, and C. J. Taylor, “Control of a QuadrotorHelicopter Using Dual Camera Visual Feedback”, The InternationalJournal of Robotics Research, Vol. 24, No. 5, pp. 329–341, 2005.

[2] H. Bay, T. Tuytelaars, and L. V. Gool, “SURF: Speeded up RobustFeatures”, European Conference on Computer Vision, 2006.

[3] D. Brown, “The Bundle Adjustment - Progress and Prospect”,Congress of the ISPRS, Helsinki, 1976.

[4] A. Y. Chen, M. Matsuoka, and S. P. N. Singh, “Autonomous HelicopterTracking and Localization Using a Self-Calibrating Camera Array”,International Journal of Robotics Research, Vol. 26, pp. 205–215,2007.

[5] W. Gilks, S. Richardson, and D. Spiegelhalter, “Markov Chain MonteCarlo in Practice”, Chapman and Hall, 1996.

[6] R. Graham, “An Efficient Algorithm for Determining the Convex Hullof a Finite Planar Set”, Information Processing Letters 1, pp. 132–133,1972.

[7] S. Han, A. D. Straw, M. H. Dickinson, and R. M. Murry, “A Real TimeHelicopter Testbed for Insect-Inspired Visual Flight Control”, IEEEInternational Conference on Robotics and Automation, pp. 3078–3083,2009.

[8] I. Ihrke, L. Ahrenberg, and M. Magnor, “External Camera Calibrationfor Synchronized Multi-Video System”, International Conference onComputer Vision Graphics, Visualization and Computer Vision, Vol.12, pp. 537–544, 2004.

[9] Y. Iwatani, K. Watanabe, and K. Hashimoto, “Automatic Take-off andLanding of an Unmanned Helicopter Using Vision-Based Control withOcclusion Handling,” CISM-IFToMM Symposium on Robot Design,Dynamics and Control, pp. 11–18, 2008.

[10] Y. Iwatani, K. Watanabe, and K. Hashimoto, “Visual Tracking withOcclusion Handling for Visual Servo Control”, IEEE InternationalConference on Robotics and Automation, pp. 101–106, 2008.

[11] R. Javis, “On the Identification of the Convex Hull of a Finite Set ofPoints in the Plane”, Information Processing Letters, pp. 18–21, 1973.

[12] J. S. Kim, M. Hwangbo, and T. Kanade, “Motion Estimation UsingMultiple Non-Overlapping Cameras for Small Unmanned Aerial Ve-hicles”, IEEE International Conference on Robotics and Automation,pp. 3076–3081, 2008.

[13] K. Kim and L. S. Davis, “Multi-camera Tracking and Segmentationof Occluded People on Ground Plane Using Search-Guided ParticleFiltering”, European Conference on Computer Vision, pp. 98–109,2006.

[14] D. G. Lowe, “Image Mosaicing Using Sequential Bundle Adjustment”,Image and Vision Computing, Vol. 20, pp. 751–759, 2002.

[15] D. G. Lowe, “Distinctive Image Features from Scale-Invariant Key-points”, International Journal of Computer Vision, Vol. 20, pp. 91–110, 2003.

[16] D. E. Schinstock, C. Lewis, and C. Buckley, “An Alternative CostFunction to Bundle Adjustment Used for Aerial Photography fromUAVs”, ASPRS Annual Conference, 2009,

[17] K. Watanabe, Y. Iwatani, K. Nonaka, and K. Hashimoto, “A Visual-Servo-Based Assistant System for Unmanned Helicopter Control”,IEEE/RSJ international Conference on Intelligent Robots and Systems,pp. 822–827, 2008.

[18] Z. Zhang, “A Flexible New Technique for Camera Calibration”, IEEETransactions on Pattern Analysis and Machine Intelligence, Vol. 22 ,pp 1330–1334, 1998.

- 247 - SI International 2010