9
Obstacle Avoidance in Cluttered Environments using Optic Flow Matthew A. Garratt University of New South Wales, Australia [email protected] Allen Cheung University of Queensland, Australia [email protected] Abstract Autonomous navigation of a flying vehicle in a cluttered environment has been simulated to demonstrate novel yet simple techniques for avoiding obstacles using optic flow and image loom signals whilst successfully navigating a ve- hicle towards a goal. The simulation makes use of simulated flow fields calculated on cam- eras looking downwards, forwards, rearwards and sidewards. This computation could be car- ried out using a single processor networked with multiple remote cameras or using separate op- tic flow sensors oriented in each direction. 1 Introduction A plethora of MAVs have been produced in recent time that can be safely operated in populated areas owing to their small size and low kinetic energy. Such craft can be flown into a building, through a sewer, down a busy street etc. The applications of MAVs to mili- tary operations are many and varied including surveil- lance inside buildings, search and rescue, hostage sit- uation monitoring, chemical agent detection and bomb search. The problem of navigating a flying robot through a cluttered terrain is of significant interest, since conven- tional techniques such as GPS technology tend to fail in urban canyon environments and will not work indoors. Remote piloted vehicles also become problematic when line-of-sight is lost and telemetry systems start to ex- perience frequent dropouts. Due to payload restrictions for MAVs, the only practical sensor to achieve collision avoidance on this scale is a visual sensor. Alternative technologies such as laser range-finding, radar and high- grade inertial systems are simply impractical for small UAVs owing to the physical barriers to miniaturisation that exist. Extra advantages of visual guidance are that it is passive, small and low cost. In our prototype sys- tems, components for a guidance payload could be pur- chased for as little as US$500 and such a system would weigh less than 50g. 2 Related Work Motivation for this project is partly derived from obser- vations from biology. Insects with tiny brains less than one tenth of a milligram and tens of thousands of times less complex than a human brain can take off, fly, navi- gate and land using vision. Because of their small size, autonomy systems for small autonomous vehicles also need to be high-bandwidth and precise, yet also low- weight and low power. Many believe (e.g.[Iida, 2001; Green et al., 2003; Thakoor, et al., 2003; Srinivasan et al., 2004]) that these seemingly contradictory require- ments can be met through biomimetic sensing, particu- larly, through the use of visual sensors, rather than by conventional techniques. Experiments in biology [Srini- vasan et al., 1989] have led researchers to the conclusion that flying creatures use image motion or optic flow to determine range to objects and hence avoid obstacles in flight. To date, optic flow has been used in two main ways for controlling flight vehicles. Firstly, given ground speed from a non-visual sensor such as GPS, optic flow can be used to estimate range. This has been used for the problem of terrain following [Garratt and Chahl, 2008], centering in a natural canyon [Griffiths et al., 2006] and in conjunction with stereo vision for guiding a UAV through an urban canyon [Hrabar et al., 2005]. The sec- ond way that optic flow has been used is to estimate velocity given a measure of range from another sensor such as stereo vision or a laser rangefinder [Garratt and Chahl, 2007]. An excellent contribution to the solution of urban nav- igation using vision is [Beyeler et al., 2007], in which a helicopter simulation is combined with graphics genera- tion and optic flow processing to demonstrate obstacle avoidance in an urban environment. In their simulation, the helicopter is steered in a direction to balance the op- tic flow on left and right sides of the field of view of a for- ward looking camera. When optic loom exceeds a thresh- old, the helicopter executes a 180 turn. In Beyeler et al’s work, the helicopter is assumed to only travel for- Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

Embed Size (px)

Citation preview

Page 1: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

Obstacle Avoidance in Cluttered Environments using Optic Flow

Matthew A. GarrattUniversity of New South Wales, Australia

[email protected]

Allen CheungUniversity of Queensland, Australia

[email protected]

Abstract

Autonomous navigation of a flying vehicle ina cluttered environment has been simulated todemonstrate novel yet simple techniques foravoiding obstacles using optic flow and imageloom signals whilst successfully navigating a ve-hicle towards a goal. The simulation makesuse of simulated flow fields calculated on cam-eras looking downwards, forwards, rearwardsand sidewards. This computation could be car-ried out using a single processor networked withmultiple remote cameras or using separate op-tic flow sensors oriented in each direction.

1 Introduction

A plethora of MAVs have been produced in recent timethat can be safely operated in populated areas owingto their small size and low kinetic energy. Such craftcan be flown into a building, through a sewer, downa busy street etc. The applications of MAVs to mili-tary operations are many and varied including surveil-lance inside buildings, search and rescue, hostage sit-uation monitoring, chemical agent detection and bombsearch. The problem of navigating a flying robot througha cluttered terrain is of significant interest, since conven-tional techniques such as GPS technology tend to fail inurban canyon environments and will not work indoors.Remote piloted vehicles also become problematic whenline-of-sight is lost and telemetry systems start to ex-perience frequent dropouts. Due to payload restrictionsfor MAVs, the only practical sensor to achieve collisionavoidance on this scale is a visual sensor. Alternativetechnologies such as laser range-finding, radar and high-grade inertial systems are simply impractical for smallUAVs owing to the physical barriers to miniaturisationthat exist. Extra advantages of visual guidance are thatit is passive, small and low cost. In our prototype sys-tems, components for a guidance payload could be pur-chased for as little as US$500 and such a system wouldweigh less than 50g.

2 Related Work

Motivation for this project is partly derived from obser-vations from biology. Insects with tiny brains less thanone tenth of a milligram and tens of thousands of timesless complex than a human brain can take off, fly, navi-gate and land using vision. Because of their small size,autonomy systems for small autonomous vehicles alsoneed to be high-bandwidth and precise, yet also low-weight and low power. Many believe (e.g.[Iida, 2001;Green et al., 2003; Thakoor, et al., 2003; Srinivasanet al., 2004]) that these seemingly contradictory require-ments can be met through biomimetic sensing, particu-larly, through the use of visual sensors, rather than byconventional techniques. Experiments in biology [Srini-vasan et al., 1989] have led researchers to the conclusionthat flying creatures use image motion or optic flow todetermine range to objects and hence avoid obstacles inflight.

To date, optic flow has been used in two main ways forcontrolling flight vehicles. Firstly, given ground speedfrom a non-visual sensor such as GPS, optic flow canbe used to estimate range. This has been used for theproblem of terrain following [Garratt and Chahl, 2008],centering in a natural canyon [Griffiths et al., 2006] andin conjunction with stereo vision for guiding a UAVthrough an urban canyon [Hrabar et al., 2005]. The sec-ond way that optic flow has been used is to estimatevelocity given a measure of range from another sensorsuch as stereo vision or a laser rangefinder [Garratt andChahl, 2007].

An excellent contribution to the solution of urban nav-igation using vision is [Beyeler et al., 2007], in which ahelicopter simulation is combined with graphics genera-tion and optic flow processing to demonstrate obstacleavoidance in an urban environment. In their simulation,the helicopter is steered in a direction to balance the op-tic flow on left and right sides of the field of view of a for-ward looking camera. When optic loom exceeds a thresh-old, the helicopter executes a 180◦ turn. In Beyeler etal’s work, the helicopter is assumed to only travel for-

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 2: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

VisionSystem

Controller

StateDisplay

States

ServoDynamics

HelicopterDynamics

TrajectoryPlanner

emPitch Cyclic (B1)

Collective Pitch

Actual State

ServoCommands

Tail Rotor Pitch

Roll Cyclic (A1)

Figure 1: SIMULINK Simulation of Closed Loop Heli-copter

wards so that side ways flight is not permitted. Thehelicopter also does not work towards a goal, only exe-cuting a random flight path and the simulation requiresthat the urban environment is surrounded by a perime-ter wall to prevent the helicopter from flying outside ofthe virtual environment. In our scheme we proceed to-wards a goal and we do not make assumptions about themotion of the helicopter, relying instead on the realistichelicopter dynamics, so that lateral motion occurs. Toachieve this, we use optic flow from multiple cameras toavoid obstacles on all sides.

3 Simulation Details

A closed loop micro helicopter simulation has been de-veloped which includes a graphics engine to simulate theview from multiple cameras onboard the UAV. OpenGLsoftware is used to rapidly draw the simulated views froma downwards, upwards and two sideways looking cam-eras. The images are processed using code implementedin MATLAB to produce an optic flow field consisting of agrid of evenly spaced flow vectors. Figure 1 shows the toplevel view of our visual guidance simulator implementedin Simulink. The helicopter dynamics block is basedon a small electrically propelled autonomous helicopterused by the group and has been validated thoroughlyagainst real flight test data [Garratt, 2007]. The visionsub-systems contain blocks for generating the simulatedview from each camera. The image stream from eachsimulated view is fed into optic flow processing blockswhich calculate a vector flow field for each image framefrom each camera.

3.1 Helicopter Dynamics Model

Mathematical models for small-scale helicoptershave been extensively documented in the litera-ture [Hashimoto et al., 2000; Gavrilets et al., 2001;Civita et al., 2002]. The general approach is to combine

a linearised rotor aerodynamic model with the non-linear rigid body equations. Sometimes a non-linearthrust and rotor inflow model is incorporated, usuallyinvolving an iterative scheme to simultaneously solve forthe thrust and the induced velocity through the rotor.The non-linear fuselage and tailplane aerodynamicsforces are only usually added when forward flight isconsidered. An underlying theme in simulating smallhelicopters has been the realisation that owing to thecomplexity of the helicopter, minimum complexity sim-ulations tends to be most practical, so most researchershave chosen to use the minimum level of detail requiredto capture the key system dynamics. This approachwas championed by Heffley who published a report forNASA in 1988 on a minimum complexity helicoptermodel [Heffley, 1988] that is regularly cited in the field.Heffley maintains that adding higher order effects suchas the off-axis moments due to blade flapping does notautomatically lead to a better match with flight data.SIMULINK R© was chosen as a development environmentfor convenience as the graphical drag and drop buildingblock approach speeds up the development cycle andallows the simulation to be quickly adapted to otherhelicopters. Balancing simulation fidelity against prac-ticality, a simulation has been created that is capable ofsimulating the following effects:

• Exact non-linear rigid body equations of motion;

• Wind gusts and turbulence;

• First order main rotor flapping dynamics;

• Hover, rearwards, sideways and forward flight;

• Dynamic effects of the Bell-Hiller stabiliser bar;

• Fuselage and tailplane aerodynamic forces;

• Approximate servo dynamics; and

• Sensor lags, filtering, offsets and noise.

The simulation does not model interaction of rotordownwash with the fuselage and rotor lead-lag mecha-nisms. Higher order flapping modes are only really ofinterest when studying vibration and aeroelastic prob-lems [Leishman, 2006] so they have also been ignored.Only low rates of descent are allowed on the helicopter sothat momentum theory can still be applied. This seemsreasonable noting that all of the simulation scenarios re-late to flight close to the ground so that high descentrates cannot be achieved. A detailed explanation andvalidation of the helicopter dynamics simulations can befound in [Garratt, 2007].

3.2 Graphics and Image ProcessingThe openGL code is written in C++ and compiled toprovide a dynamic link library executable (.dll) file thatwill run under the Windows operating system. The .dll

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 3: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

library routines are called from inside the Simulink pro-gram. The routines are called every 0.02 seconds of sim-ulation time to simulate a camera speed of 50 framesper second. The .dll function calls are passed input vari-ables corresponding to the UAV position and attitudecalculated in the simulation. The openGL executableprogram outputs the images as a 400×400 pixel 24-bitcolor image. The color image is converted into an ar-ray of 8-bit intensity values by averaging the intensitiesfrom the red, green and blue color channels. The imagesare displayed as the simulation proceeds. In future work,the openGL code has the potential to allow simulation ofmore complicated environments such as trees, shadows,rocks, multiple light sources and grass. Figure 2 showsthe views from multiple cameras when the helicopter isflying inside a simulated city block environment.

For this work, we calculate the optic flow vector flowfield using the Image Interpolation Algorithm, abbre-viated I2A. The I2A is quick to execute and robustto noise. A full derivation of the I2A can be found in[Srinivasan, 1994]. To achieve the optic flow calculation,a Simulink delay is used to store the previous frame ateach sample time. The optic flow block in Simulink isimplemented as compiled C code. A graphical menu, al-lows the parameters of the optic flow calculation to bechanged. For example, the patch size in pixels can bechanged and the spacing and number of flow vectors canbe changed.

Simulink blocks for calculating average optic flow inthe image and also image loom have been constructed.Image loom corresponds to image expansion or contrac-tion due to motion perpendicular to the image plane ofthe camera. The loom value is extracted by subtract-ing the sum of the flow vectors on the left-hand side ofthe image from the sum of the flow vectors on the right-hand side of the image. The loom calculation can bebetter understood by considering the equations whichrelate image plane velocity (Qu and Qv) and vehicle mo-tion which are given in [Corke, 2004] and reproduced asequation 1.

[Qu

Qv

]=

Fx

Z 0 −UZ −UV

Fx−F 2

x+U2

Fx−V

0 Fy

Z −VZ −F 2

y +U2

Fy−UV

FyU

×

[X Y Z p q r

]T

(1)

where Fx and Fy are the horizontal and vertical focallengths of the camera respectively (equal in the simu-lation); U and V are the horizontal and vertical pixellocations of features in the field of view; X, Y and Z arethe sideways, vertical and normal velocities of the cam-era; and p, q and r are rotation rates about the cameraaxes.

Consider the horizontal component of the optic flowfield defined by Qu (x, y) at a location (U, V ) in the imageplane. If optic flow is calculated using a symmetricaland even distribution of patches and summed over theleft and right sides of the image separately and thenthe sums subtracted, many of the terms in equation 1are cancelled out owing to the symmetry. The resultingrelationship is:

∑left

Qu −∑right

Qu =Z

Z

∑left

U −∑right

U

(2)

The sums on the right hand side of equation 2 aresimply the sums of the spatial coordinates of the centersof the patches used for optic flow computation and aretrivially calculated. The loom term L = Z/Z representsthe time to contact and can be calculated explicitly fromthe measured optic flow field using equation 3 which isderived from equation 2 after noting that the distancesto the centers of the patches on left and right sides arethe same.

L =Z

Z=

∑left

Qu −∑right

Qu

×2

∑right

U

−1

(3)

4 Simple Obstacle Avoidance

A simple scheme for controlling the flight of a helicopterin a cluttered environment was tested first. This schemeis sufficient to demonstrate the ability of a vehicle withsmall rotorcraft dynamics to avoid walls in a rectangularroom 8m×16m using vision. The scheme makes use offive sensors which are oriented downwards, sideways (leftand right) and fore and aft. (Note: the 5 sensors couldbe combined into one panoramic sensor looking with a180◦ field of view with appropriate optics). To speedup execution of the simulation, optic flow vectors forsideways, front and rear images were only calculated ina horizontal row of 8 patches. In real-flight, many morevectors would be calculated, improving the accuracy ofthe results significantly. Optic flow on 8×8 patches wascalculated for the ground camera video stream.

A simple sensor fusion loop is implemented that fuses ameasurements from a simulated height sensor with iner-tial data and the optic flow information from the down-wards looking camera (see [Garratt and Chahl, 2007]for details). The resulting lateral and directional veloc-ity estimates are used in a PID based feedback loop tocontrol the lateral and longitudinal motion of the he-licopter. Speed is maintained at 1m/s in forward andsideways directions to create a diagonal flight path wherethe sense of the forward and sideways velocity compo-nents is switched in response to obstacles detected in

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 4: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

Figure 2: Example of imagery produced by the openGL software. The following camera views can be seen: left view(top left), right view (top middle), front view (bottom left), downwards view (bottom middle) and rear view (farright).

the flight path. The heading of the helicopter was keptconstant throughout using a PID control loop to controlyaw. The height of the helicopter above ground was alsoset constant at 1m using a PID control loop.

Owing to the diagonal flight direction, both front andside cameras tend to observe flow from translation. Theexception to this is when the direction of flight is di-rectly towards an obstacle. In this case, the presenceand relative range of an obstacle is determined from theexpansion of the flow vectors (loom).

If the range calculated from flow (or loom) is less thansome threshold on the side, the helicopter lateral veloc-ity component is reversed to take the vehicle away fromthe obstacle. Similarly if the range calculated from flow(or loom) is less than some threshold on the front orback, then the helicopter reverses its longitudinal direc-tion away from the obstacle.

The resulting path of the helicopter is a zig-zagthrough a cluttered environment which results in an ex-ploration of the environment. Figure 3 shows the re-sults of simulations for the helicopter inside the roomwhen the helicopter is commanded to maintain a head-ing which is (a) initially aligned with the surroundingwalls; and (b) offset by 22.5◦ from the longer walls. Incase (a), the loom signal may not be necessary as the di-agonal motion of the helicopter tends to stimulate flowfrom translation on all cameras at once. As the heli-

copter nears a wall, an increase in optic flow and loomcan be seen for any cameras which image the wall. Thecombination of loom and optic flow from translationsis sufficient to prevent the helicopter from striking thewalls.

4.1 Goal Directed BehaviourThe work on simulation of navigation was extended toinclude bias towards a particular goal. The approximatepositions of the goal and the current vehicle position isassumed to be known from a means such as GPS. (Al-ternatively, optic flow odometry could be used to navi-gate towards a point at a set distance and bearing fromthe starting position. The basic principle of optic flowodometry is to integrate the optic flow from the sensorover time to compute distance travelled.) In the latestscheme, velocity commands are sent to the lower levelhorizontal controller. The velocity is sensed by a down-wards looking optic flow sensor that measures sidewaysand longitudinal drift.

In this means of control, we again simulate the use offive cameras looking left, right, forwards, rearwards anddown. Each camera has a field of view of 90 degrees andthe resolution of each image is 400×400 pixels. This fieldof view provides complete coverage of the surroundingenvironment, so that obstacle avoidance can be achievedwhen flying in any direction. Optic flow is calculatedon the image from each camera using a grid of 16×16

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 5: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

-4 -3 -2 -1 0 1 2 3 4-6

-4

-2

0

2

4

6

8

Y P os ition (m)

X P

ositi

on (m

)

S TAR T

F INIS Ht=35 s ec

-4 -2 0 2 4-8

-6

-4

-2

0

2

4

6

8

Y P os ition (m)

X Po

sitio

n (m

)

S TAR T

F INIS Ht=70s ec

Figure 3: Trajectory of helicopter in horizontal plane within an 8m×16m arena for (a) heading aligned with longerwalls and (b) offset by 22.5◦ from the longer walls.

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 6: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

flow vectors and then the average of the horizontal flowcomponent from each vector is calculated to provide oneoptic flow scalar values from each camera. Each flowvector is calculated using a 64×64 pixel patch.

The velocity commands are based on empirical for-mulae which blend the need to avoid obstacles with thedesire to head towards the goal. The longitudinal veloc-ity command (V cmd

x ) consists of the sum of:

• A gain k1 multiplied by the difference in averagehorizontal optic flows (QA and QF ) between the aftand front views, divided by the lateral velocity (Vy)of the vehicle. As low lateral velocity results in avery noisy signal, the lateral velocity is saturated toa minimum velocity of 0.2m/s.

• A gain k2 multiplied by the difference in averageloom between the aft and front views, divided bythe longitudinal velocity (Vx).

• The longitudinal component of the desired velocitytowards the target (V tgt

x ). The desired velocity issimply a fixed velocity of 2 m/s in a direction di-rectly towards the target. In some cases, a slightlycurved or spiral path can be used by following avector which is pointed slightly away from the tar-get. This helps prevent the helicopter from gettingtrapped in a limit cycle trajectory where it movesbackwards and forwards repeatedly over the samepath. Although, unlikely in practice, this could the-oretically occur in the simulation if an obstacle wasblocking the desired trajectory and the surroundingobstacles were arranged perfectly symmetrically.

The lateral velocity command (V cmdx ) is structured in

the same way except that the flow in left (QL,LL) andright (QR,LR) views is differenced instead. The equa-tions for the command velocities are as below:

V cmdx = V tgt

x + k0(LA − LF )

Vx+ k1

(QA −QF )Vx

V cmdy = V tgt

y + k0(LL − LR)

Vy+ k1

(QL −QR)Vy

(4)

In certain situations, the equations above could leadto low velocities so that flow in all directions becomeslimited. If this were permitted to continue, it might bepossible for the UAV to gradually drift in to an obstacle.To eliminate the risk of this happening, the velocity com-mands in equation 4 are normalised to provide a vectormagnitude of 2m/s whilst keeping the same ratio be-tween lateral and longitudinal velocities. The helicoptertherefore is commanded to maintain a 2m/s velocity ina direction determined to fly towards the target whilstmoving away from obstacles. The gains k0 and k1 canbe adjusted to regulate how far from obstacles the UAV

should fly. If the gains selected are too small, the UAVwill fly into obstacles. If the gains are made too highthe commands to the velocity controller can change toorapidly, placing undue strain on the servos. For the ex-periments herein, we found suitable values of k0 and k1

to be 4.0 and 5.0 respectively.For testing, an urban environment was created in an

area of 300×300m, consisting of random sized buildingsarranged approximately in a grid pattern. The viewsfrom the 5 cameras are shown in figure 2. Outside thebuilding array was open space i.e. no boundary. The he-licopter started at a position bottom right of the building(x=-130m, y=80m) and was given a target location of(x=110m, -110m). Height, horizontal velocity and head-ing were controlled using the same scheme described forthe simple obstacle avoidance strategy.

Figure 4 shows the trajectory where the heading of thehelicopter was maintained parallel to the X-direction.Figure 5 shows the trajectory where the heading of thehelicopter was at 22.5◦ to the right of the X-direction.Some simulations were attempted using a heading of 45◦,but this was subject to occasional infringements on thebuilding owing to a small blind spot region on the bound-ary of the four views. Each camera has a field of viewof 90 degrees but optic flow is not calculated across theedge of the images as the reference shift pixels would falloff the side of the image. As a result, the effective field ofview of each camera was only about 70◦. This problemcould be prevented using more views or by overlappingthe field of view of each camera.

In both cases, the helicopter is able to maintain clear-ance with the buildings. At times the helicopter doescome close to the buildings, but does not cross the wallboundaries. The helicopter tends to ‘hug’ the walls asit moves. Figure 6 shows the actual and commandedvelocities given to the low-level velocity controller. Thehelicopter responds quickly to the commanded velocitiesbut provides a filtering effect when the commanded ve-locities fluctuate rapidly. Some pre-filtering of the com-manded velocities may improve the performance. Figure7 shows the optic flow and loom from each camera duringthe simulation.

5 Conclusion

Simulations of autonomous navigation of a flying vehiclein a cluttered environment have been progressed. Thesimulations demonstrate novel yet simple techniques foravoiding obstacles using optic flow and image loom sig-nals and can successfully be used to navigate a vehicletowards a goal. The simulation makes use of simulatedflow fields calculated on cameras looking downwards, for-wards, rearwards and sidewards.

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 7: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

-150 -100 -50 0 50 100 150-200

-150

-100

-50

0

50

100

150

200

Y P os ition (m)

X P

ositi

on (m

)

Figure 4: Plan view of helicopter trajectory in virtual city with heading parallel to city street. Rectangles representbuildings.

-150 -100 -50 0 50 100 150-200

-150

-100

-50

0

50

100

150

200

Y P os ition (m)

X P

ositi

on (m

)

Figure 5: Plan view of helicopter trajectory with heading 22.5◦ to the street.

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 8: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

0 10 20 30 40 50 60 70-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

Time (s ec)

Vel

ocity

(m/s

)

V y ActualV y C ommand

0 10 20 30 40 50 60 70Time (s ec)

V x ActualV x C ommand

Vel

ocity

(m

/s)

-1.5

-1

-0.5

0

0.5

1

1.5

Figure 6: Commanded and actual velocities for trajectory with heading parallel to the urban streets

0 10 20 30 40 50 60 700

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Time (S ec)

Optic F low LeftLoom Left

0 10 20 30 40 50 60 700

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Time (S ec)

Optic F low R ightLoom R ight

0 10 20 30 40 50 60 700

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Time (S ec)

Optic F low F rontLoom F ront

0 10 20 30 40 50 60 700

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

Time (S ec)

Optic F low R earLoom R ear

Figure 7: Optic flow and loom results for trajectory with heading parallel to the urban streets

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia

Page 9: Obstacle Avoidance in Cluttered Environments … Avoidance in Cluttered Environments using Optic Flow ... chemical agent detection and bomb search. The problem of navigating a flying

Acknowledgments

This work has been partly supported by seed fundingfrom the US Army International Technology Center Pa-cific (ITC-PAC).

References

[Barrows, 2002] G.L. Barrows. Future visual microsen-sors for mini/micro-UAV applications. In Proceedingsof the 2002 7th IEEE International Workshop on Cel-lular Neural Networks and Their Applications, pages498–506, 2002.

[Beyeler et al., 2007] A.Beyeler, J.Zufferey, andD.Floreano. 3D Vision-based Navigation for In-door Microflyers. In Proceedings of the 2007 IEEEInternational Conference on Robotics and AutomationRoma, Italy, pages 1336–1341.

[Chahl, 2004] J.S.Chahl. Terrain following for UAVs us-ing optical flow. In Proceedings of Unmanned SystemsNorth America 2004, Anaheim Ca.

[Civita et al., 2002] M.La Civita, W.C.Messner, andT.Kanade. Modelling of small-scale helicopters withintegrated first-principles and system-identificationtechniques. In Proceedings of the American HelicopterSociety 58th Annual forum, Montreal, Canada, 11-13June 2002.

[Corke, 2004] P.Corke. An Inertial and Visual SensingSystem for a Small Autonomous Helicopter Journalof Robotic Systems 21(2):43–51, 2004.

[Garratt, 2007] M.A.Garratt. Biologically Inspired Vi-sion and Control for an Autonomous Helicopter. PhDthesis, Australian National University, 2007.

[Garratt and Chahl, 2007] M.A.Garratt and J.S.Chahl.An optic flow damped hover controller for an au-tonomous helicopter. In Proceedings of the 22nd In-ternational UAV Systems Conference, Bristol, UK.

[Garratt and Chahl, 2008] M.A.Garratt and J.S.Chahl.Vision-Based Terrain Following for an Unmanned Ro-torcraft Journal of Field Robotics 25(4-5):284–301, 10Apr 2008.

[Gavrilets et al., 2001] V.Gavrilets, B.Mettler, andE.Feron. Nonlinear model for a small-size acrobatichelicopter. In AIAA Guidance, Navigation, andControl Conference, 6-9 August 2001.

[Green et al., 2003] W.Green, P.Oh, K.Sevcik, andG.Barrows. Autonomous landing for indoor fly-ing robots using optic flow. In Proceedings of the2003 ASME International Mechanical EngineeringCongress, Washington, USA.

[Griffiths et al., 2006] S.Griffiths, J.Saunders, A.Curtis,B.Barber, T.McLain, and R.Beard. Maximizing

miniature aerial vehicles: Obstacle and terrain avoid-ance for MAVs. IEEE Robotics and Automation Mag-azine, pages 34–43.

[Hashimoto et al., 2000] S.Hashimoto, T.Ogawa,S.Adachi, A.Tan, and G.Miyamori. System iden-tification experiments on a large-scale unmannedhelicopter for autonomous flight. In Proceedings ofthe 2000 IEEE International Conference on ControlApplications, pages 850–855, Anchorage, Alaska,25–27 September 2000.

[Heffley, 1988] R.K.Heffley. Minimum-Complexity Heli-copter Simulation Math Model, USAAVSCOM Tech-nical Report 87-A-7 edition, April 1988. Preparedfor US Army Research and Technology Activity un-der contract NAS2-11665.

[Hrabar et al., 2005] S.Hrabar, G.Sukhatme, P.Corke,P.Usher, and J.Roberts. Combined optic-flow andstereo-based navigation of urban canyons for a UAV.In IEEE/RSJ International Conference on IntelligentRobots and Systems, pages 3309–3316.

[Iida, 2001] F.Iida. Goal-Directed Navigation of an Au-tonomous Flying Robot using Biologically InspiredCheap Vision, In Proceedings of the 32nd Interna-tional Symposium on Robotics 19-21 April 2001.

[Leishman, 2006] J.G.Leishman. Principles of Heli-copter Aerodynamics, volume 2nd Edition. CambridgeUniversity Press, New York, 2006.

[Srinivasan, 1994] M.Srinivasan. An image interpolationtechnique for the computation of optic flow and ego-motion. Biological Cybernetics, 71:401–415.

[Srinivasan et al., 1989] M.Srinivasan, M.Lehrer,S.Zhang, and G.Horridge. How honeybees measuretheir distance from objects of unknown size. Journalof Comparative Physiology, A(165):605–613.

[Srinivasan et al., 2004] M.V.Srinivasan, S.W.Zhang,J.S.Chahl, G.Stange and M.A.Garratt. An Overviewof Insect Inspired Guidance for application in groundand airborne platforms , Proceedings of Inst MechEngineers Part G: Journal of Aerospace Engineering,218:375–388, 2004.

[Thakoor, et al., 2003] S.Thakoor, N.Cabrol, N.Lay,and J.S.Chahl. The Benefits and Applications ofBioinspired Flight Capabilities. Journal of RoboticSystems 2003, pages 687–706.

[Zufferey et al., 2006] J.Zufferey, A.Klaptocz,A.Beyeler, J.Nicoud, and D.Floreano. A 10-gram Microflyer for Vision-based Indoor Navigation.In Proceedings of the IEEE/RSJ InternationalConference on Intelligent Robots and Systems.

Australasian Conference on Robotics and Automation (ACRA), December 2-4, 2009, Sydney, Australia