5
Algorithms for Real Time Detection and Depth Calculation of Obstacles by Autonomous Robots Vikas Singh, Anvaya Rai, Hemanth N., Rohit D.A., Asim Mukherjee Electronics and Communication Engineering Department Motilal Nehru National Institute of Technology Allahabad 211004, U.P., India vikkifundoo @gmail.com Abstract-As robotics is increasing its influence over the technological developments, the need for autonomous robots capable of performing various tasks has increased. These task include the operations in which they detect the obstacles and navigate themselves in an unknown environment .Here we present two algorithms for wireless detection and depth calculation of obstacles. These algorithms are implemented using image processing techniques. The sensor assembly used for wireless sensing of the obstacle comprises of a LASER source and a camera. Keywords- Wireless obstacle detection, Sensor assembly, depth calculation, Digital image processing, LASER source and Camera. I. INTRODUCTION With robots being made capable of performing various tasks in which they need to navigate themselves, the need for wireless and efficient detection of obstacles in their path is an important aspect to be developed. In this paper we have presented two different algorithms for efficient real time detection and depth calculation of obstacles. Optical sensors are designed by us to implement two algorithms. They comprise of a camera and a LASER source. This assembly is connected to a digital signal processor capable of image processing. The proposal discusses the two algorithms namely: * Vertical Shift Algorithm * Horizontal Shift Algorithm The sensor assemblies have been discussed under respective algorithms. When there is an obstacle the LASER is obstructed and a LASER point is obtained in the image. The algorithms are based on the sensing the shift of this LASER point with the change in the depth of the obstacle from the robot. II. VERTICAL SHIFT ALGORITHM In this algorithm the sensor assembly used is shown in Fig.l. The LASER source and camera are aligned vertically. When there is an obstacle in the path of the robot the LASER light is obstructed. The camera which takes snaps at regular intervals, obtains a Laser spot in the image. The centroid of this point is calculated by the digital signal processor and is used to calculate the distance of the obstacle. With the change in the separation between robot and obstacle there is a shift in the obtained LASER point in the vertical direction. As shown below, the depth of the obstacle varies inversely with the pixel position of the LASER spot along the Y axis. _ .XCamera 11/ Robot Figure 1. Sensor Assembly for Vertical Shift Algorithm Referring to Fig.2 and Fig.3: tan() ( = \2/ Dactual D ~ virtual I Dv y M=-=- - 0 DA h Now, We know that since the resolution of the image is 640 X480, Y 480 Pv =-= 2 = 240 2 2 ta_0 = /2 = /2 2 M DA (/)DA hY 1 k DA=2 tan 2 Y Y (1) (2) (3) (4) (5) 978-1-4244-1676-9/08 /$25.00 (©2008 IEEE RAM 2008 926

[IEEE 2008 IEEE Conference on Robotics, Automation and Mechatronics (RAM) - Chengdu, China (2008.09.21-2008.09.24)] 2008 IEEE Conference on Robotics, Automation and Mechatronics -

  • Upload
    asim

  • View
    220

  • Download
    3

Embed Size (px)

Citation preview

Page 1: [IEEE 2008 IEEE Conference on Robotics, Automation and Mechatronics (RAM) - Chengdu, China (2008.09.21-2008.09.24)] 2008 IEEE Conference on Robotics, Automation and Mechatronics -

Algorithms for Real Time Detection and DepthCalculation of Obstacles by Autonomous Robots

Vikas Singh, Anvaya Rai, Hemanth N., Rohit D.A., Asim MukherjeeElectronics and Communication Engineering Department

Motilal Nehru National Institute of TechnologyAllahabad 211004, U.P., India

vikkifundoo @gmail.com

Abstract-As robotics is increasing its influence over thetechnological developments, the need for autonomous robotscapable of performing various tasks has increased. These taskinclude the operations in which they detect the obstacles andnavigate themselves in an unknown environment .Here wepresent two algorithms for wireless detection and depthcalculation of obstacles. These algorithms are implemented usingimage processing techniques. The sensor assembly used forwireless sensing of the obstacle comprises of a LASER source anda camera.

Keywords- Wireless obstacle detection, Sensor assembly, depthcalculation, Digital image processing, LASER source and Camera.

I. INTRODUCTION

With robots being made capable of performing various tasksin which they need to navigate themselves, the need forwireless and efficient detection of obstacles in their path is animportant aspect to be developed. In this paper we havepresented two different algorithms for efficient real timedetection and depth calculation of obstacles. Optical sensorsare designed by us to implement two algorithms. Theycomprise of a camera and a LASER source. This assembly isconnected to a digital signal processor capable of imageprocessing.

The proposal discusses the two algorithms namely:* Vertical Shift Algorithm* Horizontal Shift Algorithm

The sensor assemblies have been discussed underrespective algorithms. When there is an obstacle the LASER isobstructed and a LASER point is obtained in the image. Thealgorithms are based on the sensing the shift of this LASERpoint with the change in the depth of the obstacle from therobot.

II. VERTICAL SHIFT ALGORITHM

In this algorithm the sensor assembly used is shown inFig.l. The LASER source and camera are aligned vertically.When there is an obstacle in the path of the robot the LASERlight is obstructed. The camera which takes snaps at regularintervals, obtains a Laser spot in the image. The centroid ofthis point is calculated by the digital signal processor and isused to calculate the distance of the obstacle. With the change

in the separation between robot and obstacle there is a shift inthe obtained LASER point in the vertical direction. As shownbelow, the depth of the obstacle varies inversely with the pixelposition of the LASER spot along the Y axis.

_ .XCamera

11/

Robot

Figure 1. Sensor Assembly for Vertical Shift Algorithm

Referring to Fig.2 and Fig.3:

tan() (=

\2/ Dactual D~ virtual

I Dv yM=-=- -

0 DA hNow,We know that since the resolution of the image is640 X480,

Y 480Pv=-= 2 = 240

2 2

ta_0 = /2 = /22 M DA (/)DA

hY 1 kDA=2 tan 2 Y Y

(1)

(2)

(3)

(4)

(5)

978-1-4244-1676-9/08 /$25.00 (©2008 IEEE RAM 2008926

Page 2: [IEEE 2008 IEEE Conference on Robotics, Automation and Mechatronics (RAM) - Chengdu, China (2008.09.21-2008.09.24)] 2008 IEEE Conference on Robotics, Automation and Mechatronics -

Theref

where,

h Yk =2 tan

7

fore,1

DA °C -

y

DA is the actual depth of the obstacle.

III. HORIZONTAL SHIFT ALGORITHM

(6) In this algorithm the sensor assembly used is shown inFig.4 (a). The LASER source and camera are alignedhorizontally. When there is an obstacle in the path of therobot the LASER light is obstructed. The camera whichtakes snaps at regular intervals, obtains a Laser spot in theimage. The centroid of this point is calculated by the digitalsignal processor and is used to calculate the distance of theobstacle. In this assembly both camera and Laser assemblyare capable of rotating.

k is a constant.

Y is the maximum pixel value alongY-axis, Y = 480.

h is the vertical separation betweencamera and Laser.

o is the angle of view of the camera.

y is the pixel distance of the obtainedLaser Spot from the center of the image.

Y::80

Figure 2. Top view of Image plane for analysis

Camera Inage c ter

h .LLaser Laser E

Di pOt

Figure 3. Front view of Sensor assembly and Image Plane forAnalysis

_ootII

Stepper Motors

Figure 4(a). Sensor Assembly for Horizontal Shift Algorithm

It can be shown from Fig.4 (b) that the point ofintersection the axis of the camera and LASER follows thefollowing trajectory:

x2 + y2 + ydcot(oCL+oCC)-xd =O (7)

Axis of laser Axt'sof cmra

7.'*Tra zr of pcint.P xdV

/

.-&4 d 0

Figure 4(b). Trajectory of the point of intersection of the axis of Laserand Camera

Now depending upon the position of the obstacle three casesarise:

* Obstacle inside the trajectory.

* Obstacle outside the trajectory.

927

Al 10

0

Page 3: [IEEE 2008 IEEE Conference on Robotics, Automation and Mechatronics (RAM) - Chengdu, China (2008.09.21-2008.09.24)] 2008 IEEE Conference on Robotics, Automation and Mechatronics -

sin (X1DA =d sno L+oC sin(occ)

For the first case, the Laser Spot is obtained on the left halfof the image plane. This is shown in the Fig.7(c) and it can beshown by triangulation in Fig.5 that the depth of the obstacleis given as:

D sin (XLDA d si(c+oC A sin(oc -A) (8)

Where,

d is the horizontal separation between the camera andLASER.

OXL is the angle which the axis of the LASER makes withthe horizontal at any instant.

°cc is the angle which the axis of the Camera makes withthe horizontal at any instant.

p is the angle which the obtained LASER spot makeswith the axis of the Camera.

a

r____d___

Figure 5. Analysis for obstacle inside the Trajectory

For the second case, the LASER spot is obtained on theright half of the image plane. This is shown in the Fig.7 (b)and it can be shown by triangulation in Fig.6 that the depth ofthe obstacle is given as:

Ds[n ocLDA= d sin(ocL°C+A i(CC+ Al)

c

Figure 7. Position of LASER spot in the image dependingupon the depth of the obstacle.

IV. RESULTSThe data from the sensor assembly was fed into a computer

and were processed using the MATLAB software. Thefollowing results were obtained.

Results of vertical shift algorithm:

(9)

Las er

Figure 8. Plot of Pixel distance of LASER Spot Vs Actual depth of object.Camera

Figure 6. Analysis for obstacle outside the Trajectory

For the third case, the Laser Spot is obtained at thecenter of the image plane. This is shown in the Fig.7 (a) andit can be shown by triangulation that the depth of theobstacle is given as:

928

$

0 Obstacle on the trajectory. (10)

Page 4: [IEEE 2008 IEEE Conference on Robotics, Automation and Mechatronics (RAM) - Chengdu, China (2008.09.21-2008.09.24)] 2008 IEEE Conference on Robotics, Automation and Mechatronics -

TABLE II DATA SHOWING THE RANGE OF VALUES OF THEPlot of depth cacuaIon CALCULATED DISTANCE "D" USING HORIZONTAL SHIFT ALGORITHM

110

iou ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~mxE 0(661321

2 2398 46.U013 37.38365 I.S.93 1l90

25 2 .G7 39.13 29.39 262 24.32-5

80 0 2S72912 32.D 21.637973 2RF432135 2936~093 26390 15.8240 347343

CL70 i 4U2 U2f 1udi-b~ 3.4L

45 06.4a2 1-43 6.63M473 4.08]-560 26CJ 2 C

55 31.9097 1.0903 0.45595976 55.121450

6 324426 4.426 1913171 600540 ~~~ ~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~6532.5738.5733.3355219 64.8215

70 333.O592 13.0592 5.4-08210~5 71.6512go30 7 335.057 15.3 03, 1772 7.5028 75.55

20

339155324 11 c255348 10 3466$9 B.9

S

20 30 40 50 60 70 W0 90 100 li110~35192~34 d3 S9 591ActuaIi depth dncm 341-14436 211443 7602 112691091 2.1234

95 341.65390 2165390 12.0S 636 94G05100 343.05472 236&547202 354165S3 102.34

Figure 9. Plot of calculated distance of object Vs Actual distance of object. 10 343. .76 17 23.977615 13.34679 104.15671W 34d.41,3339% 2.41~3~3 34 14.5984G2 112.5G7115 345.1471359 25.14713533 14.75395 110.4532

TABLE I. DATA SHOWING THE RANGE OF VALUES OF THECONSTANT 'K' AND SHOWING THE ERROR IN THE CALCULATED DISTANCEBY USING 'K=9055':

Now the plot of above obtained values as obtained on

A B C E MATLAB isas shown:d cm) Vy240 k=(VL24Q4d d(k=9055 erro(kzSO5S

2 30 518.788 288.7877 RGG3.63 31.3551809G 1.355285354 120Pltodetcluain3 35 433 9023 248. 023 8711.58 36.37973823 1.3797429854 40 459.6754 219.6754 7877016 41.21990901 1.219909011 100

5 45 43G 0538 196.0538 8822.421 4G 18630192 1.18630192350 41&.1923 178.1323 8903 61-5 50.81-588323 fi 815838229

7 55 404.0233 164.0233 3021.5565 55.2038SS6 0.2033S85978 0 391.375 1,51.375 3982.5 59.81833196 -0.18166S043

&5 379.75 139.75 9083.7-5 64.79427-549 -0.20572450S 4

70 370.3387 130.3387 9123.709 69 47284268 -0.5271573221 75 362.2778122.277 9170W835 7405263986 -WS947310141 20

80 355 115 9200 78.73313043 -1.2&086956513 85 347 5 107.5 9137.5 84.23255814 -O 7674,1S6 0 20 4 60 8 10 10

14 0 343.34 103.34 9300.6 87.62337914 -2.376620863 ata et nc

15 95 336.5 96.5 9167.5 93.83419689 -1 165803109100 33 35712 3571 9235 1 98 03'5562-1 35644~7' Figure 10. Plot of calculated distance of object Vs Actual distance ofc

17 105 32.5 83.5 9292.5 102.3163842 -26S3615819

Plot of depth cAlculAtion

Results of Horizontal shift algorithm: 4

40After locating the centroid of the LASER spot correctly in theimage the next task was to accurately calculate the distance of

3

the object. For this the value of the constant 'f'is calculatedfirst. The table showing the obtained results is shown for OCL 275deg occ=75deg d =3Ocm. 20

Figure I11. Plot of value of Vs Actual distance of object.

929

object.

Page 5: [IEEE 2008 IEEE Conference on Robotics, Automation and Mechatronics (RAM) - Chengdu, China (2008.09.21-2008.09.24)] 2008 IEEE Conference on Robotics, Automation and Mechatronics -

V. CONCLUSION

Following comparison can be made between two algorithms:

* In this paper we have calculated the real time depth of anobstacle from the Robot up to an accuracy of 98.93% overthe range of 20cm to 105cm using vertical shift algorithm.However range of depth calculation can be increased or

decreased by varying the distance between camera andLASER in the vertical shift sensor assembly that is "h" inour case. But in case of horizontal shift algorithm there is noconstrain on range of depth calculation, without varyingany parameters of its sensor assembly.

* Sensor assembly of horizontal shift algorithm is more

complex since it requires two stepper motors rotatingsimultaneously with same revolutions per minute. But it isnot the case with vertical shift algorithm since only one

stepper motor is used for calculating the same depth.

Applications of these algorithms other than depth calculationare:

* Autonomous navigation of vehicles.* Calculation of the speed of an approaching object.* Detection and counting of objects.

REFERENCES

[1] D. R. Pugh an E. A. Ribble, V. J. Vohnout, Th. E. Bihari, Th. M.Walliser, M. R.Patterson and K. J. Waldron. "Technical description ofthe adaptive suspension vehicle". The International Journal of RoboticsResearch, pages 24-42, 1989.

[2] K. Berns, R. Dillmann, and S. Piekenbrock. Neural networks for thecontrol of a six-legged walking machine. Robotics and AutonomousSystems, 14:233-244, 1995.

[3] St. Cordes, K.Berns, M. Eberl, W. Ilg, F. Schoenung, and A. Baeker. Onthe design of a four-legged walking machine. In InternationalConference on Advanced Robotics ICAR'97, 1997.

[4] W. Ilg, Th. Mhlfriedel, and K.Berns. A Hybrid Learning Architecturebased on Neural Networks for Adaptive Control of a Walking Machine.In IEEE International Conference on Robotics and Automation(ICRA'97), Albuquerque, New Mexico, US, april 1997.

[5] E.P. Krotkov, R.G. Simmons, and W.L. Whittaker. Ambler:Performance of a six-legged planetary rover. Acta Aeronautica,35(1):75-78, 1995.

[6] D. Wettergreen, C. Thorpe, and R. Whittaker. Exploring mount erebusby walking robot. Robotics and Autonomous Systems, 11:171-185, 1993.

[7] P. Buehrle and St. Cordes. Modeling, simulation and realization of anautonomous six legged walking machine. In ASME, 1996.

930