41
Rover Navigation and Visual Rover Navigation and Visual Odometry: A New Framework for Odometry: A New Framework for Exploration Activities Exploration Activities Enrica Zereik , Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino GRAAL Lab, DIST, University of Genoa GRAAL Lab, DIST, University of Genoa ICRA Planetary Rover Workshop Anchorage, Alaska, 3 May 2010

Rover Navigation and Visual Odometry : A New Framework for Exploration Activities

  • Upload
    thi

  • View
    47

  • Download
    0

Embed Size (px)

DESCRIPTION

Anchorage, Alaska, 3 May 2010. ICRA Planetary Rover Workshop. Rover Navigation and Visual Odometry : A New Framework for Exploration Activities. Enrica Zereik , Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino. - PowerPoint PPT Presentation

Citation preview

Page 1: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Rover Navigation and Visual Rover Navigation and Visual Odometry: A New Framework for Odometry: A New Framework for Exploration ActivitiesExploration Activities

Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino

GRAAL Lab, DIST, University of GenoaGRAAL Lab, DIST, University of Genoa

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 2: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• to deal with the underlying specific hardware platform

• to solve problems related to real-timereal-time constraints of control systemscontrol systems

• to provide data-unawaredata-unaware communication mechanisms• to be reusedreused for different control systems in several

applications

Develop a software architecture to let researchers focus their attention on the control control algorithm algorithm only, without caring about the underlying physical system

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Why a Framework?Why a Framework?

Page 3: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Minimization Minimization of the number of code lines not strictly related to the control algorithm

• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)

• Capability of coordinationcoordination between remote frameworks

Main ObjectivesMain Objectives• Independency Independency of each control algorithm

from the underlying software platform

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 4: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Minimization Minimization of the number of code lines not strictly related to the control algorithm

• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)

• Capability of coordinationcoordination between remote frameworks

KAL

Abstraction LevelsAbstraction Levels• Independency Independency of each control algorithm

from the underlying software platform

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 5: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Capability of coordinationcoordination between remote frameworks

• Minimization Minimization of the number of code lines not strictly related to the control algorithm

• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)WF

Abstraction LevelsAbstraction Levels• Independency Independency of each control algorithm

from the underlying software platform

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 6: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Capability of coordinationcoordination between remote frameworks

• Minimization Minimization of the number of code lines not strictly related to the control algorithm

• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)

Abstraction LevelsAbstraction Levels• Independency Independency of each control algorithm

from the underlying software platformBBS

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 7: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• WorkFrame Name ServerWorkFrame Name Server: abstraction of the OS resources and services

KAL: Kernel Abstraction KAL: Kernel Abstraction LayerLayer

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 8: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• System ManagerSystem Manager: resource request handling• SchedSched: Rel Sched can synchronize frameworks• LoggerLogger: communication toward user

WF: WorkFrameWF: WorkFrameICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 9: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Inter-task communication

• Resource access• Both local and

remote tasks• Shared

BlackBoard publishing data

• Local execution of computation involving BB data

BBS: BlackBoard SystemBBS: BlackBoard SystemICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 10: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Resources, SchedulingDevice I/O1

2

3

4

2

3

41 Mutually Exclusive

Interprocess Data Sharing (also with remote tasks)Network Communication

C++ Math Routines

Framework HierarchyFramework HierarchyICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 11: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Feature ExtractionFeature Extraction: in each image of the stereo pair

• Stereo MatchingStereo Matching: correspondence research• TriangulationTriangulation: correspondent 3D point computation

• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs

Visual Odometry ModuleVisual Odometry Module

• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 12: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Visual Odometry ModuleVisual Odometry Module• Feature ExtractionFeature Extraction: in each image of the

stereo pair

• TriangulationTriangulation: correspondent 3D point computation

• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs

• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition

• Stereo MatchingStereo Matching: correspondence research

LOG filtering +SURF (robust descriptors )

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 13: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Feature ExtractionFeature Extraction: in each image of the stereo pair

• TriangulationTriangulation: correspondent 3D point computation

• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs

• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition

Visual Odometry ModuleVisual Odometry Module

• Stereo MatchingStereo Matching: correspondence research

Epipolar constraint, descriptor-based

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 14: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Feature ExtractionFeature Extraction: in each image of the stereo pair

• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs

• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition

Visual Odometry ModuleVisual Odometry Module

• Stereo MatchingStereo Matching: correspondence research• TriangulationTriangulation: correspondent 3D point computation

Subject to erros, outliers rejected

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 15: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Feature ExtractionFeature Extraction: in each image of the stereo pair

• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs

• TriangulationTriangulation: correspondent 3D point computation

Visual Odometry ModuleVisual Odometry Module

• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition

No external estimation, descriptor-based

• Stereo MatchingStereo Matching: correspondence research

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 16: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

• Feature ExtractionFeature Extraction: in each image of the stereo pair

• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs

• Stereo MatchingStereo Matching: correspondence research• TriangulationTriangulation: correspondent 3D point computation

• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition

Least Square (outlier rejection, initial estimation) + Maximum Likelihood Estimation

Visual Odometry ModuleVisual Odometry ModuleICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 17: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Experimental SetupExperimental SetupICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

• Custom mobile platform @ GRAAL• Tricycle-like structure• Bumblebee2 stereo camera system

Page 18: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Preliminary ResultsPreliminary ResultsICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 19: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

GRAAL Lab, DIST, University of GenoaGRAAL Lab, DIST, University of GenoaEuropean Space AgencyEuropean Space AgencyThales Alenia Space, ItalyThales Alenia Space, Italy

Enrica Zereik, Andrea Sorbara, Andrea Merlo, Frederic Didot and Giuseppe Casalino

Robotic Crew Assistant for Robotic Crew Assistant for Exploration Missions: Vision, Force Exploration Missions: Vision, Force Control and Coordination StrategiesControl and Coordination Strategies

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 20: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Eurobot Wet ModelEurobot Wet ModelICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 21: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

JR3 force/torque sensor

7 d.o.f. arms, one camera on each

four-wheeled rover for autonomous navigation

pan/tilt stereo cameras forrover navigation

pan/tiltstereoscopic headfor manipulation

exchangeable end-effector

arm cameras

Eurobot Ground Eurobot Ground PrototypePrototype

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 22: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

CoordinationCoordination• Coordination Rover and Arms• Dynamic Programming-based strategy

VisionVision• Object recognition and centering• ARToolKiTPlus and OpenCV support

ForceForce• Approaching and actual grasping• Contact detection

EGP - Control AspectsEGP - Control AspectsICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 23: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Dynamic Programming-based Dynamic Programming-based • Coordination of robotic macro-structures• Independent from the specific system configuration• Many different control objectives can be required

22

2

min

minarg1

qJySx

qJySxqA

iiiq

oi

iiiAq

ii

Velocity controltask requirementAssociated

cost-to-goMoving

platformvelocity

General Control Architecture General Control Architecture with Priority Taskswith Priority Tasks

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 24: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Task i-Task i-th:th:

General Control Architecture General Control Architecture with Priority Taskswith Priority Tasks

iiiiiii

iiiiiiiii

iiiiii

iiiiiii

HQQQhh

SJQPPPJSS

JJIHXJ

QJJhJxX

111

#111

##

11

,

,

,

,

22#2

ySXVySXJJI

zQyPhq

iiiiiiioi

iiii

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 25: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

aC

11 LJ L

11A LJ

1LppC

b

222

222

qJ

qJv

A

L

Use of relationships

Monitoring MM tendency toward 0

1

211

211

)3

)2

)1

qp

qJ

vqJv

A

L

,,22 vLq

221

22211

22211

,;,,)3,;,,)2,;,,)1

vvpL

vvLJvvvLJv

A

L

222

22

;,,~)3

;,,~)2

;,,~)1

qvp

qvJ

qvJv

A

L

2211 ,;,, vvLq

aC

pC

11L LJ11 LJ A 1pL

e

p

Backward PhaseBackward PhaseICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 26: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

RemarksRemarks• The risk of MM losses still exists(e.g. if the object must be very high lifted)• If a MM loss is detected the lastlastresort solutionresort solution is modulating• Implicit Priority ChangeImplicit Priority Change

),,(22 vLq

pC222

222

qJ

qJv

A

L

),;,,( 2211 vvLq

22v

1qaC

p b

e

pC

aC

2v2

2q

v

Forward PhaseForward PhaseICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 27: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

),,(22 vLq

),,(3 vLqp

);,,(~3

);,,(~2

);,,(~1

2

22

22

qvp)

qvJ)

qvJv)

A

L

11L LJ

11A LJ

1Lp

pC

vvL ˆ),,(3

),,(22 vLq

),,(22 vLq

Backward phase at platform level

v̂v)1(v

o

Implicit Priority ChangeImplicit Priority ChangeICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 28: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Vision-based Recognition Vision-based Recognition of Objectsof Objects

Marker-based object trackingMarker-based object tracking• Reliability• Robustness

Occurring problemsOccurring problems• Lighting conditions• Complexity of the captured scene• Distance from which the marker is seen

PreliminaryThresholding

ImageCleaning

Image

Zooming

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 29: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Image from

camera

Autothreshold

Imagecleaning

To Estimator

Pose estimation

LPF

Pose

Es

timat

or

To E-GNC

Imagezooming

Image Processing ChainImage Processing ChainICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 30: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

angular error angular error after zooming

angular error after LPF

linear error linear error after zooming

linear error after LPF

Implicit Priority ChangeImplicit Priority ChangeICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 31: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Direct Force Control StrategyDirect Force Control Strategy• Detect a contact with the object to be grasped• Compensate residual errors

Pure Force Only at the Palm LevelPure Force Only at the Palm Level• felt by the JR3 sensor• the contact point must belong to the palmsurface

ssf ,

ddrrr

rfT

yx

ss

,

known and constant

Force-based Approach Force-based Approach towards Objectstowards Objects

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 32: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Velocity Generation Velocity Generation • Contact point estimation• Velocity assigned to the estimated contact point

• Compute velocity reference with respect to the robot end-effector

RemarksRemarks• Noisy sensor and too long distance from palm• Initial error very small thanks to vision

cxcx

ssc FFKx *

s

ss f

F

with

Force-based Approach Force-based Approach towards Objectstowards Objects

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 33: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Simulative ResultsSimulative Results

Page 34: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Experimental ResultsExperimental ResultsICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

VideoVideo• 3. EGP Failed Equipment Replacement.avi

Page 35: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

EGPEGP• EffectiveEffective and autonomousautonomous robotic crew assistant• Marker removal• PotentiallyPotentially, flight model

Planetary RoversPlanetary Rovers• Visual Odometry error less than 1%less than 1%• 3D reconstruction of the environment• DEM construction and autonomous navigation

Conclusions and Future Conclusions and Future WorkWork

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 36: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

EGPEGP[1] T. Kröger, D. Kubus and F. M. Wahl, “6D Force and Acceleration Sensor Fusion for Compliant Manipulation Control”, IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October 2006.[2] B. J. Waibel and H. Kazerooni, “Theory and Experiments on the Stabilityof Robot Compliance Control”, IEEE Transactions on Robotics and Automation, February 1991, vol. 7, no. 1, pp. 95-104.[3] G. Bradski and A. Kaehler, “Learning OpenCV: Computer Vision with the OpenCV Library”, O'Reilly.[4] C. P. Lu, G. D. Hager and E. Mjolsness “Fast and Globally Convergent Pose Estimation From Video Images”, IEEE Transactions on Pattern Analysis and Machine Intelligence, June 2000, vol. 22, no. 6, pp. 610-622.

References, IReferences, IICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 37: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

[5] B. Kainz and M. Streit, “How to Write an Application with Studierstube 4.0”, Technical report, Graz University of Technology, 2006.[6] J. Cai, “Seminar Report: Augmented Reality: the Studierstube Project”, Seminar report.[7] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Autonomous Dual-Arm Mobile Manipulator Crew Assistant for Surface Operations: Force/Vision-Guided Grasping”, International Conference on Recent Advances in Space Technologies, Istanbul, Turkey, June 2009.[8] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Force/Vision-Guided Grasping for an Autonomous Dual-Arm Mobile Manipulator Crew Assistant for Space Exploration Missions”, International Conference on Automation Robotics and Control Systems, Orlando, USA, July 2009.

References, IIReferences, IIICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 38: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

[9] G. Casalino and A. Turetta, “Coordination and Control of Multiarm Nonholonomic Mobile Manipulators", MISTRAL: Methodologies and Integration of Subsystems and Technologies for Robotic Architectures and Locomotion, B. Siciliano, G. Casalino, A. De Luca, C. Melchiorri, Springer Tracts in Advanced Robotics, Springer-Verlag, April 2004.[10] G. Casalino, A. Turetta and A. Sorbara, “Dynamic Programming based Computationally Distributed Kinematic Inversion Technique”, Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November 2006.[11] G. Casalino, A. Turetta and A. Sorbara, “DP-Based Distributed Kinematic Inversion for Complex Robotic Systems”, 7th Portuguese Conference on Automatic Control, Lisbon, Portugal, September 2006.[12] E. Zereik, “Space Robotics Supporting Exploration Missions: Vision, Force Control and Coordination Strategies”, Ph.D. Thesis, University of Genova, 2010.

References, IIIReferences, IIIICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 39: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Visual OdometryVisual Odometry[1] M. Maurette and E. Baumgartner, “Autonomous Navigation Ability: FIDO Test Results”, 6th ESA Workshop on Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November 2000.[2] M. Maimone, Y. Cheng, and L. H. Matthies, “Two Years of Visual Odometry on the Mars Exploration Rovers”, Journal of Field Robotics, March 2007, vol. 24, no. 3, pp. 169-186.[3] A. E. Johnson, S. B. Goldberg, Y. Cheng and L. H. Matthies, “Robust and Efficient Stereo Feature Tracking for Visual Odometry”, IEEE International Conference on Robotics and Automation, Pasadena, USA, May 2008.[4] L. Matthies, “Dynamic Stereo Vision”, Ph.D. Thesis, Carnegie Mellon University.

References, IVReferences, IVICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 40: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

[5] D. Nistér, O. Naroditsky and J. Bergen, “Visual Odometry”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, USA, June 2004.[6] Richard Hartley and Andrew Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, March 2004.[7] E. Trucco and A. Verri, “Introductory Techniques for 3-D Computer Vision”, Prentice Hall, 1998.[8] I. J. Cox, S. L. Hingorani, S. B. Rao and B. M. Maggs, “A Maximum Likelihood Stereo Algorithm”, Journal of Computer Vision and Image Understanding, 1996, vol. 63, no. 3, pp. 542-567. [9] M. Fischler and R. Bolles, “Random Sample Consensus: a Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography”, Communications of the Association for Computing Machinery, June 1981, vol. 24, pp. 381-395.

References, VReferences, VICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010

Page 41: Rover  Navigation  and  Visual Odometry : A New  Framework for Exploration Activities

Thank you for your kind Thank you for your kind attention!!attention!!

ICRA Planetary Rover Workshop

Anchorage, Alaska, 3 May 2010