View
217
Download
0
Category
Preview:
Citation preview
Rover Navigation and Visual Rover Navigation and Visual Odometry: A New Framework for Odometry: A New Framework for Exploration ActivitiesExploration Activities
Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino
GRAAL Lab, DIST, University of GenoaGRAAL Lab, DIST, University of Genoa
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• to deal with the underlying specific hardware platform
• to solve problems related to real-timereal-time constraints of control systemscontrol systems
• to provide data-unawaredata-unaware communication mechanisms• to be reusedreused for different control systems in several
applications
Develop a software architecture to let researchers focus their attention on the control control algorithm algorithm only, without caring about the underlying physical system
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Why a Framework?Why a Framework?
• Minimization Minimization of the number of code lines not strictly related to the control algorithm
• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)
• Capability of coordinationcoordination between remote frameworks
Main ObjectivesMain Objectives• Independency Independency of each control algorithm
from the underlying software platform
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Minimization Minimization of the number of code lines not strictly related to the control algorithm
• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)
• Capability of coordinationcoordination between remote frameworks
KAL
Abstraction LevelsAbstraction Levels• Independency Independency of each control algorithm
from the underlying software platform
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Capability of coordinationcoordination between remote frameworks
• Minimization Minimization of the number of code lines not strictly related to the control algorithm
• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)
WF
Abstraction LevelsAbstraction Levels• Independency Independency of each control algorithm
from the underlying software platform
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Capability of coordinationcoordination between remote frameworks
• Minimization Minimization of the number of code lines not strictly related to the control algorithm
• Standard communicationcommunication mechanism between control tasks (minimum impact on the algorithm)
Abstraction LevelsAbstraction Levels• Independency Independency of each control algorithm
from the underlying software platformBBS
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• WorkFrame Name ServerWorkFrame Name Server: abstraction of the OS resources and services
KAL: Kernel Abstraction KAL: Kernel Abstraction LayerLayer
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• System ManagerSystem Manager: resource request handling• SchedSched: Rel Sched can synchronize frameworks• LoggerLogger: communication toward user
WF: WorkFrameWF: WorkFrame
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Inter-task communication
• Resource access• Both local and
remote tasks• Shared
BlackBoard publishing data
• Local execution of computation involving BB data
BBS: BlackBoard SystemBBS: BlackBoard System
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Resources, SchedulingDevice I/O1
2
3
4
2
3
41 Mutually Exclusive
Interprocess Data Sharing (also with remote tasks)Network Communication
C++ Math Routines
Framework HierarchyFramework Hierarchy
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Feature ExtractionFeature Extraction: in each image of the stereo pair
• Stereo MatchingStereo Matching: correspondence research• TriangulationTriangulation: correspondent 3D point computation
• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs
Visual Odometry ModuleVisual Odometry Module
• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Visual Odometry ModuleVisual Odometry Module• Feature ExtractionFeature Extraction: in each image of the
stereo pair
• TriangulationTriangulation: correspondent 3D point computation
• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs
• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition
• Stereo MatchingStereo Matching: correspondence research
LOG filtering +SURF (robust descriptors )
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Feature ExtractionFeature Extraction: in each image of the stereo pair
• TriangulationTriangulation: correspondent 3D point computation
• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs
• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition
Visual Odometry ModuleVisual Odometry Module
• Stereo MatchingStereo Matching: correspondence research
Epipolar constraint, descriptor-based
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Feature ExtractionFeature Extraction: in each image of the stereo pair
• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs
• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition
Visual Odometry ModuleVisual Odometry Module
• Stereo MatchingStereo Matching: correspondence research• TriangulationTriangulation: correspondent 3D point computation
Subject to erros, outliers rejected
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Feature ExtractionFeature Extraction: in each image of the stereo pair
• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs
• TriangulationTriangulation: correspondent 3D point computation
Visual Odometry ModuleVisual Odometry Module
• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition
No external estimation, descriptor-based
• Stereo MatchingStereo Matching: correspondence research
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Feature ExtractionFeature Extraction: in each image of the stereo pair
• Motion EstimationMotion Estimation: estimation of the motion occured between the two considered stereo image pairs
• Stereo MatchingStereo Matching: correspondence research• TriangulationTriangulation: correspondent 3D point computation
• Tracking in TimeTracking in Time: tracking the same features in the following image acquisition
Least Square (outlier rejection, initial estimation) + Maximum Likelihood Estimation
Visual Odometry ModuleVisual Odometry Module
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Experimental SetupExperimental Setup
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
• Custom mobile platform @ GRAAL• Tricycle-like structure• Bumblebee2 stereo camera system
Preliminary ResultsPreliminary Results
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
GRAAL Lab, DIST, University of GenoaGRAAL Lab, DIST, University of GenoaEuropean Space AgencyEuropean Space AgencyThales Alenia Space, ItalyThales Alenia Space, Italy
Enrica Zereik, Andrea Sorbara, Andrea Merlo, Frederic Didot and Giuseppe Casalino
Robotic Crew Assistant for Robotic Crew Assistant for Exploration Missions: Vision, Force Exploration Missions: Vision, Force Control and Coordination StrategiesControl and Coordination Strategies
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Eurobot Wet ModelEurobot Wet Model
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
JR3 force/torque sensor
7 d.o.f. arms, one camera on each
four-wheeled rover for autonomous navigation
pan/tilt stereo cameras forrover navigation
pan/tiltstereoscopic headfor manipulation
exchangeable end-effector
arm cameras
Eurobot Ground Eurobot Ground PrototypePrototype
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
CoordinationCoordination• Coordination Rover and Arms• Dynamic Programming-based strategy
VisionVision• Object recognition and centering• ARToolKiTPlus and OpenCV support
ForceForce• Approaching and actual grasping• Contact detection
EGP - Control AspectsEGP - Control Aspects
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Dynamic Programming-based Dynamic Programming-based • Coordination of robotic macro-structures• Independent from the specific system configuration• Many different control objectives can be required
22
2
min
minarg1
qJySx
qJySxqA
iiiq
oi
iiiAq
ii
Velocity controltask requirement
Associatedcost-to-go
Movingplatformvelocity
General Control Architecture General Control Architecture with Priority Taskswith Priority Tasks
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Task i-Task i-th:th:
General Control Architecture General Control Architecture with Priority Taskswith Priority Tasks
iiiiiii
iiiiiiiii
iiiiii
iiiiiii
HQQQhh
SJQPPPJSS
JJIHXJ
QJJhJxX
111
#111
##
11
,
,
,
,
22#2
ySXVySXJJI
zQyPhq
iiiiiiioi
iiii
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
aC
11 LJ L
11A LJ
1Lp
pC
b
222
222
qJ
qJv
A
L
Use of relationships
Monitoring MM tendency toward 0
1
211
211
)3
)2
)1
qp
qJ
vqJv
A
L
,,22 vLq
221
22211
22211
,;,,)3
,;,,)2
,;,,)1
vvpL
vvLJ
vvvLJv
A
L
222
22
;,,~)3
;,,~
)2
;,,~
)1
qvp
qvJ
qvJv
A
L
2211 ,;,, vvLq
aC
pC
11L LJ11 LJ A
1pL
e
p
Backward PhaseBackward Phase
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
RemarksRemarks• The risk of MM losses still exists(e.g. if the object must be very high lifted)• If a MM loss is detected the lastlastresort solutionresort solution is modulating• Implicit Priority ChangeImplicit Priority Change
),,(22 vLq
pC222
222
qJ
qJv
A
L
),;,,( 2211 vvLq
22v
1qaC
p b
e
pC
aC
2v2
2q
v
Forward PhaseForward Phase
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
),,(22 vLq
),,(3 vLqp
);,,(~3
);,,(~
2
);,,(~
1
2
22
22
qvp)
qvJ)
qvJv)
A
L
11L LJ
11A LJ
1Lp
pC
vvL ˆ),,(3
),,(22 vLq
),,(22 vLq
Backward phase at platform level
v̂v)1(v
o
Implicit Priority ChangeImplicit Priority Change
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Vision-based Recognition Vision-based Recognition of Objectsof Objects
Marker-based object trackingMarker-based object tracking• Reliability• Robustness
Occurring problemsOccurring problems• Lighting conditions• Complexity of the captured scene• Distance from which the marker is seen
PreliminaryThresholding
ImageCleaning
Image
Zooming
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Image from
camera
Autothreshold
Imagecleaning
To Estimator
Pose estimation
LPF
Po
se
Est
imat
or
To E-GNC
Imagezooming
Image Processing ChainImage Processing Chain
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
angular error angular error after zooming
angular error after LPF
linear error linear error after zooming
linear error after LPF
Implicit Priority ChangeImplicit Priority Change
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Direct Force Control StrategyDirect Force Control Strategy• Detect a contact with the object to be grasped• Compensate residual errors
Pure Force Only at the Palm LevelPure Force Only at the Palm Level• felt by the JR3 sensor• the contact point must belong to the palmsurface
ssf ,
ddrrr
rfT
yx
ss
,
known and constant
Force-based Approach Force-based Approach towards Objectstowards Objects
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Velocity Generation Velocity Generation • Contact point estimation• Velocity assigned to the estimated contact point
• Compute velocity reference with respect to the robot end-effector
RemarksRemarks• Noisy sensor and too long distance from palm• Initial error very small thanks to vision
cx
cx
ssc FFKx *
s
ss f
F
with
Force-based Approach Force-based Approach towards Objectstowards Objects
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Simulative ResultsSimulative Results
Experimental ResultsExperimental Results
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
VideoVideo• 3. EGP Failed Equipment Replacement.avi
EGPEGP• EffectiveEffective and autonomousautonomous robotic crew assistant• Marker removal• PotentiallyPotentially, flight model
Planetary RoversPlanetary Rovers• Visual Odometry error less than 1%less than 1%• 3D reconstruction of the environment• DEM construction and autonomous navigation
Conclusions and Future Conclusions and Future WorkWork
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
EGPEGP[1] T. Kröger, D. Kubus and F. M. Wahl, “6D Force and Acceleration Sensor Fusion for Compliant Manipulation Control”, IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October 2006.
[2] B. J. Waibel and H. Kazerooni, “Theory and Experiments on the Stabilityof Robot Compliance Control”, IEEE Transactions on Robotics and Automation, February 1991, vol. 7, no. 1, pp. 95-104.
[3] G. Bradski and A. Kaehler, “Learning OpenCV: Computer Vision with the OpenCV Library”, O'Reilly.[4] C. P. Lu, G. D. Hager and E. Mjolsness “Fast and Globally Convergent Pose Estimation From Video Images”, IEEE Transactions on Pattern Analysis and Machine Intelligence, June 2000, vol. 22, no. 6, pp. 610-622.
References, IReferences, I
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
[5] B. Kainz and M. Streit, “How to Write an Application with Studierstube 4.0”, Technical report, Graz University of Technology, 2006.
[6] J. Cai, “Seminar Report: Augmented Reality: the Studierstube Project”, Seminar report.
[7] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Autonomous Dual-Arm Mobile Manipulator Crew Assistant for Surface Operations: Force/Vision-Guided Grasping”, International Conference on Recent Advances in Space Technologies, Istanbul, Turkey, June 2009.
[8] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Force/Vision-Guided Grasping for an Autonomous Dual-Arm Mobile Manipulator Crew Assistant for Space Exploration Missions”, International Conference on Automation Robotics and Control Systems, Orlando, USA, July 2009.
References, IIReferences, II
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
[9] G. Casalino and A. Turetta, “Coordination and Control of Multiarm Nonholonomic Mobile Manipulators", MISTRAL: Methodologies and Integration of Subsystems and Technologies for Robotic Architectures and Locomotion, B. Siciliano, G. Casalino, A. De Luca, C. Melchiorri, Springer Tracts in Advanced Robotics, Springer-Verlag, April 2004.
[10] G. Casalino, A. Turetta and A. Sorbara, “Dynamic Programming based Computationally Distributed Kinematic Inversion Technique”, Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November 2006.
[11] G. Casalino, A. Turetta and A. Sorbara, “DP-Based Distributed Kinematic Inversion for Complex Robotic Systems”, 7th Portuguese Conference on Automatic Control, Lisbon, Portugal, September 2006.
[12] E. Zereik, “Space Robotics Supporting Exploration Missions: Vision, Force Control and Coordination Strategies”, Ph.D. Thesis, University of Genova, 2010.
References, IIIReferences, III
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Visual OdometryVisual Odometry[1] M. Maurette and E. Baumgartner, “Autonomous Navigation Ability: FIDO Test Results”, 6th ESA Workshop on Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November 2000.
[2] M. Maimone, Y. Cheng, and L. H. Matthies, “Two Years of Visual Odometry on the Mars Exploration Rovers”, Journal of Field Robotics, March 2007, vol. 24, no. 3, pp. 169-186.
[3] A. E. Johnson, S. B. Goldberg, Y. Cheng and L. H. Matthies, “Robust and Efficient Stereo Feature Tracking for Visual Odometry”, IEEE International Conference on Robotics and Automation, Pasadena, USA, May 2008.
[4] L. Matthies, “Dynamic Stereo Vision”, Ph.D. Thesis, Carnegie Mellon University.
References, IVReferences, IV
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
[5] D. Nistér, O. Naroditsky and J. Bergen, “Visual Odometry”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, USA, June 2004.
[6] Richard Hartley and Andrew Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, March 2004.
[7] E. Trucco and A. Verri, “Introductory Techniques for 3-D Computer Vision”, Prentice Hall, 1998.
[8] I. J. Cox, S. L. Hingorani, S. B. Rao and B. M. Maggs, “A Maximum Likelihood Stereo Algorithm”, Journal of Computer Vision and Image Understanding, 1996, vol. 63, no. 3, pp. 542-567.
[9] M. Fischler and R. Bolles, “Random Sample Consensus: a Paradigm for Model Fitting with Application to Image Analysis and Automated Cartography”, Communications of the Association for Computing Machinery, June 1981, vol. 24, pp. 381-395.
References, VReferences, V
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Thank you for your kind Thank you for your kind attention!!attention!!
ICRA Planetary Rover Workshop
Anchorage, Alaska, 3 May 2010
Recommended