23
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception and Robotic Manipulation Springer Tracts in Advanced Robotics Chapter 3 Chapter 3 Shape Recovery Using Shape Recovery Using Robust Light Striping Robust Light Striping Geoffrey Taylor Lindsay Kleeman

1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

Embed Size (px)

Citation preview

Page 1: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

1

Intelligent Robotics Research Centre (IRRC)

Department of Electrical and Computer Systems Engineering

Monash University, Australia

Visual Perception and Robotic Manipulation

Springer Tracts in Advanced Robotics

Chapter 3Chapter 3

Shape Recovery Using Shape Recovery Using Robust Light StripingRobust Light Striping

Geoffrey Taylor

Lindsay Kleeman

Page 2: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

2Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

ContentsContents

• Motivation for stereoscopic stripe ranging.

• Benefits of our scanner.

• Validation/reconstruction framework.

• Image-based calibration technique.

• Experimental results.

• Conclusions and future work.

Page 3: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

3Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

MotivationMotivation

• Allow robot to model and locate objects in the environment as first step in manipulation.

• Capture registered colour and depth data to aid intelligent decision making.

Page 4: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

4Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Conventional ScannerConventional Scanner

• Triangulate from two independent measurements: image plane data and laser stripe position.

• Depth image constructed by sweeping stripe.

Camera Image

sweep stripe

Stripe generator

Camera

Scannedobject

B

D

Page 5: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

5Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

DifficultiesDifficulties

• Light stripe assumed to be the brightest feature:– Objects specially prepared (matte white paint)

– Scans performed in low ambient light

– Use high contrast camera (can’t capture colour)

• Noise sources invalidate brightness assumption:– Specular reflections

– Cross talk between robots

– Stripe-like textures in the environment

• For service robots, we need a robust scanner that does not rely on brightness assumption!

Page 6: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

6Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Related WorkRelated Work

• Robust scanners must validate stripe measurements

• Robust single camera scanners:– Validation from motion: Nygårds et al, 1994

– Validation from modulation: Haverinen et al,1998

– Two intersecting stripes: Nakano et al, 1988

• Robust stereo camera scanners:– Independent sensors: Trucco et al, 1994

– Known scene structure: Magee et al, 1994

• Existing methods suffer from assumed scene structure, acquisition delay, lack of error recovery

Page 7: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

7Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Stereo ScannerStereo Scanner

• Stereoscopic light striping approach:– Validation through three redundant measurements:

• Measured stripe location on stereo image planes• Known angle of light plane

– Validation/reconstruction constraint:• There must be some point on the known light plane

that projects to the stereo measurements (within a threshold error) for these measurements to be valid.

– Reconstructed point is optimal with respect to measurement noise (uniform image plane error)

– System parameters can be calibrated from scan of an arbitrary non-planar target

Page 8: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

8Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Validation/ReconstructionValidation/Reconstruction

LeftCamera

LP

Scannedsurface

RightCameraRP

RightImage Plane

LeftImagePlane

Laser plane at known position and angle

Unknown 3D reconstruction (constrained to light plane)

Lx

X

Rx

Right measurement

Left measurement

Light plane

params

Page 9: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

9Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Validation/ReconstructionValidation/Reconstruction

• Unknown reconstruction projects to measurements:Lx = LPX, Rx = RPX

• Find reconstruction X that minimizes image error:E = d2(Lx, LPX) + d2(Rx, RPX)

– Subject to constraint TX = 0 ie. X is on laser plane– d2(a, b) is Euclidean distance between points a and b

• If E < Eth then (Lx, Rx) are valid measurements and X is the optimal reconstruction

• The above constrained optimization has analytical solution in the case of rectilinear stereo

Page 10: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

10Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

CalibrationCalibration

• System params: p = (k1, k2, k3, x, z, B0, m, c)

– k1, k2, k3 relate to laser position and camera baseline

x, z, B0 relate to plane orientation

– m, c relate laser encoder counts e to angle, x = me + c

• Take scan of arbitrary non-planar scene– Initially assume laser is given by brightest feature

• Form total reconstruction error Etot over all points:

Etot = j=framesi=scanlinesE(Lxij, Rxij, ej, p)

Page 11: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

11Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

CalibrationCalibration

• Find p that minimizes total error (assuming Lxij, Rxij, ej are fixed):

p* = minp[Etot(p)]– Use Levenberg-Marquardt numerical minimization

• Refinement steps:– Above solution will be inaccurate due to incorrect

correspondences caused by brightness assumption– Use initial p* to validate (Lxij, Rxij) and reject invalid

measurements, then recalculate p*– Above solution also assumes no error in encoder counts

ej. Error removed by iterative refinement of ej and p*.

Page 12: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

12Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

ImplementationImplementation

RightCamera

LeftCamera

OpticalEncoder

LaserStripe

Page 13: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

13Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Image processingImage processing

Left field Right field

Image difference Extract maxima

Left candidates

Right candidates

Raw stereo image

• When multiple candidates appear on each scan line, calculate reconstruction error E for every candidate pair and choose pair with smallest E < Eth

Page 14: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

14Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Scanning ProcessScanning Process

Page 15: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

15Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

ResultsResults

• Refection/cross-talk mirror experiment: mirror generates bright false stripe measurements

Page 16: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

16Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

ResultsResults

• Laser extracted as brightest feature per line

• All bright candidates extracted and matched using validation

condition.

Page 17: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

17Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

More ResultsMore Results

• Office phone mirror scan results:

Brightest featurewithout Validation

With Validation

Page 18: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

18Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Specular ObjectsSpecular Objects

• Tin can scan with specular reflections

Page 19: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

19Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Specular ObjectsSpecular Objects

• Tin can scan results:

Brightest featurewithout Validation

With Validation

Page 20: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

20Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Depth DiscontinuitiesDepth Discontinuities

• Depth discontinuities cause association ambiguity

Page 21: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

21Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Depth DiscontinuitiesDepth Discontinuities

• Depth discontinuity scan results:

Without Validation With Validation

Page 22: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

22Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

ConclusionsConclusions

• We have developed a mechanism for eliminating:– sensor noise

– cross talk

– ‘ghost’ stripes (reflections, striped textures, etc)

• Developed an image based calibration technique requiring an arbitrary non-planar target.

• Operation in ambient light allows registered range and colour to be captured in a single sensor.

• Experimental results validate the above techniques.

Page 23: 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception

23Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics

Future DirectionsFuture Directions

• Use multiple simultaneous stripes to increase acquisition rate– Multiple stripes can be disambiguated using the same

framework that provides validation

• Perform object segmentation, modeling, and recognition on scanned data to support grasp planning and manipulation on a service robot.