"Using Vision to Enable Autonomous Land, Sea and Air Vehicles," a Keynote Presentation...

Preview:

Citation preview

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Using Vision to Enable Autonomous Land, Sea, Air,

and Space Vehicles

Larry MatthiesComputer Vision Group

Jet Propulsion LaboratoryCalifornia Institute of Technology

lhm@jpl.nasa.gov

Gale Crater

Copyright 2016 California Institute of Technology. U.S. Government sponsorship acknowledged.

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Application Domains andMain JPL Themes

Land: all-terrain autonomous mobility; mobile manipulation

Sea: USV escort teams; UUVs for subsurface oceanography

Space: assembling large structures in Earth orbit

Air: Mars precision landing; rotorcraft for Mars and Titan; drone autonomy on Earth

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Basic Taxonomy ofPerception Capabilities and Challenges

Capabilities•  Localization

–  Absolute, relative•  Obstacle detection

–  Stationary, moving–  Obstacle type–  Terrain trafficability

•  Other scene semantics–  Landmarks, signs,

destinations, etc–  Perceiving people and their

activities•  Perception for grasping

Challenges•  Sensors for observability•  Fast motion•  Lighting conditions

–  Low light, no light–  Very wide dynamic range

•  Atmospheric conditions–  Haze, fog, smoke–  Precipitation

•  Difficult object/terrain types–  Featureless, specular,

transparent–  Obstacles in grass; water, snow,

ice; mud•  Computational cost vs processor

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyLocalization

•  Relevant use cases:–  Wheeled, tracked, legged vehicles

indoors and outdoors–  Drones–  Mars rovers–  Mars landers

•  Key challenges:–  Appearance variability: lighting,

weather, season–  Moving objects–  Fail-safe performance

•  Examples I’ll describe:–  Day/night relative localization with IMU,

leg odometry, and NIR active stereo for DARPA LS3 program

–  Map relative localization for Mars precision landing

1976 Viking174 x 62 mi

•  Some central themes:–  Fusion of IMU with vision or lidar

now commonplace for relative localization

–  Map matching is a very active topic for absolute localization

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Day/Night LS3 Relative Navigationwith Visual, Inertial, and Leg Odometry

•  Array of NIR LEDs with different lenses/power to achieve more uniform illumination with dense stereo out to ~ 4.5 m

SENSORS:1)  Bumblebee Stereo (1024x768)2)  Tactical-grade IMU (600Hz)3)  Optional: Nav-grade IMU (600Hz)4)  Optional: Leg Odometry (200Hz)

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Day/Night LS3 Relative Navigationwith Visual, Inertial, and Leg Odometry

asphalt dirt road snow forest

-  Position error over a moving window of 50m-  Position error < 0.5m 95% of runs-  Position error < 0.75m 100% of runs

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Day/Night LS3 Relative Navigationwith Visual, Inertial, and Leg Odometry

•  How far can the lookahead scale with illuminators?

•  Can it work at night with thermal images (stereo or monocular) – with more noise, motion blur, and rolling shutter readout?

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Mars Airbag-based Landers (1997, 2004):Horizontal Velocity at Impact

impactvelocity

impactvelocity

impactvelocity

Pathfinder

Pathfinderwith TIRS

Pathfinderwith TIRS ���and DIMES

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Descent Image Motion Estimation System for 2004 Mars Landers

•  Horizontal velocity estimation during last 2 km of descent

AI1

AI2

I1qG

G

g

I2qG

I1

I2

vh11, vh12

AI3

vh21, vh22

I3qG

I3

~ 20 sec with 20% of 20 MHz RAD6000 flight computer

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Map-Relative Localizationfor Mars Precision Landing

Backshell Separation

Powered Descent

Sky Crane

Flyaway

Prime MLEs

Radar Data Collection

Divert Maneuver

Safe Target Selection

Lander Vision System

TRN increases the ���probability of safe landing

Hazards in Landing���Ellipse without TRN

Hazards in Landing���Ellipse with TRN

TRN

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Map-Relative Localizationfor Mars Precision Landing

LVS IMU

LVS Compute Element

Processor Navigation Filter

Data Flow

Virtex 5 FPGA Image Processing Sensor Interfaces

Memory for map

Spacecraft Flight Computer

LVS Camera

Image 1 Image 2

Image 3

IMU

IMU IMU

Image 4 Image 5

IMU

Coarse Landmark Matching Remove Position Error (3km)

Fine Landmark Matching Improve Accuracy (40m)

State Estimation Fuse inertial measurements with landmark

matches and complete in 10 seconds

spacecraft attitude, altitude

map relative ���position

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyObstacle Detection

•  Relevant use cases:–  Land vehicles indoors and outdoor

on-road and off-road–  Drones: flying and landing–  Boats on and under water–  Robot manipulators

•  Key challenges:–  Appearance variability:

•  Lighting, weather, season•  Surface reflectance, transparency

–  Terrain variability–  Moving objects–  Fail-safe performance

HVU  

8  HSMSTs  (Teleoperated)  5  USVs  

(Autonomous)  

•  Issues I’ll discuss:–  Sensors and phenomenology–  3-D representations–  Land, air, and sea examples

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyStereo Vision in a Marina

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyStereo + 360-degree Monocular EO/IR

JPL 360(monocular)

JPL Hammerhead (stereo)

Stereo

360

IR  Deck  

EO  Deck   Color  

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyUnmanned Surface Vehicles

15

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyProcessing Blocks

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Almost  crossed    à  “crossing”  rule    

 not  applied  USV  must    give  way  

1 knot

10 knots

30 knots

10 knots

COLREGS Illustration

Crossing from left Crossing from right Overtaking Head-on

Your boat

Traffic boat

Need  more  than    the  geometry  to    

determine  COLREGS    situaHons  

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Jet Propulsion LaboratoryCalifornia Instituteof TechnologySeeing through Atmospheric Obscurants

Fog

Smoke from a

controlled burn

Photon 640LWIR camera

Color cameraVisible vs SWIR

in haze

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Uncooled LWIR Stereo:More Challenging

LBM SAD5 SGBM

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyNight Operation with Thermal Stereo Cameras

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyWhy are Negative Obstacles so Hard to Detect?

R

w

H θn

h

α

R

H θp h

α

θ p ≈hR

θn ≈Hw

R(R + w )

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Heat Transfer Characteristics:Negative Obstacle Detection

Color crosswise view

Color lengthwise view

MWIR image 1 hr after sundown

Weatherproof sensor enclosure

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Heat Transfer Characteristics:After Sundown, Holes Cool More Slowly than Surface

• Radiation

•  Evapotranspiration (ignored here)

A €

qnegobs1

2negobsq

terrainq

1sideT 2sideT

terrainT

skyT

airTterrainT

terrainq

negobsq

negobsT

C5020 °−=diurnalTterrainT

terrainq

negobsq

negobsT

• Convection

• Conduction

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Heat Transfer Characteristics:Negative Obstacle Detection

3.8 m

3.7 m

0.53 m

0.53 m

North

9 pm 7 am

9 am 9 pm

5 pm 5 pm

10 pm 7 am

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyDetection Results using Thermal Signature

Rectified thermal infrared intensity image.

After intensity

difference thresholding.

Closed contours

overlaid on intensity image.

After geometry

based filtering.

Trench 3 pixels wide at first detection. Trench first detected at 18.2m range.

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyWater Detection: Why is it Useful and is it Hard?

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Water Body Detection with Reflections in Stereo:Works with Visible and Thermal Images

15:00

100m map

Stereo range imageLeft rectified image

Jet Propulsion LaboratoryCalifornia Instituteof TechnologySome Water Detection and Mapping Results

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyAbsorption Coefficient of Light in Pure Water

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Foliage Classification andObstacle Detection in Foliage

lhm - 32

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyPolar Grid Maps

Jet Propulsion LaboratoryCalifornia Instituteof Technology

C-space-like obstacle expansion of disparity map

Architecture for obstacle avoidance

Image-based Obstacle Representation

Jet Propulsion LaboratoryCalifornia Instituteof Technology

MAV Obstacle Avoidance:Test Site Near JPL

Jet Propulsion LaboratoryCalifornia Instituteof Technology

MAV Obstacle Avoidance:Test Results

C-space-like obstacle expansion

of disparity map

Obstacle points and path projected on ground plane

Upright view

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Research Approach:Challenges and InnovationGetting Very Wide Field of Regard

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Stereo-OF fusion:

“egocylinder”

•  Range from stereo and OF

•  Scale propagation from stereo in overlap region

•  Projection into common cylinder representation

•  C-space expansion in image space

Egocylinder Representation

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyEgocylinder with Real Data

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyOnboard Implementation

Update Rates

Stereo 384x240 @ 5 Hz

SFM (LSD-SLAM)

384x240 @ 10 Hzboth cameras

Egocylinder 5 Hz

C-space 5 Hz

Planning5 Hz updates (1 ms

verification, several ms searching)

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyOnboard Implementation

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Autonomous Landing:Problems and Solution Characteristics

Gale Crater, Mars

•  Problem characteristics •  Variable potentially complex 3-D structure •  Variable appearance •  Variable altitude for approach •  Need for very lightweight hardware

•  Solution characteristics •  Desire dense 3-D perception •  Must work from variable altitude •  Must work in direct sunlight •  Hardware as light as possible – just a camera?

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Autonomous Safe Landing:Variable Baseline Dense Motion Stereo

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Onboard Implementationwith Smartphone Processor

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Autonomous Rooftop Landing of Micro Air Vehicles for Recon Applications

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Real-time Onboard Mappingfor Safe Landing in Unknown Terrain

Raw image from camera at ~15m height.

Computed elevation map

Landing confidence map (dark blue: highest confidence)

river bed

edge

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Notional Future Directions for Space Exploration

Pre-Decisional Information – For Planning and Discussion Purposes Only

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Notional Future Directions for Space Exploration

•  Mars sample return•  Accessing recurring slope lineae,

caves, and vertical/microgravity

Pre-Decisional Information – For Planning and Discussion Purposes Only

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Notional Future Directions for Space Exploration

•  Mars sample return•  Accessing recurring slope lineae•  Comet sample return, Ocean Words, Titan

Pre-Decisional Information – For Planning and Discussion Purposes Only

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Perception and Planning for Robots in Human Environments, Interacting with People

Base reachability Arm

reachability

Desired end-effector goal

Perception and planning for mobile manipulation

Deep learning-based object class labeling/pose estimation; human articulate body pose estimation

Power grasp opportunities

Pinch grasp opportunities

Scene

Jet Propulsion LaboratoryCalifornia Instituteof Technology

Some Thoughts About Back on Earth:Better Perception for Human-Robot Interaction

Integrate facial expressions, head pose, and body pose

into robot perception of people for more intelligent

human robot interaction

S_1 S_2 S_3 S_4 S_5 S_6 S_7 S_8 S_9

S_13S_12 S_15S_14 S_17S_16S_11S_10

Electromyography sleeve with forearm IMU and magnetometer for recognizing arm and

hand gestures

Jet Propulsion LaboratoryCalifornia Instituteof TechnologyA Few Recent Highlights of Non-NASA Work

Sunset at Gusev Crater, Mars, from Spirit Rover

lhm@jpl.nasa.gov

Questions?

Recommended