24
High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase-Based GPS, and Inertial Measurements Daniel Shepard and Todd Humphreys 2014 IEEE/ION PLANS Conference, Monterey, CA | May 8, 2014

High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

Embed Size (px)

Citation preview

Page 1: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM,

Carrier-Phase-Based GPS, and Inertial Measurements

Daniel Shepard and Todd Humphreys

2014 IEEE/ION PLANS Conference, Monterey, CA | May 8, 2014

Page 2: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

2 of 21

Globally-Referenced Visual SLAM

Motivating Application: Augmented Reality

Estimation Architecture

Bundle Adjustment (BA)

Simulation Results for BA

Overview

Page 3: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

3 of 21

Produces high-precision estimates of Camera motion (with ambiguous scale for monocular SLAM) A map of the environment

Limited in application due to lack of a global reference

Stand-Alone Visual SLAM

[1] G. Klein and D. Murray, “Parallel tracking and mapping for small AR workspaces,” in 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. IEEE, 2007, pp. 225–234.

Page 4: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

4 of 21

Globally-referenced solution if fiduciary markers are globally-referenced

Requires substantial infrastructure and/or mapping effort Microsoft’s augmented reality maps (TED2010[2])

Visual SLAM with Fiduciary Markers

[2] B. A. y Arcas, “Blaise Aguera y Arcas demos augmented-reality maps,” TED, Feb. 2010, http://www.ted.com/talks/blaise aguera.html.

Page 5: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

5 of 21

Can globally-referenced position and attitude (pose) be recovered

from combining visual SLAM and GPS?

Page 6: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

6 of 21

No GPS positions Translation Rotation Scale

Observability of Visual SLAM + GPS 1 GPS position

Translation Rotation Scale

2 GPS positions Translation Rotation Scale

~

3 GPS positions Translation Rotation Scale

Page 7: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

7 of 21

CDGPS anchors visual SLAM to a global reference frame

Can add an IMU to improve dynamic performance (not required!)

Can be made inexpensive

Requires little infrastructure

Combined Visual SLAM and CDGPS

Very Accurate!

Page 8: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

8 of 21

Augmenting a live view of the world with computer-generated sensory input to enhance one’s current perception of reality[3]

Current applications are limited by lack of accurate global pose

Potential uses in Construction Real-Estate Gaming Social Media

Motivating Application: Augmented Reality

[3] Graham, M., Zook, M., and Boulton, A. "Augmented reality in urban places: contested content and the duplicity of code." Transactions of the Institute of British Geographers.

.

Page 9: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

9 of 21

Sensors: Camera Two GPS antennas

(reference and mobile) IMU

How can the information from these sensors best be combined to estimate the camera pose and a map of the environment? Real-time operation Computational burden vs. precision

Estimation Architecture Motivation

Page 10: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

10 of 21

Sensor Fusion Approach

IMU

Visual SLAM CDGPS

Tighter coupling = higher precision, but increased computational burden

IMU

Visual SLAM CDGPS

IMU

Visual SLAM CDGPS

IMU

Visual SLAM CDGPS

Page 11: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

11 of 21

The Optimal Estimator

Page 12: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

12 of 21

IMU only for Pose Propagation

Page 13: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

13 of 21

Tightly-Coupled Architecture

Page 14: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

14 of 21

Loosely-Coupled Architecture

Page 15: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

15 of 21

Hybrid Batch/Sequential Estimator Only geographically diverse frames (keyframes) in batch estimator

Page 16: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

16 of 21

State Vector:

Measurement Models: CDGPS Positions:

Image Feature Measurements:

Bundle Adjustment State and Measurements

Page 17: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

17 of 21

Weighted least-squares cost function Employs robust weight functions to handle outliers

Sparse Levenberg-Marquart algorithm Computational complexity linear in number of point features, but

cubic in number of keyframes

Bundle Adjustment Cost Minimization

Page 18: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

18 of 21

Initialize BA based on stand-alone visual SLAM solution and CDGPS positions Determine similarity transform relating coordinate systems

Generalized form of Horn’s transform[4]

Rotation: Rotation that best aligns deviations from mean camera position

Scale: A ratio of metrics describing spread of camera positions

Translation: Difference in mean antenna position

Bundle Adjustment Initialization

[4] B. K. Horn, “Closed-form solution of absolute orientation using unit quaternions,” JOSA A, vol. 4, no. 4, pp. 629–642, 1987.

Page 19: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

19 of 21

Simulations investigating estimability included in paper

Hallway Simulation: Measurement errors:

2 cm std for CDGPS 1 pixel std for vision

Keyframes every 0.25 m 242 keyframes 1310 point features

Three scenarios:1. GPS available

2. GPS lost when hallway entered

3. GPS reacquired when hallway exited

Simulation Scenario for BA

A

B

C

D

Page 20: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

20 of 21

Simulation Results for BA

Page 21: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

21 of 21

Hybrid batch/sequential estimator for loosely-coupled visual SLAM and CDGPS with IMU for state propagation Compared to optimal estimator

Outlined algorithm for BA (batch)

Presented a novel technique for initialization of BA

BA simulations Demonstrated positioning accuracy of cm and attitude accuracy of

in areas of GPS availability

Attained slow drift during GPS unavailability (0.4% drift over 50 m)

Summary

Page 22: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

22 of 21

State Vector:

Propagation Step: Standard EKF propagation step using accelerometer and gyro

measurements

Accelerometer and gyro biases modeled as a first-order Gauss-Markov processes

More information in paper …

Navigation Filter

Page 23: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

23 of 21

Measurement Update Step: Image feature measurements from all non-keyframes

Temporarily augment the state with point feature positions Prior from map produced by BA Must ignore cross-covariances filter inconsistency

Similar block diagonal structure in the normal equations as BA

Navigation Filter (cont.)

Page 24: High-Precision Globally-Referenced Position and Attitude via a Fusion of Visual SLAM, Carrier-Phase- Based GPS, and Inertial Measurements Daniel Shepard

24 of 21

Simulation Results for BA (cont.)