43
Sensor Fusion for Aerial Robots Shaojie Shen Assistant Professor 1

Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

  • Upload
    ledien

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Sensor Fusion for Aerial Robots

Shaojie ShenAssistant Professor

1

Page 2: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Why Sensor Fusion?

• Vision/GPS-only state estimation is too noisy, slow, and delayed for feedback control of agile aerial robots

• To improve robustness with multiple sensors and handle sensor failures• To estimate quantities that are unobservable using single sensors

2

Red: Vision+IMU FusionBlue: Vision-only

Page 3: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Design Considerations…

3

• Accuracy• Frequency• Latency• Sensor synchronization & timestamp accuracy• Delayed and out-of-order measurements• Estimator initialization• Sensor calibration• Different measurement models with uncertainties• Robustness to outliers• Computational efficiency

Page 4: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

• IMU centric fusion– High frequency– Low latency– (Almost) always available– (Usually) large drift

• Absolute measurements

• Relative measurements

What to Fuse?

4

IMU

GPS Velocity

GPS PositionPressure Altimeter

Cameras

Laser Scanner

LaserAltimeter

Multi-Sensor Fusion

Page 5: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

(A Few) Sensor Fusion Methods

5

• Kalman filtering– Loosely-coupled [1]

• Fuse processed information from individual sensors

– Tightly-coupled [2]• Fuse raw measurements

directly• Optimization-based methods

– Mostly tightly-coupled [3,4]

IMU

Visual Odometry

EKF/UKF-Based Multi-

Sensor Fusion

[1] D. Scaramuzza, et. al. Vision-controlled micro flying robots: from system design to autonomous navigation and mapping in GPS denied environments. IEEE Robot. Autom. Mag., 21(3), 2014.[2] A. I. Mourikis and S. I. Roumeliotis. A multi-state constraint Kalman filter for vision-aided inertial navigation. In Proc. of the IEEE Intl. Conf. on Robot. and Autom., pages 3565–3572, Roma, Italy, April 2007.[3] S. Leutenegger, et al. Keyframe-based visual-inertial SLAM using nonlinear optimization. In Proc. of Robot.: Sci. and Syst., Berlin, Germany, June 2013.[4] S. Shen, et. al. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs,” in Proc. of the IEEE Intl. Conf. on Robot. and Autom., Seattle, WA, May 2014.

Page 6: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Outline

6

• Loosely-Coupled, Extended Kalman Filtering-Based Multi-Sensor Fusion (“Tutorial”)

• Tightly-Coupled, Optimization-Based, Monocular Visual-Inertial Fusion, with Online Initialization and Camera-IMU Extrinsic Calibration (Brief Intro)

IMU

Visual Odometry

EKF-Based Multi-Sensor

Fusion

Page 7: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

EKF - Assumption & Model

• The prior state of the robot is represented by a Gaussian distribution– 𝑝𝑝 𝑥𝑥0 ~ 𝑁𝑁(𝜇𝜇0, Σ0)

• The continuous time process model is:– �̇�𝑥 = 𝑓𝑓(𝑥𝑥,𝑢𝑢,𝑛𝑛)– 𝑛𝑛𝑡𝑡 ~ 𝑁𝑁 0,𝑄𝑄𝑡𝑡 is Gaussian white noise– Can be linearized using one-step Euler integration:

• 𝑥𝑥 ̅𝑡𝑡 ≈ 𝑥𝑥𝑡𝑡−1 + 𝑓𝑓(𝑥𝑥𝑡𝑡−1,𝑢𝑢𝑡𝑡 ,𝑛𝑛𝑡𝑡) 𝛿𝛿𝛿𝛿• The measurement model is:

– 𝑧𝑧 = ℎ(𝑥𝑥, 𝑣𝑣)– 𝑣𝑣𝑡𝑡 ~ 𝑁𝑁 0,𝑅𝑅𝑡𝑡 is Gaussian white noise

• The process and measurement models are all time-stamped

7

Page 8: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

EKF - Process Model Linearization

• Linearize the process model about 𝑥𝑥 = 𝜇𝜇𝑡𝑡−1, 𝑢𝑢 = 𝑢𝑢𝑡𝑡 , 𝑛𝑛 = 0

– �̇�𝑥 ≈ 𝑓𝑓 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡 , 0 + �𝜕𝜕𝜕𝜕𝜕𝜕𝜕𝜕 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

𝑥𝑥 − 𝜇𝜇𝑡𝑡−1 + �𝜕𝜕𝜕𝜕𝜕𝜕𝑢𝑢 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

𝑢𝑢 − 𝑢𝑢𝑡𝑡

+ �𝜕𝜕𝜕𝜕𝜕𝜕𝑛𝑛 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

(𝑛𝑛 − 0)

• Let:

– 𝐴𝐴𝑡𝑡 = �𝜕𝜕𝜕𝜕𝜕𝜕𝜕𝜕 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

– 𝐵𝐵𝑡𝑡 = �𝜕𝜕𝜕𝜕𝜕𝜕𝑢𝑢 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

– 𝑈𝑈𝑡𝑡 = �𝜕𝜕𝜕𝜕𝜕𝜕𝑛𝑛 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

• Linear process model:– �̇�𝑥 ≈ 𝑓𝑓 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡 , 0 + 𝐴𝐴𝑡𝑡(𝑥𝑥 − 𝜇𝜇𝑡𝑡−1) + 𝐵𝐵𝑡𝑡(𝑢𝑢 − 𝑢𝑢𝑡𝑡) + 𝑈𝑈𝑡𝑡(𝑛𝑛 − 0)

8

Page 9: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

EKF – Measurement Model Linearization

• Linearize the measurement model about 𝑥𝑥 = �̅�𝜇𝑡𝑡, 𝑣𝑣 = 0

– ℎ 𝑥𝑥, 𝑣𝑣 ≈ ℎ �̅�𝜇𝑡𝑡 , 0 + �𝜕𝜕ℎ𝜕𝜕𝜕𝜕 �𝜇𝜇𝑡𝑡,0

𝑥𝑥 − �̅�𝜇𝑡𝑡 + �𝜕𝜕ℎ𝜕𝜕𝑣𝑣 �𝜇𝜇𝑡𝑡,0

(𝑣𝑣 − 0)

• Let:

– 𝐶𝐶𝑡𝑡 = �𝜕𝜕ℎ𝜕𝜕𝜕𝜕 �𝜇𝜇𝑡𝑡,0

– 𝑊𝑊𝑡𝑡 = �𝜕𝜕ℎ𝜕𝜕𝑣𝑣 �𝜇𝜇𝑡𝑡,0

• Linear measurement model:– 𝑧𝑧𝑡𝑡 = ℎ 𝑥𝑥𝑡𝑡 , 𝑣𝑣𝑡𝑡 ≈ ℎ �̅�𝜇𝑡𝑡, 0 + 𝐶𝐶𝑡𝑡 𝑥𝑥𝑡𝑡 − �̅�𝜇𝑡𝑡 + 𝑊𝑊𝑡𝑡 𝑣𝑣𝑡𝑡

9

Page 10: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

EKF - Summary

• Process Update:– �̅�𝜇𝑡𝑡 = 𝜇𝜇𝑡𝑡−1 + 𝛿𝛿𝛿𝛿 𝑓𝑓 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡, 0– �Σ𝑡𝑡 = 𝐹𝐹𝑡𝑡 Σ𝑡𝑡−1 𝐹𝐹𝑡𝑡𝑇𝑇 + 𝑉𝑉𝑡𝑡 𝑄𝑄𝑡𝑡 𝑉𝑉𝑡𝑡𝑇𝑇

– �̇�𝑥 = 𝑓𝑓 𝑥𝑥,𝑢𝑢,𝑛𝑛– 𝑛𝑛𝑡𝑡 ~ 𝑁𝑁 0,𝑄𝑄𝑡𝑡

– 𝐴𝐴𝑡𝑡 = �𝜕𝜕𝜕𝜕𝜕𝜕𝜕𝜕 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

– 𝑈𝑈𝑡𝑡 = �𝜕𝜕𝜕𝜕𝜕𝜕𝑛𝑛 𝜇𝜇𝑡𝑡−1,𝑢𝑢𝑡𝑡,0

– 𝐹𝐹𝑡𝑡 = 𝐼𝐼 + 𝛿𝛿𝛿𝛿 𝐴𝐴𝑡𝑡– 𝑉𝑉𝑡𝑡 = 𝛿𝛿𝛿𝛿 𝑈𝑈𝑡𝑡

• Measurement Update:– 𝜇𝜇𝑡𝑡 = �̅�𝜇𝑡𝑡 + 𝐾𝐾𝑡𝑡 (𝑧𝑧𝑡𝑡 − ℎ(�̅�𝜇𝑡𝑡, 0))– Σ𝑡𝑡 = �Σ𝑡𝑡 − 𝐾𝐾𝑡𝑡 𝐶𝐶𝑡𝑡 �Σ𝑡𝑡– 𝐾𝐾𝑡𝑡 = �Σ𝑡𝑡 𝐶𝐶𝑡𝑡𝑇𝑇 (𝐶𝐶𝑡𝑡 �Σ𝑡𝑡 𝐶𝐶𝑡𝑡𝑇𝑇 +

10

Assumptions

Linearization

Discretization

Assumptions

Linearization

Notes:• We may have multiple heterogeneous measurement models as long as

they are fused sequentially

Page 11: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Example 1

• Quadrotor with IMU and:– Absolute Pose Sensors (GPS + Pressure Altimeter + Magnetometer)– Absolute Velocity Sensor (Optical Flow / GPS using Doppler effect)

11

Page 12: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

State

• The IMU provides noisy and biased measurements of linear acceleration and angular velocity

• 𝐱𝐱 =

𝐩𝐩𝐪𝐪�̇�𝐩𝐛𝐛𝑔𝑔𝐛𝐛𝒂𝒂

=

positionorientation

linear velocitygyroscope bias

accelerometer bias

∈ 𝐑𝐑15

• (For now) use Z-X-Y Euler angle parameterization of 𝑆𝑆𝑆𝑆(3) for orientation– 𝐪𝐪 = 𝜙𝜙, 𝜃𝜃,𝜓𝜓 𝑇𝑇 = roll, pitch, yaw 𝑇𝑇

– 𝑅𝑅 =𝑐𝑐𝜓𝜓𝑐𝑐𝜃𝜃 − 𝑠𝑠𝜙𝜙𝑠𝑠𝜓𝜓𝑠𝑠𝜃𝜃 −𝑐𝑐𝜙𝜙𝑠𝑠𝜓𝜓 𝑐𝑐𝜓𝜓𝑠𝑠𝜃𝜃 + 𝑐𝑐𝜃𝜃𝑠𝑠𝜙𝜙𝑠𝑠𝜓𝜓𝑐𝑐𝜃𝜃𝑠𝑠𝜓𝜓 + 𝑐𝑐𝜓𝜓𝑠𝑠𝜙𝜙𝑠𝑠𝜃𝜃 𝑐𝑐𝜙𝜙𝑐𝑐𝜓𝜓 𝑠𝑠𝜓𝜓𝑠𝑠𝜃𝜃 − 𝑐𝑐𝜓𝜓𝑐𝑐𝜃𝜃𝑠𝑠𝜙𝜙

−𝑐𝑐𝜙𝜙𝑠𝑠𝜃𝜃 𝑠𝑠𝜙𝜙 𝑐𝑐𝜙𝜙𝑐𝑐𝜃𝜃– We may use quaternions and error-state EKF to avoid singularities (not

covered here) 12

Page 13: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Process Model

• The gyroscope gives a noisy and biased estimate of the angular velocity– 𝜔𝜔𝑚𝑚 = 𝜔𝜔 + 𝐛𝐛𝑔𝑔 + 𝐧𝐧𝑔𝑔

• The drift in the gyroscope bias is a Gaussian, white noise process

– �̇�𝐛𝑔𝑔 = 𝐧𝐧𝑏𝑏𝑔𝑔– 𝐧𝐧𝑏𝑏𝑔𝑔 ~ 𝑁𝑁(0,𝑄𝑄𝑔𝑔)

• The accelerometer gives a noisy and baised estimate of the linear acceleration– 𝐚𝐚𝑚𝑚 = 𝑅𝑅 𝐪𝐪 𝑇𝑇(�̈�𝐩 − 𝐠𝐠) + 𝐛𝐛𝑎𝑎 + 𝐧𝐧𝑎𝑎

• The drift in the accelerometer bias is a Gaussian, white noise process– �̇�𝐛𝑎𝑎 = 𝐧𝐧𝑏𝑏𝑎𝑎– 𝐧𝐧𝑏𝑏𝑎𝑎 ~ 𝑁𝑁(0,𝑄𝑄𝑎𝑎)

13

Page 14: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Process Model

• 𝜔𝜔𝑚𝑚 is in the body frame, 𝐪𝐪 is in the world frame• The angular velocity in the body frame is given by

– 𝜔𝜔 =𝑝𝑝𝑞𝑞𝑟𝑟

=𝑐𝑐𝜃𝜃 0 −𝑐𝑐𝜙𝜙𝑠𝑠𝜃𝜃0 1 𝑠𝑠𝜙𝜙𝑠𝑠𝜃𝜃 0 𝑐𝑐𝜙𝜙𝑐𝑐𝜃𝜃

�̇�𝜙�̇�𝜃�̇�𝜓

= 𝐺𝐺 𝐪𝐪 �̇�𝐪

• Process model:

– �̇�𝐱 =

�̇�𝐩𝐺𝐺 𝐪𝐪 −1 𝜔𝜔𝑚𝑚 − 𝐛𝐛𝑔𝑔 − 𝐧𝐧𝑔𝑔𝐠𝐠 + 𝑅𝑅(𝐪𝐪)(𝐚𝐚𝑚𝑚 − 𝐛𝐛𝒂𝒂 − 𝐧𝐧𝑎𝑎)

𝐧𝐧𝑏𝑏𝑔𝑔𝐧𝐧𝑏𝑏𝑎𝑎

14

Page 15: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Absolute Measurement Model

• Position: GPS + Pressure Altimeter• Orientation: Magnetometer + Accelerometer (not exactly correct…)• Velocity: Optical flow / GPS with Doppler effect

• 𝐳𝐳 =𝐩𝐩𝐪𝐪�̇�𝐩

+ 𝐯𝐯

=𝐼𝐼 0 0 0 00 𝐼𝐼 0 0 00 0 𝐼𝐼 0 0

𝐩𝐩𝐪𝐪�̇�𝐩𝐛𝐛𝑔𝑔𝐛𝐛𝒂𝒂

+ 𝐯𝐯

= 𝐶𝐶 𝐱𝐱 + 𝐯𝐯

15

Page 16: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Example 2

• Quadrotor with IMU and:– 6-DOF Relative Pose Sensor (Stereo Visual Odometry)– 3-DOF Relative Pose Sensor (Laser Scan Matching)– 1-DOF Relative Pose Sensor (Laser/ Sonar Altimeter)

• These are all measurements with respect to a certain keyframe

16

Page 17: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

State Augmentation

• Kalman filter requires all measurements to be related to the current state only

• State augmentation at keyframe changes (Roumeliotis and Burdick, 2002)– May augment arbitrary copies of states depending on the availability

of relative sensors

17

𝐱𝐱1 𝐱𝐱2 𝐱𝐱3

𝐱𝐱3 𝐱𝐱3𝐱𝐱3

State Augmentation

𝐏𝐏33 𝐏𝐏33𝐏𝐏33 𝐏𝐏33

𝐏𝐏33

�𝐱𝐱5𝐱𝐱3

𝐱𝐱1 𝐱𝐱2 𝐱𝐱3 𝐱𝐱4 𝐱𝐱5�𝐏𝐏55 �𝐏𝐏35�𝐏𝐏53 𝐏𝐏33

𝐱𝐱5𝐱𝐱3

𝐳𝐳5

𝐏𝐏55 𝐏𝐏35𝐏𝐏53 𝐏𝐏33

MainAugmented

Page 18: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Relative Measurement Models

• 6-DOF relative pose measurement from stereo visual odometry

• Both 𝐱𝐱𝑡𝑡 and 𝐱𝐱𝑡𝑡−𝑘𝑘 are augmented into a joint state 𝐲𝐲𝑡𝑡 =𝐱𝐱𝑡𝑡𝐱𝐱𝑡𝑡−𝑘𝑘

• 𝐳𝐳 =Δ𝐩𝐩𝑡𝑡|𝑡𝑡−𝑘𝑘Δ𝐪𝐪𝑡𝑡|𝑡𝑡−𝑘𝑘

+ 𝐯𝐯t

=𝑅𝑅(𝐪𝐪𝑡𝑡−𝑘𝑘)(𝐩𝐩𝑡𝑡 − 𝐩𝐩𝑡𝑡−𝑘𝑘)

𝛿𝛿𝑡𝑡𝑡𝑡𝑢𝑢𝑡𝑡𝑡𝑡𝑟𝑟(𝑅𝑅 𝐪𝐪𝑡𝑡−𝑘𝑘 𝑇𝑇𝑅𝑅 𝐪𝐪𝑡𝑡 ) + 𝐯𝐯t= ℎ 𝐱𝐱𝑡𝑡, 𝐱𝐱𝑡𝑡−𝑘𝑘 + 𝐯𝐯t= ℎ 𝐲𝐲𝑡𝑡 + 𝐯𝐯t

Notes:• One state augmentation per keyframe change• Only states that are affected by the relative measurement (pose in the

example) need to be augmented

18

Page 19: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Sensor synchronization & timestamps

• Best: Sensors perfectly synchronized

• OK: Sensors have the same clock

• Bad: Sensors with different clock or inaccurate timestamps

19

IMU

Vision

IMU

Vision

IMU

Vision

Page 20: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Delayed and Out-of-Sequence Measurements

• Handles delayed and out-of-sequence measurements using fixed-lag priority queue– Redo process process updates (state only) after each (delayed)

measurement update– Covariance update is done only up to the latest measurement

20

Page 21: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Multi-Sensor Quadrotor

• UKF-based multi-sensor fusion

21

Stereo CamerasGPS and Magnetometer

Laser Scanner with Mirror Housing

IMU and Pressure AltimeterIntel Core i3

Computer

Page 22: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

22

Page 23: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

23

Page 24: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Example 3

• Quadrotor with IMU and:– Up-to-Scale Relative Pose Sensor (Monocular Visual Odometry)– Very relevant for micro aerial robots!

24

Page 25: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Loosely-Coupled Monocular Visual-Inertial Fusion

• State is modified to explicitly estimate the visual scale:

– 𝐱𝐱 =

𝐩𝐩𝐪𝐪�̇�𝐩𝐛𝐛𝑔𝑔𝐛𝐛𝒂𝒂𝜆𝜆

=

positionorientation

linear velocitygyroscope bias

accelerometer biasvisual scale

∈ 𝐑𝐑1𝟔𝟔

• Relative measurement model (Δ�𝐩𝐩: up-to-scale relative translation):

• 𝐳𝐳 =Δ�𝐩𝐩𝑡𝑡|𝑡𝑡−𝑘𝑘Δ𝐪𝐪𝑡𝑡|𝑡𝑡−𝑘𝑘

+ 𝐯𝐯t

=𝜆𝜆 ⋅ 𝑅𝑅(𝐪𝐪𝑡𝑡−𝑘𝑘)(𝐩𝐩𝑡𝑡 − 𝐩𝐩𝑡𝑡−𝑘𝑘)𝛿𝛿𝑡𝑡𝑡𝑡𝑢𝑢𝑡𝑡𝑡𝑡𝑟𝑟(𝑅𝑅 𝐪𝐪𝑡𝑡−𝑘𝑘 𝑇𝑇𝑅𝑅 𝐪𝐪𝑡𝑡 ) + 𝐯𝐯t

= ℎ 𝐱𝐱𝑡𝑡, 𝐱𝐱𝑡𝑡−𝑘𝑘 + 𝐯𝐯t25

May suffer from convergence issues if scale initialization is poor

Page 26: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Summary of Loosely-Coupled EKF Fusion

26

• Accuracy • Frequency • Latency • Sensor synchronization & timestamp accuracy

– Can be done using offline calibration if sensors have the same clock

• Delayed and out-of-order measurements • Estimator initialization • Sensor calibration

– Can be done using offline calibration and online refinement

• Different measurement models with uncertainties • Robustness to outliers • Computational efficiency

Page 27: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Open Source Packages

27

• Kalibr - Offline calibration toolbox (ETH Zurich ASL)– Multi-camera calibration– Camera-IMU calibration– Temporal alignment: handles sensor synchronization & timestamp

accuracy issues– https://github.com/ethz-asl/kalibr

• MSF - Modular framework for multi sensor fusion based on an EKF (ETH Zurich ASL)– https://github.com/ethz-asl/ethzasl_msf

Page 28: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Outline

28

• Loosely-Coupled, Extended Kalman Filtering-Based Multi-Sensor Fusion (“Tutorial”)

• Tightly-Coupled, Optimization-Based, Monocular Visual-Inertial Fusion, with Online Initialization and Camera-IMU Extrinsic Calibration (Brief Intro)

IMU

Visual Odometry

EKF-Based Multi-Sensor

Fusion

Page 29: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Challenges

• Scale ambiguity

• Up-to-scale motion estimation and 3D reconstruction (Structure from Motion)

29

λ =?

Page 30: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Challenges

• With IMU, scale is observable, but…– Requires initial velocity and attitude (gravity)– Requires knowledge about camera-IMU calibration– Highly nonlinear system – requires initial values to converge

30

v0 =?g0 =?Rcb =?

pcb =?

Short term integration of IMU

Can we operate without initialization or calibration?

Plug-and-play monocular visual-inertial systems!

Page 31: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

System Pipeline

• Linear camera-IMU rotation calibration (Rcb)

• Linear initialization and camera-IMU translation calibration (v0, g0, pcb)

• Tightly-coupled nonlinear optimization and calibration refinement• Calibration convergence identification methods

31

Page 32: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Linear Camera-IMU Rotation Calibration

• Incremental IMU Rotations:• Incremental Camera Rotations:

32

Rb1b0 ,Rb2

b1 ,Rb3b2 …

, Rc2c1 , Rc3

c2 …Rc1c0

Incremental compute Rcb to

align two rotation sequences

Page 33: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

IMU Pre-Integration

• IMU integration in the body frame of the first pose– Nonlinearity from relative rotation only– Linear update equations for position, velocity, and gravity– IMU Integration without initialization– Uncertainty propagation on manifold

33

𝑩𝑩𝟎𝟎

𝑩𝑩𝟏𝟏

Page 34: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

• IMU pre-integration for using IMU without initialization• Estimates position, velocity, gravity, camera-IMU translation,

and feature depth using graph-based optimization• Linear formulation enables recovery of initial conditions

34

Initial Condition

Linear Sliding Window Initialization and Camera-IMU Translation Calibration

𝐩𝐩0𝐯𝐯0𝐠𝐠0

𝐩𝐩1𝐯𝐯1𝐠𝐠1

𝐩𝐩2𝐯𝐯2𝐠𝐠2

𝐩𝐩3𝐯𝐯3𝐠𝐠3

𝐩𝐩4𝐯𝐯4𝐠𝐠4

𝝀𝝀0

𝝀𝝀1

𝐩𝐩𝑐𝑐𝑏𝑏

Page 35: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

35

On-the-Fly Initialization

Page 36: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Camera-IMU Calibration

36

Camera-IMU Rotation

Camera-IMU Translation

Page 37: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Linear Initialization and Camera-IMU Calibration

37

Page 38: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

System Pipeline

• Tightly-coupled nonlinear optimization and calibration refinement

38

Page 39: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Tightly-Coupled Nonlinear Sliding Window Optimization with Calibration Refinement

• Nonlinear graph-based optimization using linear initialization– Optimize position, velocity, rotation, inverse feature depth, and camera-IMU

transformation simultaneously:

– Iteratively minimize residuals from all sensors

39

Linearization

IMU residual Reprojection errorPrior

Page 40: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Autonomous Quadrotor Flight

40

Page 41: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Self-Calibrating Multi-Sensor Fusion

41

Page 42: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Summary of Tightly-Coupled Fusion

42

• Accuracy • Frequency • Latency • Sensor synchronization & timestamp accuracy • Delayed and out-of-order measurements • Estimator initialization • Sensor calibration • Different measurement models with uncertainties • Robustness to outliers • Computational efficiency

Page 43: Sensor Fusion for Aerial Robots - University of …mrsl.grasp.upenn.edu/loiannog/tutorial_ICRA2016/ICRA_shaojie.pdf · Why Sensor Fusion? • Vision/GPS-only state estimation is too

Conclusion

43

• Why sensor fusion?• What methods are available?

– Filtering-based– Optimization-based– Loosely-coupled– Tightly-coupled

• What are the special design considerations?– Multiple measurement models– Delayed and out-of-order measurements– Sensor calibration– Estimator initialization– Computational efficiency