42
Jet Propulsion Laboratory California Institute of Technology NASA Experience with Automated and Autonomous Navigation in Deep Space Ed Riedel Optical Navigation Group Mission Design and Navigation Section Jet Propulsion Laboratory California Institute of Technology A description of work with many contributions from: Andrew Vaughan, Nick Mastrodemos, Shyam Bhaskaran, William M. Owen, Dan Kubitscheck, Bob Werner, Bob Gaskell (all JPL) Christopher Grasso (BSE)

NASA Experience with Automated and Autonomous Navigation in

  • Upload
    ledien

  • View
    218

  • Download
    1

Embed Size (px)

Citation preview

Page 1: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

NASA Experience with Automated and Autonomous Navigation in Deep Space

Ed Riedel

Optical Navigation Group Mission Design and Navigation Section

Jet Propulsion Laboratory California Institute of Technology

A description of work with many contributions from:

Andrew Vaughan, Nick Mastrodemos, Shyam Bhaskaran, William M. Owen, Dan Kubitscheck, Bob Werner, Bob Gaskell (all JPL)

Christopher Grasso (BSE)

Page 2: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Introductory Remarks

2

Page 3: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology What is Optical Navigation?

Optical Navigation: The location of a near-field object (e.g. the Moon) relative to a well-known far-field object (e.g. the background starfield) or relative to well known camera attitude. (However, with a sufficiently wide-field imager, pointing knowledge can be obtained simultaneous to position knowledge)

+

+ + +

+

+

Optical Navigation variously requires: •  Accurate star catalogs, and physical body models, including landmarks. •  Accurate camera calibrations including geometric distortions and photometric modeling. •  Astrometric-quality imaging systems (often) with high-dynamic range. •  Filtering and estimation of optical-relevant parameters with s/c position and attitude. •  Ground-based Optical Navigation processing is very similar to radiometric ground processing - with the addition of (sometimes difficult and labor-intensive) image processing.

Page 4: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology Optical Navigation Syllabus

4

•  Opti-metrics: “Doppler” (Red/Blue shift) and range extracted from an optical com link

•  LASER Altimetry: such as done by LRO (LOLA) •  Earth based astrometry: using telescopes to see the spacecraft,

probably via the downlink optical com signal •  Classical Optical Navigation

–  Observing distant point-like objects against stars –  Observing large near-field objects, and extracting limbs or landmarks (also called

Target-Relative-Navigation (TRN)) –  These OpNav pictures can be processed on the ground or onboard

•  AutoNav: a software set that performed classical optical navigation on Deep Space 1, Stardust and Deep Impact (as well as NExT and EPOXI)

Page 5: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

A Brief History of Deep-Space Optical Navigation

•  1976: Viking orbiters demonstrated optical navigation using Phobos and Deimos.

•  1978-1989: Voyager requires optical navigation to meet mission objectives at Jupiter, Saturn, Uranus and Neptune. This method improves navigation performance over conventional radio by factors of 2 to 20.

•  1991-1996: Galileo Mission utilizes optical navigation to capture images of Gaspra and Ida, and to accurately achieve orbit.

•  1996-1998: Deep Space 1: optical navigation automated and adapted to onboard operation (AutoNav), used in cruise flight in 1999.

•  2000-2001: NEAR orbital operations - JPL uses ground-based optical navigation to orbit and land.

•  2001: Deep Space 1 AutoNav Captures Images of Comet Borelly •  2002-present: Cassini optical navigation •  2004: Stardust AutoNav captures Images of comet Wild 2 •  2005: Deep Impact AutoNav impacts comet Tempel 1, and

subsequently captures images of impact crater. •  2006: MRO performs demonstration navigation with the Mars

Optical Navigation Camera (ONC). •  2010: AutoNav targeting Deep Impact (EPOXI) at Hartley 2 •  2011: AutoNav targeting Stardust/NExT at Tempel 2 •  2012-2013 Optical Nav at Vesta for Dawn •  2014: Optical Nav at 67P/Churyumov-Gerasimenko (CG) for Rosetta

(done by JPL as ‘shadow Nav’) •  2015-2016: Optical Nav at Ceres for Dawn •  2015: New Horizons at Pluto

Voyager 1 at Jupiter, 1980

Galileo at Gaspra, 1991

NEAR at (and on) Eros, 2001

Viking at Mars, 1976

Rosetta at 67P, 2014

Page 6: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Recent Optical Navigation Experience

6

Page 7: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology JPL Experience: Recent Autonomous OpNav and

Autonomous Navigation Technology Successes

Stardust AutoNav at Annefrank and Wild 2, Nov. 2002, Jan. 2004.

Deep Impact AutoNav at Tempel 1 July 2005

DS1 AutoNav Deep Cruise, Navigation Sept.1999

MRO OpNav Camera Validation Feb. 2006

DS1 AutoNav At Borrelly Sept., 2001

!

Hayabusa Imaging Science: Itokawa Shape Model, Sept. 2005

Deep Impact AutoNav Hartley 2, Nov. 4, 2010

Stardust AutoNav Tempel 1, Feb 14, 2011

Page 8: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Using Optical Navigation and Landmark-based Navigation to Perform the Navigation at Vesta

A control network of tens of thousands of landmarks has been created to navigate Vesta on a daily basis, merging radiometric and optical data Analysis shows that this work could be done optically only, but the inertial reference (SRU) on the Dawn s/c would need some serious analysis and modeling

GN&C / 8

Page 9: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

9

AutoNav

On July 4, 2005, AutoNav bagged the third of NASA’s first three comet nuclei missions (at left); the other two being: Borrelly, Sept 2001, Wild 2, Nov. 2002, both also captured with AutoNav. These were followed by Hartley2 in 2010, and a Tempel 1 revisit in 2011. AutoNav placed optical navigation elements onboard for otherwise impossible speedy turn-around of navigation operations.

Page 10: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

10

AutoNav on the DI Impactor

9P/Tempel 1 at Impact-2 hrs spans 10 pixels; AutoNav begins controlling the Impactor

Page 11: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

11

AutoNav on the DI Impactor

9P/Tempel 1 at Impact-2 hrs spans 10 pixels

Page 12: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

12

Flyby AutoNav - Deep Impact Encounter MRI & HRI Cameras

64 MRI pixel subframe (0.04º)

512 HRI pixel subframe (0.06º)

Page 13: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

Landmarks

13

Page 14: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

14

Navigating in proximity to a natural body using landmarks

Step 1: Survey pass or orbits, where many images are taken, with multiple views of distinct topography areas on the surface. These areas are modeled, including high-precision topocentric location, and these become landmarks in the Navigation Landmark Control Network, this requires accurate s/c position determination, usually via the combined use of radio and optical methods.

Step 2: Using the landmark network, the s/c observes the surface, performs precision image-location of the known terrain elements, and - by knowing the location of the elements - determines its own instantaneous position.

The accuracy of terrain relative navigation using landmarks is potentially limited only by the resolution of the imaging system - modest cameras can give very high accuracy.

Step 3: The instantaneous s/c positions are combined into a navigation filter that estimates s/c dynamics (e.g. position, velocity and propulsive events) and body dynamics (e.g. gravity field components.)

14

Page 15: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Developing Topography and Landmarks From Multiple Stereo Views and Limbs Using

Stereophotoclinometry

Local Coordinate System (Including Local Normal)

rn asteroid

Stereo View 1 example 50x50 pixel observation patch

Digital Terrain Map (DTM) of Landmark Vectors describing Local DTM

Center of Landmark

Landmark DTM (and albedo map) or LMAP

Process :

• Multiple images (often dozens) of the same location are combined into an estimate of a local Digital Terrain Map (DTM): a set of local altitude vectors.

• Many landmarks are observed in many images, and the locations of all landmarks can be estimated simultaneously, the global residual “stress” is removed through a numerical relaxation process.

• The amalgam of DTMs forms a global topography and shape model. Simultaneous estimation of surface reflectance also produces albedo maps.

• Simultaneous Estimation of body gravity parameters, and s/c orbit parameters as well as landmark locations can be made in the landmark location estimate process.

• Solutions for all parameters can use optical and radiometric navigation data in combined solutions.

Local Reference Plane

Stereo View 2

Limb View

Stereophotoclinometry potentially turns the entire surface of the body into high accuracy navigation beacons that can be used at all attitudes, ranges and lighting conditions.

15

Page 16: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

OpNav at the Moon

16

Page 17: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology 1024x1024km Topography Map of the Lunar South

Pole Region (centered on Cabeus) and Developed Using Stereophotoclinometry, from Apollo, LO, Clementine and LRO Images.

Map developed by Robert Gaskell Resolution 100-200m 17

Page 18: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

Navigating at the Moon with SPC- Derived Landmarks - LCROSS

LCROSS Image Element

SPC Model

Descent image taken from ~6000km altitude

LCROSS Image

18

Page 19: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

Navigating at the Moon with SPC- Derived Landmarks - LCROSS

LCROSS Image Element SPC Model

•  Final Vis Descent image taken from ~200km altitude •  With 150m resolution maps, LCROSS Navigation was able to resolve positions

with the radiometric solutions to the 50m level. •  It is believed that this result is consistent with the ensemble map registration

capability (frame-tie) of SPC-generated maps from multiple sources.

LCROSS Image

19

Page 20: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology LCROSS at the Moon

20

This is a movie of descent imaging from LCROSS, during its final approach to the Moon. JPL’s OpNav group performed post-flight OpNav analysis for LCROSS (as part of JPL Nav function for that mission.)

Page 21: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

10km

Clementine + LRO + LO + Apollo-based topography map

21

Page 22: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

Nominal Target

Radio + Optical Soln.

Final LCROSS Impact Site Reconstruction with OpNav

22

Page 23: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

Automated and Autonomous Nav at Small Bodies

23

Page 24: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Using Optical Navigation and Landmark-based Navigation to rendezvous and land at Phobos

A Demonstration and Simulation

GN&C / 24

To Mars

Trajectory begins 400km uptrack from Phobos

Two revs around Phobos before landing

16 mid-course maneuvers autonomously computed and executed to maintain orbit and landing accuracy.

Final 10m/s burn to inject on terminal landing trajectory

All navigation is entirely with passive optical data, utilizing two cameras, Narrow plus Wide Angle.

OBIRON is used to extract landmark data, utilizing maps derived from an (assumed) previous mission

Page 25: NASA Experience with Automated and Autonomous Navigation in

GN&C / 25

Page 26: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Comet and Asteroid Sample Return Touch and Go

GN&C / 26

Comet Sample Return, Approach and Touch and Go (TAG) Geometry and Strategy

Station Keeping Zone

10’s of kilometers, many hours

AutoGNC lock-on, and commit to TAG, at several km altitude

Descent, TAG, Departure, in 10’s of minutes under AutoGN&C Control

Radiation of commit authorization: TAG - several hours

Page 27: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Navigation to a Comet Nucleus With a Flight-like Onboard GN&C System

GN&C / 27

Page 28: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Navigation to a Comet Nucleus With a Flight-like Onboard GN&C System

GN&C / 28

Page 29: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Getting Around the Solar System

29

Page 30: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Interplanetary GPS Courtesy Mike Shao, JPL

•  A self-contained instrument that provides XYZ position of a spacecraft to ~100-300 m almost anywhere in the solar system.

•  DS1 demonstrated autonomous navigation by looking at asteroids (currently accurate to ~3-10km)

•  Recent progress: in 2015 we have: •  Micropixel centroiding (super camera cals) •  Synthetic tracking (multiple flash frames)

•  GAIA catalog of millions of main belt asteroid orbits.

•  If a spacecraft has an optical comm. telescope which also has a calibrated focal plane, It can do its own very accurate navigation

•  Accuracy may be lower in the “outer” solar system (beyond 3AU), subject to availability of targets.

•  The main systematic error in asteroid based navigation is the center of light/center of mass offset (CL/CM). A 10 km dia asteroid will have a CL/CM offset of ~1 km after correcting for the illumination phase. But 100 m is possible by averaging over >100 asteroids. Light curves can also be autonomously estimated onboard

Page 31: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

What do all these pretty pictures imply?

•  Effective interplanetary navigation using classical passive optical navigation is possible to the level of quality of the ephemeris knowledge – or even much, much better – as demonstrated by Deep Space 1 (which also did it automatically onboard)

•  Proximity operations around a planetary body (of high or low mass) can also be accomplished solely via passive optical navigation, as demonstrated by Deep Impact, and via extrapolation of ground-based OpNav experience from Voyager (1979) to Rosetta (2014).

•  Optical Nav vis-à-vis XNav, aka, X-Ray Nav, or Pulsar Nav: –  A camera is almost always necessary to perform target-relative navigation in deep space (if

you’re going somewhere). –  In addition, an optical camera will always be smaller and cheaper than an X-ray telescope

(especially one that must make pico-second signal-timing correlations) –  Cameras almost always have dual use as Nav and Science instruments –  Projects will always strive to save money, and combining functions is always a good way to

save money: combining Science and Nav is good – possibility to combine com as well

•  Summary: Optical Nav at a target is necessary and sufficient for the vast majority of mission operations

Page 32: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology Deep-space Positioning System (DPS)

Spacecra'  posi,on  and  control  anywhere  in  the  solar  system,  at  any  ,me.  ™

32  cm  

Coelostat/  Periscope  Mirror  

Foldable  sun-­‐shade  

ONC  Telescope  Body  

25°  MER  NavCam  (on  ONC  secondary  reflector)  

Radiator  

Periscope  Azimuth  Drive  

MER-­‐derived  Camera  Electronics  (ONC  Chassis)  

Periscope  ElevaJon  Drive  

High-­‐capability,  low-­‐power/mass  Cube-­‐sat  derived  processor,  hos:ng  Deep  Impact  heritage  AutoNav  

X-­‐Band  patch  antenna  

X-band patch antenna on back of mirror

Page 33: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

TOUCAN: Terminal for Optical Unified Communications And Navigation

Basis of design: the MRO OpNav Camera

GN&C / 33

MRO OpNav Camera (ONC) undergoing tests

Page 34: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

DPS and TOUCAN

•  DPS/TOUCAN combine Nav with a small OpCom terminal (and possibly even a small radio receiver); it would be a powerful self-contained GN&C and Com device

•  Such a system would provide low-cost multi-mission support for a wide rand of deep space missions

DPS integrated h/w and s/w system; 7 kg, 15 W, with a plug-in RS422 simple s/w interface (The Nav equivalent of a Star Tracker)

Spacecra'  posi,on  and  control  (and  com)  anywhere  in  the  solar  system,  at  any  ,me.  ™

Laser connection via fiber optic connection

Laser communications “slice” inserts into the ONC 1.4° telescope

Page 35: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Lessons Learned and the Big Challenges

35

Page 36: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Important Lessons Learned and the Principal Challenge for Autonomous Nav

•  80-90% of what you worry about in developing an automated onboard GN&C system is not the algorithms, or their coding or performance

•  Most of what you do is… •  Work on the method of implementing the autonomy and automation, and the

means of managing it onboard

•  Worry about the performance of, and anomalies in, your instrument •  Worry about the performance of, and anomalies in the attitude control system

•  Make sure data is properly conditioned (edited/filtered)

•  Make sure appropriate actions happen, on time, and in the right order

•  And mostly: deal with what happens and what should happen if something goes wrong. Because it will.

•  Conclusion: the principal challenging technology for Automated/Autonomous Navigation is the intelligent robotic control system that will implement the autonomy and automation

36

Page 37: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Flight “Software” Architecture View (System-wide Hierarchy; implementation of AutoNav)

37  

RTOS Layer (e.g., VxWorks, RTEMS, Xenomai, RTLinux, etc), file system, prioritization, etc."

Flight Service Layer (Thruster Control (RCS), Device Handlers, u/l, d/l, UIO, messaging, command dispatch, low-level fault trapping) "

Command and Sequence Layer (shared memory, inter-app signaling and capture, conditional commanding) (VML-Engines)"

Application Layer (Expert Services, ECLS, Power, GN&C, Science, heavy compute services), e.g., AutoNav"

Manager Layer (Expert Managers, Monitors, mid-level FDIR, light compute services) e.g., attitude profiling"

Executive and Autonomy Layer (Flight Director, Deputy, and Subsystem and Instrument Directors, high-level FDIR [State Machines])"

Master Sequence Layer – Selection of Flight Directors,( e.g., Deploy, Cruise, RPODU, EDL)"

C/Ma- chine

C

C

C

VML

VML

VML

Lan- guage

Architectural Layer

Incr

easi

ng c

ode

volu

me,

den

sity

, opa

city

, ris

k

Page 38: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

The AutoNav Flight Director State Machine (Evolved from Phoenix EDL)

•  Creation of a VML-state-machine Flight Director (FD) allows encapsulation of GN&C behavior to specific states.

•  The FD state machine works in parallel and cooperation with GN&C Manager state-machines to enact GN&C activities in a coordinated fashion.

•  Utilization of VML-based GN&C Exec allows for a concise but very powerful instantiation of Fault Detection, Interception and Recovery (FDIR) processes in VML-based monitors, as was done on Phoenix.

38

Page 39: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology

Some AutoNav Managers

• The VML AutoNav Flight Director (FD), Managers, Monitors (and Fault Detection, Interception and Recovery – FDIR) are the means of control interface between the spacecraft and AutoNav.

• The FD, Managers and Monitors define the behavior of AutoNav, and provide the means of configuring, managing and warranting that behavior.

• These VML constructs are (in a manner of speaking) sequenced blocks, that are up-loadable at any time, and require no FSW uploads or upgrades.

• Some VML blocks contain global variables that tune the system behavior of the managers, such as a time or altitude for an event, or a limit of a position or attitude bound.

• Despite the fact that blocks can be upgraded quickly and readily, they can and should be established (and left unchanged) from well before launch (Phoenix EDL story)

39

Page 40: NASA Experience with Automated and Autonomous Navigation in

National Aeronautics and Space Administration Jet Propulsion Laboratory California Institute of Technology AutoNav Monitor and Fault Timeline for TAG

40

Page 41: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology Conclusions

•  Classical optical navigation (passive observations of solar system bodies against an inertial reference – stars) has been used for over 40 years

•  Substantial advances in the techniques, including automated landmark tracking, and onboard automated navigation, have been developed that make the technique even more efficacious

•  Self-contained OpNav systems can be created that integrate OpNav and OpCom into a single h/w and s/w subsystem to save missions mass, cost and complexity.

Page 42: NASA Experience with Automated and Autonomous Navigation in

Jet Propulsion Laboratory California Institute of Technology

Thank You I hope this information has had an impact.