34
Augmented Reality for Robot Development and Experimentation Authors: Mike Stilman, Philipp Michel, Joel Chestnutt, Koichi Nishiwaki, Satoshi Kagami, James J. Kuffner. Jorge Dávila Chacón

Augmented Reality for Robot Development and Experimentation

  • Upload
    marrim

  • View
    24

  • Download
    2

Embed Size (px)

DESCRIPTION

Augmented Reality for Robot Development and Experimentation. Authors: Mike Stilman, Philipp Michel, Joel Chestnutt, Koichi Nishiwaki, Satoshi Kagami, James J. Kuffner. Jorge Dávila Chacón. Introduction & Related Work Overview Ground Truth Modelling Evaluation of Sensing - PowerPoint PPT Presentation

Citation preview

Page 1: Augmented Reality for Robot Development and Experimentation

Augmented Reality for Robot Development and Experimentation

Authors:Mike Stilman, Philipp Michel,

Joel Chestnutt,Koichi Nishiwaki, Satoshi Kagami, James J. Kuffner.

Jorge Dávila Chacón

Page 2: Augmented Reality for Robot Development and Experimentation

Introduction & Related WorkOverviewGround Truth ModellingEvaluation of SensingEvaluation of PlanningEvaluation of ControlDiscussion

Page 3: Augmented Reality for Robot Development and Experimentation

Introduction

Virtual simulation: Find critical system flaws or software errors.

Testing of various interconnected components for perception, planning, and control becomes increasingly difficult. Vision system: Model of the environment. Navigation planner: Erroneous path. Controller: Properly following the desired trajectory.

Page 4: Augmented Reality for Robot Development and Experimentation

Objective: To present a ground truth model of the world and to introduce virtual objects into real world experiments.

Establish a correspondence between virtual components (Environment, models, plans, intended robot actions) and the real world.

Visualize and identify system errors prior to their occurrence.

Page 5: Augmented Reality for Robot Development and Experimentation

Related Work

For humanoid robots: Simulation engines model dynamics and test the controllers, kinematics, geometry, higher level planning and vision components.

Khatib: Haptically interaction with the virtual environment. Purely virtual simulations are limited to

approximating the real world (Rigid body dynamics and perfect vision).

Page 6: Augmented Reality for Robot Development and Experimentation

Hardware in-the-loop simulation (Aeronautics and space robotics).

Virtual overlays for robot teleoperation: Design and evaluate robot plans.

Speed, robustness and accuracy enhanced by binocular cameras

Hybrid tracking by the use of markers (Retroreflective, LEDs and/or Magnetic trackers).

Page 7: Augmented Reality for Robot Development and Experimentation

Overview

Lab space setting: “Eagle-4” Motion analysis system,

cameras and furniture objects.Experiments focus: High level autonomous

tasks for humanoid robot “HRP-2”. Foot locations to avoid obstacles and

manipulate them to free its path.

Page 8: Augmented Reality for Robot Development and Experimentation

Technical details

“Eagle-4” system: Eight cameras, 5 × 5 x 2 m. Distances calculated to 0.3% accuracy. Dual Xeon 3.6GHz processor “EVa Real-Time” (EVaRT) software: Locate 3D

markers at a max. rate of 480Hz and 1280 × 1024 (60 markers min. at 60Hz).

Page 9: Augmented Reality for Robot Development and Experimentation

Virtual chair is overlayed in real-time.Both the chair and the camera are in motion.

Page 10: Augmented Reality for Robot Development and Experimentation

Ground Truth Modeling Reconstructing Position and Orientation

Individually identified markers, attached to an object can be expressed as a set of points

{a1, ..., an} in the object’s coordinate frame “F” (Object template).

Displaced marker location “bi” is found with translation vector “t”, orientation vector “R” and the centroid of markers.

iatR

1000 b i

Page 11: Augmented Reality for Robot Development and Experimentation

Markers occluded from motion capture: Algorithm performed only on the visible

markers: their corresponding rows in matrix must be removed.

New centroids are the centroids of the visible markers and their associated template markers.

Page 12: Augmented Reality for Robot Development and Experimentation

Reconstructing Geometry and Virtual Cameras

3D triangular surface meshes form environment objects (Manually edited for holes and automatically simplified to reduce the number of vertices).

The position of robot camera found with ground-truth positioning information and calculus of its axis provides the “Virtual view”.

Page 13: Augmented Reality for Robot Development and Experimentation
Page 14: Augmented Reality for Robot Development and Experimentation

Evaluation of Sensing

Ground-truth positioning information localize sensors, cameras, range finders.

Build reliable global environment representations (Occupancy grids or height maps) for robot navigation plan.

Overlay them onto projections of the real world evaluate sensing algorithms for construction of world models.

Page 15: Augmented Reality for Robot Development and Experimentation

Reconstruction by Image Warping

Tracking of camera’s position, using motion capture to recover projection matrix, enables 2D homography between the floor and the image plane.

To build a 2D occupancy grid of the environment for biped navigation, we assume that all scene points of interest lie in the z = 0 plane.

Page 16: Augmented Reality for Robot Development and Experimentation

Reconstruction from Range Data

Range sensor “CSEM Swiss Ranger SR-2” time-of-flight (TOF) to build 2.5D height maps of the environment objects.

Motion-capture based localization lets us convert range measurements into clouds of 3D points in world coordinates in real-time.

Environment height maps can be cumulatively constructed.

Page 17: Augmented Reality for Robot Development and Experimentation

Point cloud views of reconstructed box.

Example “box” scene. Raw sensor measurement.

Page 18: Augmented Reality for Robot Development and Experimentation

Registration with Ground Truth

Environment reconstructed by image warping or range data allows to visually evaluate the accuracy of our perception algorithms.

Make parameter adjustments on the-fly by overlaying the environment maps generated back onto a camera view of the scene.

Page 19: Augmented Reality for Robot Development and Experimentation

Evaluation Of Planning

Video overlay displays diagnostic information about the planning and control process in physically relevant locations.

The robot plan a safe sequence of actions to convey itself from its current configuration to a goal location. Goal location and obstacles moved while

robot was walking, requiring a constant update of the plan.

Page 20: Augmented Reality for Robot Development and Experimentation

Example camera image. Synthesized ground plane view. Corresponding environment map.

Page 21: Augmented Reality for Robot Development and Experimentation

Planning algorithm evaluates candidate footstep locations through a cluttered environment.

• Motion capture obstacle recognition. • Localized sensors. • Self-contained vision

Motion capture data removed completely and the robot use its own odometry to build maps of the environment.

Page 22: Augmented Reality for Robot Development and Experimentation

Visual Projection: Footstep Plans

For each step it computes 3D position and orientation of the foot.

Augmented reality planned footsteps are overlaid in real-time onto the environment (Continuously updated while walking).

This display exposes the planning process to identify errors and gain insight into the performance of the algorithm.

Page 23: Augmented Reality for Robot Development and Experimentation

Occupancy grid generated from the robot’s camera.

Planar projection of an obstacle recovered from range data.

Page 24: Augmented Reality for Robot Development and Experimentation

Temporal Projection: Virtual Robot

Real world preferred to completely simulated environment for experimentation: AVATAR proposal.

Instead of replacement of all sensing with perfect ground truth data, we can simulate degrees of realistic sensors.

Page 25: Augmented Reality for Robot Development and Experimentation

Objects And The Robot’s Perception

Slowly increase the realism of the data which the system must handle.

By knowing the locations and positions of all objects as well as the robot’s sensors, we can determine which objects are detectable by the robot at any given point in time.

Page 26: Augmented Reality for Robot Development and Experimentation

Footstep plan displayed ontothe world.

Augmented reality with a simulated robot amongst real obstacles.

Page 27: Augmented Reality for Robot Development and Experimentation

Evaluation Of Control

Objective: To maximize the safety of the robot and the environment.

To accomplish this, we perform hardware in-the-loop simulations while gradually introducing real components. “Complexity of the Plant”

Page 28: Augmented Reality for Robot Development and Experimentation

Virtual Objects

Simulation: Analyze the interaction of a robot with a virtual object by a geometric and dynamic model of the object.

In case of a failure we observe and detect virtual collisions without affecting the robot hardware.

Similarly, these concepts can be applied towards grasping and manipulation.

Page 29: Augmented Reality for Robot Development and Experimentation

Precise Localization

To perform force control on an object during physical interaction.

Fixing the initial conditions of robot and environment, or asking the robot to sense and acquire a world model prior to every experiment.

Page 30: Augmented Reality for Robot Development and Experimentation

The hybrid experimental model avoids the rigidity of the former approach and the overhead time required for the latter.

Virtual optical sensor: Efforts can be focused on algorithms for making contact with the object and evaluating the higher frequency feedback required for force control.

Page 31: Augmented Reality for Robot Development and Experimentation

Gantry Control

Physical presence of gantry and its operator prevent from testing fine manipulation and navigation in cluttered environments that requires the close proximity to objects.

To bypass this problem a ceiling suspended gantry was implemented, that can follow the robot throughout the experimental space.

Page 32: Augmented Reality for Robot Development and Experimentation

Discussion

Paradigm leverages advances in optical motion capture speed and accuracy to enable simultaneous online testing of complex robotic system components.

Promotes a rapid development and validation testing on each of the perception, planning and control.

Page 33: Augmented Reality for Robot Development and Experimentation

Future Work

Automated methods for environment modeling (Object with markers could be inserted at environment and immediately modeled for application)

Automatic sensor calibration in the context of a ground truth world model.

Enhanced visualizations by fusing local sensing (Gyroscopes and Force) sensors into the virtual environment.

Page 34: Augmented Reality for Robot Development and Experimentation

?