23
Tracking for Scene Augmentation & Visualization Ulrich Neumann Computer Science Department Integrated Media Systems Center University of Southern California July 2000

Tracking for Scene Augmentation & Visualization Ulrich Neumann Computer Science Department Integrated Media Systems Center University of Southern California

Embed Size (px)

Citation preview

  • Tracking for Scene Augmentation & VisualizationUlrich Neumann

    Computer Science DepartmentIntegrated Media Systems CenterUniversity of Southern CaliforniaJuly 2000

  • Research GoalsBasic science and engineering needed for wide-area, unencumbered, real-time person trackingWhy person tracking?Position/orientation of people in field smart sensorsAugment the observed scene with 3D information gleaned from distributed sourcesEnable a shared command/control visualization of distributed spatial information/sensing sourcesSpatial relationships are critical in sensing, display, and data fusion

  • Person/Head Tracking vs. Object or Vehicle TrackingTracking objects from fixed sensors vehiclesObjects emit signatures like sound, light, force that are detected (illumination is possible) Person/head tracking uses body-worn moving sensorsPassive sensing of environmental signatures/measuresdifficult to model or sense or measureVehicle tracking ground, air, waterSimilar sensors and fusion ideas (e.g., EKF)Inverse component motion rates translation rotationLack of velocity measures (e.g., wheel rotations, flow)Man-portable imposes severe weight/power constraints

  • Current Outdoor Person TrackingS. Feiner @ Columbia - MARS project L. Rosenblum @ ONR/NRLGPS/compass (gyro w/USC)E. Foxlin @ Intersense Corp.compass/gyro IS-300R. Azuma @ HRLcompass/gyro (vision w/USC) U. Neumann, S. You @ USCgyro/vision, panoramicLand Warrior - Army/Point Research Corp.GPS/compass/accelerometer (body-only)

  • Technical Approach/StrategyEstimate real-time 6DOF tracking by fusion of multiple sensor data streams, each possessing variable uncertaintyFusion strategies EKF, data-driven models GPS for 3DOF position (intermittent)Inertial (MEMS) gyros and accelerometersVision - planar and panoramic projectionsCompass, pedometer, laser-range-finder, human-aided,

  • Proposed Research and DevelopmentSensor fusion algorithms3DOF orientation from gyro/compass/vision3DOF position from GPS/accel/vision/LRFReal-time portable prototypeOutdoor performance/annotation tests/metricsCommand/control visualization testbedOriented imagesPrecise target localization (man-sighted, UAVs)

  • Data Fusion MethodsExtended Kalman FiltersExplicit closed-loop modelsFuzzy modelsImplicit data-driven models of noise, sensor bias, and sensor correlations that are hard to model explicitlyE.g., device to device variations, application usage variations, (crawling, vs running), user-to-user variations Hybrid combinations of above

  • Explicit Closed-Loop Models for Inertial/Vision Data Fusion

  • USC Research StatusAutocalibrationCalibrate 3D positions of features (sparse modeling) Extend tracking and stabilize pose Panoramic imagingTrack from reference imagesVisualize/monitor 360 sceneGyro/vision fusionStable orientation (3DOF)Track through high speed motions and blur

  • Autocalibrationestimate camera posedetect & calibrate new featuresFeatures (Fc), gyro, accelerometer & other sensors (s)convergence in an iterative Extended Kalman Filter framework supports autocalibration and multiple-sensor fusionP = K(Fc, S, Fn)S = s(t) dtFn = K(P, fn)

  • Autocalibration Demonstration

  • Motion Estimation/Tracking with Panoramic ImagesPanoramic images are more robust to partial occlusions than planar imagesAdapt iterative EKF for 5DOF motion estimatesMotion direction and rotation between imagesGood results for small motions [RT]Similar accuracy as popular 8-point methodLeast squares solution with more pointsEKF framework has advantages Sensor fusion framework No minimum number of features Flexibility in rejecting uncorrelated noise

  • Large Motion EstimatesLarge R and T cause motion estimate errorsLarge R or T are both desirable for high SNRErrors arise in separating R/TRecursive Rotation Factorization (RRF)Build on iEKF framework for small motionsTake advantage of property that features motions are identical for a given R motion

    Estimate RI2 = [TR] I1Factor R from image I2 = [TR] [R-1] I1Estimate TI2 = [T] I1Iterate until R and T converge

  • RRF Large-Motion EstimationRRF motion estimation with various noise levels (1 m displacement and 10-90 degree rotation about the up-axis).

    Left and right side charts describe translation and rotation error respectively.

    Noise levels are 0.3, 1.5, and 3.0 degrees, top to bottomTranslation Rotation

  • Panoramic 6DOF Tracking6DOF tracking of a moving camera (red) is obtained (without requiring any calibrated features in the scene) from multiple 5DOF-motion estimates relative to reference images (blue)

  • 6DOF Tracking SimulationRRF motion estimation is computed and integrated over a sequence of images. The left graph shows the absolute angular error in translation direction and the right graph shows the absolute rotation error.

    The lower figure shows the simulated motion path of the camera. Points A and B are two reference positions. The camera starts at A and moves along the path.TranslationRotationMotion Path

  • Panoramic Images/Video for VisualizationPanoramic image from conventional video Video processed in field to produce a panoramic imageHighly compressed scene capture - no redundancy

    Panoramic video camera (3200x480 @ 24 fps)Transmit 360 field of view from remote siteVideo motion detection in all directionsReal time view for desktop or HMD

  • Panorama from Video

  • Gyro/Vision Orientation TrackingGyro angular rate sensorsDrift and bias (1KHz)Video angular rate sensorstracking loss/drift (30Hz)Compass orientation sensor jitter and magnetic field distortion (100 Hz)

  • Orientation Tracking TestPredict 2D feature motion from gyro dataRefine gyro data by vision feature trackingStabilize gyro drift and more robust 2D tracking

  • Orientation-Tracking Demonstration

  • Gyro/Vision Fusion Examples0123.323334 ms timeVideo tracking process

  • Cooperation within MURI TeamAlgorithms for sensor fusion and uncertainty managementPortable prototype and testbed for visualization demonstration and outdoor tracking testsShared visualizations of spatial annotations and panoramic imageryTracking and modeling are strongly relatedScene modeling aids tracking and vice-versa