34
Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Embed Size (px)

Citation preview

Page 1: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Optical Tracking for VR

Bertus Labuschagne

Christopher Parker

Russell Joffe

Page 2: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Introduction

Page 3: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Project Motivation

Inexpensive

Variable-light conditions

Use low-resolution devices

Did we mention inexpensive?

Page 4: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Project Breakdown

Layer 3Layer 2Layer 1

Low level image processing Motion prediction & model generation Movement processing

Russell Bertus Christopher & Bertus

Christopher

Page 5: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Layer 1

Low-level image processing

Page 6: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Overview

Camera– Distortion example– Calibration

“Outside-in” model

Marker-based tracking– Thresholding– Sub-pixel accuracy– Search space reduction

Page 7: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Fundamental constraint of project: Low cost Camera choice: Logitech webcam (< R150) Camera may be prone to distortion need to calibrate

Camera

Page 8: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

CameraDistortion Example

VRVis Zentrum für Virtual Reality und Visualisierung Forschungs-GmbH

http://www.vrvis.at/2d3d/technology/cameracalibration/cameracalibration.html

Page 9: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

CameraCalibration

WHY?– Important for calculating accurate metric data

HOW?– Camera calibration toolkit.

Page 10: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

“Outside-in” model

Markers are placed on the user Cameras are fixed in position

Inside-out model: Cameras placed on users

Page 11: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Marker-based tracking

Tasks:– Find position of markers in environment– Match corresponding markers from cameras– Extract marker centres

Page 12: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Marker-based trackingThresholding (1/4)

PURPOSE: Find regions in which markers are most likely to be

METHOD: Partition the image into background and foreground based on intensity threshold.

Problems?

Page 13: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Marker-based trackingThresholding (2/4)

Threshold too high

Localisation of only one marker

Page 14: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Marker-based trackingThresholding (3/4)

Threshold too low

Localisation of all markers

Extra background noise in foreground

Page 15: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Marker-based trackingThresholding (4/4)

Threshold just about right

Localisation of all three markers

Minor noise in image

Page 16: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Marker-based trackingSub-pixel accuracy

After thresholding, a large blob remains

We would like to find the centre of the light source

Naïve method: Take the brightest pixel in the area accurate to one pixel

Binary centroid: Take the average position of all points in the region, above the threshold

Weighted centroid: Treat positions of intensities above threshold as a mask and weight the points according to their original intensities

Page 17: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Layer 3Layer 2Layer 1

Low level image processing Motion prediction & model generation Movement processing

Marker-based trackingSearch space reduction

Likely 3D position

Page 18: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Layer 2

Motion prediction & Model Generation

Page 19: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Overview

Tracking the current location and rotation of the user

Reducing latency in the system by using motion prediction

Ensuring the prediction coincides with the actual motion

Passing the information on to the environment

Page 20: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

User Tracking

Common problems with user tracking– Latency

End-to-end delay from capturing data to updating the screen

– Efficiency Of the tracking algorithm

– Accuracy Accuracy of detecting changes in position and rotation

Page 21: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Motion Prediction I

Motivation– Reduce the effects of latency– Allows smooth transition between frames

Different inputs– For 2D input devices – For 3D input devices

Types of algorithms– Polynomial Predictor– Kalman based Predictor

Page 22: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Motion Prediction II

Existing vs new Algorithm– Existing algorithms

Might not be suited to our problem May require modifications

– May require new algorithm

Testing the efficiency and accuracy of implemented algorithms

Page 23: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Layer 3

Movement Processing

Page 24: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Layer 4

Virtual Environment

Page 25: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Overview

Movement data mapped to VE screen updates

Tracker vs. Standard Input (Keyb & Mouse)

Hypothesis:– “An optical tracking system works better for

navigating through a virtual environment than conventional means”

Page 26: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Performance goals

High Accuracy

Low Latency

Speed + Usability

Page 27: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

2D / 3D Environments

OpenGL– 2D (non-walking)

– Pacman type game

– 3D (with walking)– Landscape / Game (undecided)

CAVEAT

Page 28: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Layer 4

User Testing

Page 29: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

User testing techniques

Questionnaires– Hypothesis test

Continuous Assessment– Performance statistics

Interviews

Ethnographic Observation

Postural Response

Page 30: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Conclusion

Page 31: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Conclusions

Project consists of four sections

One section each– Layer 3, joins Layer 2 and Layer 4.

Final Outcome

Lastly a look at our deliverables

Page 32: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Questions?

Page 33: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Deliverables

Page 34: Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe

Deliverables

20th June 2006 Obtain cameras 30th June 2006 Get images from cameras 20th September LED system built 20th September Test centroid-finding algorithms 20th September Test images for algorithms captured 22nd September System design complete 25th September VE design/User test design complete 27th September 1st implementation of stand alone algorithms on images 2nd October 2nd test of algorithms 6th October All modules completed 10th October 1st system integrated and running 13th October Preliminary tests 16th October Design for 2nd version