24
Object Inter-Camera Tracking with non- overlapping views: A new dynamic approach Trevor Montcalm Bubaker Boufama

Object Inter-Camera Tracking with non- overlapping views: A new dynamic approach Trevor Montcalm Bubaker Boufama

Embed Size (px)

Citation preview

Object Inter-Camera Tracking with non-overlapping views: A new dynamic

approach

Trevor MontcalmBubaker Boufama

Layout of Todays Presentation

Basics of Object Tracking, bottom to top overview

Single Camera and Inter-Camera Tracking Features used for Object Tracking Camera Linking Emphasizing factors Experimental Results

What is Object Tracking?

The task of tracking objects as they move within an area under video surveillanceo Objects could be people, cars,

anything of interest

How is this accomplished?

Other significant works

Mohammed Ahsan Alio Feature-based tracking

Andrew Gilberto Matrix-based color transfer functions between

cameras

Y. Cai, J. Kango Advanved shape and color descriptor used to

match objects

Background Subtraction

Subtracts a background model from the current frame to classify which pixels are foreground and background

Foreground pixels are of interest, objects in the scene

The Adaptive Gaussian Mixture Model background subtraction algorithm was used

Background Subtraction

Blob Formation

A blob is a group of foreground pixels that might be a real-life object

Decides which groups are blobs, and which are noise

Three steps:o Smooth background subtracted imageo Use Connected Component Analysis to discover

groupso Blob size thresholding, merging if close enough

Blob Formation

Single camera object tracking

Matches blobs to the set of known objects in the sceneo Done for each frame of video

Matching is accomplished by comparing the feature vector of each blob and objecto A feature vector is a collection of featureso Each feature describes a property of the object

or blob

Occlusions handled with Kalman filter

Inter-Camera Tracking

The specific task of object tracking across camera views that are non-overlapping

Each camera has a separate field of vision

Features Used for Object Tracking

Location – The current centroid of an object Velocity – Objects 2D velocity (pixels/sec) Width – Object width Height – Object height Size – Object size (# of foreground pixels)

Basic Features:

Features Used for Object Tracking

Histogram – Color histogram of the object Shape – 49 Zernike Moments

All feature values are normalized to facilitate comparison between different cameras

Advanced Features:

Comparing Feature Vectors

Single camera object tracking: o Differences of all features are averaged for a

final difference

Inter-camera object tracking:o Individual features are emphasized or

depreciated, depending on circumstanceso This is the new dynamic approach mentioned in

the title

Emphasizing Factors

Time: Emphasize more recent appearances Camera Link Quality: Use previous

matching information to systems advantage Stability: Emphasize more stable features

over unstable ones

Camera Link Quality

Between each pair of cameras is a camera link

Stores a Camera Transfer Function, which translates a feature vector from one camera to another

Idea is to use previous matching history to translate featureso Exploit redundancy in object movement patterns

Camera Link Quality Example

Building the Matching Feature Vector

An aggregate feature vector used to represent the object in matchingo Aggregation of many appearances

Time: More recent appearances are used Camera Link Quality: Reliably translated

features are emphasized Stability: More stable features are

emphasized

Building the Matching Feature Vector

Each feature vector translated to a target camera

Using recentness, translation quality, and feature quality, a single matching feature vector is built

Dynamic Weighting

Describes how to weigh each feature in a feature vector comparison, similar to matching feature vector

Emphasizes robust features for low-camera link quality

After matching data built up, more general features are weighed in

Object tracking decision

Best object/blob match is chosen, compared against a threshold

Single camera tracking: Preset threshold Inter-camera tracking: Dynamic threshold.

o At first, a low threshold (0.65)o After matching data is built up, more stringent

threshold (0.95)o Change in threshold is linear

Experimental Results

Two cameras used: Sony Cyber-shot DSC-S930 and a Kodak EasyShare C180o Low-resolution, off the self cameras with

differing color sensitivity

Surveillance videos filmed in two locations:o A large building hallwayo Domestic house

Experimental Results

Experimental Results

References

A. Gilbert and R. Bowden. Incremental, scalable tracking of objects inter camera. Computer Vision and Image Understanding, 111(1):43 – 58, 2008. Special Issue on Intelligent Visual Surveillance (IEEE).

M. Ali. Feature-based tracking of multiple people for intelligent video surveillance. In Masters Abstracts International, volume 45, 2006.

J. Kang, I. Cohen, and G. Medioni. Persistent objects tracking across multiple non overlapping cameras. In Proceedings of the IEEE Workshop on Motion and Video Computing (WACV/MOTION’05)-Volume, volume 2, pages 112–119.

Y. Cai, K. Huang, and T. Tan. Human Appearance Matching Across Multiple Non-overlapping Cameras. In Pattern Recognition, 2008. ICPR 2008. 19th International Conference on, pages 1–4, 2008.