29
Approximate Initialization of Camera Sensor Networks Purushottam Kulkarni K.R. School of Information Technology Indian Institute of Technology, Bombay Deepak Ganesan, Prashant Shenoy Department of Computer Science University of Massachusetts, Amherst

Approximate Initialization of Camera Sensor Networks

Embed Size (px)

DESCRIPTION

Approximate Initialization of Camera Sensor Networks. Purushottam Kulkarni K.R. School of Information Technology Indian Institute of Technology, Bombay. Deepak Ganesan, Prashant Shenoy Department of Computer Science University of Massachusetts, Amherst. Field-of -view. - PowerPoint PPT Presentation

Citation preview

Page 1: Approximate Initialization of  Camera Sensor Networks

Approximate Initialization of Camera Sensor Networks

Purushottam KulkarniK.R. School of Information TechnologyIndian Institute of Technology, Bombay

Deepak Ganesan, Prashant ShenoyDepartment of Computer Science

University of Massachusetts, Amherst

Page 2: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 2

Camera Sensor Networks

Wireless network of tetherless imaging sensors◊ Directional camera sensors

Applications◊ Ad-hoc Surveillance◊ Environmental and habitat monitoring

Tasks◊ Object detection, recognition, tracking

Field-of-view

Page 3: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 3

Camera Initialization

Pre-requisite for applications tasks◊ Localization, requires camera coordinates◊ Duty-cycling, requires set/overlap of neighbors◊ Tracking, requires overlap location with neighbors

Initialization parameters:◊ Extrinsic: location, orientation

◊ Intrinsic: focal length, skew, principal point

◊ Set of neighbors

◊ Degree of overlap

Page 4: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 4

Factors Effecting Initialization

Computation Capability Infrastructure Support

◊ Range Estimation◊ Landmarks

sync

range estimationpulse

Cricket Mote

Camera Sensor Networks Landmarks hard to find Resource-constraints

Estimation of accurate parameters not possible

Page 5: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 5

Problem Statement

Given a CSN with,◊ Limited computation capability◊ No/minimal infrastructure support

is it possible to initialize cameras to enable applications?

Proposed solution: Approximate Initialization◊ Estimate relative relationships between cameras◊ Use only picture taking capability and local

processing of camera

Page 6: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 6

Outline

Introduction & Problem Statement

Approximate Initialization Parameters

Estimation Techniques

Experimental Evaluation

Page 7: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 7

Approximate Initialization

Degree of Overlap◊ Fraction of viewing region that overlaps with

neighboring cameras

◊ k-overlap: fraction of viewing region overlapping by k cameras

Approximates level of sensing redundancy with neighboring cameras

Page 8: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 8

Approximate Initialization

Region of Overlap◊ spatial volume within viewing region that

overlaps with another camera

◊ Degree of overlap does not estimate which portion overlaps with neighbors

Approximates location of neighbors and spatial region of overlap

Approximate estimates can support application requirements

Page 9: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 9

Duty-Cycling

Operate in ON-OFF cycles d:duty-cycling parameter (ON fraction)

Oik: k-overlap of camera

Parameter in proportion to degree of overlap (extent of redundant coverage)

1

1nk

i ik

d ok

Page 10: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 10

Triggered Wakeup

Wakeup scenarios◊ Object tracking◊ Reliable detection

Region of overlap can determine potential cameras

C1

C2 C3

Object

Page 11: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 11

Estimating k-overlap

k-overlap: ratio of randomly placed reference objects viewed simultaneously by k cameras

cameras take pictures determine if object can be viewed simultaneously by

other cameras

Camera 3

Camera 2

Camera 1

kk ii

i

rO

r

reference points viewed at camera iir

kirreference points viewed by k cameras

Page 12: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 12

Skewed Distributions

Fraction of points does not represent fraction of overlap◊ Points in sparse region actually represent larger region◊ Error in estimation due to non-uniform distribution

Camera 3

Camera 2

Camera 111O

21O31O

: 2/3

: 1/9

: 2/9

11O

21O31O

: 1/2

: 1/4

: 1/4

Estimated Exact

Page 13: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 13

Handling Skewed Distributions

Assign area of each polygon as weight to corresponding reference point◊ Weight in proportion to density of neighbors

kk ii

i

wO

w

Total weight of reference points viewed at camera iiw

kiw Total weight of reference points viewed by k cameras

Page 14: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 14

Approximate 3D Voronoi Tessellation

Accurate 3D tessellation◊ Compute intensive

Approximation◊ Discretize volume into cubes◊ Calculate closest reference point

◊ Add volume to closest◊ Points in spare regions will have higher weights

Page 15: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 15

Determining Region of Overlap

where the overlap exists between cameras

region of overlap is the union of cells containing all simultaneously visible points

C1 C2

Page 16: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 16

Estimate dr using object size, image size, focal length

& have same orientation

Use unit vector along and dr to estimate location

Estimating Reference Point Location

f

rd

s

s’

Lens

P(-x,-y,-f)O

Rrd

rv (unknownlocation)Image

plane'

r

s stan

d f

CCCCCCCCCCCCCCPO

CCCCCCCCCCCCCCrv

CCCCCCCCCCCCCCPO

Page 17: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 17

Outline

Introduction & Problem Statement

Approximate Initialization Parameters

Estimation Techniques

Experimental Evaluation

Page 18: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 18

Experimental Evaluation

Simulation◊ 150 x 150 x 150◊ Two scenarios

◊ 4 cameras◊ 12 cameras

◊ Non-uniform distribution◊ Fraction of objects restricted area

Page 19: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 19

Experimental Evaluation

Implementation◊ 8 Cyclops camera sensors◊ Crossbow Micaz nodes◊ 8ft x 6ft x 17ft

Image GrabberObject Detection

Bounding Box

CyclopsView Table

Initializationprocedure

HostMotetrigger

viewinformation

Page 20: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 20

Weighted Approximation

Demonstrates non-weighted scheme shortcoming◊ Performs 4-6 times worse than weighted

Page 21: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 21

Effect of Skew

Weighted scheme can correct for skew better◊ Non-weighted scheme worse by a factor of 6

Page 22: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 22

Region of overlap

Error decreases with #reference points◊ ~22% with 12 pts/camera◊ 10% with 37 pts/camera

Error ~10% in region of overlap estimation

Page 23: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 23

Applications

Duty-cycling◊ Weighted scheme outperforms non-weighted

Triggered wakeup◊ 80% positive wakeups with 10 pts/camera with 2 triggers

Duty-Cycling Triggered Wakeup

Page 24: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 24

Implementation Results

k-overlap estimation error: 2-9%

Region of overlap error: 1-11%

Approximate techniques feasible in real deployments (~10% error)

Page 25: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 25

Related Work

Camera calibration◊ Accurate Extrinsic and Intrinsic parameters [Tsai 86], [Tsai

87], [Zhang 00]

Multimedia Sensor Networks◊ Panoptes: A vision sensor [Feng 03]

◊ Audio sensors [Raykar 03]

Localization◊ Sensor Localization [He 03], [Savvides 01], [Whitehouse 02]

◊ Active Badge [Harter 94], RADAR [Bahl 00], Cricket [Priyantha 00], Active Bat [Ward 97], GPS

◊ Relative Locationing [Rao 03]

Page 26: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 26

Conclusions

Proposed approximate techniques to estimate associations between cameras◊ Degree and region of overlap

Demonstrated use of estimates to enable applications◊ Error in estimations tolerable

http://sensors.cs.umass.edu

Page 27: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 27

Technology Trends

Sensors/platforms span a large spectrum Enable heterogeneous camera networks

Stargate

Funct

ion

alit

y

Cyclops

CMUcam

Webcam

Mote

Telos

XYZ

Image Sensors Sensor platforms

PTZ

Energy

Funct

ion

alit

y

Energy

Page 28: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 28

Approximate Initialization

Degree of overlap◊ Extent of overlapping coverage◊ k-overlap: fraction of viewing area covered by k cameras

Region of overlap◊ where is the overlapping coverage◊ spatial region of overlap with neighboring

cameras

Above estimates can support application requirements

Page 29: Approximate Initialization of  Camera Sensor Networks

UNIVERSITY OF MASSACHUSETTS, AMHERST 29

Triggered Wakeup

Wakeup scenarios◊ Object tracking◊ Reliable detection

Determine best camera◊ Projection line

◊ Object along this line◊ Reference points within

distance threshold◊ Extent of overlap

determines best camera

Image

Projectionline

Object

Distancethreshold