19
Human-computer interface with Kinect Institute of Information and Communication Technologies by Alexander Marinov

Human-computer interface with Kinect

  • Upload
    zaria

  • View
    33

  • Download
    0

Embed Size (px)

DESCRIPTION

Human-computer interface with Kinect. by Alexander Marinov. Institute of Information and Communication Technologies. My professional work. My scientific work. Motivation. - PowerPoint PPT Presentation

Citation preview

Page 1: Human-computer interface with Kinect

Human-computer interface with Kinect

Institute of Information and Communication Technologies

by Alexander Marinov

Page 2: Human-computer interface with Kinect

My professional work

Page 3: Human-computer interface with Kinect

My scientific work

Page 4: Human-computer interface with Kinect

Motivation

Meet Milo an on-screen computer character which uses Kinect "Project Natal" to interact intelligently with humans. Narrated by Peter Molyneux of Lionhead Studios.

Page 5: Human-computer interface with Kinect

Depth cameras

Color and depth-sensing lensesVoice microphone arrayTilt motor for sensor adjustment

Horizontal field of view: 57 degreesVertical field of view: 43 degreesPhysical tilt range: ± 27 degreesDepth sensor range: 1.2m - 3.5m

Sensor

Field of ViewData Streams320x240 16-bit depth @ 30 frames/sec640x480 32-bit colour@ 30 frames/sec16-bit audio @ 16 kHz

Page 6: Human-computer interface with Kinect

Depth images

XY D

Page 7: Human-computer interface with Kinect

Framework

• Locate people in the scene, ignore background

• Locate their limbs and joints, which person is which

• Find and track their gestures

Demonstration!

Page 8: Human-computer interface with Kinect

Problem

• Map the gestures to meaning and commands

• What is a gesture

• How to recognize gesture

Page 9: Human-computer interface with Kinect

Gestures• Point set trajectory of one or more human body parts

Page 10: Human-computer interface with Kinect

Gesture recognitionEuclidean DistanceSequences are aligned “one to one”.

Dynamic Time WarpingNonlinear alignments are possible.

Gavrila, D. M. & Davis,L. S.(1995). Towards 3-d model-based tracking and recognition of human movement: a multi-view approach. In IEEE IWAFGR

Page 11: Human-computer interface with Kinect

How is DTW Calculated?

KwCQDTW K

k k1min),(

(i,j) = d(qi,cj) + min{ (i-1,j-1) , (i-1,j ) , (i,j-1) }

C

QC

Q

Warping

path

w

Page 12: Human-computer interface with Kinect

DTW: Example 1

1 1 2 3 2 0

0112321

0∞

∞ ∞ ∞ ∞ ∞ ∞1112455

2112455

4221223

7442124

9552212

9664532

DTW(Q,C)=

QC

404.0~71111112

Page 13: Human-computer interface with Kinect

DTW: Example 2

1 2 3 2 0

0112321

0∞

∞ ∞ ∞ ∞ ∞ ∞1112455

3221223

6442124

8552212

8664532

9665642

DTW(Q,C)=

QC 1

395.0~811111122

Page 14: Human-computer interface with Kinect

DTW: global path constraints

Sakoe-Chiba Band Itakura Parallelogram

r =

r is a term defining allowed range of warping for a given point in a sequence

Page 15: Human-computer interface with Kinect

DTW: Lower Bounds optimization

We can speed up similarity search under DTW by using a lower bounding function.

best_so_far = infinity;for all sequences in database

LB_dist = lower_bound_distance(Ci, Q);

endfor

Algorithm Lower_Bounding_Sequential_Scan(Q)

if LB_dist < best_so_fartrue_dist = DTW(Ci, Q);

endif

if true_dist < best_so_farbest_so_far = true_dist;index_of_best_match = i;

endif

Page 16: Human-computer interface with Kinect

DTW: Lower Bound of Kim et. al.

A

B

C

D

The squared difference between the two sequence’s first (A), last (D), minimum (B) and maximum points (C) is returned as the lower bound

Kim, S, Park, S, & Chu, W. An index-based approach for similarity search supporting time warping in large sequence databases. ICDE 01, pp 607-614

Page 17: Human-computer interface with Kinect

DTW: Lower Bound of Yi et. al.

Yi, B, Jagadish, H & Faloutsos, C. Efficient retrieval of similar time sequences under time warping. ICDE 98, pp 23-27.

max(Q)

min(Q)

The sum of the squared length of gray lines represent the minimum the corresponding points contribution to the overall DTW distance, and thus can be returned as the lower bounding measure

Page 18: Human-computer interface with Kinect

Summary

• We use Microsoft ® Kinect ™ and existing SDK to obtain human body parts gesture trajectories

• We apply Dynamic Time Warping algorithm to match the closest gesture from a database

• Trigger command to the device corresponding to the matched gesture

Page 19: Human-computer interface with Kinect

Thank you!