1
Processing SequentialSensor Data
John KrummMicrosoft Research
Redmond, Washington [email protected]
22
Interpret a Sequential Signal
0 10 20 30 40 50 60 70 80 90 1000
20
40
60
80
100
120
1-D Signal
Time (seconds)
Signal isā¢ Often a function of time (as above)ā¢ Often from a sensor
33
Pervasive/Ubicomp Examples
Signal sourcesā¢ Accelerometerā¢ Light sensorā¢ Gyro sensorā¢ Indoor locationā¢ GPSā¢ Microphoneā¢ ā¦
Interpretationsā¢ Speedā¢ Mode of transportationā¢ Locationā¢ Moving vs. not movingā¢ Proximity to other peopleā¢ Emotionā¢ ā¦
44
Goals of this Tutorial
ā¢ Confidence to add sequential signal processing to your researchā¢ Ability to assess research with simple sequential signal processingā¢ Know the terminologyā¢ Know the basic techniques
ā¢ How to implementā¢ Where theyāre appropriate
ā¢ Assess numerical results in an accepted wayā¢ At least give the appearance that you know what youāre talking about
55
Not Covering
Regression ā fit function to data
Classification ā classify things based on measured features
Statistical Tests ā determine if data support a hypothesis
0 10 20 30 40 50 60 70 80 90 1000
2000
4000
6000
8000
10000
12000
0.8 1 1.2 1.4 1.6 1.8 2 2.2 2.4 2.60
0.51
1.52
2.53
3.54
4.5
0%10%20%30%40%50%60%70%80%90%
100%
6
Outline
ā¢ Introduction (already done!)ā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results
77
Signal Dimensionality
1D: z(t)
2D: z(t) =( )z1(t)
z2(t)
0 10 20 30 40 50 60 70 80 90 1000
20406080
100120
1-D Signal
Time (seconds)
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
1002-D Signal
z1 (meters)
z2 (m
eter
s)
bold means vector
88
Sampled SignalCannot measure nor store continuous signal, so take samples instead
[ z(0), z(Ī), z(2Ī), ā¦ , z((n-1)Ī) ] = [ z1, z2, z3, ā¦ , zn ]
Ī = sampling interval, e.g. 1 second, 5 minutes, ā¦
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 505
1015202530354045
1-D Signal
Time (seconds)
Ī = 0.1 seconds
99
Signal + Noise
zi = xi + vi
measurement from noisy sensor
actual value, but unknown
random number representing sensor noise
0 10 20 30 40 50 60 70 80 90 1000
20
40
60
80
100
120
1-D Signal
Time (seconds)
Noiseā¢ Often assumed to be Gaussianā¢ Often assumed to be zero meanā¢ Often assumed to be i.i.d. (independent, identically distributed)ā¢ vi ~ N(0,Ļ) for zero mean, Gaussian, i.i.d., Ļ is standard deviation
10
Running Example
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Actual Path and Measured Locations
x (meters)
y (m
eter
s)šš = šš +šš šš = įš„šš¦šį= įŗš„š,š¦šį»š
Track a moving person in (x,y)ā¢ 1000 (x,y) measurementsā¢ Ī = 1 second
measurement vector
actual location
noise
zero mean
standard deviation = 3 meters
Also 10 randomly inserted outliers with N(0,15)start outlier
šš = ąµš£š(š„)š£š(š¦)ąµ±~ąµ¬šįŗ0,3į»šįŗ0,3į»ąµ°
11
Outline
ā¢ Introductionā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results
12
Mean Filterā¢ Also called āmoving averageā and ābox car filterāā¢ Apply to x and y measurements separately
zx
t
Filtered version of this point is mean of points in solid box
ā¢ āCausalā filter because it doesnāt look into futureā¢ Causes lag when values change sharplyā¢ Help fix with decaying weights, e.g.ā¢ Sensitive to outliers, i.e. one really bad point can cause mean to take on any valueā¢ Simple and effective (I will not vote to reject your paper if you use this technique)
1313
Mean Filter
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100 Mean Filter
x (meters)y
(met
ers)
10 points in each mean
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Actual Path and Measured Locations
x (meters)
y (m
eter
s)
ā¢ Outlier has noticeable impactā¢ If only there were some convenient way to fix this ā¦
outlier
1414
Median Filter
zx
t
Filtered version of this point is mean median of points in solid box
Insensitive to value of, e.g., this point
median (1, 3, 4, 7, 1 x 1010) = 4mean (1, 3, 4, 7, 1 x 1010) ā 2 x 109
Median is way less sensitive to outliners than mean
1515
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Median Filter
x (meters)
y (m
eter
s)
Median Filter
10 points in each median
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Actual Path and Measured Locations
x (meters)
y (m
eter
s)
Outlier has noticeable less impact
outlier
1616
Mean and Median Filter
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Mean and Median Filter
MeanMedian
x (meters)
y (m
eter
s)
Editorial: mean vs. median
The median is almost always better to use than the mean.
17
Outline
ā¢ Introductionā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results
18
Kalman Filter
My favorite book on Kalman filtering
ā¢ Mean and median filters assume smoothnessā¢ Kalman filter adds assumption about trajectory
Assumed trajectory is parabolic
datadynamicsWeight data against
assumptions about systemās dynamics
Big difference #1: Kalman filter includes (helpful) assumptions about behavior of measured process
1919
Kalman Filter
Big difference #2: Kalman filter can include state variables that are not measured directly
Kalman filter separates measured variables from state variables
šš = ąµš§š(š„)š§š(š¦)ąµ±
Running example: measure (x,y) coordinates (noisy)
šš =Ū
ŪŪ
š„šš¦šš£š(š„)š£š(š¦) Ū
ŪŪ
Running example: estimate location and velocity (!)
Measure:
Infer state:
0 10 20 30 40 50 60 70 80 90 1000
102030405060708090
100
2020
Kalman Filter Measurements
šš =š»ššš +šš
Measurement vector is related to state vector by a matrix multiplication plus noise.
ąµš§š(š„)š§š(š¦)ąµ±= į
1 0 0 00 1 0 0įŪ
ŪŪ
š„šš¦šš£š(š„)š£š(š¦) Ū
Ū šįŗš,š šį»+Ū
Sleepy eyes threat level: orange
š§š(š„) = š„š +šįŗ0,ššį» š§š(š¦) = š¦š +šįŗ0,ššį»
Running example:
ā¢ In this case, measurements are just noisy copies of actual locationā¢ Makes sensor noise explicit, e.g. GPS has Ļ of around 5 meters
2121
Kalman Filter DynamicsInsert a bias for how we think system will change through timešš =Ī¦šā1ššā1+š¤šā1
Ū
ŪŪ
š„šš¦šš£š(š„)š£š(š¦) Ū
Ū=Ū ąµ¦
1 0 āš”š 00 1 0 āš”š0 0 1 00 0 0 1 ąµŖ Ū
ŪŪ
š„šā1š¦šā1š£šā1(š„)š£šā1(š¦) Ū
ŪŪ +ąµ®
00š(0,šš )š(0,šš )ąµ²
š„š = š„šā1+āš”šš£š(š„) location is standard straight-line motion
š£š(š„) = š£šā1(š„) +š(0,šš ) velocity changes randomly (because we donāt have any idea what it actually does)
2222
Kalman Filter Ingredients
į1 0 0 00 1 0 0į
ąµ¦
1 0 āš”š 00 1 0 āš”š0 0 1 00 0 0 1 ąµŖ
H matrix: gives measurements for given state
šįŗš,š šį» Measurement noise: sensor noise
Ļ matrix: gives time dynamics of state
šįŗš,ššį» Process noise: uncertainty in dynamics model
2323
Kalman Filter Recipe
šą·ļæ½ļ潚(ā) =Ī¦šā1šą·ļæ½ļ潚ā1(+)
šš(ā) =Ī¦šā1ššā1(+)Ī¦šā1š +ššā1
š¾š =šš(ā)š»ššįš»ššš(ā)š»šš+š šįā1 šą·ļæ½ļ潚(+) =šą·ļæ½ļ潚(ā) +š¾š įšš āš»ššą·ļæ½ļ潚(ā)į
šš(+) = įŗš¼āš¾šš»šį»šš(ā)
ā¢ Just plug in measurements and goā¢ Recursive filter ā current time step uses state and error estimates from previous time step
Sleepy eyes threat level: red
Big difference #3: Kalman filter gives uncertainty estimate in the form of a Gaussian covariance matrix
2525
Kalman Filter
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Kalman Filter
x (meters)
y (m
eter
s)
š£š(š„) = š£šā1(š„) +š(0,šš ) Velocity model:
ā¢ Smoothā¢ Tends to overshoot cornersā¢ Too much dependence on straight line velocity assumptionā¢ Too little dependence on data
datadynamics
2626
Kalman Filter
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Kalman Filter
UntunedTuned
x (meters)
y (m
eter
s)
ā¢ Hard to pick process noise Ļs
ā¢ Process noise models our uncertainty in system dynamicsā¢ Here it accounts for fact that motion is not a straight line
š£š(š„) = š£šā1(š„) +š(0,šš ) Velocity model:
āTuningā Ļs (by trying a bunch of values) gives better result
2727
Kalman Filter
Editorial: Kalman filter
The Kalman filter was fine back in the old days. But I really prefer more modern methods that are not saddled with Kalmanās restrictions on continuous state variables and linearity assumptions.
28
Outline
ā¢ Introduction (already done!)ā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results
2929
Particle Filter
Dieter Fox et al.
WiFi tracking in a multi-floor building
ā¢ Multiple āparticlesā as hypothesesā¢ Particles move based on probabilistic motion modelā¢ Particles live or die based on how well they match sensor data
3030
Particle Filter
Dieter Fox et al.
ā¢ Allows multi-modal uncertainty (Kalman is unimodal Gaussian)ā¢ Allows continuous and discrete state variables (e.g. 3rd floor)ā¢ Allows rich dynamic model (e.g. must follow floor plan)ā¢ Can be slow, especially if state vector dimension is too large(e.g. (x, y, identity, activity, next activity, emotional state, ā¦) )
3131
Particle Filter Ingredients
šįŗššČ%ššį» ā¢ z = measurement, x = state, not necessarily sameā¢ Probability distribution of a measurement given actual valueā¢ Can be anything, not just Gaussian like Kalmanā¢ But we use Gaussian for running example, just like Kalman
p(z i
|xi)
zi
xi
For running example, measurement is noisy version of actual value
E.g. measured speed (in z) will be slower if emotional state (in x) is ātiredā
3232
Particle Filter Ingredients
šįŗššČ%ššā1į» ā¢ Probabilistic dynamics, how state changes through timeā¢ Can be anything, e.g.
ā¢ Tend to go slower up hillsā¢ Avoid left turnsā¢ Attracted to Scandinavian people
ā¢ Closed form not necessaryā¢ Just need a dynamic simulation with a noise componentā¢ But we use Gaussian for running example, just like Kalman
xi
xi-1
random vector
3333
Home Example
šįŗššČ%ššā1į» šįŗššČ%ššį»
z = ( (x,y) location in house from WiFi)T
Measurements
x = (room, activity)
State (what we want to estimate)
ā¢ p((x,y) in kitchen | in bathroom) = 0
ā¢ p( sleeping now | sleeping previously) = 0.9ā¢ p( cooking now | working previously) = 0.02ā¢ p( watching TV & sleeping| *) = 0ā¢ p( bedroom 4 | master bedroom) = 0
Rich measurement and state dynamics models
34
Particle Filter AlgorithmStart with N instances of state vector xi
(j) , i = 0, j = 1 ā¦ N1. i = i+12. Take new measurement zi
3. Propagate particles forward in time with p(xi|xi-1), i.e. generate new, random hypotheses
4. Compute importance weights wi(j) = p(zi|xi
(j)), i.e. how well does measurement support hypothesis?
5. Normalize importance weights so they sum to 1.06. Randomly pick new particles based on importance weights7. Goto 1
Compute state estimateā¢ Weighted mean (assumes unimodal)ā¢ Median
Sleepy eyes threat level: orange
3535
Particle Filter
Dieter Fox et al.
WiFi tracking in a multi-floor building
ā¢ Multiple āparticlesā as hypothesesā¢ Particles move based on probabilistic motion modelā¢ Particles live or die based on how well they match sensor data
3636
Particle Filter Running Example
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Particle Filter
ActualParticle 1000Particle 1000000
x (meters)
y (m
eter
s)
Sometimes increasing the number of particles helps
šįŗššČ%ššį»
p(z i
|xi)
zi
xi
Measurement model reflects true, simulated measurement noise. Same as Kalman in this case.
šįŗššČ%ššā1į» š„š = š„šā1+āš”šš£š(š„) location is standard
straight-line motion
š£š(š„) = š£šā1(š„) +š(0,šš ) velocity changes randomly (because we donāt have any idea what it actually does)
Straight line motion with random velocity change. Same as Kalman in this case.
3737
Particle Filter Resources
UbiComp 2004
Especially Chapter 1
3838
Particle Filter
Editorial: Particle filter
The particle filter is wonderfully rich and expressive if you can afford the computations. Be careful not to let your state vector get too large.
39
Outline
ā¢ Introductionā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results
4040
Hidden Markov Model (HMM)
Big difference from previous: states are discrete, e.g.ā¢ Spoken phonemeā¢ {walking, driving, biking, riding bus}ā¢ {moving, still}ā¢ {cooking, sleeping, watching TV, playing game, ā¦ }
Markov 1856 - 1922 Hidden Markov
4141
(Unhidden) Markov Model
bus walk
drive
0.9 0.7
0.9
0.1
0.1
0.1
0.20.0
0.0
ā¢ Move to new state (or not)ā¢ at every time clickā¢ when finished with current state
ā¢ Transition probabilities control state transitions
Example inspired by:
UbiComp 2003
4242
Hidden Markov Model
bus walk
drive
0.9 0.7
0.9
0.1
0.1
0.1
0.20.0
0.0
Can āseeā states only via noisy sensor
accelerometer
4343
HMM: Two Parts
P(X0(j)) ajk P(X1
(j)|z1) P(X2(j)|z2)ajk
Initial StateProbabilities
TransitionProbabilities
Observation Probabilities
Observation Probabilities
TransitionProbabilities
P(X3(j)|z2)
Observation Probabilities
ajk
TransitionProbabilities
Two parts to every HMM:1) Observation probabilities P(Xi
(j)|zi) ā probability of state j given measurement at time i2) Transition probabilities ajk ā probability of transition from state j to state k
ā¢ Find path that maximizes product of probabilities (observation & transition)ā¢ Use Viterbi algorithm to find path efficiently
4444
Smooth Results with HMM
moving vs. still0 10 20 30 40 50 60 70 80 90 100
010203040506070
Signal Strength
Time (sec.)
Sign
al S
tren
gth
noise variance
still moving
still moving still
still moving
0.99989 0.999890.00011
0.00011
Signal strength has higher variance when moving ā observation probabilities
Transitions between states relatively rare (made-up numbers) ā transition probabilities
4545
Smooth Results with HMM
noise variance
still moving
still still
movingmoving
still
moving
still
moving
0 10 20 30 40 50 60 70 80 90 1000
10203040506070
0.99989
0.99989
0.00011
0.00011
0.4
0.6
0.2
0.8
0.9
0.1 0.7
0.3 Viterbi algorithm finds path with maximum product of observation and transition probabilities
still
moving
still
moving
still
moving
inferred and smoothed with
HMM
inferred
actual
0 200 400 600 800 1000
Time (seconds)
Still vs. Moving Estimate
Results in fewer false transitions between states, i.e. smoother and slightly more accurate
4646
Running Example
0
10
20
30
40
50
60
70
80
90
100
0 10 20 30 40 50 60 70 80 90 100
y (m
eter
s)
x (meters)
Hidden Markov ModelDiscrete states are 10,000 1m x 1m squares
Observation probabilities spread in Gaussian over nearby squares as per measurement noise model
Transition probabilities go to 8-connected neighbors
0.011762 0.136136 0.011762
0.13964 0.401401 0.13964
0.011762 0.136136 0.011762
4747
HMM Reference
ā¢ Good description of Viterbi algorithmā¢ Also how to learn model from data
4848
Hidden Markov Model
Editorial: Hidden Markov Model
The HMM is great for certain applications when your states are discrete.
Tracking in (x,y,z) with HMM?ā¢ Huge state space (ā slow)ā¢ Long dwellsā¢ Interactions with other airplanes
49
Outline
ā¢ Introductionā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results
5050
Presenting ContinuousPerformance Results
Measured Mean Median Kalman (untuned)
Kalman (tuned)
Particle HMM05
101520253035404550
Tracking Error vs. Filter
Mean Error
Median Error
met
ers
0
1
2
3
4
5
6
7
Tracking Error vs. Filter
Mean Error
Median Error
met
ers
šš =Ō”šą·ļæ½ļ潚 āššŌ”
estimatedvalue
actualvalue
Euclidiandistance
Plot mean or median of Euclidian distance errorā¢ Median is less sensitive to error outliers
Note: Donāt judge these filtering methods based on these plots. I didnāt spend much time tuning the methods to improve their performance.
5151
Presenting ContinuousPerformance Results
0 1 2 3 4 5 6 7 8 9 100
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Cumulative Error Distribution
Median
HMM
Kalman (tuned)
Particle
Mean
Kalman (untuned)
Error (meters)
Frac
tion
Cumulative error distributionā¢ Shows how errors are distributedā¢ More detailed than just a mean or median error
50% of the time, the particle filter gives an error of 2 meters or less (median error)
95th percentile
95% of the time, the particle filter gives an error of 6 meters or less (95th percentile error)
5252
Presenting DiscretePerformance Results
Techniques like particle filter and HMM can classify sequential data into discrete classes
Sitti ng Standing Walking Up s ta i rs Down sta i rs
Elevator down
Elevator up
Brushing teeth
Sitti ng 75% 24% 1% 0% 0% 0% 0% 0%
Standing 29% 55% 6% 1% 0% 4% 3% 2%
Walking 4% 7% 79% 3% 4% 1% 1% 1%
Up sta i rs 0% 1% 4% 95% 0% 0% 1% 0%
Down sta i rs 0% 1% 7% 0% 89% 2% 0% 0%
Elevator down 0% 2% 1% 0% 8% 87% 1% 0%
Elevator up 0% 2% 2% 6% 0% 3% 87% 0%
Brushing teeth 2% 10% 3% 0% 0% 0% 0% 85%
Actu
al A
ctivi
ties
Inferred Activities
Confusion matrix
Pervasive 2006
5353
End
Mean Median Kalman (untuned)
Kalman (tuned)
Particle HMM01234567
Tracking Error vs. Filter
Mean ErrorMedian Error
met
ers
ā¢ Introductionā¢ Signal terminology and assumptionsā¢ Running exampleā¢ Filtering
ā¢ Mean and median filtersā¢ Kalman filterā¢ Particle filterā¢ Hidden Markov model
ā¢ Presenting performance results 0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100
Actual Path and Measured Locations
x (meters)
y (m
eter
s)
54
55
Ubiquitous Computing Fundamentals,CRC Press, Ā© 2010