Upload
marty
View
21
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Caught in Motion. By: Eric Hunt-Schroeder EE275 – Final Project - Spring 2012. Kalman Filter. Kalman Filter Equations Summary. U n – Control Vector, magnitude of any control system’s or user’s control on situation Z n – Measurement Vector, real-world measurement received - PowerPoint PPT Presentation
Citation preview
Caught in MotionBy:
Eric Hunt-SchroederEE275 – Final Project - Spring 2012
Kalman Filter
Kalman Filter Equations Summary
Un – Control Vector, magnitude of any control system’s or user’s control on situation
Zn – Measurement Vector, real-world measurement received
Xn – newest estimate of current true state
Pn – newest estimate of the average error
A – State Transition Matrix
B – Control Matrix
H – Observation Matrix
Q – Estimated process error covariance
R – Estimated Measurement error covariance
ExampleLets suppose we fire a tennis ball at a 45˚ angle with a velocity of 100 m/s. We take measurements from a camera inside the tennis ball. This camera acts as our sensor taking measurements of position at the same time step (∆t). The camera adds error to our position measurements. Velocity in the x and y directions are known exact throughout the example.
We then have error in position but no error in velocity.
Expected Results using Newtons Kinematic
Equations
Kinematic Equations
x(t) = x0 + V0xt x direction position
Vx(t) = V0x velocity in x direction,
Assumed Constant
y (t) = y0 + V0yt - (1/2)gt2 y direction position
Vy(t) = V0y – gt velocity in y direction
Where: x0 is the initial displacement and g is the acceleration due to gravity (i.e. ≈ 9.81m/s2)
∆t represents a time step of 1
Converting to a Recurrence Relation, discrete time
xn = xn-1 + Vxn-1∆t x direction
position
Vxn = Vxn-1 velocity in x
direction
yn = yn-1 + Vyn-1∆t - (1/2)g∆t2 y direction
position
Vyn = Vyn-1 - g∆t velocity in y
direction
g is the acceleration due to gravity (i.e. ≈ 9.81m/s2)
∆t represents a time step of 1
Putting into Matrix form
Giving our Kalman Filter some initial information:
Simulation Results
Introduction – Why do we want to do Motion
Tracking?Track and detect objects moving across a given space
Location and Identification of an object - detection of a robbery in bank, car crash in intersection
Want to know behavior of an animal being tested on in the lab
Object Detection - Process
We must differentiate between what is the foreground and background image.
We assign each pixel of an image a distribution of typical values -> our background image.
The background should be constantly updated over subsequent frames.
Problems with Outlier-Detection & algorithm
Drastic changes to our background – no easy fix
Gradual changes to the background on the other hand can be fixed. We let our background be an accumulation of previous backgrounds. This allows for minor changes to take place such as a change from sunlight to dusk.
Time for an example!
Example – Background VS Foreground
Background image: with a pedestrian passing through (outlier)
Foreground Image: Pedestrian detected
On the left we see our frame with a background image already developed. We also notice an object, in this case a pedestrian, has entered the frame. This pedestrian causes a drastic change in the pixels we expected and we notice that the frame on the right has detected this change, shown vividly in white (foreground image).
This process is completed by lumping large connected foreground regions into blobs, allowing one to detect an object.
object detected incorrectly
Background image: with a pedestrian passing through (outlier)
Foreground Image: Pedestrian detected – Unwanted blob shown
We notice an unwanted blob is shown in the foreground image. When we evaluate the original image we see that this was a street light and may have been caused by excessive winds producing an outlier to be detected incorrectly.
With minor errors in object detection our main goal still remains MOTION TRACKING -> this involves creating a series of object blobs together across successive image frames, this is commonly referred to as blob tracking.
Step 2: Object TrackingWe are able to track an object with a bounding box by estimating the trajectory of two (x,y) coordinates at opposite corners using the Kalman Filter
This is quite similar to the example of tracking a tennis ball except we now keep track of two points
Another problem…Occlusion
Another difficulty faced with this motion tracking algorithm is that of occlusion.
When two objects pass each other we lose track of the object.
Future algorithms and research may learn how to better deal with this common problem.
Sources/References
http://en.wikipedia.org/wiki/K-means_clustering
http://greg.czerniak.info/node/5
http://www.cs.berkeley.edu/~flw/tracker/
http://www.cs.ubc.ca/~murphyk/Software/Kalman/kalman.html
http://www.mathworks.com/help/toolbox/control/ref/kalman.html