Upload
daniel-bar
View
18
Download
2
Embed Size (px)
Citation preview
1
MSc: Sustainable Energy Technology
Daniel Barzegar NtovomCandNo: 115604
Supervisor: Dr. Daniel RoggenSeptember 2014
University of Sussex
“Real-time gait and postural transition analysis with Google Glass”
2
Objectives
Recognize the user’s behavior from the sensors in Glass
Real-time feedback (audio, image)
Applications in sports (e..g. fitness) or rehabilitation (e.g. Parkinson disease)
3
Sensing systems
External sensorsMounted in predetermined points of interest – voluntary interactions of the user with the sensors (e.g. smart houses)
Wearable sensorsDevices attached on the user’s body
•State-change sensors •Motion sensors •RFID tags•camera
Use of inertial sensors•Accelerometer•Gyroscope•GPS
5
Previous work•Use of accelerometer & gyro sensor mounted on the body•Mainly under laboratory settings
Recognize gait transitions
Ear sensor
80-90% accuracyRecognize gait transitions (L)
5 accelerometer sensors
84% accuracy
6
Previous work (cont)
Recognize hand gestures (L)
accelerometer & gyroscope sensors
> 80% accuracy
7
Data collection
•Number of participants: 10
•Data acquisition time: ~45min
•Activities performed: sitting/standing, walking, ascending/descending stairs
•Venue: Sussex University campus under naturalistic settings
•Devices used: Google Glass, camera from a Smartphone to record the whole process
•Software: Android app designed to acquire data from the sensors of the Glass
•Sampling rate: 250Hz
•Total size of data collected: ~1.5GB
•Total size of videos recorded: ~40GB
8
Source: http://www.sussex.ac.uk/internal/bulletin/archive/11jan08/article6.shtml
9
Signals from acc sensor of Glass
Stand up normal transition examples
10
Annotation
•Anvil software
Figure 1 - Annotation bar
12
Activity Recognition Chain (ARC)
Figure 3 - The activity recognition chain (ARC) to recognize activities from wearable sensors
Source: https://www.andreas-bulling.de/fileadmin/docs/bulling14_csur.pdf
•Data collected
•Merging•Unit conversion•synchronization •Resampling
Isolated case
Sliding window
Extract N / 60
Knn classifierL-fold cross-validation
13
Data segmentation
Isolated case
We use segments which are defined from the
start and end time that the activity of interest
(Sit/Stand) occurred
Sliding window
a window of size Ws is moved over the time
series of the data, with a step Wstep =1/3 * Ws
14
Feature extraction
Statistical features extractedMean
Median
Std
Skewness
Kurtosis
Min
Max
Max-Min
Mode
Rms (root mean square)
10 statistical features X 3 axis X 2 sensors (acc, gyro) = 60 features
15
Feature computation
•mRMR algorithm to extract the N-best features (N = 1, 2, 3, 5, 10) from a total of 60 features
Isolated case – Sit-Down VS Stand-Up problem
Figure 4 –Frequency/feature
16
Sliding window – Sit VS Rest
•Different values for the window size are studied
Window SizesWs = 0.994904 sec
Ws1 = 1.1 * Ws Ws5 = 0.9 * Ws
Ws2 = 1.2 * Ws Ws6 = 0.8 * Ws
Ws3 = 1.3 * Ws Ws7 = 0.7 * Ws
Ws4 = 1.4 * Ws
17
Sliding window – Sit VS Rest (cont)
Figure 5 – Frequency /feature
18
Training & Classification
•Knn classifier ( k = 1, 3, 5, 7)
•L-fold cross-validation
Isolated case – Sit VS Stand problem
Figure 6 – knn example
19
The classifier was trained under two protocols:
1) trained on each subject’s activity sequence (user-specific protocol)
2) trained on activity sequence for all the subjects except one
Isolated case – Sit VS Stand problem (cont)
20
Sliding window – Sit VS Rest
Perform activity recognition on the features selected using a knn classifier
•knn classifier (k = 3)
Figure 8 – Recognition accuracies for the N-best features
21
Discussion - Conclusion
Only features from the Gyro sensor where extracted
Challenge regarding the segmentation method used
Sliding window had a low accuracy, however, only sensors of Glass have been used
Different values of k do not significantly affect the accuracy (isolated case)
These results are competitive with prior activity recognition works using other sensors
Naturalistic settings
More on-body sensors might increase the accuracy
22
Thank you!
Questions?