28
Activity recognition based on a multi-sensor hierarchical- classifier IWANN 2013, 12-14 June, Tenerife (Spain) Oresti Baños, Miguel Damas, Héctor Pomares and Ignacio Rojas Department of Computer Architecture and Computer Technology, CITIC-UGR, University of Granada, SPAIN DG-Research Grant #228398

Activity recognition based on a multi-sensor meta-classifier

Embed Size (px)

DESCRIPTION

Ensuring ubiquity, robustness and continuity of monitoring is of key importance in activity recognition. To that end, multiple sensor con gurations and fusion techniques are ever more used. In this paper we present a multi-sensor meta-classi er that aggregates the knowledge of several sensor-based decision entities to provide a unique and reliable activity classi cation. This model introduces a new weighting scheme which improves the rating of the impact that each entity has on the decision fusion process. Sensitivity and speci city are particularly considered as insertion and rejection weighting metrics instead of the overall accuracy classi cation performance proposed in a previous work. For the sake of comparison, both new and previous weighting models together with feature fusion models are tested on an extensive activity recognition benchmark dataset. The results demonstrate that the new weighting scheme enhances the decision aggregation thus leading to an improved recognition system. This presentation illustrates part of the work described in the following articles: * Banos, O., Damas, M., Pomares, H., Rojas, F., Delgado-Marquez, B. & Valenzuela, O. Human activity recognition based on a sensor weighting hierarchical classifier. Soft Computing - A Fusion of Foundations, Methodologies and Applications, Springer Berlin / Heidelberg, vol. 17, pp. 333-343 (2013) * Banos, O., Damas, M., Pomares, H., Rojas, I.: Activity recognition based on a multi-sensor meta-classifier. In: Proceedings of the 2013 International Work Conference on Neural Networks (IWANN 2013), Tenerife, Spain, June 12-14, (2013)

Citation preview

Page 1: Activity recognition based on a multi-sensor meta-classifier

Activity recognition based on a multi-sensor hierarchical-

classifier

IWANN 2013, 12-14 June, Tenerife (Spain)

Oresti Baños, Miguel Damas, Héctor Pomares and Ignacio Rojas Department of Computer Architecture and Computer Technology, CITIC-UGR,

University of Granada, SPAIN

DG-Research Grant #228398

Page 2: Activity recognition based on a multi-sensor meta-classifier

Introduction

• Activity recognition concept

– “Recognize the actions and goals of one or more agents from a series of observations on the agents' actions and the environmental conditions”

• Applications (among others)

– eHealth (AAL, telerehabilation)

– Sports (performance improvement, injury-free pose)

– Industrial (assembly tasks, avoidance of risk situations)

– Gaming (Kinect, Wii Mote, PlayStationMove)

• Categorization by sensor modality

– Ambient

– On-body

2

Page 3: Activity recognition based on a multi-sensor meta-classifier

Sensing Activity

3

• Ambient sensors

Page 4: Activity recognition based on a multi-sensor meta-classifier

Sensing Activity

• Ambient sensors

Limitations*

Page 5: Activity recognition based on a multi-sensor meta-classifier

3rd Generation (and beyond…)

2nd Generation 1st Generation

Sensing Activity

5

• On-body sensors

Page 6: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

6

Page 7: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

7

Page 8: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

8

Page 9: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

9

Page 10: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

10

Page 11: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

11

Page 12: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

12

Page 13: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

13

[-0

.14

, 3.4

1, 4

,21

, … ,

6.1

1]

[-0

.84

, 3.2

1, 4

.21

, … ,

6.1

1]

[-0

.81

, 5.7

1, 4

.21

, … ,

6.2

2]

[-0

.14

, 3.9

2, 4

.23

, … ,

7.8

2]

S u p s1,s2,…,sk fℝ(s1,s2,…,sk) c

Page 14: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

14

Page 15: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

15

Page 16: Activity recognition based on a multi-sensor meta-classifier

Activity Recognition Chain (ARC)

16

SENSOR FUSION

Page 17: Activity recognition based on a multi-sensor meta-classifier

ARC Fusion: Feature Fusion

17

[-0

.14

, 3.4

1, 4

,21

, … ,

6.1

1]

[-0

.84

, 3.2

1, 4

.21

, … ,

6.1

1]

[-0

.81

, 5.7

1, 4

.21

, … ,

6.2

2]

[-0

.14

, 3.9

2, 4

.23

, … ,

7.8

2]

S1

S2

SM

u1 p1 s11,s12,…,s1k fℝ(s11,s12,…,s1k)

c u2 p2 s21,s22,…,s2k fℝ(s21,s22,…,s2k)

uM pM sM1,sM2,…,sMk fℝ(sM1,sM2,…,sMk)

fℝ(s11,s12,…,s1k,

s21,s22,…,s2k,…,

sM1,sM2,…,sMk)

Page 18: Activity recognition based on a multi-sensor meta-classifier

ARC Fusion: Decision Fusion

18

[-0

.14

, 3.4

1, 4

,21

, … ,

6.1

1]

[-0

.84

, 3.2

1, 4

.21

, … ,

6.1

1]

[-0

.81

, 5.7

1, 4

.21

, … ,

6.2

2]

[-0

.14

, 3.9

2, 4

.23

, … ,

7.8

2]

S1

S2

SM

u1 p1 s11,s12,…,s1k fℝ(s11,s12,…,s1k) c1

c=φ(c1,c2,…,cM)

u2 p2 s21,s22,…,s2k fℝ(s21,s22,…,s2k) c2

uM pM sM1,sM2,…,sMk fℝ(sM1,sM2,…,sMk) cM

Page 19: Activity recognition based on a multi-sensor meta-classifier

Multi-Sensor Hierarchical Classifier

19

SM

S2

S1 α11

∑ C12

C1N

C11

C21

C22

C2N

CM1

CM2

CMN

Decisio

n

Class level Source level Fusion

β11

α12 β12

α1N β1N

α21 β21

α22 β22

α2N β2N

αM1 βM1

αM2 βM2

αMN βMN

γ11,…,1N δ11,…,1N

γ21,…,2N δ21,…,2N

γM1,…,MN δM1,…,MN

[-0

.14

, 3.4

1, 4

,21

, … ,

6.1

1]

[-0

.84

, 3.2

1, 4

.21

, … ,

6.1

1]

[-0

.81

, 5.7

1, 4

.21

, … ,

6.2

2]

[-0

.14

, 3.9

2, 4

.23

, … ,

7.8

2]

S1

S2

SM

u1 p1 s11,s12,…,s1k fℝ(s11,s12,…,s1k)

u2 p2 s21,s22,…,s2k fℝ(s21,s22,…,s2k)

uM pM sM1,sM2,…,sMk fℝ(sM1,sM2,…,sMk)

Page 20: Activity recognition based on a multi-sensor meta-classifier

Multi-Sensor Hierarchical Classifier

20

N activities M sensors & Class level Source level Fusion

Page 21: Activity recognition based on a multi-sensor meta-classifier

Multi-Sensor Hierarchical Classifier

21

N activities M sensors & Class level Source level Fusion

Page 22: Activity recognition based on a multi-sensor meta-classifier

Multi-Sensor Hierarchical Classifier

22

N activities M sensors & Class level Source level Fusion

Page 23: Activity recognition based on a multi-sensor meta-classifier

Multi-Sensor Hierarchical Classifier

23

N activities M sensors & Class level Source level Fusion

Page 24: Activity recognition based on a multi-sensor meta-classifier

Experimental setup: dataset

• Fitness benchmark dataset

• Up to 33 activities

• 9 IMUs (XSENS) ACC, GYR, MAG

• 17 subjects

24 Baños, O., Toth M. A., Damas, M., Pomares, H., Rojas, I., Amft, O.: A benchmark dataset to evaluate sensor displacement in activity recognition. In: 14th International Conference on Ubiquitous Computing (Ubicomp 2012), Pittsburgh, USA, September 5-8, (2012)

Page 25: Activity recognition based on a multi-sensor meta-classifier

Results

• Segmentation: sliding window (6 seconds) • Feature extraction: FS1={mean}, FS2={mean,std}, FS3={mean,std,max,min,cr} • Classification: Decision tree (C4.5) (10-fold cross-validated, 100 repetitions)

25 10 activities 20 activities 33 activities

FS1 FS2 FS3 FS1 FS2 FS3 FS1 FS2 FS360

65

70

75

80

85

90

95

100

Accura

cy (

%)

Feature Fusion Weighted Majority Voting Multi-Sensor Hierarchical Classifier

Experimental Parameters

Page 26: Activity recognition based on a multi-sensor meta-classifier

Conclusions

• We propose a multi-sensor hierarchical classifier that allows data fusion of multiple sensors

– Its assymetric decision weighting (SEinsertions/SPrejections) leverages the potential of the classifiers either for classification/rejection or both

– Specially suited for complex scenarios

• Feature Fusion and MSHC are quite in line in terms of performance however

– Our method outperforms the former when a more informative feature set is used

– Particularly notable for complex recognition scenarios

• Our model is expected to be particularly suited to deal with sensor anomalies (work-in-progress)

26

Page 27: Activity recognition based on a multi-sensor meta-classifier

On-going work…

• Our model is expected to be particularly suited to deal with sensor anomalies (work-in-progress)

27

FEAT-FUSION MSHC0

20

40

60

80

100

Accura

cy (

%)

Ideal Self Induced

Page 28: Activity recognition based on a multi-sensor meta-classifier

Thank you for your attention. Questions?

Oresti Baños Legrán Dep. Computer Architecture & Computer Technology

Faculty of Computer & Electrical Engineering (ETSIIT) University of Granada, Granada (SPAIN)

Email: [email protected] Phone: +34 958 241 516 Fax: +34 958 248 993

Work supported in part by the HPC-Europa2 project funded by the European Commission - DG Research in the Seventh Framework Programme under grant agreement no. 228398, the Spanish CICYT Project SAF2010-20558, Junta de Andalucia Project P09-TIC-175476 and the FPU Spanish grant AP2009-2244.

28