4
Systematic Evaluation of Social Behaviour Modelling with a Single Accelerometer Hayley Hung Technical University of Delft The Netherlands [email protected] Gwenn Englebienne University of Amsterdam The Netherlands [email protected] Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). UbiComp’13 Adjunct , September 8–12, 2013, Zurich, Switzerland. ACM 978-1-4503-2215-7/13/09. http://dx.doi.org/10.1145/2494091.2494130 Abstract We describe our ongoing research on systematically analysing what types of socially related attributes and behaviours can be estimated automatically in highly social and crowded situations. This is a challenging task because obtaining the true labels for social behaviours or attributes in practice is non-trivial. Here, individuals hang a sensing device around their neck that records their acceleration during a social event. We then devise models to estimate their social behaviour or attributes based on these measurements and systematically evaluate the feasibility of such a set-up. Since we only use a single triaxial accelerometer per person, our results are surprisingly accurate and suggest that further socially relevant information could also be extracted. Our systematic evaluations provide a deeper understanding of how to better model socially relevant information in the future. Author Keywords Human behavior, social actions, wearable sensors, data mining ACM Classification Keywords H.1.2 [Models and Principles]: User/Machine Systems— Human Information Processing ; H.3.1 [Information Storage and Retrieval]: Content Analysis and Indexing— Indexing Methods Poster, Demo, & Video Presentations UbiComp’13 Adjunct, September 8–12, 2013, Zurich, Switzerland 127

Systematic Evaluation of Social Behaviour Modelling - UbiComp

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

as the test data. We performed 10-fold cross-validationten times on random permutations of the sequences andthe classification performance is summarised in Table 2.speaking was most detected most robustly while gesturing

gesture step drinkPrecision 0.59 1.00 1.00Recall 0.24 0.21 0.21F1 0.34 0.35 0.35

laugh speakPrecision 1.00 0.64Recall 0.38 0.82F1 0.56 0.72

Table 2: Average precision,recall and F-measure for thedifferent action categories in ourdataset over 10 repetitions of10-fold cross validation.

was the most difficult to detect. The remaining behaviourswere detected with a very high precision but low recall.

We would like to exploit these behaviours to detect who isspeaking with whom. Social scientists have found thatpeople talking together have certain distinctivesynchronous behaviour [5]. Preliminary analysis of ourdata showed that gesturing and stepping occurssynchronously more frequently, and speakingsimultaneously occurs less often for people in the samegroup compared to different groups.

Conclusion and Future WorkWe have presented systematic evaluatons of theestimation of social attributes and behaviour fromacclerometer data recorded during crowded socialgatherings. Future work will be focused on understandinghow fusing information from the social attributes andactions could help to improve the estimation performance.

AcknowledgementsWe thank researchers at the VU University of Amsterdam(Matthew Dobson, Claudio Martella, and Maarten vanSteen) for the use of their wearable sensors and helpduring the data collection, and the University ofAmsterdam (Jeroen Kools and Ben Krose) for their datacollection and annotation help.

References[1] Castellano, G., Kessous, L., and Caridakis, G.

Emotion recognition through multiple modalities:face, body gesture, speech. Affect and emotion inhuman-computer interaction (2008).

[2] Cattuto, C., Van den Broeck, W., Barrat, A.,Colizza, V., Pinton, J., and Vespignani, A. Dynamicsof Person-to-Person Interactions from DistributedRFID Sensor Networks. PLOS ONE 5, 7 (07 2010).

[3] Englebienne, G., and Hung, H. Mining formotivation: using a single wearable accelerometer todetect people’s interests. In ACM MM Workshops,ACM (2012).

[4] Hung, H., Englebienne, G., and Kools, J. ClassifyingSocial Actions with a Single Accelerometer. InUbicomp (2013).

[5] Kendon, A. Conducting Interaction: Patterns ofBehavior in Focused Encounters. CambridgeUniversity Press, 1990.

[6] Kim, T., McFee, E., Olguin, D. O., Waber, B., andPentland, A. S. Sociometric badges: Using sensortechnology to capture new forms of collaboration.Journal of Organizational Behavior (2012).

[7] Laibowitz, M., and Paradiso, J. The UbER-Badge, aversatile platform at the juncture between wearableand social computing. Advances in PervasiveComputing (2004).

[8] Olguin, D. O., Waber, B. N., Kim, T., Mohan, A.,Ara, K., and Pentland, A. Sensible Organizations:Technology and Methodology for AutomaticallyMeasuring Organizational Behavior. IEEETransactions on Systems, Man, and Cybernetics,Part B (2009).

[9] Pantic, M., Pentland, A., Nijholt, A., and Huang, T.Human computing and machine understanding ofhuman behavior: a survey. Artifical Intelligence forHuman Computing (2007).

[10] Wyatt, D., Choudhury, T., Bilmes, J., and Kitts,J. A. Inferring colocation and conversation networksfrom privacy-sensitive audio with implications forcomputational social science. ACM Trans. Intell.Syst. Technol. 2, 1 (Jan. 2011).

Session: Poster, Demo, & Video Presentations UbiComp’13, September 8–12, 2013, Zurich, Switzerland

130