4

POSTERS · POSTERS! ! ! 1. Facial Expressions Interpretation for Human-Robot Interaction Vishwas Mruthyunjaya, Carnegie Mellon University, Pittsburgh, United States 2. Provoking Pleo

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: POSTERS · POSTERS! ! ! 1. Facial Expressions Interpretation for Human-Robot Interaction Vishwas Mruthyunjaya, Carnegie Mellon University, Pittsburgh, United States 2. Provoking Pleo
Page 2: POSTERS · POSTERS! ! ! 1. Facial Expressions Interpretation for Human-Robot Interaction Vishwas Mruthyunjaya, Carnegie Mellon University, Pittsburgh, United States 2. Provoking Pleo

POSTERS! ! ! 1. Facial Expressions Interpretation for Human-Robot Interaction Vishwas Mruthyunjaya, Carnegie Mellon University, Pittsburgh, United States 2. Provoking Pleo – Child Life Specialists’ Reflections On The Use Of Robotic Playmates In Hospital Settings Saskia van Oenen, Rianne Meiring, Wanda van Oostrom, Melissa Wesselius and Marcel Heerink, Windesheim Flevoland University, Robotics research group 3. Cloud robotics in therapy and education: possibillities and challenges Ingmar Koningen, Paul Koot, Jelmer Stavenga and Tom Visser, Windesheim Flevoland University, Robotics research group 4. Pleo rb social pet robot as positive emotional state facilitator to facilitate Learning Naema Brazal-Alcaide and Nathalie P. Lizeretti, FPCEE Blanquerna Ramon Llull University Barcelona Spain, Olga Sans-Cope, Technical University of Catalonia Barcelona, Spain , and Jordi Albo-Canals, GRSETAD – La Salle, Ramon Lull University, Barcelona, Spain 5. Social Robotics In Education Involving ASD Children: A Collaborative Design Project Saskia van Oenen, Hanno van Keulen and Marcel Heerink, Windesheim Flevoland University, Robotics research group 6. LEGO Robotics activities feeder for Social Robotics thorugh a Cloud-based Architecture Frederick Sanson, Gabriel Aguirre, Mario Mejia and Victor Lopez Technol. Univ. of Panama, Panama City, Jordi Albo-Canals, GRSETAD – La Salle, Ramon Lull University, Barcelona, Spain 7. KNXbot, A social robot full integrated to smart buildings Ignacio de Ros Viader, AdR Ingeniería S.L., Barcelona , Spain Enric Gonzalez and Xavi Burruezo, Dynatech, Barcelona , Spain, Jordi Albo-Canals, GRSETAD – La Salle, Ramon Lull University, Barcelona, Spain 8. Designing Socially Assistive Robot (SAR) for Cognitive Child-Robot Interaction (CCRI) with children in Autism Spectrum Disorder – The case of +me Beste Ozcan, Daniele Caligiore, Valerio Speratia and Gianluca Baldassarre, Institute of Cognitive Sciences and Technologies, ISTC-CNR, Rome, Italy, Eduard Fosch-Villaronga, Institute for Governance Studies, University of Twente Enschede, The Netherlands, Tania Moretta, Department of General Psychology, University of Padova, Italy 9. Social Robots in Education: Towards Versatility Wafa Johal, CHILI/LSRO, École Polytechnique Fédérale de Lausanne, Switzerland, Gaëlle Calvary, Nadine Mandan, and Sylvie Pesty, , Laboratoire d’Informatique de Grenoble, Grenoble-Alps University, France 10. Ethical concerns when developing social robots for care Ricardo Machado, Departament de Psicologia Social, Universitat Autònoma de Barcelona, Bellaterra, Spain and Jordi Albo-Canals, GRSETAD – La Salle, Ramon Lull University, Barcelona, Spain

Page 3: POSTERS · POSTERS! ! ! 1. Facial Expressions Interpretation for Human-Robot Interaction Vishwas Mruthyunjaya, Carnegie Mellon University, Pittsburgh, United States 2. Provoking Pleo

Designing Socially Assistive Robot (SAR) for Cognitive Child-Robot Interaction (CCRI) with children with Autism

Spectrum Disorder - The case of “+me”

Beste Ozcana, Daniele Caligiorea, Eduard Fosch-Villarongab, Valerio Speratia, Tania Morettac and Gianluca Baldassarrea

a Institute of Cognitive Sciences and Technologies, ISTC-CNR, Rome, Italy b Institute for Governance Studies, University of Twente Enschede, The Netherlands

c Department of General Psychology, University of Padova, Italy

Designing socially assistive robots (SAR) is about designing social robots that work on the cognitive level and that can engage in social interactions, which are compelling and familiar to users, in this case, with Autism Spectrum Disorder (ASD). The stress and unpredictability caused by social interaction is largely removed during the interaction with a computer, a robot, or a mechatronic device [1]. In this case, therefore, not only the physical embodiment, but also the personality of the robot and its ability to model some of the patient’s motivational state are dimensions that are going to be crucial to effectively and positively impact on the user’s life [2]. For example, the “+me” prototype is a transitional wearable companion (TWC) which is an embedded social robot that responds to the user’s manipulations by emitting lights, sounds, or vibrations usable for multiple purposes such as to motivate children to engage and interact socially [3]. The social robot design key design dimensions (based on how we observe, engage and want robots to look like; what motivates us when we interact; as well as on the current legal requirements and the use of biofeedback) are defined below:

a) Perception: The lifelikeness of a robot has a strong role in HRI, especially if it is designed to work at the emotional level [4].

b) Emotional attachment: SAR are physical, can behave autonomously, and they social behavior – which can lead children to respond to cues even if they are not alive [5].

c) Embodiment: The embodiment affects users’ perceptions of the robot’s personality, mind [6] and intention [7] (see perception). SAR should be embedded with behaviors that enriched the interaction with humans, making such interaction natural. d)

d) Motivation: SAR may act intrinsically rewarding as sidekicks/social partners, especially for children with special needs. SAR reproduces the social and emotional benefits associated with the interaction and the emotional bond between children and companion animals such as entertainment, relief, support and enjoyment [8].

e) Interaction: Through social interactions, human are constantly responsive to social cues from others that make us how to behave in response to how the others are acting and feeling. SAR should be designed with similar social capabilities to be integrated into children’s life [9].

f) Legal and Ethical: As robots can have moral and ethical implications, the more and more there will be the need to accommodate the design of the robot to ethical and legal considerations. From what we have seen, several are the dimensions that need to be taken into account in order to accommodate the use of emotions in the cognitive HRI. As people perceive different robot designs, there is the need to help designers create appropriate robots for specific purposes – in this case, therapeutic contexts [10].

g) The Use of Biofeedback: Emotion regulation depends critically on the ability to adjust physiological arousal and this reflects the capacity to rapidly vary heart rate [11]. At the same time, increase in activity of the vagus nerve, measured by heart rate variability (HRV), is associated with social interaction skill and decreased stress [12].

SAR's mechatronic components could contain non-intrusive biosensors to detect the heart rate and the breathing rate of the ASD child. Biosensors collect online data, thanks implemented wearable devices worn by the ASD child [13]. Wearable devices could stream heart rate and breathing rate data wirelessly to SAR, that rapidly and accurately feed back HRV and breathing rate information to the ASD child, through SAR's actuators (e.g., visual and auditory feedbacks). The feedback from bio-signals could increase the ASD child’s capacity to maximize HRV by learning to increase the size of heart rate changes in phase with breathing, promoting social skill and Child-Robot Interaction. Keywords: Social robot design dimensions, cognitive child-robot interaction, autism. REFERENCES [1] Farr W, Yuill N, Raffle H (2010) Social benefits of a tangible user interface for children with autistic spectrum conditions. Autism: Int J Res Pract 14(3):237–252 [2] Matari, M.J. (2005) The Role of Embodiment in Assistive Interactive Robotics for the Elderly. [3] Özcan, B., Caligiore, D., Sperati, V. et al. Int J of Soc Robotics (2016) 8: 471. doi:10.1007/s12369-016-0373-8 [4] Dautenhahn, K. (2004) Robots We Like to Live With ?! – A Developmental Perspective on a Personalized, Life-Long Robot Companion. RO-MAN, pp. 17 – 22. [5] Barco A, et al. (2014) Engagement based on a customization of an iPod-LEGO robot for a long-term interaction for an educational purpose. ACM/IEEE HRI, pp 124-125

Page 4: POSTERS · POSTERS! ! ! 1. Facial Expressions Interpretation for Human-Robot Interaction Vishwas Mruthyunjaya, Carnegie Mellon University, Pittsburgh, United States 2. Provoking Pleo

[6] S. Y. O. and Schwartz, D.L. (2006) Young children’s understanding of animacy and entertainment robots. IJHR, pp. 393–412. [7] E. Broadbent, et al. (2013) Robots with Display Screens: A Robot with a More Humanlike Face Display Is Perceived To Have More Mind and a Better Personality. PloS one, 8(8). [8] Cabibihan, J. J. at al. (2013). Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. IJSR, 5(4), 593-618. [9] De Graaf, M and Ben, S. (2016) The Influence of Prior Expectations of a Robot’s Lifelikeness on Users’ Intentions to Treat a Zoomorphic Robot as a Companion,” IJSR. [10] Cavoukian, A. (2011) 7 Foundational Principles of Privacy by Design. [11] Appelhans, B. M., & Luecken, L. J. (2006). Heart rate variability as an index of regulated emotional responding. Review of general psychology, 10(3), 229. [12] Shahrestani, S., Stewart, E. M., Quintana, D. S., Hickie, I. B., & Guastella, A. J. (2015). Heart rate variability during

adolescent and adult social interactions: A meta-analysis. Biological psychology, 105, 43-50. [13] Vaschillo, E., Lehrer, P., Rishe, N., & Konstantinov, M. (2002). Heart rate variability biofeedback as a method for assessing baroreflex function: A preliminary study of resonance in the cardiovascular ystem. Applied Psychophysiology and Biofeedback, 27(1), 1-27. [13] Uddin, A. A., Morita, P. P., Tallevi, K., Armour, K., Li, J., Nolan, R. P., & Cafazzo, J. A. (2016). Development of a Wearable Cardiac Monitoring System for Behavioral Neurocardiac Training: A Usability Study. JMIR mHealth and uHealth, 4(2).