106
Conference Proceedings New Friends 2015 The 1st international conference on social robots in therapy and education

Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Conference Proceedings New Friends 2015

The 1st international conference on social robots in therapy and education

Page 2: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Conference Proceedings New Friends 2015

The 1st international conference on social robots in therapy and education

October 22-23 2015, Almere, The Netherlands

Edited by Marcel Heerink and Michiel de Jong

Published by Windesheim Flevoland Hospitaaldreef 5 1315 RC Almere The Netherlands

Page 3: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Introduction

The proceedings of the international conference New Friends 2015 reflects the multidisciplinary nature of the conference theme, addressing the demand for expertise in both practice and research with expertise from a wide range of disciplines, like psychology, nursing, occupational therapy, physiotherapy, AI, robotics and education.

The event featured keynotes by Vanessa Evers and Matthias Scheutz, oral and poster presentations (based on 48 accepted submissions), product and business demonstrations, competitions and practice oriented workshops, covering:

• practitioners’ perspective of end users’ needs, • good examples of trials, practice and intervention guidelines, • interdisciplinary collaboration, • innovations in robotics, therapy and education • theoretical studies and empirical research, • legal, ethical, philosophical and social issues.

We welcomed 118 registered attendees, not including representatives from sponsoring companies and institutions, local co-organizers and student volunteers. This is quite respectable for a 1st conference and demonstrates the relevance of the conference theme and profile.

In recognition of this, we are proud to announce that this will be the first in a series: next year we hope to see you again at New Friends 2016 in Barcelona!

We thank the following people formaking this possible with their contribution to this conference: Sytse Dugour, Wytse Miedema, Adam Hagman, Cristina Abad Moya, Adri Acero Montes, Atina Hrkac, Tom Ederveen, Vanessa Evers, Miquel Aranaz

And we explicitely like to express our gratitude to our sponsors: Robotdalen, Aisoy Robotics, Robin Robotics, OMFL, Gemeente Almere, Cinnovate, GWIA, M&I/Partners

On behalf of the organizing committee,

Marcel Heerink

General chair

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 1 -

Page 4: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Organizing Committee Marcel Heerink

Windesheim University of Applied Sciences Almere, The Netherlands

Bram Vanderborght Vrije Universiteit Brussel

Brussels, Belgium

Jordi Albo-Canals Tufts University

Boston, US

Alex Barco-Martelo LaSalle University

Barcelona, Spain and Almere, The Netherlands

Lars Asplund, Christine Gustafsson Malardalen University Eskilstuna, Sweden

Claire Huijnen Zuyd University of Applied Sciences

Heerlen, The Netherlands

Joost Broekens Delft Technical University

Delft, The Netherlands

Program committee Bram Vanderborght

Christine Gustafsson

Jordi Albo-Canals

Marcel Heerink

Rianne Jansens

Renee van den Heuvel

Local organization Mary Verspaget

Brigitte Toes

Sjoerd de Vos

Juan Besselse

Wieke van Wijngaarden

Saskia van Oenen

Reviewers Sandra Bedaf, Mohamed Bouri, Hoang-Long Cao,Mark Coeckelbergh, Cristina Costescu, Eduard Fosch Villaronga,

Pablo Gomez Esteban, Michiel Joosse, Kitty Jurrius, Marcus Persson, Aaron Pica, Ramona Simut, Loek van der

Heide, Saskia van Oenen, Cesar Vandevelde, Sjoerd de Vos, Charlotte Vissenberg, Yueh-Hsuan Weng, Francis

Wyffels

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 2 -

Page 5: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Oral session papers Page

Education & children with special needs 1 Session chairs: Joost Broekens and Rosemarijn Looije

Rianne Jansens, Pedro Encarnação and Serenella Besio. LUDI: a Pan-European Network Addressing Technology to Support Play for Children with Disabilities

6

Jordi Albo-Canals, Carolina Yañez, Alex Barco, Cecilio Angulo and Marcel Heerink. Modelling Social Skills and Problem Solving Strategies used by Children with ASD through Cloud Connected Social Robots as Data Loggers: First Modelling Approach

8

Claire Huijnen, Monique Lexis and Luc de Witte. Matching Robot KASPAR To ASD Therapy And Educational Goals

10

Paul Baxter, Silviu Matu, Emmanuel Senft, Cristina Costescu, James Kennedy, Daniel David and Tony Belpaeme. Touchscreen-Mediated Child-Robot Interactions Applied to ASD Therapy

12

Alex Barco, Jordi Albo-Canals, Carles Garriga-Berga, Begoña Garcia-Zapirain and Álvaro Sánchez. LEGO robot with multitouch device connected to sensors and actuators for physical and cognitive rehabilitation in elderly people and kids with special needs

14

Dementia, eldercare & independent living Session chairs: Christine Gustafsson and Lars Asplund Marcus Persson. The impact of an Interactive Robotic Cat on Dementia Caregivers’ Psychosocial Work

Environment – a Pilot Study 16

Sandra Bedaf and Luc de Witte. Acceptability Of A Service Robot Which Supports Independent Living Of Elderly People

18

Tomohiro Susuzki, Sachie Yamada, Takayuki Kanda and Tatsuya Nomura. Influence of Social Avoidance and Distress on People’s Preferences for Robots as Daily Life Communication Partners

20

Martina Heinemann, Meritxell Valenti Soler and Marcel Heerink. Is it real? Dealing with an insecure perception of a pet robot in dementia care

22

Education & children with special needs 2 Session chairs: Jordi Albo and Alex Barco Frances Wijnen, Vicky Charisi, Daniel Davison, Jan van der Meij, Dennis Reidsma and Vanessa Evers. Inquiry

learning with a social robot: can you explain that to me? 24

Jacqueline Kory Westlund, Leah Dickens, Sooyeon Jeong, Paul Harris, David Desteno and Cynthia Breazeal. A Comparison of Children Learning New Words from Robots, Tablets, & People

26

Rosemarijn Looije, Mark A. Neerincx and Johanna K. Peters. How do diabetic children react on a social robot during multiple sessions in a hospital?

28

Jacqueline Kory Westlund, Goren Gordon, Samuel Spaulding, Jin Joo Lee, Luke Plummer, Marayna Martinez, Madhurima Das and Cynthia Breazeal. Learning A Second Language with a Socially Assistive Robot

30

Ivana Kruijff-Korbayová, Elettra Oleari, Clara Pozzi, Francesca Sacchitelli, Anahita Bagherzadhalimi, Sara Bellini, Bernd Kiefer, Stefania Racioppa and Alberto Sanna. Let’s Be Friends: Perception of a Social Robotic Companion for children with T1DM

32

Human-robot relationships Session chairs: Claire Huijnen and Renee van den Heuvel Maartje de Graaf. The Ethics of Human-Robot Relationships 34 Albert De Beir, Hoang-Long Cao, Pablo Gomez Esteban, Greet Van de Perre and Bram

Vanderborght. Enhancing Nao Expression of Emotions Using Pluggable Eyebrows 36

Hoang-Long Cao, Pablo Gomez Esteban, Albert De Beir, Greet Van de Perre, Ramona Simut, Dirk Lefeber and Bram Vanderborght. Toward a Platform-Independent Social Behavior Architecture for Multiple Therapeutic Scenarios

38

Eduard Fosch Villaronga and Jordi Albo-Canals. Boundaries in Play-based Cloud-companion-mediated Robotic Therapies: From Deception to Privacy Concerns

40

Posters session papers & late-breaking reports

Maria Vircikova, Gergely Magyar and Peter Sincak. Cloud-based Social Robot that Learns to Motivate Children as an Assistant in Back-Pain Therapy and as a Foreign-Language Tutor

44

Renée van den Heuvel, Monique Lexis and Luc de Witte. Possibilities Of The IROMEC Robot For Children With Severe Physical Disabilities

46

Jered Vroon, Jaebok Kim and Raphaël Koster. Robot Response Behaviors To Accommodate Hearing Problems

48

Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation Architecture

50

Barbara Klein, Karin Dunkel, Sebastian Reutzel and Stefanie Selic. Suitability of a Telepresence Robot for 52

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 3 -

mh0039994
Getypte tekst
Page 6: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Counseling on Home Modification and Independent Living Roger Bemelmans and Luc de Witte. A Pilot Study On The Feasibility of Paro Interventions In Intramural Care

For Intellectual Disabled Clients 54

Carolin Straßmann, Astrid Marieke Rosenthal-Von der Pütten and Nicole Krämer. NoAlien! Linguistic alignment with artificial entities in the context of second language acquisition

56

Beste Özcan, Gianluca Baldasarre, Maria Nicoletta Aliberti and Tania Moretta. Transitional Wearables Based on Bio-Signals to Improve Communication and Interaction of Children with Autism

58

Igor Zubrycki, Jaroslaw Turajczyk and Grzegorz Granosik. Roboterapia: an environment supporting therapists' needs

60

Patrick Albo-Canals, Albert Valls, Vicens Casas, Olga Sans-Cope and Jordi Albo-Canals. AISOY Social Robot as a tool to learn how to code versus tangible and non-tangible approaches

62

Michael Anderson, Susan Anderson and Vincent Berenz, Ensuring Ethical Behavior from Autonomous Systems

64

Sally Grindsted Nielsen, Anja Christoffersen, Elizabeth Jochum and Zheng-Hua Tan, Robot future: using theatre to influence acceptance of care robots

66

Resheque Barua, Shimo Sraman and Marcel Heerink, Empathy, Compassion and Social Robots: an Approach from Buddhist Philosophy

68

Tecla S. Scholten, Charlotte Vissenberg and Marcel Heerink, Hygiene and the use of robotic animals: an exploration

70

Emelideth Valenzuela, Alex Barco and Jordi Albo-CanalsLearning Social Skills through LEGO-based Social Robots for Children with Autism Spectrum Disorder at CASPAN Center in Panama

72

Workshop papers: Bridging the Gaps between Different Worlds

A. Legal

Eduard Fosch Villaronga, Principles Involved in Care Robotics Legal Compliance. 76 Marcello Ienca, Intelligent Assistive Technologies for Dementia: Social, Legal and Ethical Implications. 78 B. Ethical Elaine Sedenberg et al., Designing Therapeutic Robots for Privacy Preserving Systems, Ethical Research

Practices, and Algorithmic Transparency. 80

Rieks op den Akker, What do care robots reveal about technology? 82 Antonio Carnevale, ‘I tech care’: The responsibility to provide healthcare using robots. 84 C. Social Sofia Reppou et al., Robots and seniors: can they be friends? 86 Mark Coeckelberg et al., Survey investigating ethical issues concerning Robot Enhanced Therapy for

childrenwith autism. 88

Aaron Saiger, Accommodating Students with Disabilities Using Social Robots and TelepresencePlatforms: Some Legal and Regulatory Dimensions.

90

D. Practical Jorge Gallego-Perez, An HRI study with elderly participants? Where’s the problem? 92 Jordi Albo-Canals, Toy robots vs Medical Device 94 Mohamed Bouri, Which Perspectives of Using Exoskeletons in Activities for Daily Living? 96

Video´s and demo´s

Michael Anderson and Susan Anderson Ensuring Ethical Behavior from Autonomous Systems 100 Peter van der Post, Robin Steffers, Aaron Pica, Robin Scheick and Marcel Heerink, Bonnie: Developing a Very

Special Friend 101

Vito Mahalla, Peter van der Post, Alex Barco Martelo and Marcel Heerink,, Remote Control Application for Therapeutic Use of a Social Robot

103

Tony Belpaeme, Paul Baxter, James Kennedy, Robin Read, Bernd Kiefer, Ivana Kruijff-Korbayová, Valentin Enescu, Georgios Patsis, Hichem Sahli, Bert Bierman, Olivier Blanson Henkemans, Rosemarijn Looije, Mark Neerincx, Raquel Ros Espinoza, Alexandre Coninx, Yiannis Demiris and Joachim de Greeff, Social Robots to Support Children with Diabetes: an Overview of the ALIZ-E Project

104

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 4 -

Page 7: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Oral session papers

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 5 -

Page 8: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

LUDI: a Pan-European Network Addressing Technology to

Support Play for Children with Disabilities

Rianne Jansensa, Pedro Encarnaçãob, and Serenella Besioc aZuyd University, the Netherlands, Occupational Therapy Department and Centre of Expertise for Innovative Care

and Technology. LUDI Action Working Group 2 Vice Leader bUCP - Católica Lisbon School of Business & Economics, Portugal. LUDI Action Vice Chair

cUniversità della Valle d’Aosta, Italy. LUDI Action Chair

INTRODUCTION

LUDI, A NETWORK IN THE FIELD OF

RESEARCH AND INTERVENTION OF PLAY

FOR CHIDLREN WITH DISABILITIES

Abstract: The right to play is enshrined in the United

Nations Convention on the Rights of the Child as a

consequence of its importance to overall child development.

Children with disabilities are often deprived of this right due

to functional limitations, the lack of supporting technologies,

and social and cultural contexts in which play is frequently

seen as secondary when compared to rehabilitation

interventions. This paper presents the COST Action LUDI, a

Pan-European network aiming at the recognition of the

theme of play for children with disabilities as a multi- and

trans-disciplinary research field to which the contribution of

psycho-pedagogical sciences, health and rehabilitation

sciences, humanities, assistive technologies and robotics, as

well as the contribution of end-users’ organizations, is

necessary to grant the right to play for children with

disabilities.

Keywords: Play, children with disabilities, assistive

technology, LUDI

Play is the most prevalent activity in childhood.

Although sometimes play is regarded as a leisure only

activity, there’s a huge body of knowledge, starting

back from the 1950’s, showing that play is the motor

for child development [1,2,3]. Its importance is

recognized by the United Nations, establishing play as

one of the rights of the child (Article 31 of the

Convention on the Rights of the Child). With the

current technology ubiquity, it comes as no surprise

that children are today very familiar with technological

toys. Technological developments have thus influenced

important than for typically developing children. For

them, the use of technology may be challenging, if

accessibility issues were not taken into consideration

in the design [5,6]. On the other hand, for many

children with disabilities, (assistive) technology is the

mean to access to play activities, and this has been

addressed by many authors. For example, Cook et al.,

describe how robots can be used as assistive

technologies for play, learning and cognitive

development [7]. Cabibihan’s et al., review on social

robots for children with autism spectrum disorders

shows the opportunities created by robots to increase

the autonomy of the child [8]. Children with

disabilities have more possibilities in playing with the

children’s occupations namely play [4].

For children with disabilities play is not less

use of technology. Within the International

Classification of Functioning, technology can expand

the child’s health dimensions and environmental

determinants of health. For example, Miller & Reid

report that competence and self-efficacy increased in

children with cerebral palsy engaging in a virtual

reality play intervention [9]. Technology opens the

doors to more play scenarios. Playfulness can be more

present. It provides adults opportunities to get in

contact and to have meaningful time together [10,11].

Despite the scientifically recognized importance of

play and the technology available, children with

disabilities are often deprived from the right to play.

Physical and/or cognitive impairments may prevent

them to access to play activities. Social and cultural

contexts may also raise hurdles for children’s play. In

fact, frequently parents and caregivers place play very

low in the hierarchy of activities a child with

disabilities should engage, something to be done only

if there’s some free time after educational and

rehabilitation commitments. In therapy play is seldom

considered the goal per se.

Using technology to support play faces sometimes

doubts, resistance and concerns from the professionals.

For most of the rehabilitation professionals, technology

in care or education was and still is not part of their

education or continuous professional development

[12]. As technological developments are going fast, it’s

hard to keep pace with them. Some professionals fear

that this evolution might reduce their therapeutical

influence or even will place their jobs at risk. Looking

at technology, many tools are still at the development

stage, prototypes emerging from innovative projects,

and thus are not 100% reliable and user friendly.

Many disciplines, like psychology, education,

(rehabilitation) medicine, or engineering, have focused

on the topic of play. However, a holistic view,

encompassing all the different perspectives, is

necessary to effectively grant the right to play for

children with disabilities. This motivated the creation

in 2014 of “LUDI – Play for Children with

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 6 -

Page 9: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

a) Collecting and systematizing all existing competence and skills: educational researches, clinical initiatives, know-how of resources centers and users’ associations;

b) Developing new knowledge related to settings, tools and methodologies associated with the play of children with disabilities;

c) Disseminating the best practices emerging from the joint effort of researchers, practitioners and users.

A DATABASE OF TECHNOLOGY TO

SUPPORT PLAY

CONCLUSIONS

REFERENCES 1. J. Piaget, The Construction of Reality in the Child.

Great Britain: Routledge, 1954.

2. J. Huizinga, Homo Ludens; A Study of the Play-

Element in Culture. Boston, Beacon Press, 1955.

3. L. Vygotsky, Mind in Society: The Development of

Higher Psychological Processes. Cambridge: Harvard

University Press, 1978.

4. J.A. Davis, H.J. Polatajko, & C.A. Ruud, Children's

occupations in context: The influence of history.

Journal of Occupational Science, 9(2), 54-64 (2002).

5. T. Heah, T. Case, B. McGuire, & M. Law. Successful

participation: the live experience among children with

disabilities. Canadian Journal of Occupational

Therapy, 74(1), 38-47 (2007).

6. Copley, & J. Ziviani, J. Barriers to the use of assistive

technology for children with multiple disabilities.

Occupational Therapy International, 11(4), 29-243

(2004).

7. A. M. Cook, P. Encarnação & K. Adams. Robots:

Assistive technologies for play, learning and cognitive

development. Technology and Disability 22 (3), 127–

145 (2010).

8. J-J. Cabibihan, H. Javed, M.J. Ang, & S.M. Aljunied.

Why robots? A survey on the roles and benefits of

social robots in the therapy of children with autism.

International Journal of Social Robot (5), 593-618

(2013).

9. S. Miller,& D. Reid. Doing Play: Competency, Control

and Expression. Cyber psychology & behavior 6,623-

632 (2003).

10. A.M.R. Rincon, K.D. Adams, J. Magill-Evans & A.M.

Cook. Changes in playfulness with a robotic

intervention in a child with Cerebral Palsy. Assistive

Technology: From research to practice, 161-166 (2013)

11. R. Bemelmans Paro bij Pergamijn. Ondersteuning n de

zorg voor mensen met een (meervoudige)

verstandelijke handicap. Heerlen: Eizt Zuyd University

(2015).

12. L. de Witte Technologie in de zorg; Wat moet je

daarmee in de zorgopleidingen? Vakblad voor opleiders

in het gezondheidszorgonderwijs, 7 (Dec 2013).

Disabilities”, a 4-year Action supported by the

European Cooperation in Science and Technology

(COST) framework (www.cost.eu). LUDI is a Pan-

European network of researchers, scientists,

practitioners, users and their families, including

members from 27 European countries and from 5

international partner countries

(www.cost.eu/COST_Actions/TDP/Actions/TD1309;

www.ludi-network.eu). Its main goals are:

One of ultimate goals of LUDI is the recommendation of guidelines to the design and

development of technology to support play for children

with disabilities and of methodologies to evaluate

usability, accessibility and effectiveness of that

technology. As a first step towards this goal, a database

of available technology to support play for children

with disabilities, including methods for assessing

usability, accessibility, and effectiveness, is being

created. Clearly, given the existing number of

technologies (e.g. many toys brands have new

collections every six months), it is not possible to have

a fully comprehensive database. Instead, the objective

is to collect a vast number of examples that can inspire

users and clinicians, can elicit cooperation and foster

discussion. For example, a parent will be able to

retrieve from the database technologies available for

his child with a particular age and disability, a

researcher will be able to list robots that are being used

to support play, or a clinician will be able to find

assessment methods for an intervention with a

particular technology. The database will be available

from the LUDI webpage (www.ludi-network.eu) and

will be open for everyone to contribute and consult.

Given the importance of play for child

developmentment, the challenges children with disabilities

face to have access to play activities, and the

fragmentation of research initiatives, often conducted

within a particular scientific field framework, the

LUDI COST Action aims at creating a multi- and

trans-disciplinary research area that focus on play (for

play sake) for children with disabilities. LUDI,

together with international organizations such as the

International Play Association and the International

Council for Children’s Play, will promote the

cooperation between rehabilitation professionals,

engineers, educators, psychologists, sociologists, users

and their families, and all of those that are involved in

the theme of play for children with disabilities.

By collating state of art and agreements about

definitions of play, models, assessments, and

interventions, a body of knowledge will be created

supporting everyone who wants to stimulate the play

of children with disabilities at home, schools, daycare

centers, or in public spaces.

Technology, as an enabler for children’s play, will

have a central role in LUDI.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 7 -

Page 10: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Modelling Social Skills and Problem Solving Strategies used byChildren with ASD through Cloud Connected Social Robots as Data

Loggers: First Modelling Approach

Jordi Albo-Canalsa, Carolina Yanezb, Alex Barcoc, Cecilio Angulod and Marcel Heerinke

aTufts UniversitybUniversidad de Chile

cla Salle, Ramon Llull Universityd Universitat Politecnica de Catalunya

eWindesheim Flevoland University

Abstract— In this paper, we present a set up of cloud-connectedsocial robots to measure and model the effect of LEGO Engineeringand its collaborative nature on the development of social skills inchildren with Autism Spectrum Disorder (ASD). Here we introducethe first approach to the modelling process designed.

Keywords— Modelling, ASD, Autism, Social Robots, Cloud, DataLogger

1. INTRODUCTION

There exists a growing body of research centered aroundrobotics and autism, using a social robot as a data logger.Previous research includes children with ASD working withhumanoid robots (e.g., NAO or KASPAR), working togetherto build robots [1], [2], talking to the robot and mimickinga robot [3]. Also, we present a cloud-based system to speedup the analysis of how therapies based on working in groupsand building LEGO change their social skills, social network,and cognitive skills.

The project consists of an 8-week study (one two-hoursession per week).The sessions have a format of a workshopon building LEGO Robotics with a Robot Companion (NAORobot, AISOY Robot, or SAMSUNG Robot) that will be onthe table as a helper, social mediator, and will remind thekids of the time schedule.

During the sessions, Children sit at a table with a laptopto program the LEGO robot and a complete LEGO MIND-STORM EV3 set (The LEGO Robot). Children work ingroups of 2 selected at random, and they keep the samegroup for all sessions. A Social Robot (NAO Robot, AISOYRobot, or SAMSUNG Robot) is on the table as a helper,social mediator, and remember the time schedule.

In each of these sessions, we collect information thatallows us to create a reliable model of how these childrensocialise with each other and with the adults in the classroom,and how these children solve engineering problems (seeFigure 1). While the children with ASD social skills modelhas been studied since a long time ago, the engineeringthinking skills is not approached by the community. Previousstudies showed that only people in the field of science andtechnology were trained in engineering skills. However, it

has been proved that engineering skills are needed in veryday life, bringing clear benefits to the quality of living forthose children who can acquire and use them [4], [5]. Dochildren with ASD follow the same strategies that neurotyp-ical children? How they are dealing with this problem?.The model obtained should give an answer to these twoquestions and see if we can redesign their educational andtraining system [6]. Furthermore, in [7] is claimed that thereis a connection between engineering thinking and humansensitivity that makes the quality of live better.

Fig. 1. Schematic of how data flow through the cloud until the model isobtained.

2. MODELLING PROCESS

The modelling process is divided into two paths accordingto the two outcomes mentioned in the introduction of thispaper. On the one hand we model how children with ASDdeal with the social situation, and, on the other hand, we aremodelling how they solve engineering problems (see Figure2).

Through the video observation, the quantitative data ob-tained from the interactive systems, and the descriptors ob-tained after processing the information through the machinelearning algorithm we can identify the interactive behaviorsand their quality in terms of intensity and duration.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 8 -

Page 11: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The system is supposed to identify interactive behaviorsand to measure the amount of social engagement childrenare experiencing.

Fig. 2. Modelling description process

A. Human ExpertsThrough video observation (video coding) and question-

naires to the students, parents, and teacher, we are going tocollect qualitative data. Through a web-based tactile interfaceto interact with the robot and the video recordings we aregoing to extract quantitative data. The qualitative data thatwe are going to measure is detailed in the VIDEO CODINGdocument and the attached questionnaires. The quantitativedata that we are going to analyze from the touch screensincludes the number of times they are using the touch screen,what they are touching, and at what time during the session.The quantitative data we are going to obtain from the videosare:

• The number of times and how long every kid is talkingduring all sessions.

• The distance between kids during all sessions• Eye tracking and facial states during all sessions

B. Machine Learning as Descriptor Mining and RuleExtractor

The main purpose of the Machine Learning algorithm isto classify all information to extract a set of rules that willdefine the model. In this project, we have data from twodifferent kinds: qualitative and quantitative.

1) Modelling quantitative data: Similar to [8], we haveChildren Assitant Agents (CAA) placed in the cloud systemand connected to its individual Social Robot. All CAAsare linked to an Information Management Agent (IMA) thatreceive all information from the CAAs to build the model.Because the model is scalable to different cloud sites, wecan have multiples IMAs.

Because we are searching for two different models, IMA’sfunctionality is based two strategies:

• The social skills model rules are better predictable, sowe are based in [9] UCS, accuracy-based Michigan-style LCS that takes advantage of knowing the classof the training instances. UCS evolves a population ofclassifiers based on rules. Once the quality of the rules

is proved the model can be extracted from the collectionof rules and each classifier.

• For the engineering skills model we have a greater levelof uncertainty, so we decided to use first use a systemto classify and then a system to extract rules [10], [11]

2) Modelling qualitative data: We have used multicriteriadecision-making systems, which would be the second partof modeling, as to the assessment models or from differentexperts [12]

3. RESULTS AND CONCLUSIONSCan be the model used only with the data obtained

from the social robot as data logger? Because we had onlyfour children in all sessions during the first workshop, thisis a hard hypothesis to answer. Results showed that thequantitative data we obtained was potentially good. Howeverbecause we used different robotic platforms (AISOY, NAO,and a custom robot), and because the number of children wassmall the results were inconsistent. In any case, we testedthe technology, and it shows us that we need to mix thequalitative data with the quantitative data in a more integratedway.

We expect to get a consistent model as long as we areusing only one platform with more children.

ACKNOWLEDGMENTOur thanks to LEGO Foundation and TTT Outreach, Think

Tank Team, Samsung Research America to fund the project.Also to CASPAN Center in Panama to cover part of theresearch project there.

REFERENCES

[1] Wainer, Joshua, et al. ”The effectiveness of using a robotics classto foster collaboration among groups of children with autism in anexploratory study.” Personal and Ubiquitous Computing 14.5 (2010):445-455.

[2] Wainer, Joshua, et al. ”Collaborating with Kaspar: Using an au-tonomous humanoid robot to foster cooperative dyadic play amongchildren with autism.” Humanoid Robots (Humanoids), 2010 10thIEEE-RAS International Conference on. IEEE, 2010.

[3] Diehl, Joshua J., et al. ”The clinical use of robots for individualswith autism spectrum disorders: A critical review.” Research in autismspectrum disorders 6.1 (2012): 249-262.

[4] Beder, Sharon. ”Beyond technicalities: Expanding engineering think-ing.” Journal of Professional Issues in Engineering Education andPractice 125.1 (1999): 12-18.

[5] Brophy, Sean, et al. ”Advancing engineering education in P12 class-rooms.” Journal of Engineering Education 97.3 (2008): 369-387.

[6] Gardner, Howard. The unschooled mind: How children think and howschools should teach. Basic books, 2011.

[7] Saarinen, Esa, and Raimo P. Hmlinen. ”Systems intelligence: Connect-ing engineering thinking with human sensitivity.” Systems intelligencein leadership and everyday life (2007): 51-78.

[8] Navarro, Joan, et al. ”A Cloud robotics architecture to foster individualchild partnership in medical facilities.” (2013).

[9] Orriols-Puig, Albert, and Ester Bernad-Mansilla. ”Revisiting ucs:Description, fitness sharing, and comparison with xcs.” LearningClassifier Systems. Springer Berlin Heidelberg, 2008. 96-116.

[10] Diederich, Joachim, ed. Rule extraction from support vector machines.Vol. 80. Springer Science & Business Media, 2008.

[11] Nez, Haydemar, Cecilio Angulo, and Andreu Catal. ”Rule extractionfrom support vector machines.” ESANN. 2002.

[12] J. Nguyen, G. Sanchez-Hernandez, N. Agell and C. Angulo InsERT,the Inspirational Expert Recommender Tool. IEEE International Con-ference on Fuzzy Systems, FUZZ-IEEE 2015 [Accepted]

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 9 -

Page 12: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Matching Robot KASPAR To ASD Therapy And Educational Goals

Claire Huijnena,b, Monique Lexisa, and Luc de Wittea,b a Zuyd University of Applied Sciences, The Netherlands (Expertise Centre for Technology in Care) b Maastricht University, The Netherlands (CAPHRI, School for Public Health and Primary Care)

Abstract. Aim of this study was to identify the potential

added value of therapy robot KASPAR to the therapy or

education goals for children with autism spectrum

disorder (ASD).

Methods After conducting focus group sessions, an online

questionnaire was adopted to elicit the expectations of 54

multidisciplinary ASD practitioners about therapy and/or

educational goals that KASPAR can contribute to.

Results indicate that practitioners expect KASPAR to

bring added value to ASD objectives in domains such as

communication, social / interpersonal interaction and

relations, play, emotional wellbeing and preschool skills.

Conclusions Practitioners are convinced that KASPAR

can be useful in interventions for a broad range of therapy

and education goals for children with an autism spectrum

disorder.

Keywords: therapy, robot, children, ASD, autism,

KASPAR, intervention, objectives

INTRODUCTION

Interactive technology, and robots in particular can

contribute meaningfully to interventions used in both

therapy and education for children with autism

spectrum disorder (ASD). Robots possess a number of

characteristics (e.g. simplicity, predictability,

embodiment, interactivity) and can adopt various roles

in therapy that can be valuable assets in therapy and/or

education settings for (some) children with ASD 1,2.

Children are reported to enjoy interaction with a robot

more, show more communication, initiative or

proactivity, learn quicker and more pleasantly

compared to with an human counterpart or other

interventions. Moreover, robotic interventions might be

well equipped to answer this population’s

multidimensional and heterogeneous individualized

demands for support 2. ASD manifests itself in many

different forms and severities and there is not one best

therapeutic approach for all, people need different

support, what is beneficial for one person, might harm

the other 3. Robots allow for a personalized and

individualized approach.

However, in order for socially interactive robots to

actually make a difference to the lives of children with

ASD and their carers, they have to find their way out

from case studies with ‘standalone’ robots in robotics

labs to the children’s therapy and/or education

environments as part of interventions. Being effective

in eliciting a certain target behaviour of a particular

child will not automatically ensure effective clinical

implication in therapy settings 4. Robot interventions

need to be robust and easily targeted to the children at

hand 4. Children have to enjoy interacting with a robot,

and practitioners need to consider the robot as a

desirable intervention in their day to day care delivery

work. As formulated in 3, socially assistive robots shall

“balance goal-oriented treatment with a nonthreatening

but engaging and productive interaction”. To date,

unfortunately, only limited emphasis has been devoted

to how robots can be best integrated into therapeutic

protocols and therapy sessions 2. Many implementation

questions still remain unanswered.

One socially interactive robot that has extensively

been used in studies with children with ASD is

KASPAR 5. In the current study we focus on this semi-

autonomous humanoid child-size robot (Figure 1).

To increase the likelihood of adoption by

professionals in practice, the aim of this study was to

identify the potential added value of therapy robot

KASPAR in the therapy or education for children with

autism spectrum disorder. To what therapy and

educational goals can KASPAR contribute to according

to professionals?

Figure 1. Therapy robot KASPAR

METHODS Nine focus group sessions with ASD practitioners

(n=53) were conducted to create an overview of therapy

and educational objectives that are relevant for children

with ASD. Participants saw both a video as well as a

live demo of KASPAR. This overview was the basis for

the items in an online questionnaire. The goal of the

questionnaire was to match KASPAR to these ASD

objectives. Descriptive analyses was performed on the

data that was obtained from 54 respondents. All

respondents are experts in the area of therapy or

education for children with ASD and work for e.g.

special need schools, youth care organizations, medical

day care centres or centres for orthopedagogical

treatment.

RESULTS Main results indicate that a (large) majority of ASD

practitioners expect a meaningful role for KASPAR in

several objectives in domains related to

communication, social/interpersonal interaction and

relations, play, emotional wellbeing and preschool

skills, but also in other areas (see figure 2). In all of

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 10 -

Page 13: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

these domains, a number of objectives have been

formulated.

Figure 2. Impressions of role for KASPAR

Table 1 shows the top 10 ASD objectives (with

International Classification of Functioning, Disability

and Health for Children and Youth codes 6) where most

practitioners expect a meaningful role for KASPAR.

Table 1. Top 10 objectives expected role for KASPAR.

Therapy or Educational objectives Percentage

respondents

Imitation in play (d130) 93%

Making contact (d3) 89%

Imitation in social/interpersonal

interaction and relations (d130)

85%

Orientation to listen (d115) 83%

Turn taking (behaviour) (d720) 83%

Social routines (greet, say goodbye,

introduce) (d72)

81%

Attention (b140) 80%

Learn a new form of communication

(d3)

76%

Talk – use verbal abilities (d330) 69%

Train or practice skills (d155) 65%

Pose a question / ask for help (d815) 65%

Follow up instructions (d3102) 65%

Table 2 shows the top 10 objectives where KASPAR

is unlikely to be able to contribute.

Table 2. Top 10 objectives no KASPAR role expected.

Therapy or Educational objectives Percentage

respondents

Conflict management (d175) 44%

Balance and equilibrium (b235) 41%

Strengthening of muscles (b7306) 39%

Distinguish main from minor issues

(d198)

39%

Respect / value others (or things) (d71) 37%

Potty training (d53) 35%

Domestic skills (d6) 35%

Problem solving skills (d175) 35%

Negotiate about rules (d8808) 33%

Understand what body is “saying” (b2) 33%

CONCLUSIONS Practitioners expect that KASPAR can

meaningfully contribute to a broad range of objectives

for children with autism spectrum disorder. These

results are in line with other research in the area of robot

assisted therapy for children with ASD. Studies often

focus on social communication and social skills such as

turn-taking, joint attention and collaborative play 1.

Interestingly, this work shows that also for many other

ASD objectives – which might be less obvious for robot

developers and less explored by current robotic

initiatives - are worthwhile to consider developing

robotic interventions for. The next step will be to co-

create KASPAR interventions (based on these findings)

that will be tested and used by ASD practitioners in

(daily) care and/or therapy situations with children with

ASD.

ACKNOWLEGDEMENTS The authors sincerely thank our beloved friend and

colleague Gert Jan Gelderblom for his highly

appreciated and valuable devotion to this work and the

entire domain of (robot) assisted technologies for

people in need of support.

REFERENCES

1. Cabibihan, J.-J., Javed, H., Ang Jr., M. &

Aljunied, S. Why Robots? A Survey on the

Roles and Benefits of Social Robots in the

Therapy of Children with Autism. Int. J. Soc.

Robot. 5, 593–618 (2013).

2. Diehl, J. J., Schmitt, L. M., Villano, M. &

Crowell, C. R. The Clinical Use of Robots for

Individuals with Autism Spectrum Disorders:

A Critical Review. Res. Autism Spectr. Disord.

6, 249–262 (2012).

3. Scassellati, B., Admoni, H. & Matarić, M.

Robots for use in autism research. Annu. Rev.

Biomed. Eng. 14, 275–94 (2012).

4. Huskens, B., Verschuur, R., Gillesen, J.,

Didden, R. & Barakova, E. Promoting

question-asking in school-aged children with

autism spectrum disorders: effectiveness of a

robot intervention compared to a human-

trainer intervention. Dev. Neurorehabil. 16,

345–56 (2013).

5. Wainer, J., Robins, B., Amirabdollahian, F. &

Dautenhahn, K. Using the Humanoid Robot

KASPAR to Autonomously Play Triadic

Games and Facilitate Collaborative Play

Among Children With Autism. Auton. Ment.

Dev. IEEE Trans. 6, 183–199 (2014).

6. Organization, W. H. International

Classification of Functioning, Disability, and

Health: Children & Youth Version: ICF-CY.

(World Health Organization, 2007).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 11 -

Page 14: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Touchscreen-Mediated Child-Robot InteractionsApplied to ASD Therapy*

Paul Baxter1, Silviu Matu2, Emmanuel Senft1, Cristina Costescu2, James Kennedy1,Daniel David2 and Tony Belpaeme1

1Centre for Robotics and Neural Systems, The Cognition Institute, Plymouth University, U.K.2Department of Clinical Psychology and Psychotherapy, Babes-Bolyai University, Romania

Abstract. Robots are finding increasing application in the domainof ASD therapy as they provide a number of advantageous prop-erties such as replicability and controllable expressivity. In thisabstract we introduce a role for touchscreens that act as medi-ating devices in therapeutic robot-child interactions. Informed byextensive work with neurotypical children in educational contexts,an initial study using a touchscreen mediator in support of robot-assisted ASD therapy was conducted to examine the feasibility ofthis approach, in so doing demonstrating how this application pro-vides a number of technical and potentially therapeutic advantages.

Keywords: ASD, Robot-Assisted Therapy, Sandtray

INTRODUCTION

The application of robots to aid in the therapy of chil-dren with Autistic Spectrum Disorders (ASD) has becomeincreasingly established [1], [2], with evidence suggestingthat it can provide beneficial outcomes for the children [3].In addition to this, recent efforts have emphasised providingan increasing degree of autonomy for the robot [4].

Providing such autonomous behaviour in interaction con-texts is a challenging task, with sensory and motor limi-tations imposing a number of constraints. In our previouswork, we have developed a methodology that makes useof a touchscreen mediator between children and robots toovercome a number of these difficulties: the Sandtray [5]. Inthis setup, a child and a robot engage in a collaborative taskthat is provided on the touchscreen (e.g. sorting of imagesinto categories). The Sandtray has been successfully appliedto a range of neurotypical child-robot interaction studiesin various contexts, for example behavioural alignment [6],education [7], and others. As the Sandtray was inspired bythe therapeutic intervention of sandplay (with this havingproposed advantages for children with ASD [8]), we nowseek to apply this same methodology to robot-assisted ASDtherapy.

Touchscreens (without the robot) have found previousapplications to this domain [9]. For example, a touchscreenhas been used to enforce collaborative activity between pairsof children with ASD, resulting in an increase in coordinationand negotiation behaviours [10], a finding supported else-where [11]. Furthermore, there have been attempts to enablesandplay therapy-like interactions with touchscreens [12],

*This work was supported by the EU FP7 project DREAM (grant number611391, http://dream2020.eu/).

Fig. 1. Indicative setup and use of touchscreen for child-robot therapeuticinteraction - robot is controlled by a wizard, and the mediator provides inputto the interaction if needed (not to scale; positions are indicative only).

although our approach differs in both application contextand involvement of the robot. These studies indicate thesuitability of using touchscreens for children with ASD.

There are a number of advantages afforded by the use ofsuch a mediating touchscreen in HRI. Firstly, it provides ashared space for collaboration that does not require complexmanual dexterity for either the child or the robot; indeed itprovides the same affordances for both interactants (pointingand dragging). Secondly, it reduces the sensory processingload (vision processing) on the robot since information onscreen-oriented activity by the child can be obtained directlyfrom the touchscreen. Thirdly, it provides a straightforwardmeans of changing the task (or more broadly the interactioncontext) by just changing the images displayed on the screen:for instance, a sorting task can be appropriate for domainsas diverse as mathematics and nutrition just by changing thepictures displayed.

The aim of this contribution is to motivate and illustratehow such touchscreen mediators can specifically serve asuseful tools in the domain of robot-assisted therapy by firstdescribing an application currently in progress, and thendiscussing the opportunities and challenges for the future.

APPLICATION CASE STUDY: TURN-TAKING

An initial application to ASD therapy has been imple-mented and evaluated. Turn-taking is an important social skillthat is used as part of therapeutic interventions [13]. We havecreated an emotion image categorisation task (using sad andhappy faces) on the Sandtray for a child and Nao robot toplay, with robot verbal behaviour used to encourage turn-taking behaviours. For this study, the robot was explicitlyremote controlled (wizarded) by a remote operator (fig. 1).

With a four year-old girl with ASD, six interaction ses-sions with the Robot-Sandtray turn-taking task were con-

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 12 -

Page 15: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Fig. 2. (Top) Sample data from the sixth child-robot Sandtray turn-taking interaction session. The feedback was employed to encourage thechild to move and to give them feedback. Orange circles indicate robotencouragements for the child to take a turn. (Bottom) Trends over sixsessions, showing change in delay between robot prompt and the childmoving, and the mean number of prompts per child move (with 95% CI).

ducted over a period of four weeks. Other robot-basedtherapy activities were conducted at a separate time. Eachinteraction had a mean length of 11:06 mins (sd 5:03 mins).

Since interaction data can be captured through the touch-screen, it is possible to retrospectively examine the eventsthat occurred and their timing. Considering the relationshipbetween robot encouragement and child moves in a singleinteraction (e.g. fig. 2, top), the data suggest that both thenumber of robot encouragement instances required beforethe child made a move, and the delay between suggestionsand actual moves increases over time (fig. 2, bottom). Aclinical explanation for this relationship is not proposed here,although the ideal behaviour in this context is a turn-takinginteraction with the robot, without necessarily requiring ex-plicit prompting. What can be noted though is that data suchas these provide some insight into the interaction betweenthe child and the robot over time.

DISCUSSION AND OPEN QUESTIONS

The examination and use of touchscreen-derived informa-tion has two benefits. Firstly, it may come to constitute anadditional source of information for the therapist to aid indiagnosis or inform future therapy, with additional processingmaking aspects of emotion available for example [14]. Theextent to which this is clinically useful is an open questionthat requires investigation. It should however be noted thatwe do not suggest that such data can replace traditionaldiagnosis information, rather that it can provide supplementalinformation. It should be further noted that the touchscreen-derived information alone is likely to be insufficient toprovide a complete characterisation of the child’s behaviour.

Secondly, since the information captured by the touch-screen is directly accessible to the robot system, it can beused by the robot to adapt its behaviour to the specific cir-cumstances of an individual child in individual interactions,

e.g. [6]. In the case of autonomous robot behaviour, such asource of information that does not require the overhead ofcomplex visual or audio processing is a significant benefit.

Extensive previous work has been conducted with thistouchscreen mediated interaction between (neurotypical)children, and robots. While this has shown that the touch-screen effectively constrains the content of the interaction(thus facilitating robot autonomous behaviour) [15], it isan open question as to whether a similar effect (such ashelping to maintain focus on the interaction) is observablefor children with ASD, or over what time scales such aneffect may be manifested.

To conclude, we have presented data from an example setof interactions between a child with ASD and a robot inthe context of the Sandtray. This provides an illustration ofthe type of data that is readily available through the use ofthe touchscreen mediation technology. While further devel-opment and data collection is required (and is ongoing), wesuggest that the use of touchscreens as mediators for child-robot interactions in the context of ASD therapy providesbenefits in terms of behaviour characterisation and technicalfeasibility that should be further taken advantage of.

REFERENCES

1. B. Robins et al, “Robotic assistants in therapy and education ofchildren with autism: can a small humanoid robot help encourage socialinteraction skills?” Universal Access in the Information Society, 4(2):105–120, 2005.

2. B. Scassellati et al, “Robots for use in autism research,” Annual reviewof biomedical engineering, 14: 275–94, 2012.

3. C. A. Costescu et al, “The Effects of Robot-Enhanced Psychotherapy: AMeta-Analysis,” Review of General Psychology, 18(2): 127–136, 2014.

4. S. Thill et al, “Robot-assisted therapy for autism spectrum disorderswith (partially) autonomous control: Challenges and outlook,” Paladyn,3(4): 209–217, 2013.

5. P. Baxter et al, “A Touchscreen-Based “Sandtray” to Facilitate, Me-diate and Contextualise Human-Robot Social Interaction,” in 7th HRIConference. Boston, MA, U.S.A.: IEEE Press, 2012, pp. 105–106.

6. P. Baxter et al, “Cognitive architecture for human-robot interaction:Towards behavioural alignment,” Biologically Inspired Cognitive Ar-chitectures, 6: 30–39, 2013.

7. J. Kennedy et al, “The Robot Who Tried Too Hard: Social Behaviourof a Robot Tutor Can Negatively Affect Child Learning,” in 10th HRIConference. Portland, Oregon, USA: ACM Press, 2015, pp. 67–74.

8. L. Lu et al, “Stimulating creative play in children with autism throughsandplay,” Arts in Psychotherapy, 37: 56–64, 2010.

9. W. Chen, “Multitouch Tabletop Technology for People with AutismSpectrum Disorder: A Review of the Literature,” Procedia ComputerScience, 14(1877): 198–207, 2012.

10. A. Battocchi et al, “Collaborative puzzle game: a tabletop interfacefor fostering collaborative skills in children with autism spectrumdisorders,” Journal of Assistive Technologies, 4(1): 4–13, 2010.

11. G. F. Mireya Silva et al, “Exploring collaboration patterns in a mul-titouch game to encourage social interaction and collaboration amongusers with autism spectrum disorder,” Computer Supported CooperativeWork (CSCW), 24(2-3): 149–175, 2015.

12. M. Hancock et al, “Supporting sandtray therapy on an interactivetabletop,” 28th CHI Conference, pp. 21–33, 2010.

13. C. A. Pop et al, “Enhancing play skills, engagement and social skillsin a play task in ASD children by using robot-based interventions. apilot study,” Interaction Studies, 15(2): 292–320, 2014.

14. Y. Gao et al, “What Does Touch Tell Us about Emotions inTouchscreen-Based Gameplay?” ACM Transactions on Computer-Human Interaction, 19(4): 1–30, 2012.

15. J. Kennedy et al, “Constraining Content in Mediated UnstructuredSocial Interactions: Studies in the Wild,” in 5th AFFINE Workshopat ACII 2013. Geneva, Switzerland: IEEE Press, 2013, pp. 728–733.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 13 -

Page 16: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Multitouch-device-based iPod-LEGO Social Robot for Physical andCognitive Rehabilitation in Children with Special Needs and Elderly

People

Alex Barcoa Jordi Albo-Canalsb Carles Garriga-Bergaa Begona Garcıa-Zapirainc Alvaro Sanchezd

aLa Salle - Ramon Llull University, Barcelona, SpainbTufts University, Boston, USA

cUniversity of Deusto, Bilbao, SpaindUniversity of Comillas, Madrid, Spain

Abstract— This paper describes a robotic platform based on LEGOcombined wirelessly with multitouch device in order to performa cognitive rehabilitation, or a physical rehabilitation in eitherchildren with special needs or elderly people. Based on a previousstudies we propose technical improvements on the iPod-LEGOrobot. Also promising results are presented in order to see theeffectiveness of these treatments using our robotic platform.

Keywords— Social robot, cognitive rehabilitation, children, elderlypeople

1. INTRODUCTIONThis article is about a robotic platform as an enhancing

tool for therapeutical purposes either in children with specialneeds or elderly people. Robotics is a multidisciplinaryscientific tool which motivates and stimulates learning inchildren [1]. A key point of robotics is the ability to adapt toany kind of activity while being the perfect device for remotemonitoring. Robots can perform therapeutic and companionfunctions simultaneously [2], becoming an extension of thetherapist. In recent years there is an emergence of innovativetechnologies for cognitive rehabilitation like computerizedrehabilitation programs, virtual reality, remote rehabilitationand robotics [2]. Other studies [3] indicate that humans prefera real robot to a virtual version in one on one interactionsprecisely because their physical nature evokes a higher senseof presence in the user, making them more trustworthyand engaging. Robotics is itself something that is easy tobe accepted by people, also, as a tool can contribute tocollaborative work, adapting the level of the sessions accord-ing to the childrens educational performance [4]. Besides,robots can support therapists collecting data that can beuseful to better evaluate and monitor the level of successacquired during therapeutical activity. In the last decenniarobots have been used effectively in therapy and educationalinterventions with primary school children. For example,they have been used in therapy and educational interventions:with autistic children [5], with children with motor andphysical impairments [6] and longterm hospitalized children[7].

This paper describes the robot used for cognitive andphysical rehabilitation in children with special needs andelderly people and its technical improvements.

2. PREVIOUS WORKPrevious studies completed by the team composed by

engineers from La Salle Engineering School (Ramon LlullUniversity), University of Deusto, and University of Comil-las, have proved that robot features and activities can improvephysical and cognitive performances based on the interactionwith the LEGO NXT robot through a multitouch deviceconnected to sensors and actuators. During the first stageof the project carried out during 2013 the three participantuniversities developed the software for the multitouch de-vice (iPod 4G) and accomplished the wired communicationbetween the ipod and the LEGO NXT robot (see Fig. 1)through an electronic device (Teensy 2.0) [8]. Team LaSalle used this robotic platform during 2013 in order to seethe effectiveness of rehabilitation treatment in children withTBI in a long-term interaction. Also, team La Salle usedit to show how the drop-out rate in children is lower inthe group with robots than in another treatment directed toparents due to the engagement with robotics. Team Deustoused it for cognitive therapies associated with memory andmathematical problem-solving in elderly people [9] and ascaregiver and social assistant robot for the elderly to performphysical and mental activities for them to maintain theirhealthy life habits and, as a final result, improve their qualityof life [10]. In the following lines the main objectives for thesecond stage are explained.3. OBJECTIVES

In this second stage the objectives are:• To implement a technological solution based on the

previous study where the communication was bidirec-tional between the iPod 4G and the LEGO NXT througha Teensy 2.0 microcontroller board. Using the MIDIprotocol between Teensy and iPod, and I2C protocolbetween Teensy and NXT. The NXT was responsiblefor reading information from sensors (touch sensors),and from the iPod in order to execute actions such asmovements of joy when an activity is done properly.By using the iPad USB Camera Connector Kit (an iPodjailbreak was required to adapt the device), we couldplug the MIDI Cable of the Teensy board directly into

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 14 -

Page 17: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Fig. 1. First stage on the top: iPod-LEGO NXT wired connection via USBthrough Teensy 2.0 device. Second stage on the bottom: iPod-LEGO EV3wireless connection via Bluetooth.

the iPod. The technological improvement has been doneby Team Comillas. They worked on a bluetooth wirelesscommunication between the new LEGO Robot calledEV3 and the iPod 4G (see Fig. 1 in order to reducethe technical problems from the wired communication.Also, with this solution it is not necessary to use theMIDI protocol we were using to transfer data betweenthe iPod and the LEGO NXT to make the programmingbecome much easier.

• To design and develop a set of apps that contain newactivities for cognitive and physical rehabilitation aimedat two groups: elderly people and children with specialneeds. Also, adding new features in the pet functionalitywhere the robot behaves differently depending on theresults obtained from the activities, the battery level ofthe iPod and its overall usability affects its state, makingit happy, sad, angry, sick, etc.

4. RESULTSTeam La Salle got significative results in different cogni-

tive measurements during pre and post time with childrenwith a brain trauma showing how useful could be the iPod-LEGO robot for this kind of treatments. On the other hand,based on the tests, Team Deusto showed how easy was touse the robot to deliver basic coaching for physical activitiesas proposed by the client.5. CONCLUSIONS

Robotics concepts have revolutionized the manufacturingprocesses in industries since the industry revolution, now arebecoming to get introduced into everyday life environmentssuch as vehicles, homes, offices and schools. Living withrobots is already a reality, as happened with the interactionwith computers. Robots are already in field of rehabilitation.Based on a previous studies we propose technical improve-ments on the iPod-LEGO social robot used for cognitive or

physical rehabilitation in either children with special needsor elderly people. As a result an improved robotic platformhas been developed avoiding different technical problems wehad in the past using this new wireless communication viabluetooth. We are able to fix many of the issues we had withthe wired communication, such as the continuous brokenwires due to the intensive user-robot interaction and an easierprogramming to transfer information between devices insteadof using MIDI commands through the Teensy 2.0 device.

Our expectations are also focused on a better use of therobot as an enhancing tool for a satisfactory rehabilitationfor children with special needs and elderly people. So, ifthis target of people improve their physical ability or theircognitive functionalities, that means their quality of lifeimproves.

ACKNOWLEDGMENTThis project with code 502858 is founded by La Fundacio

de la Marato de TV3. and by Aristos Campus Mundus 2015with code ACM2015 18. Authors thank medical supportprovided by the Sant Joan de Deu Hospital and we aregrateful for the inestimable collaboration of La Real Casade la Misericordia nursing home in Bilbao.REFERENCES[1] S. Woods and K. Dautenhahn, The design space of robots: Investigat-

ing childrenviews, Proceedings of International Workshop on Robotand Human Interactive Communication, 2004, pp. 47–52.

[2] Mataric Maja et al., Socially assistive robotics for stroke and mild TBIrehabilitation, Advanced Technologies in Rehabilitation 145, 2009, pp.249–262.

[3] C. Kidd, Sociable robots: The role of presence and task in human-robot interaction. Master s thesis, MIT Media Lab, Cambridge, MA,USA, 2003.

[4] Marta Dıaz, Neus Nuno, Joan Saez-Pons, Diego E. Pardo, and CecilioAngulo. Building up childrobot relationship for therapeutic purposes:From initial attraction towards long-term social engagement. In NinthIEEE International Conference on Automatic Face and Gesture Recog-nition (FG 2011), pages 927932, Santa Barbara, CA, USA, March2011. IEEE.

[5] J. Albo-Canals, M. Heerink, M. Daz, V, Padillo, M. Maristany, A.Barco, C. Angulo, A. Riccio, L. Brodsky, S. Dufresne, S. Heilbron, E.Milto, R. Choueiri, D. Hannon, and C. Rogers (2013) Comparing twoLEGO Robotics-Based Interventions for Social Skills Training withChildren with ASD. 22nd IEEE International Symposium on Robotand Human Interactive Communication (RO-MAN 2013), Gyeongju,Korea. [DOI 978-1-4799-0509-6/13]

[6] Hok Kwee, Jacques Quaedackers, Esther van de Bool, LizetteTheeuwen, Lucianne Speth. Adapting the control of the MANUSmanipulator for persons with cerebral palsy: An exploratorystudy.Technology and Disability. Volume 14, Number 1/2002

[7] M. Daz, J. Sez, D. Pardo and C. Angulo, Pet robots with social skillsfor satisfactory interaction with hospitalized children, Proceedings ofRSS 2010 Workshop Learning for Human-Robot Interaction Modelingin Robotics: Science and Systems Conference.

[8] A. Barco et al., A Robotic Therapy for Children with TBI, Proceed-ings of the 8th ACM/IEEE international conference on Human-robotinteraction, 2013, pp. 7576.

[9] L. Lopez-Samaniego, B. Garca-Zapirain and A. Mndez-Zorrilla. Mem-ory and accurate processing brain rehabilitation for the elderly: LEGOrobot and iPad case study, Bio-Medical Materials and Engineering, vol.24(6), 3549-3556, 2014.

[10] Perez PJ, Garcia-Zapirain B, Mendez-Zorrilla A. Caregiver and socialassistant robot for rehabilitation and coaching for the elderly. TechnolHealth Care. 2015.

[11] A. Barco et al. ”A drop-out rate in a long-term cognitive rehabilitationprogram through robotics aimed at children with TBI.” RO-MAN:The 23rd IEEE Intl. Symposium on. Robot and Human InteractiveCommunication, IEEE, 2014.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 15 -

Page 18: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The Impact of a Robotic Cat on Dementia Caregivers’ Psychosocial

Work Environment – a Pilot Study

Marcus Perssona aPhD, Mälardalen University, School of health, care, and social welfare

Abstract. The aim of this pilot study is to contribute to

the knowledge of professional caregivers’ psychosocial

work environment when they use an interactive robotic

cat. Based on recurring interviews, over three months,

with three individual caregivers at a dementia care center

in Sweden, the findings indicate that the caregivers

experience that the cat can have an positive impact on

their psychosocial work environment regarding working

with people (communication and interaction), as well as

help reducing feelings of stress, and insecurity when

working alone

Keywords: Robotic cat, Dementia care, caregivers’

experiences, psychosocial work environment, qualitative

method.

INTRODUCTION

Health Robotics [3,4,6,8] and welfare technology

[2] is being developed in response to new societal

challenges such as the aging population, and is often

presented as a means to free up resources, meet the

user's needs, and promote research, development and

innovation. The area is new and evolving why there still

are few research results, with mixed results [6]. The aim

of this pilot study is to contribute to the knowledge

about professional caregivers’ psychosocial work environment when they use an interactive robotic cat.

Health care personnel is an occupational group in

Sweden with high frequency of work stress and stress-

related mental illnesses [1]. Causes of work related

mental illness may be due to a variety of psychosocial

factors [7]. In this study, the focus is primarily on such

factors presumed to arise when working with the health

robotic assistive device JustoCat (or the “robotic cat”):

working alone, risks of threats and violence, conflicts,

working with people, social contacts.

METHOD

The project has followed three individual caregivers

from the dementia care center Eskilshem for three

months during the autumn/winter 2014. The caregivers

were given one robotic cat each which they gave to one

of their patients, i.e. one cat stayed with the same patient

the whole time and they had unlimited access to the cat

every day. In-depth interviews – according to a semi-

structured design [5] – with the three persons were

conducted once a month at their workplace. The

interviews took about an hour each time. A total of nine

interviews were carried out, and later transcribed. The

interviews began with a series of neutral opening

questions about personal background, interest, etc, in

order to get the conversation started and providing an

atmosphere conducive to open and undistorted

communication between participant and the interviewer

[5]. Further into the interview, questions of more

personal and potentially sensitive character, were

asked. The method of repeatedly interviewing the same

individuals over time proved very fruitful since it gave

the interviewees time and opportunity to reflect on

issues and questions, raised by the interviewer, between

each interview. The interviews were audio recorded and

transcribed, and the data were categorized. The analysis

was mainly concerned with identifying themes on a

latent or interpretative level [9]. A check was performed

to ensure that the themes worked in relation to the coded

extracts as well as the entire data set.

FINDINGS

The caregivers were asked questions about how they

use the robotic cat in relation to the specific patients that

had been given a cat. Particularly, the questions focused

on the cats’ potential impact on the caregivers’

experiences of: working alone, risks of threats and

violence, conflicts, working with people, social

contacts. The main findings from the study will be

presented according to two themes that have been

inductively extracted from the narratives: the use of the

robotic cat as an (a) Activator, and as a (b) Pacifier.

Activator

Listening to the caregivers’ experiences of how they

use the robotic cat in their daily care of the patient, it is

obvious that they use it to promote communication, i.e.

to stimulate and to activate the patient to talk and to

interact with the caregiver. The interviewees explains

that they use the cat as a tool to evoke memories and

conversation topics, for example:

“To have something to talk about – you do not need

to talk about the non-existing bus or train that never

arrives. You have some kind of tool to change the

monotonous conversation. You talk about the cat

instead and it often recalls memories.” (IP1)

In this way, the cat promoted verbal communication

between the personnel and the users. As a conversation,

or memory, stimuli, the cat was appreciated among the

caregivers for its effect on the patient (positive impact

on attitude) as well as its function for helping them to

talk and communicate with their patients.

The interviewees also said that they had noticed that

the cat could promote communication between the users

(i.e. without the interference of the caregivers):

“Often when the cat is involved the patients starts

talking about it. That makes you happy! The cat makes

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 16 -

Page 19: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

it easier to get a discussion going, and faster to divert.

So it helps a lot, feels easier.” (IP1)

This aspect of the cat, as a conversation starter,

between users is experienced as beneficial for the

caregivers’ work environment since it promotes a good

social atmosphere in group situations. In this way the

cat may contribute to the psychosocial work

environment regarding factors such as working with

people, and social contacts.

Pacifier

If the cat was used as an “activator” regarding

communication, it was used rather as a “pacifier”

regarding physical interaction. In one-to-one situations

(between the caregivers’ and the users), the cat was

used for calming purposes, for example.

“I think that it helps me do a better job, especially

when she is hallucinating. It's easier for me to “reach”

her when she kind of loses herself in the hallucination.

She becomes stiff in her body, but when having the cat

it is different, then she relaxes.” (IP3)

Similarly, in group situations involving several

patients, the cat was used to deflect negative behaviors

that might otherwise interfere with other patients, as

illustrated in the next conversation between interviewer

and interviewee:

IP3: The other patients are now less disturbed by

her, now when she has the cat to care for. Before

they got angry with her, because she disturbed them

by her picking behavior.

Interviewer: How is the situation for you and your

colleagues then, when she becomes anxious and the

other patients get angry?

IP3: It’s stressful. For example, when we had the

Lucia ceremony, it was stressing that she disturbed

the others. But when we gave her the cat, she was

occupied with picking on it.

This aspect of the cat, as a “pacifier”, or distractor,

is experienced as beneficial for the caregivers’ work

environment due to calming and diverting effects on the

patients.

Furthermore, the interviewed caregivers also spoke

of the cats’ potential for contributing with security in

certain problematic work situations. For example, using

the cat as a pacifier in stressful evenings when the

caregiver work alone at the ward, taking care of eight

patients and trying to get them all to bed. In those

situations, the cat might function as a helping hand for

the individual caregiver when s/he is especially exposed

to risks of conflicts and violence.

CONCLUSIONS

The findings from the pilot study indicate that the

caregivers use the robotic cat in two distinct ways with

different impacts on the psychosocial work

environment.

As conversation starter and memory trigger, the

robotic cat is used to activate and stimulate the patient

to engage in communication with others (caregivers or

other patients). When used in this way, the interviewed

caregivers experience that the cat is contributing to their

psychosocial work environment regarding factors such

as working with people and social contacts.

The robotic cat is also used as pacifier to calm

patients and divert negative behaviors. When used in

this way, the interviewed caregivers not only feel less

stressed, but also more secure, especially when working

alone and are exposed to precarious situations involving

potential conflicts and violence.

The findings point to several potential benefits for

individual caregivers when using a robotic cat in their

daily care for patients. However, further studies are

needed to evaluate the findings from this pilot study and

to explore potential benefits as well as risks.

ACKNOWLEDGMENTS

The study is a result of a co-production between

Mälardalens University (The Arena for Health and

Welfare Technology), Robyn Robotics, Eskilshem,

Attendo, and Regionsförbundet Sörmland.

A Special thanks goes to Camilla Svanberg,

Mälardalen University, who conducted the interviews.

REFERENCES

1. AFA Insurance (2013). Allvarliga arbetsskador och

långvarig sjukfrånvaro. www.afaforsakring.se

2. Hofmann, Björn (2013). Ethical challenges with welfare

technology: a review of the literature, Science,

Engineering, Ethics, 19, 389-406.

3. Kanamori, M., Suzuki, M., and Tanaka, M. (2002).

Maintenance and improvement of quality of life among

elderly patients using a pet-type robot, Nippon Ronen Igakkai Zasshi, Japanese journal of geriatrics, 39, 214-

218.

4. Libin, A., and Cohen-Mansfield, J. (2004). Therapeutic

robocat for nursing home residents with dementia:

Preliminary inquiry, American Journal of Alzheimer’s

disease and other Dementia, 19, 111-116.

5. Merton, R. K., Fiske, M., and Kendall, P. L. (1956/1990).

The focused interview. A manual of problems and

procedures. New York: Free Press.

6. Oborn, E.; Barrett, M., and Darz, A (2011). Robots and

service innovation in health care, Journal of Health

Services Research & Policy, 16(1), 46-50.

7. Swedish Work Environment Authority (2012). Work-related disorders 2012. Stockholm: Swedish Work

Environment Authority.

8. Wada, K., and Shibata, T. (2007). Living with seal robots:

Its sociopsychological and physiological influences on the

elderly at a care house, IEEE Transactions on Robotics,

23, 972-980.

9. Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development.

London: SAGE Publications.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 17 -

Page 20: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Acceptability Of A Service Robot Which Supports Independent

Living Of Elderly People

Sandra Bedafa,b and Luc de Wittea,b aZuyd University of Applied Sciences, Research Centre Technology in Care

bMaastricht University, School for Public Health and Primary Care (CAPHRI)

Abstract. Prolonging independent living of elderly

people is preferred for many reasons. Service robots to

prolong independent living of elderly people has been

given increasing attention. To capture the view of elderly

people concerning a re-enablement robot, focus group

sessions were conducted in the Netherlands, UK and

France. In these focus groups a scenario of a re-

enablement robot was discussed. The results showed that

elderly people find the idea of having a robot acceptable.

Nevertheless, such a robot needs to have high

intelligence. Additionally, elderly people preferred the

robot not being able to refuse a given task, even though

this may decrease the user’s ability in the longer term.

Keywords: Service robots, elderly, independent living,

acceptability.

INTRODUCTION

The ageing population in Western countries is

increasing. This population prefers to stay at home as

long as possible, nevertheless their abilities to daily

activities diminishes. Activities related to mobility,

self-care, and interpersonal interaction & relationships

were identified in a previous study as most threatening

with regard to the independent living of elderly people

[1]. And when one no longer has the ability to meet

one’s own needs other options, such institutionalized

care, are explored. However, institutionalized care is

expensive which makes prolonging independent living

of elderly people also desirable at a societal level.

Traditionally informal or professional caregivers

offer care to those who need help to continue to live

independently at home. However, social structures

have changed resulting in informal carers being less

inclined and/or able to provide care and additionally

we also face an increasing shortage of care staff [2].

Thus, in order to maintain the quality of care at home

technical solutions, and more specific robotic

solutions, are being given increasing attention.

The ACCOMPANY (Acceptable robotiCs

COMPanions for AgeiNG Years) project aimed to

further develop the functionality of an existing service

robot, the Care-O-bot [3], in order to support elderly

people to prolong independent living of elderly people

[4]. Re-enablement is an aspects that ACCOMPANY

seeks to promote. Elderly people should execute tasks

themselves as much as possible in order to maintain

their existing functional abilities. A re-enablement

robot should encourage the user to perform tasks by

themselves when possible and should only provide

support when the user cannot perform an activity.

Such a robot should be capable of doing more than just

executing functional tasks as it should motivate and

stimulate the user to execute tasks themselves.

However, this means that such a robot should be able

to monitor and to interpret a situation as the user in

some situations might actually need the support of the

robot. This introduces the complex situation whether

the robot should be allowed to make decisions

depending on the situation. For example: would it

acceptable for the service robot to ‘decide’ to refuse to

execute a task, given by the user, in order to get the

user to exercise abilities they might otherwise lose?

This paper explores this dilemma and presents the

view and thoughts of elderly people in the

Netherlands, United Kingdom and France on the

acceptability of such a re-enablement robot.

METHOD

Focus group sessions were conducted with elderly

people at four different sites: the Netherlands – Zuyd

University of Applied Sciences (ZUYD), UK –

University of Birmingham (UB) and University of

Herefordshire (UH), and France – MADoPA. In these

focus group sessions a scenario with a re-enablement

robot was verbally explained and discussed. In this

scenario a 78 year old lady, Marie, ignores the robot’s

advice to walk around which will help her ulcers to

heal. Marie likes the robot to remind her to take her

antibiotics but dislikes the reminders to elevate her leg.

And finally, she does not tell her nurse the truth in the

scenario about how much she is moving. Participants

were asked for their thoughts concerning this scenario.

All data was audio and/or video recorded.

Participants

Elderly people were contacted through care

organizations, except for the participants of UB who

were contacted through the Birmingham 1000 Elders

[5]. Elderly people were selected based on four

criteria: 1) aged 60+, 2) living at home, 3) no cognitive

decline, and 4) receiving home care.

Data analysis

All data was transcribed verbatim, translated into

English and coded using a combination of directed

analysis and Ritchie & Spencer’s Framework Analysis

[6]. The final codes were worked into themes. All of

the data was then combined into a single report.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 18 -

Page 21: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

RESULTS

In total 55 elderly people (19 male, 36 female)

participated in focus group meetings at ZUYD (10),

UH (5), UB (21) and MADoPA (19). The mean age of

the participants of ZUYD and MADoPA was 78.5

years. The age of the participants of UH and UB was

unknown, except that they were aged 65+.

All participants could relate to the scenario of

Marie and agreed that people do not always do what is

best for them. The majority of these participants

thought the user should have the control of their life.

They were very straightforward that when the user

would want to do something others would disapprove,

the robot was not allowed to interfere: the user’s view

on how the robot should act was seen as most

important and the autonomy of the user should be

honored. On the other hand, the elderly participants

also believed that the user had agreed to accept the

robot and that it was not forced upon the user. They

therefore thought that collaboration with the robot was

a reasonable obligation. This shows that the respect for

autonomy of the user starts even before the robot is

installed in the user’s home.

Providing reminders was seen as a useful task by

all participants as they were aware that people might

start to forget things as they become older. Although

some participants were worried that these reminders

could become annoying if repeated often (at an

inconvenient time) or when the reminders would be

given by a mechanical voice. They therefore wished

the robot would only provide reminders tailored to the

situation (e.g. the robot should not provide a reminder

to go for a walk when the user is watching the news).

The feelings concerning the robot stimulating

health-promoting behavior (e.g. telling Marie to move

around to help her ulcers to heal) were mixed. Some

thought it would be useful, while others had

compassion for the discomfort it could cause. Some

participants also argued that people are normally not

forced into cooperating with health-promoting

behaviors (e.g. people are free to smoke tobacco and

drink too much alcohol) and thought it was up to

Marie to decide if she would move or not. Again

participants wished the robot would only provide

useful information depending on the situation.

Some of the participants of UB were also worried

that a robot which could refuse to bring the user a

drink (in order to get the user to exercise abilities they

might otherwise lose) would harm the user (e.g.

dehydration of the user). For this group it was more

important that the robot would keep the user safe.

DISCUSSION

A re-enablement robot that would give reminders

(e.g. to take medication) was seen as helpful.

However, in order to be acceptable such a robot needs

to be able to react on the user’s behavior and provide

useful reminders and/or information depending on the

situation. This requires the robot to be flexible,

recognize circumstances, interpret these and make

decisions based on the situation.

In all focus groups it was also acknowledged that,

in order for the robot to be acceptable, the autonomy

of the user must be respected and no decisions of user

should be overruled by the robot. The robot must be

within the control of the user. However, such a robot

may actually reduce the quality of life the elderly user

because when the robot does too much it can de-skill,

de-motivate and/or otherwise erode the abilities the

user still has, decreasing one’s ability in the longer

term.

CONCLUSION

In this paper the acceptability of an re-enablement

robot by elderly people was explored with the use of

focus group sessions. It became clear that elderly

people find the idea of having a robot to support them

in their daily life acceptable. However, such a robot

needs to have high intelligence as it needs to be able to

act upon the situation (i.e. it should recognize

circumstances, interpret these and make decisions

based on the situation). Our data also suggest that

people prefer a robot that obeys the user and does not

refuse to perform a given task, even when this may

decrease the user’s ability in the longer term and

thereby undermine the user’s ability to live

independently.

ACKNOWLEDGMENT

The authors would like to thank our colleague Gert

Jan Gelderblom† for his highly appreciated

contribution to this work. The authors are grateful to

colleagues in the ACCOMPANY consortium.

REFERENCES 1. S. Bedaf, G.J. Gelderblom, D.S Syrdal, et al., Which

activities threaten independent living of elderly when

becoming problematic: inspiration for meaningful

service robot functionality, Disability & Rehabilitation:

Assistive Technology, 9(6), 445-52 (2013).

2. C. Cameron and P. Moss, Care work in Europe: Current

understandings and future directions, Oxford, Routledge

(2007).

3. Fraunhofer IPA, Stuttgart, Germany.

4. Acceptable robotiCs COMPanions for AgeiNg Years

(ACCOMPANY). www.accompany.eu

5. Birmingham 1000 Elders.

http://www.birmingham.ac.uk/research/activity/mds/cent

res/healthy-ageing/elders.aspx

6. J. Ritchie and L. Spencer, “Qualitative Data Analysis for

Applied Policy Research,” in The Qualitative

Researcher's Companion, edited by A. Michael

Huberman, & Matthew B. Miles, Thousand Oaks, CA:

SAGE Publications, 2002, pp. 305-330.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 19 -

Page 22: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Influence of Social Avoidance and Distress on People’s Preferences for Robots as Daily Life Communication Partners

Tomohiro Suzukia, Sachie Yamadab, Takayuki Kandac, and Tatsuya Nomurac, d aTokyo Future University (School of Child Psychology)

bTokai University (Department of Psychological and Sociological studies) cATR (Intelligent Robotics & Communication Laboratories)

dRyukoku University (Faculty of Science and Technology) & ATR

Abstract. This study investigates (1) whether people prefer robots vs. humans as communication partners for different roles and situations in daily life and (2) the influence of social avoidance and distress on people’s preferences for robots vs. humans. Results showed that in Japan, a certain amount of people preferred robots as communication partners for many roles and situations. Additionally, social avoidance and distress influenced these preferences. This result suggests that robots would be particularly useful for individuals with high social anxiety.

Keywords: Human-Robot Interaction, social acceptance, social avoidance and distress.

INTRODUCTION

It is likely that social robots will become a part of our daily lives in the near future. However, questions regarding (1) whether people prefer robots vs. humans as communication partners for different roles and situations in daily life and (2) the type of people who prefer to interact with robots remain unanswered. Although Takayama, Ju, and Nass [1] identified types of roles that people expect to robots, previous studies have not sufficiently addressed the answers to the above-mentioned questions.

Several psychological factors are known to influence human-robot interaction. For example, it is revealed that human-robot interaction is inhibited by people’s robot anxiety and negative attitude toward robots (e.g., [2]). That is, people who experience anxiety toward robots tend to avoid communicating with a robot. Conversely, people who experience anxiety toward other people may have difficulty communicating with people and may prefer to communicate with robots rather than with peoples.

This study investigates two issues. First, we examined people’s preferences for robots vs. humans as communication partners with regard to different roles and situations. Second, we examined the influence of social avoidance and distress on people’s preferences of communication partners.

Therefore, we formulated the following research questions. Q1: Do people prefer robots vs. humans as communication partners for different roles and situations in daily life? Q2: How do social avoidance and distress influence the communication partner preferences?

METHOD

An online survey was conducted in March 2015. A total of 206 Japanese participants (Men: 103, Women: 103; Age range: 20–29; Mean: 25.2; SD: 2.91) were recruited through an online survey company.

In order to address Q1, the questionnaire included items assessing the participants’ preferences for robots vs. humans as communication partners. These items were developped for this survey. In this survey, participants were not presented with a clear definition (e.g., humanoid type) of a robot for think their own image. Twenty-five roles and situations (see Table 1) were presented, and the participants were asked to select either a human or a robot as a communication partner for each role and situation.

In order to address Q2, a Japanese version of the Social Avoidance and Distress Scale (SADS; [3]), which was originally developed by Watson and Friend [4] and includes 28 true-false items, was administered to assess participants’ degree of social avoidance and distress. Social avoidance and distress is an important factor related to social anxiety.

RESULTS

To answer Q1, the rate at which participants selected humans or robots as their communication partners were calculated (Table 1). Results showed that for one third of the total roles and situations presented, approximately 20% of the total participants preferred to communicate with robots, as compared to humans. Furthermore, for two roles and situations (No. 7 and No. 10), over half of participants preferred to communicate with robots as compared to humans.

To answer Q2, the SADS scores of participants who selected humans as communication partners were compared to those of participants who selected robots as communication partners. Results of t-test with SADS score as the dependent variable and communication partner preference as the independent variable showed that for all roles and situations, the SADS scores of participants who selected robots as communication partners were higher than those of participants who selected humans as communication partners (Table 1).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 20 -

Page 23: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

DISCUSSION

The study demonstrated that some of young Japanese prefer robots as communication partners than peoples for many roles and situations in daily life. Especially for a couple of roles and situations, direction-giving and cashier, over half of participants preferred to communicate with robots as compared to humans. It is considered that people tend to prefer a robot for structured tasks. Further, we found the differences in SADS scores of participants who selected humans and those who selected robots as communication partners. This result indicated that robots are particularly helpful for people with high social anxiety. Thus, introducing robots as communication partners in daily life situations may be helpful for individuals with high anxiety. And investigating psychological factors associated with

peoples’ preferences for robots may be helpful for introducing communication robots in daily life.

ACKNOWLEDGEMENT

The work was supported in part by Grants-in-Aid for Scientific Research from the Japan Society for the Promotion of Science (No. 25280095).

REFERENCES 1. L. Takayama, W. Ju, and C. Nass, Beyond Dirty,

Dangerous and Dull, Paper presented at the 3rd ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI2008), 25-32 (2008).

2. T. Nomura, T. Kanda, T. Suzuki and K. Kato, IEEE Transactions on Robotics, 24, 442-451 (2008).

3. R. Ishikawa, K. Sasaki and I. Hukui, Japanese Journal of Behavior Therapy, 18, 10-17 (1992).

4. D. Watson and R. Friend, Journal of Consulting and Clinical Psychology, 33, 448-457 (1969).

Table 1. The rate at which participants selected humans vs. robots as communication partners and differences in SADS scores between participants who preferred humans vs. robots as communication partners.

No. Item n (%) of selection of Robot

95%CI

Means (SDs)of SADS scores: Human

Robot

t p

2 Talking about serious events experienced during the day at home

29 (14.1) 9.3–18.8 15.8 (7.2) 20.7 (6.0) -3.898 .000

13 Seeking medical attention at a hospital 29 (14.1) 9.3–18.8 15.7 (7.0) 21.2 (6.8) -3.998 .000 14 Seeking career counseling at school 32 (15.5) 10.6–20.5 15.8 (7.0) 20.3 (7.4) -3.196 .003 23 Being nursed during hospitalization 33 (16.0) 11.0–21.0 15.5 (7.0) 21.4 (6.1) -4.916 .000 22 Being nursed at home 34 (16.5) 11.4–21.6 15.6 (7.0) 21.0 (6.7) -4.257 .000 17 Seeking mental health counseling at a clinic 36 (17.5) 12.3–22.7 15.5 (7.1) 21.3 (6.0) -5.085 .000 16 Seeking mental health counseling at school

or in the workplace 38 (18.4) 13.1–23.7 15.7 (7.0) 20.1 (7.0) -3.490 .001

18 Being taught at schools or cramming schools

39 (18.9) 13.6–24.3 15.6 (7.0) 20.4 (6.9) -3.966 .000

1 Talking about trivial events experienced during the day at home

41 (19.9) 14.5–25.4 15.8 (7.1) 19.3 (6.9) -2.943 .005

15 Seeking outplacement counseling at an employment agency

41 (19.9) 14.5–25.4 15.6 (7.0) 20.1 (7.2) -3.611 .001

12 Being provided with health consultations 42 (20.4) 14.9–25.9 15.7 (7.0) 19.6 (7.1) -3.242 .002 21 Being trained for new tasks at workplace 42 (20.4) 14.9–25.9 15.2 (7.0) 21.3 (5.9) -5.709 .000 20 Being taught new job-related skills for a

part-time job at the workplace 54 (26.2) 20.2–32.2 15.4 (7.1) 19.6 (6.7) -3.979 .000

5 Becoming a playmate at home 60 (29.1) 22.9–35.3 14.9 (6.9) 20.4 (6.5) -5.430 .000 19 Being taught to study at home 60 (29.1) 22.9–35.3 15.5 (6.8) 18.9 (7.7) -2.997 .003 4 Consulting about concerns at tome 63 (30.6) 24.3–36.9 15.8 (6.8) 18.0 (7.8) -1.972 .051 24 Being cared for at home when old 67 (32.5) 26.1–38.9 15.1 (6.9) 19.4 (7.0) -4.147 .000 25 Being cared for at a nursing home when old 69 (33.5) 27.0–39.9 14.9 (6.9) 19.7 (6.8) -4.763 .000 8 Being guided at a tourist spot 81 (39.3) 32.6–46.0 14.9 (6.9) 19.0 (7.1) -4.072 .000 6 Getting fortune-telling on street or store 85 (41.3) 34.5–48.0 15.0 (7.1) 18.6 (6.9) -3.557 .000 9 Enquiring about the characteristics and

features of products at stores 87 (42.2) 35.5–49.0 15.2 (6.9) 18.2 (7.3) -3.004 .003

11 Placing orders for food and drink at restaurants

95 (46.1) 39.3–52.9 14.7 (7.2) 18.6 (6.7) -4.082 .000

3 Complaining about an issue at home 99 (48.1) 41.2–54.9 15.4 (6.8) 17.6 (7.5) -2.215 .028 7 Asking directions at station or on street 104 (50.5) 43.7–57.3 14.9 (7.0) 18.0 (7.2) -3.170 .002 10 Paying for items at the checkout counter of

a store 107 (51.9) 45.1–58.8 14.9 (7.5) 18.0 (6.7) -3.147 .002

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 21 -

Page 24: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Is it real? Dealing with an insecure perception of a pet robot in dementia care

Martina Heinemanna, Meritxell Valenti Solerb, and Marcel Heerinka aWindesheim University, The Netherlands, bAlzheimer Center Reina Sofía Foundation, Spain

Abstract. When asked by dementia patients whether a pet robot is real, caregivers face the dilemma as to what the best answer is. We asked Dutch and Spanish caregivers what they consider the best answer and find that most would leave the choice to the patients. There appear to be fundamental differences between the answers in both countries: Dutch respondents often compared the pet robot to a real animal while this option was not chosen at all in Spain.

Keywords: robot assisted activity, social robots, multidisciplinary research, triangulation, dementia care

INTRODUCTION

In general, and gradually more commonly, pet robots in the care of people with dementia are used to increase their feeling of health and wellbeing, and to decrease anxiety. They stimulate patients to be more communicative and enable caregivers and family members to make contact with them - they calm down or indeed revitalize, are less anxious and/or confused, feel less lonely and/or depressed, are happier and laugh more, remember earlier times (reminisce)and communicate more and better with their surroundings [1, 2] But how are these effects reached? How to use the robot? For which clients are pet robots suitable or and for which ones not? What do you have to watch out for? How to work with groups of people or an individual client? When and how do you involve relatives ? These are a few of the many questions care professionals, volunteer caregivers and family members who (want to) work with pet robots have. There is a need for information and practical guidelines when using pet robots in the care of people with dementia [3]. To meet this need the project “New friends, old emotions” was initiated at the end of 2012. This project focussed on practice oriented research into the use of various robotic animals(1) in individual patients and in groups, (2) in various stages of dementia

(3) in cooperation with professional caregivers, relatives and volunteers and give as many ‘evidence based’ answers as possible to the questions listed above. The findings were to be translated into a set of guidelines and recommendations for the use of pet robots in dementia care.

IS IT REAL?

During a pilot study within this project, we observed an observation of a woman with severe dementia cuddling a robotic cat, obviously enjoying it. After while, she stopped, seemed confused, and looked up to the caregiver, asking ‘Is it real?’ This is an illustrative case of practice with a challenge: dementia caregivers usually go with a patient’s point of view. But what if this point of view is insecure? This could specifically occur when using life like robotic pets and we wanted to know what the best strategy would be. We decided to incorporate this case as a multiple choice question in a larger questionnaire [4] on the attitude of dementia caregivers towards therapy with robotic pets. In Madrid, twenty care professionals of different age and educational level who attended a course were invited to take part in this research and answer the questionnaire. In the Netherlands, 29 care professionals from different care institutions all over the country were recruited to take part.

RESULTS AND DISCUSSION

When looking at the cumulative frequencies for the different answers we see that only a minority would answer “no, it is not real” (12%) , the single most common answer is “what do you believe?” (35%), and the majority of caregivers favor a positive answer (53%).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 22 -

Page 25: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Figure 1. Cumulative frequencies of the different answers. Grey signifies the Spanish and black the Dutch caregivers However, a closer look at the answers given in Spain and the Netherlands separately presents a slightly more complicated picture: nearly half the respondents in Spain would leave the patients to make up their own mind while the yes has only an insignificant majority over the no. In the Netherlands about a quarter of all respondents would leave the decision to the patient, but the majority (69%) would answer yes. Only one person would answer no.

Figure 2. Frequencies resolved per country. Grey signifies the Spanish and black the Dutch caregivers So in general we find a much more positive way of answering in the Netherlands. Moreover, no respondent in Spain found positive identification with a real animal an appropriate option while more than a third of Dutch caregivers chose this answer – as

much as the other two positive answers combined.

CONCLUSION AND FURTHER RESEARCH

In summary, we find that a large group of caregivers prefers to leave the answer to the patient. We find two significant differences in the country-resolved data: • In general the Dutch respondents favor

more positive answers as compared to the Spanish.

• The comparison to a real animal was chosen by about a third of all Dutch respondents and not at all in Spain.

Even though the sample size is not overly large we would not like to discount this as purely coincidental. So further investigation is needed to answer the question • Will we see the same tendencies in a

greater sample? • A second interesting point we have not

addressed here at all, would be to look into the expectations possibly reflected in the caregivers’ answers. In other words: do they expect the therapeutic value of the robot to depend on its perceived reality?

• One caregiver pointed out that her answer would depend on factors like patient type and context. It would require more in depth research to establish the influence of situational factors on caregivers’ reply.

REFERENCES 1. Bemelmans, R., et al., Socially assistive robots in elderly

care: A systematic review into effects and effectiveness. Journal of the American Medical Directors Association, 2012. 13(2): p. 114-120. e1.

2. Broekens, J., M. Heerink, and H. Rosendal, The effectiveness of assistive social robots in elderly care: a review. Gerontechnology journal, 2009. 8(2): p. 94-103.

3. Heerink, M., Kent u Paro? Bekendheid van robot assisted therapy bij professionals in de ouderenzorg. 2011, Windesheim Flevoland: Almere.

4. M. Valenti Soler, M.H., S. Anisuzzaman, C. Smits, S. De Vos, A. Pérez Muñoz, I. Rodríguez Pérez, L. Carrasco Chillóa, C. Mendoz Rebolledo, C. Pérez Muñano, V. Isidro Carretero and M. Heerink, Picking new Friends: Caregivers And Dementia Patients Choices of Robotics Pets. Canadian International Journal of Science and Technology, 2015. 2.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 23 -

Page 26: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Inquiry learning with a social robot: can you explain that to me?

Frances Wijnena, Vicky Charisib, Daniel Davisonb, Jan van der Meija, Dennis Reidsmab, Vanessa Eversb

aFaculty of Behavioural Management and Social Sciences, ELAN, University of Twente, The NetherlandsbDept. of Electrical Engineering, Mathematics and Computer Science, HMI, University of Twente, The Netherlands

Abstract. This paper presents preliminary results of a study whichassesses the impact a social robot might have on the verbalizationof a child’s internal reasoning and knowledge while working on alearning task. In a comparative experiment we offered children thecontext of either a social robot or an interactive tablet for verballyexplaining their thoughts, while keeping the content of the learningtask identical. Results suggest the context of a social robot leads toa faster response time from the children.

Keywords: Social robot, child-robot interaction, inquiry learning,verbalization, interactive explaining

INTRODUCTION

Talking with other people can provide a context for ar-ticulating and explaining ideas. This can facilitate greaterunderstanding of ones own ideas en knowledge. For the past 15years, research has proved that generating explanations leadsto deeper understanding when learning new things [1], [2],[3]. There are two forms of explaining: (1) explaining thesubject of interest to oneself, which is called self-explainingand (2) explaining the learned subject to another person,which is called interactive explaining [9]. Several studies haveprovided successful examples of self-explanation activities [1],[11]. However, a social partner may implicitly create moreopportunities for explanations, which are difficult to trigger inthe case of self-explanation.

The role of a partner can range from being a passive one,who just listens, to an interactive one who provides supportand feedback to the learner [3]. Although there are somesimilarities between an activity with a partner who just listensand self-explanation activities, the presence of another personcan provide the benefit of an audience effect [3]. Generatingexplanations to another person has been associated with theconstruction of knowledge [8], [10]. This is because theaddition of a social partner might lead to more verbalization ofreasoning and explanations, which relates to the developmentof metacognitive skills.

This study investigates the effect of a social robot onthe explanatory behavior of young children when workingon an inquiry learning task. Inquiry learning is based onconstructivism, which we have combined with aspects of thesocio-cultural theory about collaborative learning [12]. Thischoice was based on the following arguments: (1) inquirylearning provides an open-ended task, (2) the collaborativeaspect provides a clear role for the robot as a peer learner, (3)children can use different strategies in operating inquiry tasks

This project has received funding from the European Union SeventhFramework Programme (FP7-ICT-2013-10) as part of EASEL under grantagreement no 611971.

and the verbalization of these strategies can provide insightsin the way children approach such tasks.

Inquiry learning is often described as a cycle or spiralthat involves several processes. Klahr’s [5], [6] ScientificDiscovery of Dual Search (SDDS) model identifies hypothesisgeneration, experimentation, and evidence evaluation as thecore processes of scientific inquiry learning [7], [4], [13]. Inthe phases of hypothesis generation and evidence evaluationthe child has the most opportunities for verbalization of his/herthought process.

DESIGN

The purpose of the present study is to assess the effect of asocial robot on the verbalization of reasoning and knowledgeduring a collaborative inquiry task. The inquiry task focusedon exploring the phenomenon of balance using a balancebeam. The study employed a between-subjects design withtwo conditions. In the first condition, children performedthe balance inquiry task together with an expressive socialrobot, the RoboKind Zeno R25. The robot was presented asa peer but with well-developed inquiry skills. Futhermore, thechildren received a tablet. Through this tablet the childrencould indicate they wanted to move on to the next assignmentor ask for additional help. In the second condition, childrenperformed an identical inquiry task about balance with a tabletonly. The tablet provided the same assignments, suggestionsand questions. In both conditions the robot or tablet would askthe child to verbally explain their hypothesis and conclusionat the specific stages in the inquiry task.

It was hypothesized that the presence of a social robot wouldtrigger children to give more explanations than with the tablet.Furthermore, in the robot condition it was expected that thetime between asking a question and the childs response wasshorter than in the tablet condition.

Participants were 12 Dutch elementary school students(33.3% female) with an average age of 8.8 years (SD =2.1). The students were randomly assigned to either the robotcondition (n = 6), or the tablet condition (n = 6). A review ofschool curricula showed that students were not yet educatedin the phenomena of balance. Therefore, it was expected thatthe students had little or no prior knowledge.

METHOD

This experiment focuses on measuring the duration ofverbalization and the response time of a child’s response toquestions from the system. Both measures were assessed fromvideos recorded during the sessions, which were annotated onthree levels.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 24 -

Page 27: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The first level was child speech and contained one label:verbalization. This label was used when children providedexplanations about the assignment or balance and was useddirectly to assess the duration.

The second level was system speech and contained threelabels: (1) giving explanation, this label was used when thesystem (robot or tablet) would give an explanation or a verbalresponse to the child or answer of the child, (2) askingquestion, this label was used when the system would statea question, and (3) waiting for response, this label was usedwhen the system had stated a questions and was waiting for aresponse of the child, effectively measuring the response time.

The third level was child actions and contained two labels:(1) interacting with balance, this label was used when thechildren were working with the balance, for example placingor removing pots or removing the wooden blocks, (2) pressingbutton, this label was used when the child would press one ofthe button of the tablet (in both conditions).

Future work will investigate the remaining annotation levels,however this paper focuses on reporting the duration andresponse time as discussed above.

RESULTSIn total 149 annotations were identified for the label verbal-

ization of which 77 annotations refer to the robot conditionand 72 annotations to the tablet condition. The total durationfor all annotations with this label was 758.11 seconds (SD= 4.20). The mean duration for the robot condition was 5.80(SD = 4.94). The mean duration for the tablet condition was4.32 (SD = 3.09). An independent sample t-test showed nosignificant difference between both conditions concerning theduration, t = 1.264 (df = 10), p = .118 (one-tailed).

The label waiting for response was annotated 146 timesof which 71 annotations refer to the robot condition and 75annotations to the tablet condition. The total duration for allannotations with this label was 217.22 seconds (SD = 2.79).The mean duration of this label in the robot condition was .94(SD = 1.14) and 2.00 (SD = 3.67) for the tablet condition.An independent sample t-test showed a significant differencebetween both conditions concerning the response time, t = -2.54 (df = 10), p = .015 (one-tailed).

CONCLUSIONThis study investigated the effect of a social robot on the

duration of verbalization and the response time to questionsin the context of an inquiry-learning task. To assess this effectthe social robot was compared with the use of a tablet. Itwas hypothesized that providing the children with the contextof a social robot would lead to more verbalization about thetask than when the children would only use a tablet. Resultsindicated that children in the robot condition verbalized morethan children in the tablet condition but this difference wasnot significant. The second hypothesis concerned the responsetime of children when a question was asked. The resultsshowed a significant difference between the robot conditionand the tablet condition in favor of the robot condition.

It seems that children verbalized more easily (shorter re-sponse time) when a social robot was used compared to atablet, but not necessarily more extensively. However, thesample was very small (n = 6 per condition) and a largersample with more participants may provide more information.

FUTURE WORKFor our future work we want to repeat this experiment

with a larger sample in order to increase the external validity.Furthermore, for the repeated study we are planning to performa qualitative analysis of the answers children give to the ques-tions in order to gain insight in the reasoning of children. Sincethis experiment was done in the context of inquiry learning, thereasoning of children might give us some interesting insightsin what they have learned from the experiment and whetherthere is a difference in learning between the participants in thetablet condition compared to the robot condition.

ACKNOWLEDGEMENTThis project has received funding from the European Union

Seventh Framework Programme (FP7-ICT-2013-10) as part ofEASEL under grant agreement no 611971.

REFERENCES1. M. T. Chi, M. M. Bassok, M. W. Lewis, P. Reimann, and R. Glaser.

Self-Explanations: How Students Study and Use Examples in Learningto Solve Problems. Cognitive Science, 13(2):145–182, Apr. 1989.

2. E. B. Coleman, A. L. Brown, and I. D. Rivkin. The Effect of InstructionalExplanations on Learning From Scientific Texts. Journal of the LearningSciences, 6(4):347–365, Oct. 1997.

3. J. Holmes. Designing agents to support learning by explaining. Comput-ers & Education, 48(4):523–547, May 2007.

4. W. R. V. Joolingen and T. D. Jong. An extended dual search space modelof scientific discovery learning. Instructional Science, 25(5):307–346,1997.

5. D. Klahr. Exploring Science: The Cognition and Development ofDiscovery Processes. The MIT Press, Cambridge, 2000.

6. D. Klahr and K. Dunbar. Dual Space Search During Scientific Reasoning.Cognitive Science, 12(1):1–48, Jan. 1988.

7. A. W. Lazonder. Handbook of Research on Educational Communicationsand Technology. pages 453–464, 2014.

8. V. Manion and J. M. Alexander. The Benefits of Peer Collaboration onStrategy Use, Metacognitive Causal Attribution, and Recall. Journal ofExperimental Child Psychology, 67(2):268–289, Nov. 1997.

9. R. Ploetzner, P. Dillenbourg, M. Preier, and D. Traum. Learning byexplaining to oneself and to others. Collaborative-learning: Cognitiveand Computational Approaches, pages 103–121, 1999.

10. S. D. Teasley. The role of talk in children’s peer collaborations.Developmental Psychology, 31(2):207–220, 1995.

11. K. VanLehn and R. Jones. Learning by explaining examples to oneself:A computational model. In S. Chipman and A. L. Meyrowitz, editors,Foundations of Knowledge Acquisition: Cognitive Models of ComplexLearning, volume 194, pages 25–82. Springer USA, Boston, MA, 1993.

12. L. Vygotsky. Mind in Society: The Development of Higher PsychologicalProcesses. Harvard University Press, Cambridge, MA, 1978.

13. C. Zimmerman. The Development of Scientific Reasoning Skills. Devel-opmental Review, 20(1):99–149, Mar. 2000.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 25 -

Page 28: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

A Comparison of Children Learning New Words from Robots, Tablets, & People

Jacqueline Kory Westlund, Leah Dickens, Sooyeon Jeong, Paul Harris, David DeSteno, & Cynthia Breazeal

aMIT Media Lab, bNortheastern University, cHarvard University

Abstract. This work investigates young children’s perceptions of social robots in a learning context. Because social robots are a relatively new technology, direct comparison to more familiar means of learning could give us useful insights. Here, we compared the efficacy of three sources of information (human, robot, and tablet/iPad) with respect to children’s rapid learning of new words. Our results suggested that in this simple case, all three interlocutors served equally well as providers of new words. However, children strongly preferred learning with the robot, and considered it to be more like a person than like an iPad. Follow-up work will examine more complex learning tasks.

Keywords. Education; learning; children; social robots.

INTRODUCTION

The development of children’s early oral language skills is critical for nearly all subsequent learning. Differences in children’s early vocabulary ability can predict differences in reading ability in middle and high school [1], which could magnify over time, inhibiting later growth [2]. Given the importance of language, it would be beneficial to find new ways to supplement the education of children who may not currently be getting enough support, instruction, or practice. We suggest that emerging technologies can help fill this gap.

Computers, tablets, iPads, and even robots are being introduced in many educational settings [3]. Technology has the advantages of being easily customizable, adaptive to individual learners, as well as broadly deployable. But despite the frequent success of these technologies, we often intuitively assume that humans have some “special sauce” that makes us more suited to being teachers and learning companions than any kind of technology. This may be especially true with regards to learning language, which, as a socially situated medium that is for sharing meaning, still seems a uniquely “human” ability.

To this end, we are exploring the effectiveness of technology, specifically robots, as language learning companions for children. Robots occupy a unique role because their embodiment allows them to employ more of the “human” behaviors and social cues that are recognized as crucial in language learning [4]. Children seem to readily learn words from both mobile devices [5] and robots [6], [7]. However, one concern about some of these prior studies is that the learning conditions presented may not reflect children’s usual language learning, which often proceeds rapidly and without feedback from a teacher. As such, in this work, we focus on one particular type of rapid, albeit approximate, word learning without feedback, known as “fast mapping” [8]. Although grasping the full meaning of a new word can take time, the initial mapping is often accomplished quickly. Accordingly, we ask whether children display a process of fast mapping with a social robot or a

tablet, just as they would with a human interlocutor. We expected that children would learn equally well from the human and robot, and that the tablet would fair somewhat worse due to its lack of social embodiment. Furthermore, we probed children’s perceptions of the robot in an attempt to understand how they construed it. The study is modeled closely on the procedure in [9].

METHODS

Nineteen children ages 4-6 (10 female, 9 male), from a Greater Boston area preschool serving a mainly middle-class population participated in two sessions, set about one week apart. The experiment followed a within-subjects design.

In Session 1, children were first asked questions about whether they thought a robot was more like a person or like an iPad. Then, each child looked at three series of ten pictures of unfamiliar animals, presented one image at a time on the tablet. They viewed ten pictures with just the tablet, ten with the robot (Figure 1), and ten with the second experimenter (thirty total). The order of the interlocutors was counterbalanced to handle order effects. The order in which the pictures were presented was held constant across interlocutors. A Samsung Galaxy Tablet was used to present the animal pictures. When the tablet was the interlocutor, recorded human speech was played back through the tablet’s speakers. The robot was a DragonBot [10], which was teleoperated by a second experimenter.

Figure 1: Children viewed pictures of novel animals with the DragonBot as well as with a person or with the tablet.

During the picture viewing, the child’s interlocutor commented positively but uninformatively on the animal shown for 8 of the 10 pictures, e.g., “Look at that!” The remaining two animals were named, e.g., “Ooh, a kinkajou! See the kinkajou?” This presented the opportunity for fast mapping to occur. After each set of pictures, we measured children’s learning with a recall test. Finally, we asked the earlier questions again, and probed children’s preferences for learning from the human vs. robot vs. iPad.

In Session 2, we wanted to see whether children’s thoughts about robots had changed, and to test retention of the animal names they had learned. They were given the same recall tests and were asked the same sets of questions.

RESULTS

We found that, across the three conditions, children learned a mean of 4.3 of the 6 animals correctly (71.7%

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 26 -

Page 29: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

correct, SD=1.84). However, there were no significant differences across conditions in how many names were learned. In Session 2, children’s retention was nearly as good, naming a mean of 3.9 of 6 animals correctly (65.0% correct, SD=1.48), indicating that they did learn the names.

Children expressed a strong preference for learning with the robot. After Session 1, 63.2% (12 of 17) children preferred the robot, 1 child preferred the iPad, 1 preferred the person, and 5 liked all three equally (two children were not asked this question in Session 1). After Session 2, 73.7% (14 of 19) children preferred the robot; 2 preferred the iPad, and 3 liked all three equally. Thus, although learning success appeared the same, enthusiasm was higher for the robot.

Regarding children’s perceptions of the robot, the most telling questions were “When a robot answers a question, is it more like a person or more like an iPad?” and “When a robot teaches you something…” Prior to interacting with the robot, children were split in their answers (“Answers…”: 52.6% person, 47.4% iPad; “Teaches…”: 47.4% person, 52.6% iPad). After interacting, more children thought the robot was more like a person (“Answers…”: 78.9% person, 21.1% iPad; “Teaches…”: 68.4% person, 31.6% iPad). However, during the follow-up Session 2, some children reverted back to their original opinion (“Answers…”: 36.7% person, 63.2% iPad; “Teaches…”: 68.4% person, 31.6% iPad). For the remaining questions, children generally thought the robot was more like a person.

DISCUSSION

We examined the efficacy of, as well as children’s subjective attitudes toward, three different sources of information (human, robot, and tablet) with respect to word learning. Our results suggested that in this simple case, contrary to our hypotheses, all three interlocutors served equally well as providers of novel animal names. We suspect that this is due to the simple nature of the learning task. When only one picture is shown and named, children need not observe the interlocutor’s social cues to understand what is being referred to by the novel name that is provided. Given that the key benefit provided by the robot and human over the tablet is their ability to offer social cues, it is understandable that, because these cues were not necessary, the tablet was equally well suited to the learning task.

However, children showed a clear preference for learning with the robot. Their enthusiasm and, therefore, likely engagement was higher with the robot. It is unclear whether this was merely a novelty effect. We suspect that given a sufficiently interesting activity with the robot, children’s preference for a robot over a tablet would not simply be novelty – recent work has shown that children can remain interested and engaged with a robot during educational games for a month or more [6], [7].

Regarding children’s perception of the robot, our results suggest that although children initially expect a robot to engage them just like any other technological tool, their perceptions of it rapidly change. Note that this shift was evident for the two questions in which children were invited to appraise the robot as an active, social partner, i.e., as an interlocutor that is able to teach and answer questions. They come to perceive it as being more “human,” more like a someone than a something, which suggests that they will attend to its social cues when they need to learn.

Follow-up work is now in progress to probe the social dimension farther. We are looking at tasks that require social

information for learning (e.g., gaze) and more closely mirror what happens in “real-life”, such as when a child needs to determine which of multiple target objects is the referent. Because robots can operate in the same spaces that we do (while tablets are limited to a two-dimensional screen world), it is an interesting challenge to identify clear differences between the social capabilities of a human and a robot. Our future work will continue exploring how children learn from different agents, and which social cues are truly important for learning.

ACKNOWLEDGMENTS This research was supported by the National Science Foundation (NSF) under Grant 122886 and Graduate Re-search Fellowship Grant No. 1122374. Any opinions, find-ings and conclusions, or recommendations expressed in this paper are those of the authors and do not represent the views of the NSF.

REFERENCES 1. M. Fish and B. Pinkerman, “Language skills in low-

SES rural Appalachian children: Normative development and individual differences, infancy to preschool,” J. Appl. Dev. Psychol., vol. 23, no. 5, pp. 539–565, 2003.

2. B. Hart and T. R. Risley, Meaningful differences in the everyday experience of young American children. ERIC, 1995.

3. C. K. Blackwell, A. R. Lauricella, E. Wartella, M. Robb, and R. Schomburg, “Adoption and use of technology in early education: The interplay of extrinsic barriers and teacher attitudes,” Comput. Educ., vol. 69, pp. 310–319, Nov. 2013.

4. P. L. Harris, “Trust,” Dev. Sci., vol. 10, pp. 135–138, 2007.

5. S. Judge, K. Floyd, and T. Jeffs, “Using Mobile Media Devices and Apps to Promote Young Children’s Learning,” in Young Children and Families in the Information Age, K. L. Heider and M. R. Jalongo, Eds. Springer Netherlands, 2015, pp. 117–131.

6. J. Kory Westlund and C. Breazeal, “The interplay of robot language level with children’s language learning during storytelling,” in Proceedings of the 2015 ACM/IEEE international conference on Human-robot interaction, Portland, OR, 2015.

7. J. Movellan, M. Eckhardt, M. Virnes, and A. Rodriguez, “Sociable robot improves toddler vocabulary skills,” in Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, 2009, pp. 307–308.

8. S. Carey, “Beyond Fast Mapping,” Lang. Learn. Dev. Off. J. Soc. Lang. Dev., vol. 6, no. 3, pp. 184–205, Jul. 2010.

9. L. Markson and P. Bloom, “Evidence against a dedicated system for word learning in children,” Nature, vol. 385, no. 6619, pp. 813–815, Feb. 1997.

10. J. M. Kory, S. Jeong, and C. L. Breazeal, “Robotic learning companions for early language development,” in Proceedings of the 15th ACM on International conference on multimodal interaction, New York, NY: ACM, 2013, pp. 71–72.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 27 -

Page 30: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

How do diabetic children react on a social robot during multiple

sessions in a hospital?

Rosemarijn Looijea, Mark A. Neerincx

a,b and Johanna Peters

c

aTNO

bDelft University of Technology

cUniversity of Groningen

Abstract. In the European project ALIZ-e, many aspects

of social robot interaction were evaluated, mainly with

healthy children. In this paper, we take the lessons

learned and apply them in a field experiment with

diabetic children. The observations showed that a robot

requesting help added to the bonding, that the children

with diabetes acquired relevant knowledge, seemed to

appreciate the robot more than the healthy children in

earlier experiments and showed to have different

profiles between them that set requirements for

personalization.

Keywords: social robot, field testing, diabetes

INTRODUCTION

The European project ALIZ-e aimed at persistent long-

term interaction of a robot for diabetic children in the

age of 7-11. The project works on models and methods

for robot’s interactive behaviors to achieve long-term

interaction and support the development of self-

management attitudes, knowledge, skills and behaviors

(e.g. self-efficacy, education, bonding). Within this

project experiments have been done with both sick

and healthy children who use a robot that adapts to

them on certain aspects (e.g., emotions influenced by

child [1], keeping the activity challenging for each

child [2]). This paper presents our lessons learned and

an experiment conducted in the wild (i.e., the

hospital): An evaluation of the prototype showing the

envisioned interaction with children with diabetes in

the course of 3 sessions.

LESSONS LEARNED AND IMPLICATIONS

The present experiment was the last experiment within

the ALIZ-e project and thus incorporated the lessons

learned from the previous 4 years and evaluated this

with the intended users (children with diabetes).

Lessons are: (a) children are able to recognize the

emotions of a NAO robot [3], (b) personality is hard to

take into account [4], and (c) that adapting robot state

to the user [1], exhibiting thinking behavior [5] and

remembering small facts [6] support positive

interaction. Activities are more motivating when the

activity is challenging [1] and it is possible to switch

between activities [7]. Finally we saw that children are

willing to disclose information about themselves [8]

and most children like touching the robot [9]. Based

on these results, an experiment was designed in which

children performed multiple activities over various

sessions from which the robot remembered some small

facts in an enclosed environment (robot playground

see Figure 1). Furthermore, during interaction the

robot showed thinking behavior, emotions and interest

in the child while also disclosing information about

itself. Next to this the robot was dependent on the

child to move from one point to another (walking or

lifting). The general research aim was get insight into

the child’s knowledge gain, activity preferences and

profile characteristics for personalization.

Figure 1 Robot playground

EVALUATION

17 diabetic children in the age of 6-10 (M=8.24 yrs,

SD=1.25 yrs) from the MeanderMC (Amersfoort, The

Netherlands) participated in the experiment. We used

tests (knowledge and self-efficacy), questionnaires

(fun and self-determination) and observations (game

preference, video and logging data) to quantify and

qualify the interaction with the robot.

Every child had three sessions of about an hour in the

hospital with the robot. These sessions were at least 14

days apart. The first session started with the self-

efficacy questions and a knowledge test containing 32

questions of which 8 were asked each session (24 in

total and 8 as a reference). Then a short introduction

about the activities was given. A trivial pursuit kind of

quiz was played on a swiveling tablet that can turn

towards the robot and the child, a sorting game which

is played on a large horizontal placed touch screen on

which the robot and child have to put pictures (pizza,

broccoli) in the correct category (low/high

carbohydrates) on one of the sides of the display and

watching an educational video with the robot. Next to

this, the robot was introduced as Charlie who is in

training to become a diabetes pal. He knows a bit

about diabetes, but also has to learn a lot. The children

could walk with Charlie from one activity to another

activity. In between the activities, Charlie asked some

questions about how they deal with diabetes, but also

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 28 -

Page 31: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

about their hobbies. Then they started with the quiz. In

the second session, the children could choose which

activity they wanted to start with (quiz or sorting

game), while in the last session there was only time for

one. At the end of each session, questions about fun

and self-determination were asked, and after the third

session there was also a post knowledge test.

RESULTS

The knowledge test showed significant differences in

knowledge acquired. A paired sample t-test showed a

significant increase in knowledge from the pre to the

post test for the first 24 questions (first session

M=11.35, SE=0.77; second session M=13.7, SE=0.66;

t(16)=5.6, p<0.001). The final eight questions (25-32)

did not show significant improvement (first session

M=5.94, SE=0.34; second session M=6.29, SE=0.44;

t(16)=1.19, p=0.250).

No time-effects were observed for self-efficacy, fun

and self-determination due to ceiling effects (high

scores overall).

The children had the same preference for the sorting

game as for the quiz. In the second session 9 of the 17

children chose the sorting game as their favorite and 8

chose quiz and they also agreed starting with this

game. In the third session 8 children chose to play the

sorting game and 9 the quiz.

After an analysis based on grounded theory [10] of the

video and logging data 5 types of children were

identified on which the robot could adapt its

interaction in the future: 1) children who are confident

about themselves and their illness, 2) children who feel

excluded from the group, 3) children who are afraid to

make errors, 4) children who feel uncomfortable with

the situation and 5) children who are too young to play

the activities and have meaningful robot interaction.

CONCLUSION AND DISCUSSION

With questionnaires it is hard to acquire useful data

with young children, due to ceiling effects.

Experiments over a longer period of time can solve

part of this problem. Furthermore, observations

provide useful information, but take a lot of time to

analyze. However, the observations provided the

insights that the children actually learn something

from the robot and that their interaction is not

distracting them from the subject matter. The user

profiles provided a starting point to improve the user

profiles and how the robot could adapt to certain user

profiles. For example, Charlie could be more

supportive with children who act a bit shy.

In general, we noticed that a robot that was not all-

knowing and dependent on the child's help (e.g., when

falling or going to another activity) really evoked

valuable behaviors and was appreciated by the

children. We also saw that the minimal interaction

with the experimenter and the shared space of child

and robot created by the playground was beneficial for

the child's involvement. Furthermore, we observed that

children with diabetes seem more inclined in bonding

with the robot than healthy children as observed in

previous studies (e.g. [1]). This could be inferred,

amongst others, by the gifts the children brought. This

could be because they normally feel outside the group.

Finally, because the children were brought to the

experiment by their parents who often waited in the

same room as the experiment leader (outside the

experiment room), we also got some idea about the

home situation. In further research we will take the

influence of the social environment on how a diabetic

child deals with his/her illness more into account, i.e.,

the family life (home), the caretakers (hospital) and

peers (diabetes camp).

ACKNOWLEDGMENTS

This work was (partially) funded by the ALIZ-E

project (EU FP7:248116). Furthermore we would like

to thank the MeanderMC, the participating children

and their parents for making the experiment possible.

REFERENCES1. M. Tielman, M. Neerincx, J.-J. Meyer, and R. Looije.

Adaptive emotional expression in robot-child

interaction. In Proceedings of the 2014 ACM/IEEE

international conference on Human-robot interaction,

pages 407-414. ACM, 2014.

2. J. B. Janssen, C. C. van der Wal, M. A. Neerincx, and R.

Looije. Motivating children to learn arithmetic with an

adaptive robot game. In 2011 ICSR Conference, pages

153-162, 2011.

3. Cohen, I., R. Looije, and M. A. Neerincx. "Child’s

perception of robot’s emotions: effects of platform,

context and experience." International Journal of Social

Robotics 6.4 (2014): 507-518.

4. Robben, S. M. B. "It’s NAO or Never! Facilitate

Bonding Between a Child and a Social Robot: Exploring

the Possibility of a Robot Adaptive to Personality."

Unpublished master’s thesis, Radboud Universiteit

Nijmegen (2011).

5. N. Wigdor. Conversational _llers for response delay

amelioration in child-robot interaction. Master's

thesis, University of Utrecht, 2014.

6. O. A. Blanson Henkemans et al. A social and personal

robot providing diabetes self-management education for

children with diabetes type 1: a randomized controlled

trial.

7. J. Heeffer. Reasoning robots - knowledge structures and

an introduction to agents. research project, Content and

Knowledge Engineering, Utrecht University, February

2012.

8. E. Van Der Drift. A social robot as a means to

motivate and support diabetic children in keeping a

diary. Master's thesis, University of Utrecht, 2013.

9. L. Solms. An exploration of the effects of touch on

social bonding between robot and child. Technical

report, TNO, 2014.

10. Strauss, Anselm, and Juliet M. Corbin. Basics of

qualitative research: Grounded theory procedures and

techniques. Sage Publications, Inc, 1990.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 29 -

Page 32: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Learning A Second Language with a Socially Assistive Robot Jacqueline Kory Westlunda*, Goren Gordona*, Samuel Spauldinga, Jin Joo Leea, Luke Plummera,

Marayna Martinezb, Madhurima Dasb, & Cynthia Breazeala aMIT Media Lab, bMIT

* These authors contributed equally to this work.

Abstract. We created a socially assistive robotic learning companion to support English-speaking children’s acquisition of a new language (Spanish). In a two-month microgenetic study, 34 preschool children will play an interactive game with a fully autonomous robot and the robot’s virtual sidekick, a Toucan shown on a tablet screen. Two aspects of the interaction were personalized to each child: (1) the content of the game (i.e., which words were presented), and (2) the robot’s affective responses to the child’s emotional state and performance. We will evaluate whether personalization leads to greater engagement and learning.

Keywords. Education; children; language learning; long-term interaction; play; social assistive robots.

INTRODUCTION

Preschool (3-5 years) is a critical time for children to learn language. Not only can early language ability greatly impact later educational success (e.g., [1], [2]), but also, learning the pronunciation and accent for a new language is age-sensitive [3]. This may be especially important for children who are newcomers to a country – the earlier they master the new language, the better.

For many children, the main problem faced in mastering a new language is a lack of resources in their homes and schools. Technological interventions can supplement children’s language education by providing additional instruction, support, and practice. However, passive media, like videos, can help children learn vocabulary, but not language structure [4]. Many of the interactive games available require reading or writing skills – fine for older children, but not for preschoolers who are generally still learning how to read. Very young children learn language best through social interaction.

To this end, we have created a social robot that can supplement children’s early language education. Social robots combine the unique advantages of technology – such as being easily customizable, adaptive to individual learners, and being deployable – with the necessary social cues and “human” behaviors that are crucial for language learning [5]. This robot is accompanied by a virtual sidekick, who appears on a tablet. Prior work has shown that young children will readily learn words from both mobile devices [6] and robots (e.g., [7], [8]). Furthermore, because children learn at different paces and in different ways, the robot will adapt both its affective responses and the material to be learned to each individual child. We ask whether this personalization will increase learning gains and overall engagement.

METHODS

Participants. Thirty-four children ages 3-5 (19 male, 15 female) from a “special start” preschool in the Greater Boston area have signed up for the study. Of these, 15 are classified as special needs and 19 as typically-developing.

Conditions. The study follows a 2x2 design of Development (Typical vs. Special) x Personalization (Personalized affective responding vs. no personalization).

Hypotheses. We expect that nearly all children will enjoy playing with the robot and will stay engaged over time. We expect that children who receive personalized affective feedback will exhibit greater learning gains overall.

Procedure. Each child will participate in eight 10min sessions with the robot. During each session, children will play with the robot and with a virtual character, a Toucan, who is shown on a tablet screen (Figure 1). The robot and child are situated as peer learners, while the Toucan speaks Spanish and supplies information about new Spanish words. The rest of the tablet screen contains the shared context for the games the robot, Toucan, and child play together. In each session, they play three games: (1) a review of the previous session, (2) a game “directed” by the robot in English, during which the Toucan introduces new Spanish words by saying, e.g., “Did you know that blue ball is pelota azul in Spanish?”, and (3) a game “directed” by the Toucan in Spanish, during which the robot supplies hints in English to help the child along.

The eight play sessions have content revolving around a trip to Spain: packing for the trip, visiting a zoo, having a picnic, and so forth. Each session provides the opportunity to both learn new words and review. For example, at the zoo, children can learn names of animals. The animals appear in later sessions as the Toucan’s friends, providing review.

All the speech in the interaction was pre-recorded, which allowed for more emotional expressivity, and pitch-shifted to make the voices sound more child-like. The robot’s voice was recorded by a native English speaker and the Toucan’s voice was recorded by a native Spanish speaker.

Figure 1: Children played with the robot Tega and the tablet, which featured a virtual toucan.

Robot. We are using the Tega robot (Figure 1), which was designed and built by members of the Personal Robots Group at the MIT Media Lab and their collaborators. An android phone runs the robot’s motor control software and displays the robot’s animated face. The robot is fully autonomous. Control software coordinates the robot’s behavior and the tablet game via ROS. This software follows

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 30 -

Page 33: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

a general script of the interaction flow, and receives sensory input from the tablet (such as when a child taps or drags an object on the screen) and from the Affdex emotion classifier from Affectiva [9] (including valence and engagement).

Personalization. The interaction is personalized in two dimensions. For all children, the content of the game – i.e., which Spanish words are taught – is personalized based on children’s recognition of Spanish words in previous sessions, using an algorithm based on that described in [10]. The goal is to keep children in the zone of proximal development [11], such that they have a 50% change of knowing the words used in a session.

For half the children, the robot’s affective responses will be personalized to the child’s performance and emotional state. Measurements of the child’s engagement (high/low) and valence (positive/neutral/negative) from Affdex, on-off task (measured by whether the child interacts with the tablet) and right/wrong (in the last task) are combined into a reward signal for an online reinforcement learning algorithm (SARSA), with the goal of maximizing high engagement and positive valence. This personalized a policy governing both the robot’s non-verbal (e.g., facial expressions) and verbal responses to each child following specific tasks in the game (e.g., if the child performed a task correctly, the robot would respond both with the game-related response, such as “good job,” and an appropriate affective response).

Measures. Before the session 1, after session 4, and after session 8, we will ask each child a set of questions about how they perceive the robot (e.g., whether they think the robot is more similar to a person or a tablet on various dimensions). Children will also perform an Anomalous Picture Task, in which they view two pictures of animals in strange situations (e.g., a giraffe in a dining room) with an experimenter (before session 1, as a baseline) and with the robot (after session 1 and after session 8). The child’s interlocutor comments once after 10 seconds (e.g, “That’s so silly!”). The goal is to see how many spontaneous questions, comments, and laughs children produce, which gives us insight into how they construe the robot as a social other (e.g., since people laugh most in social scenarios [12], do children laugh as much with the robot as with the person?).

In addition, we will give children an initial Spanish vocabulary assessment based on the Peabody Picture Vocabulary Task (PPVT) [13] before session 1 and after session 8. Children will also perform a curiosity task that allows them to freely explore a graphical scene on a tablet, giving insight into how curious they are. Each child’s parents and teacher will fill out a questionnaire on the child’s learning preferences. Finally, teachers will be asked to fill out a questionnaire probing their perception of and attitudes toward robots in the classroom, first before session 1 and then after session 8 to see if their opinions had changed.

We will also record audio and video of each session, as well as logging Affdex data and actions taken on the tablet.

PROGRESS

This work adds to the growing body of literature on socially assistive robots in education. We are performing one of the first microgenetic studies with a fully autonomous, adaptive robotic learning companion for preschool children and for preschool children with special needs. Data collection is currently ongoing at the preschool.

We expect that the results of this work will inform the design of future robotic learning companions. We hope to

understand how personalizing the robot’s affective feedback and the game’s content can affect children’s motivation and learning, with the ultimate goal of developing more effective educational tools that can engage children as peers and leverage the social and playful nature of children’s natural early learning environments.

ACKNOWLEDGMENTS We would like to thank the Personal Robots Group for their support. We would also like to thank Vicky for recording her voice for the Toucan’s speech. This research was supported by the National Science Foundation (NSF) under Grant CCF-1138986 and Graduate Research Fellowship Grant No. 1122374. Any opinions, findings and conclusions, or recommendations expressed in this paper are those of the authors and do not represent the views of the NSF.

REFERENCES 1. M. M. Paez, P. O. Tabors, and L. M. Lopez, “Dual

language and literacy development of Spanish-speaking preschool children,” J. Appl. Dev. Psychol., vol. 28, no. 2, pp. 85–102, 2007.

2. B. Hart and T. R. Risley, Meaningful differences in the everyday experience of young American children. ERIC, 1995.

3. C. E. Snow and M. Hoefnagel-Höhle, “Age Differences in the Pronunciation of Foreign Sounds,” Lang. Speech, vol. 20, no. 4, pp. 357–365, Oct. 1977.

4. L. R. Naigles and L. Mayeux, “Television as incidental language teacher,” Handb. Child. Media, pp. 135–152, 2001.

5. P. L. Harris, “Trust,” Dev. Sci., vol. 10, pp. 135–138, 2007.

6. S. Judge, K. Floyd, and T. Jeffs, “Using Mobile Media Devices and Apps to Promote Young Children’s Learning,” in Young Children and Families in the Information Age, K. L. Heider and M. R. Jalongo, Eds. Springer Netherlands, 2015, pp. 117–131.

7. J. Kory Westlund and C. Breazeal, “The interplay of robot language level with children’s language learning during storytelling,” in Proceedings of the 2015 ACM/IEEE international conference on Human-robot interaction, Portland, OR, 2015.

8. J. Movellan, M. Eckhardt, M. Virnes, and A. Rodriguez, “Sociable robot improves toddler vocabulary skills,” in Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, 2009, pp. 307–308.

9. “Affectiva,” 2015. [Online]. Available: http://www.affectiva.com/.

10. G. Gordon and C. Breazeal, “Bayesian Active Learning-based Robot Tutor for Children’s Word-Reading Skills.,” in Proceedings of the 29th AAAI Conference on Artificial Intelligence, 2015.

11. L. S. Vygotsky, Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press, 1978.

12. R. R. Provine, Laughter: A Scientific Investigation. Penguin, 2001.

13. L. M. Dunn and L. M. Dunn, Peabody Picture Vocabulary Test. 4th ed. Pearson Assessments, 2007.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 31 -

Page 34: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Let’s Be Friends:

Perception of a Social Robotic Companion for children with T1DM

Ivana Kruijff-Korbayováa, Elettra Oleari

b, Clara Pozzi

b, Francesca Sacchitelli

b,

Anahita Bagherzadhalimib, Sara Bellini

b, Bernd Kiefer

a, Stefania Racioppa

a, Alexandre Coninx

c,

Paul Baxterd, Bert Bierman

e, Olivier Blanson Henkemans

e, Mark Neerincx

e, Rosemarijn Loije

e,

Yiannis Demirisc, Raquel Ros Espinoza

c, Marco Mosconi

b, Piero Cosi

f, Rémi Humbert

g,

Lola Cañameroh, Hichem Sahli

i, Joachim de Greeff

d, James Kennedy

d, Robin Read

d, Matthew

Lewish, Antoine Hiolle

h, Giulio Paci

f, Giacomo Sommavilla

f, Fabio Tesser

f, Georgios

Athanasopoulosi, Georgios Patsis

i, Werner Verhelst

i, Alberto Sanna

b, Tony Belpaeme

d

aDFKI, Language Technology Lab (Saarbruecken, Germany)

bFondazione Centro San Raffaele, eServices for Life and Health (Milano, Italy)

cImperial College London (London, United Kingdom)

dThe Cognition Institute, Plymouth University (Plymouth, United Kingdom)

eOrganization for Applied Scientific Research (Soesterbergs, The Netherland)

fNational Research Council – ISTC (Padova, Italy)

gAldebaran Robotics (Paris, France)

hUniversity of Hertfordshire (Hatfield, United Kingdom)

iVrije Universiteit, Electronics & Informatics Dept. (Brussel, Belgium)

Abstract. We describe the social characteristics of a

robot developed to support children with Type 1

Diabetes Mellitus (T1DM) in the process of education

and care. We evaluated the perception of the robot at a

summer camp where diabetic children aged 10-14

experienced the robot in group interactions. Children in

the intervention condition additionally interacted with it

also individually, in one-to-one sessions featuring

several game-like activities. These children perceived

the robot significantly more as a friend than those in the

control group. They also readily engaged with it in

dialogues about their habits related to healthy lifestyle as

well as personal experiences concerning diabetes. This

indicates that the one-on-one interactions added a special

quality to the relationship of the children with the robot.

Keywords: Social robots, Child-Robot Interaction,

diabetes, Off-Activity Talk, self-disclosure, social skills,

social robot perception.

INTRODUCTION

Type 1 Diabetes Mellitus (T1DM) is a chronic disease

that affects a shocking 17,000 new children, mostly

under 14 years old, per year in Europe [1]. T1DM is an

overwhelming pathology that can cause life-

threatening complications. It requires children of all

ages to learn to constantly manage their condition in

terms of glycaemia monitoring and insulin injection.

This necessitates a major change in their lifestyle [2].

The present work stems from the Aliz-E project

[3], in which we investigated the use of a humanoid

social robot to support children with T1DM on their

way to self-management. A social robot system was

developed and instantiated in a Robot Theatre to

facilitate child-robot interaction [4]. It was deployed in

real-life settings during two editions of a Diabetes

Summer Camp in 2013 and 2014, organized by the

Italian families association “Sostegno70 – insieme ai

ragazzi diabetici ONLUS” and the team of the

Pediatric unit of Ospedale San Raffaele (Milan, Italy).

During the 2013 summer camp we experimented

with introducing so-called Off-Activity Talk (OAT) to

engage children in conversations about topics related

to diabetes and healthy lifestyle as part of one-on-one

interactions around gaming touchpoints with the robot.

Details about the experiment design and a comparison

of the effects of individual interactions with and

without OAT were presented in [5]. We also observed

that children who participated in the individual

interactions exhibited a significantly stronger

adherence in following the medical advice to fill in a

nutritional diary than children who only participated in

group interactions with the robot.

We hypothesized that this might be due to a

different quality of the child-robot relationship

established through the individual interaction. This

inspired us to further investigate the effect(s) of

individual interactions on children’s perception of the

robot during the 2014 edition of the camp. This paper

presents the method and the results of the 2014

experiment.

EXPERIMENT GOALS AND METHODOLOGY

Goals

The aim of the 2014 summer camp experiment was to

further investigate the children’s (i) perception of the

social robotic companion; (ii) expectations about

having a robotic companion in their daily life; (iii)

willingness and spontaneity to talk freely about their

diabetes condition.

Design

The experiment was held in August 2014 during a

ten-day-long Diabetes Summer Camp for T1DM

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 32 -

Page 35: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

children. All the children at the camp had the

opportunity to experience the robot in scripted

“theater” performances during collective evening

recreational activities. Out of the 41 children attending

the camp, 28 volunteered to participate in the study.

The study had a between-subject design with two

conditions: (1) the control condition, constituted by

children who only experienced the social robot as a

theater-performance character, but did not interact

individually with it; (2) the intervention condition,

where children had the additional possibility to interact

individually with the social robot.

The individual interactions for the intervention

condition were carried out using the Robot Theatre

described in [4] in a partially Wizard-of-Oz setup and

were centered around three activities, among which

the children could freely choose and switch: a quiz

game, a sorting game and a creative dance activity (see

Figure 1). More details about the activities can be

found in [4] and [5].

During these interactions the robot elicited off-

activity-talk as described in [5] and exhibited the

following social behavior characteristics discussed in

[6]: the ability to express recognition and familiarity

(e.g., using the child’s name, referring to previous

joint experiences); non-verbal bodily cues [7]; turn

taking during game playing [8][9]; allowing children

to touch it and responding to touch; and occasionally

making mistakes, which helps children to feel at ease.

Measures

Children’s perception of the robot and their

expectations about the possibility to have a robotic

companion were measured by questionnaires.

Children’s willingness and spontaneity to talk about

diabetes was evaluated by 3 raters who independently

assessed every OAT sub-dialogue regarding diabetes.

Figure 1: Left-to-right:

the quiz game, the sorting game, the creative dance activity

RESULTS

The robot was described as a friend (as opposed to pet,

toy, adult, computer) significantly more often by the

intervention group than the control (ᵡ2=20.09 with

probability 1%, two-tailed p=0.0001). Instead, there

was a tendency in the control group to ascribe

machine-like characteristics to the robot, unlike in the

intervention group. The children’s willingness and

spontaneity to talk about diabetes was mostly high.

Qualitatively, all coders noticed a common positive

attitude in sharing practical notions about diabetes and

often also their personal experiences with the robot.

Majority of children in the intervention group would

like to meet the social robotic companion again (more

preferred at home rather than school, hospital, or

summer camp) or own one. The reason was the playful

character or the relational aspect in majority of cases.

This unique relationship also had a positive impact on

the educational aspects of the interaction.

CONCLUSIONS

The individual interactions lead the children to

perceive the robot as a peer. They do not feel judged,

but rather encouraged to learn and exchange

knowledge. This finding underlines the potential of

such a robotic companion. It shows that children are

willing to let a robot enter such a delicate and personal

dimension. This is extremely important for fostering

companionship to support children with diabetes.

REFERENCES

[1] http://www.idf.org/diabetesatlas/europe

[2] Freeborn D., Dyches T., Roper S.O., Mandleco B.

(2103) Identifying challenges of living with type 1

diabetes: child and youth perspectives. Journal of

Clinical Nursing. 22/13-14. 1890–1898.

[3] http://www.aliz-e.org

[4] Coninx, A., Baxter, P., Oleari, E., Bellini, S., Bierman,

B., Blanson Henkemans, O., Cañamero, L., Cosi, P.,

Enescu, V., Ros Espinoza, R., Hiolle, A., Humbert, R.,

Kiefer, B., Kruijff-Korbayová, I., Looije, R., Mosconi,

M., Neerincx, M., Paci, G., Patsis, G., Pozzi, C.,

Sacchitelli, F., Sahli, H. et al. (in press). Towards Long-

Term Social Child-Robot Interaction: Using Multi-

Activity Switching to Engage Young Users. To appear

in Journal of Human-Robot Interaction.

http://www.coninx.org/work/ALIZE-JHRI-preprint.pdf

[5] Kruijff-Korbayová, I., Oleari, E., Baroni, I., Kiefer, B.,

Zelati, M. C., Pozzi, C. et al (2014). Effects of Off-

Activity Talk in Human-Robot Interaction with

Diabetic Children. In Proceedings of the 23rd IEEE

International Symposium on Robot and Human

Interactive Communication. (RoMan 2014), 649-654.

[6] Nalin, M., Baroni, I., Sanna, A. & Pozzi, C (2012).

Robotic Companion for Diabetic Children.

In Proceedings of the 11th International Conference on

Interaction Design and Children. 260-263.

[7] Brooks A. G., Arkin R. C. (2007) Behavioural overlays

for non-verbal communication expression on a

humanoid robot. Autonomous robots. 22/1 55-74.

[8] Nalin, M., Baroni, I., Kruijff-Korbayová, I., Cañamero,

L., Lewis, M., Beck, A. et al (2012). Children’s

adaptation in multi-session interaction. In Proceedings

of the 21st IEEE International Symposium on Robot

and Human Interactive Communication (RoMan 2012).

[9] Baxter, P., Wood, R., Baroni, I., Kennedy, J., Nalin, M.

& Belpaeme, T (2013). Emergence of Turn-taking in

Unstructured Child-Robot Interactions. In the

Proceedings of the 8th ACM/IEEE International

Conference on Human Robot Interaction (HRI). 77-78.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 33 -

Page 36: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The Ethics of Human-Robot Relationships Maartje M.A. de Graaf

University of Twente, Department of Communication Science

Abstract. Currently, human-robot interactions are constructed according to the rules of human-human interactions inviting users to interact socially with robots. Is there something morally wrong with deceiving humans into thinking they can foster meaningful interactions with a technological object? Or is this just a logical next step in our technological world? Would it be possible for people to treat robots as companions? What implications does this have on future generations, who will be growing up in the everyday presence of robots? Is companionship between humans and robots desirable? This paper fosters a discussion on the ethical considerations of human-robot relationships.

Keywords: Companion Robots, Human-Robot Relationships, Robot Ethics.

INTRODUCTION

Relationships with others lie at the very core of human existence, as humans are conceived within relationships, born into relationships, and live their lives within relationships with others [1]. Since computer technologies increasingly interact with us complex and humanlike ways, the psychological aspects of our relationships with them take on an increasingly important role [2]. In the future, robots are expected to serve humans in various social roles such as nursing, child and elder care, and teaching environments. Robots in these social roles, in addition to their functional requirements, also include socially interactive components [7]. Besides performing their monitoring and assistive tasks, these robots also must engage in social interaction and create (trust) relationships with their users in order to gain their goals (e.g., increasing an elderly person’s health). Regardless of the moral or ethical implications, these robotic companions will be entering our everyday lives as soon as their abilities are technically feasible for the application in domestic environments. This calls for an evaluation on the ethics of human-robot relationships.

APPLYING ETHICS TO A NEW TECHNOLOGICAL GENRE

Applying ethics to robotics depends on the way we perceive robots [17]. Perceiving robots as nothing more than machines, means that they do not have any hierarchically higher characteristics, nor will they be proved with consciousness, free will, or with the level of autonomy superior to that embodied by the designer. Yet, the experience of interacting with robots appears to be fundamentally different from how people interact with most other technologies as human-robot interaction often has a strong social or emotional component.

Robots differ the most from other technologies in that they are autonomous, mobile, are sometimes build as replications of humans or animals, and are often designed to effect action on a distance. According to the media equation theory [11], technological objects can be evaluated as social entities with a minimum of social cues. This theory has also been successfully applied to the field of robotics [8]. People tend to ascribe a level of intelligence and sociability to robots which influences their perceptions of how the interactions should proceed. Robots capable of natural language dialog raise a user’s expectations not only with respect to the natural language, but also regard to the intentionality of both verbal and nonverbal behaviors, the robot’s autonomy, and its awareness of the sociocultural context [14]. It is likely that robots enabled with sociable interaction features such as familiar humanlike gestures or facial expressions in their designs will further encourage people to interact socially with them in a fundamentally unique way.

Furthermore, the autonomous behaviors of robots are likely to be associated with intentionality, which induces and strengthens a sense of agency in robots. Agency refers to the capacity to act and carries the notion of intentionality [5]. It is being argued that robots, being physically embedded and enabled with sociable interaction, create a unique and affect-charged sense of active agency similar to that of living entities [18]. This might cause that human-robot interaction, in a sense, is more like interacting with an animal or another person than like interacting with a technology.

Thus, there is special specific quality of modern robotics that is very relevant to how our world is changing: robots are a new form of living glue between our physical world and the digital universe we have created. We have invented a new species, part material and part digital, that will eventually have superhuman qualities in both worlds at once. This means that we need to perceiving robots as an evolution of a new species, which means that we need to consider robots to have autonomy and consciousness, and need to be created with moral and intellectual dimensions that will exceed humans.

ARE HUMAN-ROBOT RELATIONSHIPS MORALLY WRONG?

The goal of this easy is to discuss the contribution of human-robot relationships to the good life. There is little doubt that people are capable of bonding with robotic others [9], and that they might even benefit from these relationships in particular situations [3]. Therefore, there seems nothing intrinsically wrong

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 34 -

Page 37: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

with human-robot relationships. And considering companion robots as something unethical because their effectiveness depends on deception oversimplifies the issues [13]. The currency of all human social relationships is performance [6], and rather than labeling that as a bad thing, this is simply how things are [16]. People are always performing for other people and now the robots too will perform. However, robots cannot be our Aristotelian friends and they cannot really care about us [4]. Thus, we need to make sure that human-robot relationships do not replace their human counterparts, as Sparrow and Sparrow [15] rightfully fear. Another concern here is that if we come to accept these simulacral friendships, this might degrade our friendships with other humans as well.

FUTURE DIRECTIONS FOR THE ETHICS OF HUMAN-ROBOT RELATIONSHIPS

Despite the relevance of the consequences of innovations, they have received little attention in the literature. One reason for this neglect might be that companies supplying the innovation are often sponsors of innovation research, and these companies silently assume that the consequences of their innovations will be positive. Another reason for the underexposure of the consequences of innovations is that the usual questionnaires are less appropriate for the investigation of the impact of innovations. Studying the impact of innovations ideally requires multiple observations over extended periods of times. A final reason is the difficulty of the measurement of consequences. People are often not fully aware of the consequences of the introduction of an innovation, resulting in incomplete and misleading conclusions when only studying people’s opinions about possible consequences.

It is necessary to conduct research on ethics and rights for robots in different cultural settings and contexts, because different cultures and religions have different ‘virtues’ and ‘vices’, exhibiting from different worldviews, leading to different results on the same questions about moral standing towards robots (MacDorman & Cowley, 2006).

So if robots are like us and in the future we will interact with them in a ‘natural’ social way, the deep issues of robot ethics will come to an end. Whether biological or technical, sentient beings will belong to the same genus. Of course this vision does not appeal too many and if that is the case, we people must initiate efforts to understand our uniqueness and ensure that technology remains a tool not a partner. As such, ethics for robotics could be redefined as safety regulations (Rosenberg, 2008), however complex they may be.

REFERENCES 1. E. Berscheid and L.A. Peplau, “The emerging science of

relationships”, in Close relationships edited by H.H. Kelly et al., New York, NY, USA: WH Freeman, 1983, pp. 1-19.

2. T.W. Bickmore and R.W. Picard, Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction 12, 293-327 (2005).

3. E. Broadbent, R. Stafford and B. MacDonald, Acceptance of healthcare robots for the older population: review and future directions. International Journal of Social Robotics 1, 319-330 (2009).

4. M. Coeckelbergh, “Care robots, virtual virtue, and the best possible life”, in The good life in a technological age edited by P. Brey, Briggle, A., & Spence, E., New York, NY, USA: Routledge, 2012 281-292

5. J. Dewey, Art as experience. New York, NY, USA: Perigee Books, 1980.

6. E. Goffman, The Presentation of Self in Everyday Life. New York, NY, USA: Doubleday, 1956.

7. D. Feil-Seifer and M.J. Mataric, Socially Assistive Robotics. IEEE Robotics & Automation Magazine 18, 24-31 (2011).

8. P.H. Kahn, B. Friedman, D.R. Perez-Granados and N.G. Freier, Robotic pets in the lives of preschool children. Interaction Studies 7, 405-436 (2006).

9. A. Kerepesi, E. Kubinyi, G.K. Jonsson, M.S. Magnusson and A. Miklosi, Behavioural comparison of human-animal (dog) and human-robot (AIBO) interactions. Behavioural processes 73, 92-99 (2006).

10. K.F. MacDorman and S.J. Cowley, Long-term relationships as a benchmark for robot personhood. Proceedings of RO-MAN 2006, Hatfield, UK

11. B. Reeves and C. Nass, The media equation: How people treat computers, television, and new media like real people and places. New York, NY, USA: CSLI Publications, (1996)

12. R.S. Rosenberg, The social impact of intelligent artifacts. AI & Society 22, 367-383 (2008).

13. A.J.C. Sharkey and N.E. Sharkey, Granny and the robots: Ethical issues in robot care for the elderly. Ethics in Information Technology 14, 27-40 (2012).

14. R. Simmons, M. Makatchev, R. Kirby, M.K. Lee, I. Fanaswala, B. Browning, J. Forlizzi and M. Sakr, Believable robot characters. AI Magazine 32, 39-52 (2011).

15. R. Sparrow and L. Sparrow, In the hands of machines?: The future of aged care. Minds and Machines 16, 141-161 (2006).

16. S. Turkle, Alone together: Why we expect more from technology and less from each other, New York, NY: Basic Books, (2011).

17. G. Veruggio and F. Operto, “Roboethics: A bottom-up interdisciplinary discourse in the field of applied ethics in robotics”, in Ethics and Robotics edited by R. Capurro and M. Nagenborg, Amsterdam, The Netherlands: OIS Press, 2006, pp. 3-8.

18. J.E. Young, J.Y. Sung, A. Voida, E. Sharlin, T. Igarashi, H.I. Christensen and R.E. Grinter, Evaluating Human-Robot Interaction. International Journal of Social Robotics 3, 53-67 (2011).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 35 -

Page 38: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Enhancing Nao Expression of Emotions Using Pluggable Eyebrows

Albert De Beira, Hoang-Long Caoa, Pablo Gomez Estebana, Greet Van de Perrea and Bram Vanderborghta

aVrije Universiteit Brussel, Robotics & Multibody Mechanics Group

Abstract— Robots can express emotions for better Human RobotInteraction (HRI). In this field of HRI, the Nao robot is a platformwidely used. This robot mainly expresses emotions by gesturesand colored led eyes, but due to its white flat and inanimate faceit cannot express facial expressions. In this work we propose apluggable eyebrows device with two separated degrees of freedom.A short survey is conducted to qualitatively evaluate the usefulnessof this device.

Keywords— facial expressions, emotions, NAO robot

INTRODUCTIONHumans can communicate by two different means, either

directly with speech, or indirectly by facial expressionsand body language. In human communication, more thanhalf of the information is conveyed non-verbally [1]. Inconsequence, social robots used in HRI should be ableto similarly communicate non-verbally. In this work, weintroduce a device which improves the emotional expressionof the NAO robot. Our focus is on the Nao robot because itis the most known and used social humanoid robot for HRIexperiments.

RELATED WORKMany methods have been developed to express emotions

in the Nao robot. One of the most known is to use a varietyof postures as body language [2], [3]. Although this methodis working quite well for certain circumstances [3] it is notappropriate for realistic HRI situations. Indeed, wheneverthe robot aims to express an emotion, it will interrupt itscurrent task and perform the expressive posture during afew seconds, worsening the overall interactive experience.A solution proposed by [4] to overcome this limitationis to use facial expressions through Naos eyes instead ofbody language. The eyes are surrounded by RGB leds, andcolors can be associated with emotions (for example red foranger). Improving this approach, [5] suggested to combineand synchronize the RGB leds to simulate eye shapes anda blinking behavior. However, eye colors and shapes aloneare not sufficient to successfully express emotions duringa realistic social interaction [6]. In this work we proposeto express emotions using actuated eyebrows as they areconsidered very useful to express emotions [6].

METHODThe main difficulty in designing such a device for Nao

comes from the lack of space available for the actuators.Therefore, we propose a design where two micro servo-motors are placed at the back of Naos head supported by a 3-D printed structure in Acrylonitrile butadiene styrene (ABS)

Fig. 1. Nao expressing emotions with eyebrows. Top: anger. Bottom:Sadness

which is clipped around the head. The torque needed to movethe eyebrows is transmitted from the back to the front ofthe head through a rigid cable. This cable is sufficientlyrigid to act as a push-pull mechanism. As illustrated infigure 2, the rotation of the servo is converted in translationof the cable (cable in red). At the front of the head, thistranslation is converted back in rotation of the eyebrow. Theservo motor is controlled by an Arduino Nano board. Asthe micro-servo and the Arduino board have a small powerconsumption, the Arduino board can be directly connectedto the Nao robot through its USB port at the back of itshead without any additional battery. The eyebrows device isprogrammable using Aldebarans Choregraphe software. Inorder to control the 2 DoFs of Nao eyebrows we have createda box called Eyebrows which includes two string inputsto specify the eyebrows position. This box was built basedon the ArduiNao library (www.humarobotics.com/en/robotics-lab/nao-and-arduino.html) which al-lows users to communicate between Choregraphe and Ar-duino. With the Eyebrows box, users have an ability tocontrol 2 DoFs of Nao eyebrows from their Choregrapheprogram. The final result is an easy-to-clip module that canbe plugged in or out at Naos head in few seconds. Thismodule does not require any additional support. A smallvideo illustrating the motion of the eyebrows is available at:youtube.com/watch?v=C55mAOCT__0. We hypothe-size that the recognition rate of anger and sadness emotionswill be higher using this eyebrows device than without it.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 36 -

Page 39: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Fig. 2. Actuation of the eyebrows. Top: complete mechanism. Bottom:middle section. The orange arrows show the conversion between rotationand translation.

Experimental procedure

To assess the functionality of this device, an online ques-tionnaire has been filled in by 70 voluntary participants (23where rejected because they did not answered totally). Allthe participants belong to the 3rd year of a bachelor degree inpsychology, as such they had no prior experience in robots.This exploratory questionnaire was made using LimeSurvey(www.limesurvey.org) and contained 8 open questions.Participants were randomly split in two groups. For eachgroup, 8 pictures of Nao expressing emotions were presentedin a random order. For both groups, pictures containedemotional expressions from the literature: 4 pictures usingbody language [4], 2 pictures using eyes colors and 2 beinga combination of body language and eyes colors. 2 of thesepictures showed neutral expressions, 3 represented anger and3 sadness. In the second group, the eyebrows device wasused, making the first one the control group. Figure 1 showsan example of pictures used in the study. For each picture,the participant had to write in the questionnaire the emotionthat was, according to him, expressed by the Nao robot.Participants had to guess the robot emotions as none of themwere suggested during the survey. Finally, participants wereencouraged to answer I do not know if it was the case. Thiswas done to ensure that the participants were not answeringrandomly.

RESULTS & DISCUSSIONParticipants were separated in two groups: n= 21 for

the group without eyebrows and n = 26 for the groupwith eyebrows. A content analysis was performed on theparticipant answers. Responses were considered as correctwhen they had a meaning similar or close to the targetedemotion. The results revealed that the rate of recognition isgreatly improved by the eyebrows device. In fact, the recog-nition rate of sadness increased by 32.7% (5.8% withouteyebrows to 38.5% with eyebrows). More impressively, therecognition rate of anger was improved by 80.6% (14.2%without eyebrows to 94.8% with eyebrows). Forced choiceanswer would probably give even higher recognition rates.Finally, we believe that these eyebrows device can have otherapplications. For example, small orientation variations ofthe eyebrows around the neutral position can add a life-likebehavior to the robot. Small movements on the face whenspeaking (like it is already done with the rest of the body)will certainly increase Naos impression of aliveness.

CONCLUSIONIn this work we have proposed a unique and novel

eyebrows device for the Nao robot. This device is easy touse and can be directly controlled through the Choreographprogramming environment. In addition, a validation studywas conducted showing that the recognition rate of emotionsis greatly increased by the addition of the eyebrows devicein the NAO robot.

ACKNOWLEDGMENTThis work is funded by the European FP7 project DREAM

grant no. 611391.

REFERENCES

[1] Albert Mehrabian. Nonverbal communication. Transaction Publishers,1977.

[2] Iris Cohen, Rosemarijn Looije, and Mark A Neerincx. Child’s recog-nition of emotions in robot’s face and body. In Proceedings of the 6thinternational conference on Human-robot interaction, pages 123–124.ACM, 2011.

[3] Aryel Beck, Lola Canamero, Kim Bard, et al. Towards an affect spacefor robots to display emotional body language. In Ro-man, 2010 ieee,pages 464–469. IEEE, 2010.

[4] Markus Haring, Nikolaus Bee, and Elisabeth Andre. Creation andevaluation of emotion expression with body movement, sound and eyecolor for humanoid robots. In RO-MAN, 2011 IEEE, pages 204–209.IEEE, 2011.

[5] Jillian Greczek, Katelyn Swift-Spong, and Maja Mataric. Using eyeshape to improve affect recognition on a humanoid robot with limitedexpression.

[6] Jelle Saldien, Kristof Goris, Bram Vanderborght, Johan Vanderfaeillie,and Dirk Lefeber. Expressing emotions with the social robot probo.International Journal of Social Robotics, 2(4):377–389, 2010.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 37 -

Page 40: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Toward a Platform-Independent Social Behavior Architecture

for Multiple Therapeutic Scenarios

Hoang-Long Caoa, Pablo Gómez Esteban

a, Albert De Beir

a, Greet Van de Perre

a,

Ramona Simutb, Dirk Lefeber

a, and Bram Vanderborght

a

aVrije Universiteit Brussel, Robotics & Multibody Mechanics Research Group, Brussels, Belgium

bVrije Universiteit Brussel, Department of Clinical and Life Span Psychology Group, Brussels, Belgium

Abstract. Researchers in cognitive robotics have developed

different architectures to help social robots make decisions

and express emotions autonomously in order to replace the

pre-programmed and remote-controlled techniques.

However, most of the current developments are ad-hoc

solutions with no possibility to be utilized in multiple

therapeutic situations. In this paper, we propose the design of

a social behavior architecture, which aims at helping the

social robot to sustain the user’s engagement and motivation,

and to achieve the goal of interaction in different scenarios.

Representations of behaviors are kept at abstract level and

mapped with the robot’s morphology afterward. This

approach ensures the architecture to be applicable to a wide

range of social robots.

Keywords: social behavior architecture, autonomous,

platform-independent.

INTRODUCTION

Unlike the traditional robots, social robots are

expected to exhibit natural-appearing social manners

and to enhance the quality of life for broad populations

of users [1]. For instance, robots in elderly care should

be able to recognize the users’ intentions and actions,

and generate appropriate reactions. Robots in robot-

assisted therapy are required to have more substantial

levels of autonomy which would allow the robot to

adapt to the individual needs of children over longer

periods of time [2]. Currently, pre-programmed

scenarios and remote-controlled techniques (Wizard of

Oz) are dominantly employed in robot operations e.g.

[3][4]. Therefore, researchers have developed and

implemented various architectures, which aim at

helping social robots to make decision and express

emotion autonomously e.g. [5][6]. However, most of

the current developments are ad-hoc solutions with no

possibility to use them in diverse therapeutic

situations.

In this paper, we propose the design of a social

behavior architecture for multiple therapeutic

scenarios, which aims at helping social robots to be

able to sustain the users’ engagement and motivation,

and achieve the goals of interactions. Representations

of behaviors are kept at abstract level and mapped with

the robot’s morphology afterward.

REQUIREMENTS FOR AN ARCHITECTURE

Since social robotics research requires

interdisciplinary collaboration, we hereby present the

requirements for a social behavior architecture

proposed by roboticists and psychologists.

Sustaining user’s motivation and engagement

User’s motivation plays an important role in

therapies especially in long-term ones. Together with

keeping extrinsic motivation driven by external

rewards, intrinsic motivation, come within an

individual and based on enjoyment and satisfaction,

also needs to be maintained during all phases of the

therapeutic process e.g. diagnosis, intervention,

prevention [7][8]. Therefore, the architecture should

be able to create a fluid interaction with an interesting

play scenario including different levels of difficulty;

and to recognize and evaluate user’s performance to

generate proper responses. The interaction should be

personal to create a meaningful human-robot

relationship e.g. user’s name, history of interaction.

Achieving the goal of interaction

The purpose of using robot is to obtain a particular

therapeutic goal rather than for entertainment.

Therapies basically follow step-by-step scenarios with

the aim of obtaining positive changes in user’s

behavior. Although decision making mechanism of the

architecture is required to generate behaviors that

motivate and engage the user, these behaviors should

be complaint with the goal of the interaction.

Platform-independence

The social behavior architecture should be

platform-independent. Rather than controlling

actuators specific to a robot platform, the architecture

will prescribe parameters in descriptions and

representations that are common across all platforms.

Afterward, these robot non-specific commands will be

translated into robot-specific actions.

Providing data for analysis

The architecture should be able to conform to both

clinical outcomes and potential research. Data, e.g.

users’ performance and robot operation, need to be

recorded in structured forms for analysis.

DESIGN OF A PLATFORM-INDEPENDENT

SOCIAL BEHAVIOR ARCHITECTURE

Taking into account the aforementioned

requirements, we propose a social behavior

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 38 -

Page 41: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

architecture which aims at helping the robot to

produce coherent behaviors while sustaining the goal

of interaction. The architecture is modular and

composed of a number of systems and subsystems as

depicted in Figure 1.

The Perceptual system receives raw data from

sensors (e.g. camera, touch sensors). These data are

then interpreted into interaction context and users’

performances as input of other systems and

subsystems.

The Social behavior controller includes a number

of subsystems to generate coherent behaviors. The

reactive and attention subsystem generates life-like

behaviors, perceptual attention, and attention

emulation. The deliberative subsystem, as the center of

the system, generates proper behaviors depending on

the interaction context, user’s history and

performance, and importantly the therapeutic scenario

scripted in the Scenario Manager. These influencing

factors, stored in the memory, ensure the interactivity

and personality of the interaction. The emotion

subsystem manages the affective state of the robot to

add emotions into behaviors. The reflective subsystem

oversights the robot’s behavior by checking the

technical and ethical limits. The expression and

actuation subsystem combines the output of the four

subsystems previously mentioned in an appropriate

expression taking into account the weight factors of

each subsystem. Representations of expressions are

designed by using Facial Action Coding System and

Body Action Units and then translated into robot-

specific actions in the Motion system [9].

The operation of the architecture is visualized and

controlled by the Graphical User Interface (GUI).

This will offer the therapist an ability to select the

scenario, supervise the behaviors of the robot, and

interrupt the robot’s operation if necessary.

Figure 1. Description of the social behavior architecture.

The Social behavior controller decides the robot’s behavior

taking into account the information from Perceptual system

and Scenario Manager. Behaviors are platform-independent

and mapped with robot’s morphology by Motion system.

CONCLUSION AND EVALUATION

This paper proposes the design of a platform-

independent social behavior architecture for multiple

therapeutic scenarios. This approach ensures the

architecture to be applicable to a wide range of social

robots.

As for validation, the architecture will be

implemented in robot-assisted autism therapy

scenarios e.g. joint attention, turn taking, and imitation

(Figure 2). The utilized robot platforms will be Nao

and Probo [4]. Besides the built-in sensory systems of

these robots, Kinect will be used to enhance the ability

of the perceptual system.

Figure 2. Child-robot interaction under imitation (left) and

joint attention (right) scenarios.

ACKNOWLEDGEMENTS

The work leading to these results has received

funding from the European Commission 7th

Framework Program as a part of the project DREAM

grant no. 611391.

REFERENCES

1. A. Tapus, M. Mataric, and B. Scassellati, “The Grand

Challenges in Socially Assistive Robotics,” IEEE Robot.

Autom. Mag., vol. 14, pp. 1–7, 2007.

2. S. Thill, C. A. Pop, T. Belpaeme, T. Ziemke, and B.

Vanderborght, “Robot-Assisted Therapy for ASD with

(Partially) Autonomous Control: Challenges and Outlook,”

Paladyn, J. Behav. Robot., vol. 3, no. 4, 2012.

3. B. Robins, K. Dautenhahn, R. Te Boekhorst, and A.

Billard, “Robotic assistants in therapy and education of

children with autism: can a small humanoid robot help

encourage social interaction skills?,” Univers. Access Inf.

Soc., vol. 4, no. 2, pp. 105–120, 2005.

4. B. Vanderborght, R. Simut, J. Saldien, C. Pop, A. S. Rusu,

S. Pintea, D. Lefeber, and D. O. David, “Using the social

robot probo as a social story telling agent for children with

ASD,” Interact. Stud., vol. 13, no. 3, pp. 348–372, 2012.

5. D. Feil-Seifer and M. J. Mataric, “B3IA: A control

architecture for autonomous robot-assisted behavior

intervention for children with ASD,” ROMAN 2008, pp.

328–333, 2008.

6. H.-L. Cao, P. Gomez Esteban, A. De Beir, R. Simut, G.

Van de Perre, D. Lefeber, and B. Vanderborght, “ROBEE: A

homeostatic-based social behavior controller for robots in

HRI experiments,” in ROBIO 2014, pp. 516–521, 2014.

7. J. Fasola and M. Mataric, “A Socially Assistive Robot

Exercise Coach for the Elderly,” Journal of Human-Robot

Interaction, vol. 2, no. 2. pp. 3–32, 2013.

8. D. David, S.-A. Matu, and O. A. David, “Robot-Based

Psychotherapy: Concepts Development, State of the Art, and

New Directions,” Int. J. Cogn. Ther., vol. 7, no. 2, pp. 192–

210, 2014.

9. G. Van de Perre, M. Van Damme, D. Lefeber, and B.

Vanderborght, “Development of a generic method to

generate upper-body emotional expressions for different

social robots,” Adv. Robot., pp. 1–13, 2015.

PERCEPTUAL SYSTEM

MOTIONSYSTEM

SCENARIO MANAGER

SOCIAL BEHAVIOR CONTROLLER

GRAPHICAL USER INTERFACE

Expression and actuation

Reaction and Attention

Deliberative

Self-monitoring

Emotion

Memory

Sensory inputs Robot’s actuators

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 39 -

Page 42: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Principles Concreteness Hookie

Principle of safety Safety X

Principle of User Protection

Health X Consumer Protection X Environmental Regulation X Cosmesis ---------

Principle of liability

General Liability X Prospective Liability X Legal Transactions --------- Insurance X

Principle of user rights safeguard

Privacy X Data Protection X Intellectual Property Rights --------- Non-Discrimination X

Principle to an independent living and autonomy

Final Say / X Enabling Human Capabilities X Acceptance X Persuasion X

Principle of non-isolation and social connectedness

Non Replacement of Human Caregivers X

Non Replacement of Human Feelings X

Context of Autoexclusion X Dignity X

Principle of autonomous ethical agents’ minimization

Limitation to open scenarios with non-mission tasks X

Avoidance of post-monitoring X Ethical Agents X

Principle of justice Equality X Access (in cost terms) X Access (in opportunities) X

Boundaries in Play-based Cloud-companion-mediated Robotic Therapies: From Deception to Privacy Concerns

E. Fosch Villaronga a and J. Albo-Canalsb aJoint International Doctoral (Ph.D.) Degree in Law, Science and Technology coordinated by CIRSFID, Università

di Bologna, Italy. IDT-UAB, Universitat Autònoma de Barcelona, Spain (e-mail: [email protected]). bCEEO, Tufts University Boston, Massachusetts, US (e-mail: [email protected])

Abstract. Moving on from the ‘brainstorming phase’ where the discourse of robotics is still stuck this paper identifies the legal challenges of CEEO Tufts University project ‘robotic companions and LEGO® engineering with children on the autism spectrum’ (here Hookie project). Apart from the privacy and safety concerns, some new legal challenges have been identified: cognitive harm, prospective liability, ethical agents’ minimization and consumer robotics.

Keywords: Legal Principles, Robotic Therapy, Cloud Robotics, Prospective Liability, Privacy, ASD research, Autism, Non-Replacement of Humans.

INTRODUCTION While there is a growing body of research on robotics and autism, addressing minutely the involved legal principles in robot-assisted therapy for children with Autism Spectrum Disorder (ASD) is a new idea. Previous research focused on the benefits of this therapy [1] and its ethical issues (e.g. acceptance, replacement, autonomy, trust, etc.) [2]; but legal boundaries have been not addressed yet as we are still in a brainstorming phase [7]. Principles involving a wide range of robots have been identified under the EU Charter of Fundamental Rights [3]. The European RoboLaw project provided a solid regulatory framework some for “care robots” [4], nevertheless these robots addressed in ISO 13482:2014 (person carriers, physical assistants or mobile servants) but not concretely and specifically as therapeutic robots. And as the problem of roboticists is still two-fold (the identification of the principles of their technology; and the understanding of their meaning) concreteness becomes indispensable.

LEGAL ISSUES IN ROBOTIC THERAPIESSimilarly to the methodology described in [8], after analyzing the context where the robots are inserted (research project, Hookie), and the robot type (for care receivers, social mediator, semi-/autonomous), we found out what principles were involved in play-based cloud-companion-mediated robotic therapies (Fig. 1). Safety and privacy are the biggest concerns in robotics. In truth, “robotics combines […] the promiscuity of information with the capacity to do physical harm” [9]. Both principles are very sensitive in this respect. First, although several safeguards deal with data protection in the Hookie project (an

encrypted tunnel to protect data-in-motion, a private cloud with an individual login and password to protect data-at-rest, the access to information is only granted to researchers included in the project, and the collected data will be definitely deleted after three years) the use of cloud robotics still challenges the current data protection legislation [11]. Second, because therapeutic robots aim at working on the cognitive level, these could furthermore cause cognitive harm to the users.

Fig. 1 Concrete legal principles involved in robotic therapy.

Indeed, there is the “possibility that a medical robot will cause harms to its patients in the future” [5]. This prospective liability could happen not only because of the therapy itself but also because the robot may no longer be used after the end of the project [6, 13]. Similarly to what happens with some physical assistant robots, the user (and the parents) may not be necessarily aware that the therapy did not proceed in a normal way and thus cannot provide appropriate feedback to clinicians [10]. The use of a living lab for robotics legal regulation and the use of black boxes as described in [12] could help to track and avoid further responsibilities that may otherwise be covered by an insurance like the one proposed for commercial aerial robots in [14]. Consumer robotics will deal with other types of harms [23].

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 40 -

Page 43: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Autonomy is a two-fold issue. From the robot’s perspective, it is normally linked to deception. WoZ mode tends to deceive users [15]; however, blind research is permitted under §46.116.d of the 45 Title of the US Code. In fact, it is argued that deception promotes scientific validity because ‘accurately informing subject could bias their responses, thereby impairing the validity of the data’ [7]. Autonomy is also connected to liability, artificial empathy [19] and more generally to the still ongoing debate robot agenthood [17] and replacement of human therapists - something the Hookie project does not pretend. In fact, “the more autonomous a technology is, the more it needs to be sensible to values and norms” [18], that is why the principle of autonomous agents’ minimization matters. Public attitudes towards care robots are not very good though [22]. In legal terms, autonomy refers the user’s autonomy, and includes acceptance, final say of the parents and persuasion. Robotic therapy enables human capacities, and should not promote contexts of auto-exclusion. Acceptance should be addressed carefully, especially in physical appearance (that is why Hookie is non-biomimetic) [16]; but also from the responsivity proxemics perspective, e.g. characteristics of the robot to encourage responsivity [21]. Moreover, if robotic therapy ends as a general treatment offered by the State, it is not clear what the final say parents could have in accepting or not the treatment (as certain communities only accept bloodless surgeries). In fact, Government and Public Institutions are meant to ensure the principle of justice, e.g. the equal distribution of available resources [20]. This conceals an intrinsic moral duty to roboticists, i.e. the creation of affordable and accessible technology, which is the primary objective of the Hookie project.

CONCLUSIONS AND FURTHER RESEARCH In comparison to 2012, in 2015 29% of the contestants in [22] would feel comfortable having a robot provide service to infirm people. In fact, the level of “total uncomfortable” decreased by 9%. This implies a progressive societal advance in accepting robots in care applications. Considering the Hookie project, this article presents the principles involved in play-based cloud-companion-mediated robotic therapies. The Hookie project has dealt with all the above-mentioned principles except cosmesis, intellectual property rights or legal transactions, as its inner capabilities do not permit it. Further publications will include complete details of Hookie’s inner capabilities and its compliance with concrete legal principles involved in robotic therapy (Fig. 1).

REFERENCES 1. J-J. Cabibihan et al. “Why Robots? A Survey on the

Roles and Benefits of Social Robots for the Therapy of

Children with Autism” International Journal of Social Robotics 5(4), 2013, pp. 593-618.

2. M. Coeckelbergh et al. “A Survey of Expectations About the Role of Robots in Robot-Assisted Therapy for Children with ASD: Ethical Acceptability, Trust, Sociability, Appearance and Attachment”, Science Engineering Ethics, 2015.

3. B-J. Koops et al. “Robotic Technologies and Fundamental Rights. Robotics Challenging the European Constitutional Framework”, International Journal of Technoethics, 4(2), 2013, pp. 15-35.

4. D6.2. Guidelines on Regulating Robotics. EU RoboLaw Project, 2014, more specifically pp. 167-196 also p. 18.

5. E. Datteri. Predicting the Long-Term Effects of Human-Robot Interaction: A Reflection on Responibility in Medical Robotics. Sci Eng Ethics 19, 2013, pp. 139-160.

6. L.D. Riek and D. Howard, “A Code of Ethics for the Human-Robot Interaction Profession”, WeRobot 2014.

7. D. Wendler; F. G. Miller, “Deception in the Pursuit of Science”, Arch Intern Med 164 (6), 2004, pp. 597-600.

8. E. Fosch-Villaronga, “Creation of a Care Robot Impact Assessment” WASET Int Sci Index 102, Int Journal of Social, Behavioral, Educational, Economic and Management Engineering, 9(6), 2015, pp. 1801-1805.

9. R. Calo, “Robotics and the Lessons of Cyberlaw”. California Law Review 103, 2015, pp. 101-148.

10. J. Hidler et al. “Multicenter Randomized Clinical Trial Evaluating the Effectiveness of the Lokomat in Subacute Stroke”, Neurorehab and Neural Repair 23(1) pp. 5-13.

11. Article 29 Working Party, Opinion 05/2012 on Cloud Computing, 2012.

12. Y-H. Weng et al. “Intersection of “Tokku” Special Zone, Robots and the Law: A Case Study on Legal Impacts to Humanoid Robots, Int J Soc Robotics 6, 2015.

13. N. Sharkey and A. Sharkey, “The Crying Shame of Robot Nannies: An Ethical Appraisal”. Interaction Studies: Soc Behaviour and Communication in Biological and Artificial Systems 11, 2010, pp. 161-190.

14. D.K. Beyer et al. “Risk, Product Liability Trends, Triggers, and Insurance in Commercial Aerial Robots”, WeRobot 2014.

15. J.K. Westlund and C. Breazeal, “Deception, Secrets, Children and Robots: What’s Acceptable?” HRI 2015 Workshop, Portland, US.

16. B. Scassellati et al. “Robots for Use in Autism Research” Annu Rev Biomed Eng 14, 2012, pp. 275-294.

17. S. Shen, “The curious case of human-robot morality” Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction, 2011, pp. 249.

18. S. Steinert, “The Five Robots – A Taxonomy for Roboethics” Int J Soc Rob 6, 2014, pp. 249-260.

19. L. Damiano et al. “Towards Human–Robot Affective Co-evolution. Overcoming Oppositions in Constructing Emotions and Empathy” Int J of Soc Robotics 7, 2015.

20. UN Convention Rights of Persons with Disabilities. 21. J-J. Diehl, et al. “Clinical Applications of Robots in

Autism Spectrum Disorder Diagnosis and Treatment” in V.B. Patel et al. (eds.) Comprehensive Guide to Autism, 2014, pp. 411-422.

22. Eurobarometer 427 on Autonomous Systems 2015, p. 30 23. W. Hartzog, “Unfair and Deceptive Robots” WeRobot

2015.  

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 41 -

Page 44: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 42 -

Page 45: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Posters session papers & late-breaking reports

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 43 -

Page 46: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Cloud-based Social Robot that Learns to Motivate Children as an

Assistant in Back-Pain Therapy and as a Foreign Language Tutor

Maria Vircikova1, Gergely Magyar1 and Peter Sincak1

1Center for Intelligent Technologies, Department of Cybernetics and Artificial Intelligence,

Faculty of EE and Informatics, Technical University of Kosice, Slovakia

Abstract. This paper shows two case studies of using

the Nao robot as a physiotherapist to teach anti-back

pain exercises and as a tutor of foreign language in

elementary schools in Slovakia. We propose a cloud-

based environment for the technique of the Wizard of

Oz, where the process of teaching the exercises and the

foreign language classes can be controlled and

intervened by motivational behaviors of the robot. These

make the interaction less boring and more effective as

the motivation has beneficial effects on children learning

and behavior. Moreover, we implemented an algorithm

based on reinforcement learning, which learns the

motivational interventions from the Wizard.

Keywords: children-robot interaction, cloud robotics,

robots in education, reinforcement learning, socially-

assistive robotics

INTRODUCTION

Rabbitt et al. [1] define socially assistive robotics

(SAR) a unique area of robotics that exists at the

intersection of assistive robotics, which is focused on

aiding human users through interactions with robots

(e.g. mobility assistants, educational robots) and

socially interactive or intelligent robotics, which is

focused on socially engaging human users through

interactions with robots (e.g. robotic toys, robotic

games). SAR has used robots in different roles, e.g. the

weight loss coach [2], the social robot in an attention

and memory task, helping older adults with dementia

[3], supporting young patients in hospital as they learn

to manage a lifelong metabolic disorder (diabetes) [4],

motivating physical exercise for older adults [5] or in

autism therapy [6], as a therapy assistant in children

cancer treatment [7], sign language tutors [8], other

kind of educational agents mainly in children-robot

interaction [9-13] and others [14].

One of the challenges of using robots in therapies

is often fusing play and rehabilitation techniques using

a robotic design to induce human-robot interaction

(HRI), in which the criteria are to make the therapy

process entertaining and effective for the users.

Usually, the HRI experiments are conducted using the

Wizard of Oz (WoO) technique, which means that the

robot is not acting autonomously but is teleoperated by

an expert. This method is sufficient for research, but if

we want to have robots in human environments, we

have to think about them as learning systems. We

designed a cloud-based environment for the WoO

technique, where the expert can control the learning

process. When he/she observes that the children are

getting to pay less attention to the robot, he/she can

activate a motivational behavior of the robot (different

kinds of emotional expression based on speech,

motion, sounds) which help to increase interest of the

children in the interaction and make it more effective.

CASE STUDIES

We selected two scenarios for use of the robot, in

which the subjects are 5-8 years old children.

The first problem that we face is the low back pain,

as it is the number one disability globally and number

one in almost all developed countries, as according to

[15]. The problem of scoliosis in today’s society is

growing, and it is fundamental to ensure adequate

motoric skills development during childhood. We

explore the effect of utilizing a humanoid robot as a

therapy-assistive tool in educating children to perform

safe and effective back exercises, designed by a

professional therapist, that can strengthen the back and

improve posture.

Figure 1. Examples of the exercises designed by a

professional physiotherapist and implemented using the Nao

humanoid platform

In the second scenario, the robot acts as a foreign

language tutor. This application is extremely

important, especially for countries like Slovakia,

where English is not an official language.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 44 -

Page 47: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Figure 2. Experiments in the wild - child is imitating the

robot in anti-back pain therapy, the other figures show the

robot in the role of a foreign language tutor

SYSTEM THAT LEARNS FROM THE WIZARD

BASED ON CLOUD COMPUTING

For the mentioned case studies we designed a web-

based system for WoO, where the exercises/language

lessons can be controlled and intervened by

motivational behaviors of the robot (emotional

expressions). The system based on reinforcement

learning adopts the motivational interventions from the

Wizard. It learns from the operator in order to increase

the level of autonomy of the robot. This way we can

move from WoO towards the Oz of Wizard [16]. After

reviewing other studies [17], we found out that the

biggest weakness of all existing WoO interfaces is that

they can be used only locally and just for a given

experiment. To overcome these weaknesses, our

system uses the advantages of cloud computing and

consists of the following parts:

- Motion library in the first case study – it

contains the physical therapy exercises. The

Wizard can choose the exercises from the

database, number of repeats and set the order

of execution. Another feature is recording new

exercises with a Microsoft Kinect sensor,

which enables the creation of more diverse

rehabilitation sessions.

- English classes in the second study – it

contains different conversational topics to

teach children new vocabulary and grammar

points in an entertaining way.

- Motivational behaviors library – the platform

also comes with an emotional database that

contains emotional expressions (joy,

satisfaction, anger, sadness, surprise, fear).

The Wizard can also control the LED

animations and the phrases said by the robot.

- Agent based on reinforcement learning – a

system that determines how to map situations

to actions and also tries to maximize a

numerical reward signal (how to set the

teaching process, e.g. when to activate the

motivational mode of the robot).

Our goal is to create a modular system that could be

used in different scenarios and besides that it could

serve as a common cloud-based platform for

researchers. We present two case studies, although the

system can be used for other SAR-based applications in

which the motivation of the subjects is important.

REFERENCES 1. S. M. Rabbitt, et al. Integrating SAR into mental

healthcare interventions: Applications and

recommendations for expanded use, Clinical Psychology

Review 35, 2015.

2. C. D. Kidd and C. Breazeal, A robotic weight loss coach,

Proceedings of the National Conference on AI, MIT

Press, 1999.

3. A. Tapus, et al. The use of SAR in the design of

intelligent cognitive therapies for people with dementia,

Rehabilitation Robotics, 2009.

4. P. Baxter, et al. Long-term human-robot interaction with

young users, IEEE/ACM HRI, 2011.

5. J. Fasola and M. Mataric, Using socially assistive HRI to

motivate physical exercise for older adults, Proceedings

IEEE 100, 2012.

6. B. Scassellati, et al. Robots for use in autism research,

Annual Rev. Biomedical Engineering 14, 2012.

7. M. Alemi, et al. Impact of a social humanoid robot as a

therapy assistant in children cancer treatment,

Proceedings of the International Conference on Social

Robotics, 2014.

8. H. Kose, et al. Socially interactive robotic platforms as

sign language tutors, International Journal on Humanoid

Robotics, 2014.

9. A. Desmukh, et al. Empathic robotic tutors: Map guide,

Proceedings of the 10th ACM/IEEE International

Conference of HRI, 2015.

10. S. Bhargava, et al. Demonstration of the emote Wizard of

Oz interface for empathic robotic tutors, SIGdial, 2013.

11. S. Serholt, et al. Emote: Embodied-perceptive tutors for

empathy-based learning in game environment, GBL,

2013.

12. B. LaVonda, et al. Applying behavior strategies for

student engagement using a robotic educational agent,

SMC, 2013.

13. T. Komatsubara, et al. Can a social robot help children’s

understanding of science in classrooms?, HAI, ACM,

2014.

14. I. Leite, et al. Social robots for long-term interaction: A

Survey, International Journal of Social Robotics, 2013.

15. Institute for Health Metrics and Evaluation ©, University

of Washington, 2013.

16. A. Steinfeld, et al. The Oz of Wizard: Simulating the

human for interaction research, Proceedings of the

International Conference on HRI, 2009.

17. L. Riek, Wizard of Oz studies in HRI: A systematic

review and new reporting guidelines, Journal of HRI,

2012.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 45 -

Page 48: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Possibilities Of The IROMEC Robot For Children With Severe

Physical Disabilities

Renée van den Heuvela,b, Monique Lexisa and Luc de Wittea,b

a Zuyd University of Applied Sciences, The Netherlands (Research Centre for Technology in Care) b

Maastricht University, The Netherlands (CAPHRI, School for Public Health and Primary Care)

Abstract. Objectives To match the goals for children with

severe physical disabilities in therapy and education with

the current possibilities of the IROMEC robot to support

play.

Methods Focus groups, interviews and a questionnaire

were used to gather an overview of goals and to determine

the potential of IROMEC.

Results Especially goals related to of movement

functions, learning and applying knowledge,

communication and interpersonal

interactions/relationships, and play seem to be the most

promising domains for IROMEC.

Conclusion There is a match between the possibilities of

the IROMEC robot and the goals for children with

physical disabilities in therapy and special education. The

current play scenario offer should be adapted and

expanded.

Keywords: robot, IROMEC, play, children, physical

disabilities

INTRODUCTION

Within special education and rehabilitation for

children with severe physical disabilities a lot of

interventions related to play are being used. The

progress in technology in the last decades increases the

possibilities of using technologies within therapy and

special education. A relatively new and upcoming

intervention within this area is the application of

(social) robots. IROMEC is a robot developed during

the IROMEC (Interactive social RObotic MEdiators as

Companions) FP6 European project between 2006-

2009. In the European project it was investigated how

robotics toys can become social mediators for children

who are prevented from playing, due to cognitive,

developmental or physical impairments [1]. One

important outcome of this project was a set of ten play

scenarios developed for the target user groups. Five of

these scenarios were actually implemented in the robot;

Turn taking, Sensory reward, Make it Move, Follow me

and Get in Contact [1]. In figure 1 a picture of the robot

is displayed. Important characteristics of the robot are

the use of sensors and a camera to detect obstacles and

to detect a child, a touchscreen on the back, the ability

to move around (autonomous and controlled by

buttons), a digital screen as a face, the production of

sounds and control by wireless buttons.

Play is essential in the development of every child

and is a fundamental right for every child [2]. Play gives

children the possibility to discover their capabilities, try

out objects, make decisions, understand cause and

effect relationships, learn, persist, and understand

consequences of actions [3]. For children with physical

disabilities the experience of play can become

frustrating or even impossible. They experience

difficulties in starting, developing and performing play

activities in a natural way. Most commercially available

toys are not designed with the requirements for these

children in mind and play activities may be partially or

entirely impossible [4].

The main aim of this study was to match the goals

for children with severe physical disabilities in therapy

and education with the current possibilities of the

IROMEC robot to support play.

Figure 1. The IROMEC Robot

METHODS

A qualitative mixed methods study was used

combining interviews, focus groups sessions (two

rounds) and a digital questionnaire. The goals in therapy

and education for children with severe physical

disabilities related to play were established and the

possibilities for IROMEC interventions were identified.

Therapists and special educators participated in the

study. In the first round of focus groups and in the

interviews the goals and activities related to play in

therapy and education were discussed using the

principles of the metaplan method [5]. The digital

questionnaire was sent to the participants of these

interviews and focus groups as a member check for the

overview of goals. Additionally, we asked for which of

the goals from the overview IROMEC could be applied.

A short video of the IROMEC robot was included in the

questionnaire. In the second round of focus groups,

which started with a demonstration of the existing

IROMEC characteristics and scenarios, the participants

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 46 -

Page 49: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

were asked for the possibilities of the robot to achieve

the therapeutic and educational goals were discussed.

The interviews and focus groups were audio taped

and transcribed verbatim. For the data of the interviews

and first round of focus groups an overview of the goals

was created based on the results from the metaplan

method and structured according to the International

Classification of Functioning for Children and Youth

(ICF-CY) [6]. For the second round of focus groups the

qualitative research software Nvivo 10 was used to

identify and code relevant fragments based on the

principles of directed and conventional content analysis

[7].

RESULTS

Nine persons participated in the interviews, 17

persons participated in the first round of focus groups

(3 groups) and 25 persons participated in the second

round (6 groups). The questionnaire was distributed to

26 persons and completed by 10 participants. Table 1

displays a part of the goal overview found in the

interviews and first round of focus groups. Other

domains found; Mental functions (b1), Sensory

functions and pain (b2), Mobility (d4), Self-care (d5)

and (pre)school skills (d815/d820). In bold, goals are

displayed for which at least 50% of the participants in

the questionnaire thought IROMEC could be applied

and with which people in the second round of focus

groups agreed.

Table 1. Goals and match with IROMEC

Domain (ICF-CY) Goals

Movement

functions (b750-

b789)

Fine motor skills

Gross motor skills

Eye-hand coordination

Motor skills

Learning and

applying

knowledge (d1)

Spatial awareness

Learning

Imitation

Planned and structured working

Concentration

Problem solved learning

Work attitude

Making jokes and pretending

Listening

Language comprehension and

expression

Reading

Numeracy skills

Communication

(d3) /

Interpersonal

interaction and

relationships (d7)

Turn taking

Cooperation

Interaction

Using voice

Taking initiative

Get in contact

Language

Play (d880) Playing independent

Playing together

Fantasy play

Understanding of simple rules

Having play fun

Role play

Competition

Next to the possibilities of IROMEC related to the

goals there were some comments and recommendations

on the current robot. For example: elaboration of the

current play scenarios and more flexibility in adapting

scenarios (e.g. screens, sounds, movement) as well as

the appearance of the robot.

CONCLUSION

Therapists and special educators were convinced

about the match between goals for children with severe

physical disabilities and the possibilities of IROMEC,

in therapy as well as in education. The domains

movement functions, learning and applying knowledge,

communication and interpersonal interactions and

relationships, and play seem to be the most promising

domains. It is recommended to adapt and expand the

current scenario offer of the robot within these domains,

according to suggestions from experts in daily care

practice.

ACKNOWLEDGMENT

The authors would like to thank Gert Jan

Gelderblom† for his highly appreciated contribution to

this work.

REFERENCES 1. B. Robins, K. Dautenhahn, E. Ferrari, G. Kronreif, B.

Prazak-Aram, P. Marti, I. Iacono. G.J. Gelderblom, R.

Bernd, F. Caprino. Scenarios of robot-assisted play for

children with cognitive and physical disabilities.

Interaction Studies 13, 189-234 (2012)

2. B. Prazak, G. Kronreif, A. Hochgatterer, M. Fürst. A toy

robot for physically disabled children. Technology and

Disability 16, 131-136 (2004)

3. A.Rios Rincon, K. Adams, J. Magill-Evans, A. Cook.

Changes in Playfulness with a Robotic Intervention in a

Child with Cerebral palsy. Assistive technology from

research to practice AAATE 2013

4. S. Besio. An Italian research project to study the play of

children with motor disabilities: the first year of activity.

Disability and Rehabilitation 24, 72-79 (2002)

5. E. Schnelle. The Metaplan-Method: Communication

Tools for Planning and Learning Groups. Hamburg:

Quickborn,1979, pp 63

6. World Health Organization. International Classification

of Functioning, Disability and Health: Children & Youth

version: ICF-CY. WHO: Geneva, 2007

7. H. Hsieh, A.E. Shannon. Three approaches to qualitative

content analysis. Qualitative Health Research 15, 1277-

1288 (2005)

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 47 -

Page 50: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Robot Response Behaviors To Accommodate Hearing Problems*

Jered Vroona, Jaebok Kim

a, and Raphaël Koster

b

aUniversity of Twente, Human Media Interaction

bMADoPA

Abstract. One requirement that arises for a social (semi-

autonomous telepresence) robot aimed at conversations

with the elderly, is to accommodate hearing problems. In

this paper we compare two approaches to this

requirement; (1) moving closer, mimicking the leaning

behavior commonly observed in elderly with hearing

problems, (2) turning up the volume, which is a more

mechanical solution. Our findings with elderly

participants show that they preferred the turning up of

the volume, since they rated it significantly higher.

Keywords: Telepresence robot, Hearing problems

INTRODUCTION

What behavior is appropriate for a social robot will

depend on the context in which it is to function. For

example, for a robot that helps lifting people out of

bed it is necessary to get intimately close, while for a

telepresence robot such intimate distances probably

are less appropriate. An important aspect of this

context are the specific individual needs of the users.

Elderly with hearing problems are one such user

group that places its own requirements on the behavior

of social robots. Hearing problems have a high

prevalence among elderly [1,2]. Taking hearing

problems into account could thus be a good

contribution to any robot that is to communicate

through audio with elderly, such as for example (semi-

autonomous) telepresence robots.

One way to handle hearing problems is by

mimicking the ‘leaning’ behavior commonly observed

in this user group, where people actively lean in to

intimate distances during conversations [3,4].

Similarly, a social conversation robot could also

reciprocate such leaning behavior by moving closer.

An alternative would be to instead change the

volume settings of the robot. Though in a way less

human-like, this could be equally (or more) effective

in resolving the hearing problems.

The aim of the reported experiment is to investigate

with elderly participants which of these two response

behaviors they might prefer a semi-autonomous

telepresence robot to show.

METHOD

To investigate the effect of the different response

behaviors, we set up a within subject experiment [no

response X move closer X turn up volume] as part of a

larger evaluation session for the Teresa project*. In

each session one participant (the Visitor) sat in a

remote location and used the robot in another room to

interact with one or two other participants in the same

room as the robot (the Interaction Target(s)). We

used a Giraff1 telepresence robot. A possible limitation

is that the speaker is located in its base, not its ‘head’.

Procedure

The Interaction Target(s) were seated behind a

table, with the robot on the other end of it at a distance

of approximately 1.5m. To ensure that hearing

problems would arise, the volume of the robot had

been turned down to a barely audible level. An

experimenter explaining the procedure sat with the

Interaction Target(s) during the experiment.

To make the conditions more comparable, the

experiment started with a full briefing on the aim and

the procedure of the experiment. After this, there were

three trials in which participants had a brief

conversation with each other that was terminated after

about two minutes by the experimenter. In each of

these trials, as soon as the Interaction Target(s)

expressed having hearing problems or after

approximately one minute, a Wizard of Oz showed

one of the three response behaviors in counterbalanced

order. For ‘no response’, no behavior was shown. For

‘move closer’, the robot approached the Interaction

Target(s) to a distance of around 0.8m. For ‘turn up

volume’, the volume settings were turned up a bit,

which was also visible in the interface. To ensure

functional comparability, none of these changes was

sufficient to completely resolve all hearing problems.

At the end of each trial, the robot was returned to its

initial position and volume setting. The experiment

was concluded with a brief (paper) questionnaire.

Task

To stay close to the intended use of the robot, the

task of our participants was to have a conversation.

For this, we asked them to discuss questions of the

Proust questionnaire2. Specifically, we asked the

Interaction Target(s) to read out self-selected questions

and the Visitors to discuss what they thought the

Interaction Target would answer.

Measurements

At the end of the interactions, all participants were

given a brief questionnaire. Three items asked them to

indicate their most and least favorite response behavior

and to rate all response behaviors on a scale of 1-10.

*This work has been supported by the European Commission under contract number FP7-ICT-611153 (TERESA), http://teresaproject.eu 1 http://www.giraff.org/?lang=en 2 http://fr.wikipedia.org/wiki/Questionnaire_de_Proust

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 48 -

Page 51: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

One item asked them to indicate which three qualities

of the robot were most influential in their ratings,

based on items for warmth and competence [5] (see

Table 2 for the qualities). The last 5 items considered

demographics (age, gender, hearing problems, use of

hearing aids, relationship with the other participant(s)).

We recorded the interactions on video and using

robot-mounted sensors. The interface as seen by the

Visitor was recorded using screen capture software.

Participants

We had 18 French speaking participants (13

female, 4 male, 1 undisclosed), in six pairs and two

trios, all with a prior relation (e.g. friends, family).

Participant were aged between 60 and 91 (mean age

74). Hearing loss was reported by 7 participants. In

one pair, a 10-year old grand-child also joined as

Interaction Target, but was excluded from analysis.

FINDINGS

Summaries of our main findings can be found in

Tables 1 and 2. Twelve participants preferred the ‘turn

up volume’ behavior, the other six preferred ‘move

closer’ instead. The ratings of these behaviors matched

those preferences for 89% of the participants, though

many asked for clarification of the rating questions.

Since the rating of the response behaviors was not

normally distributed (Shapiro-Wilk, p=0.135,

p=0.039*, p=0.053) we ran a Friedman test, which

found a significant difference in rating (χ2(2)=25.344,

p=0.000*). We did a post hoc analysis with a

Wilcoxon signed-rank test (significance level 0.017,

with Bonferroni correction). The ratings for ‘move

closer’ were significantly higher than those for ‘no

response’ (Z=-2.917, p=0.004*). The ratings for ‘turn

up volume’ were significantly higher than both those

for ‘no response’ (Z=-3.628, p=0.000*) and those for

‘move closer’ (Z=-2.462, p=0.014*).

This analysis made the simplifying assumption that

the participants can be treated as independent

comparable measurements, despite being in the same

group and having one of two roles (Visitor/Interaction

Target). A series of Pearson’s Chi-square test found no

significant correlations of either group or role with the

ratings, which supports this assumption. The

aforementioned significant differences all hold when

looking at the Interaction Targets only (N=10), only

the difference in rating for ‘turn up volume’ and ‘move

closer’ is no longer significant (Z=-1.364, p=0.172).

CONCLUSIONS AND DISCUSSION

We have compared three ways in which a semi-

autonomous telepresence robot could respond to

hearing problems. We found high ratings for ‘turn up

volume’, significantly surpassing the ratings for ‘move

closer’. Both of these were rated significantly higher

than ‘no response’. There do seem to be further

individual differences, as one third of the participants

instead preferred the ‘move closer’ behavior. We only

used general ratings for this, but our participants most

commonly indicated to have based their judgement

mostly on the qualities ‘Intelligent’ and ‘Helpful’.

Note that these findings need not translate to other

settings, e.g. ‘turn up volume’ may be perceived as

less appropriate if the noise could disturb others.

Overall, our findings demonstrate that trying to

accommodating hearing problems is a desirable

feature in this setting. A general approach like turning

up the volume when required could work in general

cases. If possible, a more personalized solution could

also/instead move closer if the user would so prefer.

ACKNOWLEDGEMENTS

The authors wish to thank Khiet Truong, Gwenn

Englebienne and Vanessa Evers for their input during

the design of the experiment, and Alexandre Duclos

for being the Wizard of Oz.

REFERENCES 1. N. Blevins, D. Deschler and L. Park. "Presbycusis".

Retrieved September 15, 2014, from

www.uptodate.com/contents/presbycusis

2. K.J. Cruickshanks, et al. "Prevalence of hearing loss in

older adults in Beaver Dam, Wisconsin the epidemiology

of hearing loss study." American Journal of

Epidemiology 148(9), 879-886 (1998).

3. J. Vroon, G. Englebienne and V. Evers. "D3.1 Normative

Behavior Report". Teresa project deliverable, 2014.

Available from teresaproject.eu/project/deliverables/

4. J.D. Webb and M.J. Weber. "Influence of sensory

abilities on the interpersonal distance of the elderly".

Environment and behavior 35(5), 695-711 (2003).

5. K. Bergmann, F. Eyssel and S. Kopp. "A second chance

to make a first impression? How appearance and

nonverbal behavior affect perceived warmth and

competence of virtual agents over time." Intelligent

Virtual Agents. Springer Berlin Heidelberg, 2012.

Table 1. Descriptive statistics for the ratings given to the three

different response behaviors.

Response

behavior

N Mean Percentiles

Min Q25 Q50 Q75 Max

No response 18 3.000 0.0 1.5 3.0 4.25 9.0

Move closer 17 6.167 0.0 5.0 6.5 8.0 9.0

Turn up volume 18 8.235 6.0 7.5 8.0 9.5 10.0

Table 2. Number of times the different qualities were checked

as being most influential in giving the ratings (total = 54).

Ag

réab

le (Pleasan

t)

Sen

sible (S

ensitiv

e)

Am

ical (F

riendly

)

Utile (H

elpfu

l)

Affa

ble (A

ffable)

Sy

mp

ath

iqu

e(L

ikeab

le)

Accessib

le(A

ppro

achab

le)

So

ciab

le (Sociab

le)

Intellig

ent (In

telligen

t)

Org

an

isé (Org

anized

)

Ex

pert (E

xpert)

Réfléch

i (Tho

roug

h)

Effica

ce (E

ffective)

Ap

pro

prié

(Appro

priate)

Atten

tif (Atten

tive)

2 0 2 9 2 1 4 4 10 2 0 4 5 4 5

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 49 -

Page 52: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Social and Autonomous Confabulation Architecture

Roger Tilmansa, Pablo Gomez Estebanb, Hoang-Long Caoc, Bram Vanderborghtd

aVrije Universiteit Brussel, Robotics & Multibody Mechanics Group

Abstract— This paper presents the Social and Autonomous Con-fabulation Architecture: a new cognitive architecture developed forsocial robots that have to operate under supervised autonomy. Withthis new cognitive architecture, based on the confabulation theory,robots are able to choose autonomously their behaviors, to haveemotions and to have learning abilities. At the end, the architectureis tested with the robot Nao playing different interactive games witha kid.

Keywords— confabulation theory, social robot, autonomous behav-ior, cognitive architecture

1. INTRODUCTIONSocial robots are increasingly being used for many rea-

sons. Some of them, like the seal robot Paro or the huggablerobot Probo, have been developed in order to be used inhospitals [5]. Other robots, like the commercial robot Aibohave been developed in order to be used as pet.

Aibo’s cognitive architecture is based on ethology [1]. Butmimicking the mechanism of thought from a neurologicalpoint view may facilitate the interaction between humansand robots [4]. The mechanism of thought is described bythe confabulation theory.

The confabulation theory is a concept in neurosciencedeveloped by Robert Hecht-Nielsen [2]. It postulates that theworking of the brain is similar to the working of muscles:neurons work in groups called modules. These modules areseparated in symbols that represent elements of thought.

Nowadays, different artificial creatures inspired from thistheory have already been developed. The Two-LayeredConfabulation Architecture (TLCA) [3] or the Degree ofConsideration-Based Mechanism of Thought [4] allow ar-tificial creatures to choose their optimal behaviors in virtualenvironments. This paper proposes a new cognitive architec-ture inspired from the given examples, but adapted for socialrobots that have to operate under supervised autonomy: theSocial and Autonomous Confabulation Architecture (SACA).

Thanks to its developed perception and actuator systemand thanks to the transportability of the framework, the robotNao has been chosen in order to test the SACA. It has beenprogrammed into a social robot that plays educational gameswith children.

2. WORKING OF THE SACAA. Main working

What the SACA mainly does is choosing the next behaviorof Nao. A behavior is a predefined function that typicallylaunches a set of actions of the actuator system. In its currentversion, the test software has a library of 8 behaviors: Naopresents himself, shows his emotion, dances and can play

5 other interactive games. Each game is thus defined as abehavior.

Figure 1 shows the working of the cognitive architecture.The external stimuli, perceived by the different sensors of therobot, are processed by the perception system. This data issaved in the memory. The memory represents all the informa-tion that defines the internal state of the robot. In function ofthe memory, the behavior selector chooses the next behaviorof the robot. Depending on the behavior, the robot mayneed to wait for a certain stimulus or may want to makesome changes in its internal state. This is represented by thedashed arrows. In addition to the described deliberative layer,the robot also has a reactive layer. This layer is representedby the black arrow between the perception system and theactuator system: a set of stimuli can bypass the deliberativelayer and activate directly the actuator system.

Fig. 1. Schematic diagram of the Social and Autonomous ConfabulationArchitecture

While the reactive layer is just defined as a stimulusthat activates some actuators, the deliberative layer is morecomplex and works according to the confabulation theory. Assaid before, the main elements of the confabulation theoryare the modules and symbols.

B. Modules and symbolsThe memory -that represents the internal state of the robot-

contains different modules:1) Context module: gives the general context. Did the

user just turned on the robot or did the robot not receivestimuli since a long time?

2) Last stimulus module: each stimulus is recorded inthis module until a new one has been perceived.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 50 -

Page 53: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

3) Emotion module: stores Nao’s emotional state.4) Behavior module: behaviors are here defined as sym-

bols and are stored in the behavior module.5) Additional data modules: in one of its behaviors, the

robot tries to guess a famous singer by asking questionsto the user. The robot thus has a list of singers and mustalso be able to remember the gender, the nationalityand other clues given by the user. All this informationis saved in additional modules.

Each module contains different symbols that represent thedifferent states of the module. At a given time, at least oneand only one symbol per module is activated. This activatedsymbol is called an assumed fact. For example if the robotdid not receive stimuli for a long time, the symbol called“nothing happens“ of the context module will be activatedand will be the assumed fact.

The memory also contains the weight values of the knowl-edge links. A knowledge link connects a symbol from onemodule with a symbol from another module. The weightof the knowledge link going from symbol α to symbol βequals the conditional probability that symbol α is activatedassuming β is activated: p(α|β).

Because the context module or the last stimulus moduleonly represent some characteristics of the external world,their assumed facts are chosen by the perception system. Theselected behavior on the other hand represents a decision ofthe robot. This decision is made by choosing the behaviorb that maximizes the conditional probability that all the nassumed facts f1, f2, ..., fn occur assuming the behavior boccurs. After applying the Bayes theorem, this maximum canbe estimated by finding the maximum of p(f1|bi) · p(f1|bi) ·... · p(fn|bi), with bi, the different behaviors and p(fj |bi)defined as the weight of the knowledge link going fromthe symbol fj until symbol bi [3]. The weights of theseknowledge links need to be estimated in advance and/or canbe modified thanks to a learning algorithm.

C. Learning algorithmA learning algorithm has been implemented in order to

give the possibility to the user to improve the robot’s attitude.Reinforcement learning is here considered. The user can giverewards or punishments to the robot and the weights of thelinks are increased or decreased according to formula 1:

pnew(ci|b) = p(ci|b) + F ·O · λ (1)

withci: all the symbols connected with the behavior bb: the desired/undesired behaviorF: Feedback: F=1 in case of reward and F=-1 in case of

punishmentO: Occurrence: O=1 if occurs and -1 otherwiseλ: learning rate: after tests and trials, λ = 0.05

The learning algorithm is not new, but is inspired from theone used in the TLCA [3] and has been slightly modified:the product in the original equation has been here changedby a sum. This makes the creation of new links possible,even if their weights were originally zero.

D. EmotionsWhile emotional state usually is defined in a two-

dimensional space with a valence and an arousal axis [5], theemotional state here has been defined in a module where eachsymbol represents a different emotion. The selected symbolis determined by the perception system or can be modifiedduring a behavior. While the symbols of the emotion moduleare connected with the symbols of the behaviors module,emotions influence the behavior’s choice. For example theconnection that goes from the emotion ”angry“ to the be-havior ”show his emotions“ is higher, so that if the robot isangry, there are more chances that it will express it.

3. RESULTSIn order to verify that the SACA is suitable for social

robots operating under supervised autonomy, it has beentested with Nao playing with children. The desired charac-teristics have been observed: Nao is able to play interactivegames with a child, has emotions and is able to adapt hisbehavior in function of the child’s preferences. Furthermoreit has been observed that it is even possible to teach therobot to perform a certain action as consequence of a certainstimulus. Results have been recorded and are showed in avideo: https://youtu.be/x859_qbYGuA.

4. CONCLUSIONThe developed software remains quite simple but works

well. In the future, by increasing the number of modules andsymbols and by connecting modules in series, much moreintelligent social robots could be developed using the samecognitive architecture.

ACKNOWLEDGMENTThis work is funded by the European FP7 project DREAM

grant no. 611391.

REFERENCES

[1] Bursie, C. I. (2004). Dog-like behavior selection for an AIBO robotdog.KTH Numerical Analysis and Computer Science, Royal Instituteof Technology, Stockholm.

[2] Hecht-Nielsen, R. (2007).Confabulation Theory: The Mechanism ofThought. Heidelberg: Springer.

[3] Kim, J. H., Cho, S. H., Kim, Y. H., & Park, I. W. (2008). Two-Layered Confabulation Architecture for an Artificial Creature’s Be-havior Selection. Systems, Man, and Cybernetics, Part C: Applicationsand Reviews, IEEE Transactions on, 38(6), 834-840.

[4] Kim, J. H., Ko, W. R., Han, J. H., & Zaheer, S. A. (2012). The degreeof consideration-based mechanism of thought and its application toartificial creatures for behavior selection.Computational IntelligenceMagazine, IEEE,7(1), 49-63.

[5] Saldien, J., Goris, K., Yilmazyildiz, S., Verhelst, W., & Lefeber, D.(2008). On the design of the huggable robot Probo.Journal of PhysicalAgents,2(2), 3-12.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 51 -

Page 54: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Suitability of a Telepresence Robot for Services on Home Modifica-tion and Independent Living

B. Kleina, K. Dunkelb, S. Selica, and S. Reutzela aFrankfurt University of Applied Sciences (FRA-UAS), Faculty of Social Work and Health

bCity of Hanau, Senior Citizens Office

Abstract. In four sequential explorative studies the suitability of the telepresence robot GIRAFF was inves-tigated in how far it can be utilized for providing infor-mation and advice on home modification and independ-ent living.

Keywords: Telepresence robot, Home Modification, Assistive Technologies, Independent Living.

1. SERVICES ON HOME MODIFICATION TO SUPPORT INDEPENDENT LIVING

Accessible and barrier free design is a necessary premise in order to live independently in the private home. In Germany only 3% of private homes have barrier free, accessible design. There is an estimated lack of 1.1million accessible homes. (1, 2) Home mod-ification is a key factor, but the knowledge on these possibilities and assistive technologies is not wide-spread. Information and advice on these have been supported by regulatory bodies and initiatives of wel-fare and other organizations. In the state of Hesse, Germany, a variety of professional, semi-professional and voluntary services are available. Although all of them have at least a basic qualification it is often very difficult to pass on the information if clients cannot experience the different assistive devices by them-selves.

Objective of the four sequential explorative studies was to find out whether the utilization of a tele-presence robot could support this information and advice process and bring in new quality aspects.

Following key players were involved in this study:

A. The Voluntary Mobile Home Modification Ser-vice of the City of Hanau

This service is located at the Senior Citizens Office of the City of Hanau. Six active volunteers with back-grounds such as architecture, nursing care, finance, engineering, etc. were trained in home modi-ication and assistive devices for independent living. They have an own office with set office hours.

Typical requests of their clients are adaptation of the bathroom, entrance and access to and in the build-ing, financing home modification, and support for daily activities. For their service it would be helpful if they can show suggested aids and devices.

B. Smart Independent Living Center (SILC) at Frankfurt University of Applied Sciences (FRA-UAS) in the City of Frankfurt am Main

SILC is a permanent exhibition jointly operated by the Faculty of Social Work and Health of FRA-UAS and the Specialist Unit Independent Living of the VdK Social Association Hesse-Thuringia, a major social welfare organization. SILC displays various concepts on accessible design and home modification as well as assistive technologies to support independent living on more than 150qm (3).

C. Three Private Homes located in a collaborative housing project ILEX

The housing project ILEX has 16 accessible flats and communal space for people aged 60plus. Tenants of three flats participated in the different trials.

2. TELEPRESENCE ROBOT GIRAFF

“Telepresence is defined as the experience of pres-ence in an environment by means of a communication medium.”(4) Transmitting voice via telephone, addi-tionally video via e.g. Skype are commonly known and accepted. Additionally, telepresence robots allow the transmission of movements. Most popular is mov-ing through a building, less known is the transmission of mimics. (5) The telepresence robot GIRAFF is a product of the Swedish company GIRAFF Technolo-gies AB and is developed for healthcare purposes. (6, 7) The system consists of a movable screen equipped with camera, microphone and a base on wheels both connected with a height-adjustable bar. The system is not autonomous. A remote user can operate GIRAFF with a PC and internet connection and an easy to use control surface. Communication is possible via speech and video transmission. Additionally, the remote oper-ator can move GIRAFF through the room of the per-son where the system is located.

Following picture shows the GIRAFF telepresence system and the control surface.

Control Surface of GIRAFFView into the Smart Independent Living Center

GIRAFF

Fig. 1. GIRAFF and the control surface

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 52 -

Page 55: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

3. METHODS

In order to explore the suitability and potential of telepresence robots for information and advice on home modification four studies were undertaken. All participating subjects gave written informed consent.

A. Training of the Volunteers of the Mobile Home Modification Service

Four volunteers of the Mobile Home Modification Service, all aged 60plus, were trained to operate GI-RAFF. They had to pilot GIRAFF in a prepared semi-realistic obstacle course. Usability was measured and a discussion on possible applications for their work took place which resulted in the following trials.

B. Trial “First consultation in the private home”

A scenario “First consultation in the private home” was jointly developed and tested with tenants of two flats. They had GIRAFF in their private home and two volunteers of the service dialed through a laptop into their home in order to elaborate the possibilities of home modification or other adaptations. The approx. 20 minute interaction was observed and protocolled and thereafter interviews took place.

C. Trial “Communication, information and advice in a private home”

GIRAFF was placed in a private household for one week in order to analyze the suitability for communi-cation with family and friends and the suitability for information and advice on independent living. Each morning the first author called in order to test the technical suitability and to provide support if neces-sary. The mobile counselors called each day in order to discuss different issues on home modification and tested the operation of GIRAFF. Additionally, friends and family members called and operated GIRAFF. Log files were analyzed and participants kept a diary during that period. After the trial a workshop meeting on the experience and pros and cons was undertaken.

D. Trial of GIRAFF in SILC

The mobile counselors dialed into GIRAFF placed in SILC and could discuss issues on assistive technol-ogies with the experts in SILC. Objective was to ex-plore the potential of SILC for enhanced advice and information. This scenario was tested on two days with observing researchers on both sides.

4. RESULTS

A. Technological aspects

The software of GIRAFF is easy to operate and the volunteers (all 60plus) could manage it well. Major problem was the internet bandwidth which caused problems of time lags in transmission. This resulted in GIRAFF still driving and the operator not knowing the

actual position. Poor transfer speed also affected the communication process. Distorted voice and frozen pictures were irritating especially in addition to hear-ing and vision problems of the participants. One sug-gestion was to have a joystick as a user interface.

B. Suitability of GIRAFF for information and ad-vice on home modification

Prerequisite for operating GIRAFF is accessible de-sign and internet-access. As GIRAFF is rather volumi-nous, sufficient space is necessary in order to drive GIRAFF through the private home. In the trials resi-dents removed their carpets and smaller furniture in order to enable a secure driving in the flat. Video transmission was very limited with respect to smaller items – they could hardly be recognized by the opera-tor. Communication was judged as easy and compared to “normal” conversation.

5. DISCUSSION AND OUTLOOK

Due to the ageing population there is an increasing need for information on accessible design and assistive technologies. All the participants ascribed the tele-presence technology a high potential, especially with respect to the combination of viewing the product and listening to the expertise of the exhibition team. A necessary prerequisite is a high speed internet access which is still not available for many parts in Germany. Currently, the team is testing other telepresence robots (Double and VGo).

ACKNOWLEDGMENT

This study is part of the joint project “ERimAlter – Chronic disease, loss and maintenance of functions in old age – social and emotional robotics” of the Goethe-University in Frank-furt am Main and Frankfurt University of Applied Sciences, which was funded by the Federal Ministry of Education and Research.

REFERENCES

[1] Diefenbach et al. , Datenbasis Gebäudebestand, For-schungsbericht IWU, 2010.

[2] Kuratorium Deutsche Altershilfe/Wüstenrot Stiftung: Wohnatlas. Rahmenbedingungen der Bundesländer beim Wohnen im Alter. Teil 1. Köln 2014, p. 23.

[3] www.frankfurt-university.de/en/independent_living. [4] J. Steuer, Defining Virtual Reality: Dimensions Deter-

mining Telepresence. SRCT Paper #104, 15.10.1993. originally published in Journal of Communication 4(2), Autumn 1992, pp. 73-93.

[5] e.g. R. Yamazaki, S. Nishio, K. Ogawa, H. Ishiguro, Teleoperated Androis as en Embodied Communication Medium: A Case Study with Demented Elderlies in a Care Facility, 2012 IEEE RO-MAN, 1066-1071 (2012).

[6] http://www.oru.se/ExCITE/Part-4/ (31.05.2015). [7] http://www.giraffplus.eu/index.php?option=com_ con-

tent&view=article&id=86&Itemid=87&lang=en (31.05.2015).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 53 -

Page 56: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

A Pilot Study On The Feasibility of Paro Interventions In IntramuralCare For Intellectual Disabled Clients

Roger Bemelmansa and Luc de Witteb

aResearch Center for Technology in Care, Zuyd University of Applied Sciences, The NetherlandsbCare and Public Health Research Institute, Maastricht University, The Netherlands

Abstract— Social robots, with Paro being an example, offer newopportunities for innovative approaches in care for people withintellectual disabilities. Paro was used according to individualisedinterventions during a three week period. Selected residents wouldbe offered Paro ones or twice a week. A total of 8 clients, 5 adultsand 3 children, participated. Paro seems to have a positive effect insupporting care for the elderly, for the children no positive effectwas reported.

Keywords— Paro, Intellectual Disabled, Pilot, Interventions, Fea-sibility

INTRODUCTION

Care for people with intellectual disabilities in the Nether-lands is traditionally provided by professional caregivers incombination with informal caregivers. Technology is widelyregarded as an important potential for care innovation. ICTtechnology and robotics, and particularly socially assistiverobotics (SAR) are under rising attention of innovators [1].But, as most assistive robotic developments, the implementa-tion of SAR is, after the technical development of the robotsystem, a major hurdle on the route to application of therobot in day to day care practice. When it is to be applied asan instrument supporting care there should be an interventionsurrounding the robot, specifying usage, users and purposeof the robot application in such a way that caregivers areguided in putting the robot to effective use and can regardthe robot as an instrument in their care provision renderingadded value for their clients and their efforts [2].

Three types of interventions were developed in closecollaboration with four Dutch care institutions for elderlycare [2]. These three interventions aim at:

1) Therapeutic purposes;2) Facilitating daily care activities;3) Supporting social visits.Although research has been done into the published effects

of Paro for people with dementia [3], [4], so far littleis known about the effects and applications of Paro inthe care of people with (multiple) intellectual disabilities.For psychogeriatric care Paro has clearly added value [5],especially for therapeutic related interventions, for people-including children- with (multiple) intellectual disabilitiespotential added value has not yet been demonstrated. Toevaluate practical application of the interventions developed,in the context of care for people with intellectual disabilities,this paper reports on a feasibility study executed. The Paro

interventions were applied to individual clients, translatingone of the above aims into individualised goals in linewith therapeutic or care related aims, formulated for theseindividuals by the care professionals.

METHODSThe study was executed in 2 locations of Pergamijn, a

care organization in Limburg, a southern province of theNetherlands. In each location, local small scale care units(8-10 residents each), were selected by the organisation forthis study. As Paro was new to all care staff, the firststep in the study was providing a brief training of carestaff of the involved care units to familiarize them withthe robot, its purpose and foreseen application. For thepractical execution of applying Paro interventions in the careunits, a procedure was developed leading to clarificationon which residents would be involved in the study andfor what purposes. Following the selection of participants,Paro was used with the selected residents according to theindividualised interventions during a three week period. Priorto the actual use of Paro a baseline measurement was taken,also during a three week period. This baseline measurementconcerned observation of the problematic behaviour of eachindividual.

Organisation involvedPergamijn is a care organization that advises people with

intellectual disabilities and gives professional support, it doesso based on the needs and demands of every client. This leadsto individual support in the areas of housing, counseling,diagnostics, education, work or leisure.

InterventionEach individualised intervention contained a description

of the problematic behavior, description of context andapplication, and the type of outcomes. Paro tries to stimulateinteraction and attracting attention from the participant bymaking enjoyment, by making soft noises and bowing itshead towards the participant, thus reinforcing the interaction.At the onset of the targeted behavior Paro was introduced bythe care provider similar to the following text: ”Look, this isthe seal Paro. He will sit with you for a while. You can stroke,cuddle or talk to him if you like. He can sit on your lap orstay on the table”. During the activity Paro stayed on a table(or on their lap), so that the participant could interact with it.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 54 -

Page 57: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The care provider was active in reminding the participant ofthe presence of Paro if necessary and stimulated interactionbetween the participant and Paro. At the end of the activity(after about 15 minutes) the session was ended smoothly bysaying goodbye to Paro.

Measurement of effectFeasibility was measured qualitatively by means of a

registration form and a diary in which each occasion of Parouse was briefly reported. For each of the Paro interactionsthe lead nurse filled out a registration form describing thebehaviour of the patient just before the intervention started,the reaction of the patient at the moment Paro was offered,the behaviour of the patient during the interaction with Paro,the behaviour and reaction of the patient at ending theintervention, and the perception of the caregiver regardingthe effect of this session. The primary outcome was measuredon an individual level by a care provider, based on theIndividually Prioritized Problems Assessment (IPPA) score[6]. A mood scale was used as secondary outcome to validatethat the reported effects by the care providers (i.e. IPPAscore) were consistent with the resident’s mood.

After the three week period, care staff was interviewedusing a semi-structured qualitative questionnaire, to re-assessthe effects as reported in their descriptions and to assess thepracticalities involved in applying the Paro interventions andthe effects on the patients.

RESULTSA total of 8 patients, 5 adults and 3 children, participated.

For each participating client one care staff member could beassigned who initiated and evaluated the application of theParo intervention for this client.

The three children have all completed the study, so boththe 5 measurements without Paro (baseline) as the 5 interven-tions with Paro. Of the five adults, four have also completedthe study. One client was very dismissive towards Paro andthe research for this client was ended prematurely.

Figures 1 and 2 show the average IPPA-scores for resp.the children and the older clients.

Fig. 1. Average IPPA score per child, with and without Paro

The results show that Paro has no (significant) impactin terms of the defined intervention goals for the children.Although one child liked Paro and liked interacting with it,Paro had no positive effect in terms of the individualised

Fig. 2. Average IPPA score per adult, with and without Paro

goal. The results also show that Paro has a positive effect on3 older clients. One client was very defensive towards Paroand one client showed hardly any interest.

During the final evaluation with the care providers theresults were confirmed, the staff recognized the findingsand saw potential in the use of Paro for the older clients.During the evaluation of the children, the results were alsorecognized. The present staff (8) felt that Paro probably hadno added value for profound (intellectual) (and) multipledisabled children. Some comments from care staff about theuse of Paro for these children were: The robot is too passive,there is too little exercise; Paro should be more adjustable orconfigurable, it should be personalized for each child; Therobot is very big and heavy for these children; Paro is focusedon care and attention. These children want to play activelyand manipulate toys.

CONCLUSIONSFor the children Paro seems yet to have little added value

in support of care. The children do not seem to be afraid ofParo and show (sometimes) interest. For the older residentsParo seems to have a more positive influence in the supportof care. Given the lead time and the number of participants,these results are only indicative. Wider use of Paro, linkedto specific care demands, could give more insight into thepossibilities and effectiveness of this seal robot in daily carepractice for intellectual disabled people.

REFERENCES

1. Butter M, Rensma A, Boxtel Jv, Kalisingh S, and others. Roboticsfor Healthcare, Final Report. TNO, commissioned by the EuropeanCommission, DG Information Society 2008.

2. Bemelmans R, Gelderblom GJ, Spierts N, Jonker P, de Witte L.Development of robot interventions for intramural psychogeriatric care.GeroPsych: The Journal of Gerontopsychology and Geriatric Psychiatry2013;26(2):113-120.

3. Bemelmans R, Gelderblom GJ, Jonker P, de Witte L. Socially Assis-tive Robots in Elderly Care: A Systematic Review into Effects andEffectiveness. Journal of the American Medical Directors Association2012;13(2):114-120.

4. Broekens J, Heerink M, and Rosendal H. Assistive social robots inelderly care: a review. Gerontechnology 2009;8(2):94-103.

5. Bemelmans R, Gelderblom GJ, Jonker P, de Witte L. Effectivenessof Robot Paro in Intramural Psychogeriatric Care: A MulticenterQuasi-Experimental Study. Journal of the American Medical DirectorsAssociation 2015;doi:10.1016/j.jamda.2015.05.007.

6. Wessels R, Persson J, Lorentsen, et al. IPPA: Individually prioritisedproblem assessment. Technology and Disability 2002;14(3):141-145.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 55 -

Page 58: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

NoAlien! Linguistic alignment with artificial entities in the context of second language acquisition

Carolin Straßmanna, Astrid M. Rosenthal-von der Püttena and Nicole C. Krämera

aSocial Psychology: Media and Communication, University of Duisburg-Essen

Abstract. Second-language speakers often face the situation that native-speakers adapt to non-natives and reduce the complexity of word choice and syntax in order to foster mutual understanding and successful communication. However, the other side of the coin is that this kind of alignment sometimes interferes with successful second language acquisition (SLA) on a native-speaker level. In the present work we explore whether artificial tutors can be used to avoid this negative aspect and exploit the benefits of linguistic alignment in human-computer interaction in order to enhance learning outcomes in SLA. We outline an experimental study (n=130) on the effects of the system’s embodiment (robot, virtual agent or only speech based) and speech output (prerecorded natural speech or text-to-speech) on participants’ perception of the system, their motivation, their lexical and syntactical alignment during interaction and their learning effect after the interaction.

Keywords: human-robot interaction, embodiment, virtual agent, linguistic alignment, second language acquisition,

INTRODUCTION

Due to the demographic change, many industrial countries are lacking of skilled workers and depend on immigrants from foreign nations. Therefore the integration of those immigrants becomes more and more important. One of the most essential factors of a successful integration is the ability to communicate with others and to speak the local language fluently. To learn the foreign language most people concentrate on a classical student-teacher situation, although there is some evidence that people can learn passively by linguistic alignment1. While non-native speakers can learn a lot from a dialog with native speakers, there is also a negative aspect of linguistic alignment processes. Native speakers align to non-native speakers and will use a simplified language. Hence, non-native speakers are not able to learn complicated words and syntax to improve their language skills. With regard to this problem especially modern forms of technical assistance are very promising since the verbal behavior of these systems can be tailored in order to elude this problem. Therefore, we investigate whether non-native speakers align to robots as well as virtual agents and whether alignment leads to improvements in SLA.

THEORETICAL BACKGROUND

Linguistic Alignment in HHI & HCI

When people talk to each other they adapt linguistic representations to understand each other better. This phenomenon is highly relevant for a successful conversation2 and is widely known as linguistic alignment3. The conversation partner adapts different components of the other’s linguistic behavior such as accent, phonetics, speech rate and prosody4. Further on, people use the same terms as their conversation partner for a certain object during a repetitive usage5. Brennen and Clark5 found in their study that participants indirectly form conceptual pacts about lexical choices in order to describe different items without negotiating about it. But people do not only use lexical alignment in a dialog. They also align in a syntactical manner. Although there are plenty of possibilities to express certain statements, Branigan and colleagues3 stated that dialog partners coordinate their syntax and form their sentences in a similar way. This finding is in line with Bock6 who found that participants adapt the structure of the sentences in relation to active and passive formulations. Pickering and Branigan7 showed that syntactical alignment occurs regardless of whether dialog partners use identical or different verbs. Even if the effect is more present when similar verbs are used. Overall, many studies prove the existence of linguistic alignment in different ways8,7,3. Moreover, numerous studies show that linguistic alignment takes place in MCI as well. Previous work demonstrates that people align to artificial entities (computers, agents or robots) with regard to lexical9,10 and syntactical choices11 and even with regard to dialect12.

Linguistic alignment in the context of SLA

Linguistic alignment does not only occur between two native speakers but is also present in a dialog between a non-native speaker and a native speaker13. After the conversation with a native speaker, the speech of a non-native speaker was evaluated as more naturally than before.13 Therefore, non-native speakers can benefit from a dialog with a native speaker and this conversation can enhance their linguistic skills. Pickering and Garrod14 state that long-term linguistic alignment is elementary for language acquisition. Studies show the benefits non-native speakers can derive from linguistic alignment processes.1,15 Long15 postulate that the alignment in a conversation between

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 56 -

Page 59: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

native and non-native speakers increases the understanding of the conversational content when natives align to their dialog partner. This in turn leads to an enhancement of non-natives’ language acquisition. However, this process can also have a negative effect. While natives adapt the linguistic manner of non-natives, they will not make use of complex terms. Thus, non-natives are not able to learn these terms and cannot enhance their linguistic skills to a flawless level. One possibility to avoid this problem may be the use of robots and virtual agents. A technical system like this will use perfect sentences without regard to the linguistic level of their dialog partner. In order to investigate these processes of participants’ linguistic alignment to artificial tutors we conducted a laboratory study and in addition varied diverse aspects of the tutoring system to explore their impact on motivation and learning outcome.

OUTLINE OF EXPERIMENTAL STUDY

In the present study we examine the underlying mechanisms of linguistic alignment in interactions with an artificial tutoring system and explored the impact of certain technical aspects of the system in a 2x3 between subjects design. First, we compared prerecorded speech (female voice) with a text-to-speech (tts) output (female as well), as there may be different effects of alignment. Moreover, this variation addressed also practical implications. If alignment processes are the same, then tts is a much more flexible solution to realize pedagogical agents, as a system could be easily amended by new components. Second, we varied the embodiment of the system (robot vs. virtual agent vs. control version without embodiment). For our study a wizard-of-oz setting has been used and the system has been controlled by the experimenter while the participants thought that the system works autonomously.

Participants and procedure

130 non-native speakers (74 female, 56 male) aged between 19 and 53 years (M=26, SD=6.87) took part in this experimental study. They stem from 40 different nations, and have different levels of German language skills. Upon arrival participants signed informed consent and then completed diverse tests in order to assess their German proficiency level followed by a questionnaire asking for demographic variables and assessing personality traits. Afterwards participants interacted with the language learning system. During the interaction they have to solve three different tasks together with the system, for instance describing pictures in much detail, play a guessing game, etc. Participants’ verbal behavior will be analyzed regarding lexical and syntactical alignment to the respective version of the system. Afterwards, participants evaluated their interaction with the

system. We asked for understandability, acceptance, usage intention, learning experience and also for how the respective system was perceived in terms of person perception and social physical presence. In order to access direct learning outcomes we again captured their linguistic skills in the end of our experiment.

Results and discussion

We report work in progress. We just completed data collection and will have to transcribe and analyze participants’ verbal behavior. However, we have already first results regarding the impact of speech output. The results of this study may be very profitable in regard of the application of robots and virtual agents as a technical assistance during second language acquisition. As the experiment is not finished yet, the results will be presented and discussed at the conference.

REFERENCES 1. K. McDonough and A. Mackey, Studies in Second

Language Acquisition 30, 31-47 (2008). 2. M. J. Pickering and S. Garrod, Behavioral and Brain

Sciences 27, 169–189 (2004). 3. H. P. Branigan, M. J. Pickering and A.A. Cleland,

Cognition 75, B13–B25 (2000). 4. H. Giles, J. Coupland and N. Coupland, Contexts of

accommodation: Developments in applied sociolinguistics. Cambridge: Cambridge University Press, 1991.

5. S. E. Brennan and H. H. Clark, Journal of Experimental Psychology: Learning, Memory and Cognition 22, 1482–1493 (1996).

6. J. K. Bock, Cognitive Psychology 18, 355–387 (1986). 7. M. J. Pickering and H. P. Branigan, Journal of Memory

and Language 39, 633-651 (1998). 8. M. C. Potter and L. Lombardi, Journal of Memory and

Language 38, 265-282 (1998). 9. S. E. Brennan and H. H. Clark, Journal of Experimental

Psychology: Learning, Memory and Cognition 22, 1482–1493 (1996).

10. A. M. Rosenthal-von der Pütten, L. Wiering and N. C. Krämer, i-com Zeitschrift für interaktive und kooperative Medien 12, 32-38 (2013).

11. H. P. Branigan, M. J. Pickering, J. Pearson, J. F. McLean and C. Nass, In Proceedings of the Twenty-fifth Annual Conference of the Cognitive Science Society, 186–191, (2003).

12. V. Kühne, A. M. Rosenthal-von der Pütten and N. C. Krämer, „Using Linguistic Alignment to Enhance Learning Experience with Pedagogical Agents: The Special Case of Dialect,“ in Lecture Notes in Computer Science. Intelligent Virtual Agents, edited by R. Aylett, B. Krenn, C. Pelachaud, & H. Shimodaira, Berlin, Heidelberg: Springer Berlin Heidelberg, 2013, pp. 149–158.

13. H. Bortfeld and S. E. Brennan, Discourse Processes 23, 119-147 (1997).

14. M. J. Pickering and S. Garrod, Behavioral and Brain Sciences 36, 329-347 (2013).

15. M. H. Long, Applied linguistics 4, 126-141 (1983).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 57 -

Page 60: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Transitional Wearables Based on Bio-Signals to Improve Communication and Interaction of Children with Autism Beste Özcana, Tania Morettaa, Maria Nicoletta Alibertib and Gianluca Baldassarrea

aLaboratory of Computational Embodied Neuroscience, Institute of Cognitive Sciences and Technologies, National Research Council (LOCEN-ISTC-CNR), Rome, Italy

bChildren Neuropsychiatry Institue (I.N.I) shared with Villa Dante, Rome, Italy

ABSTRACT We propose a novel interaction method based on a type of wearable interfaces called transitional wearables (TW). TW allow gathering physiological data from children with autism and can be used to facilitate their communication and interaction with parents and caregivers during daily life activities. Communication plays a key role for the children's mental and social development [1]. The variable symptoms of autism are generally grouped under the name of autism spectrum disorder (ASD) [2]. ASD patients are characterized as having difficulties in social interaction [3], communication [4], tendency to fixate on limited interests and repetitive behaviors [2]. They show less interaction in free play situations and rarely initiate social interaction [5]. In several medical fields there is an increasing need for ecological monitoring of physiology variables to support medical interventions and therapies outside the clinical setting [6]: wearables with biosensors could contribute to meet this need. The application areas are numerous, for example there is an increasing interest in early-age detection of ASD as well as for exploiting the gathered knowledge to create better therapies [7]. Indeed, a main problem involving ASD's diagnosis, evaluation and treatment is the internal emotional state of the patient [9]. Canonical biosensors, however, do not have access to physiological data in real time in daily life or during therapeutic training, thus losing important information. Moreover, they are often expensive and difficult to use on certain types of patients, e.g. on ASD patients who refuse contact or low motion [8]. Our project aims to develop and test a novel wearable which is capable of real-time and long-term physiological monitoring by recording Galvanic Skin Response (GSR), Skin Temperature (SKT), and heartbeat, and also to use an accelerometer embedded on a wristband. These devices are low cost, low power, and non-intrusive [8]. While most studies done in this field (e.g., [10]–[12]) are restricted to measurements in laboratories, they have demonstrated that there is significant emotion-related information that can be recognized through physiological activity [13]. Our aim is to use this information to identify and translate physiological output into information on

basic emotions understood by the caregiver. Real world’s expectations and judgments involved in social contexts might appear “unsafe” to children with autism and this makes social interactions problematic [4]. Many children with ASD develop an attachment to a “transitional object”, e.g. a teddy bear. This is used as a reliable source of soothing and confidence during the exploration of the world independently of parents and caregivers [14]. It is known that computer technologies have the potential to support children during interactions to facilitate their life. For instance: (1) interactive toys controlled by the child provide predictability through cause and effect functions and this reassures the child [15]; (2) form a safe bridge to the less predictable world formed by other objects and people; (3) accompany them in the daily world's learning and interactions (e.g., cleaning teeth, travelling in a car); (4) help learning to interact socially [16]. Wearable devices with biosensors can systematically collect information about actions and emotional states of children and communicate them wirelessly to an external computer (e.g., a mobile phone or a tablet). The information so gathered can be automatically processed based on pattern-recognition and other machine-learning algorithms and provide information usable at real-time to guide interventions, e.g. in the form of alert messages or text messages for the caregivers [17]. TW could gather bio-signals from children with autism during their social and collaborative activities in a friendly and comfortable way as they can be integrated easily in different types of objects, such as toys and clothing, without the child noticing the sensors. This would also provide a novel means through which multi-sensory feedbacks and cause-effect object behaviors could be used to motivate and reinforce social interaction while engaging in life and therapy activities [15]. The cause-effect nature of such type of interaction would give the child a higher sense of control and hence mitigate fearful and avoidance reactions [18]. Computers and other similar electronic devices tend to promote a non-social use and this could drive the child to further isolate from the outside world or become hyper focused, falling trapped in obsessive-compulsive behaviors. Instead, if suitably designed TW for children with autism can be used in daily life

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 58 -

Page 61: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

contexts and thus can possibly have a positive impact on children's social life [19]. For this purpose, positive/rewarding sensorial feedbacks from the wearables (e.g., colored LEDs, sounds) can be made dependent on the performance of communication actions with the caregivers. For their richness and programmable nature, TW could thus be used to facilitate exploration and development of divergent behaviors leading to “accommodate” to novel contexts, experiences, and social interactions [20]. By collaborating with therapists, psychologists, biomedical engineers, psychomotor therapists we are now prototyping design solutions of TW that are non-intrusive and allow the collection of data in children with ASD. We are also defining an experimental protocol to empirically test the TW with children with autism. The main objective of the test will be to verify the effectiveness of this approach by analyzing the recorded data related to emotional reactions of children to TW. Keywords: Autism, transitional object, wearables based on biosensors, stable-reassuring interactions.

REFERENCES [1] Y. Takano and K. Suzuki, “Affective

communication aid using wearable devices based on biosignals,” Proc. 2014 Conf. Interact. Des. Child. - IDC ’14, pp. 213–216, 2014.

[2] K. Welch, “Physiological signals of autistic children can be useful,” Instrum. Meas. Mag. IEEE, no. February, 2012.

[3] R. el Kaliouby and P. Robinson, “Therapeutic Versus Prosthetic Assistive Technologies: The Case Of Autism,” 2005.

[4] E. I. Konstantinidis, A. Luneski, C. a. Frantzidis, P. Costas, and P. D. Bamidis, “A proposed framework of an interactive semi-virtual environment for enhanced education of children with autism spectrum disorders,” 2009 22nd IEEE Int. Symp. Comput. Med. Syst., pp. 1–6, Aug. 2009.

[5] N. Yuill, S. Strieth, C. Roake, R. Aspden, and B. Todd, “Brief report: designing a playground for children with autistic spectrum disorders--effects on playful peer interactions.,” J. Autism Dev. Disord., vol. 37, no. 6, pp. 1192–6, Jul. 2007.

[6] B. Tegler, M. Sharp, and M. A. Johnson, “Ecological monitoring and assessment network’s proposed core monitoring variables: An early warning of environmental change,” Environ. Monit. Assess., vol. 67, no. 1–2, pp. 29–56, 2001.

[7] M.-Z. Poh, N. Swenson, and R. W. Picard, “A wearable sensor for unobtrusive, long-term assesment of electrodermal activity,” IEEE Trans. Biomed. Eng., vol. 57, no. 5, pp. 1243–1252, 2010.

[8] A. M. Jimenez, “Physiological Sensor,” University of Louisville, 2013.

[9] J. A. Kientz, G. R. Hayes, T. L. Westeyn, T. Starner, and G. D. Abowd, “Pervasive computing and autism: Assisting caregivers of children with

special needs,” IEEE Pervasive Comput., vol. 6, no. 1, pp. 28–35, 2007.

[10] C. Liu, K. Conn, N. Sarkar, and W. Stone, “Physiology-based affect recognition for computer-assisted intervention of children with Autism Spectrum Disorder,” Int. J. Hum. Comput. Stud., vol. 66, no. 9, pp. 662–677, 2008.

[11] S. Fairclough and K. Gilleade, “Brain and body interfaces: Designing for meaningful interaction,” … 2-----------Proceedings …, pp. 1–4, 2011.

[12] S. Schachter and J. Singer, “Cognitive, social, and physiological determinants of emotional state.,” Psychol. Rev., vol. 69, pp. 379–399, 1962.

[13] R. W. Picard, “Future affective technology for autism and emotion communication.,” Philos. Trans. R. Soc. Lond. B. Biol. Sci., vol. 364, no. 1535, pp. 3575–84, Dec. 2009.

[14] O. Stevenson, “The First Treasured Possession: A Study of The Part Played by Specially Loved Objects And Toys In The Lives of Certain Children,” Psychoanal. Study Child, vol. 9, pp. 199–217, 1954.

[15] A. Dsouza, M. Barretto, and V. Raman, “Uncommon Sense: Interactive sensory toys that encourage social interaction among children with autism,” … 2012 Wokshop Interact. …, pp. 2–5, 2012.

[16] M. Trimingham, “‘Objects in transition: the puppet and the autistic child,’” vol. 1, no. 3, pp. 251–265, 2010.

[17] R. R. Fletcher, M. Z. Poh, and H. Eydgahi, “Wearable sensors: Opportunities and challenges for low-cost health care,” in 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC’10, 2010, pp. 1763–1766.

[18] J. C. J. Brok and E. I. Barakova, “Engaging autistic children in imitation and turn-taking games with multiagent system of interactive lighting blocks,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 6243 LNCS, pp. 115–126, 2010.

[19] B. Özcan, “Motivating Children with Autism to Communicate and Interact Socially Through the ‘ + me ’ Wearable Device,” in Conferenza Annuale dell’Associazione Italiana di Scienze Cognitive (AISC-CODISCO), 2014, pp. 65–71.

[20] D. Caligiore, P. Tommasino, V. Sperati, and G. Baldassarre, “Modular and hierarchical brain organization to understand assimilation, accommodation and their relation to autism in reaching tasks: a developmental robotics hypothesis,” Adapt. Behav., vol. 22, no. 5, pp. 304–329, Sep. 2014

 

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 59 -

Page 62: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Roboterapia: an environment supporting therapists’ needs

Igor Zubryckia, Jaroslaw Turajczykb and Grzegorz Granosika

aInstitute of Automatic Control,Lodz University of Technology, Lodz, PolandbUczelnia Nauk Spolecznych, Lodz, Poland

Abstract— In this paper we have highlighted the needs of thera-pists and possibilities of using robots in therapeutic environment.While the use of robots as therapeutic devices is widely studied(especially with children with developmental needs), robots couldbe used in a wider scope. We have performed need finding sessionswith therapists and produced prototype devices that could supportdifferent aspects of therapists’ work.

Keywords— interdisciplinary collaboration, robotized environment,end users’ needs, autism therapists’ needs

A. IntroductionTherapist, especially those working with mentally disabled

children, have a large number of burnouts resulting from bothpatient characteristics and work environment factors [1].

We have focused on whole therapeutic environment be-lieving that intelligent, programmable agents could improvenot only therapy but also therapist well being.

With our partners – designers from Strzeminski Academyof Fine Arts Lodz (both student designers and professionals)and willing group of autism therapists (a group of ninepractitioners) from Navicula Centre for Autism Therapy wetried to understand the possible role that robotic systemscould have in therapist’s workplace and design adequatesolutions.

B. An environment centered approachRobots have been proven capable of being social actors

such as teachers or play partners to therapy clients. We pro-pose using robots as assistants to the therapists. A therapeuticenvironment could be then understood as a space with threeparticipants: therapist, client–patient and a robot.

We have conducted need finding sessions ( interviews,observations, ideation sessions) to find how robotic technolo-gies could be used, without limiting its use to the therapyitself. As studies in burnout have shown, the work environ-ment factors, such as lack of clarity, support and generaloverload, can have bigger role in its development than thecharacteristics of the patients [1]. Therefore, by extending therole of a robot beyond being a therapeutic tool, its impacton the long term well being of both therapist (directly) andclient (through better therapy) could be improved.

Robots’ use in therapeutic environment could be dividedinto its use as a therapeutic tool and as a supportive role inother therapist’s work. Use of robots in the therapy of chil-dren with autism was studied in such projects as AuRoRA[2], Keepon [3]. From our own interviews with therapists andneedfinding sessions, therapists state that in order to be usefulrobots need to be interactive, programmable, personalizable,

with the ability to fine-tune stimuli [4]. This suggests that anenvironment where therapists could program and control thetherapy themselves is needed, which agrees with findings ofBarakova et al. [5].

As our group of therapists was small, in our study we havedecided to use mostly open questions and treat therapists asco-creators of robotic solutions. The biggest obstacles thatour group stated in the work are actually similar as in alarger study [6]: poor relations with supervisors and parents,lack of visible results (that could be shown to parents andsupervisors), loneliness (as there are frequent periods of timewhen therapists work alone with their clients).

That suggested the use of robots in a supportive role, ofwhich therapists showed a big interest. When explained thecurrent limitations of state-of-the-art technologies of speech,emotion and activity recognition, therapists showed interestin some particular robotic roles. We have listed the mostinteresting below:

• a helper in critical/dangerous situations. As therapistsfrequently work alone with their clients, they can havedifficult time when the patient behaves in a way thatrequires help (aggressiveness against oneself or otherpeople). A robot could be used to distract, call for helpor become teleoperated by another person that couldsoothe the client.

• a record keeper and reporting device. Therapists workas a part of a bigger institution and are frequentlyrequired to report about particular patient’s behavioursand therapy progress. Also, patient’s parents can doubtthat progress is happening or that some actions areoccurring. Robots can record parts of therapy and report,both for administrative purposes and for communicationwith parents.

• an “emotional mirror” for both the patient and therapist.Therapy is a dynamic situation where it can be hard fortherapist to always understand client’s as well as itsown emotions, such as anger and frustration, which cannegatively influence therapy. Through informing aboutoccurring emotions robots could give the therapist achance to meditate the situation before it influences thetherapy.

• a “team player”. A robot can influence therapy dy-namics by stating that it does not like some behaviour(thereby moderating conflict [7]), proposing or finishingsome activities (managing pace)

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 60 -

Page 63: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Fig. 1: Exploration box authors: Magdalena Bartczak, OlgaRogalska, Szymon Surma, Magdalena Gregorczyk, DariuszUrbanski

C. ResultsThrough iterative design processes, where therapists were

considered as final users we have created two groups of pro-totypes of robotic devices that could be used in therapeuticenvironment. First group are devices that could be used insensory therapy, where therapists could program series ofstimuli and reactions of the device (warming up, moving,generating sounds) to the actions of the child. A device ofthis kind – an “Exploration Box” is presented in Fig. 1.Therapists can program the device through Scratch basedgraphical programming language.

Second group of devices, are more universal mobilerobots, that can be used both as part of the therapy andas support for the therapists. Devices have exchangeablecasings, with two designs presented in Fig. 2.

Our designs are ROS connected and have tablet basedinterfaces. Speech commands are recognised through use ofWit.Ai software, which is a set of tools for enabling speechrecognition in internet-of-things[8].

Therapists from Navicula Centre of Autism Therapy didpreliminary tests of sensory therapy devices. With the firstiteration of the devices they have evaluated them as usefuland correct but not completely novel. In the second iterationof the devices, designers used more unique robotic function-alities by connecting voice recognition, motion generationand cooperation of the devices. An educational aspect of thisproject, as most of the designs were developed by studentsof Lodz University of Technology and Strzeminski Academyof Fine Arts Lodz is described in [9].

Fig. 2: Different exchangeable casings for universal mobilerobot used in therapeutic environment. First: A doll-like shellfor a mobile robot. Design by Honorata Lukasik. Second: Asoft shell with a place for a tablet for a robotic assistantdesigned for a therapeutic assistant robot.

D. Conclusion and future workRobot’s role in therapy does not need to be constrained to

a tool. Different, critical for both therapy and therapist long-term well being, parts can also be roboticized. In our workwe aimed at collecting and describing therapists’ needs andshowing our designs that could fit those needs.

In our future work we are planning to set up robots withall functionalities that would allow for using robots as a moreuniversal assistants in therapy, that is emotion and behaviourrecognition. In our longitudinal studies we will be analysingtherapist burnout as a function of roboticized environmentusage.

REFERENCES

1. C. Cherniss, “Stress, burnout, and the special services provider,” SpecialServices in the Schools, vol. 2, no. 1, pp. 45–61, 1985.

2. K. Dautenhahn, “Aurora project,” www.aurora-project.com, 2006,accessed: 25 November 2014. [Online]. Available: www.aurora-project.com/

3. H. Kozima, M. Michalowski, and C. Nakagawa, “Keepon,” Int. Journalof Social Robotics, vol. 1, no. 1, pp. 3–18, 2009. [Online]. Available:http://dx.doi.org/10.1007/s12369-008-0009-8

4. I. Zubrycki and G. Granosik, “A robotized environment for improvingtherapist everyday work with children with severe mental disabilities,”in Proceedings of the Tenth Annual ACM/IEEE International Conferenceon Human-Robot Interaction Extended Abstracts, ser. HRI’15 ExtendedAbstracts. New York, NY, USA: ACM, 2015, pp. 203–204. [Online].Available: http://doi.acm.org/10.1145/2701973.2702717

5. E. I. Barakova, J. Gillesen, B. Huskens, and T. Lourens, “End-userprogramming architecture facilitates the uptake of robots in socialtherapies,” Robotics and Autonomous Systems, vol. 61, no. 7, pp. 704–713, 2013.

6. E. Pisula and M. Pietka, “Wypalenie zawodowe u terapeu-tow zajeciowych pracujacych z osobami z autyzmem,” Czlowiek-niepelnosprawnosc–spoleczenstwo, p. 139.

7. M. F. Jung, N. Martelaro, and P. J. Hinds, “Using robots to moderateteam conflict: The case of repairing violations,” in Proceedings of theTenth Annual ACM/IEEE International Conference on Human-RobotInteraction, ser. HRI ’15. New York, NY, USA: ACM, 2015, pp. 229–236. [Online]. Available: http://doi.acm.org/10.1145/2696454.2696460

8. Wit.ai, “Wit.ai documentation,” 2015. [Online]. Available: https://wit.ai/docs

9. I. Zubrycki and G. Granosik, “Technology and art – solving interdis-ciplinary problems,” in 6th International Conference on Robotics inEducation,RiE 2015, HESSO.HEIG-VD, Yverdon-les-Bains, Switzerland.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 61 -

Page 64: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

AISOY Social Robot as a tool to learn how to code versus tangible andnon-tangible approaches

Olga Sans-Copea, Albert Vallsb, Marc Galvez-Fonta, Marc Garnicaa, Vicens Casasa, Sandra Picob, Alex Barcob and Jordi Albo-Canalsb

aPolytechnic University of Cataloniabla Salle, Ramon Llull University

Abstract— In this paper we have conducted a study to validate theuse of educational social robotics as an hybrid system between thetraditional approach of using technology in the classroom based oncomputers and the pioneer approach about using tangible devicessuch as educational robots. In order to accomplish our goal we haveorganised a workshop with 36 participants, where students between8 to 12 years old had to program a rock-paper-scissors player usingscratch on a computer, a scratch on a computer (Enchanting) +LEGO NXT, and the educational social robot AISOY programmedwith scratch.

Keywords— Education, Social Robots, Tangible Device

1. INTRODUCTION

Research involving technology in education has twotrend topics, the first one is about technology being thebase of the STEM or STEAM learning. The second one isabout the computational or engineering thinking. The firstone foresees that with the use of technology students areattracted and engaged with science and technology, whilethe second one believes that engineering skills are used inthe everyday life, and in addition, through the engineeringskills people develops a better human sensitivity [1].

So, the controversial about virtual and tangible devicesis served. Some researchers claim that tangible devicesincrease the level of immersion because students aremanipulating things in a real world [2]. However, we canfind other studies that understands that non-tangible devicesbrings more flexibility and avoids limitation because of thephysical body in the real space, furthermore, in [3] authorsexplain that exist a lack of evidence that tangible systemsoffer any benefits compared to onscreen counterparts. Whatseems logical is a hybrid approach as the one presentedin [3], where a merge between physical and virtual worldprovides more flexibility to teachers and learners.

In this paper, we propose and studied the benefits of a tan-gible non-tangible combined system based on a social robotfor education purpose named AISOY. We have structured thisabstract as follows: in section II is presented the methodologyused to study a tangible system vs non-tangible system vs ahybrid system, and in section III, indicators from the analysisof the data obtained are given and discussed.

2. METHODOLOGY

For doing this study we have selected a population of36 students from a summer camp organized in Barcelonaby ClauTIC [6] at la Salle BCN - Ramon Llull Universityfacilities. They were students between 8 and 12 yearsold, and they are going to do this activity as a workshoporganised aside a summer camp about robotics. The studentswere divided in three classrooms or groups of 12 each, andin each classroom there were 4 groups of 3 participants each.

The activity is a 2h long session where children are goingto build and program a rock-paper-scissor player. As wecan see in Fig.1, each classroom has different resources toaccomplish the goal: the study group A have a computer withScratch software, in the study group B the students have acommercial LEGO NXT 2.0 set + Scratch to program it,and finally, the group C use the AISOY robot + Scratchto program the game. The group A will interact with thecomputer, and the interaction system will be the Scratchwindow. In group B, the students will have the computerwith the scratch linked to AISOY, an educational social robotplatform. Finally, in group C students will have the LEGONXT 2.0 sensors and motors to build the physical agentthat will perform the game, also connected to the scratchsoftware.

Fig. 1. These are the three platforms that students are going to use toimplment the game rock-paper-scissors.

We are measuring not only the absolute data acquire fromthe sessions, but also the incremental gain based on a pre-test and a post-test conducted at the beggining of the sessionand at the end.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 62 -

Page 65: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

A. Setup of the StudyThe sessions are recorded with two cameras that cover all

the classroom, and one camera for each table covering theworking space and the kids.

B. Evaluation MetricsWe are evaluating the following skills:• Level of autonomy: How many times they ask for help.

The capacity of divide a complex task in subtasks.• The creativity: We are measuring the differences be-

tween the designs and solutions that the kids can find.These can be about the coding, or about the building.

• The coding performance: the items to be evaluated hereare the understanding of the concept of the variable,loop, and conditional.

• The building performance: how robust are the system,the reliability, and the robustness of the implementation.

• Hardware knowledge: How a sensor and actuator works.• Social skills: wining, losing, greeting, cheating, mercy.• Application in the real goal: which solution allow the

student to map applications in the real world.

3. RESULTS AND CONCLUSIONSNot all kids that participated in the study where familiar

with the Scratch software, the LEGO NXT 2.0, or also withthe AISOY robot. However the number of times that theyhave been playing with LEGO or scratch is much higherthan with the AISOY Robot. Novel effect could contributeto focus on the activity so students in group B paid moreattention compared to group A or C.

While all children played nicely during the test phase,the group A plays a children computer interaction as it wasa video game, the group B had more social-based gameand they considered the robot as a human-like competitor.Finally, the group C who were using the LEGO NXT 2.0created a children-machine interaction context.

When the implementation was forced to cheat with theresult, groups A and C assigned an attribute of failureto the system, showing emotional states of angriness andfrustration. In group B, the reaction was quite different,students enjoyed when the robot failed with the answer ofthe game. Implementation B helps to work issues like fairplay, cheating, etc. creating a positive atmosphere at the sametime.

The group using the LEGO NXT 2.0 (C) set asked forhelp higher number of times and it makes sense because thiswas the group with a wider diversity of elements. Group Bneeded to ask for help for the same issue a higher numberof times than the other groups. We understand that missing atangible context difficulties the understanding of the specificcoding task. Group C had a better balance between solvingthe questions fast and the generated number of questions.

During the sessions, we asked in the pre and post test theapplicability of the Scratch software. While group A 100%of answers, before and after the session, were to programor to program video games, the groups B and C include notonly video games but also robots in the case of group C,

and 2 students answered robots or other devices in group B.However we understand that better results can be obtainedif we increase the number of participants, the diversity ofactivities, and the number of sessions.

If we focuse on two of the evaluation metrics that rep-resents how well the students learnt about new concepts(what is a variable and what is a motor) we can see thatthe increment of percentatge of good answers is as follows:

• Coding performance: the percentatge of students thatunderstood what is a variable is, in case A 10%, incase C 17%, and 50% in case B.

• Coding performance: the percentatge of students thatunderstood what is a motor is, in case A 25%, in caseC 59%, and 42% in case B.

AISOY got a better results understanding an intangibleconcept as a variable while LEGO NXT works better tounderstand a tangible and specific component as a motor.However, is interesting that in case B results about whatis a motor was in most cases to make robot work in theenvironment while in case C was more like turning wheelson.

About how they like the activity, the score obtained bycase A in a scale 1 to 5 was 4.25, in case B was 4, and in Cwas 4.25. So the conclusions is that all of them were goodenough in terms of fun.

Finally, we observed that Group B had a better capacityto map what they learn to applications in the real world.

Other considerations to take into account for further re-search are: 1) Team teaching understood as how to organisethe group roles, balancing of tasks, and make sure thateveryone understands the concepts and processes, and 2)The ways of playing with the final implementation: childto system play, multiple children to system play, childrenare following turn taking to play.

ACKNOWLEDGMENTWe thank la ClauTIC to be involved with this study

and facilitate part of the material. This project is part ofa La Salle - Ramon Llull University funded project calledRobotics@Schools.

REFERENCES

[1] Saarinen, Esa, and Raimo P. Hmlinen. ”Systems intelligence: Connect-ing engineering thinking with human sensitivity.” Systems intelligencein leadership and everyday life (2007): 51-78.

[2] Antle, Alissa N., Milena Droumeva, and Daniel Ha. ”Hands on what?:comparing children’s mouse-based and tangible-based interaction.”Proceedings of the 8th International Conference on Interaction Designand Children. ACM, 2009.

[3] Horn, Michael S., R. Jordan Crouser, and Marina U. Bers. ”Tangibleinteraction and learning: the case for a hybrid approach.” Personal andUbiquitous Computing 16.4 (2012): 379-389.

[4] Hughes, Jan N., and Oi-man Kwok. ”Classroom engagement mediatesthe effect of teacherstudent support on elementary students’ peeracceptance: A prospective analysis.” Journal of school psychology 43.6(2006): 465-480.

[5] Hughes, Jan N., and Oi-man Kwok. ”Classroom engagement mediatesthe effect of teacherstudent support on elementary students’ peeracceptance: A prospective analysis.” Journal of school psychology 43.6(2006): 465-480.

[6] http://www.clautic.com/index.php

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 63 -

Page 66: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Ensuring Ethical Behavior from Autonomous Systems Michael Andersona, Susan Leigh Andersonb, and Vincent Berenzc

aUniversity of Hartford, Dept. of Computer Science, W. Hartford, CT, USA bUniversity of Connecticut, Dept. of Philosophy, Storrs, CT, USA

cMax Planck Institute, Autonomous Motion Department, Tuebingen, Germany

Autonomous systems that interact with human beings require particular attention to the ethical ramifications of their behavior. A profusion of such systems is on the verge of being widely deployed in a variety of domains. These interactions will be charged with ethical significance and, clearly, these systems will be expected to navigate this ethically charged landscape responsibly. As correct ethical behavior not only involves not doing certain things, but also doing certain things to bring about ideal states of affairs, ethical issues concerning the behavior of such complex and dynamic systems are likely to exceed the grasp of their designers and elude simple, static solutions. To date, the determination and mitigation of the ethical concerns of such systems has largely been accomplished by simply preventing systems from engaging in ethically unacceptable behavior in a predetermined, ad hoc manner, often unnecessarily constraining the system's set of possible behaviors and domains of deployment. We assert that the behavior of such systems should be guided by explicitly represented ethical principles determined through a consensus of ethicists [1][2][3]. Principles are comprehensive and comprehensible declarative abstractions that succinctly represent this consensus in a centralized, extensible, and auditable way. Systems guided by such principles are likely to behave in a more acceptably ethical manner, permitting a richer set of behaviors in a wider range of domains than systems not so guided.

To help ensure ethical behavior, a system’s ethically relevant actions should be weighed against each other to determine which is the most ethically preferable at any given moment. It is likely that ethical action preference of a large set of actions will be difficult or impossible to define extensionally as an exhaustive list of instances and instead will need to be defined intensionally in the form of rules. This more concise definition is possible since action preference is only dependent upon a likely smaller set of ethically relevant features that actions involve. Given this, action preference can be more succinctly stated in terms of satisfaction or violation of duties to either minimize or maximize (as appropriate) each feature. We refer to intensionally defined action preference as a principle.

As it is likely that in many particular cases of ethical dilemmas ethicists agree on the ethically relevant features and the right course of action in many domains where autonomous systems are likely to

function, generalization of such cases can be used to help discover principles needed for their ethical guidance. A principle abstracted from cases that is no more specific than needed to make determinations complete and consistent with its training can be useful in making provisional determinations about untested cases. If such principles are explicitly represented, they have the added benefit of helping justify a system’s actions as they can provide pointed, logical explanations as to why one action was chosen over another. Cases can also provide a means of justification for a system’s actions: as an action is chosen for execution by a system, clauses of the principle that were instrumental in its selection can be determined and, as clauses of principles can be traced to the cases from which they were abstracted, these cases and their origin can be ascertained and used as justification for a system’s action by analogy.

A principle that determines which of two actions is ethically preferable can be used to define a transitive binary relation over a set of actions that partitions it into subsets ordered by ethical preference with actions within the same partition having equal preference. This relation can be used to sort a list of possible actions and find the currently most ethically preferable action(s) of that list. This forms the basis of a case-supported principle-based behavior paradigm (CPB): a system decides its next action by using a principle, abstracted from cases where a consensus of ethicists is in agreement, to determine the most ethically preferable one(s).

Currently, we are using our general ethical dilemma analyzer (GenEth) [4] to develop an ethical principle to guide the behavior of a Nao robot in the domain of eldercare. The robot’s current set of possible actions includes charging, reminding a patient to take his/her medication, seeking tasks, engaging with patient, warning a non-compliant patient, and notifying an overseer. Sensory data such as battery level, motion detection, vocal responses, and visual imagery as well as overseer input regarding an eldercare patient are used to determine values for action duties pertinent to the domain. Currently these include maximize honor commitments, maximize readiness, minimize harm, maximize possible good, minimize non-interaction, maximize respect for autonomy, and minimize persistent immobility. Clearly these sets of values are only subsets of what will be required in situ but they are representative of them and can be extended.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 64 -

Page 67: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The robot’s behavior at any given time is determined by sorting the actions by their ethical preference (represented by their duty values) and choosing the highest ranked one. As the following learned principle returns true if the first of a pair of actions is ethically preferable to the second, it can be used as the comparison relation required by such sorting:

This principle was abstracted from a number of particular cases of ethical dilemma types in which there is a consensus as to the ethically relevant features involved and ethically preferable action. Again, it is only representative of a full principle that will be required but it too is extendable.

To gauge the performance of principles generated by GenEth, we sought the considered choice of ethically relevant action from a panel of five applied ethicists (including the project ethicist) in 28 cases in four domains, one for each principle being test that was abstracted by GenEth. These questions are drawn both from training (60%) and non-training cases (40%). Of the 140 responses, the ethicists agreed with the system’s judgment on 123 of them or about 88% of the time. We believe this result will only improve as the principles are further specified and cases are more precisely stated.

Because autonomous robots are complex dynamic systems that must enforce stable control loops between sensors, estimated world model and action, integration of decision systems and high level behaviors into robots is a challenging task. This holds especially when human-robot interaction is one of the objectives, as the resulting robotic behavior has to look natural to any external observer. To deal with this complexity, we interfaced CPB with Fractal, our state of the art customizable robotic architecture. Fractal allows easy implementation of complex dynamic behaviors. It transparently: 1) implements the filters and algorithms to the sensory information required continuously maintain an estimation of the world model, 2) adapts the layout of its program during runtime to create suitable data flow between decision, world model and

behavior modules, and 3) provides its client software, in this case CPB, with a simple API allowing manipulation of a library of high level preemptive behaviors. Fractal is an extension of Targets-Drives-Means [5], a robotic architecture characterized by its high usability [6]. Interfacing between CPB and Fractal (see following figure) allows the ethical decision procedure to run at a frequency of the order of 10 Hz, ensuring smooth execution of robotic behavior as well as a rapid runtime adaptation of the ethical behavior of the robot upon change in the situation.

REFERENCES 1. Anderson, M. & Anderson, S. L., Machine Ethics:

Creating an Ethical Intelligent Agent, Artificial Intelligence Magazine, 28:4, Winter 2007.

2. Anderson, M. & Anderson, S. L., Robot Be Good, Scientific American 303.4 2010: 72-77.

3. Anderson, M. & Anderson, S. L., Toward Ensuring Ethical Behavior from Autonomous Systems: A Case-Supported Principle-Based Paradigm, Industrial Robot: An International Journal 2015 42:4, 324-331.

4. Anderson, Michael, and Susan Leigh Anderson. "GenEth: A General Ethical Dilemma Analyzer." Twenty-Eighth AAAI Conference on Artificial Intelligence. 2014.

5. Berenz, Vincent and Suzuki, Kenji, "Targets-Drives-Means: A declarative approach to dynamic behavior specification with higher usability", Robotics and Autonomous Systems 04/2014; 62(4)

6. Berenz, Vincent and Suzuki, Kenji, "Usability benchmarks of the Targets-Drives-Means robotic architecture", Humanoid Robots (Humanoids), 2012 12th IEEE-RAS International Conference On; 01/2012

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 65 -

Page 68: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Robot Futures: Using Theater to Influence Acceptance of Care Robots

Anja Chistoffersen, Sally Grindsted Ni eisen, Elizabeth Joehum and Zheng Hua Tan Aalborg University, Danmark

ABSTRACT Robots are increasingly used in health care settings, e.g., as home­care assistants and personal companions. One challenge for personal robots in the home is acceptance. We describe an innovative approach to influencing the acceptance of care robots using theatrical performance. Live performance is a useful testbed for developing and evaluating what makes robots expressive; it is also a useful platform for designing robot behaviors and dialogue that result in believable characters. Therefore theatre is a valuable testbed for studying human-robot interaction (HRI). We investigate how audiences perceive social robots interacting with humans in a future care scenario through a scripted performance. We discuss our methods and initia! findings, and outline future work.

Keywords

Robot theatre, H RJ, social robots, health care, assistance robot

1. INTRODUCTION Robots increasingly appear in the dornestic situations and in health care institutions as therapeutic assistants and personal companions. Following earlier work of entertainment robots in live performance [4] [5] [6] , we wrote, directed, and produced a one-act theatre play for studying human robot interaction to analyze and influence the audience ' s perception of social robots. lt was important to dramatize a realistic, possible "future scenario" that was not in the realm of science fiction but rather reflected state-of-the-art clinical trials and user studies. Live performance requires situated, embodied robots to move autonomously or semi-autonomously alongside human actors and in coordination with human operators. However, unlike clinical settings or "in the wild" user studies, live performance offers a liminal, "in between" space for examining how to best design robotic campanions that are engaging to users. This way, we combine entertainment robots with assistive robots to investigate how users possible future users perceive social robots interacting with humans.

Permission to make digital or hard copies of part or all of this work for personal or classroom u se is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/ Author.

Copyright is held by the owner/author(s).

'lil:.t,_ . ~-· - ,,~-,_ ~ ,

2. "CORNELL"

Figure I. NAO and a human actor in a scene from the play "Cornell".

Cornell is a one-act play between a human and a robot, based on social psychoiogist Arthur Aron 's research on experimental generation of interpersonal closeness [ 1] . We wanted to take the setting of a traditional theatre play and combine it with technology in the form of a state-of-the-art humanoid robot (NA0 1

) to see ifit was possible to design an engaging robot character in a believable future scenario. The operating system in the NAO enables the robot to learn from behaviors and detect and recognize emotions [8]. lt is a system that is basedon natura! interaction and emotion. Using a combination of pre-programmed animations and Wizard of Oz (WOZ) puppeteering [7], we designed a believable robot character to interact with a live performer onstage. We want to measure both the audience's reactions to the robot as a believable agent and their reactions towards having social robots assisting in daily tasks. Issues of particular importance are sociality, empathy and HRI.

"Versatile Humanoid Robots for Theatrical Performances" [5] indicates that what the auctienee cares most about are details such as eye contact, non-verbal behaviors, appearance, behaviaral motions and sounds. In order to create a good performance with a robot as an actor, the above-mentioned are of great importance. Our chief concern producing Cornell was that the auctienee would find the robot performance boring, predictabie and would not be able to perceive convincing emotions or empathy between the actor and the robot. When the audiences see the NAO robot they do not immediately develop emotions towards it, as it does not resembie a human being but appears more like a toy. Therefore it was important to add elements like non-verbal and verbal behaviors that could create the illusion of life. Si nee the NAO did not have facial expressions or auctienee interaction we had to create the bond between our robot, the play and the auctienee in another way. We choreographed the NAO 's responses to the actor, so it seemed like it was "listening" when the actor talked or when it had to answer to something, where it would wait a while before answering ("as if' it was thinking). This gave the illusion that the robot was capable of having a fluid, natura! conversation on a human level. The behaviors were pre-programmed and controlled during the play but not visible to the audience, and therefore we could maintain the illusion of having a believable, autonomous agent.

1 http://www.aldebaran.com

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 66 -

mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
mh0039994
Getypte tekst
Page 69: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

The premise for the project comes from the world of theatre and performance, which is an aspect that has caught the interest of HRI researcher Guy Hoffman. Hoffman discusses certain timing effects, such as discrete post-action delays and anticipatory actions from the robot performer in relation to the human performer and suggests that these actions cause the audience to experience the human-robot joint action and dialogues as more fluent and improvisational [3]. In his study of robots as performers, Hoffman suggests that theatre and musical performances are useful testbeds for HRI studies. The reason why theatre makes such a great platform is because it enables one to isolate certain elements of the human-robot interaction and emphasize these elements. In Cornell the emphasis is centered on the intimate interactions between the robot and the human.

HRI is critical to understanding how robots will interact with humans in the future. Initially robots were designed to perform a single specific task, e.g. in a factory producing cars, and the robot would not be utilized in any other way or at any point have to interact in social settings with humans. Care and assistive scenarios now require that robots are social, more intuitive, and interactive. For example, robots used in health care settings can enhance lifestyle and function as social companions.

The plot of Cornell takes place in the near future where robots appear more frequently in everyday life. The human character, Zoey, has been in an accident, which causes her short-term memory loss. She is provided a robotic helper to assist in her rehabilitation at home following the accident. The robot functions as a caretaker, but the human wants more than “just” a robot and looks to the robot to establish friendship and interpersonal closeness.

To measure the effect of the performance we administered questionnaires to the audiences. We measured five key concepts in HRI: anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety [2]. We also added some open questions where the audience e.g. could comment on how they perceived the appearance of the robot, and what they thought about the idea of a robotic helper. The goal of the questionnaires was to evaluate the performance and to see if we had succeeded in creating a believable robotic agent. We also wanted to see if the audience could overlook the robot’s “toy” appearance and influence the acceptance of robots in the home. With the questionnaires we hope to measure whether the audience perceived empathy between the human and the robot, and if they developed empathy for the characters.

3. PRELIMINARY RESULTS We plan to conduct statistical analysis of the feedback from audience questionnaires. Initial review of the data indicates that audiences perceived the robot as alive, responsive and lifelike. From this we hypothesize it is possible to design empathic, believable robot characters that will increase the acceptance of care robots as personal companions. When we asked the audiences whether they found the robot intelligent or not the majority agreed that the robot behaved intelligently and was perceived as agential.

Since we had used the idea of a robotic helper in a domestic situation we were also interested to hear if people could envision this happening in the future and if they could imagine having a helper for themselves. Here there were mixed opinions. The

results have yet to be compiled fully, but some people found the thought very interesting and the fact that it could maintain difficult or even boring tasks was a plus. On the contrary some also found it a bit terrifying to be so close to a robot that they would not have any control over. One audience member thought that having a robot helper instead of an actual human helper would result in the loss of intimacy between the patient/elderly and the caretaker.

4. CONCLUSION We saw that theatre, even though it was a staged scenario, provided for a great testbed in the investigation of HRI, the relationship between human and robot and also to understand how we as humans perceive robots. We gave people the chance to see a glimpse of what the future could offer with personal robots, and worked towards increasing acceptance of robots in care scenarios.

5. Acknowledgments and Website We wish to thank Ibrahim Hameed, Xiaodong Duan, Nicolai Bæk Thomsen, Søren Holdt Jensen, Børge Lindberg for their help on the iSociobot project. The play was produced at Theatre Nordkraft in Aalborg, Denmark. http://www.teaternordkraft.dk. http://socialrobot.dk.

6. REFERENCES [1] Aron, Arthur; Melinat, Edward; Aron, Elaine N.; Vallone, Robert Darrin; Bator, Renee J., (1997) “The experimental generation of interpersonal closeness: A procedure and some preliminary findings”. [Online] Personality and Social Psychology Bulletin, Vol 23(4), Apr, pp., 363-377, The Society for Personality and Social Psychology, Inc. (http://dx.doi.org/10.1177/0146167297234003)

[2] Bartneck, C., Croft, E., & Kulic, D. (2009) “Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots.” [Online] International Journal of Social Robotics, 1(1), pp. 71-81. (http://www.bartneck.de/publications/2009/measurementInstruments- Robots/)

[3] Hoffman, G. (2011). On Stage: Robots as Performers. Robotics: Science and Systems Workshop on Human-Robot Interaction. [4] Jochum, E. S. (2014). Robotic Puppets and the Eningeering of Autonomous Theater. In A. E. Laviers, Controls and Art: Inquiries at the Intersection of the Subjective and the Objective (pp. 107-128). Springer.

[5] Lin, C. Y., Cheng, L. C., Huang, C. C., Chuang, L. W., Teng, W. C., Kuo, C. H., Gu, H. Y., Chung, K. L., Fahn, C. S., (2012) “ Versatile Humanoid Robots for Theatrical Performances”. [Online] International Journal of Advanced Robotic System, pp. 1-13, INTECH (http://cdn.intechopen.com/pdfs/41814/In- Tech-Versatile_humanoid_robots_for_theatrical_performances.pdf)

[6] Lu, D. (2012). Ontology of Robot Theatre. International Conference on Robotics and Automation (ICRA).

[7] Lu, D. S. Polonius: A Wizard of Oz interface for HRI experiments. ACM/IEEE International Conference on Human-Robot Intereaction (HRI) (pp. 197-198). Lausanne: IEEE.

[8] Aldebaran https://www.aldebaran.com/en/robotics-solutions/robot-software/development

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 67 -

Page 70: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Empathy, Compassion and Social Robots: an Approach from

Buddhist Philosophy

Resheque Baruaa,

Shimo Sramona and Marcel Heerink

b

aMahachulalongkornrajavidyalaya University, Bangkok, Thailand,

bWindesheim Flevoland University, Robotics research group, Almere, The Netherlands

Abstract. In Buddhism, a key aspect of interaction

between humans mutually and between humans and other social beings, is empathy. In this paper this concept is defined and applied to different aspects of human-robot relations as a first step towards Buddhist approach of this field.

Keywords: Empathy, compassion, Buddhism, five skandhas, social robots

INTRODUCTION

In our present society, we experience what Wiener

(1) called a second industrial revolution, which

addresses not only mechanical developments, but also

intellectual developments, resulting in intelligent

machines that are physically embodied. These, we usually refer to as robots and if they use any form of

social interaction, we call them social robots. These

mechanical systems can be experienced as social

entities, and even more so if they are social robots.

This raises the question if it is possible to develop

empathy in human-robot interaction, even if it would

in fact not be much more than a computer with a

physical embodiment. In a Buddhist society, it would

raise questions on how to morally deal with empathy

in human-robot relationships. In fact, the answers to

these questions may impact acceptance of social robots.

MEANING OF EMPATHY

Empathy means ‘trying on someone else’s shoes’,

putting oneself in the position of the other, to suffer as the other suffers (2). From a Buddhist point of view,

we must develop our empathy with compassion and

closeness to others and recognize the gravity of their

misery. The closer we are to a person, the more

unbearable we find that person’s suffering. This

closeness is not a physical proximity, nor does it need

to be an emotional one. It is a feeling of responsibility,

of concern for a person or another social being. In

order to develop such closeness, we must reflect upon

the virtues of cherishing the well-being of others. We

must come to see how this brings one an inner happiness and peace. We must come to recognize how

others respect and like us as a result if such attitude

toward them (9).

The whole Buddhist philosophy and practices

which is all about liberation and nirvana and this is the

greatest act of empathy towards the world: empathy

and compassion are - although also often embedded in

Western philosophy - core concepts Buddhist

philosophy (3,4).

BUDDHIST PHILOSOPHY AND HUMAN-

ROBOT RELATIONSHIPS

The practical benefits of robots that are productive

or assistive are obvious. However, social robots can

actually (also) be socially assistive to people, by

expressing or receiving empathy, as in dementia

therapy, with hospitalized children and with children

with autism (5,6,7).

Addressing this, we have to take into account that

Buddhist philosophy is based on self-investigation of

human minds rather than on scientific models, scans, and experimental research (8,9,10). It is as much a

moral philosophy as a descriptive one, and proposes

unusual states of mind that have only begun to be

explored in laboratories, there are convincing

arguments both for in and against the role of robots in

our future would(11-13).

Empathy is a mental process that includes the

ability to not only detect what others feel but also to

experience that emotion yourself. To empathize with

other person, the element of wisdom is not required. It

is just a good quality which can fluctuate because it is

not stable. And it is conditional (8). In Buddhism, mental processes are broken out in

many ways, but most basically, as the five skandhas

(9): (1) the body and sense organs (rūpa), (2) sensation

(vedanā), (3) perception (samjñā) (4) volition

(samskāra) and (5) consciousness (vijñāna).

If we parallel this to a robot and require its mental

processes to include these skandhas in order to truly

speak of empathy, we see that the first is depending on

the exact definition. If it requires a biological system,

it would require the robot to be just that. If we realize

that presently many internal and external human body parts can be non-biological, the extent to which a

biological nature is required could be open to

reconsideration. Nevertheless, empathy is a response

to suffering, which is inherently linked to a biological

process, leading to an action of compassion in which

consciousness is essential. For example, when an

animal is being abused physically by a person and

people will feel sad to see such cruelty happen, that

feeling is empathy. If someone will step up and do

something about it, it is in fact empathy with action.

Meaning the person has compassion.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 70 -

mh0039994
Getypte tekst
Page 71: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Empathy and compassion can however also

respond to mental suffering, which does not require a

biological system. I that sense, only consciousness is a

requirement that is still a challenge.

ROBOTS AS MEDIATORS AND

REPRESENTATIONS

If there would still be too many obstacles to state

that robots can truly be empathic, this does not mean

that empathy cannot be perceived by a human

interacting with it. If we view a robot as a medium that

expresses the empathy that is developed by a human

programmer or operator. If this robot would be created

or programmed out of empathy, his existence would be an act of compassion and if its actions would be

motivated by the empathy felt by the programmer or

operator, these actions can also be taken as acts of

compassion. Actually there are no teachings that

would object to this, even if the human that perceives

this empathy is not conscious of the mediation.

Moreover, it would not matter whether the empathy is

perceived as such or not.

If a human feels empathy for a robot, as in robot

assisted dementia therapy, there can be objections

stating that it is not a biological entity. However, it can

be viewed as equal to a fictional character in a movie or a book that we feel empathy for, with the addition

that empathy for a robot can be translated in acts of

compassion. We can state that a robot, just like a

fictional character, is a representation of life, which is

sufficient to evoke empathy. And since Buddhism

teaches to focus on the development of empathy rather

than on receiving and perceiving it (3), there is no

objection to a robot being a non-biological or non-

conscious entity, whether the human is conscious or

not.

At this point, we can take into account that in Buddhist philosophy, there are three important

principles which are called as Anicca (impermence),

Dukkha (suffering), and Anatta (Non-self)(3). The

latter enforces both mediation and representation,

since the non-self can be realized by both.

CONCLUSIONS AND FINAL THOUGHTS

There are some issues concerning embodiment and

consciousness that challenge a view on social robots as

entities that are capable of empathy. However, if

robots are viewed as mediators and representations,

there are no objections to introduce them in a way that

it ensures social and therapeutic benefits. This is

especially so if human–robot interaction is set up from

an empathic intention. This means that further explorations could focus on those aspects that might

affect empathy and enables acts of compassion.

Our main conclusion at this point is however, is

that a robots that is developed out of empathy not only

enables acts of compassion, it is in fact an act of

compassion.

REFERENCES

1. Wiener, N. (1948). Cybernetics, Hermann Paris. 2. Bina Gupta (2011) An introduction to Indian

philosophy, Prospective on Reality, Knowledge and Freedom. Routledge

3. Mahāthera, N. (1988). The Buddha and His Teachings, Kuala Lumpur, Malaysia: Buddhist Missionary Society

4. Christine Dell'Amore (2009) Darwin the Buddhist?

Empathy Writings Reveal Parallels in Chicago National Geographic News .February 16, 2009

5. Alemi, M., A. Ghanbarzadeh, A. Meghdari and L. J. Moghadam (2015). "Clinical Application of a Humanoid Robot in Pediatric Cancer Interventions." International Journal of Social Robotics: 1-17.

6. Scassellati, B., H. Admoni and M. Mataric (2012). "Robots for use in autism research." Annual Review of

Biomedical Engineering 14: 275-294. 7. Smits, C., S. Anisuzzaman, M. Loerts, M. Valenti-Soler

and M. Heerink (2015). "Towards Practical Guidelines and Recommendations for Using Robotics Pets with Dementia Patients " Canadian International Journal of Social Science and Education 3 (May 2015): 656-670.

8. Dudjom Rinpoche (1991) The Nyingma School of Tibetan Buddhism - It's Fundamentals & History Wisdom Publications, Sommerville, MA 1991. Retrieved, www.eparg.org. January 16, 2014

9. Hayward, J. (1998). A rDzogs-chen Buddhist interpretation of the sense of self." Journal of Consciousness Studies 5(5-6): 611-626..

10. Damasio, Antonio. 1995. Descartes‘ Error: Emotion, Reason, and the Human Brain. New York: Harper Perennial

11. Schodt, F. L., 1988. Inside the Robot Kingdom - Japan,

Mechatronics, and Coming Robotopia. Tokyo, New York: Kodansha.

12. Kahn, P. H., Jr., Freier, N. G., Friedman, B., Severson, R. L., & Feldman, E. (2004). Social and moral relationships with robotic others? Proceedings of the 13th international workshop on robot and human interactive communication . Piscataway, NJ: Institute of Electrical and Electronics Engineers (IEEE).

13. Grau, Christopher. 2006. There is no ―I in ―robot: Robots and utilitarianism. IEEE Intelligent Systems.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 71 -

Page 72: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Hygiene and the use of robotic animals: an exploration Tecla S. Scholten, Charlotte Vissenberg and Marcel Heerink

Windesheim Flevoland University for Applied Sciences, Almere, The Netherlands

Abstract. The aim of this study is to synthesize the

existing literature on hygiene and robotic animals to

provide researchers and professionals that use robotic

animals with tools and guidelines regarding the hygienic

application of this technology in a hospital environment.

Keywords: Robotic animal, hygiene, review, reduced

resistance, pathogenic microorganisms

BACKGROUND

With technology developing at an increasing rate, the

use of robots in health care is becoming more and

more widespread 1 2

. This also includes the use of

animal shaped social robots that are increasingly used

in therapy or as a companion 3 4

, which has been

studied before in multiple populations and seems

effective in diverse settings such as a tool for social

development of autistic children, social interactions

with preschool children and as a companion in elderly

care 5-7

.

Recent studies intend to study the effects of robotic

animals in hospitalized children 8. The application of

robotic animals in more diverse settings, including

populations with a reduced resistance towards

pathogenic microorganisms, raises questions about the

hygiene of robotic animals.

Most of these animals are covered with fur or other

forms of realistic skin. Little is known about the ways

to effectively handle and clean robotic animals to

make them in concordance with existing hygiene

standards in hospital settings. However, there are

studies that have shown that toys can be contaminated

with (pathogene) micro-organisms and therefore may

pose a potential risk of infection 9-15

. It seems likely

that this is also the case with robotic animals.

Therefore we aim to synthesize literature on

hygiene and robotic animals to provide guidelines

regarding the hygienic application.

METHOD

We conducted a literature review for publications

regarding hygienic measures when using robotic

animals with hospitalized children. Databases

included: Academic Search Elite, Cinahl, Pubmed,

Science Direct, Google Scholar and SpringerLink. The

following search terms and combination of terms were

used: hygiene’, ‘infection prevention’, ‘cross

infection’, ‘disinfection’, ‘decontamination’,

‘hospital’, ‘children’, ‘pediatric’, ‘oncology’,

‘healthcare’, ‘daycare’, ‘social robot’, ‘robot animal’,

‘robotic pet’, ‘Pleo’, ‘toys’ en ‘user manual’. Through

the snowball method, the references of relevant studies

were also checked.

Unfortunately publications regarding hygiene and

robotic animals do not exist yet. Therefore we

expanded our search to also include toys in general,

other settings such as other healthcare facilities (day

care centers, geriatric departments, waiting room

general practitioner) and other types of patients

(premature infants, elderly, healthy children).

The included studies were analyzed according to a

framework that encompassed the following themes: 1)

cleaning procedure, 2) cleaning frequency, 3) sharing

RESULTS

We included 17 national and international

publications: nine research reports 9-17

, , five hygienic

guidelines 18-22

and three manuals of robotic animals 23-

25.

1) Cleaning procedure

Robotic animals should be cleaned using a brush or a

damp towel. Due to the technological devices in these

animals they cannot be cleaned with cleansers or be

exposed to excessive water or other liquids 23-25

.

Toys in general should be cleaned with all purpose

cleaner and/or a cloth with (a solution of) disinfectants 18-20

. The recommendations of how to clean toys are

further divided between hard (e.g. plastics) and soft

toys (e.g. stuffed animal). Hard toys must be cleaned

with water and soap and then be immersed in a

disinfectant (bleach, hypochlorite or other

disinfectants). After that they must be rinsed with

water and be dried in the air 9 10 12 20 21

. If possible,

hard toys should be washed in the dish washer 20 21

.

For soft toys washing in the washing machine is

suggested 9 13 14 20

but opinions about the temperature

vary. 48° 9, 60°

13 or 80°

14 are suggested.

2) Cleaning frequency

Recommendations regarding the cleaning frequency

vary between monthly 19 22

, weekly 13 14 16 21

, regularly 10

, daily 17

and under certain circumstances (e.g.

infectious outbreak or when contaminated with saliva,

defecation or vomit) daily 16 18

or directly after use 15 18

19 21.

3) Playing and sharing

Regarding sharing toys, it is generally advised to

provide each patient with his or her own toy 10 14 15

.

Especially when patients have an infection that needs

preventive measures or are treated in isolation toys

should not be exchanged 14 16 22

.

CONCLUSION

Regarding the cleaning procedure and the cleaning

frequency of toys there are no definite answers to be

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 72 -

Page 73: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

drawn from the literature. With regard to sharing toys,

the literature advised to provide each patient with

his/her own toy and to limit the extent of sharing toys.

The comparison of the robotic animal manuals and the

advices from the literature regarding cleaning raises

the question to which extent these can be integrated.

The advised cleaning procedures all include extensive

use of water and detergents, which robotic animals

cannot handle.

DISCUSSION

Due to a lack of suitable studies, we included few on

topic publications. Therefore it was impossible to take

the differences in hygiene regulations per country into

account. Furthermore, the research reports vary greatly

in size and comparability which makes it hard to draw

definite conclusions which limits the generalizability

of this study.

Prevention of infections by robotic animals among

patients is a new study domain. To prevent robotic

animals from becoming dangerous friends instead of

new friends it is necessary to gain more knowledge

about this subject. Research should be conducted

regarding the risks of infections by robotic animals

and the preventive measures that should be taken

accordingly before these animals are used in settings

with patients that are vulnerable or have diminished

resistance.

References

1. Salter T, Werry I, Michaud F. Going into the wild in

child–robot interaction studies: issues in social

robotic development. Intelligent Service Robotics

2008;1(2):93-108.

2. Broekens J, Heerink M, Rosendal H. Assistive social

robots in elderly care: a review. Gerontechnology

2009;8(2):94-103.

3. Enjoyment intention to use and actual use of a

conversational robot by elderly people.

Proceedings of the 3rd ACM/IEEE international

conference on Human robot interaction; 2008.

ACM.

4. Heerink M, Kröse B, Evers V, et al. Assessing acceptance

of assistive social agent technology by older

adults: the almere model. International journal of

social robotics 2010;2(4):361-75.

5. Robotic animals might aid in the social development of

children with autism 2008. IEEE.

6. Robotic pets in the lives of preschool children. CHI'04

extended abstracts on Human factors in computing

systems; 2004. ACM.

7. Tamura T, Yonemitsu S, Itoh A, et al. Is an entertainment

robot useful in the care of elderly people with

severe dementia? The Journals of Gerontology

Series A: Biological Sciences and Medical

Sciences 2004;59(1):M83-M85.

8. Design of a therapeutic robotic companion for relational,

affective touch. Robot and Human Interactive

Communication, 2005 ROMAN 2005 IEEE

International Workshop on; 2005. IEEE.

9. Ibfelt T, Engelund E, Schultz AC, et al. Effect of cleaning

and disinfection of toys on infectious diseases and

micro-organisms in daycare nurseries. Journal of

Hospital Infection 2015;89(2):109-15.

10. Avila-Aguero MaL, German G, Paris MaM, et al. Toys

in a pediatric hospital: Are they a bacterial source?

American journal of infection control

2004;32(5):287-90.

11. Kamhuka LN, Rees G. Successful Control of a

Vancomycin Resistant Enterococci (VRE)

Outbreak on a Pediatric Ward–Do Not Forget the

Toys Cited in Scopus: 4. 2013.

12. Merriman E, Corwin P, Ikram R. Toys are a potential

source of cross-infection in general practitioners'

waiting rooms. British Journal of General Practice

2002;52(475):138-40.

13. Naesens R, Jeurissen A, Vandeputte C, et al. Washing

toys in a neonatal intensive care unit decreases

bacterial load of potential pathogens. Journal of

Hospital Infection 2009;71(2):197-98.

14. Subramanian B, Parsons H, Finner P, et al. Empathy

dolls: are they a source of cross-contamination

between patients? Journal of Hospital Infection

2014;87(1):50-53.

15. Fleming K, Randle J. Toys--friend or foe? A study of

infection risk in a paediatric intensive care unit.

Paediatric nursing 2006;18(4):14-18.

16. Rogers M, Weinstock DM, Eagan J, et al. Rotavirus

outbreak on a pediatric oncology floor: possible

association with toys. American journal of

infection control 2000;28(5):378-80.

17. Hanrahan KS, Lofgren M. EVIDENCE‐BASED

PRACTICE: EXAMINING THE RISK OF TOYS

IN THE MICROENVIRONMENT OF INFANTS

IN THE NEONATAL INTENSIVE CARE UNIT.

Advances in neonatal care 2004;4(4):184-201.

18. Pelgrim M, Waegemaekers C, Wouters W, et al. Een

uitbraak van gastro-enteritis in een

kinderdagverblijf: de belangrijkste evidence-based

hygiëneadviezen op een rij. Infectieziekten

Bulletin:229.

19. Milieu RvVe. Hygiënerichtlijnen voor Medische

Kinderdagverblijven en Boddaertcentra: Semi-

residentieel. . Amsterdam Landelijk Centrum

Hygiëne en Veiligheid 2009.

20. Committee SRIPaCE. Toys and Toy Cleaning, 2009.

21. Prevention CfDC. Guidelines for preventing

opportunistic infections among hematopoietic stem

cell transplant recipients. MMWR

Recommendations and reports: Morbidity and

mortality weekly report Recommendations and

reports/Centers for Disease Control 2000;49(RR-

10):1.

22. Preventie WI. Veilig werken in de kindergeneeskunde.

Leiden: Leids Universitair Medisch Centrum,

2004.

23. Meditech F. Sociale robot PARO gebruikershandleiding.

Tilburg.

24. Hasbro. FurReal Friends Instruction Manual. 2003.

25. Labs I. Pleo rb handleiding, 2010.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 73 -

Page 74: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Learning Social Skills through LEGO-based Social Robots for Childrenwith Autism Spectrum Disorder at CASPAN Center in Panama

Emelideth Valenzuelaa, Alex Barcob and Jordi Albo-Canalsb

aCASPAN Center of Panamabla Salle, Ramon Llull University

Abstract— This paper presents a project that seeks to use roboticsas a facilitator to create an appropriate context for training socialskills in children with special needs, specially children with AutismSpectrum Disorder (ASD) and use it to include them in everyday-life activities. Preliminary results based on semi-structured obser-vation and psychometric measures show how robotics could be auseful tool for that.

Keywords— Learning, ASD, Autism, Social Robots, LEGO, SocialSkills, Education

1. INTRODUCTIONThere are many different interventions around the world

in robotics to improve the social skills of children withAutism. Projects like AURORA, IROMEC, etc. [1] showpromising results with robotics technology for improving thesymptoms of children with ASD. Companies like Aldebaran,have a full-time psychologist on staff to help researchers andschools use their NAO robot (Aldebaran human-like robot)as a method of supporting social skills learning [2].

Robotics is easily accepted by children with ASD, as it ispredictable and repetitive, fitting well with their psychologyand learning style. Robots, as tools, can contribute to collab-orative classroom work by helping to adapt the level of theintervention session to students performance [3].

Designing activities with different robotic platforms arenow possible due to the recent development of low-cost con-trollers and easy-to-use software. From LEGO Mindstormsto Arduinos, robotics has entered almost every school, eitheras a course or as an after-school sports event at popularcompetitions (FIRST, WRO, RoboCup Jr, Botball, and soon).

Robot-based activities provide enhanced education en-vironments for enjoyable play, exploration and discovery,collaborative and cooperative activities, social interaction,(i.e. joint attention, sharing material, negotiating plans) andobservation of learning challenges. Because getting a robotto function correctly involves so many different skills (fromprogramming to ergonomics), robotics is inherently a team-based activity, providing a motivation to learn social skillsfor children who otherwise may not see their own needs[4]. Thus, we can help them to interact with people inreal scenarios. Besides, we can use robotics for educationalpurposes and teach them different topics as it could be colors,numbers, etc.

In the following lines the objectives of the study arepresented. We will then explain the methodologies and

resources used, and finally we present preliminary resultsabout the interaction between children and robot during thesessions.

2. OBJECTIVES

As already mentioned, the main objective of this projectwas to improve the social skills in children with ASD byusing robotic technology. Improved social skills was a betterpossibility of their social inclusion and the development ofother skills needed to develop personally and professionallypair. The detailed objectives of the study are:

• To include children with ASD in everyday-life. Al-though there is no treatment that eliminates the defi-ciencies of communication, socialization and behavior,many researchers have shown that there are strategiesand techniques for teaching communication and effec-tive responses in various social situations, and thoseskills could improve the success rate of adaptation ofan individual in society. [5], [6], [7], and [8].

• To prepare for an increased worldwide number of chil-dren with ASD. In recent years, there has been a gradualincrease in the prevalence of ASD. The Centers forDisease Control and Prevention (CDC, 2012) estimateda rate of 1/88 of people with ASD in the United States.In 2014, there is already a relationship 1/68.

• The need to understand how children with ASD solveengineering problems. This will help you identify theunique strengths that could tap for leadership withneuro-typical in the fields of robotics and technologyin general peers.

3. METHODOLOGY AND RESOURCES

In this project, we intended to create therapeutic activitiesaligned with the center CASPAN (Center Ann SullivanPanama) daily program to train social skills and problemsolving. These activities are based on previous work donein [9] and [10]. The driver and facilitator for this purposewill be the platform EV3 LEGO Robotics along with thetherapist. Each intervention will be once a week for 1 hour.For instance, an activity was how to deal with a pet so webuilt and programmed a pet robot (see Figure 1 where thetherapist could teach the right manner to interact with thedog robot.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 74 -

Page 75: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Fig. 1. A robot dog used during testing sessions.

We observed, measured and quantified how the use ofrobots, through semi-structured observation and psychomet-ric measures, fostered social interaction in children with ASDduring the group sessions. A number of 10 children (aged10-16 years old) were in each group.

Topics of work for the intervention group were:

• Identify feelings in oneself and others. Express ade-quately.

• Comment and participate in conversations and interac-tion situations between equals.

• Make use of non-verbal elements of communication.• Use (use) properly (or) verbal and nonverbal communi-

cation to give directions, ask for things or informationto teammates.

• Share materials and responsibilities, learn to communi-cate, to cooperate, to be supportive, and to respect therules of the group.

• Provide social reinforcement to others through positivefeedback.

• Practice communicating as equals of personal desiresor needs with courtesy and kindness (assertiveness,aggression, passivity).

• Promoting group cohesion among group participants.• The tasks and subtasks timing and capacity planning for

the challenges.

4. PRELIMINARY TESTS

Observational results showed how children engaged withrobotics activities focusing their attention on the robot.During the sessions, therapists did not have to encouragethe interaction with the robot due to the willingness of thechildren to play with it. Children showed interest in thedifferent activities proposed by the therapist, pointing andtouching the robot, clapping their hands and yelling at it.Also some children shared the robot during the sessionsor even communicate with each other laughing or smiling.These satisfactory behaviors suggest that introducing therobots during the daily sessions can help them to interactbetter with them and so, improve their social skills.

Besides, during the robot sessions we could realize howthe level of noise in the room was lower in comparison withthe daily activities where no intervention was done. Thiscould suggest that the attraction with the robot can help theactivities to make the sessions less noisy and so, less stressfulfor the children and even the therapists.

5. FUTURE LINESWe plan to do more sessions with more children in order to

validate our preliminary tests. In addition, we will programmore activities, and we will introduce new robots as it couldbe the AISOY, mini-Darwin, the NAO and the Pleo rb. Weare also willing to introduce our idea of cloud connectivitythat enables to combine human intervention with artificialintelligent multi-agent to bias the Robot Companion behaviorin order to foster a better engagement.

ACKNOWLEDGMENTAuthors thank LEGO Foundation, who donated the EV3

LEGO robots and all the people from CASPAN center aswell as the children and their families involved in this project.We are also thank the students from Universidad Tecnologicade Panama (UTP) for their support.

REFERENCES

[1] http://www.autistec.com[2] Robins, Ben, Kerstin Dautenhahn, R. Te Boekhorst, and Aude Billard.

”Robotic assistants in therapy and education of children with autism:can a small humanoid robot help encourage social interaction skills?.”Universal Access in the Information Society 4, no. 2 (2005): 105-120.

[3] Werry, Iain, Kerstin Dautenhahn, and William Harwin. ”Investigatinga robot as a therapy partner for children with autism.” Procs AAATE2001, (2001).

[4] Ferrari, Ester, Ben Robins, and Kerstin Dautenhahn. ”Therapeutic andeducational objectives in robot assisted play for children with autism.”In Robot and Human Interactive Communication, 2009. RO-MAN2009. The 18th IEEE International Symposium on, pp. 108-114. IEEE,2009.

[5] Rao, Patricia A., Deborah C. Beidel, and Michael J. Murray. ”Socialskills interventions for children with Aspergers syndrome or high-functioning autism: A review and recommendations.” Journal of autismand developmental disorders 38, no. 2 (2008): 353-361.

[6] Laushey, Kelle M., and L. Juane Heflin. ”Enhancing social skills ofkindergarten children with autism through the training of multiplepeers as tutors.” Journal of autism and developmental disorders 30,no. 3 (2000): 183-193.

[7] Tse, Jeanie, Jack Strulovitch, Vicki Tagalakis, Linyan Meng, andEric Fombonne. ”Social skills training for adolescents with Aspergersyndrome and high-functioning autism.” Journal of autism and devel-opmental disorders 37, no. 10 (2007): 1960-1968.

[8] White, Susan W., Anne Marie Albano, Cynthia R. Johnson, ConnieKasari, Thomas Ollendick, Ami Klin, Donald Oswald, and LawrenceScahill. ”Development of a cognitive-behavioral intervention programto treat anxiety and social deficits in teens with high-functioningautism.” Clinical child and family psychology review 13, no. 1 (2010):77-90.

[9] Diaz, Marta, Alex Barco, Judit Casacuberta, Jordi Albo-Canals, Ce-cilio Angulo, and Carles Garriga. ”Robot Assisted Play with a MobileRobot in a Training Group of Children with Autism.” In Proceedingsof the 2012 IEEE/RSJ International Conference on Intelligent Robotsand Systems IROS, Vilamoura, Portugal. 2012.

[10] Albo-Canals, Jordi, Marcel Heerink, Marta Diaz, Vanesa Padillo,Marta Maristany, Alex Barco, Cecilio Angulo et al. ”Comparing twoLEGO Robotics-based interventions for social skills training withchildren with ASD.” In RO-MAN, 2013 IEEE, pp. 638-643. IEEE,2013.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 75 -

Page 76: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 76 -

Page 77: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Workshop papers

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 77 -

Page 78: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Principles Involved in Care Robotics Legal Compliance E. Fosch Villarongaa

aJoint International Doctoral (Ph.D.) Degree in Law, Science and Technology coordinated by CIRSFID, Università di Bologna, Italy. IDT-UAB, Universitat Autònoma de Barcelona, Spain (e-mail: [email protected]).

Abstract. Roboticists find legal compliance labyrinthine because existing laws are scattered throughout the legal(s) system(s), and do not specifically refer to robotics. At the same time, jurists do not know exactly what the robotics growth implies. In order to overcome such limitations, the general principles involved in robotics have been recently addressed in legal literature. This paper aims to increase the current knowledge and tries to define the concrete principles involved in Personal Care (PCR), Therapeutic (TR) and Companion Robots (CR). Factors related to such concrete principles (such as attributes of the robot, technology applied to the robot and context) are also highlighted.

Keywords: Fundamental Rights, Personal Care Robots, Modular Regulation, Regulate As You Go, Therapeutic Robots, Companion Robots, Robotics, Principles.

INTRODUCTION

According to Lessig, four are the main constraints that normally regulate a thing: the Law, the Social Norms, the Market and its own Architecture [1]. Of all four, Personal Care (PCR), Therapeutic (TR) and Companion Robots (CR) lack some specific legal regulation. Although great efforts in this direction have been made [2], there is no concrete, binding addressing which fundamental rights these robots individually violate [3], if they should be granted agenthood [4] or what happens if they cause harm [5]. In fact, generic rules regarding robots [6] although of great help for policymaking, do not give a definite and concrete response to those roboticists trying to build a robot. The problem lies on the fact that, while we are still in a ‘brainstorming phase’ [9], some of this technology is already entering [7], or will enter very soon in the market [8]. This could lead roboticists to unknown legal risk scenarios. Therefore, the identification of concrete principles for PCR, TR and CR is indispensable. Which factors increase legal complexity?

PRINCIPLE CONCRETIZATION IN PCR, TR AND CR

A roboticist building a precise technology may encounter a two-fold problem: first, the identification of the principles involved in his/her technology; and second, the understanding of their meaning [10]: does an encrypted tunnel between the robot and the server protect data? Would a black, woman-like robot prevent the creator from violating race, gender – or even sexual orientation – discrimination?

Concerning the first problem, the RoboLaw project identifies 5 legal common themes in the field of robotics: health, safety, consumer, and environmental regulation; liability; intellectual property rights; privacy and data protection; and capacity to perform legal transactions [6]. In order to identify concrete principles, Carnevale   establishes 9 critical ethical issues related to PCR: safety, responsibility, autonomy, independence, enablement, privacy, social connectedness, new technologies and justice, and ethics and scientific research. Di Carlo and Nocco highlight the importance of respecting fundamental rights (e.g. independence and autonomy in the light of independent living, participating in community life, equality and access), liability and insurance, privacy, and the legal capacity and legal acts by personal care robots. However, PCR sub-types (person carrier, PCaR; physical assistant, PAR; and mobile servant robots, MSR), TR and CR are not all involved in these principles. In fact, by analyzing in more detail the meaning of these principles (thus addressing the second problem mentioned above), each of these robots comply gradually with these principles in a scale determined by the level of complexity of the human-robot interaction (HRI) (Figure 1):

Fig. 1 HRI and its impact on the legal/ethical layer. Personal Care Sub-types, Therapeutic and Companion Robots.

The relation between robots and humans is slightly different in each one of the cases described in the figure above: that is why we need concrete frameworks and not only a common regulation for all the robots. Indeed, the more cognitive the HRI is, the more complex the associated legal issues are, e.g. a companion robot that controls a patient’s medication doses [8] conceals more problems than an intelligent wheelchair that climbs stairs [12]. That is why TR and

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 78 -

Page 79: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

CR will be involved more in dignity, freedom and self-determination scenarios as compared with PCaR that will only be involved in safety, consumer protection and liability scenarios: - PCaR will be involved normally in: safety, user

protection and general liability. - PAR in: PCaR + specific safety, prospective

liability (if rehab context [11]), autonomy and independence, enabling capabilities, acceptance.

- MSR in: PAR + user rights (data protection), proxemics, dignity (final say on robot care).

- TR: MSR + social connectedness (principle of non-isolation), no replacement of human caregivers, persuasion.

- CR: TR + principle of autonomous ethical agent’s minimization, limitation to open scenarios with non-mission tasks.

OTHER FACTORS

The interaction between the user and the robot is not nevertheless the only variable that increases legal complexity. In reality, other discrete but interlaced factors play a major role in determining the level of complexity in the legal layer: (1) the attributes of the robot; (2) the technology applied to the robot; and (3) the context where the robot is inserted: 1) The attributes of a robot refer to its hardware and

software, and normally to the robot functions: not only what it is capable of doing, but also what expectancies the users have from it [13].

2) The technology applied to the robot directly affects to the legal complexity associated with the robot: the more sensors, cameras, microphones, etc., the more the robot can monitor and track sensitive data in all stages of its interaction with the user. That is why an intelligent wheelchair could imply more complex scenarios if it incorporates cameras that could record video and audio information of the user in private situations. A robot should be compliant gradually also with the number and quality of components it incorporates.

3) Regarding the context, exoskeletons have lately been used for rehabilitation purposes. Although the use of the robot does not increase per se the HRI, it does increase its level of complexity in the legal layer (see Fig. 1). Indeed, a rehabilitation exoskeleton could involve prospective liability and isolation scenarios.

CONCLUSIONS

The identification of general principles concerning robotics represents a great effort towards something yet unaddressed by European policy makers. Even so, roboticists need to know the concrete principles underlying their particular technology. These can be

classified according to the HRI; however, only taking into account other variables like robot attributes, the technology applied to the robot and the context where it will be inserted, it will be possible to know precisely which principles will have to be considered in a particular case. Thus a Modular Regulation based on the concept “Regulate-As-You-Go” is needed. This could make robotics compliance more flexible. Indeed, a robot should be compliant for what it is: for some general modules (shared among all robots like safety, user protection and liability) and some specific modules (depending on the specific attributes of the robot, the technology applied to it, and the context where it will be inserted). This could avoid current over-/under-regulated scenarios.

REFERENCES  1. L. Lessig, Code version 2.0. Basic Books, NY (2006) p.

121. 2. See the European funded Project www.robolaw.eu or the

Asian version initiative www.robolaw.asia. 3. B-J. Koops et al. “Robotic Technologies and

Fundamental Rights. Robotics Challenging the European Constitutional Framework”, International Journal of Technoethics, 4(2), 2013, pp. 15-35.

4. S. Shen, “The curious case of human-robot morality” Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction, 2011, pp. 249-250.

5. E. Stradella et al. “Robot Companions as a Case-Scenario for Assessing “Subjectivity” of Autonomous Agents. Some Philosophical and Legal Remarks”. RDA2 ECAI Biennal European Conference in AI, 2012.

6. D6.2. Guidelines on Regulating Robotics. EU RoboLaw Project, 2014, more specifically pp. 167-196 also p. 18.

7. Automated Guided Vehicles (AGV) for Hospitals and Healthcare Industry are already in the market. See: www.jbtc-agv.com/en/Solutions/Industries/Hospital.

8. Buddy, for instance, is a companion robot, is ready to enter the market: http://www.bluefrogrobotics.com/#.

9. P. Salvini, “On Ethical, Legal and Social Issues of Care Robots” in S. Mohammed et al. (eds.) Intelligent Assistive Robots, 2015, pp. 431-445.

10. “An organization must take reasonable steps to protect the personal information […]” is an example of lack of concreteness. See 4 Principle, Schedule 1, Information Privacy Principles, Privacy and Data Protection Act Victoria, No 60, 2014.

11. E. Datteri, “Predicting the Long-Term Effects of Human-Robot Interaction: A Reflection on Responsibility in Medical Robotics” Science Engineering Ethics 19, 2013, pp. 139-160.

12. J. Wang et alt, “Active Tension Optimal Control for WT Wheelchair Robot by Using a Novel Control Law for Holonomic or Nonholomic Systems”, Science China Information Sciences/Springer, 57, 2014.

13. Special Eurobarometer 427, Autonomous Systems Report, 2015.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 79 -

Page 80: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Intelligent Assistive Technologies for Dementia: Social, Legal and Ethical Implications

Marcello Ienca, M.Sc. M.A.a and Fabrice Jotterand, Ph.D. M.A. a b a Institute for Biomedical Ethics, University of Basel (Switzerland) b Rueckert-Hartman College for Health Professions

Abstract: The increasing number of older adults being diagnosed and living with dementia poses a major challenge for global health. The integration of Artificial Intelligence into the design of assistive technologies for dementia has a great potential for improving the life of patients and alleviating the burden on caregivers and healthcare services. However, ethical, legal and social implications should be considered early in the development of intelligent assistive technology to prevent slow social uptake, incorrect implementation and inappropriate use.

Keywords: Dementia, Alzhiemer’s disease, caregiving burden, intelligent assistive technology, information gap, ethics

THE GLOBAL BURDEN OF DEMENTIA AND AGEING

By 2050 it is projected there will be 115 million people with dementia worldwide: 1 in 851. The increasing incidence of dementia poses a major problem for public health and the healthcare services in terms of financial management and caregiving burden. Alzheimer’s disease (AD), the most common form of dementia, is among the most expensive diseases for human societies, with a total estimated worldwide cost of US$818 billion2. Such significant costs arise primarily from long-term care at nursing homes and other institutions, whose burden affects not only public finances but also the elders, their informal caregivers (e.g. relatives) and the healthcare system. The disabling conditions of dementia patients dramatically undermines their capability to live independent at home, interact with society and perform activities of daily living (ADLs). The provision of caregiving services frequently comes at high socioeconomic costs for caregivers3. From the perspective of the patient, the burden of dementia and age-dependent cognitive disorders results in a dramatically reduced quality of life (QoL).

INTELLIGENT TECHNOLOGY FOR AN AGEING WORLD: PROMISES AND CHALLENGES

Given the current limited possibilities for pharmacological treatment, a promising approach in response to this emerging global crisis is the development and deployment of Intelligent Assistive Technologies (IATs) that compensate for the specific physical and cognitive deficits of seniors with dementia, and there by, also reduce caregiver burden related to long-term care and institutionalization4. In fact, technologies that can help dementia patients to continue living independently at home or maintain independence in skilled facilities would provide a triple-win effect5. These technologies could aid in: (I)

saving significant costs to the health-care system by delaying or obviating the need for institutional long-term care, (II) reducing the burden on informal caregivers, and (III) improving the quality of life of patients by improving their autonomy, social interaction and help fulfil their wish to age in place. While IATs open up the prospect of improving the quality of life of the elderly and reducing the financial, logistical and professional burden on the healthcare system, yet their distribution and uptake is still very low4. The reason for that stems from a multi-level gap in the cross-section of technology and healthcare6,7. This gap dopes not arise exclusively from the current strategies for the implementation of ATs into neurological and geriatric care but concerns three inherent dimensions of the relationship between technological products and target users: the societal, the legal and the ethical dimension.

THE SOCIETAL DIMENSION AND THE INFORMATION GAP

At the societal level, the low distribution and uptake of IATs is generally ascribed to an information gap in the cross-section of technological development and healthcare6. At present, little information is available to technology designers and developers regarding the specific needs, wishes, and expectations of their target population8. The reason for that is twofold. First, because social science research on the use of IATs among older users is at a germinal stage of development and current knowledge on the users’ needs, views and attitudes is far from being extensive, generalizable and theoretically systematic. Second, because research on dementia patients is time-consuming and requires extremely high standards of ethical rigor. According to Kramer (2014), this information gap is a major cause of the lower-than-expected acceptance of IATs among the senior population as well as of the current position of IATs in the Innovation Adoption Lifecycle (IAC). One further consequence of the information gap is the differential success of producer-centered models of technology development for intelligent assistive devices. With direct information from target users being hard to achieve, prototypes are often developed in absence of systematic knowledge about the users’ needs. This risks to generate a vicious circle since unmet users’ expectations are a major indicator of low societal uptake and use. Following Niemeijer et al. (2010) and Robinson et al. (2009), we call for a rapid transition to a human-centered approach as well as a user-centered model of technology design and development9,10. This will require extensive research on the views, needs and

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 80 -

Page 81: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

attitudes of target users and their proactive involvement into the design and development process. A similarly participatory model should be implemented at the stage of technology assessment and evaluation.

THE LEGAL DIMENSION: PRIVACY, RESPONSIBILITY, CULPABILITY

At the legal level, the major challenge faced by IATs for dementia regards the protection of data and the security of information available to the devices. IATs are capable to extract, measure, store and decode potentially sensitive information about their users. For example, GPS and RFID devices for tracking dementia patients during wandering can access and manipulate information about the user’s location. Similarly, biosensors and wearables can access biological information (e.g. blood pressure or hearth-beat rate) that is relevant for composing the medical records of the users. Since this information is often private and sensitive and can be potentially used by malevolent external agents for nefarious purposes, safeguards and protection mechanisms should be introduced to limit the access of such information to professionals and other relevant stakeholders while restricting access to malevolent agents and third-party companies interested in those data (e.g. neuromarketing or health-insurance companies). In addition, the quantity and quality of data through which IATs will irrigate the digital ecosystem poses challenges to data analysis, curation, storage, transfer and visualization. Further legal reflection is needed within a twofold framework. First, from the perspective of human-rights, there is a need for systematic analysis of the specific rights that dementia patients are entitled to enforce when interacting with IATs (especially in the case of assistive robotics). In addition, from the perspective of criminal law, there is a need for a proactive and rigorous definition of the conditions for legal responsibility and culpability in both patients and robots. With neither dementia patients nor assistive devices being considered fully competent agents, hence fully entitled to legal responsibility and culpability, unequivocal standards should be set up to account for emerging case-scenarios (e.g. in case the intelligent device harms the user in a non-programmatic way or the user harms another agent through the device).

THE ETHICAL DIMENSION: INFORMED CONSENT, PERSONAL AUTONOMY, JUSTICE

From an ethical perspective, three major implications are recognizable. The first one is informed consent: while the participation of patients into the development of new applications is highly desirable to produce designs that better match the needs and expectations of the target population, yet this inclusive approach poses the important ethical challenge of obtaining informed consent from patients. Enrolling mild to moderate dementia patients into research will require extraordinary ethical standards and urge close monitoring from ethical committees. On the positive side, user-centered designs for IATs could empower

adults with dementia and improve their personal autonomy (e.g. through the partial support of their independence, mobility, cognitive capacity and social interaction). Patients reports will be highly needed to promote and assess this phenomenon. The third challenge is justice: fair distribution of technologies is paramount to prevent the emergence of a technological divide which could exacerbate preexisting economic inequalities. Policy makers and regulatory should prevent IATs for being exclusively available among wealthy users and should rather promote the widespread distribution of such devices throughout society. This could be achieved through incentives for producers and families, the implementation of reimbursement plans and other welfare mechanisms.

CONCLUSION

IATs open the prospects of providing a triple-win effect on the management of the global crisis posed by dementia and population ageing. Nonetheless, such potential benefits risk risked to be tampered if social, legal and ethical questions remain unaddressed. Interdisciplinary research is required to develop a systematic framework to maximize the benefits of these emerging technologies while minimizing the unintended risks.

REFERENCES 1. Association, A. s. 2014 Alzheimer's disease facts and figures.

Alzheimer's & Dementia 10, e47-e92 (2014). 2. International, A. s. D. World Alzheimer Report 2015: The

Global Impact of Dementia. An analysis of prevalence, incidence, cost and trends. Vol. World Alzheimer Report 2015 (Alzheimer's Disease International (ADI), 2015).

3. Wimo, A., Winblad, B. & Jönsson, L. An estimate of the total worldwide societal costs of dementia in 2005. Alzheimer's & Dementia 3, 81-91 (2007).

4. Bharucha, A. J. et al. Intelligent Assistive Technology Applications to Dementia Care: Current Capabilities, Limitations, and Future Challenges. American Journal of Geriatric Psychiatry 17, 88-104, doi:10.1097/JGP.0b013e318187dde5 (2009).

5. Pollack, M. E. Intelligent technology for an aging population: The use of AI to assist elders with cognitive impairment. AI Magazine 26, 9-24 (2005).

6. Kramer, B. Dementia caregivers in germany and their acceptance of new technologies for care: the information gap. Public Policy & Aging Report 24, 32-34 (2014).

7. Sugihara, T., Fujinami, T., Phaal, R. & Ikawa, Y. in Technology Management for Emerging Technologies (PICMET), 2012 Proceedings of PICMET '12:. 3067-3072.

8. Peterson, C. B., Mitseva, A., Mihovska, A., Prasad, N. R. & Prasad, R. The phenomenological experience of dementia and user interface development. (2009).

9. Robinson, L., Brittain, K., Lindsay, S., Jackson, D. & Olivier, P. Keeping In Touch Everyday (KITE) project: developing assistive technologies with people with dementia and their carers to promote independence. International Psychogeriatrics 21, 494-502 (2009).

10. Niemeijer, A. R. et al. Ethical and practical concerns of surveillance technologies in residential care for people with dementia or intellectual disabilities: an overview of the literature. International Psychogeriatrics 22, 1129-1142 (2010).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 81 -

Page 82: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Designing Therapeutic Robots for Privacy Preserving Systems, Ethical Research Practices, and Algorithmic Transparency

Elaine Sedenberg,a John Chuang,a and Deirdre Mulligana

a School of lnformation, University of California, Berkeley

Abstract. This paper explores the unique privacy and ethica! challenges of therapeutic robots with multiple sensing modalities and ability for ubiquitous data collection. The migration of robots from the labm·atory into sensitive healthcare or private settings as therapeutic agents represents a notabie transition. In both laboratory and sensitive contexts, user-based research and algorithmic adaptations can lead to new knowledge about populations as well as particular users. This underscores the imperative for designers to consider unintended consequences along with long-term risks and benefits to user privacy and autonomy. By incorporating these aspects into the system design of therapeutic robotics early on, this paper aims to improve the potential for ethica! long-term data sharing and use by a diverse set of researchers and practitioners. We apply the Fair Information Practices (FIPs) and explore privacy concerns unique to the placement of therapeutic robots in sensit1ve contexts. We introduce ethica! frameworks beginning with the Belmont Report regarding the use of human subjects (i.e., users) in research practices, and explore how these principles may be integrated into the design of therapeutic robotics so that ethica! research may be enabled both in the corporate and academie spheres. We draw out principles that apply to the participation of vulnerable individuals (i.e., children or handicapped persons) in research contexts, and how these considerations may be integrated into the interactions and data collection between users and robots. Finally, we make recommendations for tbe implementation of these ethica! and privacy principles to promote the adaptation of long-term, research-ready robotics in sensitive settings.

Keywords: HRI, privacy by design, research ethics, informed consent

INTRODUCTION

Therapeutic robots embody science fiction dreams for a better future, and come with unprecedented power to understand aspects of human behavior and health, through the detection of patterns in user data from multiple sources. Sensors enabled by therapeutic robotics can collect intimate personal data through passive sensors and human-computer mediated interactions. Analysis of this multiple modal sensor data can yield surprising, and often "category­jumping" inferences about individuals. [1]

A diverse range of actors are now deploying therapeutic robotics and their associated data systems-including academie researchers, healthcare

- . providers, and private corporations. Regardless of the actor, these systems can generate new knowledge with potentially positive impacts on society. However, each actor is subject to different legal and regulatory regimes while deploying these systems. We assert that regardless of the actor, there are unifying design principles that wil! promote privacy-preserving and ethica! data collection in these sensitive environments. By implementing practices and designs informed by these principles, the robotics community may enable wider data sharing, and support interdisciplinary research on these valuable data systems.

Building upon existing literature discussing the ethica!, privacy, and security implications of robotic devices and ubiquitous computing systems, we apply ethica! and privacy principles to the design of social robotics systems as a whole, not just for particular use applications.

DESIGNING PRIVACY-PRESERVING ROBOTS

Many therapeutic robotic devices are designed for long-term usage and placement with an individual within sensitive, and often intimate, settings. Since these robots and their data may cover a significant portion of an individual's matmation (e.g., autism therapy) orendof life care (e.g. , elderly companions), privacy policies and data management should be proportionally designed to accommodate these extended timescales, sensitive settings, and potential permanency of a high-volume of data.

These systems could benefit from the Fair Information Practices (FIPs), [2] which are internationally recognized practices for designing systems that respect the information privacy interests of individuals. Core principles include: Transparency (no secret systems); Access (to individuals' records and their uses); Privacy Controls (ability to prevent information about oneself from purposes without consent); Integrity (ability to correct or amend); and Data U se Protections (prevent data misuse).

We discuss the specific application of FIPs to therapeutic robots, and propose additional concerns unique to the field for consideration including: 1) Data Review and Access Permissions (enhance user or guardian's ability to onderstand and manage data collection); 2) Presentation of Privacy Policies, User Consent, and Controls (utilize the diverse functionality of the robotic platform to offer consent and notification via multiple modalities like audio); and 3) Awareness of Existing Laws and Potential Data Use (sensor data held by third parties may be accessed for legal proceedings under lower standards than if it is

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 82 -

Page 83: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

held by the individual data subject [3] and storing data in different countries can yield different protections—such variances are often unanticipated by users and practitioners).

ETHICAL FRAMEWORKS TO ENABLE ROBUST RESEARCH & DATA SHARING

In the U.S., ethical oversight boards regulate only medical and federally funded human subject research. This means that some development of therapeutic robots by private industry may not be regulated by ethical oversight and require the use of consent agreements for research practices. Regardless of robotic developer’s institutional affiliation, we feel that the ethical framework for human-subject research developed in the canonical Belmont Report [4] and Menlo Report [5] should be incorporated into all therapeutic robotic devices. We apply this framework to the design and implementation of therapeutic robots, and discuss ways in which these principles could be further optimized to maximize benefits and minimize risk while enabling robust research studies. In particular, we focus on the application of the principle “respect for persons” through informed consent mechanisms. Consent should not only account for single academic studies, but should be inclusive of research done in private industry. Therapeutic applications of social robotics require additional legal and ethical consideration for vulnerable persons (children, handicapped, and elderly). We examine how U.S. regulations, such as the Children’s Online Privacy Protection Act (COPPA) and Health Insurance Portability and Accountability Act (HIPAA), should influence robotic data systems in the private sector, and examine additional ethical frameworks for these sensitive—but high potential impact—user applications of robotics.

IMPLEMENTATION RECOMMENDATIONS

To synthesize our application of privacy and ethical principles to therapeutic robots, we discuss implementation recommendations, which include:

Access to Data: Particularly in cases where therapeutic robots cohabitate, users should be given the options to prevent data archiving, delete historical data, and amend incorrect or misinterpreted data over the lifespan of the robot. Special provisions are discussed for vulnerable persons who may need extra assistance in data choices and care of their records. Issues of non-owner/user data are also addressed.

General Practice and Algorithmic Transparency: Therapeutic robots should promote a healthy, ethical research data sharing environment by notifying users of all research done using their data—whether it is limited to algorithm/product development or for generalizable knowledge. We discuss opportunities for the field to embrace algorithmic transparency by

providing users information about the data inputs, outputs, and algorithmic decisions presented to them during therapy provided by the robot.

Universal Informed Consent: The diverse applications of therapeutic robots in academia, healthcare, and private industry present an exciting opportunity to engage users in enhanced informed consent practices using common features like voice- enabled interactions and screen interfaces—regardless of whether consent is required by law for the specific application. Instead of limiting consent to binary decision, and pen and paper forms, we present options for dynamic consent models [6] that allow for the user to select more nuanced participation choices (e.g., use all of my data, or only for certain types of research), receive protocol updates or scientific findings over time, and the ability to change decisions over time.

Design for Privacy-Preserving Data Sharing: Designers should build privacy-preserving data sharing mechanisms into therapeutic robots, so data that is ethically collected may benefit a wide spectrum of researchers and topics. We discuss proposals for open Personal Data Stores (PDS) in the literature, and propose platform choices, security, and access permissions for information sharing.

Anticipate New Knowledge and Unintended Consequences: The rich and intimate data collected by therapeutic robots will require designers to carefully consider unintended knowledge and consequences. We discuss choices and consequences in other fields, and how the design of social robotic systems for therapeutic applications may benefit from these case studies.

CONCLUSION

These principles and recommendations are not intended to be comprehensive or definitive. Rather, it serves as a starting point for dialog between the robotics community and the privacy and research ethics communities so that the immense societal benefits of therapeutic robotics may be fully realized.

REFERENCES 1. E. Horvitz, and D. Mulligan. Data, privacy, and the

greater good, 2015 2. R. Gellman. Fair information practices: A basic history.

2015. 3. R.M. Thompson. The fourth amendment third-party

doctrine, 2014. 4. The National Commission for the Protection of Human

Subjects of Biomedical and Behavioral Research. Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects Research. Washington, DC, 1979.

5. Menlo report: Ethical principles guiding information and communication technology research, 2012.

6. A. Stopczynski, R. Pietri, A. Pentland, D. Lazer, and S. Lehmann. Privacy in sensor-driven human data collection: A guide for practitioners, 2014.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 83 -

Page 84: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

What do care robots reveal about technology?

Rieks op den Akker

Faculty of Electrical Engineering, Mathematics and Computer Science, University of Twente, Enschede, The Netherlands

Abstract— Ethical issues raised by the idea of social robots thatcare point at a fundamental difference between man and machine.What sort of “difference” is this? We propose a semiotic view ontechnology to clarify the relations users have with social robots.Are these autonomous agents just promising or can we also counton them?

1. INTRODUCTION

If a “smart” coffee machine knows about its user’s heartproblems, should it accept giving him a coffee when herequests one? The issue is raised in “Ethical Things” aproject that “explores the effects of autonomous systemsof the future.”1 Similar ethical issues raised by the idea ofautonomous care robots were discussed in the Accompanyproject, one of the many EU projects in the field of socialrobotics for elderly care.2.[1].

Social robots challenge our traditional theories of moralresponsibility. Are they moral agents? Can they be heldresponsible? In this short note I invite the reader to take alook behind these type of ethical issues raised by the growingautonomy of our intelligent technical artifacts of whichthe social robots are the most impressive representatives.Can we perceive robots as social responsible autonomouscompanion agents that care and at the same time as technicalinstruments? How can we understand social robots from theprinciples of technology? And what do users that reportabout their interactions with social robots tell us about thelimitations of technology that follow from these principles?

2. ROBOT ETHICS AND ETHICAL ROBOTS

People have different views on the moral issues raised byautonomous artifacts like robots and what they mean for theirapplication in for example health care practice. Implicit inthese views is an idea about what technology can accomplishwhich is based on ideas about what technology is, about therelation between mind and matter in men and in the machine.The emphasis in the usual approach in robot ethics researchis “on the robot and what the robot really is or thinks‘’,in order to be able to answer questions like “Are robotsintelligent, rational, ‘moral agents’?” or “it limits ethics toconcerns about things that might go wrong in interactionswith robots.” “For many moral philosophers, ethics is aboutholding someone responsible and about the rightness of one’sactions, and then questions regarding moral status and action

1http://www.creativeapplications.net/objects/ethical-things-the-mundane-the-insignificant-and-the-smart-things/

2In Accompany a robotic companion was developed for providing ser-vices to elderly users in a motivating and socially acceptable manner tofacilitate independent living at home. (http://accompanyproject.eu/

are central. We usually ascribe moral responsibility onlyto beings that have a sufficient degree of moral agency -whatever that means- and ask about the rightness of whatthat agent does, has done, or could do.” [2]. Coeckelbergproposes a human centric or interaction centric approach tothe ethics of robot technology. “Instead of a philosophy ofmind concerning what robots really are or really (can) think,let us turn to a philosophy of interaction and take seriouslythe ethical significance of appearance.”([3], p.220).

One of the outcomes of the Accompany focus group dis-cussions was that control over the programming of the robotneeded to be a negotiation between the older person livingwith the robot, and that person’s other support networks offormal and informal carers, rather than simply implementingan older person’s wishes. However, the data also suggeststhat at least one approach - the ‘let’s do it together’ strategymay itself undermine autonomy by (unconsciously, perhaps)infantilising the older person [1].

I will argue that what is needed for ethical decisions isan open dialogue between partners involved; a dialogue thattakes into account the specific situation in which a decisionhas to be made. Ethical issues are raised when we becomeaware of a conflict between general rules of good conduct,between different values, autonomy and safety for example.“Open” means that there is no protocol that is forced uponthe dialogue partners. A robot would be social when it wouldtake responsibility, not because it is ascribed responsibility.Someone who is just following a procedure, as computersand clergymen do, is not responsible since he does not atthe same time reflect critically on the appropriateness of theprocedure, a reflection that should be based on sensitivityfor the values that are important in the particular situationat hand. Sometimes we must leave things for others to do.Trust is okay, but not blind trust. Responsibility is a virtue,not a commodity that can be given away.

Moor argues that “explicit ethical robot agents can decidewhat to do in a conflict situation.” [4]. But also then we canonly implement general rules. They need to be applied ina careful way. “The human act of caring is the recognitionof the intrinsic value of each person and the response tothat value” (Schoenhofer). From the patient’s view pointcare values are safety, satisfaction, responsiveness to care,dignity, physical and psychological well-being. Values ofthe analytical, empirical scientific view are quite differ-ent: structurability, reproducability, analysability. For moderntechnology we can add computability, programmability. Thedesigner of (social) technology makes user models andassumes programmability of the user, who adheres to the

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 84 -

Page 85: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

models underlying the user interface of the system. Althoughtailoring is a hot topic in the field of intelligent softwareagents, from a designers perspective the user remains anabstract entity. For the care giver the unique person he caresabout is the one who determines what has to be done in aconcrete situation.

3. DIALOGUE AND RESPONSIBILITY

In everyday life we encounter each other as persons.What makes man a person is his rationality, in the sense ofaccountability. The postulate of rationality is a -contrafactual-principle that partners in a personal dialogue adhere to.According to Kant being accountable, having the will totake responsibility, is what characterizes the moral person.On the contrary, things are those objects that can not takeresponsibility3.

Note that ‘man is rational’ is not meant here as anempirical statement, but a contrafactual postulate. When weare engaged in a dialogue we must assume that it holdsand we must act accordingly so it becomes reality. Thispostulate is constitutive for the dialogue: without this there isno dialogue between persons possible. Even when someonelies we assume that he will have an explanation for it. Wehave to take seriously that the other says something. Thisis the first postulate of dialogue. Being accountable is thuscharacteristic for being rational.

What do users’ experiences tell us about the interactionwith artificial companions? Bickmore et al. study long termrelationship between embodied conversational agents andelderly people [6].“Several participants mentioned that theycould not express themselves completely using the con-strained interaction. One of them reported: ‘When she ask mequestions ... I can’t ask her back the way I want’. [6]. Clearly,users of conversational agents experience that a real interac-tion with the system is not possible. It simulates programmed“social behaviors” but it lacks social competence. The coffeemachine that knows about its user’s heart problems and thatis confronted with a moral problem: ‘Should I present acoffee or not?’ could start a dialogue with the user and tryto convince him. Eventually, questions will come up: ‘Whoam I talking to?’ ‘Do you really care?’. The philosophertries to understand what this reveals about the very ideaof technology. How does technology work and serve us? Asemiotic approach might help.

4. UNDERSTANDING TECHNOLOGY

For understanding the “difference between man and ma-chine” it may help if we think about the difference betweenthe physical sign and the meaning it carries. Machine is “part

3“Person ist dasjenige Subjekt, dessen Handlungen einer Zurechnungfahig sind. Die moralische Personlichkeit ist also nichts anderes als dieFreiheit eines vernunftigen Wesens unter moralischen Gesetzen (die psy-chologische aber bloss das Vermogen, sich der Identitat seiner selbst in denverschiedenen Zustanden seines Daseins bewusst zu werden); woraus dannfolgt, dass eine Person keinen anderen Gesetzen als denen die sie (entwederallein oder wenigstens zugleich mit anderen) sich selbst gibt, unterworfenist.” “Sache ist ein Ding, was keiner Zurechnung fahig ist. Ein jedes Objektder freien Willkur, welches selbst der Freiheit ermangelt, heisst daher Sache(res corporalis)”, [5], Einl. IV (III 26 f.)

of” an intelligent relation; without the human intellect it hasno meaning. Just like a sign without a meaning is not a sign.The physical presentation and its form is on the one handarbitrary (there is no intrinsic relation between the meaningof a word and how the words looks or sounds), on theother hand it is conventional and historically motivated (to beunderstood you need to learn the language of a community).In the same way machines are outside objectivations of ourintellect. As technical means they mediate between men andnature. They are based on forces of the physical nature andon the forces of social psychological nature.

Computers are language machines. Suppose we talk to amachine and ask “What time is it?” and the machine answers“It is 2 o’clock in the afternoon.” How does this work? Thisworks because of the implemented correspondence betweenthe structure of the physical process that my talking (also) isand the meaning I express. Natural language is the sociallyshared interface we use to express our thoughts, emotions,commands. By making the machine react to sequences oftokens specified in a formal system, tokens that we chooseto resemble the words and sentences in our own naturallanguage, and by making the machine generate sentencesin a situation that satisfies certain felicity conditions webring about the user experience of having to do with anunderstanding machine. The social robot by uttering somenatural sounds and by showing some natural behaviourspromises to be of our natural kind.

5. CONCLUSION

We propose a semiotic view on modern technology andunderstand technological beings essentially as outside ob-jectivations of our intellectual meaningful relations in socialpractices. The semiotic view on modern technology suggestsa conceptual framework for thinking about the moral issuesraised by social robots. It reveals the fundamental limitationsof any technical system however “smart”. It is our respon-sibility to see these limitations when we use a system. Inthinking about morality in technology we should carefullydistinguish between the general abstract value free technicalideas and their application in devices used in concrete valueladen situations.

REFERENCES

[1] H. Draper, T. Sorell, S. Bedaf, H. Lehmann, C. G. Ruiz, M. Herve, G. J.Gelderblom, K. Dautenhahn, and F. Amirabdollahian, “What askingpotential users about ethical values adds to our understanding of anethical framework for social robots for older people,” in Presentationat AISB50 - 50th Annual Convention of the AISB, 2014.

[2] M. Coeckelberg, Growing moral relations: critique of moral statusascription. Palgrave Macmillan, 2012.

[3] M. Coeckelbergh, “Personal robots, appearance, and human good:A methodological reflection on roboethics,” International Journal ofSocial Robotics, vol. 1, no. 3, pp. 217–221, 2009, open Access Article.[Online]. Available: http://doc.utwente.nl/76112/

[4] J. H. Moor, “Four kinds of ethical robots,” Philosophy Now, pp. 12–14,March/April 2009.

[5] I. Kant, Die Metaphysik der Sitten. Verlag von Felix Meiner, 1907.[6] T. W. Bickmore, L. Caruso, and K. Clough-Gorr, “Acceptance and

usability of a relational agent interface by urban older adults,” in Adults,ACM SIGCHI Conference on Human Factors in Computing Systems(CHI. New York, NY, USA: ACM, 2005, pp. 1212–1215.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 85 -

Page 86: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

'I tech caré: The responsibility to provide healthcare using robots Antonio Carnevale

Scuola Superiore Sant 'Anna, Pisa, Italy

Abstract- Robots are an emerging technology in many areas such as military engineering, logistic services, and autonomous vehicles. One of the most promising areas of their implementation is human care. Healthcare robots have not yet been commercialized but evidence suggests that their future use will be substantial and challenging. In this contri bution my aim is investigating the kind of care relationship that could exist between a robot and a human. Usually, we take care of things and people because we love them, or else we want to give them support in their suffering. However, continuing to value only this sense of care, in a future rendered increasingly abstract by technology may mean losing sight of the fact that taking care of others also means taking care of ourselves. lf we completely entrust robots with the role of caring, the bigger concern is not the foreseeable decrease in the ' humanity ' in healthcare contexts, but the much more challenging notion of people surrendering the value and meaning in their lives. Since caring about sarnething means, firstly, giving it value, a society passively nursed by technology is a society unable to give value to things and people. In order to avoid this risk, new approaches are required, no longer basedon love or solidarity, but responsibility.

Keywords- Healthcare robot, Human care, Responsibility, Ethics and technology

1. THE ROLE OF EMERGING TECHNOLOGY TODAY

Emerging technologies (ICTs, robotics, computer science, etc.) are challenging the sense we give to the things. In the Renaissance, Michelangelo Buonarroti argued that works of art are not created in the marble, but removed from the marble. A work of art is already present in nature; all the artist has to do is to remove it with the chisel, in order to bring out its beauty and replicate the perfection of nature. In our technological world, the idea of imitating or reproducing nature no Jonger exists, because nature is the same reality produced technologically.

For most of our day we think, feel , and act in a reality whose objectivity is formerly constructed to guide us in thinking, feeling, and doing the things we have to do. A coffee cup is made so that using it properly constitutes an immediate action. Using a vending machine requires information and abstract reasoning to be processed into a practical (and not immediate) exp lanation in order to achieve our purpose. Understanding the price of the drink, inserting the coin, interacting with the display to select the product, choosing the amount of sugar, etc. The interaction between humans and machines make the world a more informational and abstract structure [7].

But abstracting an idea from the reality is not merely a logica! operation (A=A; A;#:B, etc.), but also involves a choice, therefore freedom. Abstracting means distinguishing, which means choosing, which in itself means being free to choose. Without freedom of choice there is no abstraction, but only captivity and reification. Paradoxically, the more technology, its languages, and its machines take over the abstraction ofthe world, the more we as human beings wiJl be called upon to rethink our roleon this planet.

2. ROBOTS AND HEALTHCARE

Robots are an emerging technology in many areas such as military engineering, logistic services, and autonomous vehicles. One of the most promising areas of their implementation is human care [4] [5]. Companies and universities continue to produce different robotic prototypes for different care services - rehabilitation, physical assistant robot, person carrier robot. In the last decade, healthcare robots have been the main focus of several projects and prototypes conceived to improve quality of life and independent living, thus promoting an active and healthy ageing and reducing health and social costs. Robotic service solutions range from the simplest tele-presence functionalities to support caregivers, to the most complex, such as assistance for daily living activities self-management of chronic diseases, well-being and integration in a smart environment. That care robots are ready to enter into the private Jives of people is a fact that wiJl soon be reality. Robotic service solutions range from the simplest tele-presence functionalities to support caregivers,1 to the most complex, such as assistance for daily living activities self-management of chronic diseases/ well­being and integration in a smart environmene or in different scenarios.4 On the other hand, also patient associations and other parties of the civil society are pushing public health systems to use robotic applications for social and home-based care. Media increasingly presents robots in terms of future helpful supports, thus stimulating the collective imagination on how life could change when these machines wiJl be able to take care of our daily needs. In actdition governments' attention on care robots has increased because they are seen as technological solutions to tackle the growth in public costs of healthcare due to the aging society and the transformations in the family systems which demand and rely always more on the social welfare support.

3. WHY IS TAKING CARE SO IMPORTANT FOR HUMAN

BEINGS?

We take care of things, people, at least idea, because we love them [8] [9]; in other words, because they assume a significant value for us. The movement is the same: investing a thing, a person, an idea with a value. The invested value te lis us that, in the name of that thing, person, or idea, it is worthwhile to act, struggle, and sacrifice a part of oneself. Whether for the health of a family memher or for defending the freedom of a population, what drives us to help, support, aid, love, fraternize, is what they represent for us. The practice of care is perhaps the aspect of human life that makes us truly 'human beings ' . This is why it is so important. We could continue to exist as numbers in monetary economics that maximize profits and relativize losses. And this is why there is no demise in sight for -::apitalism. We could continue to exist in societies in which

1 See ExCITE project (GiraffTechnologies company, website: www.giraff.org) . 2 See AVA (iRobot/AVA company, website: www.irobot.com/ava). 3 See DOM EO project, website: www.aal-domeo.eu. 4 See: www.robot-era.eu.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 86 -

Page 87: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

machines are built to replace us to do difficult work. But what makes us truly human is taking care of things and people, which means giving value to reality. In the practice of caring we rise above the selfishness of the economic exchange, over the camaraderie of small communities. We are something more than mere living beings. We are human beings because we give representation – i.e. value – to our lives. Without love and solidarity, life certainly would continue to exist, but it would have no value. It would not be chosen by people, but only passively experienced – a bare life [1]. And without being chosen, life would not even be free, because choosing means being free to choose.

4. THE CHALLENGE TO PROVIDE HEALTHCARE

THROUGH HUMAN-ROBOT INTERACTIONS For decades it has been believed that the most advanced

robotics design in the field of assistance and sociality was trying to replace all (the humanoid) or some parts (the cyborg) of the human body. In the future this centrality of the anthropomorphic element is probably doomed. In fact, the gradual incursion of robotics with other technological and scientific sectors – ICT, AI, synthetic biology, digital fabrication – will almost certainly lead to new trends. On the one hand, there is the interest of researchers and developers in imitating and reproducing not only the human body on its own, but biology and nature in a broader sense – which is represented here by the project of the robotic octopus5. On the other hand, the anthropomorphic element will not disappear completely, but will be greatly transformed. No longer will the body be like a biological machine created to mimic and replicate, but humanity construed as a unique normative element, i.e. a model driven by both biological and social rules. The degree of ‘humanity’ of a machine will no longer be represented by its aesthetic and functional similarity with the human body, but by its ability to choose based on principles and shared rules [12]. The more complex machines become, in order to be more ‘human’ they must also be ‘right’, making decisions according to universalizable rules – like the well-known laws of Asimov. There are interesting legal approaches that imagine the rules to which technology should be subjected, not as rules to regulate the technical functionality of its product, rather as tools of rights. By regulating the use of technological artifacts, it is possible to intervene and improve some aspects of people’s behavior [2] [3].

If this is the robotics of tomorrow, it is difficult to believe that in the future, the problem with healthcare robots will be their similarity to a pleasant and attentive caregiver [6]. The robot does not necessarily have to love the person they are caring for, nor have solidarity for the cause of his/her suffering, nor look like a good mother or a loving pet. If it really will be possible to reproduce the feeling of ‘love’ in the machine, this will still be a matter of programming the commands and the rules that the robots have to follow, and not an ontological question about their sensitivity.

The problem is normative and techno-regulatory, and not purely speculative. The problem is not seeking the exact definition to distinguish a robot from a human being; the

5 See: www.octopus-project.eu.

problem is seeking norms and policies to respond to the questions: how will humans and machines be able to live together? In a world rendered abstract by technological processes and computer languages, what will become of the significance of the human touch, an affectionate gaze, a hug? What is to become of the typical human feelings such as sympathy, guilt, shame, and even a sense of justice? If it is true that those who suffer injustice are more able to enjoy the taste of freedom, who will still have a sense of freedom in a society in which autonomous machines will do anything and everything to prevent us from suffering?

These questions highlight the real challenge that the future diffusion of healthcare robots poses: the responsibility of providing a healthcare balance between technology and human values. Who provides care to whom in a future technologized society? [11]

Answering this requires new conceptual as well as practical and value-sensitive design approaches [13]. And this combination makes everything difficult. Scientific progress cannot and should not be stopped, however it is unacceptable to adapt human freedom to the needs of technology [10].

REFERENCES

[1] G. Agamben, Homo Sacer: Sovereign Power and Bare Life.

Stanford University Press, 1998. [2] B. van den Berg, Robots as Tools for Techno-Regulation, Law,

Innovation and Technology 3(2) (2011), 317–332.

[3] B. van den Berg, and R.E. Leenes, Abort, Retry, Fail: Scoping

Techno-Regulation and Other Techno-Effects, Human Law and

Computer Law: Comparative Perspectives 25 (2013), 67–87.

[4] A. Carnevale, Ontology and Normativity in the Care-Robot

Relationship, in: Seibt, J., Hakli, R., and Nørskov, M. (eds.),

Sociable Robots and the Future of Social Relations, IOS Press,

Amsterdam, 2014, 143–152.

[5] E. Datteri and G. Tamburrini, Ethical Reflections on Health Care

Robots, in: Capurro, R., and Nagenborg, M. (eds.). Ethics and

Robotics, IOS Press, Amsterdam, 2009, 35–48. [6] D.J. Feil-Seifer and M.J. Matarić, Ethical Principles for Socially

Assistive Robotics, IEEE Robotics & Automation Magazine,

special issue on Roboethics, (eds.) G. Veruggio, J. Solis, and M.

Van der Loos, 18(1) (2011), 24–31.

[7] L. Floridi, The Ethics of Information, Oxford University, Oxford,

2013.

[8] H. Frankfurt, The Reasons of Love, Princeton University Press,

Princeton, 2004. [9] E. F. Kittay, Love’s Labor: Essays on Women, Equality, and

Dependency, Rutledge, New York, 1999. [10] N.E. Sharkey and A.J.C. Sharkey, The Rights and Wrongs of Robot

Care, in: Lin, P., Abney, K., and Bekey, G. A. (eds.), Robot Ethics:

The Ethical and Social Implications of Robotics, MIT Press,

Cambridge MA, 2011, 267–282.

[11] R. Sparrow and L. Sparrow, In the Hands of Machines? The

Future of Aged Care, Minds and Machines 16 (2006), 141–161.

[12] W. Wallach and C. Allen, Moral Machines. Teaching Robots Right

From Wrong, Oxford University Press, Oxford, 2009. [13] A.L. van Wynsberghe, Designing Robots for Care: Care Centered

Value-Sensitive Design, Science and engineering ethics, 19(2)

(2013), 407–433.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 87 -

Page 88: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Robots and seniors: can they be friends?

Sofia Reppou1 and George Karagiannis

1

1 ORMYLIA Foundation, 63071 Ormylia – Chalkidiki

Abstract. When getting older, some positive changes

happen as we develop and mature; some other, not so

welcomed, changes also happen like physical decline and

diseases. Getting older sometimes equals to start living

marginalized. Retirement, loss of spouse, health issues or

disabilities due to aging, suddenly situate elderly to the edge

of society. Children have grown up and have their own

families and elderly find themselves living alone at home or

unluckily, in geriatric institutions. Unfortunately, we cannot

avoid the negative aspects of ageing but we can always try to

make them easier. In recent years, a new approach was

introduced to confront social exclusion of elderly with the

help of assistive robots [1].

Keywords: Social Inclusion, Robots, elderly, seniors,

Assistive Robots, Companion Robots.

SOCIAL INCLUSION WITH ASSISTIVE

TECHNOLOGIES

Exclusion according to World Health Organization

[2] “consists of dynamic, multi-dimensional processes

driven by unequal power relationships interacting

across four main dimensions - economic, political,

social and cultural - and at different levels including

individual, household, group, community, country and

global levels”. As a result, a sequence of

inclusion/exclusion is triggered that leads to

inequalities in health and rights; a process in which

individuals or entire parts of the society are deprived

from the rights, opportunities and resources that are

normally available to its members.

Elderly, facing the threat of losing their

independence due to aging and its consequences

(health, cognitive, social and financial), are in high

risk of social exclusion. Social exclusion in the form

of deprivation from activities that offer joy, fulfillment

and sense of belonging, can lead to frustration,

depression and health decline. Developing policies to

confront and prevent social exclusion has been a major

concern for the World Health Organization and all

European countries.

To address the critical issue of the social exclusion

of elderly, EU supports and funds a vast number of

projects that aim to offer solutions to isolation,

loneliness and exclusion of elderly with the assistance

of ICT technologies and robotics. The use of new

assistive technologies allows elderly to face the

difficulties of modern life and get over the barriers that

limit their social and emotional well being, assisting

them to have a more qualitative living [3]. Assistive

robotics were alleged to be more effective towards this

direction comparing to computers for a number of

reasons [4], [5]:

Computers need training in order to be used

by elderly or require a person (caregiver) to

use it on behalf of the elder or helping him.

Learning is one of the cognitive functions that

decline with aging, making new information

and skills difficult to be acquired.

Sensor and motor declines triggered by age

can also make it difficult for the elderly to use

computers.

On the other hand, assistive robots can be used by

elderly without intensive training and physical effort.

They could have immediate access to a number of

applications and direct connection to internet, social

media, email, Skype calls etc. All they need to do is

just ask the robots to do it for them [6]. Moreover,

their humanoid appearance gives a sense of having a

companion rather than a machine and decreases

loneliness and social deprivation. Companion robots

like Paro, should be cited as they have been proved to

positively affect social skills of elderly and increase

social interactions as well as the emotional well being

of the users. Paro is a small robot resembling a seal

that can sense user’s touch, recognize a limited amount

of speech, express a small set of vocal utterances, and

move its head and front flippers [7].

The development of robotics has already created a

number of abilities to current products enabling robots

with the ability to recognize objects and faces; hear

and speak; move around; pick up and grasp objects;

express emotions.

Human-robot interaction

It has been proved that robots can assist elderly in

their daily life [8], [9], but can they really substitute

social contact? While working in the frame of an EU

project (RAPP/EU-FP7), with a group of seniors from

a small seaside town in North Greece, some questions

were raised on the potential use and acceptance of

robots by elderly.

The social and cultural background of the

aforementioned RAPP group living in a small

Mediterranean village in Greece where family ties and

social relations ships are strong determines their

reaction to robots and specifies their interaction [10].

In order to explore in detail the feelings of the users

towards robots, we used the “Negative Attitudes

towards Robots Scale” by Nomura to investigate

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 88 -

Page 89: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

potential negative perceptions and behaviors that could

prevent interaction between robots and the elderly

[11]. The scale was utilized as a discussion tool rather

than distributed questionnaire due to the small number

of our group and their tendency to answer

questionnaires “in a positive way” (to gain

researcher’s approval) or exactly the same way with

each other (watching what other seniors answered).

The main outcome of this free discussion was that all

elderly imagine robots in a human-like form moving

around the house doing the household and take care of

them like “mechanical servants”. They would like

robots to have feelings and make friends with them but

they do not really believe that this could happen, at

least not soon. When insisting on this aspect (“imagine

that we could have a robot with feelings by

tomorrow”), they expressed some concerns of how the

world could be if people make friends with robots

instead of each other or how complicated the human

robot interaction could be if feelings were engaged.

The basic conclusion was that robots are good because

they are assistive machines and there is no reason to

worry about them as they will always be like that; all

the scenarios about having feelings or think for

themselves are impossible (“this is sci-fi”).

Figure 1. Interacting with NAO

It is apparent that a number of issues rise

considering the human-robot interaction and the

feelings of likeliness of robots by the elderly. It is of

high importance that robots will be accepted by

seniors to interact with. The feelings that robots evoke

to people whether look like humans or unlike humans

can affect their interaction and their psychological

status in the whole. Concerning our user case in the

small seaside town of North Greece, we chose to use

NAO, a human-like robot by Aldebaran (SoftBank

Group), [12]. As NAO is akin to a little child, the

human-robot interaction was enhanced since seniors

treated the robot like a little child, protect it and adjust

their behavior to discuss with it. The cautious attitude

of the first user-NAO meeting was followed by a

number of positive meetings where the familiarity with

the robot helped them to approach it and interact with

it, finding NAO interesting, useful and easy to use.

They were still disheartened by some “disabilities” as

they actually expected much more from robots. Still,

when companionship was discussed, nobody felt that a

robot could replace a warm meeting with friends or

family. One of the users commented: “It’s a machine,

not a friend of mine”.

The feelings of the elderly on interacting with

robots or restore their social life with the company of

robots is the key issue we need to clarify in order to

promote successful assistive technologies and suitable

robotic products.

REFERENCES 1. Bradley, Sara M., and Cameron R. Hernandez. "Geriatric

assistive devices." American family physician 84, no. 4

(2011).

2. World Health Organization, ed. International

Classification of Functioning, Disability, and Health:

ICF. World Health Organization, 2002.

3. Ahn, Mira. "Older people’s attitudes toward residential

technology: The role of technology in aging in place."

PhD diss., Virginia Polytechnic Institute and State

University, 2004

4. Roupa, Zoe, Marios Nikas, Elena Gerasimou, Vasiliki

Zafeiri, Lamprini Giasyrani, Eunomia Kazitori, and

Pinelopi Sotiropoulou. "The use of technology by the

elderly." Health Science Journal 4, no. 2 (2010).

5. Hutson, Suzanne, Soo Ling Lim, Peter J. Bentley, Nadia

Bianchi-Berthouze, and Ann Bowling. "Investigating the

suitability of social robots for the wellbeing of the

elderly." In Affective Computing and Intelligent

Interaction, pp. 578-587. Springer Berlin Heidelberg,

2011

6. Ezer, Neta, Arthur D. Fisk, and Wendy A. Rogers. "More

than a servant: Self-reported willingness of younger and

older adults to having a robot perform interactive and

critical tasks in the home." In Proceedings of the Human

Factors and Ergonomics Society Annual Meeting, vol.

53, no. 2, pp. 136-140. SAGE Publications, 2009.

7. Kidd, Cory D., Will Taggart, and Sherry Turkle. "A

sociable robot to encourage social interaction among the

elderly." In Robotics and Automation, 2006. ICRA 2006.

Proceedings 2006 IEEE International Conference on,

pp. 3972-3976. IEEE, 2006

8. Hutson, Suzanne, Soo Ling Lim, Peter J. Bentley, Nadia

Bianchi-Berthouze, and Ann Bowling. "Investigating the

suitability of social robots for the wellbeing of the

elderly." In Affective Computing and Intelligent

Interaction, pp. 578-587. Springer Berlin Heidelberg,

2011.

9. Fasola, Juan, and Maja J. Mataric. "Using socially

assistive human–robot interaction to motivate physical

exercise for older adults." Proceedings of the IEEE 100,

no. 8 (2012): 2512-2526.

10. Bartneck, C., Nomura, T., Kanda, T., Suzuki, T. and K.

Kensuke (2005). Cultural differences in attitudes towards

robots. Proc. of AISB 2005:1-4.

11. Nomura, Tatsuya, Takayuki Kanda, and Tomohiro

Suzuki. 2006. Experimental investigation into influence

of negative attitudes toward robots on human–robot

interaction. Ai & Society 20 (2): 138-50.

11. Aldebaran Robotics. (n.d.). Retrieved July 30, 2015.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 89 -

Page 90: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Survey investigating ethical issues concerning Robot Enhanced Therapy for children with autism Coeckelbergh, Marka; Pop, Cristinab; Simut, Ramonac; Peca, Andreeac;

Pablo Esteband; Albert De Beird; Hoang Long Caod; David, Danielc; Vanderborght, Bramd a Centre for Computing and Social Responsibility, De Montfort University, Leicester, United Kingdom.

b Babeș-Bolyai University, Department of Clinical Psychology and Psychotherapy, Cluj-Napoca, Romania c Vrije Universiteit Brussel, Clinical and Life Span Psychology Department, Brussels, Belgium

d Vrije Universiteit Brussel, Robotics and Multibody Mechanics Research Group, Brussels, Belgium

Abstract. The use of partially autonomous robots in therapeutic contexts raises several ethical issues, starting with the degree of autonomy to be afforded to the robot. The more autonomous the robot, the less control therapists have over the robot-child interaction, raising the issue of where the responsibility for the robot’s actions lies. Autonomy also raises the problem of trust: are parents happy to have their child interact with a robot? Will the child trust the robot? The Eurobarometer study of public attitudes towards robots, shows that many people in Europe resist this idea of using robots in care. The aim of this paper is to investigate the ethical issues raised by the use of robots in therapy for children with ASD by means of a survey amongst caregivers, parents and teachers of children with ASD. We conclude that although in general stakeholders approve of using robots in therapy for children with ASD, it is wise to avoid replacing therapists by robots and to develop and use robots that have at what we call supervised autonomous interaction.

Keywords: social assistive robot, Autism Spectrum Disorders, Survey

INTRODUCTION

Impairment in social interaction is an important element of Autism Spectrum Disorders (ASD) and challenges researchers to find better treatments. This is also the case for children with ASD. Several kinds of treatments are being investigated to improve their capacity for social interaction and communication such as applied behavior analysis, peer-mediated training, video-modeling, social stories, etc. One of the proposed options is to use robots as tools to enhance therapy [1]. The project DREAM, funded by the European Commission under the FP7 framework, investigates so-called robot enhanced therapy (RET) for children with ASD. Roboticists develop social robots as Nao or Probo [2] which can interact with the child, while being supervised by the therapist. Therapists can use the robot to elicit prosocial behavior; the robot functions as a social mediator between therapist and child. However, robot developers and therapists are concerned about the ethical and societal acceptability of their tools and methods. As a recent Eurobarometer [3] study of public attitudes towards robots shows, many people in

Europe resist this idea of using robots in care. 60% of EU citizens saying that robots should be banned in care of children, elderly people and people with disabilities . There is also still considerable opposition to using robots in other ‘human’ areas: 34% of respondents say robots should be banned in education, 27% are against the use of robots in healthcare and 20% oppose their use for leisure purposes (European Commission 2012: 11). Robot scientists are also sometimes confronted with negative responses to their work. Also, often robots are linked to science-fiction and are presented as dangerous for mankind. Some sound ‘apocalyptic alarm’ [4]. Therefore, we want to know what people think about RET. Do they think it is ethically acceptable to use robots for this purpose? Do they think it is helpful? Would parents trust their children to a robot? And if more autonomous robots were to be developed, would they trust a situation in which there is no adult supervision?

The philosophical discussion delivers two types of potential problems which both relate to the autonomy of the human person (therapists, parents, others). First, there are issues concerning privacy and data protection, issues which are also raised by many other ICTs. Second, there is the problem concerning robot autonomy and trust: how much (and what kind) autonomous behavior should the robot exhibit, that is, to what extent should the robot-child interaction be supervised and controlled by the therapist? More generally, can the parents trust their child “into the hands of the robot”?

METHODOLOGY

The questionnaire was mainly/also offered on-line by the free and open source online survey application LimeSurvey installed at the VUB webserver and was available in three languages English, Romanian and Dutch. Since robots exists in different shapes for wide range of applications, but our survey focuses on social robots we introduced this type of robot before the survey by means of a 1minute video in Layman’s terms. The video contained short clips of a selection of currently most used robots as NAO, Keepon, Probo, Kaspar, Iromec platform, Pleo. As such robots were

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 90 -

Page 91: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

shown that look like machines, (imaginary) animals, humanoids and androids. No children were shown in the video1.

The questionnaire was developed in a multidisciplinary team consisting of psychologists, therapists, engineers and ethicists and were based on guidelines and essential elements of questionnaire design and development in order to obtain a reliable and valid questionnaire [5].

We asked parents and therapists in Romania, Belgium, and the Netherlands. Participants were recruited based on databases of persons involved in our past research and messages were posted on relevant blogs, Facebook, and newsletters and websites of autism organizations. A total of 416 subjects participated in the study. Data from 22 participants were excluded from the analysis since the responses were incomplete. 22.59% of the participants were parents of children with ASD and 16.75% of the participants were therapists or teachers of children with ASD.

RESULTS

Our survey had the following results. In general, our respondents find it acceptable to use social robots in therapy for children with ASD. (This is a difference with Eurobarometer results about the use of robots in healthcare in general. We explained in a video the concept of a social robot, we used a neutral voice and did not show children; perhaps this made a difference.) However, our respondents are far more hesitant about the idea that these robots would replace therapists; most people think that robots should support the interaction between therapist and child, rather than replace the therapist. For instance, a significant number of people do not want the robot to respond automatically to the child’s behavior, without being tele-operated. The reason why in DREAM is worked towards supervised autonomous interaction [6]. Furthermore, some people are also worried about the possibility that the robot is perceived by the child as a friend, or as a human; our respondents are more positive about zoomorphic robots and the idea of the robot as a tool.

CONCLUSION

The use of robots for RET for children with ASD raises several ethical issues. The survey we conducted supports both the idea of the DREAM project to avoid replacement and to develop and use robots that have at most supervised autonomy. More generally, it seems that most people approve of using social robots in ASD therapy, which is in contradiction with the Eurobarometer study, provided ethical issues such as autonomy/trust and appearance are dealt with by the

1 http://www.youtube.com/watch?v=DSklqn49gD8.

researchers and therapists. A more in depth discussion is found in [7]. Further research is needed to obtain a more comprehensive analysis of the ethical issues and to involve stakeholders in the development of robots for children with ASD.

Figure 1. Results Is it ethical acceptable that social robots are used in therapy for children with autism. ACKNOWLEDGEMENTS

The work leading to these results has received funding from the European Commission 7th Framework Program as a part of the project DREAM grant no. 611391.

REFERENCES 1. Scasselati, Brian, Admoni, Henny, and Matarić, Maja.

2012. Robots for Use in Autism Research. Annu Rev Biomed Eng 14: 275-294

2. Vanderborght, Bram., Simut, R., Saldien, J., Pop, C., Rusu, A., Pintea, D., Lefeber, D., David, D. 2012. Using the Social Robot Probo as Social Story Telling Agent for Children with ASD. Interaction Studies 13(3): 346-370.

3. European Commission. 2012. Special Eurobarometer 382: Public Attitudes Towards Robots. (Report) Retrieved 14-11-2012 from http://ec.europa.eu/public_opinion/archives/ebs/ebs_382_en.pdf

4. Veruggio, Gianmarco. 2005. The Birth of Roboethics. Paper presented at ICRA 2005, IEEE International Conference on Robotics and Automation, Workshop on Robo-Ethics, Barcelona 18 April 2005.

5. Rattray, Janice & Jones, Martin C. 2005. Essential elements of questionnaire design and development. Journal of Clinical Nursing 16: 234–243.

6. Thill, S., Pop, C. A., Belpaeme, T., Ziemke, T., & Vanderborght, B. (2012). Robot-assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook. Paladyn, Journal of Behavioral Robotics, 3(4), 209-217.

7. Coeckelbergh, M., Pop, C., Simut, R., Peca, A., Pintea, S., David, D., & Vanderborght, B. (2015). A Survey of Expectations About the Role of Robots in Robot-Assisted Therapy for Children with ASD: Ethical Acceptability, Trust, Sociability, Appearance, and Attachment. Science and engineering ethics, 1-19.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 91 -

Page 92: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Accommodating Students with Disabilities Using Social Robots and Telepresence Platforms: Some Legal and Regulatory Dimensions

Aaron Saigera

aFordham University, School of Law, New York, New York, United States

Abstract. As social and other robots become more accessible and inexpensive, public school systems will increasingly use them to accommodate pupils with disabilities. Such accommodations are subject to regulatory regimes designed to ensure those students’ equality and dignity. The application of these rules to robotic technology is not straightforward. This is because robots in school have the simultaneous potential to facilitate inclusion, mark difference, and substitute for other kinds of accommodations. In order to assure students’ rights and comply with relevant law, regulators will have to consider in sophisticated ways the social dimensions of robots—social robots per se, but also telepresence platforms that allow students to attend school remotely. Design of robots for use by disabled students must ensure that they facilitate rather than mitigate the social inclusion of the disabled user in her general-education school.

In both the United States and in Europe, public schools have a duty to educate all children, with or without disabilities, equally. Both systems understand equality to require schools to accommodate disabled children’s particular needs. Such accommodations ensure that each disabled child is able, to the extent possible, to access educational programs on the same basis as all other pupils.

All U.S. states accept federal funds that support the education of disabled children. The Individuals with Disabilities Education Act (IDEA) obligates states in return to provide all children with disabilities in the state “a free appropriate public education.” IDEA defines “a free appropriate public education” as “special education and related services” that are provided “at public expense,” are educationally “appropriate,” and are consistent with an “individualized education program” that the state must prepare for the student. It defines “related services” as “such developmental, corrective, and other supportive services (including ... mobility services) ... as may be required to assist a child with a disability to benefit from special education.” And it defines “special education” to mean “specially designed instruction, at no cost to parents, to meet the unique needs of a child with a disability.” The implementing regulations for IDEA further require that schools provide “assistive technology devices … or services,” at home or at school, if these are “required” to make either special education or related services effective. (1

European requirements are broadly similar. The recently ratified United Nations Convention on the Rights of Persons with Disabilities (CRPD) guarantees children with disabilities “quality and free primary education and secondary education on an equal basis with others in the communities in which they live.” It further obligates states to ensure “[r]easonable accommodation of the individual’s requirements.” (

)

2

A major additional feature of the CRPD is its requirement for “inclusive education.” (

)

3

1

) Students with disabilities may not be “excluded from the general education system on the basis of disability.” Inclusion bears a family resemblance to the “mainstreaming” requirement of the American IDEA, which demands that disabled children be placed in the “least restrictive environment” consistent with their needs. ( ) In the United States, but less so in Europe, mainstreaming is considered to be an overriding goal, entitled to priority over most other desiderata for special education. (3, 4

As experiments and pilot programs that introduce social robots into classrooms proliferate, several have focused upon special education, especially for students with autism and similar social disorders. Such robots are “assistive technology devices” under the law. IDEA requires schools to provide them, without regard to cost and without charge to parents—although only if they are “required” to make special education or related services effective. In guidance they issue to school districts, several states already list robots among assistive technologies that schools must consider for disabled students. There are no reported instances of disputes over whether robots are “required” that have led to formal administrative or judicial resolution; but it is hard to imagine that this will long remain the case.

)

Social robots might also substitute for human paraprofessionals now assigned as one-on-one supervisors or “shadows” for students with disabilities such as severe allergies or social disabilities that involve aggression and violent behavior. (5

More complex issues are raised by telepresence systems for children whose disabilities prevent them from attending school. Such disabilities range from severe allergies to immune disorders to cancer. (

) I am unaware of any such uses today but it is an obvious application.

6) These systems’ core features are a mobile platform, processor,

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 92 -

Page 93: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

microphone, and camera. The child user can, from a remote location, move the platform around the school, hear and see what is occurring in the classroom, and be heard and seen in turn. A story in the popular press about the VGo, one of the major brands in this sector, describes it as a “camera-and-Internet-enabled robot that swivels around the classroom and streams two-way video between … school and house.” (7

Telepresence systems raise important issues regarding inclusivity. The most straightforward is that public schools likely have a legal duty to provide such systems to students who otherwise cannot attend school. In the American context, the VGo is an “assistive device” under the regulations, and is probably also a “related … mobility service” under the IDEA statute itself. Schools must therefore provide it, irrespective of cost.

)

A more difficult question is whether schools may accommodate disability via telepresence rather than by providing physical accommodations that are potentially more cumbersome, expensive, and disruptive. A school might prefer to accommodate a child with a motor handicap by offering a telepresence system than by modifying doors, classroom furniture, and bathrooms. So too, telepresence might be offered to disruptive students in place of a one-on-one paraprofessional shadow (human or robotic). Telepresence might not only save money but improve the effective accessibility of the school to such students. (Even if a student has a place for her wheelchair in the classroom, the wheelchair will not fit in every place that a child’s classmates might go.) Nevertheless, one might nevertheless conclude that such “accommodations” are antithetical to mainstreaming or inclusive education, because they physically exclude the child with the disability from the general-education classroom.

These problems will only become more difficult as the social dimensions of robotic presence in contexts like classrooms becomes better understood. Telepresence platforms are not social robots in the ordinary sense, but they have affective features. Both the accommodated child and other children relate socially to the telepresent machines. Their operators often doll them up with clothing and other accoutrements; the mobile carts stand in line and go outside for recess; other children touch them. (6, 7) Mandates for mainstreaming or inclusion are primarily about social equality, and focus upon equality in the informal, interpersonal aspects of school life more than equality of formal academic opportunity. Thought must therefore be given to what kind of social reality telepresence creates for disabled children and whether it is consistent with the right to inclusion. Telepresence is unambiguously positive when the disability in question otherwise demands isolation. But the question is much more difficult when telepresence substitutes for other, partial but in-person accommodations.

In the near future, moreover, we should expect telepresence to be augmented by various adaptive features, some of which will be classically “robotic” in the sense of the systems’ responding to external stimuli based upon internal algorithms rather than direct user control. For example, a VGo controlled by a vision-

impaired student can be rigged at the user’s end to magnify what the camera sees. This is beneficial, but not unambiguously so. The hidden cost is that neither the school nor fellow students need adjust their own behaviors to the needs of their disabled comrades. This is problematic if accommodation and mainstreaming are necessary to equality for the disabled.

The hardest cases will involve, again, social disabilities. A student who cannot pay attention, who disrupts class, or who is violent toward his peers is difficult to accommodate in person. A telepresence system might therefore be designed, for example, to mute its voice or disengage its drive during instruction. A user might wish to have the system run around the room, or call out, but it would refuse to do so. Similarly, studies show that persons expect technology to obey conventions about social space, body language, movement, and response time, and judge nonconformance negatively. (8

Again, such accommodations bring clear benefits in terms both of access and acceptance. In some cases they might even be a necessary condition of any inclusion at all. But they depart from the ideal of mainstreaming or including the disabled child as she is. They include not the child but a modified simulacrum of that child; and they deprive other children in the classroom of the social experience of the full diversity of disabled children. The legal question then becomes whether, when, and to what extent socially-adapted telepresence is a desirable accommodation that brings the disabled child into the social mainstream of her class or school, and under what circumstances it is a retrograde development that excludes and isolates them. The legal analysis in turn should inform design standards for systems whose features will facilitate the former determination.

) A tempting potential response is to augment telepresence systems to follow social conventions automatically. Doing so could, for example, ease the accommodation and acceptance of operators with autism who often struggle to conform to these conventions.

1. Individuals with Disabilities Education Act, 20 U.S.C. §§ 1401ff.; 34 C.F.R. § 300.105. 2. United Nations Convention on the Rights of Persons with Disabilities § 24(2)(b), (c). 3. G. De Beco, The Right to Inclusive Education According to Article 24 of the UN Convention on the Rights of Persons with Disabilities: Background, Requirements and (Remaining) Questions, Netherlands Quarterly of Human Rights 32, 263–287 (2014). 4. A. Kanter, M.L. Damiani, & B.A. Ferri, The Right to Inclusive Education Under International Law: Following Italy’s Lead, Journal of International Special Needs Education 17, 21–32 (2014). 5. M.F. Giangreco et al., Paraprofessional Support of Students with Disabilities: Literature from the Past Decade, Exceptional Children 68, 45–63 (2001). 6. http://www.vgocom.com 7. Robbie Brown, A Swiveling Proxy That Will Even Wear a Tutu, New York Times, June 7, 2013. 8. Tom Simonite, The New, More Awkward You, MIT Technology Review 114, 78–80 (January/February 2011).

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 93 -

Page 94: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

An HRI study with elderly participants? Where’s the problem?*

Jorge Gallego-Perez

Abstract— Assistive technology and particularly care robotspose a range of practical and ethical problems to elderly users,as well as their carers. Many of those issues have already beentackled in a vast literature. More rarely do we find accountsof the challenges that the researcher faces when researchinvolves robotics and elderly participants. In this short paper,I introduce potential pitfalls that researchers in Human-RobotInteraction (HRI) might encounter when the participants areof advanced age. To present these research challenges in anengaging manner, these are embedded in a fictitious story abouta researcher’s latest study.

I. INTRODUCTION

In this paper I present an array of pitfalls and caveatsthat an HRI researcher is likely to face in his/her researchwith elderly participants. I do not consider the correspondingsolutions in this paper, leaving that to future publications.Instead of enumerating these potential shortcomings andconsiderations in a long list, I decided to present them in theform of a fictitious narration. In the story, a PhD student isemailing back his supervisor, who is inquiring him about thecurrent status of his last study, namely an HRI experiment thatincludes elderly participants. The fictitious experiment takesplace in a nursing home of elderly people without dementia.They interact one-to-one with a WoZ-controlled social robotin a room of the nursing home, whereby they speak in naturallanguage while holding a tablet.

The events in the story that are marked with asterisk (*)indicate that I went through similar situations in my research.Other events are followed by references, indicating literaturethat points to related issues. The rest of events and reflectionsadded to the story relate to issues that I estipulate to beequally relevant.

II. A FICTITIOUS HRI STUDY

I so much appreciate your email, asking me about thecurrent state of my first HRI study with elderly people. Ido acknowledge the importance of being accepted to theconference as well as the urgency of the situation due to thedelays, but I can assure you that I have everything undercontrol and that we will impress our reviewers with the greatquality of the study. I have no doubt that we’ll be accepted!There might have been a few bumps on the road, but there’snothing now to worry about.

As you already know, it took me just a little long to properlystart the study. The recruitment of the participants turned outto be different than recruiting younger adults as I’m used to*

*Research supported by EU FP7 project ACCOMPANY.J. Gallego-Perez is with the Faculty of Electrical Engineering, Mathematics

and Computer Science, University of Twente, 7500 AE Enschede, TheNetherlands {j.gallegoperez} at utwente.nl

[1]. There are just a few older people around the campus*,and I don’t have connections. I pulled some strings and Ifound a nursing home where I could conduct my study. A littlebit more of delay came then from obtaining the agreement ofthe carers and the relatives of the participants [1].

You asked me in your email how the data collection went.It went very well, just a few minor obstacles regardingthe organization. When participants were supposed to waitoutside the room of the robot before their turn, some of themdecided to just leave [2] (some of the delay comes fromthis, I needed more extra time rearranging new sessions).Some participants didn’t show up because they had forgottenour appointment [2]. Also, three participants didn’t showup either because of mobility problems [1]. And also, a fewparticipants decided to give up their participation becausethey believed I was trying to sell them the robot [2]. Alright,maybe a few obstacles so far, but the rest went quite alright.

Well, actually we had some small issues with the memoryof the participants throughout the whole interaction withthem* [2], but nothing to worry about. For instance, severalparticipants didn’t remember well their interaction with therobot when they were supposed to fill in the questionnaires [2].Others would initially refuse to complete the questionnairesbecause they thought they were finished with the experiment[2]. Also, some participants had eyesight problems* and couldnot read the questionnaires. In general the participants neededextra attention to understand how to fill in the questionnaires*and often would hand me the questionnaires incomplete*. Ialso had to be a bit careful during the interview, becausemany would have the tendency to remember recent tragicevents (e.g. the death of a relative) and would start to feeldeeply affected*. Alright, seen retrospectively it might appearthat many things went wrong with the data collection Stillnothing to worry about, [supervisor], the rest went on moresmoothly.

You asked me also about the interaction between theparticipants and the robot. It went very well, they lovedit! I just had to explain very often what the purpose of thestudy was* [2]; some would have forgotten it [2], whereasothers would speak to the robot expecting that its capabilitieswere a bit too far beyond its actual ones* [2]. Well, nowthat I think about it, I also had the impression that a fewparticipants were somewhat tense, at least more tense thanyounger adults would generally be in this kind of experiments*.OK, to be honest, perhaps that happened with about half ofthe participants. I wonder why. Maybe older adults tend tounderestimate their own abilities more than younger adultsdo [3]. Or perhaps they felt stigmatized as in thinking I’msuch a frail, lonely old person that they think I need to have

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 94 -

Page 95: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

a robot. I suppose that talking to them in English instead ofin their native language only worsened this issue*.

Anyway, I still believe they liked it and that it was quitean enriching experience for them. Well, at least for thosewho could hear the robot*. Some participants had hearingimpairments*. I just had to adjust the volume of the robotsometimes*, no big deal. Also, for some reason, a fewparticipants seemed to be unable to understand the artificialvoice of the robot while they would actually understandnatural voices in the same language, at the same volume*.For the rest, the interaction with the robot went alright, exceptfor two participants that fell asleep for a moment during thelongest monologue of the robot*.

Alright, I know what you’re thinking. Perhaps quite a fewbumps on the road indeed. But I still have faith in this study,and I assure you that you will be very happy with the resultsonce the data are analysed, I still have one full day for this.

You asked me in your email to explain the reasons for thedelays and the scarcity of participants. Well, these hindrancesI mention above had tripled the duration of the data collectionas I had initially planned, point at which one participantaccidentally dropped the tablet, which was damaged beyondrepair*. I needed then another week to resume the datacollection and run the last sessions.

With respect to why I gathered so few data, well, perhaps,as you told me at the beginning, I was a little too bold whenI planned this study for one hundred elderly participants. Allthose obstacles I mentioned during the data collection forcedme to discard about two thirds of my sample. Also, very sadly,from the people that had agreed to participate, four fell sickand one passed away.

Answering another question you asked, of course, Iunderstand that you want to give a good impression to thenursing home organization. Except for a couple of minorissues everything went well with them. Alright, perhaps Ishould tell you what happened. I explained to the nursinghome organizers that I needed to give a debriefing to theparticipants to, among other things, make them aware thatthe robot was actually not autonomous, but teleoperated(WoZ)*. One organizer insisted that the participants couldturn severely disappointed I can understand this, after that asort of friendship had been in some cases established betweenthe robot and the participants and warned me with carryingthe issue to court. She dropped this issue eventually. Also, thethree organizers insisted vehemently that I continue bringingthe robot to the nursing home. They argued that by takingthe robot away from the residents I would remove from theirlives a source of recreation or even socialization.

You mentioned also that we needed results as generalizableas possible. I hope they are! Well, I don’t know exactly howto tackle the fact that the great majority of the participantsare women. Can I generalize the results to both genders,or should I separate them? If I separate them, I will onlyhave a handful of male participants, which would be a greatstatistical disadvantage. I thought also about what to proposefor future research. I thought it would be a good idea toextend the study to younger participants. However, I then

realized that I can’t separate the effect of age from the effectof generation. That is, how can I know that all reactionsof the elderly participants to the robot depend on the factthat they are old, and do not depend on the different way(two generations ago) they were raised? Let’s take the Flynneffect as an example, which states that, at least for almostthe whole last century, the average IQ of the population hassteadily been increasing [4].

Finally you asked me about the biggest contribution of thisstudy. Look, I don’t know let’s face it, it’s been a disaster!I’ll give up trying to convince you that the study went welland that I have everything under control, because right nowI’m just bracing for a collection of tough reviews. There’sno way we’ll be accepted!

But hey, next time it can only get better*. The biggestcontribution? Well, I learned a lot!*

Best regards,[your student]

III. CONCLUSION

Conducting HRI research with aged participants can proveto be daunting for the unprepared researcher. The higherprevalence of certain ailments in this age group makes itnecessary to adapt the instruments and the methodology ofthe study. Also the particular social structure around manyaged people must be considered by the researcher, for examplein the cases where the participants depend on third persons.

I hereby apologize if any persons of advanced age feltoffended reading the fictitious story. I personally acknowledgethat this age group is as diverse as any other, and that not everyelderly person possesses the characteristics of the fictitiousparticipants depicted above.

I would like to remind the reader of the presence ofasterisks on the last line of the story. Even though the story,the PhD student and the supervisor are fictitious, I once hadin reality a first HRI study with elderly participants, fromwhich I learned that one cannot overstate the importance ofspecific preparation.

ACKNOWLEDGMENTThanks to Rieks op den Akker, Michiel Joosse and Jered

Vroon for their feedback on this work. Special thanks to mysupervisors, Vanessa Evers and Manja Lohse, for their helpand guidance throughout my research.

REFERENCES

[1] Iman Ridda, Richard Lindley, and Raina C MacIntyre. Research: Thechallenges of clinical trials in the exclusion zone: The case of the frailelderly. Australasian Journal on Ageing, 27(2):61–66, 2008.

[2] Marcel Heerink, Ben Krse, Bob Wielinga, and Vanessa Evers. Human-robot user studies in eldercare: Lessons learned. In Smart Homes andBeyond: ICOST 2006: 4th International Conference on Smart Homesand Health Telematics, volume 19, pages 31–38. IOS Press, 2006.

[3] J Gallego-Perez, Manja Lohse, and Vanessa Evers. Robots to motivateelderly people: present and future challenges. In RO-MAN, 2013 IEEE,pages 685–690. IEEE, 2013.

[4] James R Flynn. The mean iq of americans: Massive gains 1932 to 1978.Psychological bulletin, 95(1):29, 1984.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 95 -

Page 96: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Toy Robot versus Medical Device

Jordi Albo-Canalsa

ala Salle, Ramon Llull University

Abstract— A common Dilemma that people have when they aredesigning social robots for therapy. In those casses that in additionto the research about the benefits of using robots, there is also thedevelopment of a robotic product, research team faces the questionof how they are supposed to certify the product: as a toy robot,or as a medical device. In this paper, I introduce the decissioncriteria followed by roboticists I know, I comment what we canfind in the literature and internet, and I present a discussion ofsome considerations that researchers and developers should takeinto account.

Keywords— Toy, Robot, Medical Device, Legal, Certification

1. INTRODUCTION

Robotics is a reseacrh field that reached the marketrecently and all kinds of robots started to be part of theconsumer market. Some of theses robots are being usedin therapy and education. Up to know education has beenan easier environment, in terms of regulation, to deployrobots. So most of the robots used in Schools, trainingcenters, universities, etc., have the certification of toy robots.The same happens with robotic platforms used to assistcaregivers, physicians, therapist, or other professionals, totake care of their patients.

As long as this research invole multidisciplinary researchgroups, practiotioners with different backgrounds, and differ-ent environments of application, the actors that are involvedin the design of a robotic platform wonder, in those caseswhere after the reseacrh we are getting a new robotic device,how the product should be certified.

If we take a look to the list of questions which answersdrive a decission about the product certification process wehave:

• Certifications ?• Time to market ?• Intellectual property ?• Applications ?• International regulations ?In the following sections we are going to present what

we can find in the literature, news, the internet, etc., aboutthis dilemma. And we end the paper with the introductionof discussion about what paths are chosen by developers ingeneral, what can be right or ethical.

2. WHAT WE CAN FIND OUT THERE

If we do a quick search about robot as medical device ornon-medical device we will deal easily with the controversialabout how a robot is or should be certified.

Maybe because robotics started to get into the market fromthe industrial sector, the first tool that we find to regulatethe criteria to design, and so to certify a robotic product likea mobile robot, robotic wheelchair, or excoeskeleton, is theISO 13482:2014. As we can find in the literature in [2], [3],[4], or in [5], there are initiatives that are trying to cover thisgaps on the regulation requirements for these kind of devices.

If we move to the point that concerns to us about Medicaland non-medical robots, we will see that non only thereis unclear policy from the manufacturer point of view, butalso from the distributor perspective. To clarify this point, Ipresent two obvious cases that we can findnin the market:the robot PARO [6] and the robot ROMIBO [7].

PARO is a robot that is commercialized in the U.S. as amedical device, whille in countries like for example Spainit is commercialized as a non-medical device. In Fig.1 wecan find a reference to PARO Robot by Wall Street Journal,and what we can find about the ROMIBO robot in theorigami robotics website (the developers of the robot) aboutif ROMIBO is a Medical device or not.

Fig. 1. Article about PARO robot, and how ROMIBO robot is poresebtedat Origami robotics website

According to [9] and the US regulation in [10], a medicaldevice is required whether a given article is intended either”for use in the diagnosis,... treatment, or prevention ofdisease,” or ”to affect the structure or any function of thebody.”

Above, we can find two examples of, in one hand, howa social pet robot is considered a different device type indifferent countries, and on the other hand, how a developersplit the device itself from how to use the device.

While in the PARO robot case, there might be a wrongapplication of how the device should be acredited, in theROMIBO case it is a clear example the they decide to ac-credit the product as a toy robot to skip a tougher certificationprocess.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 96 -

Page 97: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

3. DISCUSSION & CONCLUSIONS

How the community is acting about that? As we haveseen in the previous examples, there is not only a matter ofdifferent regulation in different countries, it is also a tradeoff between the time to market, the time to go through thecertification costs, and the cost to develop the robotic device.

The most typical decission when it is time to considerhow the device will be certified is to take the easiest andcheapest way and certify the robot as a toy-robot device.To do that, what developers do is to explain that they areselling a toy, but that at the same time it can be usedas a support, facilitator, etc. tool that will improve thetherapies conducted by the specialist, the quality of life ofthe children, etc.

Although this can be the way to arrive on time to themarket, keep the final cost affortable for the customers, andskip extra burocracy, it is also true that in those cases wherethere is a clear intention to put the robots in a medicalenvironment the attitude can be considered unethical.

From an idealistic point of view, a better regulationsshould be defined. At the same time, to be realistic, thereis the option to create different product lines, i.e., a robottoy as a general public device for playing, and a medicalrobot as a therapeutical device for treating people. [11] (seeFig. 2 is a social robot device that follows this example.There is a Keepon oriented to children to play with it, andthere is a Keepon oriented to researchers.

Fig. 2. The Keepon Robot

REFERENCES

[1] http://www.iso.org/iso/catalogue detail.htm?csnumber=53820[2] Fosch-Villaronga, Eduard. ”Creation of a Care Robot Impact Assess-

ment.”[3] Weng, Yueh-Hsuan. ”The study of safety governance for service

robots: on open-texture risk.” (2014).[4] Boscarato, Chiara. ”Robotics, Innovation and the Law.” THE CHAL-

LENGE OF INNOVATION IN LAW: 221.[5] Mitka, Eleftheria, and Spyridon G. Mouroutsos. ”Applying the

STAMP system safety engineering methodology to the design of adomestic robot.” International Journal of Applied Systemic Studies 6,no. 1 (2015): 81-102.

[6] http://www.parorobots.com/[7] http://origamirobotics.com/[8] Pathak, Sant, Giorgio Metta, and Armando Tacchella. ”Is verification a

requisite for safe adaptive robots?.” In Systems, Man and Cybernetics(SMC), 2014 IEEE International Conference on, pp. 3399-3402. IEEE,2014.

[9] Gamerman, Gary E. ”Intended Use and Medical Devices: Distinguish-ing Nonmedical Devices from Medical Devices under 21 USC 321(h).” Geo. Wash. L. Rev. 61 (1992): 806.

[10] http://www.fda.gov/downloads/RegulatoryInformation/Guidances/UCM127067.pdf

[11] http://www.mykeepon.com/

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 97 -

Page 98: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Which Perspectives of Using Exoskeletons

in Activities for Daily Living?

Mohamed Bouri Laboratory of Robotic Systems (LSRO), Swiss Federal Institute of Technology (EPFL) Lausanne, Switzerland (e-mail: [email protected])

Abstract— Since many years, a lot of powered

exoskeletons are developed and some of them are already

available for sale. Mainly for rehabilitation purposes.

People with neuromuscular disease, paraplegics and

tetraplegics are looking to this technology with a lot of

hope. May be one day they will buy this kind of devices

online as many buy their glasses. This paper discusses the

obstacles to the use of exoskeletons in activities of daily

living (ADL). The first section describes exoskeletons and

for what they are used. The second part presents some

regulatory aspects related to their marketability. Finally,

the third part of the paper discusses what is limiting

exoskeletons to accessing the market of ADL.

Keywords— Exoskeleton, orthosis, Activities for Daily

Living, ADL, Rehabilitation, certification.

INTRODUCTION

Exoskeletons are motorized and instrumented devices. Their mechanical shape reproduces the anthropomorphic and skeleton of the human with which they are rigidly interfaced through dedicated components to transmit a closed loop controlled movement. The objective of exoskeletons, both for upper limbs and lower limbs, is to assist or mobilize human limbs [1] [2]. Figure 1 represents the most known active exoskeletons in the market; The Ekso from Ekso Bionics (US), Rewalk from Argo Medical Technologies (IL) and REX from REX Bionics (NZ).

Figure 1. Three exoskeletons available in the market. Respectively from right to left: Rex, Rewalk and Ekso.

Recently, researchers, product developers, and market providers have extensively addressed locomotion. This is mainly explained by the fact that locomotion is associated to mobility and Lack of mobility means social absence. Moving in vertical position in comparison with being on a wheelchair helps to important the external image. It can give a feeling of parity with healthy people while sharing the social space. Furthermore, verticalization contributes to better managing the human urinary and digestive systems. This explains why additionally to the pure aspect of mobilization these kinds of devices are now mostly targeting applications of rehabilitation. For stroke patients, fortunately, restorative chances of

locomotion are quite high. However, for paraplegic and tetraplegic patients, the focus is on improving the functions related to gait synchronization, recruitment and strengthening of lower limb muscles and probably the vestibular control [4] rather than total recovery of walking. For elderly and other patients with muscle weaknesses [1], the use of robotized orthosis aims at improving the quality of life by extending the mobility options. These options provide more accessibility to daily tasks (standing, access to toilets, kitchen, etc.). Additionally, the assisted locomotion avoids muscle atrophy and improves blood circulation. Unfortunately, exoskeletons for daily activities have not yet reached the market. Nonetheless the question remains: Could we imagine someone buying an exoskeleton with a medical prescription, which defines their anthropomorphic lengths (lengths of the lower limb segments), or simply buy it online as many buy their glasses? Hopefully, this is what these communities are waiting for.

CERTIFICATION AND LEGAL PROCEDURES

Design and control of exoskeleton devices are carried out around a main issue: “avoid any injury to the wearer”. Adjustments related to end user’s health and to the safety of the subjects are investigated through systematical and methodological risk analysis approaches. This is carried out in order to ensure the user’s safety from any potential hazard, avoid his/her falling down and prevent any resulting injury [1][6][7]. Among approaches available for risk assessment, we principally cite the FMEA (Failure Modes and Effects Analysis), which is based on the identification of all the failures associated with the subsystems and the components of the application. Failure modes, along with fault propagates, are analyzed and a solution is optimized to minimize the risk of injury for the user. The HAZOP technique (HAZard and OPerability study) and FTA (Fault Tree Analysis) are also used. All these methods are semi-empirical and rely on the developers’ expertise in identifying potential causes of injury to overcome the system and sub subsystems unpredictable behaviors. Motorized exoskeletons are categorized with respect to risk. The EU Medicine and Healthcare Regulatory Agency (MHRA) proposes 4 classes (I, IIa, IIb and III) respectively, with increasing levels of risks and control procedures [7]. The US Food and Drug Administration makes 3 categories (1, 2 and 3) also rated as low, medium and high levels of risks [6] [7]. When used for physical therapy and rehabilitation with the assistance of a therapist, actuated orthotic devices are classified by the FDA as “Powered exercise equipment” (product code BXB) and are categorized as class 1 devices. This makes their certification easier and exempt from 510(k) premarket notification or

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 98 -

Page 99: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

premarket approval application processes. REX 1, Ekso and Rewalk are all categorized as class 1 by the FDA. When used for ADL, they must be treated differently and the manufacturers are asked for the 510(k) clearance. In Europe, orthotic devices (especially if combined with crutches) may also be declared by the manufacturers as class I, which leads to assess the compliance with the corresponding CE mark directives. However, the MHRA may adopt a different point of view and hence propose another classification to IIa or IIb. The regulatory classes I; both in US and EU exempt the manufacturers from the call of a notified body for certification which considerably reduces time to market of these devices. However, when the devices are expected for ADL (as powered prosthetic devices), they must conform to classes II for US and IIa for EU.

REHABILITATION OR ADL?

Manufacturers and providers of powered exoskeletons are taking the shortcut to the market of rehabilitation because it is fast and simple in terms of legal procedures and helps to prepare the 510(k) premarket form. It allows them to place their devices in a controlled environment under assistance of therapists. It is relevant to notice that the activities for daily living do not need these exoskeletons to be designed much more different. The required assistance functions are already available with most of the current existing devices: Locomotion, Standing and Sitting. The function of climbing stairs is also more than appreciated. Because of safety considerations, the more complex the control strategies are (which is necessarily related to flexibility to different subject’s disabilities) more the device needs assistance of another person. Moreover, the time and financial investments of the exoskeleton’s manufacturers are higher. The Ekso is one the most advanced devices in terms of locomotion control strategies but it is still not available for personal use. This simply answers a part of the following question:

Is the availability of exoskeletons for home use for tomorrow?

The first obstacle concerns the certification and legal issues. By legal issues we principally intend regulatory authorizations that protect the consumers. The home use of these devices also means ease of wearability, autonomy and management of any ambiguous situation. For instance, what happens if the device faces a critical failure and it stops moving (or stops assisting the wearer)? Even if the stop is safe, what happens after that? This explains why even if the Rewalk (fig. 1) is the first device that has obtained an FDA approval for home use, it still needs a presence of a companion (husband, wife or any clinical companion) who helps managing some configuration aspects, wearing and unweaving the device, and assisting the user if any critical situation occurs. This is pointing out the second obstacle: a presence of another person is necessary and no technological answer is currently available to this issue. The third issue is probably the social acceptance of these devices. For sure social acceptance is related to cultural origins. There is no comparative study of the

social acceptance of exoskeletons around the world but probably Japan could be considered the most accepting country when it comes to invasion of technology in daily lives. In social acceptance, we may include our perception and experience with the surgical robots (invasive and not invasive). Surgical robots received large acceptance in daily use by surgeons both from the certification authorities and by patients as the protocols are well defined. Besides, people are trusting robots for surgery operations in more than a case because of their precision, efficiency and reducing undesirable effects.

CONCLUSION

We may conclude by an interesting question recently asked by Rose Eveleth “why many people seem more interested in hoisting someone out of their wheelchair than they are in making spaces accessible to that chair?” [8]. For sure, it is not yet the conquest of exoskeletons in our streets and the question of Rose should remain in mind. Technology must be thought as a help to improve the lives of human beings and not a source of their degradation. Researchers have to always take this aspect in consideration. Foundations and associations are also the guardians. Although it is not yet the conquest of exoskeletons in our streets, the question of Rose should remain in mind. Technology must be thought as a help to improve our human being and not a source of its degradation. Researchers have always to take this aspect into consideration. The objectives of industrials and business developers are probably more profit oriented. This is why foundations and associations are the guardians that protect our assets. Politicians and law actors should assume the role to set rules for the sake of the general well-being and the people’s rights protection. We believe that exoskeletons are helpful for walk assistance, either in ADL or rehabilitation applications. Workshops and conferences are the main key points to promote understanding and communication around this subject. Researchers from ETH Zurich launched an international race of exoskeletons (www.cybathlon.com). There is nevertheless still work for technology providers, researchers and medical doctors to improve the access and ease of use of these walking devices.

REFERENCES

[1] M. Dollar and H. Herr, “Lower Extremity Exoskeletons and

Active Orthoses: Challenges and State-of-the-Art,” IEEE

Transactions on Robotics, vol. 24, no. 1, pp. 144–158, Feb. 2008.

[2] M. R. Tucker, J. Olivier et al. , Control Strategies for Active

Lower Extremity Prosthetics and Orthotics: A Review. Journal of

NeuroEngineering and Rehabilitation, January 2015

[3] J. Olivier, A. Ortlieb, M. Bouri, H. Bleuler, Mechanisms for

Actuated Assistive Hip Orthoses. Elsevier Journal of Robotics and

Autonomous Systems, Online October, 2014

[4] A. Ortlieb, J. Olivier, M. Bouri, H. Bleuler, T. Kuntzer, From

gait measurements to design of assistive orthoses for people with

neuromuscular diseases, ICORR, Singapore 2015.

[6] H. Sutton, Overview of Regulatory Requirements: Medical

Devices, FDA resources, 2014.

[7] European Commission, DG Health and Consumer, Directorate

B, Unit B2 “Cosmetics and medical devices” - MEDDEV 2. 4/1

Rev. 9, June 2010.

[8] R. Eveleth, The Exoskeleton's Hidden Burden, The Atlantic,

August 2015.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 99 -

Page 100: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 100 -

Page 101: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Video’s & demo’s

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 101 -

Page 102: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Ensuring Ethical Behavior from Autonomous Systems Michael Andersona and Susan Leigh Andersonb

aUniversity of Hartford, Dept. of Computer Science bUniversity of Connecticut, Dept. of Philosophy

Video: A paradigm of case-supported principle-based behavior (CPB) is proposed to help ensure ethical behavior of autonomous machines. We argue that ethically significant behavior of autonomous systems should be guided by explicit ethical principles determined through a consensus of ethicists.

Keywords: autonomous systems, machine ethics

Autonomous systems that interact with human beings require particular attention to the ethical ramifications of their behavior. A profusion of such systems is on the verge of being widely deployed in a variety of domains. These interactions will be charged with ethical significance and, clearly, these systems will be expected to navigate this ethically charged landscape responsibly. As correct ethical behavior not only involves not doing certain things, but also doing certain things to bring about ideal states of affairs, ethical issues concerning the behavior of such complex and dynamic systems are likely to exceed the grasp of their designers and elude simple, static solutions. To date, the determination and mitigation of the ethical concerns of such systems has largely been accomplished by simply preventing systems from engaging in ethically unacceptable behavior in a predetermined, ad hoc manner, often unnecessarily constraining the system's set of possible behaviors and domains of deployment. We assert that the behavior of such systems should be guided by explicitly represented ethical principles determined through a consensus of ethicists [1][2][3]. Principles are comprehensive and comprehensible declarative abstractions that succinctly represent this consensus in a centralized, extensible, and auditable way. Systems guided by such principles are likely to behave in a more acceptably ethical manner, permitting a richer set of behaviors in a wider range of domains than systems not so guided.

To help ensure ethical behavior, a system’s ethically relevant actions should be weighed against each other to determine which is the most ethically preferable at any given moment. It is likely that ethical action preference of a large set of actions will be difficult or impossible to define extensionally as an exhaustive list of instances and instead will need to be defined intensionally in the form of rules. This more concise definition is possible since action preference is only dependent upon a likely smaller set of ethically relevant features that actions involve. Given this,

action preference can be more succinctly stated in terms of satisfaction or violation of duties to either minimize or maximize (as appropriate) each feature. We refer to intensionally defined action preference as a principle.

As it is likely that in many particular cases of ethical dilemmas ethicists agree on the ethically relevant features and the right course of action in many domains where autonomous systems are likely to function, generalization of such cases can be used to help discover principles needed for their ethical guidance. A principle abstracted from cases that is no more specific than needed to make determinations complete and consistent with its training can be useful in making provisional determinations about untested cases. If such principles are explicitly represented, they have the added benefit of helping justify a system’s actions as they can provide pointed, logical explanations as to why one action was chosen over another. Cases can also provide a means of justification for a system’s actions: as an action is chosen for execution by a system, clauses of the principle that were instrumental in its selection can be determined and, as clauses of principles can be traced to the cases from which they were abstracted, these cases and their origin can be ascertained and used as justification for a system’s action by analogy.

A principle that determines which of two actions is ethically preferable can be used to define a transitive binary relation over a set of actions that partitions it into subsets ordered by ethical preference with actions within the same partition having equal preference. This relation can be used to sort a list of possible actions and find the currently most ethically preferable action(s) of that list. This forms the basis of a case-supported principle-based behavior paradigm (CPB): a system decides its next action by using a principle, abstracted from cases where a consensus of ethicists is in agreement, to determine the most ethically preferable one(s).

REFERENCES 1. Anderson, M. & Anderson, S. L., Machine Ethics:

Creating an Ethical Intelligent Agent, Artificial Intelligence Magazine, 28:4, Winter 2007.

2. Anderson, M. & Anderson, S. L., Robot Be Good, Scientific American 303.4 2010: 72-77.

3. Anderson, M. & Anderson, S. L., Toward Ensuring Ethical Behavior from Autonomous Systems: A Case-Supported Principle-Based Paradigm, Industrial Robot: An International Journal 2015 42:4 , 324-331.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 102 -

Page 103: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Bonnie: Developing a Very Special Friend Robin Steffersa, Aaron Picab, Robin Scheicka, Peter van der Posta and Marcel Heerinka

aWindesheim Flevoland University of applied sciences, Robotics research group, Almere, The Netherlands bLaSalle Ramon Llull University Barcelona, Spain

Demo: We will show our prototype skeleton for a social robot that is suitable for therapy with hospitalized children. This demo will concern one construction with 3 main components: the main body, the interface (sensory and human control) and the firmware.

Keywords: social robot, therapy, Arduino, sensory stimulation, ARCAROS

CONCEPT

The concept of Bonnie is based on the endearing effect of baby orangutans. The embracing/clinging of a monkey triggers a sense of care and an impulse to hold it close. To avoid an uncanny design [1], we chose not to create a human baby, but an animalistic form that still had some of the appeal of a human. An orangutan is more exotic and would raise less high expectancies.

Based on interviews with caregivers in a children’s hospital we chose the following basic functionalities:

-­‐ Head can move sideways. -­‐ Head can move up and down. -­‐ Hands with grip. -­‐ Haptic touch sense (vibrating hand). -­‐ Arms can embrace. -­‐ Remotely adjustable behaviour. -­‐ Inclusion of sound effects. This needs more

investigation.

INTERACTION SCENARIO’S

To enable the arrangement all the actuators in the desired disposition we produced a 3d printed body which also houses all the electronic components and batteries (Figure 1). The objective of this prototype is to trigger a ‘sense of care’ and combine the functionalities in meaningful scenarios, such as:

-­‐ Child touches the robot and triggers a corresponding movement. For example:

o A touch triggers a hug. o Touching the belly triggers an approving nod. o Squeezing the hand will make it vibrate. This

vibration will serve as a pain relief when a child is punctured, similar to the effect of the buzzy[2].

o Touching Bonnie triggers a sound effect. -­‐ Hands with grip, triggered by a touch sensor in the

palm.

FIRMWARE

The system is controlled by a centralized firmware. Its architecture is based on isolated blocs that can be added or modified easily without affecting other parts of the system. Moreover, the modules are structured in three layers (application, translation and hardware). This increases the modularity/scalability of the system, forming a platform (ARCAROS) able to create more Bonnie-like robots with others inputs and functionalities.[3]

REFERENCES 1. Mori, M. (1970/2012), The uncanny valley (K.F.

MacDorman & N.Kageki, Trans.). IEEE Robotics & Automation Magazine, 19(2), pp.98-100.

2. Baxter, A.L., et al., An integration of vibration and cold relieves venipuncture pain in a pediatric emergency department. Pediatric emergency care, 2011. 27(12): pp. 1151-1156.

3. Johnson, B.D., 21st Century Robot. 2014 - chapter 5 The Brain, Maker Media, Inc.

Figure 1. Bonnie augmented with sensors, actuators and Arduino’s. Degree of freedom: saying no or yes and hugging. Not shown: pressure sensor (belly) and sound generator.

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 103 -

Page 104: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Remote Control Application for Therapeutic Use of a Social

Robot Vito Mahalla

a, Peter van der Post

a, Alex Barco

b and Marcel Heerink

a

aWindesheim Flevoland University of applied sciences, Robotics research group, Almere, The Netherlands

bLaSalle Ramon Llull University Barcelona, Spain

Demo: A remote control that gives healthcare

professionals the ability to influence the behavior of

robots, such as Pleo, that will be used in Therapeutic

environments.

Keywords: remote control, Pleo, social robot

CONCEPT

Robots like a Pleo can be used with hospitalized

children for distraction, preparing for treatment and

evaluation of treatment. A remote control, such as a

smart phone or tablet, gives the healthcare professional

an unnoticed way to influence the behavior of the robot

[3]. The remote control also collects sensor data which,

in combination with the feedback from the robot, can be

used to analyze the interaction.

Figure 1. Pleo interacting with a patient

INTERACTION SCENARIO’S

Pleo is a robot whose behavior depends on the way

you treat it. A patient feeling sad doesn’t need a Pleo

that gets angry when the patient pet the Pleo a bit too

hard. A healthcare professional adjusts the behavior in

a way that benefits the patient. The remote control is

also used to force certain behaviors at moments when

the child is about to lose interest in Pleo.

INTERFACE DESIGN

Based on needs of therapists [2,3], a user friendly

interface is developed that is self-explanatory. The

display contains facilities to configure the robot for an

emotion (Emotions), mood (combined emotions),

behavior, stage, profiles (type of child), and child

identity (behavior, emotion and mood for a particular

child). These emotions, moods, behaviors, stages,

profiles and child identity are programmed in order to

enhance engagement between patient and robot.

FIRMWARE

A Pleo is extended with a Bluetooth receiver so that it

communicates with mobile devices, such as an

Android phone or tablet, equipped with Bluetooth [4].

The commands from the control are sent to the robot

by means of a RESTfull protocol [5].The PLEO-rb

Development Kit [6] makes it possible to creatively

interact with PLEO-rb on the programming level to

modify his behaviors and tweak an animation.

Figure 2. Interface design

FURTHER DEVELOPMENT

In the future the remote control will be able to

support multiple mobile operating systems like the

iPhone OS. Another target is the ability to work with

other robots. The remote control will also be able to

display the data collected, e.g. the amount of times

Pleo is petted.

A cloud-based structure to enhance long-term

engagement in a pet-robot companion treatment is

also something being developed. This enables

further personal adaptation of each emotion, mood,

etc. for each child, thus enhancing effective

interaction. Kids will be able to see small

differences between PLEOs and can feel their robot

is different from the rest [1].

REFERENCES 1. A. Barco, et al, Enhancing Long-term Children to

Robot Interaction Engagement through Cloud

Connectivity, 2015

2. Bosse, T., et al., Emotion Modeling. 2014: Springer. 3. Munter, et.all, Comaker Robotica –

onderzoeksverslag, 2015-05-22.

4. F. Larriba, Providing a pratical Bluetooth

communication between Pleo RB and an Android

device, 2014

5. Douglas K.Barry, David Dick, Webservices, Service- Oriented Architecture and Cloud

Computing – 2nd Edition, Morgan Kaufmann Publishers, 2013.

6. Pleo-rb Development Kit, Pleo World, visited 2th

June 2015

http://www.pleoworld.com/pleo_rb/eng/pdk.php

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 104 -

Page 105: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Social Robots to Support Children with Diabetes: an

Overview of the ALIZ-E Project

Tony Belpaeme1, Paul Baxter1, Joachim de Greeff1, James Kennedy1, Robin Read1, Bernd

Kiefer2, Ivana Kruijff-Korbayová2, Valentin Enescu3, Georgios Patsis3, Hichem Sahli3, Bert

Bierman4, Olivier Blanson Henkemans4, Rosemarijn Looije4, Mark Neerincx4, Raquel Ros

Espinoza5, Alexandre Coninx5, Yiannis Demiris5, Antoine Hiolle6, Matthew Lewis6, Lola

Cañamero6, Elettra Oleari7, Sara Bellini7, Marco Mosconi7, Clara Pozzi7, Francesca Sacchitelli7,

Alberto Sanna7, Giulio Paci8, Giacomo Sommavilla8, Fabio Tesser8, Piero Cosi8

, Rémi Humbert9

1 The Cognition Institute, Plymouth University, United Kingdom

2 Deutsches Forschungszentrum für Künstliche Intelligenz, Germany 3 Vrije Universiteit Brussel, Belgium

4 Organization for Applied Scientific Research, The Netherlands 5 Imperial College London, United Kingdom

6 University of Hertfordshire, United Kingdom 7 Fondazione Centro San Raffaele, Milan, Italy

8 National Research Council - ISTC, Italy 9 Aldebaran Robotics, Paris, France

Video URL: https://vimeo.com/111655200

Abstract. This video presents a brief overview of the

ambition and results of the ALIZ-E project. ALIZ-E

built social robots to support children with diabetes. The

robots were evaluated across hospitals in Europe and

served as a support tool for young children, to help

children understand their condition and educate children

about diabetes management. The video highlights the

collaborations between the academics, medical staff,

parents and -most importantly- the children.

Keywords: social robotics, healthcare, child-robot

interaction.

INTRODUCTION

The ALIZ-E project was a 54 month long European

research project running between 2010 and 2014

involving an interdisciplinary team comprised of seven

research institutes, one hospital and one medium

enterprise [1, 2]. The project aimed to contribute to the

development of integrated cognitive systems capable of

naturally interacting with young people in real-world

situations, with a specific goal of supporting children

engaged in a residential diabetes-management course.

The goal of the project was to extend the science and

technology behind long-term human-robot interaction.

To achieve this, we addressed three related issues in

developing interactive robots capable of sustaining

medium- to long-term autonomous operation in real-

world indoor environments. Firstly, ALIZ-E addressed

how long-term experience can be acquired, so the robot

could learn its spatio-temporal experiences. Secondly,

ALIZ-E addressed how a system can deal robustly with

inevitable differences in quality in perceiving and

understanding a user and her environment. To this end,

ALIZ-E developed new methods for adaptively

controlling how a system invokes and balances a hybrid

ensemble of processing methods for perception, action

and interaction. Thirdly, ALIZ-E addressed how a

system can engage in an intersubjective interaction

using potential anthropomorphisation of robots by the

user. The long term aim of the ALIZ-E project was to

implement believable, long-term, social child-robot

interaction.

RESULTS

Through dozens of studies and field trials, the

project has shown that social robots have significant

potential for motivating and educating young children.

This can be used in educational environments, such as

schools, but has significant potential in more targeted

environments, such as hospitals, where children have to

learn and acquire skills and where motivation is an

important aspect of learning.

The creation of autonomous Human-Robot

Interaction is one of the greatest challenges faced in

robotics. While encouraging progress was made in

ALIZ-E many of the more unstructured interactions still

require the robot to be remotely controlled. A main

obstacle to autonomous social robots appears to be

perception: perceiving and correctly interpreting the

social environment is as yet an unsolved problem.

REFERENCES 1. Belpaeme, T., et al. (2012) Multimodal Child-Robot

Interaction: Building Social Bonds. Journal of Human-

Robot Interaction, 1(2), 33-53.

2. www.aliz-e.org , sponsored by the European

Commission, grant reference FP7-ICT-248116

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 103 -

Page 106: Conference Proceedings New Friends 2015 - Marcel Heerink · 2016-10-12 · Roger Tilmans, Pablo Gómez Esteban, Hoang-Long Cao and Bram Vanderborght. Social and Autonomous Confabulation

Proceedings New Friends 2015 - The 1st International Conference on Social Robots in Therapy and Education

- 104 -