Emotion-sensitive Human-Computer Interaction (HCI): ?· Emotion-sensitive Human-Computer Interaction…

  • Published on

  • View

  • Download

Embed Size (px)


  • Emotion-sensitive Human-Computer Interaction (HCI):State of the art - Seminar paper

    Caroline VoeffrayUniversity of Fribourg

    1700 FribourgFribourg


    ABSTRACTEmotion-sensitive Human-Computer Interaction (HCI) is ahot topic. HCI is the study, planning and design of the in-teraction between users an computer systems. Indeed todaya lot of research is moving towards this direction. To attractusers, more and more developers add the emotional side totheir applications. To simplify the HCI, systems must bemore natural, efficacious, persuasive and trustworthy. Andto do that, system must be able to sense and response appro-riately to the users emotional states. This paper presentsa short overview of the existing emotion-sensitive HCI ap-plications. This paper focused on what is done today andbrings out what are the most important features to take intoaccount for emotion-sensitive HCI. The most important find-ing is that applications can help people to feel better withtechnologies used in the daily life.

    General TermsEmotion in HCI

    KeywordsEmotion HCI, Affective HCI, Affective technology, Affectivesystems

    1. INTRODUCTIONFirst affective-applications must collect emmotionals datawith facial recognition, voice analysis and detecting physi-ological signals. Then the system interpret collected dataand must be adapting to the user

    In the past, the study of usability and emotions were sep-arated but for some time, the area of human-computer in-teraction has evolved and today emotions take more spacein our life and the affective computing appeared. Affec-tive computing is a popular and innovative research area

    Seminar emotion recognition : http://diuf.unifr.ch/main/diva/teaching/seminars/emotion-recognition

    mainly in artificial intelligence. Affective computing consistof recognition, expression, modeling, communicating andresponding to emotions [4]. Research in this domain con-tributes to many different domains such as aircraft, jobs inscientific sectors like lawyers or police, anthropology, neurol-ogy, psychiatry or in behavioral science, but for this paperwe will focus especially on HCI. In this paper we will ap-proach different elements of Emotion-sensitive HCI. Firstlywe briefly discuss the role of emotions and how they can bedetected. Then, we explain the research areas related to af-fective computing. Finally we make a state of the art surveyof existing affective application.

    2. ROLE OF EMOTIONSEmotions are an important factor of life and they play anessential role to understand users behavior with computerinteraction [1]. In addition, the emotional intelligence playsa major role to measure aspects of success in life [2]. Recentresearches in human-computer interaction dont focus onlyon the cognitive approach, but on the emotions part too.Both approaches are very important, indeed take into acountthat emotions of the user solve some important aspects of thedesign in HCI systems. Additionnaly the human-machineinteraction could be better if the machine can adapt its be-havior according to users; and this system is seen more nat-ural, efficacious, persuasive, and trustworthy by users [2].The following question is asked Which is the connection be-tween emotions and design? and the respond is that Ourfeelings strongly influence our perceptions and often framehow we think or how we refer to our experiences at a laterdate, emotion is the differentiator in our experience [1].In our society, positive and negative emotions influence theconsumption of product and to understand the decisionalprocess, it could be crucial to measure the emotional ex-pression [1]. Emotions are an important part in our life, itswhy affective computing was developed. As say in [3] af-fective computing is to develop computer-based system thatrecognize and express emotions in the same way humans do.

    In human-computer interaction, the nonverbal communica-tion plays an important role, in that we can identify thedifficulties that users stumble upon, by measuring the emo-tions in facial expressions [1]. Therefore in [3] we talk abouta number of studies that have investigated peoples reac-tions and responses to computers that have been designedto be more human-like. And several studies have reporteda positive impact of computers that were designed to flatter

  • and praise users when they did something right. With thesesystem, users have a better opinion of themselves [3].

    By cons, we have to be careful because sometimes, the sameexpression of an emotion can have a different significationin different countries and cultures. A smile in Europe is thesign of happiness, pleasure or irony. But for japanese peo-ple, it could simply imply their agreement with the appliedpunishment or could be the sign of indignation associatedwith the person applying the punishment [1]. These ethnicdifferences should make us aware of different results with theuse of the same affective system in different countries.

    3. EMOTIONAL CUESTo understand how emotions affect peoples behavior wemust understand the relationship between the different cuessuch as facial expressions, body language, gestures, tone ofvoice, etc. [3]. Before creating an affective system, we mustexamine how people express emotions and how we perceiveother peoples feelings.

    To have an intelligent HCI system that responds appropri-ately to the users affective feedback, the first step is thatthe system must be able to detect and interpret the usersemotional states automatically [2]. The visual channel (e.g.,facial expression) and the auditory (e.g. vocal caps) like vo-cal reactions are the most important features in the humanrecognition of affective feedback. But other elements needto be taken into account, for example body movements andthe physiological reactions. When a person judge the emo-tional state of someone else, it relies mainly on his facialand vocal expressions. However some emotions are harderto differentiate than others and need to consider other typesof signals such as gestures, posture or physiological signals.A lot of research have been done in the field of face and ges-ture recognition, especially to recognize facial expressions.Facial expressions can be seen as communicative signals orcan be considered as being expression of emotions [6]. Andthey can be associated with basic emotions like happiness,surprise, fear, anger, disgust or sadness [1].Another tool to detect emotions is the emotional speechrecognition. In the voice several factors can vary depend-ing on emotions, such as pitch, loudness, voice quality andrhythm.In case of human-computer interaction, we can detect emo-tions by monitoring the nervous system because for somefeelings, physiological signs are very marked [1]. We canmeasure the blood pressure, the skin conductivity, the rate ofbreathing or the finger temperature. Changing in physiolog-ical signals means a change in the users behavior. Unfortu-nately, physiological signals play a secondary role in humanrecognition of affective states, these signals are neglectedbecause to detect somenones clamminess or heart rate, weshould be in a physical contact with the person. The analy-sis of the tactile channel is harder because, the person mustbe wired to collect data and its usually perceived as beinguncomfortable and unpleasant [2]. Several techniques areavailable to capture this physiological signs; Electomyogram(EMG) for evaluating and recording the electrical activityproduced by muscles, Electrocardiogram to measure the ac-tivity of the heart with electrodes attached to the skin orskin conductance sensors can be used [6] .

    All these methods to collect data are very useful but theythey do not seem always easy to use. Firstly, available tech-nologies are restrictive and some parameters must absolutelybe taken into account to have valid data. We must differenti-ate users by gender, age, socio-geographical origin, physicalcondition or pathology. Moreover we need to remove thenoise of collected data like unwanted noise, image againstthe light or physical characteristics such as beard, glasses,hat, etc. Secondly, to collect physiological signals, we mustuse some intrusive methods like electrodes, chest trap orwearable computer [6].

    4. EMOTION-SENSITIVE APPLICATION4.1 Research areasThe automatic recognition of human affective states canbe used in many domains besides the HCI. In fact the as-sessment of different emotions like annoyance, inattentionor stress can be highly valuable in some situations. Theaffective-sensitive monitoring done by a computer could pro-vide prompts for better performance. Especially for certainjobs like aircraft and air traffic controller, nuclear powerplant surveillance or all jobwhere we drive a vehicle. Inthese professions attention to a crucial task is essential. Inscientific sectors like lawyers, police or security agents, mon-itoring and interpreting affective behavioral signs can bevery useful. These information could help in critical situ-ations such as to know the veracity of testimonies. Anotherarea where the computer analysis of human emotion can bebenefit is the automatic affect-based indexing of digital vi-sual material. Detection of pain, rage and fear in scenescould provide a good tool for violent-content-based index-ing of movies, video material and digital libraries. Finallymachine analysis of human affective states would also facil-itate research in anthropology, neurology, psychiatry or inbehavioral science. In these domains sensitivity, reliabilityand precision are recurring problems, this kind of emotionrecognition can help them to advance in their research [2].

    In this paper we focused on HCI and one of the most impor-tant goal of HCIs application is to design technologies thatcan help people feeling better. For example how an affectivesystem can calm a crying child or prevent strong feeling ofloneliness and negative emotions [3].

    A domain where recognition of emotions in HCI is widelyused is the evaluation of interface. The design of an interfacecan influence strongly the emotional impact of the systemon the users [3]. The appareance of an interface such as com-bination of shapes, fonts, colors, balance, white space andgraphical elements determine the first users feeling. More-over there can have a positive or a negative effect on peoplesperception of the systems usability. For example with good-looking interfaces users are more tolerant because the systemis more satisfying and pleasant to use. For example if thewaiting time to download a website is long, with a good-looking interface, the user is prepared to wait a few moreseconds. On the contrary, computer interfaces can cause touser frustration and negative reactions like anger or disgust.It happens when something is too complex, if the systemcrashes or doesnt work properly, or if the appareance of theinterface is not adapted. Therefore the recognition of emo-tions is important to evaluate the usability and the interface

  • of applications. Emotions felt by user play an important rolein the success of a new application.

    Another big research area is the creation of intelligent robotsand avatars that behave like humans [3]. The goal of thisdomain is to develop computer-based systems that recognizeand express emotions like humans. There are two major sec-tors, affective interface for children like dolls and animatedpets or intelligent robots developed to interact and help peo-ple.Safety driving or soldier training are other examples of emo-tionnal research areas. Sensing devices can be used to mea-sure different physiological signals to detect stress or frus-tration. For drivers, panic and sleepiness levels can be mea-sured and if these signals are too high, the system alerts theuser to be more careful or advise to stop for a break. Thephysiological data from soldiers can be used to design a bet-ter training plan without frustration, confusion or panic [6].Finally, sensing devices can also be used to measure thebody signals of patients that are having tele-home healthcare. Collected data can be sent to a doctor or a nurse forfurther decisions [6], [5] .

    4.2 ApplicationsIn this section, we make a state of the art of different typesof existing affective-applications.

    4.2.1 Emoticons and companionThe evolution of emotions in technology is explain in thisbook [3]. The first expressive interfaces were designed byemoticons, sounds, icons and virtual agents. For example inthe 1980s and 1990s when an Apples computer was booted,the user saw the happy Mac icon on the screen. The mean-ing of this smiling icon was that the computer was workingcorrectly. Moreover, the smile is a sign of friendliness andmay encourage the user to feel good and smile back.

    An existing technique to help users is the use of friendlyagents at the interface. This companion helps users to feelbetter and encourages them to try things out. An exampleis the infamous Clippy, the paper clip with human-like qual-ities as part of their windows 98 operating system. Clippyappears in the users screen when the system thinks that theuser need some help to make some tasks [3].

    Later, users have also found ways to express emotions throughthe computer by using emoticons. The combination of key-board symbols that simulate facial expressions, allows usersto express their feelings and emotions [3].

    4.2.2 Online chat with animated textResearches demonstrated that animated text was effectiveto convey a speakers tone of voice, affection and emotion.Therefore an affective chat system uses animated text asso-ciated with emotional information is proposed in [9]. Thischat application focuses only on text messages because itssimpler than video, the size of data is smaller and concernsless the privacy. A GSR (Galvanic Skin Response) sensorwas attached to the middle and to the index fingers of theusers non-dominant hand to collected necessary affective in-formation. An animated text is created according to thesedata. With this method, the peaks and troughs of GSR

    data are analyze to detect emotions in real-time and ap-plies it to the online chat interface. Because its difficult toobtain valence information from physiological sensors, theuser specifies manually the animation tag for the type ofemotion, whereas physiological data are used to detect theintensity of emotion. Twenty types of animations are avail-able to change the speed, size, color and interaction of thetext. The user specifies the emotion with a tag before themessage. Through this animated chat, the user can deter-mine the affective state of his partner.

    4.2.3 Interactive Control of MusicWe discover an interface to mix pieces of music by movingtheir body in different emotional styles [8]. In this applica-tion, the users body is a part of the interface. The systemanalyzes the body motions and produces a mix of musicto represent expressed emotions in real time. To classifybody motions into emotions, the machine learning is used.This approach give very satisfactory results. To be...


View more >