33
Affectiv e Computing Saumya Srivastava M.Tech (HCI)

Affective computing

Embed Size (px)

DESCRIPTION

A Brief Introduction to what Affective Computing is, followed by its updates in current scenario

Citation preview

Page 1: Affective computing

AffectiveComputing

Saumya SrivastavaM.Tech (HCI)

Page 2: Affective computing

Introduction

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects.

- Rosalind Picard

Originated with Rasalind Picard’s 1995 paper on “Affective Computing”.

Motivation for Research : Ability to simulate Empathy (i.e. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response for those emotions).(Video :Technology to measure Emotions)

It is a Empirical research motivated from the theoretical foundation of psychology and neuroscience.(Video : Emotional Technology)

- Eva Hudicka(Author: “Affective Computing : Theory, Methods & Applications”)

Page 3: Affective computing

Objective

To develop a computing device with its capacity to gather cues to user emotion from a variety of sources.In Simple words, produce “emotion aware machines”.

Facial expression, posture, gesture, speech, force or rhythm of key stroke, temperature change of hand on mouse can signify changes in user’s. emotional state, detected and interpreted by a computer.

There exist a limitless range of applications : E-Learning

Tutor expands explanation when user is found in a state of confusion, adds information when user is found in a state of curiosity etc.

E-TherapyProvide psychological health services (i.e. online counseling) revealing the emotional state as in real world session. Through Affective Computing, the patient’s posture, face expression and gesture in real world leads to accurate evaluation of psychological state.

Page 4: Affective computing

PSYCHOLOGICAL THEORIES OF EMOTION

Ekman, Friesen and Ellsworth (1972) categorized emotions into 6 groups namely fear, surprise, disgust, anger, happiness and sadness all of which can facially expressed.

Ekman and Friesen (1978) developed Facial Action Coding System (FAC), which uses muscle movements to quantify emotions.

According to Ekman, every primary emotion has a adaptive value irrespective of individual and culture.

Automated Version of FAC was given by Ballett et. al. (1999). Later, Plutnik(1980) argued for 8 basic pairs of emotions which can be

combined to produce secondary emotion. Drawback : One was forced to choose among the 8 emotion pairs. Russell proposed variation in 2 dimension i.e. Valence (X-axis) and Arousal (Y-

axis) e.g. Happy + Content = Pleasure / Displeasure

true true Pleasure true false Displeasure false true Displeasure false false Displeasure

[3]

Page 5: Affective computing

PSYCHOLOGICAL THEORIES OF EMOTION

OPTIMISM

LOVE

SUBMISSION

AWE

AGGRESIVENESS

CONTEMPT

REMOVE

DISSAPPOINTMENT

JOY

ANTICIPATION

ANGER DANGER

SADNESS

ACCEPTANCE FEAR

SURPRISE

Figure : “The Emotion Wheel” by Plutchik

(Continue)

[3]

Page 6: Affective computing

COMPONENTS OF EMOTIONS

Subjective experience (feeling of fear and so on).

Physiological Changes in Autonomic Nervous System(ANS) and Endocrine System (Glands and Hormones released from them).e.g. trembling with fear precedes conscious control of them

Behavior evoked (such as running away or fainting due to fear)

[3]

Page 7: Affective computing

SOME THEORIES

JAMES-LANGE THEORY Introduced in 1890 by James and Lange. Argues that action precedes emotion (brain interprets action as emotion).

e.g. something scary moving towards us → pulse starts rising up → interpreting our state of body → we are afraid(Fear).

Perception of Emotion arousing

Stimulus

Visceral and skeletal Changes

Interpretation

Feedback loop

[3]

Page 8: Affective computing
Page 9: Affective computing

SOME THEORIES

(Continue)

CANNON - BARD THEORY Introduced in 1920 by Cannon and Bard. Argues that emotion arousing stimulus precedes Action followed from

cognitive Appraisal .

Perception of emotion arousing

stimulus

Experience of Emotion

Physiological (Bodily) Changes

Sends signals to Cortex

Sends signals to Hypothalamus

[3]

Page 10: Affective computing
Page 11: Affective computing

SOME THEORIES(Continue)

SCHACHTER – SINGER THEORY Introduced in 1960. Adrenaline Experiment: Participant were told that they would be injected

with an injection of vitamin and tested whether their vision would be affected or not.Group A: Accurate information was given resulting in sweating, tremor and a feeling of jittery.Group B : False information was provided.Group C : Told nothingGroup D : Injected with Saline (not a vitamin having no side-affects).

Results1. A & D : Didn’t feel like being involved.2. B & C : Shared their emotion State.

Criticism1. Less ambiguity about what was happening.2. Emotion is not just based on behavior but also depends on past

experience and source of information.3. Unexplained emotion state arousal leads to negative experience.

[3]

Page 12: Affective computing

SOME THEORIES(Continue)

Perception of emotion arousing

stimulus

Awareness of Physiological Arousal

Physiological (bodily) Changes

Interpreting the arousal as a particular

emotion given the context

Thalmus sends impulses to cortex

Further, Lazus(1982) performed experiments on “Cognitive Labeling” and proposed a notion “Cognitive Appraisal”. Acc. To this theory, Evaluation of the situation precedes Affection Reaction.

Zajonc(1984) argues that Emotional response precedes Cognitive processing.

[3]

Page 13: Affective computing
Page 14: Affective computing

Areas of Affective Computing

Detecting Emotional Information (Basic capabilities in a computer to discriminate emotions) Input : Getting a large variety of i/p signals. E.g. Face, Hand gesture,

posture, gait, respiration, electro thermal response, ECG, temperature, blood pressure, blood volume, Ecteomyogram*.

Pattern Recognition : Feature Extraction and their classification of signals. E.g. Analysis of Video motion features(to discriminate a frown from a smile)

Reasoning : Predicts underlying emotion based about how emotions are generated and expressed.

Learning : Factors tends to emotion (of an individual) which helps better to recognize a person’s emotion.

Bias : If a system has emotions, then recognizing ambiguous emotion becomes easier.

Output : Recognize expression and likely underlying emotion.

* A test that measures the activity of the muscles.

[3]

Page 15: Affective computing

Areas of Affective Computing(Continued)

Recognizing Emotional Information We are in need in development of systems which moderate their responses

to respond to user frustration/stress/anxiety in response to computer recognition of emotion.

Exception : This concept can’t be implemented in tele-healthcare. Lisetti et. al. (2003) designed a application in this regard to resolve this

issue.

Helps communication between patient and clinician. Result : 90% success recognizing SADDNESS.

80% success recognizing FEAR.80% success recognizing ANGER.70% success recognizing FRUSTRATION.

Wearable Sensors & other Devices

Embodied Avtars

[3]

Page 16: Affective computing

Areas of Affective Computing(Continued)

AFFECTIVE WEARABLESSensors & tools can be used in recognizing affective patterns, but these tools require a lot of attention/ maintenance.

Figure : Wearer’s Blood Volume Pressure using photoplethysmography

Figure : Sample & transmit biometric data to larger computer for analysis

[3]

Page 17: Affective computing

Areas of Affective Computing(Continued) Expressing Emotional

Need of Computers to express emotions :1. Computers expressing emotions can improve the quality and

effectiveness of communication between people and technologies.2. How people can communicate with computer such that they can express

their emotions?3. How technology can stimulate and support new modes of affective

communication between people. Efforts made :

1. Schiano and her colleagues(2000) tested an early prototype of simple robot. Drawback : It had no emotions.

2. An Experimental application at MIT, the ‘Relational Agent’(Bickmore, 2003), designed to sustain long-term relationships. The agent expressed to emotions. Drawback : Didn’t convince the reality of ‘feelings’.

3. By Contrast, ‘Kismet’ an expressive robot at MIT is equipped with auditory and proprioceptive (touch) sensory inputs. Kismet can express emotion through vocalization, facial expression and adjustment of Gaze direction and head orientation.

[3]

Page 18: Affective computing

Areas of Affective Computing(Continued) Expressing Emotional

Figure : MS Office Assistant Figure : Kismet Robot

Evolution over the years

[3]

Page 19: Affective computing

What has been done?

Emotion Recognition and synthesis in the focus of many FP5, FP6 and FP7 projects.

Starting from ERMIS (emotion aware agents) → HUMAINE (network of excellence) to Callas (emotion in art and entertainment).

What can be done?

Add Observable Manifestations which provide cues about user’s subjective experience.

A Smile may indicate successful completion of task or retrieval of what user was looking for…

Instead of cryptic “retry” button or asking user to verify results.

People may frown to indicate displeasure or difficulty to read , nod to agree, shrug shoulders when indifferent etc.

[5]

[5]

Page 20: Affective computing

How can this be done?We can recognize :

Facial Features and cues Head Pose/Eye Gaze (to estimate attention) Hand Gestures (usually fixed vocabulary , signs) Directions and Commands (usually fixed vocabulary) Anger in speech (useful in call centers)

Affective InteractionsWhen computers can sense affective cues :

Users cannot read text off the screen and frown/approach screen? Redraw text with larger font!

Call centre user is angry? Redirect to human operator!

Users not familiar with/cannot use mouse/keyboard? Spoken commands/hand gestures are another option!

Users not comfortable with on-screen text? Use virtual characters and speech synthesis!

[5]

[5]

Page 21: Affective computing

Rosaline Picard, in her book “Affective Computing & Intelligent Interaction” addresses a research paper on “Expressive Face Animation – Synthesis based on dynamic Mapping Method”[1] which talks of SPEECH DRIVEN FACE ANIMATION SYSTEM WITH EXPRESSIONS .

Up till now…

Work had been done on Lip Movement resulting in inaccuracy & discontinuity.

In speech recognition System, Yamamoto E. built a phoneme recognition model using Hidden Markov Model, directly mapping phoneme to the lip shape.

Drawback : Phoneme had to be linked to a language, and since the phoneme varied from person to person and region to region, the Efficiency degraded.

SPEECH DRIVEN FACE ANIMATION

Audio StreamSequence of Corresponding Face movements

(For e.g. Multimodal HCI, Visual Reality, Video Phones)

Current State Of Art

[1]

Page 22: Affective computing

Progress To overcome the drawback, Neural Networks was now used for Audio

Visual Mapping using the Gaussian Mixture Model (GMM). (Demonstration 1) (Demonstration 2)

Explains the relation between neutral facial deformation & a expressive facial deformation via GMM together with joint probability distribution.

Result : An encouraging quantitative evaluation a synthesized face showing a realistic quality.

Non Verbal Information Verbal Information Set of Message with speaker’s emotional State

Current State Of Art (Continued)

[1]

Page 23: Affective computing

Released Applications Spatio-Temporal Emotional Mapper for Social System [Demonstration]

Developed by the Dept. of Informatics Engineering of Faculty of Science & Technology, University of Coimbra.

This tool gathers from a society of agents their emotional arousals and self rated motivation as well as their location in order to plot a map of a city or geographical region with information about the motivation and emotional state of the agents that inhabit it.

It is open source application.(source code).

[6]

Page 24: Affective computing

Research Groups & their Work

AffQuake[2]

AffQuake is an attempt to in-cooperate signals that relate to the players emotions involved while playing.

Quake II alters with the modification in behavior of the play w.r.t. average skin conductance level.

For e.g. Excitement increase the size of the avatars, giving benefit to see farther, but at the same time making him a easier target.

Performed at MIT Media Lab, MIT.

Group Members : a) Carson J. Reynoldsb) Rosalind W Picard

Page 25: Affective computing

Research Groups & their Work

Affective Tangibles[2]

Objective : To develop physical objects that can grasped, squeezed, thrown or otherwise manipulated via a natural display of affect.

People generally express their frustration through the use of motor skills. In simple words, people often increase their intensity of muscle movement when experiencing frustrating interactions.

Constructed tangibles include a Pressure Mouse, affective pinwheels that are mapped to skin conductance, and a voodoo doll that can be shaken to express frustration.

Performed at MIT Media Lab, MIT. Group Members :

a) Carson J. Reynoldsb) Rosalind W Picard

Figure : Pressure Mouse

Page 26: Affective computing

Research Groups & their Work Affective Learning Companion[2]

A powerful research tool, exploring a variety of social-emotional skills in HCI.

The platform enables a computational agent to sense and respond, in real time, to a user's non-verbal emotional cues, using video, postural movements, mouse pressure, physiology, and other behaviors communicated by the user to infer.

An animated agent, recently developed allowing the study of factors helping learners develop the ability to persevere during frustrating learning episodes.

Performed at MIT Media Lab, MIT. Group Members :

a) Selene Atenea Motab) Rosalind W. Picardc) Ashish Kapoord) Barry Korte) Hyungil Ahnf) Ken Perling) Winslow Burleson

Page 27: Affective computing

Research Groups & their Work

The Galvactivator[2]

A glove-like wearable device that senses the wearer's skin conductivity and maps its values to a bright LED display.

Increases in skin conductivity across the palm tend to be good indicators of physiological arousal, causing the galvactivator display to glow brightly.

Applications : Self-feedback for stress management, Facilitation of conversation between two people & new ways of visualizing mass excitement levels in performance situations or visualizing aspects of arousal and attention in learning situations.

Performed at MIT Media Lab, MIT. Group Members :

a) Rosalind W. Picardb) Jonny Farringdonc) Nancy Tilbury (Philips Research Laboratories)d) Jocelyn Scheirer

Page 28: Affective computing

Research Groups & their Work

Learning & Pattern Recognition[2]

This project developed efficient versions of Bayesian techniques for a variety of inference problems, including curve fitting, mixture-density estimation, principal-components analysis (PCA), automatic relevance determination, and spectral analysis.

Performed at MIT Media Lab, MIT. Group Members :

a) Rosalind W. Picardb) Thomas Minkac) Yuan Qi

Page 29: Affective computing

Research Groups & their Work

Robotic Computer[2]

A robotic computer that moves its monitor "head" and "neck," but that has no explicit face, is being designed to interact with users in a natural way for applications such as learning, rapport-building, interactive teaching, and posture improvement.

In all these applications, the robot will need to move in subtle ways that express its state and promote appropriate movements in the user, but that don't distract or annoy.

Goal : Giving the system the ability to recognize states of the user and also to have subtle expressions.

Performed at MIT Media Lab, MIT. Group Members :

a) Carson J. Reynoldsb) Rosalind W Picard

Note : Other publication associated with Affective computing are available @ http://affect.media.mit.edu/publications.php

Page 30: Affective computing

Research Groups & their Work

Agent-Dysl Project[5]

Problem : Children with dyslexia experience problems in reading off a computer screen Common errors: skipping words, changing word or syllable sequence,

easily distracted/frustrated. Solution : A screen reading software which

Helps them read in correct order by highlighting words and syllables. Checks and monitors their progress. Looks for signs of distraction or frustration.

Figure : User leans towards the screen? Font size increased

Figure : User looks away? Highlighting stops

Page 31: Affective computing

Key Issues for Further Research

The critical issues that the Interactive Systems Designers are facing :

In which domain does affective capability make a positive difference to HCI, and where is it irrelevant or even obstructive?

How precise do we need to be identifying human emotions- perhaps it is enough to identify a general positive or negative feeling? What techniques best detect emotional states for this purpose?

How do we evaluate the contribution of affect to overall success of design?

[3]

Page 32: Affective computing

References

[1] Panrong Yin, Linye Zhao, Lexing Huang and Jianhua Tao, “Expressive Face Animation Synthesis based on Dynamic Mapping Method” Published at National Laborotary of Pattern Recognition , Springer-Verlag Berlin Heidelberg 2011.

[2] Site : http://www. affect.media.mit.edu[3] DAVID BENYON 2010, Designing Interactive Systems- A Comprehensive guide to HCI

and interaction design ,Addison Wesley-Second Edition[4] Site : http://www.agent-dysl.eu[5] DR. KOSTAS KARPOUZIS, “Technology Potential : Affective Computing”, Image, video

and multimedia systems lab, National Technical University of Athens.[6] Site : https://github.com/lfac-pt/Spatiotemporal-Emotional-Mapper-for-Social-Systems[7] ZHIHONG ZENG, Member, IEEE Computer Society, MAJA PANTIC, Senior Member, IEEE,

GLENN I. ROISMAN & THOMAS S. HUANG, Fellow, IEEE, “A Survey of Affect Recognition Methods: Audio,Visual, and Spontaneous Expressions” .

Page 33: Affective computing

Projects

Problem Definition 1 : Designing an interface integrating emotion detection for video surveillance.

Problem Definition 2: A 3-D Avatar reflecting the emotion as per scenario in gaming environment.