32

Click here to load reader

Affective Computing Report

Embed Size (px)

Citation preview

Page 1: Affective Computing Report

SEMINAR REPORT ON

AFFECTIVE COMPUTING

SUBMITTED BY:

NAVEED S

ROLL NO.-15

S7 CSE

COLLEGE OF ENGINEERING

PERUMON

Page 2: Affective Computing Report

ABSTRACT

Affective computing is computing that relates to, arises from or deliberately

influences emotions. Neurological studies indicate that the role of emotions in human

cognition is essential and that emotions play a critical role in rational decision-

making, perception, human interaction and human intelligence. In the view of

increased human computer interaction or HCI it has become important that for proper

and full interaction between Humans and Computers, computers should be able to at

least recognize and react to different user emotion states.

Emotion is a difficult thing to classify and study fully. Therefore to replicate or to

detect emotions in agents is a challenging task. In Human-Human interaction it is

often easy to see if a person is angry, happy, or frustrated etc. It is not easy to

replicate such an ability in an agent. In this seminar I will be dealing with different

aspects of affective computing including a brief study of human emotions, theory and

practice related to affective systems, challenges to affective computing and systems

which have been developed which and support this type of interaction. I will also be

doing a tryst into the area of ethics related to this field as well as implication of

computers which will have emotions of their own. Affective computing is an

emerging, interdisciplinary area, addressing a variety of research, methodological, and

technical issues pertaining to the integration of affect into human-computer

interaction.

The specific research areas include recognition of distinct affective states, user

interface adaptation and function integration due to changes in user’s affective state,

supporting technologies such as wearable computing for improved affective state

detection and adaptation.

Page 3: Affective Computing Report

INTRODUCTION

Affective computing aims at developing computers with understanding capabilities

vastly beyond today’s computer systems. Affective computing is computing that

relates to, or arises from, or deliberately influences emotion. Affective computing also

involves giving machines skills of emotional intelligence: the ability to recognize and

respond intelligently to emotion, the ability to appropriately express (or not express)

emotion, and the ability to manage emotions. The latter ability involves handling both

the emotions of others and the emotions within one self.

Today, more than ever, the role of computers in interacting with people is of

importance. Most computer users are not engineers and do not have the time or desire

to learn and stay up to date on special skills for making use of a computer’s

assistance. The emotional abilities given to computers are intended for helping

address the problem of interacting with complex systems leading to smoother

interaction between the two. Emotional intelligence that is the ability to respond to

one’s own and others emotions is often viewed as more important than mathematical

or other forms of intelligence. Equipping computer agents with such intelligence will

be the keystone in the future of computer agents.

Emotions in people consist of a constellation of regulatory and biasing mechanisms,

operating throughout the body and brain, modulating just about everything a person

does. Emotion can affect the way you walk, talk, type, gesture, compose a sentence,

or otherwise communicate. Thus to infer a person’s emotion, there are multiple

signals you can sense and try to associate with an underlying affective state.

Depending on which sensors is available (auditory, visual, textual, physiological,

biochemical, etc.) one can look for different patterns of emotion’s influence. The most

active areas for machine motion recognition have been in automating facial

expression recognition, vocal inflection recognition, and reasoning about emotion

given text input about goals and actions. The signals are then processed using pattern

recognition techniques like hidden Markov models (HMM’s), hidden decision trees,

auto-regressive HMM’s, Support Vector Machines and neural networks.

The response of such an affective system is also very important consideration. It could

have a preset response to each user emotional state or it could learn from trying out

Page 4: Affective Computing Report

different strategies on the subject with the passing of time and deciding the best

option as time passes on. user, to see which are most pleasing. Indeed, a core property

of such learning systems is the ability to sense positive or negative feedback –

affective feedback – and incorporates this into the learning routine. A wide range of

uses have been determined and implemented for such systems. These include systems,

which detect the stress level in car drivers to toys, which sense the mood of the child

and reacts accordingly.

A GENERAL OVERVIEW

A general system with users who have affect or emotion and the surrounding world

can be represented by the following sketch.

The most important component of the system will be the emotive user. This is any

user or being who has emotions and whose actions and decisions are influenced by his

emotions. The user forms the core of any affective system. This affect makes him able

to communicate to other humans and computers and to his self. Human to human

Page 5: Affective Computing Report

affective communication is a widely studied branch of psychology and is one of the

base subjects which where explored when Affective computing was considered.

Now in a general way it can be said that a human user will display an emotion. This

emotion will be sensed by any one of the interfaces to the affective application. This

might be a wearable computer or any other device, which has been designed for

inputting affective signals. In this way the sensing of the affective signal takes place.

A pattern recognition algorithm is further applied to recognize the affective state of

the user. The affective state is now understood and modeled. This information is now

passed to an affective application or an affective computer, which uses it to

communicate back with the emotive user. Research is also going on as to synthesizing

affect in computers, which will provide further dimension of originality to the Human

Computer Interaction. Each of the dimensions of affective interaction is discussed

below.

EMOTIONAL OR AFFECTIVE COMMUNICATION

Affective communication is communicating with someone (or something) either with

or about affect. A crying child, and a parent comforting that child, are both engaged in

affective communication. An angry customer complaining to a customer service

representative, and that representative trying to clear up the problem are both also

engaged in affective communication. We communicate through affective channels

naturally every day. Indeed, most of us are experts in expressing, recognizing and

dealing with emotions. However, affective communication that involves computers

represents a vast but largely untapped research area. What role can computers play in

affective communication? How can they assist us in putting emotional channels back

into communication technologies such as email and online chat where the emotional

content is lost? How can computer technology support us in getting to know our own

bodies, and our own emotions? What role can computers play in helping manage

frustration, especially frustration that arises from using technology? Researchers are

beginning to investigate several key aspects of Affective Communication as it relates

to computers. Affective communication may involve giving computers the ability to

recognize emotional expressions as a step toward interpreting what the user might be

feeling. However, the focus in this area is on communication that involves emotional

expression. Expressions of emotion can be communicated to others without an

Page 6: Affective Computing Report

intermediate step of recognition; they can simply be "transduced" into a form that can

be digitally transmitted and re-presented at another location. Several devices are being

investigated for facilitating this, under the name of Affective Mediation - using

computers to help communicate emotions to other people through various media.

Affective Mediation

Technology supporting machine-mediated communication continues to grow and

improve, but much of it still remains impoverished with respect to emotional

expression. While much of the current research in the group focuses on sensing and

understanding the emotional state of the user or the development of affective

interfaces, research in Affective Mediation explores ways to increase the "affective

bandwidth" of computer-mediated communication through the use of graphical

visualization. By graphical visualization, it means the representation of emotional

information in an easy-to-understand, computer graphics format. Currently the focus

is on physiological information, but it may also include behavioral information (such

as if someone is typing louder or faster than usual.) Building on traditional

representations of physiological signals – continuously updating line graphs -- one

approach is to represent the user's physiology in three-dimensional, real-time

computer graphics, and to provide unique, innovative, and unobtrusive ways to collect

the data. This research focuses on using displays and devices in ways which will help

humans to communicate both with themselves and with one another in affect-

enhanced ways.

Human-to-Human Communication

From email to full-body videoconferencing, virtual communication is growing rapidly

in availability and complexity. Although this richness of communication options

improves our ability to converse with others who are far away or not available at the

precise moment that we are, the sense that something is missing continues to plague

users of current methodologies. Affective Communication seeks to provide new

devices and tools for supplementing person-to-person communications media.

Specifically, through the use of graphical displays viewable by any or all members of

a mediated conversation, researchers hope to provide an augmented experience of

affective expression, which supplements but also challenges traditional computer

mediated communication.

Page 7: Affective Computing Report

Human-to-Self (Reflexive) Communication

Digitized representation of affective responses creates possibilities for our

relationship to our own bodies and affective response patterns. Affective

communication with oneself – reflexive communication - explores the exciting

possibilities of giving people access to their own physiological patterns in ways

previously unavailable, or available only to medical and research personnel with

special, complex, or expensive equipment. The graphical approach creates new

technologies with the express goal of allowing the user to gain information and

insight about his or her own responses.

Computer expression of emotion

This work represents a controversial area of human-computer interaction, in part since

attributing emotions and emotional understanding to machines has been identified as a

philosophical problem: what does it mean for a machine to express emotions that it

doesn't feel? What does it mean for humans to feel "empathized with" by machines

that are simply unable to really "feel" what a person is going through? Currently, few

computer systems have been designed specifically to interact on an emotional level.

An example is the smile that Macintosh users are greeted with, indicating that "all is

well" with the boot disk. If there is a problem with the boot disk, the machine displays

the "sad Mac".

Humans are experts at interpreting facial expressions and tones of voice, and making

accurate inferences about others' internal states from these clues. Controversy rages

over anthropomorphism: Should researchers leverage this expertise in the service of

computer interface design, since attributing human characteristics to machines often

means setting unrealistic as well as unfulfillable expectations about the machine's

capabilities? Show a human face, expect human capabilities that far outstrip the

machine? Yet the fact remains that faces have been used effectively to represent a

wide variety of internal states. And with careful design, researchers regard emotional

expression via face and sound as a potentially effective means of communicating a

wide array of information to computer users. As systems become more capable of

emotional communication with users, researchers see systems needing more and more

sophisticated emotionally-expressive capability.

Page 8: Affective Computing Report

SENSING HUMAN EMOTIONS

Sensors are an important part of an Affective Computing System because they

provide information about the wearer's physical state or behavior. They can gather

data in a continuous way without having to interrupt the user. There are many types of

sensors being developed to accommodate and to detect different types of emotions.

Some are listed below.

The Galvanic Skin Response (GSR) Sensor

Galvanic Skin Response is a measure of the skin's conductance between two

electrodes. Electrodes are small metal plates that apply a safe, imperceptibly tiny

voltage across the skin. The electrodes are typically attached to the subject's fingers or

toes using electrode cuffs or to any part of the body using a Silver-Chloride electrode

patch. To measure the resistance, a small voltage is applied to the skin and the skin's

current conduction is measured. Skin conductance is considered to be a function of

the sweat gland activity and the skin's pore size. An individual's baseline skin

conductance will vary for many reasons, including gender, diet, skin type and

situation. Sweat gland activity is controlled in part by the sympathetic nervous

system. When a subject is startled or experiences anxiety, there will be a fast increase

in the skin's conductance (a period of seconds) due to increased activity in the sweat

glands (unless the glands are saturated with sweat.)

After a startle, the skin's conductance will decrease naturally due to reabsorption.

There is a saturation to the effect: when the duct of the sweat gland fills there is no

longer a possibility of further increasing skin conductance. Excess sweat pours out of

the duct. Sweat gland activity increases the skin's capacity to conduct the current

passing through it and changes in the skin conductance reflect changes in the level of

arousal in the sympathetic nervous system.

The Blood Volume Pulse Sensor

The Blood Volume pulse sensor uses photoplethysmography to detect the blood

pressure in the extremities. It is a process of applying a light source and measuring the

light reflected by the skin. At each contraction of the heart, blood is forced through

the peripheral vessels, producing engorgement of the vessels under the light source--

thereby modifying the amount of light to the photosensor. The resulting pressure

waveform is recorded. Since vasomotor activity (activity which controls the size of

Page 9: Affective Computing Report

the blood vessels) is controlled by the sympathetic nervous system, the BVP

measurements can display changes in sympathetic arousal. An increase in the BVP

amplitude indicates decreased sympathetic arousal and greater blood flow to the

fingertips.

The Respiration Sensor

The respiration sensor can be placed either over the sternum for thoracic monitoring

or over the diaphram for diaphragmatic monitoring. In all experiments so far we have

used diaphragmatic monitoring. The sensor consists mainly of a large Velcro belt,

which extends around the chest cavity and a small elastic which stretches as the

subject's chest cavity expands. The amount of stretch in the elastic is measured as a

voltage change and recorded. From the waveform, the depth the subject's breath and

the subject's rate of respiration can be learned.

The Electromyogram (EMG) Sensor

The electromyographic sensors measure the electrical activity produced by a muscle

when it is being contracted, amplify the signal and send it to the encoder. There, a

band pass filter is applied to the signal. For all our experiments, the sensor has used

the 0-400 microvolt range and the 20-500 Hz filters, which is the most commonly

used position.

RECOGNIZING AFFECTIVE INPUT

The research work mainly involves efforts to understand the correlation of emotion

that can potentially be identified by a computer and primarily behavioral and

physiological expressions of emotion. Because we can measure physical events, and

cannot recognize a person's thoughts, research in recognizing emotion is limited to

correlates of emotional expression that can be sensed by a computer, including such

things as physiology, behavior, and even word selection when talking. Emotion

modulates not just memory retrieval and decision-making (things that are hard for a

computer to know), but also many sense-able actions such as the way you pick up a

pencil or bang on a mouse (things a computer can begin to observe). In assessing a

user's emotion, one can also measure an individual's self-report of how they are

feeling. Many people have difficulty recognizing and/or verbally expressing their

emotions, especially when there is a mix of emotions or when the emotions are

nondescript. In many situations it is also inappropriate to interrupt the user for a self-

report. Nonetheless, researchers think it is important that if a user wants to tell a

system verbally about their feelings, the system should facilitate this. Researchers are

Page 10: Affective Computing Report

interested in emotional expression through verbal as well as non-verbal means, not

just how something is said, but how word choice might reveal an underlying affective

state.

Our focus begins by looking at physiological correlates, measured both during lab

situations designed to arouse and elicit emotional response, and during ordinary (non-

lab) situations, the latter via affective wearable computing.

Our first efforts toward affect recognition have focused on detecting patterns in

physiology that we receive from sensing devices. To this effect, we are designing and

conducting experiments to induce particular affective responses. One of our primary

goals is to be able to determine which signals are related to which emotional states --

in other words, how to find the link between user's emotional state and its

corresponding physiological state. We are hoping to use, and build upon, some of the

work done by others on coupling physiological information with affective states.

Current efforts that use physiological sensing are focusing on:

• GSR (Galvanic Skin Response),

• ECG (Electrocardiogram),

• EMG (Electromyogram),

• BVP (Blood Volume Pressure),

• Respiration, and

• Temperature.

UNDERSTANDING THE AFFECTIVE INPUT

Once the Sensing and Recognition modules have made their best attempt to translate

user signals into patterns that signify the user's emotional responses, the system may

now be said to be primitively aware of the user's immediate emotional state. But what

can be done with this information? How will applications be able to make sense of

this moment-to-moment update on the user's emotional state, and make use of it? The

Affective Understanding module will use, process, and store this information, to build

and maintain a model of the user's emotional life in different levels of granularity-

from quick, specific combinations of affective responses--to meta-patterns of moods

and other emotional responses. This module will communicate knowledge from this

model with the other modules in the system.

Page 11: Affective Computing Report

The Affective Understanding module will eventually be able to incorporate contextual

information about the user and his/her environment, to generate appropriate responses

to the user that incorporate the user's emotional state, the user's cognitive abilities, and

his/her environmental situation.

The Affective Understanding module may:

• Absorb information, by receiving a constant data stream on the user's current

emotional state from the Recognition module.

• Remember the information, by keeping track of the user's emotional responses

via storage in short, medium, and long-term memory buffers.

• Model the user's current mood, by detecting meta-patterns in the user's emotional

responses over time, comparing these patterns to the user's previously defined moods,

and possibly canonical, universal or archetypal definitions of human moods

previously modeled.

• Model the user's emotional life. Recognize patterns in the way the user's

emotional states may change over time, to generate a model of the user's emotional

states--patterns in the typical types of emotional state that the user experiences, mood

variation, degrees of valence (i.e. mildly put off vs. enraged), pattern combinations

(i.e. tendencies toward a pattern of anger followed by depression).

• Apply the user affect model. This model may help the Affective Understanding

module by informing the actions that this module decides to take--actions that use the

Applications and Interface modules to customize the interaction between user and

system, to predict user responses to system behavior, and to eventually make

predictions about the user's interaction with environmental stimuli.

• Update the user affect model. This model must be inherently dynamic in order to

reflect the user's changing response patterns over time. To this end, the system will be

sensitive to changes in the user's meta-patterns as it begins to receive new kinds of

data from the Recognition module. Similarly, the Understanding module's learning

agents will receive feedback from both Application and Interface modules that will

inform changes to the user model. This feedback may consist of indications of levels

of user satisfaction--whether the user liked or disliked the system's behavior. This

feedback may come either as direct feedback from the user via the interface, or

indirectly by way of inference from how an application was used (e.g. the way that

Page 12: Affective Computing Report

application X was used and then terminated indicated that the user may have been

frustrated with it).

These user responses will help to modify the Understanding module's model of the

user and, therefore, the recommendations for system behavior that the Understanding

module makes to the rest of the system.

• Build and maintain a user-editable taxonomy of user preferences, for use in

specific circumstances when interacting with the user. For example instructions not to

attempt to communicate with the user while she/he is extremely agitated, or requests

for specific applications during certain moods--i.e. "Start playing melancholy music

when I've been depressed for x number of hours, and then start playing upbeat music

after this duration." This taxonomy may be eventually incorporated into the user

model; however, ultimately, the user's wishes should be able to override any modeled

preference.

• Feature two-way communication with the system's Recognition module. Not

only will the Recognition module constantly send updates to the Understanding

module, but the Understanding module will also send messages to the Recognition

module. These messages may include alerting the Recognition module to "look out"

for subsequent emotional responses that the Understanding module's model of the

user's meta-patterns predicts. Other kinds of Understanding-module-to-Recognizing-

module messages may include assisting the Recognition module in fine-tuning its

recognition engine by suggesting new combinations of affect response patterns that

the user seems to be displaying. These novel combinations may in turn inform novel

patterns in the user's affect that may be beyond the scope of the Recognition engine to

find on its own.

• Eventually build and maintain a more complete model of the user's behavior.

The more accurate a model of the user's cognitive abilities and processes can be built,

the better the system will be at predicting the user's behavior and providing accurate

information to the other modules within the system.

• Eventually model the user's context. The more information the system has about

the user's outside environment, the more effective the interaction will be, as will be

the benefit to the user. A system that knows that the user is in a conversation with

someone else may not wish to interrupt the user to discuss the user's current affective

response. Similarly, a system that can tell that the user has not slept in several days, is

Page 13: Affective Computing Report

ill or starving or under deadline pressure, will certainly be able to communicate with

much more sensitivity to the user.

• Provide a basis for the generation of synthetic system affect. A system that can

display emotional responses of its own is a vast, distinct area of research. The

Affective Understanding module described here may be able to inform the design of

such systems. And, once built, such a system could be integrated into the Affective

Understanding module to great effect. For example, a system that is able to display

authentic empathy in its interaction with the user might prove even more effective in

an Active Listening application than a system that shows artificial empathy (looks like

empathy to the user, but the machine doesn't really feel anything).

• Ensure confidentiality and security. The understanding module will build and

maintain a working model and record of the user's emotional life; eventually, this

model may also record other salient, contextual aspects of the user's life. Therefore,

perhaps more so than any other part of the affective computing system, the affective

understanding module will house information that must be kept confidential.

SYNTHESIZING EMOTION

Synthesizing emotions in machines and with this, building machines that not only

appear to "have" emotions, but also actually do have internal mechanisms analogous

to human or animal emotions is the next step. In a machine, (or software agent, or

virtual creature) which "has" emotions, the synthesis model decides which emotional

state the machine (or agent or creature) should be in. The emotional state is then used

to influence subsequent behavior. Some forms of synthesis act by reasoning about

emotion generation. An example of synthesis is as follows: if a person has a big exam

tomorrow, and has encountered several delays today, then he/she might feel stressed

and particularly intolerant of certain behaviors, such as interruptions not related to

helping prepare for the exam. This synthesis model can reason about circumstances

(exam, delays), and suggest which emotion(s) are likely to be present (stress,

annoyance).

The ability to synthesize emotions via reasoning about them, i.e. to know that certain

conditions tend to produce certain affective states, is also important for emotion

recognition. Recognition is often considered the "analysis" part of modeling

something -- analyzing what emotion is present. Synthesis is the inverse of analysis --

constructing the emotion. The two can operate in a system of checks and balances:

Recognition can proceed by synthesizing several possible cases, then asking which

Page 14: Affective Computing Report

case most closely resembles what is perceived. This approach to recognition is

sometimes called "analysis by synthesis." Synthesis models can also operate without

explicit reasoning. Researchers are exploring the need for machines to "have"

emotions in a bodily sense. The importance of this follows from the work of Damasio

and others who have studied patients who essentially do not have "enough emotions"

and consequently suffer from impaired rational decision making. The nature of their

impairment is oddly similar to that of today's Boolean decision-making machines, and

of AI's brittle expert systems. Recent findings suggest that in humans, emotions are

essential for flexible and rational decision-making. Our hypothesis is that they

emotional mechanisms will be essential for machines to have flexible and rational

decision making, as well as truly creative thought and a variety of other human-like

cognitive capabilities.

INTERFACES TO AFFECTIVE SYSTEMS

Once we begin to explore applications for affective systems, interface design

challenges and novel strategies for human-computer interaction immediately begin to

suggest themselves. These design challenges concern both hardware and software. In

terms of software, the human interface to applications can change with the increased

sensitivity that the affective sensing/recognizing/understanding system will bring to

the interaction. In terms of hardware, the design challenges present themselves even

more immediately.

For example, various bio-sensors and other devices such as pressure sensors may be

used as inputs to an affective computing system, perhaps by placing them into mice,

keyboards, chairs, jewelry, or clothing, things a user is naturally in physical contact

with. Sensing may also be done without contact, via cameras and microphones or

other remote sensors. How will these sensors evolve into the user's daily life? Will

sensors be embedded in the user's environment, or will they be part of one's personal

belongings, perhaps part of a personal wearable computer system? In the latter case,

how can we design these systems so that they are unobtrusive to the user and/or

invisible to others? In either case, consideration for the user's privacy and other needs

must be addressed. So the design of the interface is very important to all concerned

since the interface to the user will determine the utility more than the actual

complexity of the system.

The best type of interfaces would be using wearable interfaces. Wearable computers

are entire systems that are carried by the user, from the CPU and hard drive, to the

Page 15: Affective Computing Report

power supply and all input/output devices. The size and weight of these wearable

hardware systems are dropping, even as durability of such systems are increasing.

Researchers are also designing clothing and accessories (such as watches, jewelry,

etc.) into which these devices may be embedded to make them not only unobtrusive

and comfortable to the user, but also invisible to others.

Wearable computers allow researchers to create systems that go where the user goes,

whether at the office, at home, or in line at the bank. More importantly, they provide a

platform that can maintain constant contact with the user in the variety of ways that

the system may require; they provide computing power for the all affective computing

needs, from affect sensing to the applications that can interpret, understand and use

the data; and they can store the applications and user input data in on-board memory.

Finally, such systems can link to personal computers and to the Internet, providing the

same versatility of communications and applications as most desktop computers. A

prototype affective computing system, which is currently being developing in MIT,

uses a modified "Lizzy" wearable is described below. Researchers plan to create a

uniform set of affective computing hardware platforms, both to conduct affect

sensing/recognizing experiments, and to develop eventual end user systems. An

example of this hardware system is shown below. The computer module itself is five

and a half inches square (about the length of a pen), by three inches deep. It runs the

Linux operating system. The steel casing can protect the computer in falls from

heights up to six feet, even on hard surfaces like concrete. This system is durable

enough that it can withstand occasional blows, knocks, even the user's accidentally

sitting on various parts of the system without damage.

APPLICATIONS OF AFFECTIVE SYSTEMS

Perhaps the most fundamental application of affective computing will be to form

next-generation human interfaces that are able to recognize, and respond to, the

emotional states of their users. Users who are becoming frustrated or annoyed with

using a product would "send out signals" to the computer, at which point the

application might respond in a variety of ways -- ideally in ways that the user would

see as "intuitive".

Beyond this quantum leap in the ability of software applications to respond with

greater sensitivity to the user, the advent of affective computing will immediately lend

itself to a host of applications, a number of which are described below.

Affective Medicine

Page 16: Affective Computing Report

Affective computing could be a great tool in the field of medicine. Stress is an

emotion, which is widely felt by all of us in this world of technology that forces us to

pick up a pace, which is higher than what we can handle. Stress is a big killer also.

Studies have indicated that stress is a big factor affecting health. It has been shown

that people who are more stressed out have lower resistance to diseases than a normal

person. It is also noted that the most stressed out people are those who use high-end

technology. So if computers and other devices were to interact with their users on an

affective level, it might help to bring down and control stress and consequently help

the health of the users.

Another use of affective system is to train Autistic children. Although autism is a

complex disorder where children tend to have difficulty with social-emotional cues,

they tend to be poor at generalizing what they learn, and learn best from having huge

numbers of examples, patiently provided. Many autistics have indicated that they like

interacting with computers, and some have indicated that communicating on the web

“levels the playing field” for them, since emotion communication is limited on the

web for everyone. Current intervention techniques for autistic children suggest that

many of them can make progress recognizing and understanding the emotional

expressions of people if given lots of examples to learn from and extensive training

with these examples.

Another application, which has been designed, is to use affective computers to collect

details about the condition of patients when they visit a physician. Today, physicians

usually have so little time with patients that they feel it is impossible to build rapport

and communicate about anything except the most obviously significant medical

issues. However, given findings such as those highlighted here emotional factors such

as stress, anxiety, depression, and anger can be highly significant medical factors,

even when the patient might not mention them.

In some cases, patients prefer giving information to a computer instead of to a doctor,

even when they know the doctor will see the information: computers can go more

slowly if the patient wishes, asking questions at the patient’s individual speed, not

rushing, not appearing arrogant, offering reassurance and information, while allowing

the physician more time to focus on other aspects of human interaction. Also, in some

cases, patients have reported more accurate information to computers; those referred

for assessment of alcohol-related illnesses admitted to a 42% higher consumption of

Page 17: Affective Computing Report

alcohol when interviewed by computer than when interviewed for the same

information by psychiatrists.

Affective Tutor

Another good application for affective computer is to impart education to students.

Computers are widely being used to impart quality education to students. But most of

these CBT’s or Computer Based Tutorials are either linear, that is they follow a fixed

course, or they are based on the ability of the student which is gauged from the

response of the student to test situations. Even such a response is very limited.

An affective tutor on the other hand would be able to gauge the students

understanding as well as whether he is bored, confused, strained or in any other

psychological state which affects his or her studies and consequently change its

presentation or tempo so as to enable the student to adjust just as a human teacher

would do. This would consequently increase the student’s grasp of the subject and

give a better overall output from the system.

Affective DJ

Another application, which has been developed, is a digital music delivery system that

plays music based on your current mood, and your listening preferences. This system

is able to detect that the user is currently experiencing a feeling of sorrow or

loneliness and consequently select a piece of music which it feels will help change the

mood you are in. It will also be able to make changes in your current play list if it

feels that the user is getting bored of the current play list or that the music has been

able to change the affect of that person to another state. Another promising

development is a video retrieval system might help identify not just scenes having a

particular actor or setting, but scenes having a particular emotional content: fast-

forward to the "most exciting" scenes. This will allow the user to be watching a scene,

which has his or her favorite actors, and also which suit the user’s current mood.

Affective Toys

In the age where robotic toys are the craze of the time affective toys will soon enter

the toy world to fill in the void that the robots are unable to show or have emotions

and have to be attributed to them by an imaginative child. Affective toys on the other

hand will have emotions of their own and will be able to exchange these emotions

with the child, as a normal human child would do. The most famous affective toy is

The Affective Tigger, which is a reactive expressive toy. The stuffed tiger reacts to a

Page 18: Affective Computing Report

human playmate with a display of emotion, based on its perception of the mood of

play.

Affective Avatars

Virtual Reality avatars that accurately and in real time represent the physical

manifestations of affective state of their users in the real world are a dream of hard-

core game players. They would enjoy the game more and also feel more a part of the

game if their Avatars would behave just like they would in a similar scenario. They

would like their Avatar to be scared when they are scared, angry when they are angry

and also excited whenever they feel excited. Work has been progressing in this

direction.

For example, AffQuake is an attempt to incorporate signals that relate to a player's

affect into ID Software's Quake II in a way that alters game play. Several

modifications have been made that cause the player's avatar within Quake to alter its

behaviors depending upon one of these signals. For example, in StartleQuake, when a

player becomes startled, his or her avatar also becomes startled and jumps back.

Many other applications have risen with continuing research in this field. More and

more possibilities are opening up every day.

CONCLUSION

In this seminar I have tried to provide a basic framework of the work done in the field

of affective computing. Over the years, scientists have aimed to make machines that

are intelligent and that help people use their native intelligence. However, they have

almost completely neglected the role of emotion in intelligence, leading to an

imbalance on a scale where emotions are almost always ignored. This does not mean

that the newer research should be solely to increase the Affective ability of the

computers. It is widely known that too much emotion is also as bad possibly worse

than no emotion. So a lot of research is needed to learn about how affect can be used

in a balanced, respectful, and intelligent way; this should be the aim of affective

computing as we develop new technologies that recognize and respond appropriately

to human emotions. The science is still very young but is showing large amounts of

promise and should provide more to HCI than did the advent of GUI and speech

Page 19: Affective Computing Report

recognition. The research is promising and will cause Affective computing to be an

essential tool in the future.

REFERENCES

• J. Scheirer, R. Fernandez, J. Klein, and R. W. Picard (2002),

"Frustrating the User on Purpose: A Step Toward Building an

Affective Computer"

• Jonathan Klein, Youngme Moon and Rosalind W. Picard (2002), "This

Computer Responds to User Frustration"

• Rosalind W. Picard, Jonathan Klein (2002), "Computers that

Recognise and Respond to User Emotion: Theoretical and Practical

Implications"

• Rosalind W. Picard and Jocelyn Scheirer (2001), "The Galvactivator:

A Glove that Senses and Communicates Skin Conductivity"

• Carson Reynolds and Rosalind W. Picard (2001), "Designing for

Affective Interactions"