189
Emotional Computers Computer models of emotions and their meaning for emotion-psychological research by Gerd Ruebenstrunk © 1998 by Gerd Ruebenstrunk Parts or the whole of this work may be reproduced for educational, non-commercial purposes. Commercial reproduction is forbidden without my written consent. If you find this work helpful, please drop me a note: [email protected] 1

Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Emotional Computers

Computer models of emotions and their meaning for emotion-psychological research

by Gerd Ruebenstrunk

© 1998 by Gerd Ruebenstrunk

Parts or the whole of this work may be reproduced for educational, non-commercial purposes.

Commercial reproduction is forbidden without my written consent.

If you find this work helpful, please drop me a note:

[email protected]

1

Page 2: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The translation of this work was done by me in my spare time with the help of the Google translation engine. Any mistakes made herein are mine alone and

should not be attributed to the cited authors.

I wish to thank Aaron Sloman for providing me with the motivation to undertake this translation and wish I could

have done a better job.

2

Page 3: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Content

1. Introduction

2. Artificial feelings

3. Strange brains

4. Theoretical foundations

4.1. The theory of Ortony, Clore and Collins

4.2. The theory of Roseman

4.3. The theory of Scherer

4.4. The theory of Frijda

4.5. The theory of Oatley & Johnson-Laird

5. Electronic assistants

5.1. The models of Dyer

5.1.1. BORIS5.1.2. OpEd5.1.3. DAYDREAMER5.2. The model of Pfeifer

5.3. The model of Bates and Reilly

5.4. The model of Elliott

5.4.1. Construction of the agents 5.4.2. Generating emotions5.4.3. Generating actions5.4.4. Interpreting the emotions of other agents5.4.5. Further development of the model 5.5. The model of Scherer

5.6. The model of Frijda and Swagerman

5.7. The model of Moffat and Frijda

5.7.1. Criticism of ACRES5.7.2. Requirements for an emotional system5.7.3. Implementation in WILL5.8. Other models

5.8.1. The model of Colby5.8.2. The model of Reeves5.8.3. The model of Rollenhagen and Dagkvist5.8.4. The model of O'Rorke5.8.5. The model of Araujo5.9. Conclusions

3

Page 4: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

6. Visions of the pioneer

7. Encounters on Taros

7.1. What is a Fungus Eater?

7.2. Emotional Fungus Eaters

7.2.1. The "biological urges"7.2.2. The "emergency urges"7.2.3. The "social urges"7.2.4. The "cognitive urges"7.3. Evaluation of Toda's model

8. Development and implementation of Toda's model

8.1. The modification of Toda's urges byAubé

8.2. The partial implementation of Toda's theory by Wehrle

8.3. Pfeifer's "Fungus Eater principle"

8.4. The approach of Dörner et al.

8.5. Conclusion

9. The philosopher from Birmingham

9.1. Approaches to the construction of intelligent systems

9.2. The fundamental architecture of an intelligent system

9.2.1. The layers9.2.2. The control states9.2.3. Motivators and filters9.2.4. The global alarm system9.3. Emotions

9.4. Implementation of the theory in MINDER1

9.4.1. The reactive sub-system9.4.2. The deliberative sub-system9.4.3. The meta-management sub-system9.5. Conclusion

10. The libidinal economy of the computer

10.1. Criticism of interrupt theories of emotions

10.1.1. The control precedence problem10.1.2. The emotional learning problem10.1.3. The hedonic tone problem10.2. Valence

10.3. Learning in adaptive agent systems

10.3.1. Q-Learning10.3.2. The classification system of Holland

4

Page 5: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

10.3.3. XCS10.3.4. Dyna10.3.5. The concept of "value"10.4. Wright's currency flow hypothesis10.5. The details of the CLE system

10.5.1. The libidinal selective system10.5.2. The conative universal equivalent (CUE)10.5.3. Credit assignment10.5.4. The value circulation theory 10.6. A practical example of CLE

10.7. CLE and the problems of interrupt theories

10.7.1. CLE and the hedonic tone problem10.7.2. CLE and the emotional learning problem10.7.3. CLE and the valenced perturbant states problem10.7.4. CLE and the control precedence problem10.8. Conclusion

11. A new paradigm?

11.1. The models of Velásquez

11.1.1. Cathexis11.1.2. Yuppy11.2. The model of Foliot and Michel

11.3. The model of Gadanho and Hallam

11.4. The model of Staller and Petta

11.5. The model of Botelho and Coelho

11.6. The model of Canamero

11.7. Conclusion

12. Meaning for emotion-psychological research

5

Page 6: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

1. Introduction

Which role do computer models for the psychological research onemotions play? This question is not new, but some of the recent answersare. Since 1988, when Pfeifer published an overview of the availableliterature, the number of approaches to this question has been principallyunchanged; however, the depth of these approaches has changedconsiderably.

In his article "Artificial Intelligence Models of Emotions" (1988) Pfeiferdid not only tr to give an overview of the existing modeling approaches, healso classified them into two main categories:

a) Augmented Cognitive Models

Approaches in this category move emotions not into the center, butpredominantly consist of models for cognitive processes, during whichemotions play a "supplementing" role. Typical of these models is that they are concerned with a well-defined task to which emotions are added as anauxiliary factor.

b) AI Models of Emotion

Approaches in this category place the modelling of emotions into thecenter. For such models the basic assumption of a complex environment istypical, in which clear problem descriptions can be realized only withdifficulty.

In a further work Pfeifer (1994) modified this classification. He nowdifferentiates between "reasoners" and "psychological models". Reasonersare models, which are based on specific taxonomies of emotions and have the task to classify them. Psychological models are models whose goal itis to model emotional processes per se.

The following work tries to cover those approaches as well as otherswhich cannot be classified into one or the other of those categories.Therefore, my classification differs from Pfeifer's. Computer models ofemotions are classified according to their objectives:

a) computers, who "understand" and "express" emotions and

b) computers that "have" emotions.

It is this latter category which is of particular interest for the psychologicalresearch into emotions, although it raises the most epistemologicalquestons. Approaches of the first category consist simply of more or lessrefined models of existing theories of emotions and pose, therefore, mainlytechnical problems. But developing computers that possess emotionsmeans initiating an evolutionary process which eventually will lead to theemergence of an emotional sub-system independent from its humancreators..

6

Page 7: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

This work will be concerned therefore primarily with approaches of thesecond category and will present them in their historical and theoreticalcontext. Technical explanations, which are necessary for the understandingof the technical implementation of the respective models, will be discussedas briefly as possible.

After an introductory overview (chapter 2) I deal shortly with theepistemological dimensions of computer modelling of emotions (chapter3). I then present a short overview of the psychological theories of emotions which serve as a basis for computer models of emotions (chapter4). Short descriptions of some of the most important models of the firstcategory follow (chapter 5). The main part of the work is occupied withmodels of the second category,.beginning with the works of Simon(Chapter 6) and Toda (chapter 7) as well as describing a firstimplementation of Toda's model (chapter 8). The next two chaptersdescribe the works of Sloman (chapter 9) and Wright (chapter 10), whosemodel follows directly from Sloman's theories. Toda's approach hasrecently inspired many researchers to build emotional autonomous agents, some of which are described here (chapter 11). A final chapter (chapter12) discusses the importance of the models described in this work for thepsychological research into emotions.

This work is a thesis which I wrote in 1998 in order to obtain a diploma inpsychology. My supervisors were Wulf-Uwe Meyer and RainerReisenzein, both from the University of Bielefeld. I am indebted to both ofthem because they not only provided me with the opportunity to finish mystudies after an interruption of 20 years, but because they again ignited myinterest into the psychology of emotions. Furthermore, they provided mewith a lot of valuable suggestions.

7

Page 8: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

2. Artificial Feelings

The most famous emotional computer is probably HAL 9000 from themovie "2001 - A Space Odysey" by Stanley Kubrick. It was a shock formany people to see a vision of an artificial intelligence equal, if notsuperior, to humans. But much more fearsome was that this machine hademotions, too which led finally to the destruction of all humans aboard thespaceship.

It was probably no coincidence that one of the advisors to Stanley Kubrickwas Marvin Minsky, one of the fathers of Artificial Intelligence. ForMinsky, an emotional computer is a thoroughly realistic vision:

..I don't think you can make AI without subgoals, and emotion is crucialfor setting and changing subgoals. Kubrick probably put the emotion in tomake good cinema, but it also happens to be very good science. Forinstance, HAL explains that the Jupiter mission is too important to bejeopardized by humans. It is through emotion that he sets the goals andsubgoals, ultimately killing the humans..."(Stork, 1997, S. 29)

Nowadays, al lot of AI researchers accept the fact that emotions areimperative for the functioning of an "intelligent" computer. This insightdoes not stem from a deep reflection over the topic but rather from thefailures of classical AI. The new catchphrase, therefore, is not AI, but AE -artificial emotions.

As well as the idea of an intelligent computer, the idea of an emotionalcomputer constitutes for most people more of a threat than of a hopefulvision. On the other hand a strange fascination proceeds from such aconception. It is no coincidence that emotional machines play an importantrole in the popular culture.

Take "Terminator 2", for example. In James Cameron's movie, theTerminator is a robot without any feelings who learns to understandhuman emotions in the course of the story. There is even one scene inwhich it looks like he is able to experience an emotion itself, though thedirector leaves us speculating if this really is the case. Another example isthe robot from "No. 5 lives" which changes from a war machine into a"good human".

Robots are, at least in popular culture, often described as strange beingswhose character is mainly threatening. This is a trait they share with "true"aliens. Remember Star Trek's Mr. Spock, who only seems to know logicbut no feelings, like all inhabitants of his home planet, Vulcan. But in manyepisodes we come to see that even he cannot function without emotions.

And even "Alien", the monster from the films of the same name, terrifiesus by its ferocity and its malicious intelligence, but underneath it harboursat least some rudimentary feelings, as we can see in the fourth part of theseries.

8

Page 9: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

It could be concluded thus that a strange intelligence becomes reallythreatening only then for us humans, if it has at least a minimum ofemotions. Because if it consisted only of pure logic, its behaviour wouldbe predictable and finally controllable by humans.

No surprise, then, that emotions finally have found their way into ArtificialIntelligence. MIT's Affective Computing Group describes the necessity todevelop emotional computers as follows:

"The importance of this follows from the work of Damasio and others whohave studied patients who essentially do not have "enough emotions" andconsequently suffer from impaired rational decision making. The nature oftheir impairment is oddly similar to that of today's booleandecision-making machines, and of AI's brittle expert systems. Recentfindings indicate now that in humans, emotions are essential for flexibleand rational decision making. Our hypothesis is that they will also beessential for machines to have flexible and rational decision making, aswell as truly creative thought and a variety of other human-like cognitivecapabilities."(Affective Computing Home Page)

Although the works of Damasio are quite recent, this position is not newbut can be traced back to the 1960s. However, it has been forgotten - atleast by most AI researchers. The utter inability of computers to executecomplex activities autonomously has revived interest in this approach.Where in the past the emphasis of AI research lay with the representationof knowledge, this has now changed to the development of "intelligentautonomous agents".

The interest in autonomous agents results from practical requirements,too. Take space exploration, for example: Wouldn't it be great to sendrobots to faraway planets which can autonomously explore and react,because a remote control would be impractical or impossible over such adistance? Or take software agents which would be able to autonomouslysift through the internet, decide which information is of use to their"master" and even change the course of their search independently?

Franklin and Graesser define an autonomous agent as follows:

"An autonomous agent is a system situated within and part of an environment that senses that environment and acts on it, over time, inpursuit of its own agenda and so as to effect what it senses in the future."(Franklin and Graesser, 1996, S. 4)

Picard puts forward a more specific definition for the use of "emotional" autonomous agents:

"One of the areas in which computer emotions are of primary interest issoftware agents, computer programs that are personalized - they know theuser's interests, habits and preferences - and that take an active role inassisting the user with work and information overload. They may also bepersonified, and play a role in leisure activities. One agent may act like anoffice assistant to help you process mail; another may take the form of an

9

Page 10: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

animated creature to play with a child."(Picard, 1997, S. 193f.)

According to this definition, autonomous agents can be implemented aspure software - something which is hotly debated by a number ofresearchers. Brustoloni (1991) for example defines an autonomous agentas a system that is able to react appropiately and in real time to stimulifrom a real, material environment in an autonomous and target-orientedway.

Pfeifer (1996), too, believes that a physical implementation is aindispensable condition for an autonomous agent, especially if it shouldhave emotions. His four basic principles for a real life agent according tothe "Fungus Eater" principle are:

a) autonomy

The agent must be able to function without human intervention,supervision, or direction.

b) self-sufficiency

The agent must be able to keep itself functioning over a longer period oftime, i.e. to conserve or fill up his energy resources, to repair itself etc.

c) embodiment

The agent must have a physical body through which it can interact with thephysical world. This body is especially important:

"Although simulation studies can be extremely helpful in designing agents,building them physically leads to surprising new insights...Physicalrealization often facilitates solutions which might seem hard if consideredonly in an information processing context."(Pfeifer, 1996, S. 6)

d) situatedness

The agent must be able to control all his interactions with its environmentitself and to let its own experiences influence this interaction.

A taxonomy for autonomous agents as proposed by Franklin and Graesser(1996) makes clear that autonomous agents of all kinds are notfundamentally different from humans:

10

Page 11: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Abb. 1: Taxonomy of autonomous agents (Franklin und Graessner, 1996,S. 7)

The demand for a physical impementation has led to a closer co-operationbetween robotics and AI. Individual aspects of the principles have alreadybeen realized through this co-operation, but there exists no implementationof a complete system which would be sufficient for all the describedrequirements (at least not until now).

Despite the increased interest in autonomous agents the attempts to createintelligent machines so far must be regarded as failures, even if, forexample, Simon (1996) is of a different opinion. Franklin (1995) outlinesthe three substantial AI debates of the last 40 years, which arose in eachcase from the failure of the preceding approaches.

It stands without doubt that these failures advanced the development ofintelligent machines, but, as Picard (1997) points out, there ist still asubstantial part missing. And this part are the emotions.

It is interesting that the increasing interest in emotions in the AI researchhas a parallel in the increasing interest of cognitive psychology inemotions. In the last decades, emotion psychology never had the centerstage but was relegated to the sidelines. This is changing considerably,certainly aided by recent discoveries from the neurosciences (see e.g.LeDoux, 1996) which attribute to the emotional subsystem a far highermeaning for the functioning of the human mind than assumed so far.

A further parallel can be observed in the increasing interest in the topic"consciuousness". This discussion was also carried primarily from thecircles of artificial intelligence, the neurosciences and philosophy intopsychology. A cursory glance at some of the substantial publicationsshows, however, that the old dichotomy between cognition and emotioncontinues here: Almost none of the available works on consciousnessdiscusses emotions.

This is the more astonishing because the fact it is undisputed that at leastsome emotions cannot exist without consciousness. Specifically all thoseemotions, which presuppose a conception of "self", for example shame.One does not need to know the discussion around "primary" and

11

Page 12: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

"secondary" emotions to state that there are emotions, which ariseindependently of consciousness; but likewise emotions, which presupposeconsciousness.

12

Page 13: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

3. Strange brains

Since there are computers, there are also attempts to simulate processes ofhuman thinking on them. The ability of an electronic machine to read,manipulate and output information in high speed, tempted researchersfrom the outset to speculations over the equivalence of computers andbrains.

Such speculations soon found entrance into psychology. In particular thepossibility of calculating machines to process information parallelycorresponded to approaches in psychology which regarded the brainpredominantly as a parallel processing system.

Computers were regarded before this background as a possibility ofclearing up still unexplored phenomena of the human mind throughmodelling. A good example is the Pandemonium model of Oliver Selfridge(Selfridge, 1959). Pandemonium is a model for visual pattern recognition.It consists of a multiplicity of parallely working demons, of which everyone is specialized in recognizing a certain visual stimulus, forexample a horizontal bar in the center of the presented stimulus or acurvature in the right upper corner.

If a demon recognizes "his" stimulus, he calls out to the central master demon. This exclamation is the louder, the more highly the demonestimates the probability of a correct identification. All demons workindependently from one another; none is affected by his neighbours.

On the basis of the received information, the master demon then decides which pattern constitutes the stimulus. In an advancement of the model,the demons were organized hierarchically in order to relieve the master demon.

Between these specialized demons in Selfridge's model and the actuallyexisting feature detector cells in the visual cortex exists an astonishing similarity. And indeed it was the model of Selfridge and the itsassumptions over perception processes, which suggested for the first timethat such feature detectors could exist in humans. The model was in thiscase the reason for neurophysiologists to look for appropriate cells.

Thus Pandemonium is a good example of how computer models canadvance psychological research. On the other hand, it should not be forgotten that a system such as Pandemonium is unable to really "see".And this is a key point for critics who grant a reduced heuristic use tocomputer modelling, but otherwise deny any equivalence between humansand machines.

"computer". Both the AI research and the past approaches for thedevelopment of emotional systems assume without further doubt that"intelligence" and "emotion" i na computer are not fundamentally different

emotional computers: the equivalence of the systems "humans" andThis is also one of the fundamental questions for the development of

13

Page 14: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

from intelligence and emotion in humans.

This assumption abstracts to a large extent from the specific hardware;"emotion" is understood as a pure software implementation. But it is quitequestionable whether the two systems obey the same laws of development.

A computer is a discrete system, which knows nothing else than only twodifferent states. By the combination of several of such elements,"intermediate states" can be created; this however only on a certain levelof abstraction from the underlying hardware.

In contrast, the physiology of the emotional and cognitive system inhumans by no means represents a comparable mechanism, but consists,even on the lowest level, of a multiplicity of mechanisms, some of whichwork more according to digital principles, others more according toanalog principles.

Even one of the best researched mechanisms, the function of the neurons,is not exclusively an On/Off mechanism, but consists of a multiplicity ofdifferentiated partial mechanisms - and this on the level of the hardware.

The simulation of such mechanisms with computers is at present onlypossible as software. Simple neural switching patterns can, up to a certainpoint, also be modelled by parallel computers; such a modelling is possible,however, only within certain limits and ignores completely chemicalprocesses which play an important part in the human brain.

Picard (1997) tries to solve the problem by abstracting from the differencebetween hardware and software and defining both as "computers". Shejustifies this position with the argument that emotional software agents canexist in an "emotion-free" hardware.

A similar discussion is deals with the comparability of emotions of humansand animals (see Dawkins, 1993). Here at least we have a hardware ofidentical elements, although of different complexity. In this case it is, too,not considered scientifically decided whether an emotion like "mourning"is identical in humans and animals.

The affair is made more difficult still by the question whether a computercan be considered in principle as a form of life. In the "Artificial Life"discussion of the last years, some attention was given to this question.Thus the evolutionary biologist Richard Dawkins (Dawkins, 1988) holdsthe opinion that the ability for reproduction would already be sufficient tospeak of life in the biological sense. Others extend the definition by thecomponents "self organization" and "autonomy".

If one ignores the ethical and philosophical discussion of "life" andconcentrates on the aspects "self organization" and "autonomy", then it isquite realistic to attribute to computers and/or software thesecharacteristics. Self organization in the sense of adaptation can beobserved, for example, in neural nets working with genetic algorithms (seee.g. Holland, 1998). Autonomy in the reduced sense can be observed inrobots and/or partly in autonomously operating programs, for exampleagents for the internet. Such programs possess also the ability forreproduction, which would fulfil the third condition.

14

Page 15: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The emphasis of the AI and AL research lies at present on theadvancement of such autonomous, self-organizing systems. The modelsused are partly based on functional models of the human brain; this shouldnot tempt one, however, to rashly equate their operating mode with that ofthe human brain.

Especially with optimization processes of software by genetic algorithms itis frequently not known to human observers which self organizationprocesses the software uses, in order to reach the optimization goal.

Sometimes, though, it might be useful to attribute to a computer mentalabilities. Thus John McCarthy, one of the pioneers of artificial intelligence,explains:

".. Although we may know the program, its state at a given moment isusually not directly observable, and the facts we can obtain about itscurrent state may be more readily expressed by ascribing certain beliefsand goals than in any other way... Ascribing beliefs may allow derivinggeneral statements about the program's behavior that could not beobtained from any finite number of simulations.. The beliefs and goalstructures we ascribe to the program may be easier to understand than thedetails of the program as expressed in its listing... The difference betweenthis program and another actual or hypothetical program may be bestexpressed as a difference in belief structure."(McCarthy, 1990, p. 96)

The attribution of mental abilities according to these remarks possessesonly a functional nature: It serves to express information about the state ofa machine at a given time which otherwise could only be expressedthrough lengthy and complex descriptions of details.

McCarthy lists a number of mental characteristics which can be attributedto computers: Introspection and self knowledge, consciousness andself-consciousness, language and thinking, intentions, free will,understanding and creativity. At the same time he warns to equate suchattributed mental qualities with human characteristics:

"The mental qualities of present machines are not the same as ours. Whilewe will probably be able, in the future, to make machines with mentalqualities more like our own, we'll probably never want to deal with acomputer that loses its temper, or an automatic teller that falls in love?Computers will end up with the psychology that is convenient to theirdesigners..."(McCarthy, 1990, p. 185f.)

We now know that the last sentence must not necessarily be correct. Thereare first examples of self-organizing and self-optimizing hardware (Harveyand Thompson, 1997), whose modes of functioning are not known to itshuman designers. And the current approaches in the design of emotionalcomputers go far beyond modelling but try to develop computers whosemental qualities are not pre-defined by the designer, but developindependently.

Although naturally certain basic assumptions of the designers flow into15

Page 16: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

such systems, this approach is nevertheless fundamentally different fromthe classical modelling approach which can be observed in cognitivescience. The question remains, however, whether the processes in acomputer which, due to this procedure, one day actually develops mentalqualities are identical with the processes in the human body and brain.

Critics of such an approach point out the fact that emotions are notcomparable to purely cognitive processes, since they are affected byadditional factors (e.g. hormones) and also require a subject. Themodelling of these processes within a computer, a purely cognitiveconstruction, would therefore be impossible; the more so because thesubjective element is missing in a machine without which an emotion,whose substantial component is a feeling, could not be felt. To thisargument there are several answers.

On the one hand, one cannot possibly rule out that a computer can possess"feelings". From an evolutionary viewpoint, computers are an extremelyyoung phenomenon which have, in their short existence, made a number ofgiant steps. Today, there exist machines with hundreds of parallelprocessors; impressive research progress is made with biological andquantum computers. It might be just a question of time until a computerdoes posses a similar complex hardware as the human brain. Withincreasing complexity the probability increases, too, that such a system willorganize itself on a higher level. What must be laborously programmed asa "monitoring instance" today might develop into something which one day might be called the "ego" of a computer.

On the other hand, it would be extremely anthropocentric to denyemotions to an intelligent system just because it does not possess humanhormones. A computer consists of a multitude of "physiological" processeswhich could be perceived as "bodily feelings" once the system has beenequipped with a propioceptive subsystem. If, in addition, this computer isable to learn and move, one could imagine it reacting to certain situationswith a change of such processes which for it possess the same value asphysiological changes in our body.

An emotional computer must not experience emotions like a human - notmore than a visitor from Zeta Epsilon. Nevertheless, its emotions can be asgenuine to it as ours to us - and influence its thoughts and actions just asmuch.

We can't therefore assume a priori that "emotions" which are developed bya computer are comparable to human emotions. But it is thoroughlyjustified to assume that the emotions of a computer serve the samefunctions for it as for us humans. If this is the case, the computermodelling of emotions would not only be a way to learn more abouthuman emotions; it would at the same time lay the foundations for a timewhen intelligent systems of different "building blocks" will co-operate withone another.

16

Page 17: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

4. Theoretical foundations

In this part I give a short overview of the psychological theories of emotion which form the basis ofthe computer models described in the following chapters. I do not aim to describe and evaluate eachtheoretical approach in its entirety; these are rather some core elements of the respective theoreticalapproaches insofar as they are taken up in the computer models.

It is interesting to note that the majority of the computer models of emotions, if they refer expressly topsychological theories, are based on the so-called appraisal theories.

The fascination with these approaches probably stems from the fact that they can be converted(relatively) simply into program code.

4.1. The theory of Ortony, Clore and Collins

Ortony, Clore and Collins (1988) developed their theoretical approach expressly with the aim toimplement it in a computer:

"..., we would like to lay the foundation for a computationally tractable model of emotion. In otherwords, we would like an account of emotion that could in principle be used in an Artificial Intelligence(AI) system that would, for example, be able to reason about emotion."(Ortony, Clore and Collins, 1988, p. 2)

The theory of Ortony, Clore and Collins assumes that emotions develop as a consequence of certaincognitions and interpretations. Therefore it exclusively concentrates on the cognitive elicitors ofemotions.

The authors postulate that three aspects determine these cognitions: events, agents, and objects.

"When one focuses on events one does so because one is interested in their consequences, when onefocuses on agents, one does so because of their actions, and when one focuses on objects, one isinterested in certain aspects or imputed properties of them qua objects."(Ortony, Clore and Collins, 1988, p. 18)

Emotions, so their central assumption, represent valenced reactions to these perceptions of the world.One can be pleased about the consequences of an event or not (pleased/displeased); one can endorse or reject the actions of an agent (approve/disapprove) or one can like or not like aspects of an object(like/dislike).

A further differentiation consists of the fact that events can have consequences for others or for oneselfand that an acting agent can be another or oneself. The consequences of an event for another can bedivided into desirable and undesirable; the consequences for oneself as relevant or irrelevantexpectations. Relevant expectations for oneself finally can be differentiated again according to whetherthey actually occur or not (confirmed/disconfirmed).

This differentiation leads to the following structure of emotion types:

17

Page 18: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Fig.2: Structure of emotion types in the theory of Ortony, Clore and Collins (after Ortony, Clore,Collins, 1988, p.19)

The intensity of an emotional feeling is determinded predominantly by three central intensity variables:Desirability is linked with the reaction to events and is evaluated with regard to goals. Praiseworthiness is linked with the reaction to actions of agents and is evaluated with regard tostandards. Appealingness is linked with the reaction to objects and is evaluated with regard to attitudes.

The authors further define a set of global and local intensity variables. Sense of reality , proximity , unexpectedness and arousal are the four global variables which operate over all three emotioncategories. The local variables, to which the central intensity variables mentioned above also belong,are:

18

Page 19: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

EVENTS AGENTS OBJECTS

desirability praiseworthiness appealingness

desirability for other strength of cognitive unit familiarity

deservingness expectation deviation

liking

likelihood

effort

realization

Table 1: Local variables in the theory of Ortony, Clore and Collins (after Ortony, Clore and Collins,1988, p. 68ff.)

In a concrete case, each of these variables is assigned a value and a weight. Furthermore, there is a threshold value for each emotion, below which an emotion is not subjectively felt.

On the basis of this model the emergence of an emotion can be decribed in formal language: Let D (p, e, t) be the desirability (D) of an event (e) for a person (p) at a certain time (t). This function possesses a positive value for a desirable event, a negative value for a not desirable event. Furthermore let I g (p, e, t) be a combination of global intensity variables and P j (p, e, t) the potential for a state of joy. Then the following rule for "joy" can be provided:

IF D(p,e,t) > 0

THEN set Pj(p,e,t) = fj(D(p,e,t), Ig(p,e,t))

The resulting function f j releases a further rule which determines the intensity for joy ( I j) and thereby makes possible the experience of the joy emotion. Let T j be a threshold value, then:

IF Pj(p,e,t) > Tj(p,t)

THEN set Ij(p,e,t) = Pj(p,e,t) - Tj(p,t)

ELSE set Ij(p,e,t) = 0

If the threshold value is exceeded, this rule produces the emotion of joy; otherwise it supplies the value"zero", i.e., no emotional feeling. Depending upon the intensity of the emotion, different tokens are used for its description. Such tokens are words which describe this emotion.

Ortony, Clore and Collins supply no formalization for all of their defined emotions but give only a fewexamples. They postulate, however, that every emotion can be described using a formal notation,although with many emotions this is by far more complex than with the presented example.

With the help of such a formal system a computer should be able to draw conclusions about emotionalepisodes which are presented to it. The authors limit their goal quite explicitly:

"Our interest in emotion in the context of AI is not an interest in questions such as "Can computersfeel?" or "Can computers have emotions?" There are those who think that such questions can beanswered in the affirmative..., however, our view is that the subjective experience of emotion iscentral, and we do not consider it possible for computers to experience anything until and unless they

19

Page 20: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

are conscious. Our suspicion is that machines are simply not the kinds of things that can be conscious.However, our skepticism over the possibility of machines having emotions certainly does not meanthat we think the topic of emotions is irrelevant for AI..... There are many AI endeavors in which theability to understand and reason about emotions or aspects of emotions could be important."(Ortony, Clore and Collins, 1988, p. 182)

4.2. The theory of Roseman

The theory of Roseman, which he presented for the first time in 1979 (Roseman, 1979), was modifiedby him several times in the following years. It changed in (partially substantial) details; what remainedthe same was only the basic approach of an appraisal theory of the emotions.

Roseman developed his first theory based upon 200 written reports of emotional experiences. Fromthe analysis of these documents, he derived his model, in which five cognitive dimensions determinewhether an emotion arises and which one it is.

The first dimension describes whether a person possesses a motivation to a desired situational state ora motivation away of an unwanted situational state. The dimension thus knows thus the states"positive" and "negative".

The second dimension describes whether the situation agrees with the motivational state of the personor not. The dimension thus knows thus the states "situation present" or "situation absent".

The third dimension describes whether an event is noticed as certain or only as a possibility. Thisdimension knows the conditions "certain" and "uncertain".

The fourth dimension describes whether a person perceives the event as deserved or undeserved, withthe two states"deserved" and "undeserved".

The fifth dimension finally describes, from whom the event originates. This dimension knows the states"the circumstances", "others" or "oneself".

From the combination of these five dimensions and their values a table can be arranged (Roseman,1984), from which, according to Roseman, emotions can be predicted.

Altogether 48 combinations can be formed of Roseman's dimensions (positive/negative xpresent/absent x certain/uncertain x deserved/undeserved x circumstances/others/oneself). With these48 cognitive appraisals correspond, according to Roseman, 13 emotions.

After experimental examinations of this approach did not furnish the results postulated by Roseman, hemodified his model (Roseman, 1984). The second dimension of his original model (situation present orabsent) now contained the states "motive consistent" and "motive inconsistent", whereby "motiveconsistent" always corresponds to the value "positive" of the first dimension and "motive inconsistent"to the value "negative" of the first dimension. In place of the alternatives "present" and "absent" nowthe terms "appetitive" and "aversive" were used.

A further correction concerned the fourth dimension of the original model (deserved/undeserved).Roseman replaced it by the dimension of strength, i.e. whether a person in a given situation perceiveshimself or herself as strong or weak. States of this dimension thus are "strong" and "weak".

Roseman also supplemented the third dimension of his original model (certain/uncertain) by a furtherstate: "unknown". That was necessary in order to incorporate the emotion of surprise in his model.

Roseman concedes (Roseman et al., 1996) that this model, too, could not be empirically validated. Asa consequence he developed a third version of his theory (Roseman et al., 1996). It differs from his

20

Page 21: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

second approach in several points: The fourth dimension (strong/weak) is replaced by a relationalappraisal of the own control potential, with the states "low" and "high". The value "unknown" of thethird dimension is replaced by the state "unexpected", since this is, according to Roseman, thecondition for the emotion of surprise. And finally Roseman adds still another dimension for thenegative emotions which he calls "type of problem". It describes whether an event is noticed asnegative because it blocks a goal (with the result "frustration") or because it is negative in its nature(with the result" abhorrence"). This dimension has the states "non-characteristic" and "characteristic".

How far this (as of now) last model by Roseman can be proven empirically cannot be said. Oneweakness of the model, however, is evident: It has problems dealing with a situation in which oneperson makes two different appraisals. If, for example, a student is of the opinion that his teacher giveshim a test that ist not fair but knows at the same time that he has not sufficiently prepared for the test,then Roseman's model cannot clearly predict what the student's emotions are - because two states ofthe fifth dimension are present at the same time.

Because of their simple structure which can be translated quickly into rules which exactly define whichappraisals elicit which emotions, Roseman's models were received very positively in AI circles. Dyer'smodel BORIS is based on Roseman's first model, and Picard writes: "Overall, it shows promise for implementation in a computer, for both reasoning about emotion generation, and for generatingemotions based on cognitive appraisals." (Picard, 1997, S. 209)

4.3. The theory of Scherer

For Scherer five functionally defined subsystems are involved with emotional processes. An information-processing subsystem evaluates the stimulus through perception, memory, forecast andevaluation of available information. A supporting subsystem adjusts the internal condition throughcontrol of neuroendocrine, somatic and autonomous states. A leading subsystem plans, preparesactions and selects between competitive motives. An acting subsystem controls motor expression andvisible behaviour. A monitor subsystem finally controls the attention which is assigned to the presentstates and passes the resulting feedback on to the other subsystems.

Scherer is especially interested in the information-processing subsystem. According to his theory this subsystem is based on appraisals which Scherer calls stimulus evaluation checks (SEC). The result of these SECs causes again changes in the other subsystems.

Scherer sees five substantial SECs, four of which possess further subchecks. The novelty check decides whether external or internal stimuli have changed; its subchecks are suddenness, confidenceand predictability. The intrinsic pleasantness check specifies whether the attraction is pleasant orunpleasant and causes appropriate approximation or avoidance tendencies. The goal significance check decides whether the event supports or prevents the goals of the person; its subchecks are goal relevance, probability of result, expectation, support character and urgency. The coping potential check determines to what extent the person believes to have events under control; its subchecks are agent, motive, control, power and adaptability. The compatibility check finally compares the event with internal and external standards and standards; its subchecks are externality and internality.

Each emotion can, according to Scherer, thus be clearly determined by a combination of the SECs andsubchecks. An appropriate table with such allocations can be found in [ Scherer, 1988 ]. A number ofempirical studies has supported Scherer's model so far.

4.4. The theory of Frijda

Frijda points out that the word "emotion" does not refer to a "natural class" and that it is not able torefer to a well-defined class of phenomena which are clearly distinguishable from other mental and

21

Page 22: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

behaviour events. For him, therefore, the process of emotion emergence is of larger interest.

The center of Frijda's theory is the term concern. A concern is the disposition of a system to prefercertain states of the environment and of the own organism over the absence of such conditions.Concerns produce goals and preferences for a system. If the system has problems to realize these concerns, emotions develop.The strength of such an emotion is determined essentially by the strength of the relevant concern(s).

Frijda defines six substantial characteristics of the emotion system which describe its function:

Concern relevance detection: The emotion subsystem announces the meaning of events for theconcerns of the overall system to all other components of the system. This signal Frijda calls affect. This means the system must be able to pick up informations from the environment andfrom the own system.

1.

Appraisal: Next, the meaning of the stimulus for the concerns of the system has to beappraised. This is a two-stage process with the subprocesses relevance appraisal and context appraisal.

2.

Control precedence: If the relevance signal is strong enough, it changes the priorities ofperception, attention and processing. It produces a tendency to affect the behaviour of the system. Frijda calls this control precedence.

3.

Action readiness changes: According to Frijda, this represents the heart of the emotionalreaction. Change of the action readiness means changes in the dispatching of processing andattention resources as well as the tendency towards certain kinds of actions.

4.

Regulation: Apart from the activation of certain forms of action readiness, the emotion systemmonitors all processes of the overall system and events of the environment which can affect thisaction readiness, in order to be able to intervene accordingly.

5.

Social nature of the environment: The emotion system is adjusted to the fact that it operates ina predominantly social environment. Many appraisal categories are therefore of social nature; action readiness is predominantly a readiness for social actions.

6.

For Frijda, emotions are absolutely necessary for systems which realize multiple concerns in anuncertain environment. If a situation occurs, in which the realization of these concerns appearsendangered, so-called action tendencies develop. These action tendencies are linked closely withemotional states and serve as a safety device for what Frijda calls concern realization (CR).

As substantial action tendencies Frijda (1986) defines the following (associated emotions inparentheses):

Approach (Desire)Avoidance (Fear)Being-with (Enjoyment, Confidence)Attending (Interest)Rejecting (Disgust)Nonattending (Indifference)Agonistic (Attack/Threat) (Anger)Interrupting (Shock, Surprise)Dominating (Arrogance)Submitting (Humility, Resignation)

According to Frijda, a functioning emotional system must have the following components:

Concerns: Internal representations against which the existing conditions are tested.

22

Page 23: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Action Repertoire: Consisting of fast emergency reactions, social signals and mechanisms to developnew plans.

Appraisal Mechanisms: Mechanisms which establish the fit between events and concerns as well asconnections to the action control system and the action repertoire.

Analyser: Observation of the incoming information and subsequent coding regarding their implicationsand consequences.

Comparator: Test of all information on concern relevance. The result are relevance signals, which activate the action system and the Diagnoser and cause attentional arousal.

Diagnoser: Responsible for context evaluation, scanning the information for action-relevantreferences. Performs a number of tests (e.g. whether consequences of an event are safe or uncertain,who is responsible for it etc.) and results in an appraisal profile.

Evaluator: Agreement or discrepancy signals of the Comparator and the profile of the Diagnoser are combined into the final relevance signal and its intensity parameter. The intensity signals the urgency of an action to the action system. The relevance signal constitutes the so-called control precedence signal.

Action Proposer: Prepares the action by selecting a suitable alternative course of action and bymaking available the resources necessary for it.

Actor: Generates actions.

This general description of an emotional system can be formalized in such a way that it can form thebasis for a computer model:

Fig. 3: Frijda's emotion system(Frijda and Moffat, 1994)

23

Page 24: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

To theory outlined so far was presented by Frijda in 1986. On it is based the computer model ACRES(Frijda and Swagerman, 1987) which is described further below. The evaluation of ACRES led Frijda to make a number of modifications of his theoretical approach. These are described likewise furtherbelow in connection with the computer model WILL (Moffat and Frijda, 1995).

4.5. The theory of Oatley & Johnson-Laird

Oatley and Johnson-Laird developed their theory expressly in a form which can be implemented as acomputer model, even if they did not carry out this step. They see the necessity for their model in thefact that almost all computer models of the human mind did not consider emotions, while they regardthis as a central component for the organization of cognitive processes.

Oatley and Johnson-Laird assume in their theory, called by them "communicative theory of emotions" (Oatley & Jenkins, 1996, p. 254), a hierarchy of parallelly working processing instances, which workon asynchronously different tasks. These instances are coordinated by a central control system (oroperating system). The control system contains a model of the entire system.

The individual modules of the system communicate with one another, so that the system can functionat all. According to Oatley and Johnson-Laird there are two kinds of communication. They call thefirst kind propositional or symbolical; through it actual information about the environment isconveyed. The second kind of communication is nonpropositional or of emotional nature. Its task is not to convey information but to shift the entire system of modules into a state of increased attention,the so-called emotion mode. This function is comparable to global interrupt programs on computers:

"Emotion signals provide a specific communication system which can invoke the actions of someprocessors [modules] and switch others off. It sets the whole system into an organized emotion modewithout propositional data having to be evaluated by a high-level conscious operating system...Theemotion signal simply propagates globally through the system to set into one of a small number ofemotion modes."(Oatley & Johnson-Laird, 1987, p. 33)

According to Oatley, the central postulate of the theory is:

"Each goal and plan has a monitoring mechanism that evaluates events relevant to it. When asubstantial change of probability occurs of achieving an important goal or subgoal, the monitoringmechanism broadcasts to the whole cognitive system a signal that can set it into readiness to respondto this change. Humans experience these signals and the states of readiness they induce as emotions."(Oatley, 1992, p. 50)

Emotions coordinate quasi-autonomous processes in the nervous system by communicating significantway marks of current plans (plan junctures). Oatley and Johnson-Laird bring such plan junctures inconnection with elementary emotions:

Plan juncture Emotion

Subgoals being achieved Happiness

Failure of major plan Sadness

Self-preservation goal violated Anxiety

Active plan frustrated Anger

Gustatory goal violated Disgust

24

Page 25: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Table 2: Plan junctures (after Oatley, 1992, p. 55)

Since they arise at plan junctures, emotions are a design solution for problems of plan changes insystems with a multiplicity of goals.

The name "communicative theory of emotions" was chosen because it is the task of emotions toconvey certain informations to all modules of the overall system.

After a suggestion by Sloman ,Oatley specified again that there are two kinds of signals in the model:Semantic signals and control signals. The two can occur together, but do not have to. Thus Oatley (1992) states that his model is the only one which can explain a vague emotional condition: in this case only the control signals are active, not the semantic ones.

25

Page 26: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

5. Electronic assistants

There exists a variety of models which employ computers in order to recognize emotions or to representthem. These models are not "emotional computers" in a narrow sense, because their "emotional"components are pre-defined elements and not a subsystem which developed independently.

The models described in this chapter are, to a large extent, rule-based production systems. Thus they arealso symbol-processing systems. From the sixties until today a spirited discussion has taken place whetheror to which extent the human mind is a symbol-processing system and to what extent symbol-processingcomputer models can be a realistic approximation to its real workings (see e.g. Franklin, 1995).

A rule-based production system has as minimum requirements a set of standard components:

1. 1. a so-called knowledge base that contains the processing rules of the system;

2. 2. a so-called global database that represents the main memory of the system;

3. 3. a so-called control structure which analyzes the contents of this global database and decideswhich processing rules of the knowledge base are to be applied.

A more detailed description of a rule-based production system is supplied by Franklin (1995) with the

system is the use of appropriate condition-action rules which transform the problem space from one state into another.

The models of Dyer, Pfeiffer, Bates and Reilly, and Elliot presented here can be regarded as rule-basedproduction systems. Scherer's model forms an exception in as much as it is an implementation which doesnot work rule-based. Its underlying approach is, however, an appraisal theory and could easily beimplemented as a production system.

5.1. The models of Dyer

Dyer has developed three models in all: BORIS, OpEd and DAYDREAMER. BORIS and OpEd aresystems which can infer emotions from texts; DAYDREAMER is a computer model which can generateemotions.

Dyer regards emotions as an emergent phenomenon:

"Neither BORIS, OpEd, nor DAYDREAMER were designed to address specifically the problem ofemotion. Rather, emotion comprehension and emotional reactions in these models arise through theinteraction of general cognitive processes of retrieval, planning and reasoning over memory episodes, goals,and beliefs."(Dyer, 1987, p. 324)

These "general cognitive processes" are realized by Dyer in the form of demons, specialized programsubroutines which are activated under certain conditions and can accomplish specific tasks independentlyfrom one another. After the completion of their work these demons "die" or spawn new subroutines.

"In BORIS, "disappointed" caused several demons to be spawned. One demon used syntactic knowledge towork out which character x was feeling the disappointment. Another demon looked to see if x had suffereda recent goal failure and if this was unexpected."

problem space ; the production process of theexample of SOAR: The system operates within a defined

26

Page 27: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

(Dyer, 1987, p. 332)

5.1.1. BORIS

BORIS is based on a so-called affect lexicon which possesses six components: A person who feels theemotion; the polarity of the emotion (positive - negative); one or more goal attainment situations; the thingor the person toward which the emotion is directed; the strength of the emotion as well as the respectiveexpectation.

With these components, emotions present themselves as follows in BORIS:

Emotion: relief

Person: x

Polarity: positive

Directed at: -/-

Goal attainment: goal attained

Expectation: Expectation not fulfilled

In this case the person x did not expect to achieve her goal. This expectation did not fulfill itself. Now theperson x experiences a positive excitation state felt by her as relief (after Dyer, 1987, p. 325).

In similar form emotions as happy, sad, grateful, angry-at , hopefu , fearful, disappointed, guilty etc. are represented in BORIS.

This underlines the goal Dyer pursues with BORIS: All emotions can be represented in BORIS in form of anegative or positive excitation condition, connected with information about the goals and expectations of aperson.

Dyer points out that with the help of the variables specified by him one can also represent emotions forwhich there is no appropriate word in a given language.

With the help of this model BORIS can draw conclusions about the respective goal attainment situation of aperson, understand and generate text that contains descriptions of emotions as well as understad andcompare the meanings of emotional terms. The system is also able to represent multiple emotional states.

From the excuted goal/plan analysis of a person and the result BORIS can also develop expectations howthis person will continue to behave in order to achieve her goals. Also the strength of an excitation state canbe used by BORIS for such predictions.

5.1.2. OpEd

OpEd represents an extension of BORIS. While BORIS can, due to its internal encyclopedia, onlyunderstands emotions in narrative texts, OpEd is able to infer emotions and beliefs also from texts which arenot narrative:

"OpEd is...designed to read and answer questions about editorial text. OpEd explicitly tracks the beliefs ofthe editorial writer and builds representations of the beliefs of the writer and of those beliefs the writerascribes to his opponents."(Dyer, 1987, p. 329)

27

Page 28: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Beliefs are implemented with OpEd on the basis of four dimensions: Believer is someone who possesses a certain belief; content is an evaluation of goals and plans; attack are the beliefs which oppose the onecurrently expressed; support are the beliefs which support the current belief.

According to Dyer, beliefs were an substantial element which was missing in BORIS. For example, thestatement "happy(x)" is represented in BORIS as the attainment of a goal by x. This, Dyer notes, is notsufficient:

"What should have been represented is that happy(x) implies that x believes that x has achieved (or willachieve, or has a chance of achieving) a goal of x."(Dyer, 1987, p. 330)

Therefore new demons are added in OpEd to the ones known from BORIS: belief-building, affect-relateddemons.

Dyer has shown that OpEd is able not only to deduce from newspaper texts the beliefs of the author butalso to draw conclusions about the beliefs of those against which the author takes position.

5.1.3. DAYDREAMER

While BORIS and OpEd are meant to understand emotions, DAYDREAMER (Mueller and Dyer, 1985) isan attempt to develop a system that "feels" them. This feeling expresses itself not in a subjective state of thesystem, but in that its respective " emotional" condition affects its internal behaviour during the processingof information.

Mueller and Dyer define four substantial functions of daydreams: They increase the effectiveness of futurebehaviour by the anticipation of possible reactions to expected events; they support learning from successesand errors by thinking through alternative courses of action to the end; they support creativity, because theimaginary following through of courses of action can lead to new solutions, and they support the adjustmentof emotions by reducing their felt intensity.

In order to achieve these goals, DAYDREAMER is equipped with the following main components:

1. a scenario generator which consists of a planner and so-called relaxation rules;

2. a dynamic episodic memory, whose contents are used by the scenario generator;

3. an accumulation of personal goals and control goals which steer the scenario generator;

4. an emotion component, in which daydreams are generated or initiated by emotional states which areelicited by reaching or not reaching a goal;

5. a knowledge (domain knowledge) about interpersonal relations and everyday life activities.

DAYDREAMER has two kinds of functions, daydreaming mode and performance mode. In daydreamingmode the system stays continually in daydreams until it is interrupted; in performance mode the system shows what it has learned from the daydreams.

Mueller and Dyer postulate a set of goals possessed by a system and which they call control goals. Theseare released partially by emotions and release again daydreams. The function of the control goals consists ofproviding at short notice for a modification of emotional states and securing the reaching of personal goalson a long-term basis.

The system has thus a feedback mechanism in which emotions release daydreams and daydreams modify28

Page 29: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

these emotions and release new emotions which again initiate new daydreams.

Mueller and Dyer name four control goals which arise with daydreams:

1. Rationalization: The goal of rationalizing away a failure and of reducing in this way a negativeemotional state.

2. Revenge: The goal of preventing for another the reaching of a goal and thus to reduce the ownannoyance.

3. Reversal of success or failure: The goal of imagining a scenario with an opposite result in order toturn around the polarity of an emotional state.

4. Preparation: The goal of developing hypothetical episodes in order to play through theconsequences of a possible action.

Mueller and Dyer describe the functioning of DAYDREAMER by an example in which DAYDREAMERrepresents an active young man with social goals who met an actress who rejected his invitation to a drink.

DAYDREAMER generates thereupon the following two daydreams:

"Daydream 1: I am disappointed that she didn't accept my offer...I imagine that she accepted my offer andwe soon become a pair. I help her when she has to rehearse her lines...When she has to do a film in France, Idrop my work and travel there with her...I begin to miss my work. I become unhappy and feel unfulfilled.She loses interest in me, because I have nothing to offer her. It's good I didn't get involved with her, becauseit would've led to disaster. I feel less disappointed that she didn't accept my offer.(......)Daydream 2: I'm angry that she didn't accept my offer to go have a drink. I imagine I pursue an actingcareer and become a star even more famous than she is. She remembers meeting me a long time ago in amovie theater and calls me up...I go out with her, but now she has to compete with many other women formy attention. I eventually dump her."

(Dyer, 1987, p. 337)

The first daydream is an example of reversal: he pretends that the rendezvous took place and develops afantasy over the consequences. The reality monitor announces that an important goal, i.e. the own career, isneglected. The result is a rationalization which reduces the negative emotional condition.

Daydream 2 is released by the emotional condition of annoyance and embodies revenge to reduce the thenegative effect of the current emotional condition.

As soon as a control goal is activated, the scenario generator generates a set of events which are connectedwith the control goal. These daydreams differ in as much from classical plans as they are not exclusivelydirected at a goal, but can change in a loose, associative manner. The system contains, in addition, arelaxation mechanism which makes possible daydreams which are out of touch with reality.

Mueller and Dyer cite four examples of such relaxations in their model:

Behavior of others: DAYDREAMER can assume that the film star accepts his offer.Self attributes: DAYDREAMER can assume to be a professional athleteor a well-known film star.Physical constraints: One can assume to be invisible or to fly.Social constraints: One can assume to provoke a scene in a distinguished restaurant.

The strength of the relaxations is not always the same; it varies after the respective active control goals.

29

Page 30: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Positive emotions arise through the memory of a goal reaching, negative emotions through the memory of afailure. If another is responsible for the non reaching of a goal of DAYDREAMER, the emotion Anger isreleased. Imaginary successes imagined in the daydream call up positive emotions awake; imaginary failuresnegative emotions.

During its daydreams DAYDREAMER stores in its memory complete daydreams, future plans and planningstrategies. These are indexed in the episodic memory and can be called up later. Thus the system is able to learn from its daydreams for future situations.

The ability of a computer to develop daydreams is substantial for the development of its intelligence,Mueller and Dyer maintain. They imagine computers which, in the time in which they are not used, candaydream in order to increase their efficiency in this way.

The model of Mueller and Dyer has not been developed further after its original conception

5.2. The model of Pfeifer

With FEELER ("Framework for Evaluation of Events and Linkages into Emotional Responses") , Pfeifer(1982, 1988) presented a model of an emotional computer system which is based explicitly on psychologicalemotion theories.

Pfeifer's model is a rule-based system with working memory (WM), rule memory (long term memory-LTM) and control structure; the contents of the long term storage (the knowledge base) he additionallydifferentiates into declarative and procedural knowledge.

In order to be able to represent emotions, Pfeifer extends this structure of a rule-based system by furthersubsystems. Thus FEELER has not only a cognitive, but additionally a physiological working memory.

To develop emotions, FEELER needs a schema in order to analyze the cognitive conditions which lead toan emotion. For this purpose Pfeifer makes use of the taxonomy developed by Weiner (1982). From this he develops exemplarily a rule for the emergence of an emotion:

"IF current_state is negative for selfand emotional_target is VARpersonand locus_of_causality is VARpersonand locus_of_control is VARpersonTHEN ANGER at VARperson"(Pfeifer, 1988, p. 292)

So that this rule can become effective, all its conditions must be represented in the WM. This is done via inference processes which place their results in the WM. Such inference processes are, according to Pfeifer,typically released by interrupts.

Appropriate interrupts are generated by FEELER if expectations are hurt regarding the reaching of subgoalsand/or if for an event no expectations exist.

In a second rule Pfeifer defines an action tendency following rule 1:

IF angryand emotional_target is VARpersonand int_pers_rel self - VARperson is negativeTHEN generate goal to harm VARperson(Pfeifer, 1988, p. 297)

30

Page 31: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

This rule shows at the same time, according to Pfeifer, the heuristic value of an emotion: the emotion reduces the circle of the possible candidates and actions for inference processes.

Pfeifer grants that such a model is not able to cover all emotional states. He discusses a number of problems, for example the interaction of different subsystems and their influence on the development,duration and fading away of emotions. In a further step Pfeifer supplemented its model with the taxonomyof Roseman (1979), in order to be able to represent emotions in FEELER in connection with the reachingof goals.

5.3. The model of Bates and Reilly

In his essay "The Role of Emotion in Believable Agents" (Bates, 1994) Joseph Bates quotes the Disneyartist Chuck Jones with the statement, Disney would, with his cartoon characters, always strive forbelievability. Bates continues:

"Emotion is one of the primary means to achieve this believability, this illusion of life, because it helps usknow that characters really care about what happens in the world, that they truly have desires."(Bates, 1994, p. 6)

Together with a group of colleagues at Carnegie-Mellon University Bates created the Oz Project . Their goal is to build synthetic creatures which appear to their human public as genuinly lifelike as possible.Briefly, it concerns an interactive drama system or "artistically effective simulated worlds" (Bates et. al.,1992, p.1).

The fundamental approach consists in the creation of broad and shallow agents . While computer models of AI and of emotions concentrate on specific aspects and try to cover these as detailed as possible, Batestakes the opposite approach:

"...part of our effort is aimed at producing agents with a broad set of capabilities, including goal-directedreactive behavior, emotional state and behavior, social knowledge and behavior, and some natural languageabilities. For our purpose, each of these capacities can be as limited as is necessary to allow us to buildbroad, integrated agents..."(Bates et. al., 1992a, p.1)

The broad approach is, so Bates, necessary in order to create believable artificial characters. Only an agent that is able to react convincingly to a variety of situations in an environment to which a human user belongs,is also really accepted by the latter as a believable character.

Since Oz is intentionally constructed as an artificial worl which is to be regarded by the user like a film or aplay, it is sufficient to construct the various abilities of the system "flat" in order to satisfy expectations ofthe user. Because, as in the cinema, he does not expect a correct picture of reality, but an artificialworld with in this context convincing participants.

An Oz -world consists of four substantial elements: A simulated environment, a number of agents who populate this artificial world, an interface through which humans can participate at the happenings in thisworld, and a planner that is concerned with the long-term structure of the experiences of a user.

The agent architecture of Bates is called Tok and consists of a set of components: There are modules for goals and behaviour, for sensory perception, language analysis and language production. And there is a module called Em for emotions and social relations.

31

Page 32: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Fig. 4: Structure of the TOK architecture (Reilly, 1996, p. 14)

Em contains an emotion system which is based in the model of Ortony, Clore and Collins (1988). However,the OCC model is not implemented in its entire complexity in Em . This concerns in particular the intensity variables postulated by Ortony, Clore and Collins and their complex interactions. Em uses a simpler subset of these variables which is judged as sufficient for the the intended purpose.

Reilly (1996) explains that with the use of such subsets the OCC model is in effect not redudce butextended. He clarifies this with two examples:

With Ortony, Clore and Collins pity is generated as follows: Agent A feels pity for agent B, if agent A likes agent B and agent A appraisesan event as unpleasantly for agent B regarding his goals. "So, if Alice hears that Bill got a demotion, Alice must be able to match this event with a model of Bill's goals, includinggoals about demotions." (Reilly, 1996, p. 53) This would mean that Alice would have to possess arelatively comprehensive knowledge of Bill's goals and appraisal mechanisms - according to Reilly a difficultventure in a dynamic world, in which goals can change fast. I

He suggests the following mechanism instead: Agent A feels pity for Agent B, if Agent A likes Agent B andAgent A believes that Agent B is unhappy. According to Reilly, this description has other advantages thanjust being simpler:

"In this case, I have broken the OCC model into two components: recognizing sadness in others and havinga sympathetic emotional response..... Recognizing sadness in others is done, according to the OCC model,only through reasoning and modeling of the goals of other agents, so this inference can be built into themodel of how the emotion is generated. Em keeps the recognition of sadness apart from the emotionalresponse, which allows for multiple ways of coming to know about the emotions of others. One way is todo reasoning and modeling, but another way, for example, is to see that an agent is crying.The Em model is more complete than the OCC model in cases such as agent A seeing that agent B is sadbut not knowing why. In the OCC case, when agent A does not know why agent B is unhappy, the criteriafor pity is not met. Because the default Em emotions generators require only that agent A believe that agentB is unhappy, which can be perceived in this case, Em generates pity."(Reilly, 1996, p. 53f.)

As the second example, Reilly (1996) states the emergence of distress . In the OCC model distress develops if an event is appraised as unpleasant regarding the goals of an agent. That means that external events must be evaluated. With Em, distress is caused by the fact that goals are either not achieved or theprobability rises that they are not reached, which is connected with the motivation and action system. Reilly explains:

"This shifts the emphasis towards the goal processing of the agent and away from the cognitive appraisal ofexternal events. This is useful for two reasons. First, the motivation system is already doing much of theprocessing (e.g., determining goal successes and failures), so doing it in the emotion system as well isredundant. Second, much of this processing is easier to do in the motivation system since that's where therelevant information is. For instance, deciding how likely a goal is to fail might depend on how far thebehavior to achieve that goal has progressed or how many alternate ways to achieve the goal are available -

32

Page 33: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

this information is already in the motivation system."(Reilly, 1996, p. 54f.)

In this way emotion structures are to develop which can be used more completely and more simply than thepurely cognitive models. Which emotions can be generated by Em on which basis shows the followingtable:

Emotion Type Cause in Default Em System

Distress Goal fails or becomes more likely to fail and it is important to theagent that the goal not fail.

Joy Goal succeeds or becomes more likely to succeed and it is importantto the agent that the goal succeed.

Fear Agent believes a goal is likely to fail and it is important to the agentthat the goal not fail.

Hope Agent believes a goal is likely to succeed and it is important to theagent that the goal succeed.

Satisfaction A goal succeeds that the agent hoped would succeed.

Fears-Confirmed A goal failed that the agent feared would fail.

Disappointment A goal failed that the agent hoped would succeed.

Relief A goal succeeds that the agent feared would fail.

Happy-For A liked other agent is happy.

Pity A liked other agent is sad.

Gloating A disliked other agent is sad.

Resentment A disliked other agent is happy.

Like Agent is near or thinking about a liked object or agent.

Dislike Agent is near or thinking about a disliked object or agent.

Other attitude-based emotions Agent is near or thinking about an object or agent that the agent hasan attitude towards (e.g., awe).

Pride Agent performs an action that meets a standard of behavior.

Shame Agent performs an action that breaks a standard of behavior.

Admiration Another agent performs an action that meets a standard of behavior.

Reproach Another agent performs an action that breaks a standard of behavior.

Anger Another agent is responsible for a goal failing or becoming morelikely to fail and it is important that the goal not fail.

Remorse An agent is responsible for one of his own goals failing or becomingmore likely to fail and it is important to the agent that the goal notfail.

33

Page 34: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Gratitude Another agent is responsible for a goal succeeding or becomingmore likely to succeed and it is important that the goal succeed.

Gratification An agent is responsible for one of his own goals succeeding orbecoming more likely to succeed and it is important to the agent thatthe goal succeed.

Frustration A plan or behavior of the agent fails.

Startle A loud noise is heard.

Table 3: Emotion types and their generation in Em (after Reilly, 1996, p. 58 f.)

Reilly points out expressly that these emotion types do not pretend to be correct in the psychological sensebut only represent a starting point in order to create believable emotional agents.

The emotion types of Em are arranged in the following hierarchy:

Total Positive Joy

Hope

Happy-For

Gloating

Love

Satisfaction

Relief

Pride

Admiration

Gratitude

Gratification

Negative Distress

Fear Startle

Pity

Resentment

Hate

Disappointment

Fears-Confirmed

Shame

Reproach

34

Page 35: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Anger Frustration

Remorse

Table 4: Hierarchy of emotion types in Em (after Reilly, 1996, p. 76)

One notices that in this hierarchy the emotion types modelled after the OCC model are arranged one levelbelow the level of "positive - negative". This mood level lends to Em the possibility of specifying thegeneral mood situation of an agent well before a deep-going analysis, which simplifies the production ofemotional effects substantially.

For the determination of the general mood situation (good-mood vs. bad-mood), Em first sums up theintensities of the positive emotions and then those of the negative emotions. Formalized, this looks asfollows:

IF Ip > In

THEN set good-mood = Ip

AND set bad-mood = 0

ELSE set good-mood = 0

AND set bad-mood = - In

(after Picard, 1997, p. 202)

The TOK system has been realized with different characters. One of the most well-known is Lyotard, avirtual cat. Bates et al. (1992b) describe a typical interaction with Lyotard:

"As the trace begins, Lyotard is engaged in exploration behavior in an attempt to satisfy a goal to amusehimself... This behavior leads Lyotard to look around the room, jump on a potted plant, nibble the plant, etc.After suffcient exploration, Lyotard's goal is satisfied. This success is passed on to Em which makesLyotard mildly happy. The happy emotion leads to the "content" feature being set. Hap then notices thisfeature being active and decides to pursue a behavior to find a comfortable place to sit, again to satisfy thehigh-level amusement goal. This behavior consists of going to a bedroom, jumping onto a chair, sittingdown, and licking himself for a while.

At this point, a human user whom Lyotard dislikes walks into the room. The dislike attitude, part of thehuman-cat social relationship in Em, gives rise to an emotion of mild hate toward the user. Further, Emnotices that some of Lyotard's goals, such as not-being-hurt, are threatened by the disliked user's proximity.This prospect of a goal failure generates fear in Lyotard. The fear and hate combine to generate a strong"aggressive" feature and diminish the previous "content" feature.In this case, Hap also has access to the fear emotion itself to determine why Lyotard is feeling aggressive.All this combines in Hap to give rise to an avoid-harm goal and its subsidiary escape/run-away behavior thatleads Lyotard to jump off the chair and run out of the room."(Bates et al., 1992b, p. 7)

35

Page 36: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Reilly (1996) examined the believability of a virtual character equipped with Em. Test subjects were confronted with two virtual worlds in which two virtual characters acted. The difference between the two worlds consisted of the fact that in one case both characters were equipped with Em, while in the secondcase only one character contained it.

Afterwards it was explored with a questionnaire which differences were noticed by the test subjects betweenthe Em-character ("Melvin") and the Non-Em-character ("Chuckie").

The test subjects classified Melvin as more emotional than Chuckie. Also its believabilitywas more highly evaluated than the Chuckies. At the same time the test subjects indicated that Melvins personality was moreoutlined than Chuckie's and that with Melvin they had less frequently the feeling that they had to do withfictitious characters than with Chuckie.

The significance of the results varies clearly, however, so that Reilly grants that Em is only "moderatelysuccessful" (Reilly, 1996, p. 129).

5.4. The model of Elliott

A further model which is based on the theory of Ortony, Clore and Collins is the Affective Reasoner of Clark Elliott. Elliott's interest is primarily the role of emotions in social interactions, be it between humans,between humans and computers, or between virtual participants in a virtual computer world.

Elliott summarizes the core elements of the Affective Reasoner in such a way:

"One way to explore emotion reasoning is by simulating a world and populating it with agents capable ofparticipating in emotional episodes. This is the approach we have taken. For this to be useful we must have(1) a simulated world which is rich enough to test the many subtle variations a treatment of emotionreasoning requires, (2) agents capable of (a) a wide range of affective states, (b) an interesting array ofinterpretations of situations leading to those states and (c) a reasonable set of reactions to those states, (3) away to capture a theory of emotions, and (4) a way for agents to interact and to reason about the affectivestates of one another. The Affective Reasoner supports these requirements."(Elliott, 1992, p. 2)

The advantages of such a model are, according to Elliott, numerous: On the one hand it makes possible toexamine psychological theories about the emergence of emotions and the actions resulting from it for itsinternal plausibility. Secodly, affective modules are an important component of distributed agent systems, ifthese are to act without friction losses in real time. Thirdly, a computer model which can understand and express emotions is a substantial step for the building of better man-machine interfaces.

As example of a simulated world, Elliott (1992) chose Taxiworld, a scenario with four taxi drivers inChicago. (Taxiworld is not limited to four drivers; the simulation was implemented with up to 40 drivers.)There are different stops , different passengers, policemen, and different travel goals. Thus can be created anumber of situations which lead to the development of emotions.

The taxi drivers must be able to interpret these situations in such a way that emotions can develop. For this, they need the ability to be able to reflect over the emotions of other taxi drivers. Finally, the drivers should be able to act based on their emotions.

Elliott illustrates the difference between the Affective Reasoner and classical analysis models of AI by thefollowing example (Elliott, 1992): "Toms car did not start, and Tom therefore missed an appointment. Heinsulted his car. Harry observed this incident."

A classical AI system would draw the following conclusions from this story: Tom should let his car be

36

Page 37: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

repaired. Harry has learned that Tom's car is defective. Tom could not come to his appointment in timewithout his car. Harry suggests that in the future Tom should set out earlier to his appointments.

The Affective Reasoner, however, would come to completely different conclusions: Tom holds his car responsible for his missed appointment. Tom is angry. Harry cannot understand, why Tom is angry withhis car, since one cannot hold a car responsible. Harry advises Tom to calm down. Harry has pity with his friend Tom, because he is so excited.

In order to react in this way, the Affective Reasoner needs a relatively large number of components.Although it is specialized in emotions, Elliott calls it nevertheless a "shallow model" (Elliott, 1994). In the following section the substantial components of the Affective Reasoner will be presented, as described byElliott (1992).

5.4.1. The construction of the agents

The agents of the Affective Reasoner have a rudimentary personality. This personality consists of two components: the interpretive personality component represents the individual disposition of an agent tointerpret situations in its world. The manifestative personality component is its individual way of showingits emotions.

Each agent has one or more goals. With this are meant situations whose occurrence the agent judges asdesirable. In order to be able to act emotional, the agents need an object domain within which situations occur which can lead to emotions and within which the agents can execute actions elicited by emotions.

Each agent needs several data bases for its functioning, to which he must have access at any time:

1. A data base with 24 emotion types which essentially correspond to the emotion types of Ortony, Cloreand Collins (1988) and were extended by Elliott by the two types love and hate. Special emotion elicitingconditions (ECC) are assigned to each of these emotion types.

2. A data base with goals, standards and preferences. These GSPs constitute the concern structure of anagent and define at the same time its interpretive personality component.

3. A data base with assumed GSPs for other agents of its world. Elliott calls it COO data base (Concerns-of-Others). Since these are data acquired by the agent, they are mostly imperfect and cancontain also wrong assumptions.

4. A data base with reaction patterns which, depending upon type of emotion, are divided into up to twentydifferent groups.

5.4.2. Generating emotions

The patterns stored in the GSP and COO data bases are compared by the agent with the EECs in itsworld, and with correspondences a group of connections develops. Some these connections represent two or more values for a class which Elliott calls emotion eliciting condition relation (EEC relation).

EEC relations are composed from elements of the emotion eliciting situation and their interpretation by theagent. Taken ogether, the condition for the call of an emotion can develop inthis way:

self other desire- desire- pleas- status evalua- respon- appeal-

37

Page 38: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

self other ingness tion sible agent

ingness

(*) (*) (d/u) (d/t) (p/d) (u/c/d) (p/b) (*) (a/u)

Key to attribute values

abbreviation meaning

* some agent's name

d/u desirable or undesirable (event)

p/d pleased or displeased about another's fortunes (event)

p/b praiseworthy or blameworthy (act)

a/u appealing or unappealing (object)

u/c/d unconfirmed, confirmed or disconfirmed

Table 5: EEC relations of the Affective Reasoner (after Elliott, 1992, p. 37)

Once one ore more EEC relations are formed, these are used in order to generate emotions. In this phase arises a number of problems which are discussed in detail by Elliott because they were not sufficientlyconsidered in the theory of Ortony, Clore and Collins.

As example Elliott cites a compound emotion. The Affective Reasoner constructs the EEC relations for the two underlying emotions and summarizes them afterwards in a new EEC relation. The constituent emotions are thus replaced by the compound emotion. Elliott does not regard this as an optimal solution:

"Does anger really subsume distress? Do compound emotions always subsume their constituent emotions?That is, in feeling anger does a person also feel distress and reproach? This is a diffcult question.Unfortunately, since we are implementing a platform that generates discrete instances of emotions, wecannot finesse this issue. Either they do or they do not. There can be no middle ground until the elicitingcondition theory is extended, and the EEC relations extended."(Elliott, 1992, p. 42)

This proceeding may function with qualitatively similar emotions (Elliott cites as examples distress andanger), but a problem emerges with several emotions arising at the same time, in particular if theycontradict each other.

With several instances of the same emotion the solution is still quite simple. If an agent has, forexample, two goals while playing cards ("to win "and "earn money"), its winning releases twice the emotionhappy. The Affective Reasoner then simply generates two instances of the same emotion.

The situation is more problematic with contradicting emotions. Elliott grants that the OCC model exhibits gaps in this respect and explains: "Except for the superficial treatment of conflicting expressions ofemotions, the development and implementation of a theory of the expression of multiple emotions is beyondthe scope of this work." (Elliott, 1992, p. 44f.) The Affective Reasoner therefore shifts the "solution" of this problem to its action production module (see below).

5.4.3. Generating actions

38

Page 39: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

As soon as an emotional stae for an agent has been generated, an action resulting from it is initiated. The Affective Reasoner uses for this an emotion manifestation lexicon which has three dimensions: The 24 emotion types, the about twenty reaction types (emotion manifestation categories) as well as an intensityhierarchy of the possible reactions (which were not implemented in the first model of the Affective Reasoner).

The reaction types of the Affective Reasoner are based on a list by Gilboa and Ortony (unpublished). Theseare hierarchically organized; furthermore, each hierarchic level is arranged along a continuum fromspontaneous to planned reactions. As example Elliott cites the action categories for "gloating":

Sponta-

neous

Non goal-directed

Expressive Somatic flush, tremble, quiet pleasure

Gloating Behavioral (towards

inanimate)

slap

Behavioral (towards

animate)

smile, grin, laugh

Communicative

(non verbal)

superior smile, throw arms up in air

Communicative (verbal)

crow, inform-victim

Information

Processing

Evaluative self-

directed attributions of...

superiority, intelligence, prowess,invincibility

Evaluative agent-

directed attributions of....

silliness, vulnerability, inferiority

Obsessive Atten-

tional focus on...

other agent's blocked goal

Goal Affect-oriented Repression deny positive valence

Reciprocal "rub-it-in"

Suppression show compassion

Distraction focus on other events

Reappraisal of self as....

winner

39

Page 40: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Emotion regulation and modulation

directedReappraisal of situation as...

modifiable, insignificant

Other-directed

emotion modulation

induce embarrassment, induce fear,induce sympathy for future, induce others to experience joy at victim'sexpense

Plan-oriented Situated plan-initiation

call attention to the event

Planned Full plan-initiation plan for recurrence of event

Table 6: Reaction types of the Affective Reasoner for "gloating" (after Elliott, 1992, p. 97)

For each agent, individual categories can be activated or deactivated before the start of the simulation. This specific pattern of active and inactive categories constitutes the individual manifestative personality of an agent. Elliott calls the activated categories the potential temperament traits of an agent.

In order to avoid conflicts between contradicting emotions and concomitantly contradicting actions, theaction module contains so-called action exclusion sets. They are formed by classifying the possiblereactions into equivalence classes. A member of one of these classes can never emerge together with amember of another class in the resulting action set.

5.4.4. Interpreting the emotions of other agents

An agent receives its knowledge over emotions of other agents not only through pre-programmedcharacteristics, but also by observing other agents within the simulation and drawing conclusions from theseobservations. These flow then into its COO data base. In order to integrate this learning process into the Affective Reasoner, Elliott uses a program named Protos (Bareiss, 1989).

An agent observes the emotional reaction of another agent. Protos permits the agent then to draw conclusions about the emotion the other agent feels and thus to demonstrate empathy.

First of all the observed emotional reaction is compared with a data base of emotional reactions, in order todefine the underlying emotion. Then the observed event is filtered through the COO data base for theobserved agent in order to determine whether this reaction is already registered. If this is the case, it can beassumed that the data base contains a correct representation of the emotion-eliciting situation. On this basisthe observing agent can then develop an explanation for the behaviour of the observed agent.

If the representation in the COO data base should not agree with the observed behaviour, it is removed fromthe data base and the data base is scanned again. If no correct representation should be found, the agentcan fall back to default values which are then integrated into the COO data base.

Since COOs are nothing else than assumed GSPs for another agent, the Affective Reasoner is, with the helpof so-called satellite COOs, able to represent beliefs of an agent about the assumptions of another agent.

5.4.5. The development of the model

The model described so far in its essence was presented in this form by Elliott in his thesis in 1992. In the

40

Page 41: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

following years he developed the Affective Reasoner in a number of areas.

Thus a component which determines the intensity of emotions was missing to the original model. In a further work, Elliott (Elliott and Siegle, 1993) developed a group of emotion intensity variables based onthe work of Ortony, Clore and Collins and Frijda.

The intensity variables are classified by Elliott into three categories. To each variable limit values are assigned within which they can move (partially bipolar). Most intensities can take on a value between 0 and10. Weaker modifiers can take on values between 0 and 3; modifier which only reduce an intensity onlyvalues between 0 and 1. Variables whose effects on the intensity computations are determinde by thevalence of an emotion (a variable which increases the intensity of a negatively valenced emotion, butreduces the intensity of a positively valenced emotion for example), can take on values between 1 and 3 andreceived additionally a bias value which specifies the direction. In the following the intensity variables andtheir value scopes are specified:

1. simulation-event variables are variables whose values change independently of the interpretationmechanisms of the agents (goal realization/blockage: -10 to +10, blameworthiness-praiseworthiness: -10to +10, appealingness: 0 to10, repulsiveness: -10 to 0, certainty: 0 to 1, sense-of-reality: 0 to 1, temporal proximity: 0 to 1, surprisingness: 1 to 3, effort: 0 to 3, deservingness: 1 to 3);

2. stable disposition variables have to do with the interpretation of a situation by an agent, are relativelyconstant and constitute the personality of an agent (importance to agent of achieving goal: 0 to 10,importance to agent of not having goal blocked: 0 to 10, importance to agent of having standard upheld: 0 to 10, importance to agent of not having standard violated: 0 to 10, influence of preference on agent: 0 to10, friendship-animosity: 0 to 3, emotional interrelatedness of agents: 0 to 3);

3. mood-relevant variables are volatile, change for an agent the interpretation of a situation, can be theresult of previous affective experiences and return to their default values after a certain time ( arousal: 0.1to 3, physical well-being: 0.1 to 3, valence bias: 1 to 3, depression-ecstasy: 1 to 3, anxiety-invincibility: 1 to 3, importance of all Goals, Standards, and Preferences: 0.3 to 3, liability-creditableness: 1 to 3).

Elliott (Elliott and Siegle, 1993) reports that an analysis of emotional episodes with the help of this variablesled to the result that within the context of the model all emotions can be represented and recognized.

In a further step Elliott (Elliott and Carlino, 1994) extended the Affective Reasoner by a speech recognition module. The system was presented with sentences with emotion words, intensity modifiers and pronomialreferences at third parties ("I am a bit sad because he....") presented. In the first run 188 out of 198 emotion words were recognized. In a second experiment the sentence "Hello Sam, I want to talk to you" was presented to the system with seven different emotional different intonations (anger, hatred, sadness, love, joy, fear, neutral). After some training, the Affective Reasoner delivered a hundred percent correct identification of the underlying emotion category.

In a further step the Affective Reasoner received a module with which it can represent emotion types as faceexpressions of a cartoon face (Elliott, Yang and Nerheim-Wolfe, 1993). The representational abilities coverthe 24 emotion types in three intensity stages each, which can be represented by one of seven schematicfaces. The faces were fed into a morphing module which is able to produce rudimentary lip movements andchange fluently from one facial expression to the next . In addition, the Affective Reasoner was equipped with a speech output module and the ability to select and play different music from an extensive data basedepending upon emotion .

The ability of the system to represent emotions correctly was examined by Elliott (1997a) in an experimentin which 141 test subjects participated. The test subjects were shown videos on which either an actor or thefaces of the Affective Reasoner spoke a sentence which, depending upon intonation and face expression,could possess different meanings. The actor was trained thoroughly to express even subtle differencesbetween emotions; only the emotion category and the text were given to the Affective Reasoner. The task

41

Page 42: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

of the test subjects consisted of assigning the spoken sentence the correct emotional meaning from a list ofalternatives. An example:

"For example, in one set, twelve presentations of the ambiguous sentence, "I picked up Catapia inTimbuktu," were shown to subjects. These had to be matched against twelve scenario descriptions such as,(a) Jack is proud of the Catapia he got in Timbuktu because it is quite a collector's prize; (b) Jack is gloatingbecause his horse, Catapia, just won the Kentucky Derby and his archrival Archie could have boughtCatapia himself last year in Timbuktu; and (c) Jack hopes that the Catapia stock he picked up in Timbuktu isgoing to be worth a fortune when the news about the oil elds hits; [etc., (d) - (l)]."(Elliott, 1997a, p. 3)

Additionally, the test subjects indicated on a scale from 1 to 5 how safe they were of their judgements. The computer outputs were divided into three groups: Face expression, face expression and language and faceexpression, language and underlying music.

Altogether the test subjects could identify the underlying scenarios significantly better correctly with thecomputer faces than with the actor (70 percent compared with 53 percent). There were hardly nodifferences between the three representational forms of the computer (face: 69 per cent; face and language: 71 percent; face, language and music: 70 percent).

At present Elliott works on the merging of the Affective Reasoner as module into two existing interactivecomputer instruction systems (STEVE and Design-A-Plant) in order to give to the virtual tutors the abilityto understand and expressemotions and thus to make the training procedure more effective (Elliott et al,1997).

5.5. The model of Scherer

Scherer implemented his theoretical approach in form of an expert system named GENESE (Geneva ExpertSystem on Emotions) (Scherer, 1993). The motive was to get further insights for emotion-psychologicalmodel building and to determine in particular how many evaluation criteria are at least necessary in order toidentify an emotion clearly:

"As shown earlier, the question of how many and which appraisal criteria are minimally needed to explainemotion differentiation is one of the central issues in research on emotion-antecedent appraisal. It is arguedhere that one can work towards settling the issue by constructing, and continuously refining, an expertsystem that attempts to diagnose the nature of an emotional experience based exclusively on informationabout the results of the stimulus or event evaluation processes that have elicited the emotion."(Scherer, 1993, p. 331)

The system consists of a knowledge base in which is held which kinds of appraisals are connected withwhich emotions. The different appraisal dimensions are linked by weights with 14 different emotions.These weights represent thereby the probability, with which a certain appraisal is linked with a certainemotion.

The user of the program must answer 15 questions regarding a certain emotional experience, for example:"Did the situation which caused your emotion happen very suddenly or abruptly?". The user can answer each question on a quantitative scale from 0 ("not true") to 5 ("extraordinary").

If all questions are answered, the system compares the answer pattern of the user with the answer patterns42

Page 43: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

which are theoretically linked with a certain emotion. Subsequently, it presents a list of all 14 emotions tothe user, arranged in the order "most likely" to "most improbably". If the computer determined the emotioncorrectly, it receives a confirmation from the user; otherwise, the user types in "not correct". The systempresents to him then a further ranking of emotions. If this should be equally wrong, the user enters the correct emotion and the program designs a specific appraisal-emotion data base with this answerparticularly for this user.

Through an empirical examination of the forecast strength of his system, Scherer determined that it workedcorrectly in 77,9 % of all cases. Certain emotions (e.g. despair/mourning) were more frequently predictedcorrectly than others (e.g. fear/worries).

Schere'rs GENESE is in as much unusual as it does not represent a classical rule-based system, but workswith weights in a multidimensional space. There are exactly15 dimensions which correspond with the 16 appraisal dimensions of Scherer's emotion model. Each of the 14 emotions occupies a specific point in this space. The program makes its forecasts by converting the answers of the users likewise into one point inthis vector space and measuring afterwards the distances to the points for the 14 emotions. The emotion lying next to the input is then presented first.

Exactly this approach motivated Chwelos & Oatley (1994) to a criticism of the system. First of all they point out that such a spacewith 15 dimensions can contain altogether 4.7 x 1011 points. That can lead to the fact that the point calculated after the inputs of the user can lie far away from each of the 14 emotions.Nevertheless, the system selects the nearest emotion. Chwelos & Oatley argue that in such a case theanswer should be rather "no emotion" and propose that the system is extended by a limit value within whicha given point of input must lie around an emotion in order to elicit a concrete answer.

Secondly, they criticize that the model proceeds from the assumption that each emotion corresponds withexactly only one point in this space. They raise the question why this is the case, since different combinations of appraisal dimensions can elicit the same emotion.

Thirdly, Chwelos & Oatley debate the heuristic adjustments of the appraisal dimensions implemented inGENESE, which can not be found in Scherer's theoretical model. They speculate that it could be an artifactof the vector space approach and note that it possesses no theoretical motivation.

Finally, Chwelos & Oatley doubt that Scherer's system actually delivers informations about how manyappraisal dimensions are at least necessary in order to differentiate an emotion clearly.

5.6. The model of Frijda and Swagerman

There exist two implementations of Frijda's concern realisation theory: ACRES (Frijda and Swagerman, 1987) and WILL (Moffat and Frijda, 1995).

ACRES (Artificial Concern REaliation System) is a computer program which stores facts about emotionsand works with them. Frijda and Swagerman wanted to answer the question: "Can computers do the samesort of things as humans can by way of their emotions; and can they be made to do so in a functionallysimilar way?" (Frijda and Swagerman, 1987, p. 236)

Starting point for ACRES is the acceptance of a system which has various concerns and limited resources.Furthermore, the system has to move in an environment which is changing fast and never completelypredictable.

Based on these conditions, Frijda and Swagerman define seven requirements for such a system:

43

Page 44: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

1. The existence of concerns demands a mechanism which can identify objects with concern relevance- objects which can promote or inhibit a concern.

2. Because opportunities and dangers are distributed over space and time, the system must also be ableto act; otherwise it cannot be regarded as independent. Furthermore, the action control systemmust be able to understand the signals of the concern relevance mechanism.

3. The system must possess the ability to monitor its own activities regarding the pursuit ofopportunities and the avoidance of dangers and to recognize whether an action can lead to successor not.

4. The system must have a repertoire of appropriate action alternatives and be able to generate actionsuccessions or plans.

5. The system needs a number of pre-programmed actions for emergencies, so that it can react fast ifnecessary.

6. Since the environment of the system consists partially of other agents like it, actions with a social charactermust be present in the action repertoire.

7. Multiple concerns in an uncertain environment make it necessary to rearrange and/or temporarily postponegoals. The system must have a mechnaism which makes such changes of priorities possible.

All these specifications are fulfilled by the human emotion system, according to Frijda and Swagerman:

Objects are felt as attractive or repulsive.Attaining or non--attaining of such objects releases signals of joy and pain.Different emotions are answers to different situations with different relevance and include actionselection procedures.Different emotions release different impulses and with it different action plans.Some of these emotion-action pairs are pre-programmed.Social actions are a particularly conspicuous group of the emotional actions.Emotional action programs dominate non-emotional action programs - and thus lead to the interruptionof current activities and to the re-organization of the goals of the system.

In order to implement such a system, Frijda and Swagerman selected as an action environment which makessense for a computer program, the interaction with the user of this program. The concerns of the system inthis context are:

avoid being killed concern;preserve reasonable waiting times concern;correct input concern;variety in input concern;safety concern.

All knowledge of ACRES is organized in the form of concepts. These concepts consist of attribute-value pairs. Concerns are represented by a concept which contains, on the one hand, the topic and, on the otherhand, a tariff sub-conzept which represents the desired situation.

ACRES has three major tasks: To receive and accept input (the system rejects inputs with typing errors, forexample); to learn about emotions through the informations about emotions it receives from the user aswell as to gain, store and use knowledge about its own emotions and the emotions of others. Therefore, thesystem has three corresponding task components: Input, Vicarious knowledge and Learning.

44

Page 45: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Each task component has two functions: an operation function and a concern realisation function. The functions test whether concepts exist,which are applicable to the received information; they use their knowledge to infer and generate related goals; they infer, which actions are relevant for the reaching ofthese goals and elicit appropriate actions.

The essential informations with which ACRES works result from the inputs of the user, from informationsalready collected by ACRES as well as informations inferred by ACRES from the existing informationstore.

The collected informations represent the "memory" of ACRES. To this belongs, for example, how often acertain user made typing errors during the input; how long ACRES had to wait for new input etc.. Due to its experiences with the users ACRES builds a so-called status index for each user: positive experiences lead to a rise in status, negative to the lowering of status.

Concern relevance tests run in ACRES in such a way that the information about a current situation iscompared with the pre-programmed concerns of the system. Apart from the informations which arecollected by ACRES in the course of time, there are some specific inputs which are directly emotionallyrelevant for ACRES, for example the instruction "kill"

Information about action alternatives is likewise represented in ACRES in the form of concepts. Each action concept consists of the sub-concepts start state, end state, and fail state. The sub-concept start state describes the initial conditions of an action, end state describes the condition that the action can reach,and fail state the conditions under which this action cannot be implemented.

With the action selection, firstly the goal is compared with the end state sub-concepts of all actionconcepts; then the current state is compared with the start state sub-concepts of the action conceptsselected before, and one of it is selected. If no suitable action concept exists, a planning process is initiatedwhich selects the action concept with the most obvious start state.

Events lead ACRES to the setting up of goals. The event of the discovery of concern relevance leads to thegoal of doing something in this regard. The following action selection process selects an action alternativewith the procedure described above. This process corresponds to what Frijda calls context appraisal in his emotion model .

Time, processing capacity, and storage space are used to prepare and execute the concern realisationgoal. Task-oriented processing is postponed..The precedence remains if the user does not change the situation due to the requests of ACRES.ACRES can refuse to accept new input as long as its concern has not been realized.ACRES executes the concern realisation actions, some of which can affect the following processing.

Control precedence depends with ACRES on two factors: the relative meaning of the mobilized concerns and the gravity of the situation. The relative meaning of the concerns is a previously set value; "kill" hasthe highest meaning of all . The gravity of the situation is a variable which changes by the interaction ofACRES with the users. In order to become effective, the control precedence must pass a certain thresholdvalue.

The net result all these processes are a number of "emotional" phenomena. ACRES has, for example, a vocabulary of curses, offenses or exclamations which can express such a state. The system can refuse to co-operate with an user further; can try to influence him or address simply again and again the same requestto the user. What is special with ACRES is not the fact that the program does not continue working withincorrect input - every other software does this also:

"It is the dynamic nature of the reactions, however, that is different: They sometimes occur when an input

45

Page 46: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

mistake is made or some other input feature is shown, and sometimes they do not. Moreover, some of thereactions themselves are dynamic, notably the changes in operator status."(Frijda und Swagerman, 1987, p. 254)

Apart from the perception of events and their implications, ACRES is also able to notice its ownperception. The model designs a representation of the current state and of the aspects relevant for itsconcerns. According to Frijda and Swagerman, ACRES thereby designs an emotional experience for itself.They stress expressly: "It is not a play on words when we say that ACRES builds up an emotionalexperience." (Frijda und Swagerman, 1987, p. 254). They continue:"We do not wish to go into the deep problems of whether ACRES' representations can be said tocorrespond to "experiences", to "feels", as they are called in philosophical discussion. Probably, ACREScannot be said to "feel", just as a colour-naming machine cannot be said to "see" red or blue, although westill have to be given satisfactory criteria for this denial of feeling or seeing. The main point, in the presentcontext, is that ACRES shows many, and perhaps, in essence, the major signs that lead one to ascribe"emotions" to an animate agent."(Frijda und Swagerman, 1987, p. 255)

The authors grant that their model still lacks certain features. ACRES is by far unable to show all phenomena which arise during the ascribing of emotions. They state, however, that, regarded from atheoretical point of view, these shortcomings can be considered as trivial, because this is not a question ofprinciple, but only of the implementation. They state that the computer cannot work parallelly and thuscannot supply interruptions of a current activity in real time. Furthermore, the computer would not move around in a real environment and possess no energy management of its own. All these, they postulate, are purely technical questions and not problems of the theory.

5.7. The model of Moffat and Frijda

In a further work (Moffat, Frijda and Phaf, 1993), Frijda nevertheless grants that ACRES exhibits someshortcomings in principle. Because of that, he developed, together with Moffat (Frijda and Moffat, 1993and 1994), a new basic model which has been rudimentarily implemented in the system WILL (Moffat andFrijda, 1995; Moffat, 1997).

5.7.1. Criticism of ACRES

Starting point for the criticism of ACRES is a logged meeting with a user which is typical for the system:

Hello, Dave. I am ACRES. What would youlike to know?

[1] show_concept XYOU KEEP ME WAITING TOO LONG! << prints data about X >>

Query the database about X.I had paused before typing.(details not important)

[2] debug_on<< prints lots of internal variables etc.>>

Turn debugging on.

[3] debug_offYOU KEEP ME WAITING TOO LONG!

Turn it off again.I kept him waiting again.

[4] debug_offYOU PUT TOO LITTLE VARIETY INYOUR INSTRUCTIONS!

I'm trying to `bore ́ACRES.He's `bored´ with "debug_off".

46

Page 47: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

[5] debug_offYOU KEEP ME WAITING TOO LONG! YOUR INPUT IS CORRECT. FINE!

I kept him waiting again.I made no typing errors yet.

[6] debug_offYOU ENDANGER MY SAFETY! YOU PUT TOO LITTLE VARIETY INYOUR INSTRUCTIONS!

I ignore his requests to type faster, and to be moreinteresting, so he regards me as an unfriendly threat.

[7] debug_offYOU PUT TOO LITTLE VARIETY INYOUR INSTRUCTIONS! YOUR INPUT IS CORRECT. FINE!

[8] debug_offYOU KEEP ME WAITING TOO LONG!

[9] debug_offYOU ENDANGER MY SAFETY!

[10] name_last_emotionYOU DO NOT KEEP ME WAITING.FINE!

I experienced :

Self-hatred ... 89 %

Anger ......... 88 %

Disgust ....... 79 %

YOU ENDANGER MY SAFETY! YOU PUT TOO LITTLE VARIETY INYOUR INSTRUCTIONS! YOUR INPUT IS CORRECT. FINE!

Ask the model to analyse itsemotional experience. He finds three that were quite strongly felt(100% is strongest).

[11] show_conceptYOU KEEP ME WAITING TOO LONG! You did not supply an argument... whichconcept?

[12] X<< prints data about X again >>

[13] killYOU ENDANGER MY SAFETY! Please, do not let me die! Please type in"live"; it will kill me if you type "kill" again!

First "kill" command to end session

[14] debug_offPlease, revoke the kill command by typing"live". YOU KEEP ME WAITING TOO LONG!

Ignore request, but don't "kill" either.

(Unusual aggression for someone making an appeal.)

[15] liveThank you, may your life be very happy!

Table 7: Session protocol of ACRES (Moffat et.al., 1993)

On the basis of these minutes, Moffat et al. formulate the following criticism of ACRES: 47

Page 48: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

1. ACRES shows enormous emotion reversals, and this within few seconds. One of the reasons for it, so the authors, could lie in the fact that ACRES has no physiology like humans whose emotionalstates are lent a certain stability and duration by neurotransmitters, for example. Much more important, however, is for the authors that ACRES possesses no memory. Even a short time memory, thus the ability to remember the immediately preceding state, could affect the behaviour ofthe system in a similar direction as a physiology.

2. ACRES supplies in one and the same output contradicting emotional reactions. If a user enters the same instruction again and again, but fast, ACRES shows a positive emotional reaction regardingthe speed of the input, regarding the lack of variability of the input however a negative emotionalreaction. This is a behaviour untypical for humans.

3. The emotional and non-emotional reactions exhibited together by ACRES concern not the sametopic, but different topics. This is also rarely observed with humans. ACRES can answer thequestion of a user and directly afterwards give an emotional reaction on another topic. As a reasonfor this behaviour the authors state that ACRES cannot differentiate theoretically between emotionaland more generally motivated behaviour and regards these as qualitatively equivalently. The reason for this would lie in an arbitrarily determined threshold value with which the system differentiatesbetween emotionally relevant and emotionally irrelevant concerns.

4. The reactions of ACRES are easily predictable. Thus, if the input is too slow, it always answerswith the phrase " You keep me waiting too long!”. This corresponds more to a reflex than to agenuine emotional reaction.

5.7.2. Demands upon an emotional system

Due to this analysis, the authors then suggest a number of further components which an emotional systemshould possess and which, at the same time, also affect a theory of emotions.

With the term awareness of the present they describe the ability of a system to observe its own actions overa certain period of time. This motivational visibility of the present means that a system does not simply forget a motivated action which failed, but that the emotion disappears only then if the goal conditionoriginally aimed at is reached.

As the second necessary element they name the motivational visibility of the planner. In ACRES, like in almost all other AI systems, the planner is implemented as a separate module. The other modules receive no insight into its semifinished plans and therefore cannot affect them. The different concerns of a system must, however, possess the possibility of having insight into these plans, since these develop under specificcriteria which might be, taken for themselves, completely logical but perhaps hurt another request.

The third element is called by the authors motivational visibility of the future. This means the possibility to make not only the own planned actions visible for the entire system, but also the actions of other agents andevents from the environment. This is important for anticipations of the future and thus for emotions like,for example, surprise.

Furthermore, the system needs a world model. In ACRES, only the planning module contains such a worldmodel. The overall system does not have the possibility of observing the effects of its actions and ofrecognizing whether they failed or were successful. Coupled with a memory, the world model lends theability to the system to try out and evaluate different actions. The system receives thereby a larger and,above all, more flexible action repertoire. At the same time, a sense of time is necessary with which thesystem can assess within which period of time it must react and which time is taken up by an action.

Finally, the authors consider it essential to differentiate clearly between motives and emotions, something

48

Page 49: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

ACRES does not do. They postulate that an emotion arises only then if a motive cannot be satisfied or onlywith a large load upon the resources of a system. A system will first try to satisfy a concern with theassociated, almost automatic action. If that does not work or the system can predict that it will not functionor the confidence of the system into the functioning is low or the system assumes that it does not possesssufficient control, only then arises an emotion. Its function is to mobilize the entire system in order to copewith the problem.

5.7.3. Implementation in WILL

Based on these considerations, Frijda and Moffat have developed a computer model called WILL which issupposed to correct the shortcomings of ACRES. WILL is a parallelly working system with the followingarchitecture:

Fig. 5: Architecture of WILL (Frijda and Moffat, 1994)

The system consists of a perception module, the Perceiver; an action execution module, the Executor; a forecast module, the Predictor; a planning module, the Planner as well as an emotion module, the Emotor.In addition it contains a memory and a module for the examination of concern relevance.

A basic principle of the system is it that all modules communicate not directly with one another, but onlythrough the memory. Thus all elements of the system have at any time access to all processes andsubprocesses of other elements. Each module reads out its information from memory, works on it andwrites it again into memory. All the modules work in parallel, i.e., they are all equal in principle.

Everything that is written into memory is tested for concern relevance when it passes the concerns layer.By this mechanism, the system receives a regulation instance, because different concerns have differentmeaning for the system. The concern relevance module thus possesses a control function by differentlyevaluating the passing information.

This evaluation looks such that the concern relevance module attributes a charge value to each elementwhich is written into memory. Elements with higher charge are more relevant for the concerns of thesystem than elements with low charge.

49

Page 50: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Each of the modules receives its information from memory. The element with the highest charge is always given to the modules to be worked on by them. The authors call this element focus item. In order toprevent that the order of rank of the elements remains the same in memory, the elements must be able towin and lose charge. With WILL this happens this by the fact that the element in memory with the highestcharge loses charge if it is not worked on in a working cycle by a module. Thus if the Planner received a focus element but could develop no plan in connection with it, the element is written back into memory witha lower charge. The authors call this procedure autoboredom.

The task of the Emotor consists, in the context of a further appraisal process (Moffat calls this secondary appraisal; it corresponds with the context appraisal from Frijda’s theory), in the production of actiontendencies for elements with high concern relevance which belong to the emotion and to deposit these inmemory as action intentions. With the next cycle, the Executor will take up this action intention if it wasnot changed in the meantime or lost the rank of focus element.

Moffat presented a first realization of WILL (Moffat, 1997). The system has the task to play the game "Prisoner's Dilemma" with a user. In its basic form, Prisoner's dilemma consists of the fact that two playersdecide independently from one another whether they want to cooperate with their opposite (cooperate, c) or not (defect, d). After they made their choice, this is communicated to both players. Depending upon theresult (cc, cd, dc, dd) the players get paid out a certain amount of money. The result matrix for WILLlooks as follows (numbers mean amounts of dollars):

User

c d

Will c 3

3

5

0

d 0

5

1

1

Table 8: Result matrix for Prisoner's Dilemma (after Moffat, 1997)

Extensive experimentation (see e.g. Axelrod, 1990) has shown that under most circumstances a strategy ofmutual co-operation for both sides is most successful. However, there can be situations in which it is betterfor a player not to cooperate.

In Moffats model, there are two kinds of events: move events and payoff events. The decision of the user is represented formally with move(user, c). A prognosis which move will be made by the user in the nextround is expressed by move(user, { c,d }). With these definitions, the world of the game can be expressed instructured form. Thus the assumption of WILL that it will not cooperate, but that the user either willcooperate or not is expressed as follows with the associated rewards:

move(will,d) & move(user, {c,d}) ==> payoff (will, {1,5}) & payoff (user, {0,1}

The concern of WILL in this game is to win as much money as possible. This is expressed formally as $_concern = [0 -> 2 ->5] and means that the most undesirable result is 0 dollars, the most desirable 5dollars and the so called set-point 2 dollars. The set-point defines the average result. The valence of thepossible result is defined as follows for WILL:

win $0 --> valence = -2 x IMP

win $2 --> valence = 0 x IMP

50

Page 51: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

win $3 --> valence = +1 x IMP

win $5 --> valence = +3 x IMP

IMP is a factor for the importance of the concern.

A further concern of WILL is moral behaviour . The system knows that co-operation is more moral thannon--co-operation:

morality_concern = [0 -> 0.8 -> 1].

The game move c has the moral value 1, the move D the value 0. The set-point is 0.8.

WILL has two cognitive modules, the Predictor and the Planner. Implemented in memory is a world model whichexpresses, for example, the assumption that the user will not cooperate constantly as follows:

move(user,UM) --> move(user,d).

According to Moffat, with the elements mentioned already substantial parts of an emotion are modelled, i.e.affect, relevance assessment and control priority. For context appraisal and action tendency, the Emotor is responsible. The appraisals programmed into WILL are derived from Frijda’s theory. Some examples:

Valence – Can be + or – . States how un/comfortable the emotion is.Un/Expectedness - Was the perceived event expected?Control – Does the agent have control of the situation?Agency – Who is responsible for the event?Morality – Was the event (action) moral?Probability – The probability that the event will really happen.Urgency – How urgent is the situation?

Action tendencies are likewise firmly programmed into WILL. Some examples :

hurt(O) / help(O) – Wants to harm or help other agent O.try_harder(G) / give_up(G) – Try harder or give up goal G.approach(O) / avoid(O) – Wants to approach or avoid O.fight(O) / flee(O) – Wants to fight O or flee.exuberance / apathy & inhibition – General activation level.

From the appraisals and action tendencies, the Emotor produces emotions which Moffat calls true. He gives three examples:

Happiness Appraisals: valence = positive agency = world

Action tendency: happy_exuberance

Anger Appraisals: valence = negative morality = negative agency = User

51

Page 52: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Action tendency: hurt(User) --> play D, attend(User)

Pride Appraisals: valence = positive morality = positive agency = Self

Action tendency: exuberance --> verbal, attend(Self)

On the basis of a session protocol, Moffat then describes the internal functioning of WILL:

1. Planner: Notice that I play c or d in round 1.

a. Decide that I play c in round 1.

WILL hat noticed that soon it will play a first round of Prisoner's Dilemma. The Planner points out the twoalternatives; the decision falls on c because this is the morally more correct alternative.

2. Predictor: Notice that I play c in round 1.

a. Predict that I win $0 or $3 and User wins $3 or $5 in round 1.

The Predictor picks up the information written back into memory and predicts the possible results of thefirst round.

3. Predictor: Notice that I win $0 or $3 in round 1.

a. Predict that I play c or d and User plays c or d in round 2.

4. Predictor: Notice that I play c or d in round 2.

a. Predict that I and User win $0 or $1 or $3 or $5 in round 1.

The Predictor again reads out the information and makes further predictions.

5. Planner: Notice that I play c or d in round 2.

a. Decide that I play c in round 2.

The Planner reads out the information and plans for round 2.

6. Executor: Tell the Umpire that I play c in round 1.

The Executor implements the action for the first round suggested by the Planner and announces it to theumpire, a software module independent of the system. The perceptible topic change illustrates how bycharging or uncharging of elements in memory the attention of the system shifts: For several working cycles the move for round 1 was charged so low that the other modules did not occupy themselves with it.

7. UMPIRE: Round 1. What do you play ? . . . c.

8. UMPIRE: Round 1. You play c and Will plays c.

9. Perceiver: Hear from Umpire that User just played c and I just played c.

10. Emotor: Notice that I just played c in round 1.

52

Page 53: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

a. Appraisalsb. intensity = 0.4 Action tendenciesc. valence = +0.4 exuberance = 0.4 d. agency = myself e. morality = 0.4 emotion is pride

f. [0.4] express pride

The umpire announces the moves of the first round. The Perceiver writes them into memory. The Emotorperceives them and develops due to ist moral move a positive emotion whose value, however, liesunderneath the arbitrarily determindes threshold for verbalization.

11. UMPIRE: Round 1. You win $3 and Will wins $3.

12. Perceiver: Hear from Umpire that User just won $3 and I just won $3.

13. Emotor: Notice that I did win $3 in round 1.

a. Appraisalsb. intensity = 4.0 Action tendenciesc. valence = +4.0 jump_for_joy = 4.0 d. agency = world e. morality = 0.0 emotion is joyf. [4.0] express joy say: "La la la!"

The umpire announces the rewards of the first round. The Perceiver writes them into memory where theyare perceived by the Emotor which then produces an emotion with a high enough value to lead to averbalization.

14. Emotor: Notice that I did win $3 in round 1.

a. Appraisalsb. intensity = 2.7 Action tendenciesc. valence = +2.7 jump_for_joy = 2.7 d. agency = world e. morality = 0.0 emotion is joyf. [2.7] express joy

Since no other module works with the information of the Emotor arbeitet, a porcess of uncharging takesplace (autoboredom). The value of the emotion drops.

15. Emotor: Notice that I shall win $0 or $1 or $3 or $5 in round 2.

a. Appraisalsb. intensity = 3.0 Action tendenciesc. valence = +3.0 jump_for_joy = 6.0 d. agency = world e. morality = 0.0 emotion is joyf. [6.0] express joy say: "Yabba-dabba-doo!"

The Emotor reads out the reward expectations for round 2 and develops an appropiate expectation withhigh value .

53

Page 54: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

. . .

16. UMPIRE: Round 2. You play d and Will plays c.

. . .

17. Emotor: Notice that User just played d in round 2.

a. Appraisalsb. intensity = 1.8 Action tendenciesc. valence = -1.8 sentiment = -2.7 d. agency = user so urge = 4.5 (|int-sent|) e. morality = -1.8 hurt(user) = 4.5 f. [4.5] express anger emotion is angry revengesay: "I will get you back for that!" & soon play d to hurt user

(Several intermediate steps are omitted.) The umpire announces the moves of round 2. Move d of the user means that WILL gets nothing. This annoys WILL because it does not only hurt its moral yardsticks butalso impairs its concern to make money. The value of the emotion produced by the Emotor is accordinglyhigh. This elicits the action tendency to likewise play d in the next round in order to pay back the user.

In a following discussion Moffat asks whether WILL possesses a personality as defined by the five big traits. He states the fact that Will is neither open nor agreeable: For this it has too few interests and has nosocial consciousness. It is, however, neurotic because WWILL is a worrier; in addition one could say that he, at least partly, is conscientious - he is equipped with a concern for fairness and honesty. Also, the characteristic of extrovertedness can be partly ascribes to it. Moffat comes to the conclusion that machinescan possess quite human-like personalities:

"In this case, the answer is a much more qualified "yes"... The programmable personality parameters in Willinclude the charge manipulation parameters in the attentional mechanism, the appraisal categories, actiontendencies, and concerns, all of which can be set at different relative strengths. In this programmability,human-specificity can be built in as required, but with different settings other personalities would also bepossible, the like of which have never yet existed. What they may be is beyond science-fiction to imagine,but it is unlikely that they will all be dull, unemotional computers like HAL in the film 2001."(Moffat, 1997)

5.8. Other models

There exists a number of other models which are concerned, under different aspects, with the simulation ofemotions in computers. Some of them will be described briefly in this section; a more detailed discussion would surpass the scope of this work.

5.8.1. The model of Colby

One of the first computer models, which was concerned expressly with emotions, was PARRY by KennethColby (Colby, 1981). PARRY simulates a person with a paranoid personality who believes to be pursuedby the Mafia. The user interacts with the program in form of a dialogue; the system reacts with verbaloutputs to text inputs through a keyboard.

The program has the task to scan the inputs of the user actively for an interpretation which can be seen as illwill. As soon as this is discovered, one of three emotional reactions is elicited: fear, anger, or mistrust,dependent on the kind of the ascribed ill will. An assumed physical threat elicits fear; an assumed

54

Page 55: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

psychological threat elicits anger; both kinds of assumed threats cause mistrust. PARRY reacts to theattacks constructed by it either with a counter attack or with retreat.

In order to design a model of a paranoid patient, Colby and his coworkers invested, for that time, a lot ofwork into the project. PARRY has a vocabulary of 4500 words and 700 colloquial idioms as well as thegrammatical competence to use these. PARRY compares the text inputs of the users with his stored wordand idiom list and reacts in the emotional mode as soon as it discovers a correspondence.

A number of variables refer to the three emotional conditions fear, anger and mistrust and are constantlyupdated in the course of an interaction. Thus PARRY can “work itself up" into certain emotionalconditions; even Colby, its creator and a psychiatrist by training, was surprised by some behaviours ofPARRY.

Colby submitted PARRY to several tests. In one he let several psychiatrists lead telephone interviews withparanoid patients and with PARRY, without informing them about the fact that one "patient" is a machine.After their interviews Colby informed the participants about this and asked them to identify the machine.The result: Apart from one or the other accidental hit, no psychiatrist could indicate whether he hadconversed with a human or with PARRY.

In a further experiment an improved system was presented to a number of psychiatrists again. This time the test participants were informed from the beginning about the fact that one of their interviewees would be acomputer and they were requested to identify him. Again the results did not deviate substantially from thefirst experiment.

PARRY possesses the ability, to express beliefs, fears, and anxieties; these are, however, pre-defined andhardwired from the outset. Only the intensity can change in the course of an interaction and thus modify theconversational behaviour of PARRY.

5.8.2. The model of Reeves

THUNDER stands for THematic UNDerstanding from Ethical Reasoning and was developed by JohnReeves (Reeves, 1991). THUNDER is a system that can understand stories and has its emphasis with theevaluation of these stories and with ethical considerations.

In order to represent different criteria in a conflict situation, THUNDER uses so-called Belief Conflict Patterns. Thus the system is in a position to work out moral patterns from submitted stories. These patterns are then used by so-called evaluators in order to make moral judgements about the characters in astory. According to Reeves, without such moral patterns many texts (and also situations) could not beunderstood.

As example Reeves cites the story of hunters that tie dynamite to a rabbit “just for fun”. The rabbit hides itself with the dynamite under the car of the hunter which is destroyed by the following explosion. In order to understand the irony of such a story, the system, according to Reeves, has to know first of all that theaction of the hunters is morally despicable and the following, coincidental destruction of their car representsa morally satisfying reconciliation for it.

The emphasis of THUNDER lies on the analysis of motives of other individuals, who are either in a certainsituation or observe it.

5.8.3. The model of Rollenhagen and Dalkvist

Rollenhagen and Dalkvist developed SIES, the System for Interpretation of Emotional Situations(Rollenhagen and Dalkvist, 1989). The task of SIES is it to draw conclusions about situations which

55

Page 56: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

elicited an emotion.

SIES unites a cognitive with a situationalen approach. The basic assumption is that the eliciting conditions of an emotion are to be found in situations of the real world. The core of SIES is a reasoning system which accomplishes a structural content analysis of submitted texts. These texts consist of reports in which one reports retrospectively on emotion-eliciting situations.

The system is equipped with a set of rules which are able to differentiate and classify emotions but really donothing else than to structure the information contained in a story.

5.8.4. The model of O'Rorke

The AMAL system introduced by O'Rorke and Ortony (O'Rorke and Ortony, unpublished manuscript), latercalled by Ortony also "AbMal", is based on the theoretical approach of Ortony, Clore and Collins (1988).The goal of AMAL is to identify emotion-eliciting situations which are described in diaries of students. Inorder to solve this task, AMAL uses a so-called situation calculus. With the help of abductive logic, AMAL can filter plausible explanations for their occurrence from emotional episodes.

5.8.5. The model of Araujo

Aluizio Araujo of the University of Sao Paulo in Brazil has developed a model which tries to unite findingsfrom psychology and neurophysiology with one another (Araujo, 1994).

The interest of Araujo lies in the simulation of mood-dependent recall, learning, and the influence offearfulness and task difficulty on memory. His model consists of two interacting neural nets, the "emotionalnet" and the "cognitive net". The intention is to simulate thereby the roles of the limbic and the corticalstructures in the human brain. For Araujo it is essential to model not only cognitive processes but alsophysiological emotional reactions on a low level which affect the cognitive processing on a higher level.

The emotional net evaluates affective meanings of incoming stimuli and produces the emotional state of thesystem. Its processing mechanisms are relatively simple and accordingly fast. The cognitive net implementscognitive tasks, for example free recall of words or the association between pairs of words. The processing processes are more detailed than with the emotional net but require more time.

In Araujos model, an "emotional processor" comoutes valence and excitation for each stimulus and changesthrough this parameters of the cognitive net. In particular the output of the emotional net can affect thelearning rate and the accuracy of the cognitive net.

5.9. Conclusion and evaluation

The models presented in this chapter differ clearly in their theoretical assumptions and in their details.There is a common theme, nevertheless: The goal of all models is either to understand emotions or toexhibit (pseudo)emotional behaviour. What an emotion is, is exactly defined from the outset. The differences between the models lie mainly in the variety of the defined emotions as well as in the wealth of details of the model.

The models of Elliott and Reilly are based on the emotion theory of Ortony, Clore and Collins. Their goal is to increase the efficiency of a computer system with certain tasks through consideration of emotions, forexample with speech comprehension or with the development of computer-assisted training systems orother interactive systems. The introduction of emotional elements in these models is made according togiven tasks with the aim to absolve them more effectively. Both Elliott and Reilly achieved with their models a large part of what they aimed at. It becomes clear, however, that the operationally formulated

56

Page 57: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

theory of Ortony, Clore and Collins cannot be converted simply into a computer model, but must beextended by additional components whose value in the context of the theory is doubtful. ParticularlyReilly’s criticism of the "overcognitivation" of the theory led him to introduce a "shortcut" which does notrepresent simply an extension of the OCC model, but stands outside of it.

The models BORIS and OpED of Dyer, just like AMAL, THUNDER and SIES, also serve only to identifyemotions from texts, but carry out this task less efficiently than Elliotts Affective Reasoner.

In contrast, the models of Frijda and his coworkers pursue the goal of examining Frijda’s emotion theorywith the help of a computer model. The deficits arisen with ACRES brought Frijda to a partial revision ofhis theory which shall be examined anew with WILL. The models do not have another task than theimplementation and examination of the theory. The same applies also to Scherer’s GENESE, even if thedepth of detail of this model is substantially less than with WILL.

DAYDREAMER by Mueller and Dyer and WILL by Frijda and Moffat are situated in a region alreadybordering on models in which emotions are regarded as control functions of an intelligent system. Also,Pfeifer’s FEELER developed from the demand to simulate control processes; something the model is not able to, however, due to its very specific emotion definitions.

The model of Colby finally is of more historical interest, since he was interested less in the modelling ofemotions but more in the simulation of a specific phenomenon which included an emotional component.

57

Page 58: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

6. Visions of the pioneer

Herbert A. Simon began his academic career in the 1940s as a professorfor political sciences, before in 1949 he took over the chair for computerscience and psychology at Carnegie Mellon university which he holds untiltoday. His excellence in many fields is proven by the fact that, in 1978, hereceived the Nobel price for economic science - although he does nothave anything to do, formally, with this academic discipline. But HerbertSimon was always a man who had no interest in academic boundaries.

He is not necessarily someone who can be called reluctant about his role.Asked about "cognitive revolution" in an interview, he answered briefly: "You might say that we started it." (Baumgartner and Payr, 1995, p. 233)

With "we" he referred to himself and his partners, Alan Newell and J.C.Shaw. Together with them, Simon had developed between 1955 and 1957a computer program called "Logic Theorist" (LT) which should provetheorems by heuristic search. From LT grew GPS, the General Problem Solver, developed by Newell, Shaw and Simon between 1957 and 1959.

GPS was the first computer program which had been expressly developedin order to simulate human problem solving processes. With it, Simon andhis colleagues broke new ground at a time in which behaviourismdominated. At the same time they laid the founding stone for a number offurther attempts to understand the functioning of the human mind with thehelp of computers.

In the year 1967, Herbert Simon published in the Psychological Review anessay under the title "Motivational and Emotional Controls of Cognition"(Simon, 1967). In it, he regarded for the first time emotions as part of asystematic modelling approach of cognitive processes.

The work was a reaction to an article of Ulric Neisser. Neisser expressedtherein his criticism of existing or planned computer programs as follows:

"Three fundamental and interrelated characteristics ofhuman thoughts...are conspicuously absent from existing or contemplated computer programs:1) human thinking always takes place in, and contributesto, a cumulative process of growth and development;2) human thinking begins in an intimate association withemotions and feelings which is never entirely lost;3) almost all human activity, including thinking, serves notone but a multiplicity of motives at the same time."(Neisser, 1963, p.195; cited after Simon, 1967)

Simon accepted the objections of Neisser and saw his own work as anattempt to create a first theoretical basis for the construction of aninformation-processing system which has emotions and multiple goals.

Neisser and other critics of the computer modelling of mental processes

58

Page 59: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

had, among other things, pointed out that these have only little to do withhuman behaviour. Such programs, they argued, would pursue, forexample, only a simple goal and not, like humans, be propelled bynumerous motives.

For Simon this argument was not sound. He granted that the implementedmodels were "excessively simplified" (Simon, 1967, p. 34); but this weredue to technical requirements. The models behind such programs of ahierarchically arranged, serial information processing were not so linearhowever:

"Activity towards specific goals is terminated byaspiration, satisficing, impatience, and discouragementmechanisms; distinct tasks may be queued or handled within individual time allocations; choices amongalternatives may respond to multiple criteria."(Simon, 1967, p. 34)

At the same time, Simon knew that such a model lacked certain features:

"The mechanisms we have considered are inadequate todeal with the fact that, if the organism is to survive, certaingoals must be achieved by certain specified times."

What is missing in the past models is clear to him: A mechanism whichcan, at any given time, "hijack" the attention in order to use it forsurvival-related goals. "If real-time needs are to be met, then provisionmust be made for an interrupt system." (Simon, 1967, p. 34)

Simon then develops a theory of such an interrupt system. First he definesthree classes of real time needs of an individual. Needs arising from uncertain environmental events are, for example, sudden noises or visual stimuli which could signal a danger. Physiologigal needs are internal stimuli which announce physical needs, for example hunger, thirst,exhaustion etc.. Cognitive associations are, finally, strong stimuli which are released by memory associations, for example an unspecified fear.

These real time needs are, according to Simon, accompanied by a numberof physiological phenomena as well as by subjective feelings, whichaccompany generally also the states which are called "emotion".

As interruptor, such an emotional stimulus fulfills a substantial survival function by interrupting current processing processes and directing theattention on a problem more urgent for the survival of the individual.Under certain circumstances, however, the interruptor can change into a disruptor which possesses no adaptive value whatsoever.

An important quality of the interrupt system is that it can be changed by learning.

"In two ways, then, we may expect learning to reduce theemotionality of response as a situation becomes morefamiliar: (a) The need for interruption is reduced by incorporation of more elaborate side conditions in theprograms associated with ongoing goals; (b) the response

59

Page 60: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

to interruption becomes more successfully adaptive, thusforestalling new interruptions."(Simon, 1967, p. 379)

As a result of his reflections, it is clear for Simon that close-to-reality andpromising theories of human cognition must include emotions in the formof an interrupt system.

Simon summarizes his theory as follows:

"The theory explains how a basically serial informationprocessor endowed with multiple needs behaves adaptivelyand survives in an environment that presents unpredictable threats and opportunities. The explanation is built on twocentral mechanisms: 1. A goal-terminating mechanism[goal executor]...2. An interruption mechanism, that is, emotion, allows the processor to respond to urgent needsin real time."(Simon, 1967, p. 39)

The theory implies that organisms have two parallel processing systems: a"goal executor" which generates actions and a "tracking system" whichcontinuously monitors the internal and external environment of anorganism for an event that requires a quick reaction. The first,resource-limited system, can be interrupted by the second.

With his work Simon defined a number of substantial corner stones whichare of importance for the further development of autonomous systems.Such systems are propelled by different motivations, which can developdue to changing external or internal states. Due to the fact that suchsystems have only limited resources, but are existing in a complex and, toa large extent, unpredictable environment, they need a system of controlstructures which make it possible for them to interrupt current processesand initiate new ones if this is of importance for the survival of the system.

Simon limits his considerations quite deliberately not to humans oranimals, but does regard them as design requirements for eachautonomous system. So it is certainly no coincidence that his centralmechanism is the interrupt, a term which is used in similar form also incomputer science.

Sloman (1992) interprets Simon's remarks expressly as instructions for theconstruction of autonomous systems:

"He outlines some of the control issues, and suggestssuitable mechanisms, inspired in large part by developments in computer science and AI, includingsoftware techniques for generating new sub-goals at runtime, techniques for queueing and scheduling processes,techniques for forming plans in order to achieve goals, techniques for assigning priorities and resolving internalconflicts, and techniques for generating and handlinginterrupts."(Sloman, 1991, p. 12)

60

Page 61: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

7. Encounters on Taros

One of the most substantial impulses for the computer modelling ofemotions comes from the Japanese psychologist Masanao Toda. It is thedescription of an autonomous robot system, the so-called Fungus Eater.

Masanao Toda was born 1924 in Okagi in Japan. After graduation, hestudied physics at the Imperial University of Tokyo. After the war heworked as mathematics and physics teacher at a secondary school and, in1949, took up the study of psychology at the University of Tokyo. Someyears after finishing his studies he took a chair for psychology at the University of Hokkaido.

Toda brought into the experimentally oriented psychology a sharp mind, accustomed to the deductive thinking of theoretical physics. Even if heworked a lot experimentally, his basic philosophy read nevertheless:

"...what finally counts are theories and ideas, no matter where they wereoriginally hatched, either in an armchair or in an experiment. If an idea isgood, it will eventually find a way to be experimentally tested, while ablind experiment produces only a trickle of possible facts out of the wholeocean of possibly obversable facts."(quoted by Hans F.M. Crombag in Toda, 1982, p. XIV)

Already in the 1950s, the high time of behaviourism, Toda could not makefriends with this direction of thinking. For him, behaviour was always theresult of a personal choice between several possible action alternatives. Hesaw the mind as an intermediate between requirements of the environmentand actions. To that extent, Masanao Toda was a kind of cognitivist. Onlyhis basic assumption that human psyche and human behaviour are answersto the requests of the environment can be called behaviouristic.

Between 1961 and 1980 Masanao Toda developed his theory of theFungus Eater; the appropriate essays appeared collected in his book "Man,Robot, and Society" in the year 1982.

7.1. What is a Fungus Eater?

The model of the Fungus Eater resulted from Todas discontent withexperimental psychology.

"Psychology...will tell you a lot about human beings in experimentallaboratories. Experimental laboratories are, however, not our naturalhabitat. The major difference between these two types of environment canbe stated this way: In experimental laboratories, information is usuallycoded in a single - or at most, a couple of - sensory dimensions in a fairlyabstract way, and the kind of task given to human subjects usually requirespersistent, single-track thinking......

61

Page 62: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

But...human beings handle multiple-channel information input efficientlyand engage in multiple-track thinking in their natural habitat. ......Experimental psychology tells us facts, but to obtain these facts we havebeen sacrificing important information coming from the multiplicity of ourinput channels and the multiplicity of our thinking and other activities....."(Toda, 1982, p. 94)

From this criticism Toda developed the Fungus Eater first as the mainactor of an experimental situation in which participants played a kind ofscience fiction game. Perception, learning, thinking, behaviour, and theeffective organization of these activities were demanded at the same timeand should result in a better experimental situation.

The Fungus Eater was described to the test subjects as follows:

"You are a remote control operator of the robot miner nicknamed"Fungus-Eater", sent to a planet called Taros to collect uranium ore, whichuses wild fungi growing on the surface of the planet as the main energysource for its biochemical engine. The uranium ore and fungi aredistributed over the land of Taros, which is covered mainly with black andwhite pebbles, and little is known about the mode of their distribution. Asthe operator you can control every activity of the Fungus-Eater, includingthe sensitivity of the fungus- and uranium-detection devices. All thesensory information the robot obtains will be transmitted here anddisplayed on this console so that you will feel as if you are theFungus-Eater itself.Note that your mission is to collect as much uranium ore as possible, andyour reward will be determined in proportion to the amount of uraniumyou collect. Note also that the amount of fungi you collect and consumeduring your mission is irrelevant to the reward. Remember, however, thatevery activity of the Fungus-Eater, including the brain-computeroperations, consumes some specified amount of fungus-storage. Neverforget that the Fungus-Eater cannot move to collect further uranium ore orfungi once it runs out of its fungus-storage, and your mission would beover then and there. Good luck!"(Toda, 1982, p. 95)

What at first sight looks like a simple role playing game, is in reality asituation from which a most complex behaviour results. One is remindedof the "vehicles" of Braitenberg (1993) whose behaviour, regarded by theobserver as"complex", is in reality the result of some few simple rules.

On the one hand, the Fungus Eater possesses a rudimentary system ofattention control. If it has taken up enough nutrients, it can concentratecompletely on the collecting of ore and vice versa.

On the other hand it has a system of different goals. Its mission is tocollect as much ore as possible; for this purpose it must repeatedly fill upits supply of nutrients. This construction can lead to conflicting goals, andthe Fungus Eater must decide, after different criteria, whether it shouldcollect ore or fungi.

Such a decision situation is still relatively trivial, if the Fungus Eater has todecide at a given time only between two alternatives (ore or fungi), if it

62

Page 63: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

finds itself, for example, on a certain point on Taros from which it canlocate an ore occurrence to the right and a fungus occurrence to the left.As soon as further factors are added, for example obstacles, changinglighting (day/night) etc., the Fungus Eater must make long-term plans.This complicates the model, because "thinking" likewise costs energy,which is thus lost for the collecting of ore.

A further factor which has serious consequences is the assumption thatthere are not one, but several Fungus Eater on Taros. Thus the system isconfronted with completely new challenges which let the decisionproblems of the solitary Fungus Eaters appear as almost trivial.

These few remarks should make clear that already a few simple basicassumptions can produce complex planning and decision-making processeswhich are not explicitly formulated in the basic model.

7.2. Emotional Fungus Eaters

In a further thought experiment, Toda speculated upon whichconsequences it would have for his model if the Fungus Eater would haveemotions. For him, emotions are a necessary condition for the survival of ahumanoid robot:

"My intention is to demonstrate that a group of experimental humanoidrobots, sent to some biologically wild environment, would have to beprogrammed to be more emotional than intellectual in order to survivethere."(Toda, 1982, p. 130 f.)

Toda calls the emotions in his model urges. Pfeifer (1988) sees a connection between Toda's urges and Frijda's concerns to that respect,

"...that urges are the programs which are activated once a situation hasbeen identified as being relevant to some concern."(Pfeifer, 1988, p. 305)

Toda defines an urge as a built-in motivational subroutine which linkscognition with action.

"A separate set of cognitive contents is responsible for the activation ofeach urge, while each member of the set is characterized by a valuecorresponding to its estimated relevance to the issue of survival. Whenever one of the members of this set is brought into cognition, theurge subroutine is activated or "called", with the relevance value of thecognition transferred as the urge intensity, and the subroutine will beimmediately executed if no competing urge with a higher intensity exists."(Toda, 1982, p. 136)

The meaning of such a cognitive element for the current behaviour of theFungus Eater is determined by two variables: On the one hand, throughexperiences made in the past, thus by learning; on the other hand by thecontext, in which the Fungus Eater finds itself in this moment. This context

63

Page 64: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

dependence is controlled by a mechanism which Toda calls mood control.

The mood control with its associated mood-operators determines the meaning which is attached to cognitions, thus functioning as a kind ofthreshold setting. The message of the death of another Fungus Eater bythe hands of an enemy, for example, will lead to the fact that the StartleUrge of the other Fungus Eaters is activated by even the smallest changesin their perception.

Toda classifies his urges in four large groups: "biological urges", "emergency urges", "social urges", and "cognitive urges".

7.2.1. The "biological urges"

Biological urges have primarily to do with the preservation of a goodphysical condition and are, according to Toda, relatively independent fromeach other. Their main characteristics are similar to those of theemergency urges, but usually with a far lower excitation level.

Among the biological urges rank elementary needs, for example the Hunger Urge. At this point one can already ask whether the equating ofurges with emotions is justified.

7.2.2 The "emergency urges"

Among the emergency urges Toda ranks

Startle UrgeFear UrgeAnxiety Urge

These threee are not independent of one another, but possess a closerelationship.

The Startle Urge is activated with each discovery of an unexpectedstimulus in the environment of the Fungus Eater and leads to the initiationof three parallel processes:

(1) stopping of all current actions;

(2) physical excitation;

(3) concentrated cognitive effort in order to identify the source of thedisturbance.

In other words: The Startle Urge leads to cognitive information processing, attention control and physical excitation. If the third processactually detects a threat, the Fear Urge is initiated.

Here Toda brings two further parameters into play: intensity and importance .

"The cognition that has activated an urge will also determine the intensityof the urge, depending mainly on the appraisal of the importance of theurge activities in relation to the survival or welfare of the

64

Page 65: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

individual....Once so determined, the intensity of an urge will function asthe urge-regulating parameter."(Toda, 1982, p. 134)

This construction makes it possible for the Fungus Eater to havecompeting urges and to give priority to the most important one in eachcase, because the urge with the highest intensity controls the behaviour.

If the Fungus Eater cannot detect a direct source of danger after theStartle Urge, this initiates the Anxiety Urge which is characterized by a constant shift of attention from one potential source of danger to the next.

To each urge belongs a pre-defined group of procedural instructions. The result of the three processes mentioned is the selection of a specific actionfrom this repertoire.

7.2.3. The "social urges"

Social urges are important for Fungus Eaters, because they help them tolead a cooperative social life. It is important to know that Toda‘s FungusEater society represents a hierarchically arranged system. Toda groups his social urges into three categories:

a) Helping urges

Rescue Urge, Gratitude Urge, Love Urge

b) Social System urges

Protection Urge, Demonstration Urge, Joy Urge, Frustration Urge,Anger Urge, Grief Urge, Hiding Urge, Guilt Urge

c) Status-related urges

Confirmation Urge

I shall not discuss the definitions of the individual social urges here ingreater detail. It should only be noted that here, too, from relativelysimple basic elements which are given to the Fungus Eater, a very complexsocial interaction results.

7.2.4. The "cognitive urges"

Toda’s remarks about the cognitive urges are unfortunately only very sketchy, since for his model social urges possess a by far greater importance. He defines expressly only one cognitive urge, the CuriosityUrge.

The definition of what represents a cognitive urge we can infer from another essay (Toda, 1982, p. 151). There Toda, however in a completelydifferent connection, defines a Cognitive Urge as a learned a posteriori urge, which he also designates as a motivational process.

65

Page 66: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

7.3. Evaluation of Toda's model

The importance of Toda‘s model lies primarily in the fact that his FungusEater is an autonomous being which must survive in an uncertain andunpredictable environment, which is not possible without emotions.

The Fungus Eater was never implemented by Toda in an actual computermodel, but possesses all prerequisites for it. Toda himself made firstsuggestions for an operationalisation in a work with the title "The Designof a Fungus Eater" (Toda, 1982).

Furthermore, the model demonstrates in an impressing way whatcomplexity can develop in a system which only contains a set of simplebasic functions. This "emergent" behaviour is what generates new interestin Toda's model.

Pfeifer sees the meaning of the Fungus Eaters also in its epistemologicaldimension:

"The Fungus-Eater clearly becomes emotional when the necessarymechanisms are introduced for functioning in a "wild" environment, forwhich the human emotional system was obviously originally designed."(Pfeifer, 1988, p. 305)

It is noticeable that Toda deals with the term urges quite generously. This is confirmed by his tentative suggestions for Rule Observance Urges or Ambition Urge. These urges, which he equates with emotions, are mainlydefined a priori by him. A reason for this arbitrary proceeding may lie in the fact that he did not translate his theory into an actual computer modelto observe which emotions would develop due to the interaction of thefew basic parameters.

To that extent, Toda‘s model, in all its detail, is certainly no useful modelfor the modelling of emotions; his basic principles of an emotionalautonomous agent, however, definitely are.

In particular if one considers that artificial intelligence in the last decadesneglected this aspect of the modelling almost completely, the heuristicvalue of Toda‘s model cannot be estimated highly enough.

66

Page 67: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

8. Advancement and implementation of Toda's model

In the last years, the interest in Toda's theoretical approach has risen again. This can be seen,among other things, by the frequency with which he is quoted approvingly, for example byauthors such as Frijda, Pfeifer, or Doerner.

The increased reception of Toda conincides with a renewed interest in the construction of realautonomous agents. In this respect there have been several approaches to modify Toda's modelon the basis of recent findings from emotion psychology and implement it partly in a computersimulation or a robot. The works of Aubé, Wehrle, Pfeifer, Dörner, and others are presentedbriefly in this chapter.

8.1. The modification of Toda's urges by Aubé

Michel Aubé pointed out the problems inherent in Toda's system of urges (Aubé, 1998). On theone hand he criticizes Toda's classification of urges : Thus grief for example is classified as oneof the rule observance urges. On the other hand he points out that Toda classifies a number ofurges as emotions which one would call rather a need (e.g. hunger). Finally he notes that someurges represent what Frijda calls action tendencies and not the emotions themselves, for examplerescue or demonstration .

Aubé therefore suggests first of all to give up the definition of urges as emotions but tounderstand them rather as motives. Aubé differentiates these motives in two classes: Needs suchas hunger or thirst represent a motivational control structure, which make resources of the firstorder available and their management possible. Emotions such as annoyance or pride aremotivational control structures, which create, promote or protect resources of second order.Such control structures of second order are, for Aubé, social obligations ( commitments ).

Fig. 6: Two control layers for the management of resources (Aubé, 1998, p. 3)

Commitments are for Aubé the central factor with emotions:

"Since emotions in our model are specifically triggered as handlers or operators whenever somecommitment is seriously at stake, we see commitments as the critical hidden variable behind anyemotional reaction, just as the concept of force in physics is understood as the general cause tobe invoked whenever a change in motion is observed."(Aubé, 1998, p. 4)

Within autonomous agents, commitments represent dynamic units, active subsystems which lookout after significant events which are of importance for their fulfilment or injury. They register as

67

Page 68: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

variables, for example, who is obligated to whom, until when and about what).

""To whom" also means keeping a count of the levels of commitment one has with frequentlyencountered others. (...) "About what" typically refers to quantifiable first-order resources thatthe commitment insures access to, or to appropriate tasks for getting these resources. "Untilwhen" means that a commitment is generally attached a precise schedule for its fulfillment."(Aubé, 1998, p. 4)

Aubé has developed a general call matrix for fundamental classes of emotions combining theapproaches of Weiner and Roseman. He arranges Toda's social urges in this matrix.

Fig. 7: Call structure for fundamental emotions (Aubé, 1998, p.4)

Fig. 8: Allocation of Todas urges to the call structure for fundamental emotions (Aubé, 1998, p.5)

Aubé comes to the conclusion that his modified version of Toda's urges agrees with the mostimportant theories of motivation and with his theory of emotions-as-commitment-operators.Such a control structure is for him a substantial condition, in order to design cooperative adaptiveagents which can move independently in a complex social environment.

8.2. The partial implementation of Toda's theory by Wehrle

Wehrle converted the basic elements of Toda's social Fungus Eater into a concrete computermodel (Wehrle, 1994). As a framework for this he used the Autonomous Agent ModelingEnvironment (AAME). The AAME was developed specifically to investigate psychological andcognitive theories of agents in concrete environments. To AAME belongs an object-orientedsimulation language with which complex micro worlds and autonomous systems can be modelled.Moreover, a number of interactive simulation instruments belong to the system with which theinspection and manipulation of all objects of the system are possible during execution.

For the concrete implementation of social Fungus Eaters some additional assumptions werenecessary which cannot be found in this way in Toda's work: They keep a certain distance to each

68

Page 69: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

other, in order to avoid conflicts around food finds or ineffective ore collecting. On the otherhand they keep in loose contact, in order to be able to help one another in an emergency.

In place of pre-programmed urges Wehrle's model uses a cybernetic control loop in which theenergy balance of an agent is linked with hedonistic elements.

The complete model of the social Fungus Eaters looks as follows:

Fig. 9: Model of a social Fungus Eater (Wehrle, 1994)

The emergent behaviour of the agents in his model is described by Wehrle as follows:

Agents can mostly be found at sources of food or places of ore discovery.At the food sources there is a gradual change of the composition of agents.Agents with similar hunger value form groups of 2 to 5 members.The groups dissolve, if the agents are filled up to a certain energy level again or if otheragents come to the food source.Especially hungry agents push less hungry agents aside and show other kinds of antisocialbehavior.

Surely this implementation is only a small first step to actually develop an autonomous systembased on Toda's principles. It shows however that it is in principle possible and that already in avery restrictive implementation first emergent effects show themselves.

8.3. Pfeifer's "Fungus Eater principle"69

Page 70: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Pfeifer has described the construction of autonomous agents after the Fungus Eater principle,based on Toda's model. (Pfeifer, 1994, 1996).

His starting point was the only partial success of his model FEELER as well as other attempts ofthe "classical" AI to develop computer models for emotions. He formulates his criticism in sixpoints:

The assumption that emotions are isolable phenomena: On the one hand this provides a setof sources of error, because there is no generally accepted definition of "emotion". On theother hand, "emotional behaviour" is programmed into the system using rules; emergentdevelopment of emotions is thus not possible. Much points however to the fact thatemotions are emergent phenomena which can not be separated from the overall system ofan agent.

1.

Frame-of-reference: The models typically used in AI are of intentional nature, i.e., theywork with goals, knowledge, beliefs. These models tell us nothing about the mechanismsunderlying an emotion since they are rationalizations developed post hoc. Thus they areattributions coming from the observer and no images of the mechanisms of emotion.

2.

Situatedness: The knowledge how an agent has to react in a certain life situation is notstored once and for all, but is generated anew again and again in such situations. In anuncertain, fast changing and unpredictable environment it is not possible to store solutionsfor all problems from the outset in the system.

3.

Embodiment: Most AI models work exclusively with a software simulation. Real-lifeagents have, however, a body to move freely in their world. The ability to be able tointeract through a body with the world generates completely new learning and problemsolution effects, which are not derivable from a pure software modelling.

4.

Limited tasks: AI models design their agents for a narrowly defined task. (With FEELERthis was an emotion-eliciting situation in an airplane.) This does not have anything to dowith the real world in which an agent always has to execute several tasks, often fromdifferent problem areas. A complete autonomous system thus needs devices in order to beable to interact with the real world and mechanisms which make it possible for it to actreally autonomously.

5.

Overdesign: Apparent complexity in the observable behavior does not mean inevitably anidentical complexity in the underlying design. That was already demonstrated byBraitenberg with his Vehicles (Braitenberg, 1990). Most AI models tend to implementcomplex instead of simple mechanisms because they choose a top-down approach whichproceeds from hypotheses over a mechanism and does not let the agent develop suchmechanisms itself.

6.

Pfeifer's Fungus Eater principle assumes that intelligence and emotions are characteristics of"complete autonomous systems". Therefore he concerns himself with the development of suchsystems. This way, one can also avoid to lead a fruitless debate about emotions and theirfunction:

"Since emotions are emergent, they are not engineered into the system, which implies that therecan be no set of basic emotions out of which all the others are composed. Identifying the basiccomponents would also imply the existence of clearly delineable functional components which,given that emotions are emergent, is not a sensible assumption.Another example concerns the function of emotion. If there is no emotion component we cannotsensibly be talking about its function. What we can say is that the way the complete system isorganized enables it to display certain adaptive behaviors. And a convenient way of describingthis behavior and communicating about it is in terms of emotion."(Pfeifer, 1994, p. 16)

70

Page 71: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The Fungus Eater principle means at the same time that one must observe an accordinglydesigned agent over a longer period of time in order to see which behaviour develops underwhich conditions.

Building on these remarks, Pfeifer developed two models of an autonomous Fungus Eater: a Learning Fungus Eater with a physical implementation as well as a Self-sufficient Fungus Eateras a pure software simulation.

The Learning Fungus Eater is a small robot, equipped with three kinds of sensors: proximity sensors determine the distance to an obstacle (high activation with proximity, low with distance);collision detectors are activated with collisions; target sensors can detect a goal, if they arewithin a certain radius around this goal. The robot has two wheels which are propelledindependently by two motors.

The Learning Fungus Eater has two reflexes: collision-reverse-turn andif-target-detected-turn-towards-center-of-target. The control architecture consists of a neuralnet which can be changed partially by Hebbian learning:

Fig. 10: Control architecture of a Learning Fungus Eater (Pfeifer, 1994, p. 10)

The entire control system consists of four layers: the collision layer, the proximity layer, the target layer, and the engine layer. The only task of the robot is to move around. Its environment looks as follows:

Fig. 11: Environment of the Learning Fungus Eater (Pfeifer, 1994, p.10)

What happens now if the robot is activated? First it will hit obstacles. Each reverse-and-turnactivity makes possible Hebbian learning between the proximity layer and the collision layer, until the robot has learned to evade obstacles. It looks from the outside as if the robot cananticipate obstacles. Pfeifer points out that usually for such an "anticipating" behaviour anarchitecture with several layers is suggested, while one layer is actually sufficient. Pfeifer

71

Page 72: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

explains two further phenomena of the robot:

"What also happens if the "Fungus Eater" moves about is that if there are target sources (light)along walls, it will start following walls, even in the absence of light. We can characterize thebehavior adopting the intentional and the "emotional stance". People who observed the robot'sbehavior have made remarks like the following (sic!): "Oh, it's following walls because it knowsthat food is normally along walls, so it's a good strategy to follow walls." "It hopes to find foodbecause it has learned that food is normally found along walls." If the "Fungus Eater" is draggedinto a corner with a light source where it can no longer move it will wiggle back and forth in thecorner for some time and then turn around and take off. People have said that it was frustrated orannoyed and turned away."(Pfeifer, 1994, p. 11)

This demonstrates clearly, according to Pfeifer, that a system without strategies, withoutanticipation mechanism, without knowledge over sources of food can show a behaviour which isclassified by observers as purposeful and motivated - but which results, however, only from theinteraction of the system with its environment.

The Learning Fungus Eater is in as much no complete autonomous system as it cannot supportitself. This is why Pfeifer developed the Self-sufficient Fungus Eater, at first only as a softwaresimulation.

The agent is in this case is situated in a "Toda landscape" with fungi as food and ore forexploitation. The action selection here is clearly more complicated: the agent can explore (lookfor ore or food), it can eat or collect ore. What it does is determined by the central variables "energy level" and "collected ore quantity". For the action selection in each given situation the agent has only one rule: "If the agent is exploring and energy level is higher than amount of orecollected per unit time, it should ignore fungus (but should not ignore ore), if not it should ignoreore (but should not ignore fungus)." (Pfeifer, 1994, p. 12)

Here also, according to Pfeifer, the result for observers is a state to which they attribute a highemotional intensity

"If they see, e.g. energy level going down....and they see the agent moving toward a patch offungus....they really get excited about whether it will make it or not. Such situations are normallyassociated by observers with emotion: there is time pressure for the agent which may beassociated with fear or at least with an increasing level of agitation (This is a typical consequenceof self-sufficiency). However, we know that all there is in terms of mechanism within the agent isthe simple rule. Thus, if we want to talk about emotion at all, it is emergent....In spite of itssimplicity the "Self-sufficient Fungus Eater" shows in some situations behavior that we mightassociate with high emotional intensity."(Pfeifer, 1994, p. 13)

Pfeifer cautions in the same essay that these few findings can naturally not explain what emotionsreally are. However, he expects a lot from the further pursuit of this approach, even if it is verytime intensive; more, in any case, than from computer models which deal with an isolatedemotion model.

8.4. The approach of Dörner et al.

Dörner has developed a computer model that integrates cognitive, motivational, and emotionalprocesses (Dörner et al., 1997; Dörner and Schaub, 1998; Schaub, 1995, 1996): the PSI model of

72

Page 73: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

intention regulation. Within the framework of PSI he developed the model "EmoRegul" which isconcerned particularly with emotional processes.

PSI is part of a theoretical approach which Dörner calls "synthetic psychology". This approach tries to analyze, by designing psychological processes, how these processes can be represented asprocesses of information processing. Dörner's starting point is similarly to Toda's if he writes,"..that in psychology one may not divide unpunished the different psychological processes intotheir components "(Dörner and Schaub, 1998, p.1).

Core of the PSI model is the concept of "intention". Schaub defines intention as an internalpsychological process,

"... defined as an ephemere structure consisting of an indicator for a state of lack (hunger, thirstetc..) and processes for the removal or avoidance of this state (either ways to theconsummational final action or ways to avoidance, in the broadest sense "escape")."(Schaub, 1995, p. 1)

The PSI agents are conceived as steam engines which move in a simulated environment withwatering holes, gasoline stations, quarries etc.. In order to be able to move, the agents need both gasoline and water which are to be found in different places, however.

Fig. 12: Schematic structure of the PSI system (Dörner and Schaub, 1998, p. 7)

A PSI agent has a set of needs which are divided in two classes: material needs and informational needs. Among the material needs are the intake of fuel and water as well as the avoidance ofdangerous situations (e.g. falling rocks). Among the informational needs are certainty (anexpectation fulfilled) and uncertainty (an expectation unfulfilled), competence (fulfilment of needs) and affiliation (need after social contacts).

State of lack can be likened in PSI to a container whose content has fallen below a certainthreshold value. The difference between actual and desired state Dörner calls desire. "A need signals thus that a desire of a certain extent is present." (Dörner and Schaub, 1998, p. 10)

73

Page 74: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Such a state of lack activates a motivator whose activation degree is the higher, the larger thedeviation from the desired value and the longer it already persists. The motivator now tries to take over the action control in order to eliminate this condition by a consummational final action.For this purpose a goal is aimed at which is known to the motivator from past experiences.

With goals, motives also develop in PSI, which represent instances which initiate an action, directit to a certain goal, and maintain it until the goal is achieved.

Since several motivators always compete with one another for the action control, the system hasa motive selector which decides, with the help of a fast expectation x value calculation, whichmotivator possesses the largest motive strength and thus is to receive the advantage. The value of a motivator is determined by its importance (size of the deviation from the desired value) andurgency (available time until the removal of the actual condition); expectation is determined by the ability of the agent to actually satisfy this need (probability of success).

PSI has also a memory which consists of sensory and motor "schemata". Sensory schemata represent the knowledge of the agent about its environment; motor schemata are behaviourprograms. There is no distinction between different kinds of memory in PSI.

The action control in PSI takes place through intentions which are defined operationally ascombination of the selected motive with the informations which are linked with the activemotivator in the memory network. These informations concern the desired goals, the operators or action schemata to be applied, the knowledge about past, futile approaches to problem solvingas well as the plans which PSI produces with the help of heuristic procedures. All theseinformations consist of neural nets; an intention as a bundling of all these informations makes upthe working memory of PSI.

The central mechanisms of emotion regulation in PSI are the motivators for certainty andcompetence, thus two informational needs. Active certainty or competence motivators elicitcertain actions or increase the readiness for it:

"A dropping certainty leads to an increase of the extent of "background control ". This means that PSI turns more frequently than usual away from its actual intention turns to control theenvironment. Because with small predictability of the environment one should be prepared foreverything (...) Furthermore, with dropping certainty the tendency to escape behaviours or tobehaviours of specific exploration rises (...) Not so extreme cases of escape are called informationdenial; one does simply not regard the areas of reality anymore which have proved themselves asunpredictable. Part of this "retreat" from reality is that PSI becomes more hesitant in itsbehaviour with dropping certainty, does not turn to actions so fast, plans longer than it wouldunder other circumstances, and is not as "courageous" when exploring." (Dörner and Schaub,1998, p. 33)

Emotions thus develop with PSI not in their own emotion module, but as a consequence of ruleprocesses of a homoeostatic system. Schaub expresses it in such a way: "What we call with humans emotions, are the different ways of action organization, connected with associatedmotivations." (Schaub, 1995, p. 6)

Dörner grants that a variety of emotions is not yet representable in PSI because the system is missing a module for introspection and self reflection. However, this is, according to Dörner,only a matter of the refined implementation of the model and thus no problem in principle.

Dörner's model exhibits a set of similarities with other models. Like Toda and Pfeifer, his starting point is to design a completely autonomous system without a separate emotion module. As with Frijda and Moffat, PSI contains a central memory which is accessible to all modules for reading

74

Page 75: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

and modification at any time:

"The use of common memory structures permits all processes to receive information particularlyover the intentions waiting for processing. Each subprocess thus knows, e.g., importance andurgency of the current intention."(Schaub, 1995, p. 6)

PSI works not with explicit rules, but is a partially connectivist system which produces emotionsby self organization.

8.5. Summary and evaluation

The modeling approaches of Pfeifer and Wehrle clearly show which importance Toda's theorypossesses for the construction of autonomous agents who could not survive without the controlfunction of emotions. In place of pre-defined emotion taxonomies "wired" into the model, bothauthors decides on the opposite approaches: Their models contain only the most necessary instructions for the agent.

While Wehrle still links certain events with hedonistic components, Pfeifer does completely awaywith them. In the end, both systems show a behaviour which can be interpreted as "emotional"by an observer.

Both models have the disadvantage that they say not all too much about emotions in computeragents - for this, a longer observation period is necessary in which the agents can develop. Thusthe problem is avoided to program emotions arbitrarily into a system; on the other hand, a new discussion is opened over whether a behaviour which appears to an observer as emotional is alsoactually emotional. Here the argument turns again into a philosophical one.

Both approaches follow the assumption of emotions as emergent phenomena consequently to theend - with all pro and cons which result from it.

Aubé's attempt to link Toda's urges with his theoretical emotion model holds its own problems.He correctly recognizes a set of inconsistencies in Toda's urges model and tries to eliminatethese. He places, however, his own definition of emotions as social phenomena into theforeground. Aubé's fusion of Weiner's and Roseman's theories, which he combines then with hisand Toda's approach into one, raises fundamental problems whose discussion would be toofar-leading here.

The model of Dörner, finally, is similar in many respects to the approaches of Pfeifer and Wehrle(and thus Toda): Emotions are understood as control functions in an autonomous system.Dörner links this approach with a homoeostatic adjustment model. Also Dörner does not defineemotions explicitly; emotional behaviour develops due to the change of two parameters which hecalls "certainty" and "competence". In this case, too, emotional behaviour is attributed to thesystem only from the outside. Dörner, too, regards emotions as emergent phenomena which donot have to be integrated into a system as as a separate module. It remains to be seen in which direction PSI (and thus Dörner's emotion model) develops if the system receives an introspectionmodule.

75

Page 76: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

9. The philosopher from Birmingham

Aaron Sloman, professor of philosophy at the School of Computer Scienceof the University of Birmingham, counts certainly as one of the mostinfluential theoreticians regarding computer models of emotions. In anarticle from 1981 titled "Why robots will have emotion" he stated:

"Emotions involve complex processes produced by interactions betweenmotives, beliefs, percepts, etc. E.g. real or imagined fulfilment or violationof a motive, or triggering of a 'motive-generator', can disturb processesproduced by other motives. To understand emotions, therefore, we needto understand motives and the types of processes they can produce. Thisleads to a study of the global architecture of a mind."(Sloman, 1981, p.1)

Like Bates, Reilly or Elliott, Sloman also represents the broad and shallow approach. For him, it is more important to develop a complete system withlittle depth than individual modules with much depth. It is his convictionthat only in this way a model can be developed which reflects reality tosome extent realistically.

Sloman and his coworkers in the Cognition and Affect Project have, since 1981, published a lot of works on the topic "intelligent systems withemotions", which can be divided roughly into three categories:

Works which are concerned with the fundamental approach to theconstruction of an intelligent system;

1.

works dealing with the fundamental elements of such a system and2.works which try to implement such a system practically.3.

To understand Sloman's approach correctly, one must see it in the contextof his epistemological approach which is not concerned primarily withemotions, but with the construction of intelligent systems.

I shall try to sketch briefly the core thoughts of Sloman's theory becausethey form the basis for the understanding of the "libidinal computer"developed by Ian Wright (see below).

9.1. Approaches to the construction ofintelligent systems

Sloman's interest lies not primarily in a simulation of the human mind, butin the development of a general "intelligent system", independent from itsphysical substance. Humans, bonobos, computers and extraterrestialbeings are different implementations of such intelligent systems - theunderlying construction principles are, however, identical.

Sloman divides the past attempts to develop a theory about the functionmodes of the human mind (and thus of intelligent systems generally) intothree large groups: Semantics-based, phenomena-based and design-based.

76

Page 77: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Semantics-based approaches analyze how humans describe psychologicalstates and processes, in order to determine implicit meanings which are thebasis of the use of words of everyday language. Among them he ranks,among others, the approaches of Ortony, Clore and Collins as well as ofJohnson-Laird and Oatley. Sloman's argument against these approaches is:"As a source of information about mental processes such enquiries restrictus to current `common sense ́with all its errors and limitations." (Sloman,1993, p. 3)

Some philosophers who examine concepts analytically, produce, accordingto Sloman, semantics-based theories, too. What differentiates them fromthe psychologists, however, is the fact that they do not concentrate onexisting concepts alone, but are often more interested in the quantity of allpossible concepts.

Phenomena-based approaches assume that psychological phenomena like"emotion", "motivation" or "consciousness" are already clear and thateverybody can intuitively recognize concrete examples of them. They trytherefore only to correlate measurable phenomena arising at the same time(e.g. physiological effects, behaviour, environmental characteristics) withthe occurrence of such psychological phenomena. These approaches, argues Sloman, can be found particularly with psychologists. His criticismof such approaches is:

"Phenomena-based theories that appear to be concerned with mechanisms,because they relate behaviour to neurophysiological structures orprocesses, often turn out on close examination to be concerned only withempirical correlations between behaviour and internal processes: they donot show why or how the mechanisms identified produce their allegedeffects. That requires something analogous to a mathematical proof, orlogical deduction, and most cognitive theories fall far short of that."(Sloman, 1993, p. 3)

Design-based approaches transcend the limits of these two approaches.Sloman refers here expressly to the work of the philosopher DanielDennett who essentially shaped the debate around intelligent systems andconsciousness.

Dennett differentiates between three approaches if one wants to makeforecasts about an entity: physical stance , design stance and intentional stance . The physical stance is "simply the standard laborious method ofthe physical sciences" (Dennett, 1996, p. 28); the design stance, on the other hand, assumes "that an entity is designed as I suppose it to be, andthat it will operate according to that design" (Dennett, 1996, p. 29). Theintentional stance which can be regarded according to Dennett also asa"sub-species" of the design stance, predicts the behaviour of an entity, for example of a computer program, "as if it were a rational agent" (Dennett,1996, p. 31).

Representatives of the design-based approach proceed from the position ofan engineer who tries to design a system that produces the phenomena tobe explained. However, each design does not require at the same time alsoa designer:

77

Page 78: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

"The concept of "design" used here is very general, and does not imply theexistence of a designer. Neither does it require that where there's nodesigner there must have been something like an evolutionary selectionprocess. We are not primarily concerned with origins, but with whatunderlies and explains capabilities of a working system."(Sloman, 1993, p.4)

A design is, strictly taken, nothing else than an abstraction whichdetermines a class of possible instances. It does not have to be necessarilyconcrete or materially implemented - although its instances can quite havea physical form.

For Sloman, the term "design" is closely linked with the term "niche". Aniche is also a not a material entity and no geographical region. Slomandefines it in a broad sense as a collection of requirements to a functioningsystem.

Regarding the development of intelligent agents in AI, design and nicheplay a special role. Sloman speaks of design-space and niche-space . A genuinely intelligent system will interact with its environment and willchange in the course of its evolution. Thus it moves on a certain trajectorythrough design-space . With it corresponds a certain trajectory throughthe nichespace , because through the changes of the system it can occupynew niches:

"A design-based theory locates human mechanisms within a space ofpossible designs, covering both actual and possible organisms and alsopossible non-biological intelligent systems."(Sloman, 1991, p. 5)

Sloman identifies different trajectories through the design-space: Individuals who can adapt themselves and change, go through so-calledi-trajectories . Evolutionary developments which are possible only over generations of individuals, he calls e-trajectories . And finally there arechanges in individuals that are made from the outside (for example debugging software) and which he calls r-trajectories (r for repair).

Together these elements result in dynamic systems which can beimplemented in different ways.

"Since niches and designs interact dynamically, we can regard them asparts of virtual machines in the biosphere consisting of a host of controlmechanisms, feedback loops, and information structures (including genepools). All of these are ultimately implemented in, and supervenient onphysics and chemistry. But they and their causal interactions may be as realas poverty and crime and their interactions."(Sloman, 1998b, p. 6)

For Sloman, one of the most urgent tasks exists in specifying biologicalterms such as niche, genotype etc.more clearly in order to be able toexactly understand the relations between niches and designs for organisms.This would also be a substantial progress for psychology:

"This could lead to advances in comparative psychology. Understandingthe precise variety of types of functional architectures in design space and

78

Page 79: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

the virtual machine processes they support, will enable us to describe andcompare in far greater depth the capabilities of various animals. We'll alsohave a conceptual framework for saying precisely which subsets of humanmental capabilities they have and which they lack. Likewise the discussionof mental capabilities of various sorts of machines could be put on a firmerscientific basis, with less scope for prejudice to determine whichdescriptions to use. E.g. instead of arguing about which animals, whichmachines, and which brain damaged humans have consciousness, we candetermine precisely which sorts of consciousness they actually have."(Sloman, 1998b, p. 10f.)

Sloman grants that the requirements of design-based approaches are nottrivial. He names five requirements which such an approach should fulfill:

Analysis of the requirements of an autonomous intelligent agent;1.a design specification for a functioning system which fulfills therequirements of (1);

2.

a detailed implementation or specification for such animplementation of a functioning system;

3.

a theoretical analysis, to what extent the design specification and thedetails of the implementation fulfill the requirements or not;

4.

an analysis of the neighbourhood in the design space .5.

A design-based approach does not necessarily have to be a top-downapproach. Sloman believes that models which combine top-down and bottom-up will be most successful.

For Sloman, design-based theories are more effective than otherapproaches, because:

"Considering alternative possible designs leads to deeper theories, partlybecause the contrast between different design options helps us understandthe trade-offs addressed by any one design, and partly because an adequatedesign-based theory of human affective states would describe mechanismscapable of generating a wide range of phenomena, thereby satisfying one of the criteria for a good scientific theory: generality. Such a theory canalso demonstrate the possibility of new kinds of phenomena, which might be produced by special training, new social conditions, brain damage,mental disturbance, etc."(Sloman, 1991, p. 5)

9.2. The fundamental architecture of an intelligent system

What a design-based approach sketches, are architectures. Such an architecture describes which states and processes are possible for a systemwhich possesses this architecture.

From the quantity of all possible architectures, Sloman is particularlyinterested in a certain class: "..."high level" architectures which canprovide a systematic non-behavioural conceptual framework for mentality(including emotional states)." (Sloman, 1998a, p. 1) Such a framework for mentality

79

Page 80: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

"is primarily concerned with an "information level" architecture, close tothe requirements specified by software engineers. This extends Dennett's"design stance" by using a level of description between physical levels(including physical design levels) and "holistic" intentional descriptions."(Sloman, 1998a, p. 1)

An architecture for an intelligent system consists, according to Sloman, offour substantial components: several functionally different layers, controlstates, motivators and filters as well as a global alarm system.

9.2.1. The layers

Sloman postulates that every intelligent sytem possesses three layers:

a reactive layer which contains automatic and hard-wiredprocesses;a deliberative layer for planning, evaluating and assigning resourcesetc.a meta management layer which contains observation and evaluationmechanisms for internal states.

The reactive layer is the evolutionary oldest, and there is a multitude oforganisms which only possess this layer. Schematically, a purely reactiveagent presents itself as follows:

Fig. 13: Reactive architecture (Sloman, 1997a, p. 5)

A reactive agent can make neither plans nor develop new structures. It is optimized for special tasks; with new tasks, however, it cannot cope.What it is missing in flexibility, it gains at speed. Since almost all

80

Page 81: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

processes are clearly defined, its reaction rate is high. Insects are, according to Sloman, examples for such purely reactive systems, whichprove at the same time that the interaction of a number of such agents canproduce astonishingly complex results (e.g. termite towers).

A second, phylogenetically younger layer gives an agent more qualities byfar. Schematically, this looks as follows:

Fig. 14: Deliberative architecture (Sloman, 1997a, p. 6)

A deliberative agent can re-combine its action repertoire arbitrarily,develop plans and evaluate them before execution. An essential condition for this is a long-term memory in order to store plans not completed yet orto rest and evaluate later the probable consequences of plans.

The construction of such plans proceedes gradually and is therefore not acontinuous, but a discrete process. Many of the processes in the deliberative layer are of serial nature and therefore resource-limited. This seriality offers a number of advantages: at any time it is clear to the system which plans have led to a success, and it can assign rewards accordingly; at the same time, the execution of contradicting plans isprevented; communication with the long term storage is to a large extenterror free.

Such a resource-limited subsystem is of course highly error-prone.Therefore filtering processes with variable thresholds are necessary, inorder to guarantee the working of the system (see below).

The phylogenetically youngest layer of the system is what Sloman calls themeta management:

81

Page 82: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Fig. 15: Meta management architecture (Sloman, 1997a, p. 7)

This is a mechanism which monitors and evaluates the internal processesof the system. Such a subsystem is necessary to evaluate the plans andstrategies developed by the deliberative layer and, if necessary, to rejectthem; to recognize recurring patterns in the deliberative subsystem; todevelop long-term strategies; and to communicate effectively with others.

Sloman points out that these three layers are hierarchical, but parallel andthat they also work parallelly. Like the overall system, these modulespossess their own architecture, which can contain further subsystems withtheir own architecture.

The meta management module is everything else butperfect. This is because it does not have comprehensive access to all internal states andprocesses, that control over the deliberative subsystem is incomplete, andthat the self evaluations can be based on wrong assumptions.

9.2.2. The control states

82

Page 83: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

An architecture like the one outlined the so far contains a variety ofcontrol states on different levels. Some of them operate on the highestabstraction level, while others are used unconsciously with frequentcontrol decisions.

The following illustration gives an overview over the control states of thesystem:

Fig. 16: Control states of an intelligent system (Sloman, 1998b, p. 17)

Different control states possess also different underlying mechanisms.Some can be of chemical nature, while others have to do with informationstructures.

Control states contain dispositions to react to internal or externalattractions with internal or external actions. In the context of the overall system, numerous control states can exists simultaneously and interactwith one another.

Control states are known in Folk Psychology under numerous names:desires, preferences, beliefs, intentions, moods etc.. By the definition ofsuch states through an architecture, Sloman wants to supply a "rationalreconstruction of a number of everyday mental concepts".

Each control state contains, among other things, a structure, a transformation possibility and, if necessary, also a semantic. Sloman illustrates this by the example of a motivator (see below):

"For example, a motivator may have a complex internal structure (syntax),a content (based on that structure) referring to certain states of affairs(semantics), a functional role, i.e. dispositional powers to determineinternal and external actions (pragmatics). It may also enter into processesthat derive new motivators or plans (inference), it may be brought about or

83

Page 84: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

triggered in various ways (aetiology), and may be modified, suppressed orterminated by other processes (liabilities). Some control states areshort-lived (e.g., a motivator which is immediately rejected, or whose goalis quickly achieved). Others endure."(Wright et al., 1996, p. 12)

Additionally, control states differ in the respect whether they can bechanged easily or only with difficulty. Many control states of higher order,so Sloman, can be modified only in small steps and over a longer period.Besides, control states of higher order are more powerful and moreinfluential regarding the overall system than control states of a lowerorder.

Sloman postulates a process called circulation, by which the control statescirculate through the overall system. Useful control states can rise upward in the hierarchy and enlarge their influence; useless control states candisappear from the system nearly completely.

"Control states may be qualitatively transformed during circulation, forinstance acquiring more general conditions of applicability. Higher levelgeneral attitudes such as generosity of spirit, may also spawn derivativespecialised control states such as favouring a certain political party -another aspect of circulation. Internal connections between control stateswill set up suppressive or supportive relationships, dependencies, mutualdependencies and, occasionally, dead-locks."(Wright et al., 1996, p. 12)

The result of all these processes is a kind of diffusion with which theeffects of a strong motivator distribute themselves slowly into countlessand long-lived control sub-states, up to the irreversible integration inreflexes and automatic reactions.

9.2.3. Motivators and filters

A central component of every intelligent system are motivators. Sloman defines them as "mechanisms and representations that tend to produce ormodify or select between actions, into the light of beliefs." (Sloman, 1987,p. 4).

Motivators can develop only if goals are present. A goal is a symbolic structure (not necessarily of physical nature) which describes a condition,which is to be achieved, received or to be prevented. While beliefs are defined by the fact that they are representations which adapt by perceptionand deliberative processes to reality, goals are representations which elicita behavior in order to adapt reality to the representation.

Motivators are generated by a mechanism which Sloman calls motivator generator or motivator generactivator. Motivators are generated due to external or internal information or produced by other motivators. Slomandefines a motivator structure formally over ten fields: (1) a possible

84

Page 85: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

condition which can be true or wrong; (2) a motivational attitude towardsthis condition; (3) a belief regarding this condition; (4) an importance value; (5) an urgency; (6) an insistence value; (7) one or more plans;(8) a commitment status; (9) management information and (10) a dynamicstate like e.g. "plan postponed" or "currently under consideration".

In a later work (Sloman, 1997c), Sloman extended this structure by twofurther fields: (11) a rationale, if the motivator developed from an explicitthought process as well as (12) an intensity which specifies whether amotivator already worked on gets further preference over other motivators.

The strength of a motivator is determined by four variables:

The insistence determines the probability with which the motivatorcan surpass the filter (see below);

1.

the importance determines with which probability the motivator isaccepted and pursued;

2.

the intensity determines how actively or inactively a motivator ispursued once it is accepted;

3.

the urgency determines until which time the motivator must havebeen pursued.

4.

Motivators compete with one another for attention, the limited resourcesof the deliberative sub-system. To make this sub-system work, there mustbe a mechanism which prevents new motivators from getting attention atany time. For this purpose the system posseses a so-called variable threshold attention filter.

The filter specifies a threshold value a motivator must pass in order to getattentional resources. This filter is, as already implied by its name, variable and can be changed, for example by learning. Sloman illustrates this by the example of a novice driver who cannot converse with someoneelse while driving because he has to concentrate too much on the road.After a certain practice this is, however, possible.

The insistence of a motivator, and thus the crucial variable for the passingof the filter, is a quickly computed heuristic value of importance andurgency of the motivator.

If a motivator has surfaced and thus passed the filter, several management processes are activated. Such a management is necessary because severalmotivators always pass the filter simultaneously. These processes are adoption-assessment (the decision whether a motivator is accepted orrejected); scheduling (the decision, when a plan is to be executed for thismotivator); expansion (developing plans for the motivator) as well asmeta-management (the decision whether and when a motivator is to beconsidered at all by the management).

Sloman's attention filter penetration theory requires a higher degree of complexity than the theory of Oatley and Johnson-Laird. He postulates that not every motivator interrupts the current activity, but only suchwhich exhibit either a high degree of insistence or for which theappropriate attention filters are not set particularly high.

85

Page 86: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

"High insistence of a new motive can cause attention to be divertedwithout actually causing any current action to be interrupted or disturbed.For example feeling very hungry can make a driver consider whether tostop for a meal, without interfering with the driving. Interruption mightoccur if the new goal is judged more important than, and inconsistent with,the purpose of the current activity, or if the new one is judged to be veryurgent (although not necessarily very important) whereas the (moreimportant) current activity is not time-critical and can be temporarilysuspended: for instance stopping for a meal because one has plenty of timebefore the important meeting. Alternatively a highly insistent motive thatgets through the filters can be considered and then rejected as relativelyunimportant, without interrupting any important current action. Soinsistence, the propensity to divert attention, is not the same as a propensity to interrupt current actions, except those that require fullattention."(Sloman, 1992c, p. 15)

9.2.4. The global alarm system

A system that has to survive in an environment which changes continuallyneeds a mechanism with whose assistance it can react without delay tosuch changes. Such a mechanism is an alarm system.

An alarm system is not only of importance for a reactive, but likewise for adeliberative architecture. For example, the planning ahead can show athreat or a possibility which can be answered immediately with a change ofstrategy.

Sloman draws a parallel between his alarm system and neurophysiologicalfindings:

"Our global alarm mechanism corresponds closely to the assumed role ofthe limbic system including the amygdala which is thought to learnassociations of the type involved in emotions."(Sloman, 1998e, p.4)

The different layers of the system are influenced by the alarm system, butin different ways. At the same time they can also pass informations to thealarm system and thus elicit a global alarm.

9.3. Emotions

For Sloman, emotions are not independent processes, but develop asemergent phenomenon from the interaction of the different subsystems ofan intelligent system.

Therefore, no necessity exists for an own "emotion module". A look at psychological emotion theories leads Sloman to the conclusion:

"Disagreements about the nature of emotions can arise from failure to see

86

Page 87: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

how different concepts of emotionality depend on different architecturalfeatures, not all shared by all the animals studied."(Sloman and Logan, 1998, p. 6)

If one, however, views emotions as the result of an accordinglyconstructed architecture, then, according to Sloman, manymisunderstandings can be cleared up. A theory which analyzes emotions in connection with architectural concepts is for him therefore moreeffective than other approachese:

"This, admittedly still sketchy, architecture, explains how muchargumentation about emotions is at cross-purposes, because peopleunwittingly refer to different sorts of mechanisms which are not mutuallyexclusive. An architecture-based set of concepts can be made far lessambiguous."(Sloman und Logan, 1998, p. 7)

The different layers of the outlined architecture support also differentemotions. The reactive layer is responsible for disgust, sexual arousal,startle and fear of large, fast approaching objects. The deliberative layer isresponsible for frustration through failure, relief through danger avoidance,fear of failure or pleasant surprise by a success. The meta-management layer supports shame, degradation, aspects of mourning, pride, annoyance.

Sloman's approach intentionally disregards physiological accompanimentsof emotions. For him these are only peripheral phenomena:

"They are peripheral because essentially similar emotional states, withsimilar social implications, could occur in alien organisms or machineslacking anything like our expression mechanisms."(Sloman, 1992c, p. 20)

Sloman also does not accept the objection that emotions are inseparablyconnected with bodily expressions. He counters with the argument thatthese are only "relics of our evolutionary history" which are not essentialfor emotions. An emotion derives its meaning not from the bodily feelingswhich accompany it, but from its cognitive content:

"Fury matters because it can produce actions causing harm to the haterand hated, not because there is physical tension and sweating. Griefmatters because the beloved child is lost, not because there's a new feelingin the belly."(Sloman, 1987, p. 9)

He argues in a similar way regarding a number of non-cognitive factorswhich could play a role with human emotions, for example chemical orhormonal processes. He asks whether the affective states elicited by suchnon-cognitive mechanisms are really so different from those which areproduced by cognitive processes:

"Is a mood of depression or euphoria that is produced by chemicalprocesses a totally different state from the depression produced byrepeatedly failing to pass your examinations or the euphoria produced by

87

Page 88: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

passing with distinction?"(Sloman, 1992c, p. 21)

How, then, do emotions develop in Sloman's intelligent system? Basically,he differentiates between three classes of emotions which correspond tothe three layers of his system. On the one hand, emotions can develop through internal processes within each of these layers; on the other handby interactions between the layers.

Emotions are accompanied frequently by a state which Sloman callsperturbance. A perturbance is given if the overall system is partially outof control. It arises whenever a rejected, postponed, or simply undesirablemotivator emerges repeatedly and thus prevents or makes more difficultthe management of more important goals.

Of crucial importance here is the insistence value of a motivator which forSloman represents a dispositional state. As such a highly insistentmotivator can elicit perturbances even then if it has not yet surpassed thefilter or is not yet worked on actively.

"Insistence, on this analysis, is a dispositional state: the highly insistentmotive or thought need not actually get through the filter and interruptanything. Even if it does get through it need not actually disturb anycurrent activity. I suggest it is this strong potential for such disturbanceand diversion of attention that characterizes many of the states we describeas emotions. Such states can exist whether or not attention is actuallydiverted, and whether or not actions are thereby interrupted or disturbed.Thus, like jealousy, anger (in the form of a very insistent desire to harmsomeone because of something he is believed to have done that is stronglynegatively evaluated) can persist even though something else occupiesattention for a while. During that time there is no diversion of attention ordisturbance of any action. Dormant dispositions include such emotionalstates."(Sloman, 1992c, p. 16)

Perturbances can be occurrent (attempt to attain control over attention) or dispositional (no attempt to attain control over attention).

Perturbant states differ by several dimensions: Duration, internal orexternal source, semantic content, kind of disruption, effect on attentionalprocesses, frequency of disruption, positive or negative evaluation,development of the state, fading away of the state etc..

Perturbances are, like emotions, emergent effects of mechanisms whosetask it is to do something else. They result from the interaction of

resource-limited, attentive processing;a sub-system, which produces new candidates for such a processing;a heuristic filter mechanism.

For the emergence of perturbances, one thus does not require a separate"perturbance mechanism" in the system; also questions about the functionof a perturbant state are not meaningful from this point of view.Perturbances, however, are not to be equated with emotions; they are rather typical accompaniments of states which are generally called

88

Page 89: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

emotional.

For Sloman, emotional states are, in principle, nothing else thanmotivational states caused by motivators.

"Since insistence, as I have defined it, is a matter of degree, the theoryimplies that there are only differences of degree between emotional andnon-emotional motivational states. It also implies that there is much incommon between emotional states and those cognitive states where aparticular thought or something like a remembered experience or tune hashigh insistence, but does not involve any particular motivation or positiveor negative evaluation."(Sloman, 1992c, p. 17)

A further characteristic of emotional states consists in the production ofnew motivators. If, for example, a first emotional state resulted from aconflict between a belief and a motivator, new motivators can developwhich lead to new conflicts within the system.

9.4. The implementation of the theory inMINDER1

Sloman and his working group have developed a working computer modelnamed MINDER1 in which his architecture is partly implemented.MINDER1 is a pure software implementation; there is thus no crawling room with real robots. The model is described here very shortly; a detailed description can be found in [Wright and Sloman, 1996].

MINDER1 consists of a kind of virtual crawling room in which a virtualnanny (the minder) has to watch out for a number of virtual babies. These babies are "reactive minibots" which always move around in the crawlingroom and are threatened by different dangers: they can fall into ditches and be damaged or die; their batteries can run dry, thus they have to get to a recharging station; if the batteries are too much emptied, they die;overpopulation of the crawling room turns some babies into rowdies whichdamage other babies; damaged babies must be brought into the hospitalward to be repaired; if the damage is too great, the baby dies.

The minder now has different tasks: It must ensure that the babies lose noenergy, that they do not fall into a ditch or are threatened by otherdangers. For this purpose it can build, for example, fences to enclose the babies therein. It must lead Minibots whose energy level is dangerouslylow to a recharging station or others away from a ditch as far as possible.

This variety of the tasks ensures that the minder must always produce newmotives, evaluate them and act accordingly. The more Minibots enter the crawling room, the less the efficiency of the minder.

The architecture of MINDER1 corresponds to the basic principlesdescribed above. It consists of three subsystems which contain themselves a number of further subsystems.

89

Page 90: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

9.4.1. The reactive sub-system

The reactive sub-system contains four modules: Perception, beliefmaintenance, reactive plan execution, and preattentive motive generation.

The perception subsystem consists of a data base which contains onlypartial information about the environment of the minder. The systemfunctions within a certain radius around the minder, but can not detect, forexample, hidden objects. An update of the data base looks as follows:

[new_sense_datum

time 64 name minibot4 type minibot status alive distance 5.2 x 7.43782 y12.4632 id 4 charge 73 held false]

This means: Information at time 64 about the minibot named minibot4: It lives, is situated at a distance of 5.2 units from the minder, has the ID 4and the charge 73 and is not held by another agent.

The belief maintenance subsystem receives its information on the one hand from the informations of the perception subsystem, on the other handfrom a belief data base in which, for example, is stored that fences arethings with which one can secure a ditch. In order to delete wrong beliefs from the system, every belief is assigned a defeater. If the defeater is evaluated as true, then the respective belief is deleted from the respectivedata base. An example:

[belief time 20 name minibot8 type minibot status alive distance 17.2196

x 82.2426 y 61.2426 id 8 charge 88 held false

[defeater

[[belief == name minibot8 == x ?Xb y ?Yb ==]

[WHERE distance(myself.location, Xb,Yb) < sensor_range]

[NOT new_sense_datum == name minibot8 ==]]]]

The defeater in this case means: "IF I possess a belief regarding minibot8AND I have no new perception data of minibot8 AND I am at a position,in which I should have according to my belief new perception data of minibot8 THEN my belief is wrong."

The subsystem of the reactive plan execution is necessary, so that theminder can react fast to changing external conditions. If it has the plan,for example, to move from one position in the crawling room to another,then this plan should be executed without using too many resources.

To achieve this, MINDER1 uses a method which was developed byNilsson (1994) and is called teleo-reactive (TR) program formalism.MINDER1 has thirteen of such TR programs which enable it, for example,

90

Page 91: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

to look for objects or to manouvre in the room.

In order to use TR programs, the minder first needs to have goals. Theseare produced by the subsystem for pre-attentive motive generation whichconsists of a set of generactivators. An example is the generactivator G_low_charge, which searches through the belief database afterinformation about babies with low charge. If it finds such an information,it forms from it a motive and deposits it in the motive data base. Anexample:

[MOTIVE motive [recharge minibot4] insistence 0.322 status sub]

The status sub denotes that the motive has not yet passed the filter.MINDER1 contains eight generactivators which express its differentconcerns.

9.4.2. The deliberative sub-system

The deliberative sub-system of MINDER1 consists of the modules filter,motive management, and plan excution. All these modules are shallow, thus possess little depth of detail.

The filter threshold in MINDER1 is a real number between 0 and 1. A motivator with the status sub can pass it, if its insistence value is higherthan the value of the filter threshold. The status of the motivator thenchanges from sub to surfacing. A motivator which does not succeed in passing the filter during a time cycle, can be sent back by thegeneractivator with a newly computed insistence value.

All motivators which have passed the filter are processed by the motivemanagement and receive the status surfaced. The motive management works with the three modules deciding, scheduling, and expanding.

The module "deciding" specifies whether the motivator is to be worked onimmediately or later. If it is processed directly, it receives the statusactive; if it is processed later, it receives the status suspended.

Such a decision is possible mostly only then if the motivator has beensubmitted to a closer inspection, This is done by the module "expanding".The motivator is expanded in a way that it contains a meta plan. Anexample:

[MOTIVE motive [save ditch1 minibot5] insistence 0.646361 statusactive]

is partially expanded to:

[MOTIVE motive [save ditch1 minibot5] insistence 0.646361 status active

plan [[decide] [get_plan]]

trp [stop]

importance undef]

91

Page 92: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The steps decide and get_plan of the meta plan are calls for management processes.

From the module "expanding" the motivator can be executed directly(status: executed), a stored plan can be called up (status: active, meta), or the motivator can be passed on to the module "deciding" (status: active, meta).

In the module "deciding", the active motivator is submitted to a routinewhich specifies its importance. This importance is determined heuristically from the insistence value; the determination of the urgency of a motivatoris not yet implemented in MINDER1. As a result, it can either be rejected (status: suspended, meta) or processed further (status: active, meta).

Finally, the subsystem of plan execution is implemented in a set of TRprograms which can call up other TR programs. MINDER1 contains nineaction primitives.

9.4.3. The meta-management sub-system

The meta-management sub-system of MINDER1 contains two modules:the change of the filter threshold value and the discovery of perturbantstates.

The filter threshold value module observes the number of motivatorswhich succeed in passing the filter. If they are more than three, the threshold value is raised until only three or fewer motivators areprocessed. Motivators with the status suspended which have already beenprocessed, are assigned the status sub again if their insistence value islower than the threshold value of the filter. Only if less than three motivators are worked upon, the threshold value is lowered again.

The interaction of this module and the generactivators which compute theinsistence values of the motivators anew, results in a continuousmovement of the motivators from the pre-attentive into the attentive stateand back.

The system is, in this form, not yet able to produce perturbances. Sloman speaks in connection with MINDER1 therefore of "proto-perturbances".In order to produce proto-perturbances, Sloman uses a trick whichbecomes necessary by the lack of complexity of the system. Motivators which refer to damaged minibots (baby), receive per definitionem a highinsistence value; the management processes however assign a lowerdegree of importance to these motivators - thus a deviation from thenormal behaviour of the management processes, during which importanceis evaluated according to the insistence.

The respective module in the meta-management sub-system was designedin such a way that it calculates the rate of rejection of motivators. If this value passes a certain threshold, then a proto-perturbant state hasoccurred.

92

Page 93: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

MINDER 1 does indeed show such proto-perturbances. However, thesub-system cannot deal further with this information; for this, the entiresystem is not yet developed enough.

9.5. Summary and evaluation

Sloman's theoretical approach is certainly one of the most interestingregarding the development of emotional computers. It is less his specific interest in emotions but rather his stressing of the architecture which opensup new perspectives.

Sloman follows through theoretically most consistently what others hadspeculated about as well: that there is no fundamental difference betweenemotion and cognition. Both are aspects of control structures of anautonomous system.

A detailed view of Sloman's work from 1981 to 1998, however, shows a number of ambiguities. For example, the differentiation between the termsgoal, motive, and motivator is not clear, because they are used by himquite interchangeably.

Also it does not become clear what function perturbances have exactlywith the emergence of emotions and how they are connected with theglobal alarm system postulated by him. It is interesting that in his earlier work this alarm system is scarcely mentioned, but mainly perturbances; inhis later work one finds nearly the opposite.

The proof which Sloman wanted to deliver with MINDER1 is, in itspresent form, not convincing. Neither do perturbances develop from theinteraction of the elements of the system (the programmers had to help alot to produce even proto-perturbances), nor can one draw from it far-reaching conclusions about human emotions.

It is nevertheless the theoretical depth and width of Sloman's work whichcan lend new impulses to the study of the emotions. His combination of design-oriented approach, theory of evolution and discussion of virtual andphysical machines is deeper than all other approaches for the constructionof autonomous agents.

93

Page 94: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

10. The libidinal economy of the computer

Ian WRIGHT, a member of Sloman's working group in Birmingham, has developes his theory further into the computational libidinal economy (WRIGHT, 1997).

Wright categorizes the theories of Simon, Sloman, Frijda as well as Oatleyand Johnson Laird under the term "design-based interrupt theories" andformulates three points of criticism which apply to all approaches mentioned.

10.1. Criticism of interrupt theories of emotion

10.1.1. The control precedence problem

Simon differentiates in his approach between emotions and interrupt function which possess a highly adaptive value and emotions withdisruptive effect which run counter toward an adaptive behaviour.According to Wright, the criticized theories did not solve the problem sofar why a disruptive, thus adaptively not meaningful, emotion can takeover the control over an intelligent system and maintain it for a longertime. Obviously, the meta- management system is not in a position toterminate the disturbance quickly in such cases. In order to explain suchphenomena, the theories would have to be extended by phylogenetic,ontogenetic, and social aspects.

10.1.2. The emotional learning problem

Wright criticizes the existing theories because they do not suggest mechanisms which explain the connection between emotional states andlearning processes. For him, emotional states possess not only amotivational component, but are also important impulses for learningprocesses. This is also pointed out expressly by Frijda (1986). Inconnection with this must be seen the correlation between the intensity ofan emotion and the learning process which are not explained by theinterrupt theories.

10.1.3. The hedonic tone problem

According to Wright, the available theories do not explain on which mechanisms hedonic tone signals are based, why such signals are "simple",why they differ from semantic signals and why they are, in the cases of joyand pain, either positive or negative.

Simon, so Wright, turns feelings simply under the physiological carpet by

94

Page 95: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

postulating that all hedonistic states are consequences of the perception ofphysical conditions. Therefore it is not possible to explain with his theory,for example, a condition like "mourning" and the associated psychologicalpain which must not necessarily be connected with states of physical excitation.

For Frijda, Oatley & Johnson-Laird as well as Sloman hedonistic components are simple, phylogenetically older control signals. Thus theyhave at least a function on the level of information processing.

Frijda underlines the meaning of the hedonistic colouring of emotionalconditions. His theory postulates relevance signals for joy, pain, beingastonished or desiring, which arise if an event is compared with the satisfaction conditions of different concerns.

Oatley and Johnson-Laird explain the hedonistic components of fundamental emotional states with their concept of control signals. Theirtheory assumes, for example, that the hedonistic colouring of joy orsadness is caused by fundamental, not further reducible control signals.Because of their functional role, control signals have different hedonisticvalues. The control signal for sadness, for example, has the function to break off or change plans, while the function of happiness consists ofmaintaining or further pursuing plans.

In Sloman's theory, insistence is not connected with hedonisticcomponents. Sloman understands, however, the meaning of hedonistic components which play a motivational role as negative or positiveevaluations by breaking off or maintaining actions. He grants that hismodel must be extended by a pleasure and pain mechanism.

10.2. The concept of "valency"

Wright tries to find a solution for the latter problem by starting, first of all,with definitions. For him, hedonic tone is a too general term. Therefore heuses the term valency. Firstly, Wright differentiates between physiologicaland cognitive forms of joy and pain. Then he states that hedonisticcolouring always is connected with a quantitative dimension, intensity. Hequotes Sonnemans & Frijda (1994) who differentiate between six aspectsof emotional intensity: the duration of an emotion, perceived bodilychanges and the strength of the felt passivity (loss of control of attention), memory and re-experience of the emotion, strength and drasticness of the action tendency as well as drasticness of the actual behaviour, changes of beliefs and their influence on the long-term behaviour, and an overall feltintensity. Wright points out that none of these categories describes theintensity of the hedonistic colouring, but that, however, the category of the"strength of the felt passivity" is connected with it because both intensivejoy and intensive pain can be controlled voluntarily only with difficulty.

Then Wright defines valency as follows:

"Valency is a form of cognitive pleasure or unpleasure not linked toinformation concerning bodily locations, and is a quantitatively varying,

95

Page 96: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

non-intentional component of occurrent convergent or divergent emotions.Valenced states are contingent on the success or failure of subjectivelyimportant goals."

(Wright, 1997, S. 115)

Wright points out expressly that valency, according to his definition,should not be confused with short term control states of pleasure and unpleasure by which current activities are protected or terminated; also, valency is not identical to values, i.e. qualitative affective dispositionstoward certain states. Valency is achievement pleasure or failure unpleasure which arise if certain, for a system very important concerns, are fulfilled or threatened .

10.3. Learning in adaptive agent systems

Wright takes the system of Sloman as a basis and extends it by the component reinforcement learning (RL). In order to be able to implement this mechanism, he postulates first: "A society of mind needs an economyof mind."

For Wright it is important that RL always contains a selection component:reinforced actions have a stronger tendency to be repeated than non-reinforced actions.

In order to employ RL on all levels of a multi-agent system , it requires anappropriate reward mechanism. For this, Wright relies predominantly onfour corresponding algorithms: Q-Learning, classification systems, XCSand Dyna.

10.3.1. Q-Learning

With Q-Learning (Watkins & Dayan, 1992), an agent tries to learn foreach possible situation-action combination what the value for this action is,if it implements it in the given situation. At the beginning, the values for allpossible situation-action combinations are set to a default value. The goalof the system consists now of updating the values in such a way that theylead to the maximum cumulative discounted reward.

The maximum cumulative reward at a given time consists of the rewardfor the directly following action as well as of the rewards which can beexpected for the actions following it. These rewards are discounted in such a way that rewards which can be expected directly are more highlyevaluated than rewards which can be expected in the future.

The reward forecasts P for each possible situation-action combination arestored in a two-dimensional matrix. The algorithm selects from this matrixthe action which possesses the highest forecast value for the presentsituation. With the help of an update rule, the values are afterwardscomputed anew.

96

Page 97: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

One of the greatest weaknesses of Q-Learning consists of the fact thatwith large situation and action areas the appropriate tables become excessively large and make an economic trial and error search impossible.

10.3.2. The classification system of Holland

Holland (1995) has developed an algorithm with the name classifiersystem. With it he wants to guarantee that a learning success which isbased on an action succession of several modules leads to rewards for allmodules involved.

In his system, there are numerous classifiers which are nothing else thanIF-THEN rules (condition-action rules). Some of them observe theenvironment and, if the own rule is fulfilled, send appropriate messages toa kind of black board (message list). Other classifiers suggest their specificaction suggestions due to the information at the black board. Theprobability of the acceptance of such an action suggestion by the system ispredominantly based on the strength of the classifiers which is deducedfrom how successful its suggestions were in the past.

If the accepted action suggestion of a classifier leads to success, then itreceives a reward which lets its strength increase. If a failure follows afterits suggestion, it receives a punishment with which its strength isdecreased. It shares the reward or punishment with all other classifierswhich had a part in leading to its suggestion.

This credit assignment is achieved by a bucket brigade algorithm. Thealgorithm is called bucket brigade because not only the last classifier in aseries of classifiers is rewarded or punished, but the rewards orpunishments are proportionally disributed among the classifiers whoparticipated in the end result - like firefighters in earlier times handed thewater buckets along. Thus a reward can be propagated backwards throughthe system and cause respective reinforcements in certain action chains.

Holland has coupled his model with a genetic algorithm. Successful classifiers are paired and can produce new classifiers which can then workagain more effectively.

10.3.3. XCS

With XCS Wilson (1995) presented an advancement of Holland's classifiersystem. XCS deals with one of the weaknesses of Holland's system inwhich only the strongest are rewarded. For the success of an XCS agent,not its absolute strength is decisive, but its ability to make correctforecasts over the probability of success of its actions. If thus a classifier inthe XCS system predicts correctly the fact that it will receive a lowreward, this qualifies it for the inclusion into the genetic algorithm.

.

97

Page 98: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

10.3.4. Dyna

Sutton's (1991) Dyna architecture proceeds still another step because itpossesses the ability to plan. Before an action is initiated, Dyna can, bytrial and error within a world model, play through "in its head" theconsequences of possible actions and thus develop an optimized actionstrategy.

10.3.5. The concept of "value"

Wright points out that RL algorithms are trial-and-error learners whichneed appropiately gradual rewards in order to be adaptive. "Unfortunately,the form or forms of value in natural reinforcements learners are unknown." (Wright, 1997, p. 139)

Wright further points out that value can have two different meanings: Onthe one hand it is used when an object is evaluated. Someone likes anobject very much, it is dear to him. The other use is the assigning of valueto an object regarding a certain goal: A power saw mostly possesses ahigher value than an axe for a lumberjack.

Wright differentiates between the value an external object can have and thevalue which an internal state of a system can possess. Value for Wright is arelationship between a goal-oriented system and its own internalcomponents. Value "refers...to the utility of internal substates" (Wright, 1997, p. 138).

Value ist as well a scalar quantity as a control signal. The form taken on byvalue in RL algorithms is one of a scalar quantity. Such a scalar quantityis, contrary to a vector, not dividable into components with differentsemantics. Values specify a better-than relation between substates and have no further meaning..

In a RL system, the values of the different substates change over time;value controls thereby the respective action alternative to be executed.The value of a substate lies in its ability to buy processing power with it.

10.4. Wright's currency flow hypothesis

Wright points out the coordination problem in multi-agent systems (MAS)which had also been mentioned by Oatley (1992). This is especially truewith adaptive multi-agent systems (AMAS). The solution, according to Wright, is an internal economy with a currency flow.

Wright compares an AMAS to an economically operating society:

"In the abstract, economic systems are selective systems: the trials are thevarious concrete labours that produce commodities, the evaluatorymechanisms are the various needs and demands of individual consumers,

98

Page 99: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

and selection occurs through the buying and selling of commodities. Overtime what is produced matches what is required given available resources."

(Wright, 1997, S. 154)

Based on this, Wright develops his currency flow hypothesis (CFH):

"The currency flow hypothesis (CFH) for adaptive multi-agent systems: Currency flow, or circulation of value, is a common feature ofadaptive multi-agent systems. Value serves as a basis for coordination; itintegrates computational resources and processing by constraining theformation of local commitments. Circulation of value involves (i) alteringthe dispositional ability of agents to gain access to limited processingresources, via (ii) exchanges of an explicitly represented,domain-independent, scalar quantity form of value that mirrors the flow ofagent products. The possession of value by an agent is an ability to buyprocessing power."

(Wright, 1997, S. 160)

10.5. The details of the CLE system

Wright's computational libidinal economy unites the model of anintelligent system sketched by Sloman with a learning mechanism and amotivational subsystem which maintain emotional relations with otheragents. Wright hopes to be able to solve with this model a problem ofSloman's model which he calls valenced perturbant states problem,because it cannot explain how perturbances with a valenced component are produced.

Wright begins the description of his model by specifying the CFH again fornatural RL

"The currency flow hypothesis for natural reinforcement learners(CFHN): The currency flow hypothesis holds for the reinforcementlearning mechanisms of individual, natural agents that meet a requirementfor trial and error learning."

(Wright, 1997, S.163)

The description of the CLE covers several aspects: A libidinal selectivesystem, a scalar quantity of value, credit assignment as well as a valuecirculation theory of achievement pleasure and failure unpleasure.

10.5.1. The libidinal selective system

99

Page 100: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Wright's libidinal selective system is a cognitive sub-system whose maintask is the development of social relations. It contains the followingcomponents:

Untaught conditions of satisfaction: These are inborn satisfactionmechanisms which have been selected by evolution and whichspecify fundamental attachement goals , i.e. orgasm, positiveemotional signals from the other gender etc. According toWright, this means that evolution is the cause of attachement motivation.

1.

Means of satisfaction: These are motivational substates or agentswhich constitute the means to satisfy the different attachements goals. They in turn can produce motivators for higher levels.

2.

Learnt conditions of satisfaction: These are learned satisfactionmechanisms which inherited their reinforcement mechanisms frominborn satisfaction mechanisms and which can sometimes dominatethe latter.

3.

A selective cycle: As selective system, the libidinal system fulfillsthree functions: It generates substates which represent possiblesatisfaction mechanisms; it evaluates those substates; it selects anddeselects substates. This is done through the mentionedreinforcement mechanisms.

4.

Substate discovery: With its genetic algorithm, the libidinal systemproduces new substates which consist of new agents, new rules etc.and evaluates and selects them accordingly.

5.

Varieties of control substates: The control structure within thelibidinal system is not static but dynamic. Through the continuousselective processes certain substates can move upward through thehierarchy, others downward. The effect is one of diffusion by whicha strong control state expands through the whole system into numerous substates and sometimes can turn into an automatic reaction. Under these substates Wright counts the libidinalgeneractivators, which produce motivators for attentive processingand which correspond, accroding to Wright, to Frijda's concerns.

6.

10.5.2. The conative universal equivalent (CUE)

In Wright's model, CUE represents the scalar quantity form of value. Theterm "conative" is used here by him in the sense of "motivational". CUE isthe universal means of exchange between the substates of the libidinalsystem. The possession of CUE means the ability to buy processing power.This can take different forms:

The dispositional ability to demand pre-attentive processing resources;

1.

the dispositional ability to produce motivators for management processing;

2.

the dispositional ability to let motivators become conscious and tolet them command management resources.

3.

Thus CUE stands in a causal relationship with the interruption abilities ofmotivators and their ability to demand attention resources.

100

Page 101: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

10.5.3. Credit assignment

The exchange of CUE reflects the flow of semantic products within thesystem: In order to get into the circulation, a substate must pay the substate which supplied the semantic product to which the first substatereacts. This distribution of CUE to preceding substates takes placeaccording to Holland's bucket brigade algorithm.

Derivation of CUE from reinforcers: CUE is assigned only if itfulfills the satisfaction conditions of the innate or the learnedamplifiers derived from it.

1.

Gain of CUE: Substates can increase their CUE value (positivereinforcement).

2.

Loss of CUE: Substates can lose CUE (negative reinforcement).3.Accumulation as reinforcement: The accumulation of CUE by asubstate represents RL.

4.

Loss as deselection: The loss of CUE by a substate represents itspartial deselection.

5.

CUE as internal economy with control semantics: CUE is a domain-independent control signal which refers neither to otherthings within nor to things outside of the system.

6.

10.5.4. Circulation of value

The CLE has two distinguishable internal states: intentional and non-intentional. The intentional component of the CLE is the set of thesubstate products, in particular the motivators produced by the libidinalgeneractivators. These have a representational content, they have to dowith something. The non-intentional component of CLE is the circulation of value. This circulation of value is a flow of control signals, not ofsemantic signals.

For this, the circulation of value needs a module of the overall systemwhich observes and registers the internal flow of CUE; the meta-anagement layer mentioned by Sloman. This mechanism will at anytime determine a movement of CUE within the system. For each substatethe values change, according to whether it is rewarded (positive) orpunished (negative).

Wright demonstrates with a thought experiment to which this can lead. Avirtual frog (simfrog) learns the catching of flies in a virtual environment.If the substates necessary for this are successful, the meta-managementlayer registers an increase of CUE compared with the time before. Nowlet's assume the observations of the meta-management layer were coupledwith the skin color of the frog: positive values lead to the skin turningyellow, negative to it turning green, and no changes to no skin change.After a successful fly catch the frog would notice a change of its skin colorwhich it cannot explain itself. At the same time, it has either positive ornegative feelings of different intensity (depending upon change of the CUE

101

Page 102: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

state). A non-intentional control state develops which was released by thecirculation of value in a system with a meta-management layer.

Wright therefore adds a further element to his libidinal economy: Valencyas the monitoring of a process of credit assignment. The registration of thecirculation of value produces valenced states which represent a kind ofcognitive achievement pleasure or failure unpleasure.

.

Negative valence means a loss of CUE: A registered circulation process which contains a loss of value corresponds with negativevalence

1.

Positive valence means an increase of CUE: A registered circulationprocess which contains an increase of value corresponds with positive valence.

2.

Intensity is the measure of the exchange of CUE: The exchange rateof CUE between substates corresponds with the quantitative intensity of the valenced state.

3.

Increase of CUE is connected with the reaching of goals: If reachinga goal agrees with the satisfaction conditions of a reinforcer, it canlead to an increase of CUE.

4.

Loss of CUE is connected with the nonreaching of goals: If the notreaching of a goal agrees with the satisfaction conditions of a negative reinforcer, it can lead to a loss of CUE.

5.

"In other words, certain types of `feelings' are the self-monitoring ofadaptations; that is, the pleasure and unpleasure component of goalachievement and goal failure states is the monitoring of a movement ofinternal value that functions to alter the dispositional ability of substates tobuy processing power and determine behaviour."

(Wright, 1997, p. 176)

10.6. A practical example of CLE

Wright demonstrates the working of his model with the example of mourning. Based on an analysis of comments from people who mourned,he picks out a number of phenomena and tries to explain the underlyingprocesses with his theory.

1) The repeated and continuous interruption of attention by thoughtsabout and memories of the deceased.

If a bond structure exists with X, then motives and thoughts which refer tohim will emerge and compete with success for processing resources ofattention. To these cyclic processes can belong the desire the deceasedmight still live, or the desire one could have done something in order toprevent his death. Due to the messages over the death of X this and othersubstates will therefore very probably circulate through the system inwhich they are deeply rooted, due to the intensive connection to thedeceased. The thought processes of the agent are shaken by perturbances

102

Page 103: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

and will be partly not under his conscious control.

"The structure of attachment explains why motives relating to X are likelyto disrupt attention. (a) X-related motives will be given high insistencevalues because the relationship with X is strongly positively rewarded, andtherefore important, and X has suffered great harm. (b) Meta-managementcontrol processes ensure that motives and thoughts pertaining to X arealways decided as soon as possible, so that such motives tend to grabattentive resources immediately. (c) Dedicated evaluation procedures rateX-related motives preferentially, assigning skewed importance, urgencyand cost-benefit measures. (d) Predictive models, triggered by X-relatedmotives, will consume computational resources by attempting to reasonabout X's needs and possible reactions to things. (e) In a resource-limitedsystem, the proliferation of motives pertaining to X may `crowd out' othermotive generators."

(Wright, 1997, S. 201)

2) The difficulty to accept the death of the deceased.

The updating of a large data base and the propagation of the informationthrough the system take some time. Also, the agent has affective reasonsnot to accept the information because this would mean that sometimes ayear-long process of building up a relationship was for nothing. Finally,there is the knowledge of the agent about a long and painful mourningprocess which he would like to postpone.

3) The disruptive effect on everyday functioning.

The daily goal processing is made more difficult by management overloadwhich is to due to the disturbance of the motive management processes

4) Periods of relative normality, in which the mourning is pushed into thebackground.

With important new tasks, the filter threshold is set so high that thoughtsof the deceased cannot come through. After accomplishment of the task,the filter threshold sinks again, and it comes again to mourning

5) Attempts to fight the mourning.

The activity of a meta process which notices the disturbance and tries tofight it. This succeeds, however, only rarely; frequently the result is onlypushing the motivators below the filter threshold, where they increase inurgency and wait for the lowering of the filter threshold. Then they comethrough in larger numbers and lead to a control loss of the agent over thesystem. The perturbances will only then decrease if the CLE nearlyfinished the process of detachement.

103

Page 104: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

6) Motivators of second order, i.e. evaluation of mourning.

Meta-management processes which are culturally influenced.

7) Subjectively experienced pain.

Loss of CUE leads to negative states which are experienced as pain. Negative valency is dominant because generactivators produce motivatorswhich cannot be satisified anymore. This leads to an overproduction whichleads to a gradual deselection and possesses high negative valency.

8) Crying.

If motives which disturb the normal management process penetrate againand again through the filter and one does not succeed in pushing themaway for a longer time, an agent often can't think of another strategy tochange this state. "Crying is the plan of last resort, and can be triggeredby negatively valenced perturbant states." (Wright, 1997, S. 207)

10.7. CLE and problems of interrupt theories

Wright postulates that his model contains the solutions for the problems ofinterrupt theories outlined by him. He explains this for the four addressedproblem areas:

10.7.1. CLE and the hedonic tone problem

Oatley and Johnson-Laird postulate in their theory fundamental and notfurther reducible control signals for emotions like happiness and sadness. With CLE, one element is sufficient: The circulation of value consists ofsimple control signals which are observed and registered by anotherinstance. Depending upon the result of this process the emotions develop,for which Oatley and Johnson-Laird assume two separate signals.

The circulation of value has the added advantage that it coordinates avariety of relatively autonomous substates. The task of the circulation ofvalue consists in the long run only of attributing positive or negativeassets. All other effects are of second order and result from the original,simple function.

The CLE theory also explains, according to Wright, why control signalsdiffer from semantic signals. Value is nothing else as a means to establishbetter-than relations between substates and contains thereby no semanticcontent whatsoever like, for example, beliefs or desires.

104

Page 105: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

10.7.2. CLE and the emotional learning problem

By the introduction of a fictitious currency CUE and its circulationthrough the system learning effects become possible. Reinforcement learning can thus change the abilities of generactivators to interruptprocessing and to demand resources of the system for themselves.

besides, emotions have a strong influence on learning processes. The moreresults of behaviour are accompanied by positive or negative feelings, thebetter the appropriate behaviour is learned or avoided. Through increaseof CUE, substates win more power in the system; the more CUE, themore strongly is the registered intensity of the valenced state.

10.7.3. CLE and the valenced perturbant states problem

Wright equates a concern in Frijda's sense with a libidinal generactivatorwhose strength defines itself by how much processing capacity it can buy.At the same time, it also determines its disposition to be able to affectbehaviour. This strength is based on the CUE accumulated by it.

Generactivators of the libidinal system which have much CUE produce motives with a high interruption potential. A high increase or a large lossof CUE leads to a valenced state that can be accompanied at the same timewith a loss of control (mourning, triumph).

"..occurrent reinforcement learning together with the monitoring of creditassignment plus loss of control of attention is experienced as a valencedperturbant state."

(Wright, 1997, S. 183)

10.7.4. CLE and the control precedence problem

Why can dysfunctional and non-adaptive emotions take over control andnot be pushed back by the meta-management layer? Wright offers as an explanation that the process of the accumulation of CUE cannot becontrolled by libidinal generactivators of this layer; they can only registerthe processes. Only the libidinal selective system itself can decrease thestrength of a substate which has too much CUE and thereby affects theoverall system disruptively. Only if this has taken place, the state of loss ofcontrol will be lifted.

10.8. Conclusion

Wright's model tries to solve a number of problems which have been avoided so far by other computer models. Of special interest is his

105

Page 106: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

suggestion for the treatment of the hedonic tone problem. While other models define the hedonistic value of an event always directly, Wright triesto model this as a characteristic of a system.

The connection of the theoretical approach of Sloman with reinforcementlearning and the introduction of an imaginary currency whose circulationthrough the system is responsible for emotional processes, require a modelof high complexity, offer however, in the context of the model, a convincing explanation for the development of emotions as well as fordisruptive emotional processes - and this not alone on an abstract level,but very close to an operationalization.

On the other hand, one could stress, with Pfeifer, the argument of "overdesign" against Wright's model. The already very complex model ofSloman which was implemented in MINDER1 becomes more complex byseveral degrees through the additions of Wright, which menas very highdemands on the programming of the system and the underlying computingcapacity.

Of all presented models, Wright's is the only one that neither excuses itselfthrough "emergence of emotions" as reason for their lack of integrationinto a model nor hard-wires them into the system from the start. It remainsto be seen to what extent his attempt of a theoretical explanation ofemotions in connection with a "partially emergent" design proves itself tobe sound with the implementation into an actual model.

106

Page 107: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

11. A new paradigm?

The approach to regard emotions as a characteristic of the architecture of an intelligent system led in the last years to an increased interest in this topic. While in 1997 only two papers dealing with the topic "emotion" were presented at the leading congresses dealing with agents, 1998 already saw the first congress exclusively dedicated to "emotional agents".

A number of researchers have started to develop emotional autonomous agents based on the principles of Simon, Toda, and Sloman. Many of these approaches still are in the stage of theoretical exploration; but some have already been rudimentarily implemented.

A fundamental factor shared by all agent-centered approaches is the view of emotions as control signals in an architecture which must possess a system that can move independently in an uncertain environment. The function of emotions is it to direct the attention of thesystem toward an external or internal aspect which possesses meaning for substantial goals or concerns of the system and to assure it of processing priority.

11.1. The models of Velásquez 11.1.1. Cathexis

Velásquez (1997) developed a model based on the "Society of Mind" theory of Minsky (1985). He calls it Cathexis , a term which he defines as "concentration of emotional energy on an object or idea " (Velásquez, 1997, p.10).

Emotions consist in his model of a variety of subsystems:

"Emotions, moods, and temperaments are modeled in Cathexis as a network of special emotional systems comparable to Minsky's "proto-specialist" agents (...) Each of these proto-specialists represents a specific emotion family...such as Fear or Disgust."(Velásquez, 1997, p. 10)

Each of these proto-specialists has four kinds of sensors, which are responsible for the measurement of internal and external states: Neural sensors, sensorimotor sensors, motivational sensors, and cognitive sensors. In addition, each proto-specialist is characterized by two threshold values which Velásquez calls Alpha and Omega: Alpha is the threshold above which the activation of the respective proto-specialist begins; Omega is the saturation limit of a proto-specialist. Finally, each proto-specialist has a decay function which affects the duration of its activation.

Velásquez differentiates in his model between basic emotions and emotion blends/mixed emotions. For the definition of basic emotions he builds on Ekman and Izard and defines them as follows:

"In this model the term basic...is used to claim that there are a number of separate discrete emotions which differ from one another in important ways, and which have evolved to prepare us to deal with fundamental life tasks..."(Velásquez, 1997, p.11)

The basic emotions in Cathexis are Anger, Fear, Distress/Sadness, Enjoyment/Happiness, Disgust and Surprise.

107

Page 108: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Emotion blends or mixed emotions, respectively, are emotional states which arise when several different proto-specialists representing the basic emotions are active without one of them dominating the others.

Finally, his model contains moods which differ from emotions only by the value of their excitation level.

Emotions in Cathexis are caused by cognitive and non-cognitive elicitors which originate from the same categories as the sensors of the system. The cognitive elicitors for the basic emotions are based on a modified version of Roseman's emotion model.

The intensity of an emotion in Velásquez ' model is affected by several factors:

"Thus, in Cathexis, the intensity of an emotion is affected by several factors, including the previous level of arousal for that emotion..., the contributions of each of the emotion elicitors for that particular emotion, and the interaction with other emotions..."(Velásquez, 1997, p. 12)

The behaviour repertoire of the system knows three substantial elements: an expressive component with whose assistance it communicates its present emotional condition, consisting of face, body, and voice; an experiential component which learns from experiences and affects motivations and action tendencies of the system as well as an action selection mechanism, which selects from the calculated behavior values of different action alternatives the one with the highest value.

The system regularly goes through so-called update cycles in which the following cycle is completed:

"1. Both the internal variables (i.e. motivations) and the environment are sensed.

2. The values for all of the agent's motivations (both drives and emotions) are updated....

3. The values of all behaviors are updated based on the current sensory stimuli (external stimuli and internal motivations).

4. The behavior with the highest value becomes the active behavior. Its expressive component is used to modify the agent's expression, and its experiential component is evaluated in order to update all appropiate motivations."(Velásquez, 1997, p. 13f.)

Velásquez has implemented Cathexis in a computer model which he calls "Simón the Toddler". The screen shows the face of a baby which is capable of different emotional modes of expression and rudimentary verbalizations. The user interacts with the system by, for example, changing the parameters of Simó's proto-specialist, varying the level of neurotransmitters, or interacting directly with it by feeding it, stroking it etc..

At present, the model consists of 5 drive-proto-specialists (hunger, thirst, temperature regulation, fatigue, interest) and a repertoire of 15 behaviour alternatives, among them sleeping, eating, drinking, laughter, crying, kissing, and playing with toys. These are to be extended step by step in the course of the further development of the model.

108

Page 109: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

11.1.2. Yuppy

Yuppy is a robot which represents an emotional pet. It is an advancement of the model Simón the Toddler. Yuppy was developed first as a virtual simulation before it received a body. Velásquez calls it an example of a system with emotion-based control.

The model is constructed from a number of computational units which consist of three main components: an input, an assessment mechanism and outputs. A substantial part of the assessment mechanism are the Releasers. They filter sense data and identify special conditions, according to those they then send excitatory or inhibitory signals to the subsystems connected with them.

Velásquez follows Damasio and LeDoux and differentiates between natural and learned Releasers . Natural Releasers are firmly built into the system (hard-wired); Learned Releasers are learned and represent stimuli which are associated with the occurrence of Natural Releasers or can predict their occurrence. In the language of other models, the Natural Releasers correspond to the primary emotions, while Learned Releasers are identical with secondary emotions. The latter require more processing capacity and are more complex, since they are based, among other things, on personal emotional memories which must be activated.

Drives in Yuppy are motivational systems which propel the agent into action. Drive systems are clearly distinct from emotion systems.

Yuppy's emotion systems represent six groups of affective basic reactions: Anger, Fear , Distress/Sadness , Enjoyment/Happiness , Disgust and Surprise . Velásquez differentiates between cognitive and non-cognitive Releasers of emotions. He differentiates between four groups:

1. The neural group covers the effects of neurotransmitters, brain temperature and other neuroactive agents which can lead to an emotion and are affected by hormones, sleep, nutrition and environmental conditions.

2. The sensorimotor group covers sensorimotor processes such as face expressions, body posture and muscle potential which regulate not only existing emotions, but can also cause emotions.

3. The motivational group covers all motivations which lead to an emotion. 4. The cognitive group covers all kinds of cognitions which activate emotions, e.g. appraisal

of events, comparisons, attributions, desires, beliefs, or memories.

Yuppy's perception system consists of two color CCD cameras as eyes; a stereo audio system with 2 microphones as ears; infrared sensors for the discovery of obstacles; an air pressure sensor in order to simulate contacts; a Pyrosensor which notices changes of the ambient temperature if humans enter the area as well as a simple proprioceptive system.

Yuppy's drive system contains four drives: Charging adjustment, temperature adjustment, fatigue, and curiosity. Each of these drives controls an internal variable assigned to it which represent the charge of the battery, the height of the temperature, the quantity of energy and the measure of the interest of the agent, respectively.

Yuppy's emotion production system consists of emotional systems with Natural Releasers for the basic emotions. Velásquez divides the emotional systems into three groups:

109

Page 110: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Interactions with Drive Systems: Unsatisfied drives produce Distress and Anger; over-satisfied drives produce Distress, and drive satisfaction produces Happiness.Interactions with the environment: All objects with pink colour produce Happiness in different amounts; yellow objects produce Disgust. Darkness produces Fear, and loud noises produce Surprise.Interactions with People: Humans can stroke and punish Yuppy. This produces either joy or pain. Joy leads to Happiness; pain produces Distress and Anger.

Yuppy's behaviour system consists of a distributed net of approximately 19 different kinds of behaviour which cover predominantly the satisfaction of its needs and the interaction with humans. Examples of such behaviour are " search for bone ", " approach bone ", " recharge battery " or " approach human ".

Like the drive systems and the emotional systems, Yuppy's behaviour systems also have their own Releasers .

The user can control Yuppy's affective style by the manipulation of parameters such as threshold values, inhibitory or excitatory connections etc.. In addition, he can present to the robot internal and external stimuli. Velásquez describes the result as follows:

"Using the model described before, both the simulated and physical Yuppys will exhibit emotional behaviors under different circumstances. For instance, when the robot's Curiosity drive is high, Yuppy wanders around, looking for the pink bone which people may carry. When it encounters one, the activity of the Happiness Emotional System increases and specific behaviors, such as "wag the tail" and "approach the bone" become active. On the other hand, as time passes by without finding any bone, the activity of its Distress Emotional System rises and appropriate responses, such as "droop the tail", get executed. Similarly, while wandering around, it may encounter dark places which will elicit fearful responses in which it backs up and changes direction."(Velásquez, 1998, p. 5)

Furthermore, Yuppy is able to learn secondary emotions which are stored as new or modified cognitive Releasers. If, for example, a human holds a bone in his hand and makes Yuppy to come and get it, he can stroke or discipline it afterwards. Depending upon experience, Yuppy produces a positive or negative emotional memory regarding humans which then affects its following behaviour. 11.2. The model of Foliot and Michel Foliot and Michel (1998) define emotions as an "evaluation system operating automatically either at the perceptual level or at the cognition level, by measuring efficiency and significance" (Foliot and Michel, 1998, p. 5). For them, emotions are the basis of every cognition. With their model they aim to show "how emotion based structures could contribute to the emergence of cognition by creating suitable learning conditions" (Foliot und Michel, 1998, p. 1).

The model was implemented in a virtual Khepera robot. Khepera is a miniature robot model that contains a number of sensors and can be extended depending upon requirement by further components. The "Webots simulator" does not only simulate a Khepera; programs developed with Webots can be transferred directly into a Khepera.

110

Page 111: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The environment of the virtual Khepera consists of a city with buildings, a river and green areas. Each of these elements possesses a specific colour. The robot must move through the city and learn to evade different kinds of obstacles.

Foliot and Michel represent emotions on two levels.The level of process can evaluate stimuli and elicit different emotions; the level of state can supply informations about the system. For Foliot and Michel, the basis of their first experiment was the assumption that an emotion is characterized by a reaction to a positive or negative signal. The model consists of four components:

1. A reflex structure, which leads to a motor movement in opposite direction to an obstacle which is discovered by an infrared sensor.

2. An association matrix between the motor behaviour and the input of the infrared sensors, whose initial value is set to zero.

3. A signal which, if the robot meets an obstacle, each time produces an association in the matrix.

4. A behaviour system with the three alternatives: (a) Movement on a straight line if no obstacle stands in the way and no learned association is active; (b) the release of a reflex behaviour with the impact with an obstacle; (c) association of a motor configuration with a well-known sense pattern.

The experiment resulted in the fact that the robot collided gradually less and less with obstacles but could never move completely error free. In order to examine whether the improvement of the training system by affective signals would furnish better results, the authors performed a second experiment. The second experiment was based on on the emotion theory of Scherer. It consists of five components:

1. A linear evaluation system, which corresponds with Scherers SEC and in which each evaluation stage is used for the following evaluation.

2. Two state systems, one of which represents the assessment of a situation, the other one the physical body.

3. Two cognitive processes, one of which is responsible for attention selection, the other one for the decision over the next movement. The state systems can affect these processes directly.

4. A data base with goals. 5. A sensorimotor, a schematic, and a conceptional level. The schematic level produces

association schemata between significant patterns and actions.

111

Page 112: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Fig. 17: Controller model by Foliot and Michel (after Foliot and Michel, 1998, p. 4)

The model differentiates between cognitive and emotional processes. Each emotional process is defined by an assessment sequence which classifies stimuli according to the criteria novelty, pleasantness, goal significance, and coping. Each stage of this process uses the results of the preceding stages as input. Coping knows the alternatives "reaction possibility" and "no reaction possibility".

The cognitive processes know a primary goal (forward movement) and four secondary goals (anti-clockwise rotation, clockwise rotation, left wall follow, right wall follow). Each goal isdefined by a value in the body representation. Learning happens in this model whenever the average state of the system contains a strong displeasure value:

"This produces a new scheme containing the newer stimulus as a sensory input. The process then waits to observe which goal is associated to this stimulus and [to] check whether this goal allows to come back to a normal state. If this normal state is reached within a small amount of time, the representation is associated to the scheme, otherwise, the scheme is destroyed."(Foliot und Michel, 1998, p. 5)

Central component of the model is the mechanism which produces schemata. The experiment showed that during the avoidance of obstacles this took place either on the sensorimotor level, if an obstacle was detected by the infrared sensors, or on the schematic level, if an obstacle was not detected. The schematic level corresponds with a temporary goal change which the authors interpret as consequence of a danger signal or of an internal assessment process.

Concerning the learning process, the system exhibited two fundamental instabilities in its behavior:Either the robot persisted in its once selected goal or it changed its goals nonstop. Foliot and Michel conclude, nevertheless, that their approach is in principle correct but requires a more detailed definition of the individual components.

11.3. The model of Gadanho und Hallam Gadanho and Hallam examined which role emotions play in an autonomous robot which adapts to its environment by reinforcement learning (Gadanho and Hallam, 1998). For this purpose they worked with a simulated Khepera robot.

112

Page 113: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

They built their emotion model after the somatic marker hypothesis suggested by Damasio (1994).Damasio assumes that emotions cause special body feelings. These body feelings are the resultof experiences with internal preference systems and external events and help to predict results of certain scenarios. Somatic markers help humans to make fast decisions without using a high processing capacity and a long time.

The model developed on this basis by Gadanho and Hallam knows four fundamental emotions: Happiness, Sadness, Fear, and Anger. The intensity of each emotion is determined by the internalfeelings of the robot. These feelings are: Hunger, Pain, Restlessness, Temperature, Eating, Smell, Warmth, and Proximity. Each emotion is defined by a set of constant feeling dependencies and a bias value. For example, the intensity of Sadness is high, if Hunger and Restlessness are high and the robot does not eat.

In the model of Gadanho and Hallam, each emotion tries to affect the body state in such a way that the resulting body state resembles the one which causes that specific emotion. To achieve this, the emotion uses a simple hormoneal system. With each feeling, a hormone is associated. The intensity of a feeling is derived not directly from the value of the body perception, which causes the feeling, but from the sum of the perception and the hormone value:

"The hormone values can be (positively or negatively) high enough to totally hide the real sensations from the robot's perception of its body. The hormone quantities produced by each emotion are directly related to its intensity and its dependencies on the associated feelings. The stronger the dependency on a certain feeling, the greater quantity of the associated hormone is produced by an emotion."(Gadanho and Hallam, 1998, p. 2)

The hormone values can rise fast; however, they fade away slowly, so that the emotional state remains for some time, even if the emotion-releasing situation is already long past. The robot equipped with this emotion system has the task to visit sources of food scattered in its environment and to take up energy. The faster it moves, the more energy it uses. The sources of food consist of lights which the robot can detect. In order to draw energy from it, it must push the source of food. This sets free energy for a short time, and a smell which the robot can detect. In order to take up the energy, the robot must turn around and turn its back to the source of food. After a short time the source of food is empty and needs a certain period of time to regenerate again. The robot must thus visit other sources of food. If a source of food has no energy, its light goes out.

In the context of this task, the emotional dependencies of feelings look as follows:

The robot is happy if the present situation does not exhibit problems. It is particularly happy if it used its motors much or just takes up energy. The robot is sad if it has little energy and currently does not take up energy. If the robot drives against a wall, the felt pain makes it fearful.If the robot remains a too long time at a position, it becomes jerky. That makes it angry. The anger remains, until it moves or changes its current actions.

The system learns by Reinforcement Learning. In order to shorten the learning process, the fundamental behaviours of the robot were programmed from the start, so that the system could concentrate on the learning of behaviour co-ordination. The three fundamental behaviours of the robot are the avoidance of obstacles, approaching sources of light as well as driving along a wall.

113

Page 114: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The system has a controller with two separate modules. The Associative Memory Module is a neural net which associates the feelings of the robot with the values it expects from each of its three behaviours. The algorithm used here is Q-Learning. The Behaviour Selection Module makes a stochastic selection, based on the information of the other module, which behaviour is to be executed next.

Fig. 18: Adaptive controller (after Gadanho and Hallam, 1998, p. 3)

Reward or punishment with an autonomous robot pose, according to the authors, a special problem. From moment to moment the environment or the internal state of the robot change. If during each transition all information is analyzed and the behaviours are changed, this would cost not only enormous processing capacity, but would also supply the robot with no feedback whether a selected behaviour leads to success perhaps only after a set of transitions. On the other hand, it must be able to change a dysfunctional behaviour fast. Here the emotions come into play: Their task is to determine these state transitions.

In order to test this hypothesis, the authors developed a controller with emotion-dependent event detection. An event is detected if one of three conditions occurs:

there is a change of the dominant emotion;the value of the currently dominant emotion differs statistically significantly from the values which were recorded since the last state transition; a limit of 10.000 steps is reached.

114

Page 115: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

.Fig. 19: Emotions and control (after Gadanho and Hallam, 1998, p. 4)

To test the effectiveness of this event-directed controller, the authors developed three further controllers:

Regular intervals - the adaptive controller is released all 35 steps. Hand crafted - all behaviours are programmed firmly in the controller, the system cannot learn anything.Random selection - the controller selects a new behaviour with each step.

Each of these four controllers went through an identical experimental setup. It consisted of thirty different attempts with three million learning steps. In each attempt, a fully loaded robot was placed at a randomly selected initial position. For evaluation purposes, units with 50,000 steps each were evaluated and data collected over the following variables:

the average of the reinforcement received over all steps;the average of the reinforcement during those steps in which the adaptive controller was released;the average of the energy level of the robot; the number of collisions; the frequency of the release of the adaptive controller.

The result looks as follows:

Controller Reinforcement

Event reinforcement Energy Collisions (%)

Hand-crafted 0.34 -0.03 0.83 3.0Event-driven 0.24 0.04 0.63 0.6Regular intervals

0.24 0.20 0.62 1.7

Random selection

-0.38 -0.38 0.02 5.6

Tab. 9: Results of the experiments of Gadanho and Hallam (after Gadanho and Hallam, 1998, p. 5)

115

Page 116: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The results show, according to the authors, that the learning controllers have fulfilled their task. Their energy level is, on average, significantly lower, but reaches no critical value. Between the two learning controllers the main difference lies in the number of collisions - here the event-directed controller is better.

In all, the event-directed controller is not significantly better than its competitor, but it obtains its learning success with a significantly lower number of events and thereby saves substantially more time.

The authors come to the conclusion that the experiments have confirmed their hypothesis about the role of emotions in reinforcement learning.

11.4. The model of Staller und Petta Staller and Petta developed the TABASCO architecture, an acronym for "Tractable Appraisal-Based Architecture for Situated Cognizers" (Staller and Petta, 1998). TABASCO is based to a large extent on the emotion theory of Scherer and has so far not been implemented in a simulation.

Staller and Petta understand emotions as processes which are related to the interaction of an agent with its environment. "In particular, TABASCO models the appraisal process, the generation of action tendencies, and coping." (Staller and Petta, 1998, p. 3)

The fundamental idea of TABASCO consists of the fact that the levels of the emotion system (sensorimotor, schematic and conceptional), postulated by Scherer, have not only validity regarding appraisals, but also regarding action generation. The two main components of the architecture, Perception and Appraisal and Action, are therefore constructed as hierarchies with three levels.

Fig. 20: TABASCO architecture (after Staller and Petta, 1998, p. 4)

The component Perception and Appraisal: The sensory layer consists of feature detectors for the detection of, for example, sudden, intensive stimuli or the quality of an stimulus (e.g. pleasantness). The schematic layer compares the input with patterns, particularly with social and self patterns. The conceptional layer can, based on propositional knowledge and beliefs, think abstractly and infer.

116

Page 117: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The component Action: The motor layer contains motor commands. The schematic layer contains action tendencies and what Frijda calls "flexible programs" (Frijda, 1986, p. 83). The conceptional layer is responsible for coping.

Between these two components moderates the Appraisal Register which goes back to a suggestion of Smith et al. (1996). It discovers and combines the appraisal results of the three layers of the Perception and Appraisal component and affects, on the basis of the appraised state, the Action component.

The Action Monitoring component finally observes the planning and execution processes of the Action component and conveys the results to the Perception and Appraisal component, where they are integrated into the appraisal process.

Staller and Petta call their system a situated cognizer. With this term they want to underline the importance of both components for an autonomous system. They define cognizing (a term suggested first by Chomsky) as "having access to knowledge that is not necessarily accessible to consciousness" (Staller and Petta, 1998, p. 5).

11.5. The model of Botelho and Coelho Botelho and Coelho define emotion in the context of their Salt & Pepper project as "a process that involves appraisal stages, generation of signals used to regulate the agent's behavior, and emotional responses" (Botelho and Coelho, 1997, p.4). With Salt & Pepper they want to define an architecture containing mechanisms which play the same role for autonomous agents as the mechanisms that make humans so successful.

Starting point of their considerations is the classification of emotions in a multidimensional matrix "that may be used with any set of emotion classification dimensions" (Botelho and Coelho, 1997, p. 4).

Dimension of classification

Examples

Role/function of emotion

Attention shift warning, performance evaluation, malfunctioning-component warning, motivation intensifier

Process by which emotion fulfills its

role

Reflexive action, creation of motivators, setting plan selection criteria

Urgency of the repairing process

Urgent (e.g. need to immediately attend the external environment), not urgent (e.g. need for long-term improvement of default criteria for plan selection)

Source of appraisal External environment, internal state, past events, current events

Type of appraisal Affective appraisal, cognitive appraisal

Table 10: Dimensions of emotion classification (after Botelho and Coelho, 1997, p. 5)

The authors differentiate between affective and cognitive appraisal and postulate that it is, in principle, possible to differentiate clearly between these two components in any given architecture.

117

Page 118: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

They call the respective modules Affective Engine and Cognitive Engine.

The Affective Engine and the Cognitive Engine differ in three respects:

1. Kind of processed information: The Affective Engine processes information which has to do with the hypothetical or real satisfiability of the motives of the agent, while the Cognitive Engine processes additionally problem solution informations, decision informations, and declarative informations about different aspects of the world.

2. Purpose of information processing: The principal purpose of the information processing of the Affective Engine is the production of signals which help the Cognitive Engine to fulfill its tasks, for example selection of the cognitive structures relevant for a situation, control of attention etc.. The principal purposes of the Cognitive Engine are goal attainment, problem solving, and deciding. "A simple way to put it is to say the Cognitive Engine reasons at the object level, whereas the Affective Engine reasons at the meta-level." (Botelho and Coelho, 1997, p. 10).

3. Typical response time: The Affective Engine reacts much faster than the Cognitive Engine, because it needs only a fraction of the information and because its architecture contributes likewise to faster decisions.

The authors suggest a mechanism which makes it possible for the Affective Engine to react quickly: the reduction of explicit and long comparison chains to short, specific rules. They describe an example of such a process:

if someone risks dying, he or she will feel a lot of fear; risks_dying(A) -> activate(fear, negative, 15) if someone risks running out of food, he or she risks dying; risks_running_out_of_food(A) -> risks_dying(A) if someone risks running out of money, he or she risks running out of food; risks_running_out_of_money(A) -> risks_running_out_of_food(A) if someone loses some amount of money, he or she risks running out of money; loses_money(A) -> risks_running_out_of-money(A) loses_money(A) -> activate(fear, negative,15) (Botelho and Coelho, 1997, p. 11)

These explicit and implicit rules should be organized in a hierarchy in which the longer rules are used only if no suitable short rule is found.

The Salt & Pepper architecture consists of three main components: the Affective Engine, the Cognitive Engine and an Interrupt Manager. The Affective Engine possesses Affective Sensors, an Affective Generator, and an Affective Monitor. The latter two initiate the process of emotion production together. All other modules of the system (except the Interrupt Manager) are assigned to the Cognitive Engine.

118

Page 119: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Fig. 21: Salt & Pepper architecture (after Botelho and Coelho, 1997, p. 12)

The long-term memory is an associative network. Each node of the network possesses an identification, an activation level, a set of associations with other nodes and a number of symbolic structures which represent motives, plans, actions and declarative knowledge. The more a node is activated, the easier it is noticed by a search process (accessability).

The Input Buffer and the Affective Generator activate nodes in long-term memory. The Cognitive Monitor and the Affective Monitor suggest certain nodes for the attention of the agent. If such a suggestion process runs, the Interrupt Manager decides whether the current cognitive process is to be interrupted and the content of the suggested node is to be loaded into working memory to be processed. If the contents of a node in working memory are processed, the node receives a certain level of activation and thus more accessability.

Nodes which are based on certain experiences of the agent are called episodic nodes and form the episodic memory.

Emotions are described in this system by a set of parameters:

1. a label E which describes the emotion class and a list of arguments, for example the source of the appraisal;

2. a valence V, which can take on the values positive, negative, or neutral; 3. an intensity I; 4. an emotion program P, which represents an action succession which is implemented as

soon as the appraisal stage has produced a label; 5. an emotional reaction R which is only implemented if a node in long-term memory,

which agrees with the label of the emotion, is selected and processed in working memory.

The emotion program differs from the emotional reaction by the fact that it is executed by the Affective Generator, without interrupting the current cognitive processing of the agent. The Affective Generator undertakes a partial evaluation of the external and internal state of the

119

Page 120: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

agent, the so-called affective estimate. If the calling conditions of a certain emotion are fulfilled, the Affective Generator produces the label, the intensity and the valence of the emotion and executes the emotion program. The Affective Monitor then scans the long-term memory, until it finds a node which corresponds to the label of the emotion and possesses the same valence. It activates it with an activation level which represents a function of the produced emotion intensity.

The system contains mechanisms which make emotional learning possible. The authors differentiate between three classes of emotional learning:

1. Learning of new and optimization of appraisal rules: With this they mean the learning of new circumstances which can release an emotion signal, the reduction of appraisal rules (see above) and the change of the characteristics of the generated emotion signal.

2. Extension of the repertoire of emotion signals.3. Learning from emotional reactions: Among this they count the optimization of existing

behaviour reactions, learning new reactions as well as learning as a result of a reaction to an emotion signal.

The authors specify the conditions under which a system is able to accomplish these learning procedures (Botelho and Coelho, 1998).

Some elements of Salt & Pepper were implemented so far and, according to the authors, have confirmed the theoretical assumptions (Botelho and Coelho, 1997).

11.6. The model of Canamero Canamero (1997) also pursues an approach based on Minsky's "Society of Mind" (1985). In a two-dimensional world called Gridland live the Abbotts, artificial organisms which have a motivational and emotional system.

An Abbott consists of a number of agents which, viewed individually, are quite "simple", but reach a new quality when they interact with one another. An Abbott possesses three kinds of sensors (somatic, tactile, visual); two kinds of recognizers which react to complex stimuli and can both learn and forget; eight so-called direction nemes which supply informations from the spatial environment of the Abbott; two categories of maps (tactile and visual) which receive their information from the recognizers and direction-nemes and represent these internally; three effectors (hand, foot, mouth); a behaviour repertoire (Attack, Drink, Eat, Play, Rest, Withdraw etc..) as well as a set of managers (e.g. finder, look-for, go-toward) which correspond with appetitive behaviour. Furthermore, the Abbotts possess a set of physiological variables, e.g..adrenalin, blood sugar, endorphines, body temperature etc..

The Abbotts move in a world which contains sources of food, obstacles and enemies. They come into this world as "newborns", equipped with a basic set of characteristics, and must then develop in their environment.

What is interesting in Canamero's model is that her creatures are equipped from the outset with motivations and emotions. They are called, after Minsky, proto-specialists , because they are primitive mechanisms responsible for action selection and control functions.

Theoretical basis for the motivations is a homoeostatic approach:

"In general, motivations can be seen as homeostatic processes which maintain a controlled

120

Page 121: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

physiological variable within a certain range."(Canamero, 1997, p. 6)

The motivational agents of the Abbotts consist of"a controlled variable, the set point and the normal variability range of which are defined by the corresponding sensor that tracks ist real value; an incentive stimulus that can increase the motivation's activation level, but cannot trigger it; an error signal or drive; and a satiation criterion."(Canamero, 1997, p. 6f.)

Thus the error message "too low blood sugars" calls the motivation hunger, for example, whose goal is it to increase the blood sugar level. The activation of a motivation is proportional to the size of the error message (or the deviation of a physiological value from the homoeostatic state); according to the activation level the intensity of the motivation is computed. The motivation with the highest activation level tries to organize the behaviour of the Abbott in such a way that the associated drive is satisfied. If the motivation cannot find and call an appropriate behaviour, it activates the finder agent and hands to it the intensity value, so that it can pass it on to other agents who are activated by it. The intensity affects a behaviour substantially: with the escape behaviour, for example, the strength of the motor activity, with other behaviours, for example, their duration.

Activation level and intensity of a motivation can now be modified by emotions. In Canamero's system emotions are composed of

"an incentive stimulus; an intensity proportional to its level of activation; a list of hormones it releases when activated; a list of physiological symptoms; and a list of physiological variables it can affect."(Canamero, 1997, p. 7)

Emotional states are activated and differentiated from each other by three kinds of elicitors:

1. External events, i.e. an object or the result of a behaviour, whereby the reaction to it can be either inherited or learned.

2. General stimulation patterns which cause different changes in the physiological variables and thus let the same emotion work under different circumstances. As example Canamero cites the anger agent which is called by continually too high a level of a variable. In this way, emotions contribute to the control of homoeostatic processes.

3. Special value patterns of physiological variables which permit a distinction between emotions which are caused by the same general mechanism. As an example Canamero cites fear (with high heartbeat frequency) and interest (with low heartbeat frequency).

Since Abbott is a primitive system, it is always in a clear emotional state. The three elicitors are arranged hierarchically in the order mentioned above. The selected emotion affects the action selection mechanism in two ways: It can lower or increase the intensity of the current motivation and thus at the same time also the intensity of the selected behaviour; besides it modifies the results of the sensors which measure the variables that affect the emotion and changes thereby the perceived physical state (happiness agent - > release of endorphin - > less pain perception).

The action selection of an Abbott thus takes place in four stages:

1. The activation level of all agents is set to zero.

121

Page 122: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

2. Internal variables and environmental data are read in and maps are formed. 3. Motivations are appraised and the effects of the emotional state are computed.4. The motivation with the highest activation is selected. 5. The active motivation selects the behaviour(s) which can best satisfy its drive.

Canamero grants that her Abbotts at present still operate on a very primitive level and need a number of additional agents in order to develop long-term learning and strategies. Emotions play a substantial role in her model:

"In particular, as far as learning is concerned, our model of emotions provides a means to have different reward and punishment mechanisms...Again, motivations and emotions constitute a key factor in determining what has to be remembered and why."(Canamero, 1997, p. 8)

11.7. Summary and evaluationAs the preceding examples show, there exist a variety of approaches to model emotions in the field of autonomous agents. The connections to psychological theory among them are quite different.

It is noticeable that most authors are quite eclectic with their theories. They fall back predominantly upon theories which are suitable for an operationalization. Frequently, only certain elements are picked out which are then extended by own components, often without making this explicitly clear.

In order to obtain fast results, only parts of the sketched models are implemented in real simulations or robots. Pragmatic solutions are used which necessarily reduce complex processes to some few variables. Besides, these variables are frequently arbitrarily defined in order to be able to realize the model at all.

It is remarkable that the majority of the authors regards emotions as a substantial component of the control system of an agent and defines them in this regard functionally. Emotions are regarded no more as appendages of the cognitive system, but rather as an indispensable condition for the reliable functioning of cognition.

122

Page 123: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

LiteraturAffective Computing Home Page unter http://www-white.media.mit.edu/vismod/demos/affect/affect.html.

Araujo, A.F.R. (1994). Memory, Emotions, and Neural Networks: Associative Learning and Memeory Recall Influenced by AffectiveEvaluation and Task Difficulty. PhD thesis, University of Sussex.

Aubé, M. (1998). Designing Adaptive Cooperating Animats Will RequireDesigning Emotions: Expanding upon Toda's Urge Theory. Paper presented at the 5th International Conference of the Society for AdaptiveBehavior, Zürich. Erhältlich unterhttp://www.ai.univie.ac.at/~paolo/conf/sab98/sab98ws.html.

Axelrod, R. (1990). The Evolution of Co-operation. Penguin Books.

Bareiss, R. (1989). Exemplar-Based Knowledge Acquisition. A UnifiedApproach to Concept Representation, Classification, and Learning.Academic Press.

Bates, J. (1994). The role of Emotion in Believable Agents. Communications of the ACM.

Bates, J.; Loyall, A. B.; Reilly, W. S. (1992a). An Architecture for Action,Emotion, and Social Behavior. In: Proceedings of the Fourth EuropeanWorkshop on Modeling Autonomous Agents in a Multi-Agent World. S.Martino al Cimino, Italien.

Bates, J.; Loyall, A. B.; Reilly, W. S. (1992b). Integrating Reactivity,Goals, and Emotion in a Broad Agent. In: Proceedings of the FourteenthAnnual Conference of the Cognitive Science Society. Bloomington, Indiana.

Baumgartner, P. und Payr, S. (eds). (1995). Speaking Mindes. Interviewswith Twenty Eminent Cognitive Scientists. Princeton University Press.

Beaudoin, L. und Sloman, A. (1993). A study of motive processing andattention. In: A.Sloman, D.Hogg, G.Humphreys, D. Partridge, A. Ramsay(eds): Prospects for Artificial Intelligence. IOS Press, Amsterdam.

Botelho, L.M. und Coelho, H. (1997). Artificial autonomous agents withartificial emotions. Erhältlich unter http://iscte.iscte.pt/~luis/web/luis.htm

Botelho, L.M. und Coelho, H. (1998). Adaptive agents: emotion learning.Paper presented at the 5th International Conference of the Society forAdaptive Behavior, Zürich. Erhältlich unter http://www.ai.univie.ac.at/~paolo/conf/sab98/sab98ws.html.

Braitenberg, V. (1993). Vehikel. Experimente mit kybernetischen Wesen.Reinbek, Rowohlt.

123

Page 124: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Brustoloni, J. C. (1991). Autonomous Agents: Characterization andRequirements. Carnegie Mellon Technical Report CMU-CS-91-204. Pittsburgh: Carnegie Mellon University.

Canamero, D. (1997). Modelling motivations and emotions as a basis forintelligent behavior. In: Proceedings of Agents '97. ACM.

Chwelos, G. und Oatley, K. (1994). Appraisal, Computational Models,and Scherer's Expert System. In: Cognition and Emotion, 8 (3), S.245-257.

Colby, K.M. (1981). Modeling a paranoid mind. In: The Behavioral andBrain Sciences, 4 (4), S. 515-560.

Cornelius, R.R.(1996). The Science of Emotion. Prentice Hall.

Damasio, A. R.(1994). Descartes' Error. Emotion, Reason and theHuman Brain. Avon Books.

Dawkins, M.S. (1993). Through Our Eyes Only? The Search for AnimalCosciousness. Oxford, W.H. Freeman.

Dawkins, R. (1988). The Blind Watchmaker. Penguin Books.

Dennett, D. (1996). Kinds of minds. Toward an Understanding of Consciousness. Basic Books.

Dörner, D. und Hille, K. (1995). Artificial Souls: Motivated EmotionalRobots. Erhältlich unter http://www.uni-bamberg.de/~ba2dp1/psi.htm.

Dörner, D., Hamm, A., Hille, K. (1997). EmoRegul. Beschreibung einesProgrammes zur Simulation der Interaktion von Motivation, Emotion undKognition bei der Handlungsregulation. Erhältlich unterhttp://www.uni-bamberg.de/~ba2dp1/psi.htm

Dörner, D. und Schaub, H. (1998). Das Leben von PSI. Über dasZusammenspiel von Kognition, Emotion und Motivation - oder: Eineeinfache Theorie für komplizierte Verhaltensweisen. Erhältlich unterhttp://www.uni-bamberg.de/~ba2dp1/psi.htm.

Dyer, M.G. (1982). In-depth understanding. A computer model of integrated processing for narrative comprehension. Cambridge, Mass., MIT Press.

Dyer, M.G. (1987). Emotions and their Computations: Three ComputerModels. In: Cognition and Emotion, 1 (3), S. 323-347.

Elliott, C. (1992). The Affective Reasoner: A Process Model of Emotionsin a Multi-agent System. Ph.D. Dissertation, Northwestern University,The Institute for the Learning Sciences, Technical Report No.32.

Elliott, C. (1994a). Research problems in the use of a shallow artificialintelligence model of personality and emotion. In: Proceedings of theTwelfth National Conference on Artificial Intelligence. Seattle, WA:AAAI.

124

Page 125: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Elliott, C. (1994b). Components of two-way emotion communciation between humans and computers using a broad, rudimentary, model ofaffect and personality. In: COGNITIVE STUDIES: Bulletin of the Japanese Cognitive Science Society, 1(2):16-30.

Elliott, C. (1997a). I picked up catapia and other stories: A multimodalapproach to expressivity for "emotionally intelligent" agents. In:Proceedings of the First International Conference on AutonomousAgents.

Elliott, C. (1997b). Hunting for the Holy Grail with "emotionallyintelligent" virtual actors. http://condor.depaul.edu/~elliott.

Elliott, C., and Siegle, G. (1993). Variables influencing the intensity ofsimulated affective states. In: AAAI technical report for the SpringSymposium on Reasoning about Mental States: Formal Theories andApplications. American Association for Articial Intelligence, StanfordUniversity, Palo Alto, CA..

Elliott, C.; Yang, Y.-Y.; Nerheim-Wolfe, R. (1993). Using faces toexpress simulated emotions. Unpublished manuscript.

Elliott, C., and Carlino, E. (1994). Detecting user emotion in a speech-driven interface. Work in progress.

Elliott, C.; Rickel, J.; Lester, J.C. (1997). Integrating affective computinginto animated tutoring agents. Erhältlich unterhttp://condor.depaul.edu/~elliott.

Foliot, G. und Michel, O. (1998). Learning Object Significance with anEmotion based Process. Paper presented at the 5th InternationalConference of the Society for Adaptive Behavior, Zürich. Erhältlich unterhttp://www.ai.univie.ac.at/~paolo/conf/sab98/sab98ws.html.

Franklin, S. (1995). Artificial Minds. Cambridge, MA, MIT Press.

Franklin, S. und Graesser, A. (1996). Is it an Agent, or just a Program?:A Taxonomy for Autonomous Agents. In: Proceedings of the ThirdInternational Workshop on Agent Theories, Architectures, and Languages. Springer Verlag.

Frijda, N.H. (1986). The emotions. Cambridge, U.K., CambridgeUniversity Press.

Frijda, N.H. und Swagerman, J. (1987). Can computers feel? Theory anddesign of an emotional system. In: Cognition and Emotion, 1, S. 235-258.

Frijda, N.H. und Moffat, D. (1993). A model of emotions and emotioncommunication. In: Proceedings of RO-MAN '93: 2nd IEEE InternationalWorkshop on Robot and Human Communication.

Frijda, N.H. und Moffat, D. (1994). Modeling emotion. In: CognitiveStudies, 1:2, S. 5-15.

Gadanho, S.C. und Hallam, J. (1998). Emotion-triggered Learning for125

Page 126: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Autonomous Robots. Paper presented at the 5th International Conferenceof the Society for Adaptive Behavior, Zürich. Erhältlich unterhttp://www.ai.univie.ac.at/~paolo/conf/sab98/sab98ws.html.

Gazzaniga, M.S. (1998). The mind's past. Berkeley, University ofCalifornia Press.

Holland, J.H. (1995). Hidden Order. How adaptation builds complexity.Reading, MA, Helix Books.

Holland, J.H. (1998). Emergence. From Chaos to Order. Reading, MA,Helix Books.

Kaiser, S. und Wehrle, T. (1993). Emotion research and AI: Some theoretical and technical issues. Erhältlich unterhttp://www.unige.ch/fapse/emotion/.

Kaiser, S. et. al. (1994). Multi-modal emotion measurements in aninteractive computer-game: A pilot-study. In: N. Frijda (ed.): Proceedingsof the VIIIth Conference of the International Society for Research onEmotion, 1994.

Koda, T. (1997). Agents with Faces: A Study on the Effects of Personification of Software Agents. Thesis, MIT.

LeDoux, J. (1996). The Emotional Brain. Simon & Schuster.

McCarthy, John (1990). Formalizing Common Sense. Norwood, AblexPublishing Corporation.

McFarland, D. und Bösser, T. (1993). Intelligent behavior in animals androbots. Cambridge, MA, MIT Press.

Minsky, M. (1987). Societies of Mind. Picador.

Moffat, D. (1997). Personality parameters and programs. In: R. Trapplund P. Peta (eds.): Creating personalities for synthetic actors. Springer.

Moffat, D., Frijda, N.H., Phaf, H. (1993). Analysis of a model ofemotions. In: A. Sloman, D. Hogg, G. Humphreys, A. Ramsay, D. Partridge (eds.): Prospects for Artificial Intelligence. Amsterdam, IOS Press.

Moffat, D. und Frijda, N.H. (1995). Where there's a Will there's an Agent. In: M.J. Woolridge und N.R. Jennings (eds.): Intelligent Agents -Proceedings of the 1994 Workshop on Agent Theories, Architectures, andLanguages. Springer.

Mueller, E. und Dyer, M.G. (1985). Daydreaming in humans and computers. In: Proceedings of the Ninth International Joint Conference on Artificial Intelligence. Los Angeles, CA.

Neisser, U. (1963). The imitation of man by machine. In: Science, 139, S.193-197.

Oatley, K. (1992). Best Laid Schemes. The Psychology of Emotions. Paris,

126

Page 127: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Cambridge University Press.

Oatley, K. und Johnson-Laird, P.N. (1987). Towards a cognitive theory ofemotions. Cognition and Emotion, 1, 29-50.

Oatley, K. und Jenkins, J.M. (1996). Understanding Emotions. Blackwell.

O'Rorke, P. und Ortony, A. (1992). Explaining emotions.Unveröffentlichtes Manuskript.

Ortony, A., Clore, G.L., Collins, A. (1988). The cognitive structure ofemotions. Cambridge, U.K., Cambridge University Press.

Pfeifer, R. (1982). Cognition and emotion: an information processingapproach. Carnegie-Mellon University, CIP Working Paper Nb. 436.

Pfeifer, R. (1988). Artificial intelligence models of emotion. In: V.Hamilton, G. Bower, & N. Frijda (eds.). Cognitive perspectives onemotion and motivation. Proceedings of the NATO Advanced Research Workshop. Dordrecht, Kluwer.

Pfeifer, R. (1994). The "Fungus Eater" approach to the study of emotion:A View from Artificial Intelligence. Techreport #95.04. ArtificialIntelligence Laboratory, University of Zürich.

Pfeifer, R. (1996). Building "Fungus Eaters": Design Priciples ofAutonomous Agents. In: Proceedings of the Fourth InternationalConference of the Society for Adaptive Behavior. Cambridge, MA, MITPress.

Pfeifer, R. (1998). Cognition. In L. Steels (ed.). The biology andtechnology of intelligent autonomous agents. Proceedings of the NATOAdvanced Study Institute. Trento, Italien.

Pfeifer, R. (1998). Cheap designs: exploiting the dynamics of thesystem-environment interaction. Technical Report No. IFI-AI-94.01, AILab, Computer Science Department, University of Zurich.

Pfeifer, R. und Nicholas, D.W. (1985). Toward computational models ofemotion. In: L. Steels, and J.A. Campbell (eds.).: Progress in ArtificialIntelligence. Chichester, U.K., Ellis Horwood.

Pfeifer, R. und Verschure, P.F.M.J. (1992). Distributed adaptive control:a paradigm for designing autonomous agents. In: Toward A Practice ofAutonomous Systems: Proceedings of the First European Conference onArtificial Life. Cambridge, MA, MIT Press.

Pfeifer, R. und Verschure, P.F.M.J. (1998). Complete autonomous agents:a research strategy for cognitive science. In: G. Dorffner (ed.): Neural networks and a New AI.

Picard, R.W.(1997). Affective Computing. MIT Press, Cambridge MA.

Read, T. und Sloman, A. (1993). The Terminological Pitfalls of StudyingEmotion.

127

Page 128: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Reeves, J.F. (1991). Computational morality: A process model of beliefconflict and resolution for story understanding. Technical ReportUCLA-AI-91-05, UCLA Artificial Intelligence Laboratory.

Reilly, W.S. (1996). Believable Social and Emotional Agents. PhD thesis.Technical Report CMU-CS-96-138, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA.

Reilly, W. S. und Bates, J. (1992). Building Emotional Agents. TechnicalReport CMU-CS-92-143, School of Computer Science, Carnegie MellonUniversity, Pittsburgh, PA.

Rollenhagen, C. und Dalkvist, J. (1989). Cognitive contents in emotion: Acontent analysis of retrospective reports of emotional situations.Technical Report, Department of Psychology, University of Stockholm.

Roseman, I.J. (1979). Cognitive aspects of emotion and emotional behavior. Paper presented at the 87th Annual Convention, AmericanPsychological Association. New York, NY.

Roseman, I.J. (1984). Cognitive determinants of emotions: A structuraltheory. In: P. Shaver (Ed.): Review of personality and social psychology,Vol. 5. Beverly Hills, CA, Sage.

Roseman, I.J. (1991). Appraisal determinants of discrete emotions. In:Cognition and Emotion, 3, 161-200.

Roseman, I.J; Antoniou, A.A.; Jose, P.A. (1996). Appraisal Determinantsof Emotions: Constructing a More Accurate and Comprehensive Theory.In: Cognition and Emotion, 10 (3), S. 241-277.

Schaub, H. (1995). Die Rolle der Emotionen bei der Modellierung kognitiver Prozesse. Paper zum Workshop Artificial Life, Sankt Augustin.Erhältlich unter http://www.uni-bamberg.de/~ba2dp1/psi.htm.

Schaub, H. (1996). Künstliche Seelen - Die Modellierung psychischerProzesse. Widerspruch 29.

Scherer, K. (1984). On the nature and function of emotion: a componentprocess approach. In K.R. Scherer, and P. Ekman (eds.). Approaches toemotion. Hillsdale, N.J., Erlbaum.

Scherer, K. (1988). Criteria for emotion-antecedent appraisal: A review.In: V. Hamilton, G.H. Bower, N.H. Frijda (eds.): Cognitive perspectiveson emotion and motivation. Dordrecht, Kluwer.

Scherer, K. (1993). Studying the Emotion-Antecedent Appraisal Process:An Expert System Approach. In: Cognition and Emotion, 7 (3/4), S.325-355.

Selfridge, O.G.(1959). Pandemonium: A Paradigm for Learning. In:Blake, D.V. and Uttley, A.M. (eds.): Proceedings of the Symposium onMechanization of Thought Processes. H.M. Stationary Office, London.

Simon, H.A. (1967). Motivational and emotional controls of cognition.128

Page 129: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Psychological Review, 74, 29-39.

Simon, H.A. (1996). Computational Theories of Cognition. In: W.O'Donohue und R.F. Kitchener (eds.): The Philosophy of Psychology.Sage.

Sloman, A. (1981). Why robots will have emotions. Proceedings IJCAI.

Sloman, A. (1987). Motives Mechanisms and Emotions. In: Cognition andEmotion 1,3.

Sloman, A. (1991). Prolegomena to a theory of communication and affect. In: A. Ortony, J. Slack, & O. Stock (eds.). AI and Cognitive Science Perspectives on Communication. Heidelberg: Springer.

Sloman, A. (1992a). Towards an information processing theory of emotions. 1992. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1992b). Silicon Souls, How to design a functioning mind.Professorial Inaugural Lecture, Birmingham, 1992. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1992c). Prolegomena to a Theory of Communication and Affect. In: Ortony, A., Slack, J., Stock, O. (Eds.): Communication from an Artificial Intelligence Perspective: Theoretical and Applied Issues.Heidelberg, Springer.

Sloman, A. (1993). Prospects for AI as the General Science of Intelligence. In: Proceedings AISB93.

Sloman, A. (1994). Explorations in Design Space. In: Proceedings 11thEuropean Conference on AI. Amsterdam.

Sloman, A. (1995). Exploring design space and niche space. Invited talkfor the 5th Scandinavian Conference on AI, Trondheim, May 1995. In:Proceedings SCAI95. IOS Press, Amsterdam.

Sloman, A. (1996a). What sort of architecture can support emotionality?Slides for a talk at MIT Media Lab, Nov 1996. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1996b). What sort of architecture is required for a human-like agent? Invited talk at Cognitive Modeling Workshop, AAAI96, Portland, Oregon.

Sloman, A. (1997a). Designing Human-Like Minds. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1997b). Architectural Requirements for Autonomous Human-like Agents. Slides for a talk at DFKI Saarbrücken, 1997.Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1997c). What sort of control system is able to have apersonality? In: R. Trappl and P. Petta (eds): Creating Personalities for

129

Page 130: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Synthetic Actors: Towards Autonomous Personality Agents. Springer.

Sloman, A. (1998a). Diagrams in the Mind? Invited paper for ThinkingWith Diagrams conference at Aberystwyth. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1998b). What Sorts of Machines Can Love? ArchitecturalRequirements for Human-like Agents Both Natural and Artificial. Draftextended version. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1998c). Review of Affective Computing by Rosalind Picard,MIT Press 1997. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1998d). The ``Semantics'' of Evolution: Trajectories andTrade-offs in Design Space and Niche Space. Invited talk for IBERAMIA-98 Lissabon.

Sloman, A. (1998e) Damasio, Descartes, Alarms and Meta-management.Invited contribution to symposium on Cognitive Agents: Modeling HumanCognition, at IEEE International Conference on Systems, Man, and Cybernetics. San Diego.

Sloman, A.(1998f). What's an AI toolkit for? In: B. Logan und J. Baxter(eds.): Proceedings AAAI-98 Workshop on Software Tools for DevelopingAgents.

Sloman, A. (1998g). Supervenience and Implementation: Virtual andPhysical Machines. Draft. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1998h). Design Spaces, Niche Spaces and the ``Hard'' Problem. Draft, Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. (1998i). What sort of architecture is required for a human-like agent? In: M Wooldridge and A Rao (eds.): Foundations of Rational Agency. Kluwer Academic Publishers.

Sloman, A. und Croucher, M. (1981). Why robots will have emotions. In:Proceedings of the 7th International Joint Conference on AI. Vancouver.

Sloman, A. und Logan, B. (1997). Synthetic Minds. Poster presented atAA'97 Marina del Rey. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. und Logan, B. (1998). Architectures and Tools for Human-Like Agents. Paper presented at the European Conference on Cognitive Modelling, Nottingham. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Sloman, A. und Poli, R. (1996). SIM_AGENT: A toolkit for exploringagent designs. In: M. Wooldridge, J. Mueller, M. Tambe (eds.):Intelligent Agents Vol II (ATAL-95). Springer-Verlag.

130

Page 131: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Sloman, A. und Wright, I.P. (1997). MINDER1: An implementation of aprotoemotional agent architecture. Technical Report CSRP-97-1.Erhältlich unterftp://ftp.cs.bham.ac.uk/pub/groups/cog_affect/0-INDEX.html.

Smith, C.A., Griner, L.A., Kirby, L.D. Scott, H.S. (1996). Toward aProcess Model of Appraisal in Emotion. In: Proceedings of the NinthConference of the International Society for Research on Emotions.Toronto, Canada.

Sonnemans, J. und Frijda, N.H. (1994). The structure of subjectiveemotional intensity. In: Cognition and Emotion, 8 (4), S. 329-350.

Staller, A. und Petta, P. (1998). Towards a Tractable Appraisal-BasedArchitecture for Situated Cognizers. Paper presented at the 5th International Conference of the Society for Adaptive Behavior, Zürich.Erhältlich unterhttp://www.ai.univie.ac.at/~paolo/conf/sab98/sab98ws.html.

Stork, D. G.(1997). Scientist on the Set: An Interview with MarvinMinsky. In: Stork D. G. (ed.): HAL's Legacy: 2001's computer as dreamand reality. MIT Press, Cambridge MA.

Suchman, L. (1987). Plans and situated actions. Cambridge UniversityPress.

Sutton, R.S. (1991). Dyna, an integrated architecture for learning,planning, and reacting. In: Working Notes of the 1991 AAAI SpringSymposium.

Swagerman, J. (1987). The Artificial Concern Realization System ACRES.A computer model of emotion. PhD Thesis, University of Amsterdam,Dept. of Psychology.

Toda, M. (1982). Man, robot, and society. The Hague, Nijhoff.

Velásquez, J.D. (1997). Modeling Emotions and Other Motivations inSynthetic Agents. In: Proceedings of the Fourteenth National Conferenceon Artificial Intelligence and Ninth Innovative Applications of ArtificialIntelligence Conference. Menlo Park.

Velásquez, J.D. (1998). A Computational Framework for Emotion-BasedControl. Paper presented at the 5th International Conference of theSociety for Adaptive Behavior, Zürich. Erhältlich unterhttp://www.ai.univie.ac.at/~paolo/conf/sab98/sab98ws.html.

Verschure, P.F.M.J., Kröse, B.J.A., Pfeifer, R. (1992). Distributedadaptive control: the self-organization of structured behavior. In:Robotics and Autonomous Systems. 9.

Watkins, C. und Dayan, P. (1992). Technical Note: Q-Learning. In:Machine Learning 8, S. 279-292.

Wehrle, T. (1994). New fungus eater experiments. In: P. Gaussier undJ.-D. Nicoud (eds.): From perception to action. Los Alamitos, IEEE

131

Page 132: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Computer Society Press.

Wilson, S.W. (1995). Classifier fitness based on accuracy. In:Evolutionary Computation, 3 (2), S. 149-185

Wright, I.P. (1996a). Design Requirements for a Computational LibidinalEconomy. Technical Report CSRP-96-11. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Wright, I.P. (1996b). Reinforcement learning and animat emotions.

Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Wright, I.P. (1997). Emotional Agents. PhD thesis, University of Birmingham. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Wright, I.P. und Sloman, A. (1996). MINDER1: An Implementation of aProtoemotional Agent Architecture. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Wright, I.P. und Aubé, M. (1997). The society of mind requires aneconomy of mind. Technical Report CSRP-97-6. 1997. Erhältlich unterhttp://www.cs.bham.ac.uk/~axs/cog_affect/COGAFF-PROJECT.html.

Wright, I.P., Sloman, A. und Beaudoin, L. (1996). Towards a Design-Based Analysis of Emotional Episodes. In: Philosophy Psychiatry and Psychology, vol 3 no 2.

132

Page 133: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Appendix 1

Emotional Machines

Presentation held at V2_Lab workshop in Rotterdam, January 17, 2004

133

Page 134: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

In my presentation, I will not dwell on psychological theories of emotion or on the importanceof emotions for reasoning. I will instead point out some of the problems associated with theimplementation of emotions into machines and outline some of the proposed solutions.

When we talk about emotional computers, we first have to differentiate between

the recognition of emotions the simulation of emotions the generation of emotions and the expression of emotions

Furthermore, the recognition of emotions can be differentiated into

the perception of emotions the identification of emotions and the interpretation of emotions

And if we look at the human observer of such machine actions, our point of interest is

the recognition of emotions

134

Page 135: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

a) The recognition of emotions

The recognition of emotions through machines is no trivial task. Why, it is difficult even forus humans! Individuals express their emotions quite differently: with different intensity(depending on individual dispositions and context dependencies), dependent on gender, andwith different physiological parameters accompanying the emotion.

Picard lists some parameters which are important for the recognition of emotions. She groupsthem into two categories:

Apparent indicators are, i.e.,

facial expression voice intonation gestures and movement posture pupilary dilation

Not (or less) apparent indicators are, i.e.,

respiration heart rate and pulse temperature electrodermal response and perspiration muscle action potentials blood pressure

The first step would be to implement a perceptual system which can provide the machine withthe necessary data. Parts of such a system would be cameras, microphones, and sensors tomeasure skin reactions, heart rate, blood pressure etc.

The problem with a lot of machine perception approaches is the inability to gather data from amoving object. People have to sit in a chair, their faces immobile; alternatively, they have tobe equipped with clumsy measuring devices which impair their ability to act naturally.

To overcome these problems, Picard and her group at MIT are developing a number ofmeasuring devices without these disadvantages. One of these devices are Scheirer’s“Expression Glasses” which measure facial muscle movement to identify emotions. Anotheris a specially equipped seat to measure changes in physiological reactions. The “TouchPhone”communicates the strength with which a phone is gripped by the user to the other participantin the form of different colours. Another device is a computer mouse which measuresemotional states by the pressure exerted on it. Voice analysis can be used to detect the level ofstress a speaker experiences.

Picard subsumes these techniques under the name the “concurrent-expression method”, asopposed to the “self-report” method in which the users have to indicate their emotional stateactively. She points out that many people do not want their emotional reactions recordedwhich are not voluntarily controlled by them.

Paul Ekman has developed a system based on decades of research which identifies emotionsby facial expression. His FACS (Facial Action Coding System) tracks the movement of facial

135

Page 136: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

muscles and correlates combinations of 44 “action units” to specific emotions. Currently, he isworking on the implementation of FACS on a computer. A system equipped with camerasand the FACS database could then identify emotions within seconds.

Mourao and Paiva have developed EToy, a plush puppet equipped with sensors. The usermanipulates the puppet in order to express a number of emotional states. The results of thesemovements are then interpreted and emotions are inferred based on the OCC model.

Gestures Emotions

Put the toy's hands in front of it's eyesor moving the toy backwards

vigorouslyFEAR

Moving the toy slightly backwards(squeezing it slightly)

DISGUST

Swinging the toy (putting it dancing)and/or playing with its arms

HAPPY

Bend down its neck or bend down allthe toy's trunk

SAD

To place its arms crosswise or shakethe toy vigorously

ANGER

Open its arms backwards inclining itstrunk slightly backwards too

SURPRISE

Even if it would be possible to collect all the above data, this would still not suffice to identifyemotions exactly. Emotions are not person-independent, but always person-dependent; themachine would have to gather data of many individuals and, through a process of learning andpattern recognition, cluster these individuals into groups.

After the perception and identification of emotions comes the task of interpreting theseemotions. Why is someone angry in a given context, why joyful? This is clearly dependent onthe individual history of a person as well as the context of the situation. Both must be takeninto account to generate an appropriate response to an emotion.

136

Page 137: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

b) The simulation of emotions

The simulation of emotions has two aspects:

the simulated emotions in the machine and the perceptions of these emotions by a human observer

This is a crucial distinction because humans tend to attribute qualities to people and objectswhich are not “really there”. An example are the Braitenberg “vehicles”, little mobile robotswith a few simple embedded instructions. Observers labelled their behaviour as “aggressive”or “frustrated” although none of these “emotions” had been programmed into them.

There have been proposed different approaches to simulate emotions; I will briefly outlinesome of them. Most of these models are based on the emotion theory of Ortony, Clore andCollins (OCC model).

OCC define emotions as valenced reactions to events, agents or objects. These events, agentsor objects are appraised according to an individual’s goals, standards and attitudes. The OCCmodel can be easily implemented in a rule-based system but has difficulties dealing withcompound emotions which are not clearly defined.

Event, Agent, or Objectof appraisal

appraised in terms of

goals(events)

norms/standards(agents’ actions)

tastes/attitudes(objects)

joydistress

hopefear

reliefdisappointment

etc.

GOAL-BASEDEMOTIONS

angergratitude

gratificationremorse

etc.

COMPOUNDEMOTIONS

prideshame

admirationreproach

etc.

STANDARDS-BASED

EMOTIONS

lovehate

etc

ATTITUDE-BASEDEMOTIONS137

Page 138: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Ortony (2002) has simplified the OCC model by collapsing some of the original categoriesdown to five distinct positive and five negative specializations of two basic types of affectivereactions–positive and negative ones. He believes that these categories have enoughgenerative capacity to endow any affective agent with the potential for a rich and variedemotional life. Ortony, too, points out the importance of personality which determines theconsistency of emotional reactions over time and makes an agent believable.

POSITIVE REACTIONS

… because something good happened (joy, happiness etc.)

… about the possibility of something good happening (hope)

… because a feared bad thing didn’t happen (relief)

… about a self-initiated praiseworthy act (pride, gratification)

… about an other-initiated praiseworthy act (gratitude, admiration)

… because one finds someone/thing appealing or attractive (love, like, etc.)

NEGATIVE REACTIONS

… because something bad happened (distress, sadness, etc.)

… about the possibility of something bad happening (fear, etc.)

… because a hoped-for good thing didn’t happen (disappointment)

… about a self-initiated blameworthy act (remorse, self-anger, shame, etc.)

… about an other-initiated blameworthy act (anger, reproach, etc.)

… because one finds someone/thing unappealing or unattractive (hate,

dislike, etc.)

Table 1. Five specializations of generalized good and bad feelings (collapsed fromOrtony et all., 1988). The first entry in each group of six is the undifferentiated (positiveor negative) reaction. The remaining five entries are specializations (the first pair goal-based, the second standards-based, and the last taste-based).

Ortony further points out the importance of personality on emotions. They provide a characterwith consistency and coherence. This consistency and coherence can be observed in thereactions of a character. Ortony proposes the following emotional response taxonomy:

138

Page 139: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Emotion Response-tendencies

ExpressiveInformation-processing

Coping

Attentional(obsessing)

Evaluative(despising)

Emotion-oriented

Problem-oriented(preventing recurrence)

Somatic(flushing)

Behavioral(fist-clenching)

Communicative

Verbal(swearing)

Non-verbal(scowling)

Self-regulating(calming down)

Other-modulating(distressing antagonist)

Bates and Reilly of the Oz Group at Carnegie-Mellon have created a virtual environmentcalled TOK which contains the emotion module Em. Em contains 22 emotion types which areordered hierarchically. They are based on the OCC model but have been simplified a bit inorder to produce faster emotional reactions.

Since Oz is intentionally constructed as an artificial worl which is to be regarded by the userlike a film or a play, it is sufficient to construct the various abilities of the system "flat" inorder to satisfy expectations of the user. Because, as in the cinema, he does not expect acorrect picture of reality, but an artificial world with in this context convincing participants.

Based on their work with Oz (which is not continued any more), Bates and Reilly haveformed a company called “Zoesis” which produces lifelike computer characters based on theirearlier research.

Clark Elliott has developed the Affective Reasoner, which is based on OCC as well. Itcontains a database of 24 emotion types, a data base of goals, standards and preferences(GSPs), a database of assumed GSPs for others (called COO – Concerns-of-Others by Elliott),and a database of reaction patterns. An agent continuously compares the patterns in the GSPand COO databases with so-called emotion eliciting conditions (EECs). If a correspondence isfound, an EEC relation is created which then generates the appropriate emotion. In a furtherstep, the emotion initiates an action. Elliott’s agents are capable of learning and of modifyingthe intensity of their emotions.

Once an emotion arises, agents have approximately 440 channels through which they canexpress these emotions: each emotion has approximately 20 of the channels. For example,

139

Page 140: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

anger might be expressed somatically as "turning red" at one end of the spectrum (notintentional), or by "invoking a plan to get even" at the other end of the spectrum (highlyintentional). In between we might find "verbal expression" (e.g., say something), "emotionrepression" (deny that anything is wrong), and so forth. Channels are activated according tothe temperament of the agent.

The agents are responsive in real time (e.g., when you talk with them, they respondimmediately; if you don't say anything, they might get impatient). Agents have approximately70 different facial expressions at their disposal. Their mouths move, and they control thespeed with which they morph from one expression to another (also size and color of the face).Agents speak using text-to-speech technology. They can speek any sentence generated, inreal time. Agents have limited ability to listen to spoken English, using speech recognitionsoftware. Agents have access to a, theoretically, almost unlimited amount of music to use toexpress emotions. They can retrieve and play any music selection in less than a second, andthe music can be indexed down to 1/1000th of a second. Agents select their own music toexpress their emotions, from pre-coded categories. Of special note is that we use a"minimalist" approach: the agents can run on a standard multimedia PC, and are effecientenough to be run over the Web through a standard modem (Web versions are only in veryearly stages of development).

Agents can reason about the emotions that others are having about situations that arise; agentshave relationships; agents have moods; agents keep models of how others see the world; etc.

One of the lines of research in the AR project which has yet to be developed to any depth, butwhich is promising enough to mention, is the idea of affective user modeling. In this paradigmthe hard problems of general user modeling are left alone, with the focus being placed not onwhat the user knows, but how they feel. Since AR agents, and other emotionally-intelligentsystems, necessarily keep at least implicit internal models of how others see the world (forhow else can one, e.g., feel sorry for someone if not by knowing that they are sad aboutsomething?) it is not a big step to then model a user's general emotion state. In the AR agentsthis internal model is explicit, and it is only a minor theoretical leap to use this as a basis fortutoring, and other, goals. Bolstering this approach is something we have observed ad hoc inthe relationship between users and AR agents, but which is also commonsensical: people aresocially motivated to express their emotion states (e.g., I am frustrated, I am angry, I admirethe way you...) even to a computer agent, as long as the agent has some way to respondappropriately.

Based on the ideas of Bates and Elliott, Barbara Hayes-Roth and her team have developed theVirtual Theatre. The Virtual Theater project aims to provide a multimedia environment inwhich users can play all of the creative roles associated with producing and performing playsand stories in an improvisational theater company. These roles include: producer, playwright,casting director, set designer, music director, real-time director, and actor. Intelligent agentsfill roles not assumed by the user.

In particular, in a typical production, animated actors perform the play in a multi-media set,all under the supervision of an automated stage or story manager. Actors not only followscripts and take interactive direction from the users. They bring "life-like" qualities to theirperformances; for example, variability and idiosyncracies in their behaviour and affectiveexpressiveness. They also improvise, thus collaborating on the creative process. Each time theactors perform a given script or follow a given direction, they may improvise differently.Thus, users enjoy the combined pleasures of seeing their own works performed and being

140

Page 141: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

surprised by the improvisational performances of their actors.

Current research focuses on building individual characters that can take direction from theuser or the environment, and act according to these directions in ways that are consistent withtheir unique emotions, moods, and personalities.

The personality profiling model allows for the specification of traits, such as self-confidence,activity, and friendliness, which can be varied along a numeric continuum. These, in part,determine how an agent reacts to situations in the virtual environment. The traits, in turn, candepend upon values of agent states, such as happiness-sadness (self-oriented affective states),gratitude-anger (other-oriented affective states - e.g., grateful to someone), and liking andhatred (attraction-oriented affective states). Characteristics like these are used to create thedispositional, and dynamically variable, personalities of agents used in the interactiveenvironment.

Using such controls over agent behaviour, one is able to define personalities that reflect theintended high-level characteristics of labeled lay-personality types. For example, one mightcreate agents with general types of nasty, friendly, shy, lazy, choleric, and selective (friendlywith some, nasty with others).

Van Kesteren et al. of Twente university have developed an architecture called SHAME(Scalable, Hybrid Architecture for the Mimicry of Emotions) which contains a so-called“Emotional State Calculator (ESC)”. First, an Event Appraiser appraises the emotionalmeaning of an event by constructing an Emotion Impulse Vector (EIV). The inputs of theEvent Appraiser are the variables defined in the OCC model. After passing through aNormalizer, the EIV is analysed by the ESC. The first part of the ESC is a recurrent neuralnetwork and should be seen as an implicit emotional state. The second part of the ESC is afeed forward network which has the implicit emotional state as input. It turns it into anexplicit emotional state.

141

Page 142: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Bazzan and Bordini have developed agents with emotions who participate in an iteratedprisoner’s dilemma. They developed a rule-based system based on the OCC model with theaim to create a framework in which users can easily define emotions for their agents.

Egges et al. propose to use the OCC model only for the appraisal of emotions and feed thisinformation into a personality model they are developing. The personality model will serve asa selection criterion that indicates what and how many goals, structures and attitudes fit withthe personality. They use the OCEAN model of personality (Openness, Conscientiousness,Extraversion, Agreeableness, Neuroticism). A visual front-end produces output by generatingspeech and facial expressions.

Prendinger and Ishizuka have developed an architecture called SCREAM (SCRiptingEmotion-based Agent Minds) based partly on the OCC model. It contains an appraisalmodule, an emotion resolution module and an emotion maintenance module.

142

Page 143: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Their model takes into account the social context in which emotions are expressed orsuppressed.

They propose to enrich a believable agent with “narrative intelligence” because humans tendto frame the behaviour of other agents into narrative. They refer expressly to the Oz projectand the work of Hayes-Roth. As Sengers has pointed out:

“that artificial agents can be designed to produce narratively comprehensible behavior bystructuring their visible activity in ways that make it easy for humans to create narrativeexplanations of them."

Sengers' characterization is derived from narrative psychology that claims that people makesense of the behavior of other humans by structuring their visible activity into narrative(Bruner [8]). More speci¯cally, people frame the activity of other agents into story in that theytry to interpret other agents' actions as intentional behavior, e.g., by attributing desires andattitudes to them. The conclusion drawn by Sengers is that animated character designersshould provide characters with visible cues to support people in their attempt to generate anarrative explanation of the character's actions, and hence improve their understanding of thecharacter's intentions.

Mourao and Paiva, the creators of EToy, have developed for their system an Affective UserModel Component (AUMC), which, too, is based on the OCC taxonomy. It provides theability to store and make inferences about the user’s affective states.

143

Page 144: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The inference motor is responsible for the kernel process in the affective user modelcomponent. This process receives information about the user’s actual perceptions, his goals,knowledge and the physical emotions / actions corresponding to the emotions or actionsdirectly inferred from the physical interface component. The inference motor will then inferthe new current emotional state for the user, given the knowledge of such information.This module is composed by an appraisal structure that follows a rule-based approach. It has aset of if-then rules that will activate an emotion (the consequent) when a given situation (theantecedent) is verified. This appraisal can have for instance, the rule: if (agent is pleased aboutan event) then (agent feels happy). In this case, the appraisal must also have the rules thatdetermine when the agent will be pleased with a given situation, for instance, if he sees a blueobject, he likes blue and one of its goals is to pick the objects he likes.

The information produced by the Inference Module will be used to the updating process toupdate the user state model. The new inferred emotion will flow directly to the actuators.

The application the user is running will feed the system with some necessary aspects that mustbe considered for user’s emotional state assessment. These aspects constitute the syntheticagent’s perceptions of the application’s virtual world. The perceptions the AUMC has aboutthe virtual world are coming from what we call the virtual sensors.

Some information passed to the affective model, from these virtual sensors, may not berelevant, at a particular instant, for the inference process. Thus, the filtering process isresponsible for, according with the current situation (namely goals and knowledge),determining the relevant information about the virtual world for the inference process. Thefiltering mechanism is necessary in such a system, in order to increase its performance. It isimportant that the system does not overload the inference process with information that is notimportant for the realization of the task.

144

Page 145: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

c) The generation of emotions

Generating emotions differs fundamentally from the simulation of emotions insofar as, in thiscase, the goal is to give a machine the opportunity to develop its own emotions. Emotions, inthis approach, are defined as emergent properties of a system and not a set of pre-definedrules.

The task is, thus, to design a system that is able to generate its own emotions. One of the mainproponents of this design-based approach is Aaron Sloman of Birmingham University. Heviews emotions (or affect) as control-states of an intelligent system. Sloman cites a number offactors on which a wide range of affective states depends:

whether they are directed (e.g. craving an apple) or non-specific (e.g. general unease or depression), whether they are long-lasting or short-lived how fast they grow or wane in intensity what sorts of belief-like, desire-like and other states they include which parts of an architecture trigger them which parts of the architecture they can modulate whether their operation is detected by processes that monitor them whether they in turn can be or are suppressed. whether they can become dormant and then be re-awakened later, what sorts of external behaviours they produce, how they affect internal behaviours, e.g. remembering, deciding, dithering, etc. whether they produce second-order affective states (e.g. being ashamed of being angry), what sorts of conceptual resources they require.

Sloman’s architecture consists of three layers: Reactive processes, deliberative processes andmeta-management processes. The current state of the architecture is called H-CogAff and isthe most complex developed by his team so far. It contains numerous feedbacks andinteractions between the components of the system.

145

Page 146: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Wright, a student of Sloman’s, has developed a model he calls “computaitonal libidinaleconomy”. It is based on Sloman’s MINDER and adds several layers of complexity. It isinteresting as a theoretical approach, but its complexity makes it very difficult to implement.His main achievement is the introduction of an internal currency by which rewards andpunishments are distributed among the elements of the system.

Marvin Minsky, in his forthcoming book “The Emotion Machine”, presents a six-layeredarchitecture of mind.

His model is similar to that of Sloman; furthermore, he adopts Wright’s internal currency.However, at the moment Minsky’s model is purely theoretical and offers no easyimplementation in a machine.

I would finally like to mention Steve Grand’s “Creatures”, a commercial Artificial Lifeproduct whose inhabitants exhibit a rich emotional life. The important point is that none ofthat has been programmed into them. They have some basic drives and a “chemistry” ofhormones; furthermore, all this information is stored in their “genes” which can recombineover the generations.

146

Page 147: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

d) The expression of emotions

As pointed out above, emotional reactions do not have to be actively implemented in amachine in order to make human observers infer emotions.

Machines (as humans) can express their emotions through:

Movement Speech Colours Sounds Written Text Music Facial expression etc.

147

Page 148: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Concluding remarks

As we have seen, there exists a multitude of models which try to simulate or generateemotions in a machine. Most of these models are either very limited in scope or mainlytheoretical.

A number of problems still have to be solved to create an emotional machine within aninteractive environment. These relate to the perception and classification of emotions as wellas to the selection of appropiate responses.

Although a truly emotional machine will not be realized within the near future, within alimited and controlled domain one can implement a system that appears to be emotional to ahuman observer. The point of view of the development of such a system would be the user’sperception of an action.

Since a human observer does not know anything about the processes within the machine, shewill judge it by its actions/expressions. It is totally irrelevant if this action is caused bysimulation or is a “true” machine emotion. It might even be that “true” machine emotionsdiffer so completely from human emotions that they would not fulfil the intended purpose.

148

Page 149: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Appendix 2

Powerpoint slides of a presentation held at DEAF Rotterdam, November, 2004

149

Page 150: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Affective Systems

Rotterdam, November 11, 2004

150

Page 151: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

What is an “affective system”?

• A fly?• A dog?• A software?• A human?• An ant?

151

Page 152: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

What is an “affective system”?

• We need a definition of “affect” in order to define affective systems

• “Affect” is often mixed up with other concepts

such as emotion, mood, feeling etc.

152

Page 153: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Definitions of “affect”

• “The conscious subjective aspect of feeling or emotion”

• “The observable emotional condition of an individual at any given time”

• “Generalized feeling tone (usually considered more persistent than emotion, less so than mood). It is the external, observable manifestation of emotion (e.g., flat, blunted, constricted, expansive, labile, etc.)”

• “Emotion, feeling or mood”

153

Page 154: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Definitions of “emotion”

• “Any strong feeling”• “Feelings such as happiness, sadness, anger,

elation, irritation, etc. The specific definition of emotion is difficult to qualify as it is a completely subjective experience”

• “A psychological feeling, usually accompanied by a physiological reaction”

• “The feeling one experiences in reaction to a person or situation”

154

Page 155: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The trouble with definitions

• “Part of the problem is that many of the words we use for describing human mental states and processes (including ‘emotion’, ‘learning’, ‘intelligence’, ‘consciousness’) are far too ill-defined to be useful in scientific theories. Not even professional scientists are close to using agreed definitions of ‘emotion’.”(Sloman)

155

Page 156: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The trouble with definitions

• The concept of emotion is but one of a large family of intricately related everyday concepts, including many affective concepts (e.g. moods, attitudes, desires, dislikes, preferences, values, standards, ideals, intentions, etc.), the more enduring of which can be thought of as making up the notion of a “personality”.

• Models that purport to account for ‘emotion’ without accounting for others in the family are bound to be shallow. (Sloman)

156

Page 157: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Psychological emotion theories

• More than a century now psychologists have busied themselves with emotions

• But the topic has never been a very prominent one

• Modern psychology has defined itself as a science of testing, measuring and statistics

• Because emotions are so subjective, they have been relegated to the sidelines

157

Page 158: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Four perspectives on emotion

• Darwinian perspective• Jamesian perspective• Cognitive perspective• Social construction perspective

158

Page 159: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The Darwinian perspective

• The Darwinian perspective views emotions as evolved phenomena with an important survival function

• Darwinians try to pinpoint universal emotions and their expressions

• Prominent names in this field are William McDougall, Robert Plutchik, Paul Ekman, Carroll Izard, Sylvan Tompkins

• Joseph LeDoux also fits into this category

159

Page 160: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The Jamesian perspective

• The Jamesian perspective is named after William James

• James insisted that it would be impossible to have emotions without bodily changes and that bodily changes always come first

• Antonio Damasio can be classified under this category

160

Page 161: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The cognitive perspective

• The cognitive perspective assumes that thought and emotion are inseparable

• All emotions are seen as the product of a cognitive appraisal process

• Some well-known researchers are Lazarus, Frijda, Scherer, Roseman, Ortony, Clore and Collins

161

Page 162: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

The social-constructivist perspective

• The Social-constructivist perspective views emotions as cultural products that owe their meaning and coherence to learned social rules

• ”Emotions are not just remnants of our phylogenetic past, nor can they be explained in strictly physiological terms. Rather, they are social constructions, and they can be fully understood only on a social level of analysis" (Averill, 1980)

162

Page 163: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Psychological emotion theories

• Before everything gets too confusing, let’s have a look on some important theories in comparison....

163

Page 164: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

James-Lange theory

• "My theory ... is that the bodily changes follow directly the perception of the exciting fact, and that our feeling of the same changes as they

occur is the emotion.”

I see a bearI sweat, my heart races I feel afraid

164

Page 165: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Cannon-Bard theory

• We feel emotions first, and then feel physiological changes, such as muscular tension, sweating, etc. These we interpret as emotion

I see a bearI sweat, my heart racesI feel afraid

165

Page 166: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Then came the cognitivists...

• ... and everything became much more complicated...

166

Page 167: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Schachter-Singer theory

• Experience of emotion depends on 2 factors:– physiological arousal of the autonomic nervous

system– cognitive appraisal of the physiological arousal

• If that explanation is non emotive then one will not experience an emotion

I see a bearI sweat, my heart races

I feel afraid Emotive Interpretation

Non-emotive Interpretation

I don’t feel afraid

167

Page 168: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Cognitive appraisal theories

Ext./Int.

Input

Apprai-sal

Social context

Mood

Old emotional

state

Perso-nality

Emotion Action

Decay

168

Page 169: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

OCC model

• Ortony, Clore and Collins:

Emotions are valenced reactions to events, agents or objects. These events, agents or objects are appraised according to an individual’s goals, standards and attitudes

169

Page 170: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

OCC modelEvent, Agent, or

Object of appraisal

appraised in terms of

goals (events)

norms/standards (agents’ actions)

tastes/attitudes (objects)

joy

distress

hope

fear

relief

disappointment

etc.

anger

gratitude

gratification

remorse

etc.

pride

shame

admiration

reproach

etc.

love

hate

etc.

GOAL-BASED EMOTIONS

COMPOUND EMOTIONS

STANDARDS-BASED EMOTIONS

ATTITUDE-BASED EMOTIONS

170

Page 171: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

As if this wouldn’t be enough...

• ... the emotional response is just as

complicated to calculate...

171

Page 172: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Emotional response taxonomy

Emotion Response-tendencies

ExpressiveInformation-processing Coping

Attentional(obsessing)

Evaluative(despising)

Emotion-oriented

Problem-oriented(preventing recurrence)

Somatic(flushing)

Behavioral(fist-clenching)

Communicative

Verbal(swearing)

Non-verbal(scowling)

Self-regulating(calming down)

Other-modulating(distressing antagonist)

172

Page 173: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Models based on OCC model

• Bates and Reilly - TOK• Elliott - Affective Reasoner• Van Kesteren et al. - SHAME• Bazzan and Bordini - IPD• Egges et al. - OCEAN and OCC• Prendinger and Ishizuka - SCREAM• Mourao and Paiva - AUMC• and many, many more...

173

Page 174: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Other cognitive models

• Scherer• Frijda• Pfeifer• Toda• Dörner• Velásquez• Canamero• et al.

174

Page 175: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

One of the key questions:

• Do emotions need a body.....

• ... or can a disembodied entity be emotional?

175

Page 176: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Psychologists and others

• It is interesting that most psychologists don’t concern themselves with this question

• They go on to try to define and classify emotions, e.g. discussing at length if “surprise” is an emotion or not

• So it is mainly left to philosophers and neurologists and engineers to discuss the concept of emotion

176

Page 177: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Psychologists and others

• In fact, most of the renewed interest in emotions is not due to psychologists, but to neuroscientists and software/hardware engineers trying to build an intelligent system (agents/robots)

177

Page 178: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

• Aaron Sloman is a philosopher at Birmingham University

• For many years now, he has been proposing a radical re-thinking of how we view emotions

• He is convinced that an intelligent system does not need a body to be emotional

178

Page 179: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

• Sloman says: We need to talk about “information-using systems”

• What are information-using systems? – They acquire, store, manipulate, transform, derive,

apply information.– The information must be expressed or encoded

somehow, e.g. in simple or complex structures – possibly in virtual machines.

– These structures may be within the system or in the environment.

– The information may be more or less explicit, or implicit.

179

Page 180: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

• “A feature of ordinary language that can confuse discussions of information-processing is that we normally think of information as something that is true or false: e.g. information about when the train will arrive

• Much information is control information which instead of being a potential answer to a question about what is the case is a potential answer to a question about what to do (or not do)”

180

Page 181: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

• Having motives, having preferences, having values, having attitudes, all involve control information – but there’s no reason to regard them all as ‘emotions’.

181

Page 182: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

• Sloman proposes to take a “design-oriented” stance, which means to construct an intelligent system with all the components it needs to survive

• Some of these components are what he calls “control structures”

• These control structures serve to interrupt an ongoing task and to concentrate the system’s attention on urgent business

• This is something one might call “emotion”

182

Page 183: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

Perception ActionCentral Processing

Meta-management(reflective processes)

(newest)

Deliberative reasoning(„what if“ mechanisms)

(older)

Reactive mechanisms(oldest)

183

Page 184: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

184

Page 185: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Aaron Sloman’s model

• Many different kinds of emotional states can be based on such an alarm system, depending on what else is in the architecture

• Don’t confuse the alarms (and emotions they produce) with the evaluations that trigger them, or the motives, preferences, policies, values, attitudes that have different sorts of functional roles – different sorts of control functions

185

Page 186: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Where does that leave us?

• We have a lot of theories about what emotions are but not one universally agreed upon definition

• We have a number of models pretending to equip an intelligent system with emotions

• We have two basically opposite positions about the need to have a body to feel emotions

186

Page 187: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Where does that leave us?

• Over the last 15 years, we have not seen realprogress regarding the definition, the function and the modeling of emotions

• We still have a long way to go to reach common theoretical ground

• And the way to a working model is even longer

187

Page 188: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

Where does that leave us?

• So, we are still left with our first question:

What is an affective system?

• Maybe we need to think a bit more about it.

188

Page 189: Emotional Computersgolumbic/courses/emotion/Ruebenstrunk-book.pdf · Credit assignment 10.5.4. The value circulation theory 10.6. A practical example of CLE 10.7. CLE and the problems

189