12
Copyright 2002 Psychonomic Society, Inc. 52 Cognitive, Affective, & Behavioral Neuroscience 2002, 2(1), 52-63 A primary function of attentional mechanisms is to di- rect cognitive resources toward parts or attributes of the world that are of particular relevance to the organism. These selectiveattentional processes are likely to be par- ticularly sensitive to biologicallysignificantstimuli. Emo- tional facial expressions are one set of stimuli that might be expected to have a high degree of behavioral relevance. The face carries important information about many social and biological attributes, such as identity, gender, age, and emotional state. Given that the expression of fear or threat in another individual may signal potential danger in the environment, the rapid detection and facilitated processing of such facial expressionswould be expected. Three sets of evidence suggest that facial expressionsare indeed a special category of stimuli in terms of attracting processing resources. First, with a visual search paradigm, it has been shown that angry facial expressions are easier to detect than happy or neutral facial expressions.For in- stance, Hansen and Hansen (1988) presented their partic- ipants with a series of photographscontaininga number of faces. On half of the trials, there was a singlediscrepantface. They found that people were fastest to detect discrepant angry faces, relative to discrepant happy faces. Moreover, the time taken to detect an angry face was unaffected by the number of other faces in the display, whereas detection of a happy face was strongly influenced by the size of the display. They argued that processing of angry facial ex- pressions occurs automaticallyand that a consequenceof this automatic processing of threat is the allocation of at- tentionalresources to a preattentivelydefined location.In contrast, search for happy faces was serial and was con- sidered to require attention.Unfortunately, there were a number of problems in interpreting this early study, in that the results may have been due to slower search through angry distractors, since a discrepant angry face was always embedded in a happy crowd and vice versa. In addition, there were further problems with the nature of the stim- uli used (see Fox et al., 2000, and Nothdurft, 1993, for dis- cussions). Nevertheless, these methodological problems have been addressed in more recent research, and it does seem that processing of angry facial expressions is indeed more efficient than that of happy expressions, althoughthe evidence suggests that the search may not be as fully au- tomatic as was originallyclaimed (Eastwood, Smilek, & Merikle, 2001; Fox et al., 2000; Ohman, Lundqvist,& Es- teves, 2001). This research indicates that the mechanisms underlyingvisual search are particularly sensitive to angry facial expressions. A second area of research that supports the notion that facial expressions may be processed at an automatic level comes from studies of patients with unilateral neglect. Uni- lateral neglect is a relatively common disorder following lesions of the right parietal lobe (Vallar, 1993). It is char- acterized by the failure to notice contralesional stimuli even in the absence of hemianopia. Extinction is a similar im- pairment and is defined as the failure to detect a contrale- sional stimulus, but only when another stimulus is simul- taneouslypresented on the ipsilesionalside (Rafal, 1996). Unilateral neglect and extinction are widely thought to represent disorders of attentionalmechanisms, because the patient seems capable of detecting and responding to stim- uli in the neglected field and yet does not. Of particular rel- evance for the present purposes is a recent study demon- strating that the emotionalexpressionsof schematic faces can influence the amount of extinctionobserved (Vuilleu- Correspondence concerning this article should be addressed to E. Fox, Department of Psychology, University of Essex, Wivenhoe Park, Colchester CO4 3SQ, England (e-mail: [email protected]). Processing emotional facial expressions: The role of anxiety and awareness ELAINE FOX University of Essex, Colchester, England In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants of the selective processing of affective facial expressions is investigated. In two experiments, an atten- tional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field. This pattern was especially strong when the participants were unaware of the presence of the facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinction was presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials, it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful or a happy expression tended to produce less extinction than did neutral expressions. This suggests that emotional facial expressions may be less dependent on attention to achieve awareness. The implications of these results for understanding the relations between attention, emotion, and anxiety are discussed.

Processing emotional facial expressions: The role of anxiety and awareness

Embed Size (px)

Citation preview

Page 1: Processing emotional facial expressions: The role of anxiety and awareness

Copyright 2002 Psychonomic Society, Inc. 52

Cognitive, Affective, & Behavioral Neuroscience2002, 2(1), 52-63

A primary function of attentional mechanisms is to di-rect cognitive resources toward parts or attributes of theworld that are of particular relevance to the organism.These selective attentional processes are likely to be par-ticularly sensitive to biologicallysignificant stimuli. Emo-tional facial expressions are one set of stimuli that mightbe expected to have a high degree of behavioral relevance.The face carries important information about many socialand biological attributes, such as identity, gender, age,and emotional state. Given that the expression of fear orthreat in another individual may signal potential dangerin the environment, the rapid detection and facilitatedprocessing of such facial expressions would be expected.Three sets of evidence suggest that facial expressions areindeed a special category of stimuli in terms of attractingprocessing resources. First, with a visual search paradigm,it has been shown that angry facial expressions are easierto detect than happy or neutral facial expressions. For in-stance, Hansen and Hansen (1988) presented their partic-ipants with a series of photographscontaininga number offaces. On halfof the trials, there was a singlediscrepant face.They found that people were fastest to detect discrepantangry faces, relative to discrepant happy faces. Moreover,the time taken to detect an angry face was unaffected by thenumber of other faces in the display, whereas detection ofa happy face was strongly influenced by the size of thedisplay. They argued that processing of angry facial ex-pressions occurs automaticallyand that a consequenceofthis automatic processing of threat is the allocationof at-tentional resources to a preattentivelydefined location. In

contrast, search for happy faces was serial and was con-sidered to require attention. Unfortunately, there were anumber of problems in interpreting this early study, in thatthe results may have been due to slower search throughangry distractors, since a discrepant angry face was alwaysembedded in a happy crowd and vice versa. In addition,there were further problems with the nature of the stim-uli used (see Fox et al., 2000, and Nothdurft, 1993, for dis-cussions). Nevertheless, these methodological problemshave been addressed in more recent research, and it doesseem that processing of angry facial expressions is indeedmore efficient than that of happy expressions, althoughtheevidence suggests that the search may not be as fully au-tomatic as was originally claimed (Eastwood, Smilek, &Merikle, 2001; Fox et al., 2000; Ohman, Lundqvist,& Es-teves, 2001). This research indicates that the mechanismsunderlyingvisual search are particularly sensitive to angryfacial expressions.

A second area of research that supports the notion thatfacial expressions may be processed at an automatic levelcomes from studies of patientswith unilateralneglect.Uni-lateral neglect is a relatively common disorder followinglesions of the right parietal lobe (Vallar, 1993). It is char-acterized by the failure to noticecontralesionalstimuli evenin the absence of hemianopia. Extinction is a similar im-pairment and is defined as the failure to detect a contrale-sional stimulus, but only when another stimulus is simul-taneously presented on the ipsilesionalside (Rafal, 1996).Unilateral neglect and extinction are widely thought torepresent disorders of attentionalmechanisms, because thepatient seems capable of detecting and responding to stim-uli in the neglected field and yet does not. Of particular rel-evance for the present purposes is a recent study demon-strating that the emotional expressions of schematic facescan influence the amount of extinctionobserved (Vuilleu-

Correspondence concerning this article should be addressed toE. Fox, Department of Psychology, University of Essex, WivenhoePark, Colchester CO4 3SQ, England (e-mail: [email protected]).

Processing emotional facial expressions:The role of anxiety and awareness

ELAINE FOXUniversity of Essex, Colchester, England

In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants ofthe selective processing of affective facial expressions is investigated. In two experiments, an atten-tional bias toward fearful facial expressions was observed, although this bias was apparent only forthose reporting high levels of trait anxiety and only when the emotional face was presented in the leftvisual field. This pattern was especially strong when the participants were unaware of the presence ofthe facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinctionwas presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials,it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful ora happy expression tended to produce less extinction than did neutral expressions. This suggests thatemotional facialexpressions may be less dependent on attention to achieveawareness.The implicationsof these results for understanding the relations between attention, emotion, and anxiety are discussed.

Page 2: Processing emotional facial expressions: The role of anxiety and awareness

ATTENTION AND FACIAL EXPRESSIONS 53

mier & Schwartz, 2001). This study found that faces withan angry or a happy expression resulted in less extinctionthan did faces with a neutral expression. This providesconverging evidence that emotional expressions have aspecial status for the attentional system. It is interesting,however, that the influenceon extinctionwas not specificto angry expressions but was found equally for angry andhappy expressions. In contrast, the visual search experi-ments discussed above suggest that processing is espe-cially facilitated for angry expressions.

Neurobiological research also supports the notion thatemotional facial expressions, especially of negative emo-tions, may have a special status in attracting processing re-sources (see Dolan, 2000, and LeDoux,1996, for reviews).For example, in conditioningexperiments, larger galvanicskin responses and greater resistance to extinction are as-sociated with aversively conditioned angry facial expres-sions, relative to happy or neutral facial expressions (Es-teves, Parra, Dimberg, & Ohman, 1994). Likewise, in apositron emission tomography study, it has been shownthat the human amygdala is involved in processing theemotional salience of faces, with an apparent specificityof response to fearful facial expressions (Morris et al.,1996). More recent research has shown that even whenaversively conditioned angry facial expressions are back-ward masked, so that observers are unaware of their pre-sentation, the right amygdala is more responsive to theseunseen stimuli than to unconditioned angry faces. Thesestudies suggest that the right amygdala can discriminatebetween stimuli on the basis of their acquired behavioralsignificance (Morris, Ohman, & Dolan, 1998, 1999). Inaddition, brain imaging has shown that the amygdala isparticularly responsive to masked fearful facial expres-sions even when the expressions are not conditionedin theexperiment (Whalen et al., 1998).

Thus, there has been a general finding from brain-imaging research that emotional facial expressions havea special status in capturing visual attentional processes.However, although the foregoing research has consideredthe emotionalityof the stimuli to be processed in some de-tail, little attentionhas been paid to the emotionalityof theparticipant in these studies. In relation to this, a burgeon-ing behavioral literature has found that selective process-ing of emotionallyrelevant stimuli might be modulatedbythe mood state of the person. For example, several studieshave found that peoplewho report high levels of trait anx-iety are faster to detect probes occurring in a location pre-viouslyoccupiedby an angry facial expression, relative toa neutral or happy facial expression (e.g., Bradley, Mogg,Falla, & Hamilton, 1998; Mogg & Bradley, 1999). More-over, it has also been found that high state-anxious peopletake longer to disengage their visual attention from the lo-cation of angry facial expressions,whereas this patterndoesnot occur for happy facial expressions (Fox,Russo, Bowles,& Dutton, 2001). Thus, it seems that the level of an indi-vidual’s self-reported anxiety may be an important deter-minant of selective attention to behaviorallysalient stimuli,such as facial expressions. It should be noted, however,

that two studies have attempted to assess whether detectionof angry expressions is enhanced in high trait-anxious(HA)individuals and both found little evidence for this hy-pothesis. Byrne and Eysenck (1995) did find that highanxiouspeople detected an angry face in a neutral crowdmore quickly, when compared with low trait-anxious (LA)individuals. However, the HA group was no faster to de-tect an angry face in a neutral crowd, relative to a happyface in a neutral crowd. Similarly, in a more recent study,it was found across several experiments that the enhanceddetection of angry schematic faces did not differentiatebetween HA and LA people (Fox et al., 2000). Thus, al-though the evidence suggests an anxiety-related bias toallocate attentional resources toward the location of angryfaces in dot-probe paradigms (e.g., Bradley et al., 1998;Mogg & Bradley, 1999), anxiety does not appear to mod-ulate the detection of angry faces in visual search tasks.

One study using the dot-probe paradigm suggested afurther specificity of the anxiety-related attentional biastoward angry facial expressions (Mogg & Bradley, 1999).This study found that HA people were faster to respond toprobes occurring in the locationof angry faces, but this biaswas observed only when the angry faces appeared in theleft visual field (LVF) and when the faces were backwardmasked (Mogg & Bradley, 1999). This raises the intriguingpossibility that there is a hemispheric bias for processingangry facial expressions,which is primarily apparentwhenawareness of the faces is restricted. This is consistentwiththe finding that neglect patientscan process affective facialexpressions even with severe problems in allocating atten-tion to their LVF. It seems that affective facial expressions(angry and happy) may be processed automatically (Vuil-leumier& Schwartz, 2001) and that attention then gets allo-cated toward the locationof the angry expressions (Mogg& Bradley, 1999). The latter study further suggests that theallocationof attention to threat-relatedstimuli (angry faces)may be modulated by the anxiety level of the observer, aswell as by the degree of consciousawareness of the stimuli.

Several studies have attempted to identify the neuralmechanisms that might underlie anxious mood. For exam-ple, it has been proposed that anxious arousal is associatedwith greater activity (e.g., increased blood flow) in right-hemisphere than in left-hemisphere regions, whereas anx-ious apprehension (worry) is associated with greater left-hemisphere activity (Heller, Nitschke, Etienne, & Miller,1997). In support of this, it has been found that social pho-bics show large increases in right-sidedprefrontal and pari-etal activation while anticipatingthe presentation of a pub-lic speech, whereas this pattern was not apparent in lowanxious individuals (Davidson, Marshall, Tomarken, &Henriques, 2000). Thus, although the evidence is not con-sistent across studies, there is a strong indication that theright hemisphere might be particularly sensitive to perceiv-ing the emotionality of a stimulus and that this bias mightbe somewhat stronger in highly anxious people.

The aim of the present study was to further investigatethe role of trait anxiety and the role of conscious aware-ness as determinants of the selective processing of affec-

Page 3: Processing emotional facial expressions: The role of anxiety and awareness

54 FOX

tive facial expressions. In a previous study, there was an in-dication that angry faces might attract attention to theirlocation in HA people only when awareness of the faceswas restricted (Mogg & Bradley, 1999). The present studyattempts to further assess this issue with fearful facial ex-pressions. The first two experiments evaluate whether aselective attentionalbias for fearful and/or happy faces isparticularly salient when stimuli are presented in the LVF,as opposed to the right visual field (RVF), for HA and LAparticipants. Experiment 2 evaluates whether an anxiety-related attentionalbias for fearful and happy faces occurswhen the stimuli are backward masked. Finally, Experi-ment 3 presents fearful, happy, and neutral facial stimulito the neglected field of a patient with unilateral neglect.Fearful expressions might be expected to be particularlysalient stimuli, since they are powerful indicators of po-tential danger in the environment. To our knowledge, thepresent study is the first that reports results for which fear-ful facial expressions were used with the dot-probe taskand with a neglect patient.

EXPERIMENT 1

The aim of Experiment 1 was to examine the influenceof trait anxiety on the propensity to selectively attend toemotional facial expressions. A pictorial version of thedot-probe task was used in order to assess whether fear-ful faces attract visual attention to a greater extent than dohappy faces. On each trial of the dot-probe task, a pair offaces was presented for 500 msec, immediately followedby a pair of dots (either vertically or horizontallyaligned)in the locationpreviouslyoccupiedby one of the presentedfaces. Using the same task, previous research has shownthat angry faces appear to capture spatial attention, in thatprobes appearing in the location of angry faces are de-tected faster than probes appearing in the location of neu-tral or happy facial expressions. Moreover, this selectiveprocessing of angry faces was found only for those re-porting high levels of trait anxiety (Bradley et al., 1998;Bradley, Mogg, & Millar, 2000).

In the present experiment, fearful and happy faces werepaired with neutral faces in order to evaluate whether thefacial portrayal of fear would capture spatial attention inHA people in a similar way to that of angry facial expres-sions. In this experiment, three key questionsare addressed.First, do fearful facial expressions capture spatial attentionto a greater extent than happy facial expressions do? Sec-ond, is this bias to attend to fearful expressionsparticularlysalient when the stimuli are presented in the LVF? Third,is this bias particularly apparent in HA individuals?

MethodParticipants. The participants were 32 undergraduate students

(20 female, 12 male), who were selected from over 120 volunteers whohad completed the trait scale of the Spielberger State Trait AnxietyInventory (STAI: 26) early in the academic year. The recruitment pro-cedure favored those with high and low scores on this scale, in orderto ensure a wide range of emotionality scores within the sample.

Materials and Apparatus. The face stimuli consisted of pairs ofphotographs of faces. Each face pair consisted of two pictures of dif-ferent individuals, with one person portraying an emotional ex-pression (fear or happiness) and the other a neutral expression. Halfof the emotional faces were fearful, and half were happy. Half of allthe faces were male, and half were female. The 24 faces used wereselected from the Pictures of Facial Affect developed by Ekman andFreisen (1975). Eight of the faces portrayed fearful expressions;half of these were male (Ekman codes PE3-21, JJ5-13, EM5-21, andWF3-16), and half were female (Ekman codes PF2-30, MO1-23, C1-21, and MF1-26). Eight of the faces portrayed happy expressions; onceagain, half of these were male (Ekman codes EM4-07, GS1-08, JB1-09, and WF2-12), and half were female (Ekman codes A1-06, JM1-04, MO1-04, and MF1-06). Eight of the faces portrayed a neutralemotional expression; half of these were male (Ekman codes WF2-05, JB1-03, JJ3-04, and PE2-04), and half were female (Ekmancodes A1-02, C2-03, MO1-05, and SW3-03). In the ratings re-ported by Ekman, each of the emotional faces was rated as por-traying the “correct” emotional category (fearful or happy) by over87% of respondents. Ratings were not available for most of the neu-tral faces. We conducted an additional rating of all 24 faces with 20independent raters in the same way as Ekman and Friesen. Thisconfirmed that each of the faces was correctly categorized (fearful,happy, and neutral) by at least 16 out of 20 of the raters (i.e., 80%).Thus, we were confident that the facial expressions were perceivedas portraying fearful, happy, or neutral emotional expressions.

All of the photographs were digitized and presented side by sideat the center of a computer screen. The display size of each photo-graph was 4 3 6 cm (120 3 180 pixels), and the centers of the faceswere 10 cm apart. The probe target was a pair of dots (either : or . .)that was presented on either the left- or the right-hand side of thescreen in the location previously occupied by the center of one ofthe faces (i.e., 5 cm to the right or left of fixation). Sixteen facepairs were created (emotional face 1 neutral face) so that each paircontained either two male or two female faces and no pair containedtwo expressions made by the same person.

The experiment was presented on a Power Macintosh (7200/90)computer, with the screen set at a resolution of 832 3 623 pixels.Stimulus presentation and data collection were controlled by Psy-Scope software (Cohen, MacWhinney, Flatt, & Provost, 1993). Re-sponse times were collected by means of a New Micros PsyScopebutton box that gives an accuracy rate of 1 msec.

Procedure. The participants were welcomed to a quiet lab andwere seated approximately 50 cm in front of a computer screen. Fullinstructions regarding the dot-probe task were presented on thecomputer screen. In the main dot-probe task, there were 16 practicetrials, followed by 128 experimental trials. In the experimental trials,each of the 16 face pairs was presented 8 times. This meant that eachof the emotional faces (fearful and happy) was presented 8 timeseach during the experiment, whereas each of the neutral faces was pre-sented 16 times during the experiment. Each face appeared equallyoften in the LVF and the RVF, and the presentation of type of probe(: or . .) and location of probe (LVF or RVF) was counterbalancedacross the experiment. Each trial began with a central fixation crosspresented at the center of the computer screen for 500 msec. Uponoffset of the fixation cross, the face pair was presented for 500 msec.Immediately following the display of the face pair, the probe targetwas presented in the location of one of the faces, and the participantswere required to press one of two keys on the button box to indicatethe type of probe (: or . .) as quickly and as accurately as possible. Theleft index finger was used for one probe (:), whereas the right indexfinger was used for the other probe (. .). The probe was displayed untila response was made, up to a maximum of 2,000 msec. The intertrialinterval was 1,000 msec. The computer emitted a 500-Hz bleep if theparticipant made a mistake. The trials were presented in a new randomorder for each participant.

Page 4: Processing emotional facial expressions: The role of anxiety and awareness

ATTENTION AND FACIAL EXPRESSIONS 55

Following the dot-probe task, each participant completed the stateanxiety scale of the STAI (Spielberger, Gorsuch, Lushene, Vagg, &Jacobs, 1983).

ResultsGroup characteristics. The participants were pre-

selected on the basis of their STAI trait anxiety scores,which were collected at a general testing session about6–8 weeks prior to the dot-probe task. Those scoring 45or above on the STAI trait anxiety scale were allocatedto the HA group (n 5 16), whereas those scoring 35 or lesson the STAI trait anxiety scale were allocated to the LAgroup (n 5 16). The two groups differed significantly onboth trait anxiety [t(30) 5 9.3, p , .001; HA, M 5 48.9,SD 5 5.6; LA, M 5 31.5, SD 5 4.9] and state anxiety[t(30) 5 3.0, p , .005; HA, M 5 45.7, SD 5 10.1; LA,M 5 34.2, SD 5 11.7] scores.

Dot-probe task. Trials with errors were discarded, aswere any reaction times less than 100 msec. Reaction timesgreater than two and a half standard deviations (SDs)from a participant’s overall mean reaction time were alsoexcluded prior to data analysis. This resulted in the lossof less than 3% of the reaction time data. The mean re-action time data are shown in Table 1. To simplify theanalysis, attentional bias scores were calculated from theresponse time data for each type of emotional face, relativeto the neutral face. This bias score was calculated by sub-tracting the mean response time when the emotional faceand the probe were in the same location from the mean re-sponse time when the emotional face and the probe loca-tion were in different positions. In order to test the spe-cific hypotheses regarding the location of the emotionalface (LVF or RVF), separate bias scores were calculatedfor emotional faces appearing in the LVF and the RVF.Positive values of the bias score for fearful faces reflectfaster response times when probes appear in the same lo-cation as a fearful face (i.e., vigilance for fearful faces),relative to a neutral face, whereas a negative bias scorereflects avoidance of fearful faces. Attentionalbias scoreswere similarly calculated for happy faces. The bias scoresfor each anxiety group and each visual field are shown inFigure 1.

A 2 3 2 3 2 mixeddesign analysisof variance (ANOVA)was carried out on the attentionalbias score data, with one

between-subjects factor of trait anxiety (HA vs. LA), andtwo within-subjects factors of type of emotional face (fear-ful vs. happy) and location of emotional face (LVF vs.RVF). The only significant effect was an interaction be-tween trait anxiety group, type of emotional face, andlocation of the emotional face [F(1,30) 5 4.97, p , .03].To examine the results in more detail, separate 2 3 2ANOVAs were carried out on bias scores for fearful andhappy faces. For the fearful faces, there were no signifi-cant effects, whereas for the happy faces, there was a trendtoward an interaction between trait anxiety group and lo-cation of face [F(1,30) 5 3.26, p , .08].

Given the specific hypotheses, planned contrasts wereused to compare the groups on their bias scores for eachtype of emotional face in each location (LVF vs. RVF).With one-tailed tests of significance, the only effect wasfor the HA group to show avoidance of happy faces ap-pearing in the LVF, whereas the LA group showed vigilancefor the same faces [228.56 vs. 119.87; t(30) 5 1.83, p ,.038]. Attentionalbias scores were also tested against zero(0 5 no attentional bias). Once again, with a one-tailedtest, there was a trend for the HA group to show vigilancefor fearful faces appearing in the LVF [123.7; t(15) 51.53, p , .07] and to show avoidance of happy faces ap-pearing in the LVF [228.56; t(15) 5 2.03, p , .03]. Notrends or significant effects were found for the LA group(all ts , 1). Pearson correlationswere calculated betweenthe index scores for fearful and happy faces and the traitand state anxiety questionnaires. No significant correla-tions were found between the index scores and the subjec-tive measures.

Control experiment. Because the different emotionalexpressions corresponded to different individuals in theexperiment, there is a possibility that the faces may havediffered on features other than expression and that theseartifacts may have influenced attention independently ofexpression.Thus, we conducted a control experimentwith6 HA participants (trait anxiety scores above 45). All ofthe stimuli were presented in a dot-probe paradigm ex-actly as in the main experiment, except that the faces wereinverted. It is well known that inversion of faces destroysholistic processing (Tanaka & Farah, 1993), and there-fore, if the emotional expressions were the critical factorin influencing the distribution of attention, no effectsshould be found with inverted faces. In contrast, if somelow-level artifact, such as dimples, were causing the ef-fect, a similar pattern should be observed with upright andinverted faces. In the pilot study with 6 HA participants,no attentional bias effects were found for fearful faces inthe LVF (10.23) or the RVF (21.2) or for happy faces inthe LVF (12.2) or the RVF (10.56; all ts , 1). This sug-gests that the pattern found in the main experiment wasdue to the emotionality of the faces.

DiscussionAs was expected, in the analysis of attentional bias

scores, the visual field of presentationand the typeof emo-tional face did affect response times differentially for HAand LA participants. The results demonstrated that there

Table 1Mean Response Times to Probe Targets

(in Milliseconds) in Experiment 1

High Trait Low Trait

Face Face Probe Anxiety Anxiety

Type Location Location M SD M SD

Fearful left left 614 118 571 105left right 638 112 575 190right left 607 196 582 106right right 615 122 574 185

Happy left left 621 196 582 123left right 592 185 601 107right left 605 190 575 110right right 601 191 578 193

Note—Face pairs were presented for 500 msec.

Page 5: Processing emotional facial expressions: The role of anxiety and awareness

56 FOX

was a tendency for HA participants to be influenced byemotional faces appearing in the LVF, but not by those ap-pearing in the RVF. Importantly, a control experimentwith6 HA participants demonstrated that emotional faces didnot influence the distribution of attentionwhen the faceswere inverted.However, when the faces were upright, HAparticipants tended to be vigilant for fearful faces appear-ing in the LVF but avoidant of happy faces appearing inthe LVF. These results support the notion that the righthemisphere of the brain might be particularly sensitive tothe emotionality of a face. This pattern was not observedfor LA individuals and was not found for faces appear-ing in the RVF. Thus, the results of this study give tenta-tive support to the notionthat the natureof the stimuli (emo-

tional vs. neutral), the visual field of presentation, and theparticipants’ level of trait anxiety can all influence the dis-tribution of spatial attention.

Previous research has indicated that threat-related stim-uli may havea stronger effect on attentionalprocesses whenawareness of the stimuli is restricted. For instance, onestudy demonstrated that the selective processing of HApeople was disrupted by irrelevant threateningwords onlywhen thesewords were backward masked. The same wordsproduced little disruption on the selection task when theywere not masked (Fox, 1996). Likewise, it has been shownthat the propensity of angry faces to capture visual atten-tion in HA individuals is apparent only when the faces aremasked, and not when they are unmasked (Mogg & Brad-ley, 1999). Thus, it was decided to evaluate the role ofemotional expressions and level of trait anxiety in influ-encing spatial attentionunder conditions in which the fa-cial stimuli were masked after a brief period.

EXPERIMENT 2

The aim of Experiment 2 was to evaluate whether HApeople allocate attention toward the location of maskedfearful faces. On the basis of the results of Experiment 1,as well as of previous research (Mogg & Bradley, 1999),the hypothesis was that this propensity should be mostapparent when the fearful face was presented in the LVF.The experimentwas identical to Experiment 1, except thatthe pair of faces were masked after 17 msec (1 screen re-fresh rate) by a mask made up of jumbled face parts. Im-mediately after the dot-probe task, a gender discrimina-tion task was presented in order to provide an objectivemeasure of awareness of the masked stimuli (Cheesman& Merikle, 1986; Mogg & Bradley, 1999).

MethodParticipants. The participants were 36 undergraduate students

(26 female, 10 male), who had not participated in Experiment 1. Asbefore, they were selected from over 120 volunteers who had com-pleted the trait scale of the Spielberger State Trait Anxiety Inven-tory (STAI: Spielberger et al., 1983) early in the academic year. Therecruitment procedure favored those who obtained high and lowscores on this scale in order to ensure a wide range of emotionalityscores within the sample.

Materials and Apparatus. The materials and apparatus wereidentical to those in Experiment 1. The only difference was that amasking stimulus was presented 17 msec after the face pair waspresented. The masking stimulus was constructed by cutting up andrandomly reassembling a pair of neutral faces, so that the featuresof each face were jumbled. The display size of each masking pho-tograph was 4 3 6 cm (120 3 180 pixels), and the centers of themask stimuli were 10 cm apart.

Procedure. The procedure was very similar to that in Experiment 1.The dot-probe task was presented at the start of the session and con-sisted of 16 practice trials, followed by 128 experimental trials. Theface stimuli and their presentation were exactly the same as those inExperiment 1. At the start of each trial, a fixation cross was presentedfor 500 msec. The face pair was then presented for 17 msec, immedi-ately followed by the mask pair for 17 msec. The probe (: or . .) wasthen presented immediately in the location of one of the masks. Theprobe was displayed until a response was made, up to a maximum of

500-msec PresentationFearful–Neutral Face Pairs

Att

enti

onal

Bia

sSc

ore

(mse

c)

LVF RVF

HA

LA

HA

LA

30

20

10

0

–10

–20

–30

Happy–Neutral Face Pairs

Att

enti

onal

Bia

sSc

ore

(mse

c) 30

20

10

0

–10

–20

–30LVF RVFLocation of Face

Figure 1. Attentional bias scores (in milliseconds) as a functionof trait anxiety group, type of emotional face, and location ofemotional face in Experiment 1.

Page 6: Processing emotional facial expressions: The role of anxiety and awareness

ATTENTION AND FACIAL EXPRESSIONS 57

2,000 msec. The intertrial interval was 1,000 msec. The computeremitted a 500-Hz bleep if the participant made a mistake. The trialswere presented in a new random order for each participant.

Following the dot-probe task, the awareness check was given forthe masked faces. All of the face pairs used in the main experimentwere presented twice, resulting in a total of 32 trials. The face pairswere presented for 17 msec and then were masked for 17 msec, justas in the dot-probe experiment. For each face pair, the participant wasasked to indicate the gender of each face pair by pressing the left-hand button for female and the right-hand button for male. At theend of the session, each participant completed the state anxietyscale of the STAI.

ResultsGroup characteristics. The participants were pre-

selected on the basis of their STAI trait anxiety scores,which were collected at a general testing session about6–8 weeks prior to the dot-probe task. Those scoring 45or above on the STAI trait anxiety scale were allocated tothe HA group (n 5 18), whereas those scoring 35 or lesson the STAI trait anxiety scale were allocated to the LAgroup (n 5 18). None had participated in Experiment 1.The two groups differed significantly on both trait anxi-ety [t(34) 5 11.1, p , .001; HA, M 5 50.4, SD 5 6.8;LA, M 5 29.4, SD 5 4.2] and state anxiety [t(34) 5 7.1,p , .001; HA, M 5 48.7, SD 5 9.7; LA, M 5 28.6, SD 57.1] scores.

Dot-probe task. On the awareness check, the meanpercentage of correct scores on the gender discriminationtask was 50.1% (SD 5 2.1%), which is at chance level.There was no difference between the HA (M 5 50.09%,range 5 46%–54%) and the LA (M 5 50.21%, range 546%–55%) groups on the awareness check. On the dot-probe task, trials with errors were discarded, as were anyreaction times less than 100 msec. Reaction times greaterthan two and a half SDs from a participant’s overall meanreaction time were also excluded prior to data analysis.This resulted in the loss of less than 3% of the reaction timedata. The mean reaction time data are shown in Table 2. Tosimplify the analysis, attentional bias scores were calcu-lated from the response time data for each type of emo-tional face, as in Experiment 1. The bias scores for eachanxiety group and each visual field are shown in Figure 2.

A 2 3 2 3 2 mixed design ANOVA was carried out onthe attentionalbias score data with one between-subjectsfactor of trait anxiety (HA vs. LA) and two within-subjectsfactors of type of emotional face (fearful vs. happy) andlocation of emotional face (LVF vs. RVF). There weresignificant interactions between trait anxiety group andtype of emotional face [F(1,34) 5 17.6, p , .001], andbetween trait anxiety group and face location [F(1,34) 56.4, p , .016]. The predicted interaction between traitanxiety group, type of emotional face, and location ofemotional face almost reached significance [F(1,34) 53.43, p , .073].

Given the specific hypotheses, planned contrasts wereused to compare the groups on their bias scores for eachtype of emotional face in each location (LVF and RVF).With one-tailed tests of significance, it was found that theHA group was more vigilant than the LA group for fear-ful faces in both the LVF [140.4 vs. 25.7; t(34) 5 4.6,p , .001] and the RVF [115.8 vs. 11.4; t(34) 5 2.5, p ,.01]. No between-group differences were found for theindex scores for happy faces. Attentional bias scores werealso tested against zero (0 5 no attentionalbias) for eachanxiety group separately. Once again, with a one-tailedtest, the HA group demonstratedvigilancefor fearful facesappearing in the LVF [140.4; t(17) 5 5.7, p , .001] andin the RVF [115.8; t(17) 5 4.8, p , .001]. Further analy-sis showed that the vigilance for fearful faces was greaterwhen the emotional face was presented in the LVF, rela-tive to the RVF, for this group [t(17) 5 3.3, p , .004]. Theonly effect for the LA participants was a vigilance forhappy faces appearing in the RVF [114.9; t(17) 5 2.1,p , .026], which was not predicted. Finally, Pearson cor-relations were calculated between the trait and state anx-iety scales of the STAI and the index scores for fearful andhappy faces. This showed that the index for fearful faceswas correlated with both trait anxiety (r 5 .53, p , .001)and state anxiety (r 5 .37, p , .025), whereas the index forhappy faces did not correlate with either of the subjectivemeasures.

Comparison of Experiments 1 and 2. The data fromExperiments 1 and 2 were combined in order to assesswhether masked fearful faces produced a stronger atten-tional capture effect in HA people, relative to unmaskedfaces. This combined data was analyzed in a 2 3 2 3 2 3 2mixed design ANOVA, with attentional bias scores as thedependent variable. There were two between-subjects fac-tors of masking (masked vs. unmasked) and trait anxiety(HA vs. LA) and two within-subjects factors of type ofemotional face (fearful vs. happy) and location of emo-tional face (LVF vs. RVF). As was expected, there was asignificant interaction between trait anxiety, type of emo-tional face, and location of the emotional face [F(1,64) 57.9, p , .006]. However, the critical four-way interactionbetween these three factors and masking failed to reachconventional levels of significance [F(1,64) 5 2.6, p ,.10]. Because of the a priori predictions, one-tailed t testswere computed on the attentionalbias index for fearful and

Table 2Mean Response Times to Probe Targets

(in Milliseconds) in Experiment 2

High Trait Low Trait

Face Face Probe Anxiety Anxiety

Type Location Location M SD M SD

Fearful left left 611 80 571 105left right 651 87 575 190right left 621 70 582 106right right 606 68 574 185

Happy left left 622 67 597 186left right 615 62 604 177right left 615 69 608 182right right 621 72 594 185

Note—Face pairs were presented for 17 msec and then masked.

Page 7: Processing emotional facial expressions: The role of anxiety and awareness

58 FOX

happy facial expressions. For HA people, the attentionalbias index was larger for masked fearful (133.4 msec) thanfor unmasked fearful (14.1 msec) faces [t(32) 5 1.64, p ,.055], whereas there was no difference for the happy faces(220.3 msec vs. 29.6; t , 1). For the LA participants,therewas no difference for either the fearful or the happy indexscores between the masked and the unmasked conditions.

DiscussionThe results of Experiment 2 confirmed that fearful faces

had a greater influence on the distribution of spatial atten-tion than did happy faces. Moreover, this result was appar-ent only for individualswith high levelsof trait anxiety andwas strongerwhen the emotional face was presented in the

LVF. Thus, under conditionsin which the participantswerenot aware of what stimuliwere being presented,an anxiety-related propensity to selectively attend to fearful facial ex-pressions was found. This is very similar to the results pre-viously reported for angry facial expressions (Mogg &Bradley, 1999). It is particularly interesting that Mogg andBradley also found that selective attention to angry faceswas apparent only when the faces were presented in theLVF. The present results extend these findings to facialexpressions of fear and suggest that the right hemispheremight be particularly sensitive to negative emotional ex-pressions (anger and fear), but not to happy facial expres-sions. Moreover, in the present study, a strong correlationwas found between the attentional bias index for maskedfacial expressions of fear and self-report measures of anx-iety. In contrast, there was no correlation when the emo-tional facial expressionswere not masked (Experiment 1).Unfortunately, the four-way interactionbetween masking,trait anxiety, type of emotional expression, and location offace did not reach significance, although the trend (p ,.10) was in the right direction. In particular, the attentionalbias index for fearful faces was higher for the masked, rel-ative to the unmasked, conditionsfor HA participants.Thissupports previous research showing that masking emo-tional stimuli sometimes leads to stronger anxiety-relatedeffects on spatial attention (Fox, 1996; Mogg & Bradley,1999). Taken together, the present results, alongside theprevious research, suggest that restricting the awarenessof emotional stimuli may lead to stronger attentional cap-ture. In the final experiment,we further explored the ques-tion of awareness by presenting affective and neutral fa-cial expressions to the LVF of a patient with right parietallobe damage who demonstrated extinctionof objects pre-sented to the LVF. This allowed for assessment of whetherthe processing of emotional facial expressions might beless dependent on attentional processing.

EXPERIMENT 3

Many studies have demonstrated that neglected stim-uli often seem to receive extensive processing despite thepatient’s reporting that he or she sees nothing. In one study,for example, 4 patientswith left neglect and 1 patient withhemianopia were presented with a lexical decision taskat fixation. The letter string was preceded 600 msec ear-lier by two line drawings (one real object, one scrambledobject) on either side of fixation (McGlinchey-Berroth,Milberg,Verfaellie, Alexander, & Kilduff, 1993). The crit-ical finding was that lexical decisions were faster if thepreceding line drawing was semantically related to the tar-get word for the neglect patients, regardless of the side ofpresentation of the meaningful drawing. In other words,even when the drawing appeared in the neglected field, this“unseen” object still produced substantial priming. Nopriming effect was found for the hemianopic patient. Thisresult suggests that despite being neglected, the un-attended object received sufficient processing to influence

17-msec PresentationFearful–Neutral Face Pairs

Att

enti

onal

Bia

sSc

ore

(mse

c)

LVF RVF

HA

LA

45

35

25

15

5

–5

–15

Happy–Neutral Face Pairs

Att

enti

onal

Bia

sSc

ore

(mse

c)

Location of FaceLVF RVF

HA

LA

45

35

25

15

5

–5

–15

Figure 2. Attentional bias scores (in milliseconds) as a functionof trait anxiety group, type of emotional face, and location ofemotional face in Experiment 2.

Page 8: Processing emotional facial expressions: The role of anxiety and awareness

ATTENTION AND FACIAL EXPRESSIONS 59

future responding. More recent studies have shown thatreal objects or faces produce less extinctionthan do scram-bled objects or scrambled faces (Vuilleumier, 2000; Ward& Goodrich, 1996). These studies support the notion thatsubstantial analysis can take place in the absence of sub-jective awareness of the stimuli and that attention maythen be biased to select more meaningful stimuli. Of moredirect relevance for the present study is the recent reportthat faces with an angry or a happy expression resultedin less extinction than did faces with a neutral expression(Vuilleumier & Schwartz, 2001). In the present experi-ment, photographsof faces (fearful, happy, and neutral ex-pressions) and of fruits (apple, orange, and banana) werepresented to an individual who had experienced a rightparietal cerebrovascular accident (CVA) approximately1 year prior to testing. This patient, J.B., presented withmild unilateral neglect on the left side, and extinction forobjects appearing in the LVF was easily demonstrable. Itis also of interest that J.B. received some treatment for prob-lems with anxiety and worry related to his illness. There-fore, we can expect that he was relatively highly anxious.The prediction was that faces would produce less contra-lesional extinctionon bilateral trials than would fruits. Sec-ond, it was expected that fearful expressions would pro-duce less contralesional extinctionon bilateral trials thanwould either happy or neutral facial expressions.

MethodsParticipants. J.B. was a 74-year-old right-handed man who had

suffered a CVA 13 months prior to testing (in June 1995). The CVAhad left him with extensive damage in the right posterior parietal re-gion. On testing, he was alert and cooperative and had intact visualfields on both sides, with corrected-to-normal vision. He showed mildsigns of spatial neglect on a standard test of letter cancellation, inwhich he obtained 60/60 on the right side, as compared with 45/60on the left side. Left-side extinction was easily demonstrated on bi-lateral simultaneous stimulation for visual stimuli. He obtained 36items correct with 14 errors on the National Adult Reading Test,which gave him an estimated IQ of 116. J.B. scored 33 (maximum 550) on the Taylor Manifest Anxiety Scale (Taylor, 1953), which isa screening instrument, and indicated a high level of worry and anx-iety. He was subsequently seen by a hospital psychiatrist, who de-termined that he was suffering distress, depression, and worry pri-marily related to his brain injury.

Materials and Procedure. Three male faces were selected fromthe Pictures of Facial Affect developed by Ekman and Freisen(1975). One of the faces portrayed a fearful expression (PE3-21), oneportrayed a happy expression (EM4-07), and one portrayed a neu-tral emotional expression (WF2-05). These faces were digitized andpresented as black-and-white photographs on a Macintosh portablecomputer. The display size of each photograph was 4 3 6 cm (1203 180 pixels), and the centers of the faces were 10 cm apart on bi-lateral trials. In addition, three black-and-white photographs of anapple, an orange, and a banana were digitized and also had displaysizes of 4 3 6 cm, with the centers of the fruits 10 cm apart on bi-lateral trials. The experiment consisted of 120 unilateral and 180bilateral trials. Each of the six stimuli (three faces, three fruits) ap-peared equally often on unilateral and bilateral trials, and the computerpresented the trials in a random order. All the possible trials are pre-sented in Figure 3. Testing took place across two sessions about 1week apart. The patient sat about 50 cm in front of the computerscreen. Each trial began with a fixation cross at the center of thescreen for 500 msec, followed by a stimulus presented in the left,

the right, or both hemifields (5.7 º of visual angle away from fixa-tion) for 400 msec. Previous calibration had shown that J.B. could dothe task with a 400-msec presentation time and that clear evidenceof extinction was obtained with this exposure. The task simply re-quired J.B. to say whether a face, a fruit, or both had been presentedon each trial. He was told that two items would often appear to-gether (one on the left and one on the right). When J.B. gave his re-sponse, this was entered into the computer by the experimenter, andthis was followed by a 1,000-msec black screen until the next trialbegan.

ResultsTable 3 shows the number of stimuli missed in each

conditionfor Patient J.B. The results show that J.B. missedrelatively few contralesionalstimuli in unilateral trials andthat this was comparable for photographs of faces (20%)and photographs of fruits (23%). However, clear contra-lesional extinctionwas observed on bilateral trials. As canbe seen, J.B. missed approximately 71% of LVF stimulion bilateral trials, but only 2% in the RVF on bilateral tri-als. Of more interest, the amount of extinction observedfor photographsof faces (56%) seemed to be considerablyless than that observed for photographs of fruits (86%).With respect just to the face stimuli, there was some indi-cation that the degree of extinction was less for fearfulfaces (43%), relative to neutral (73%) and happy (53%)faces. The data were analyzed by means of a 2 3 2 3 2ANOVA, with type of trial (unilateral vs. bilateral), lo-cation of stimulus (LVF vs. RVF), and type of stimulus(faces vs. fruits), as “between-subjects” factors. Eachindividual trial was treated as a separate subject in orderto allow for parametric analyses. The results showed maineffects for type of trial [F(1,472) 5 60.0, p , .001], suchthat the percentage of stimuli missed was lower for uni-lateral (11%) than for bilateral (37%) trials. As was ex-pected, the percentageof stimuli missed was much higherin the LVF (59%) than in the RVF [2%; F(1,472) 5184.5, p , .001]. Of more interest, it was also found thatthe percentage of faces missed was less (24%) than thepercentage of fruits missed [37%; F(1,472) 5 7.6, p ,.006]. The type of trial 3 location of stimuli interactionwas also significant [F(1,472) 5 50.2, p , .001], con-firming that the LVF omissions occurred mainly on bi-lateral trials. There was also a type of trial 3 type of stim-ulus interaction, such that the difference between facesand fruits occurred only on bilateral trials.

The next analyses considered the face stimuli only onbilateral trials in order to test the a priori prediction thatfearful faces would be extinguished to less of an extentthan neutral or happy faces. A 2 3 3 ANOVA for locationof face (LVF vs. RVF) and type of face (fearful, happy, neu-tral) showed a significant main effect for location of theface [F(1,174) 5 121.5, p , .001], whereas the main ef-fect for type of face was close to significance [F(2,174) 52.9, p , .055]. Of more importance, the location3 type offace interaction was also almost significant [F(2,174) 52.9, p , .055]. Planned contrasts showed that for the LVF(on bilateral trials), there was substantially more extinc-tion for neutral faces (73%) than for fearful faces [43%;

Page 9: Processing emotional facial expressions: The role of anxiety and awareness

60 FOX

t(58) 5 2.43, p , .009, one-tailed], or for happy faces[53%; t(58) 5 1.6, p , .055]. There was no difference inthe magnitude of extinction between fearful and happyfaces.

DiscussionThe results of this single case study confirm recent find-

ings of reduced extinction for faces, relative to shapes orscrambled faces (Vuilleumier, 2000; Ward & Goodrich,1996). Moreover, the results also confirm the hypothesisthat emotionally salient faces produce less extinction thando neutral facial expressions (Vuilleumier & Schwartz,2001). In Vuilleumier and Schwartz’s study, 3 patientswith chronic left neglect and visual extinction were pre-sented with schematic angry, happy, and neutral faces,as well as with simple shapes. It was found that the facesresulted in less extinction than did the shapes on bilateraltrials for all of the patients (about 39% and 72%, respec-tively). It was also found that the angry and happy facialexpressionsproduced less extinction (about 33% and 25%for angry and happy faces, respectively) than did the neu-tral faces (about 58%). The present study was similar inshowing less extinctionon bilateral trials for photographsof faces (28%), relative to photographs of fruits (45%).When we looked just at the facial stimuli, J.B. was lesslikely to extinguish fearful (43%) or happy (53%) facialexpressions in the LVF on bilateral trials, as comparedwith neutral (73%) facial expressions. Thus, these resultssupport the previous finding that emotional facial expres-

sions are extinguishedless than neutral faces (Vuilleumier& Schwartz, 2001). In the present study, we used pho-tographs of real facial expressions, rather than schematicfaces, as in the previous study. In addition, the present re-sults extend the previous findings to the emotional ex-pression of fear. The expectation was that fearful expres-sions would produce less extinction than would happyexpressions.Although the trend was in the right direction(43% vs. 53% for fearful and happy expressions, respec-tively), this was far from significant. Thus, as in the Vuil-leumier study, similar levels of extinction were observedfor the negative and the positive emotional expressions.Future research will be needed to investigate in more de-tail whether negative emotional expressions (e.g., anger,fear) are extinguishedto a lesser extent than positiveemo-tionalexpressions (e.g., happiness) for patientswith neglect

Table 3Mean Number of Omissions for Each Condition

for Patient J.B. in Experiment 3

Unilateral Trials Bilateral Trials

Type of Stimuli LVF RVF LVF RVF

Fearful face 1 10 13 10Happy face 3 10 16 10Neutral face 2 10 20 10Apple 3 10 26 12Orange 1 10 27 10Banana 3 10 24 12No. trials in each condition 10 10 30 30

Figure 3. The diagram shows all possible unilateral and bilateral trials presented topatient J.B. in Experiment 3. The stimuli were black-and-white photographs of faces(fearful, happy, and neutral) and fruits (apple, orange, and banana). The trials werepresented in a totally randomized order. LVF, left visual field; RVF, right visual field.

Page 10: Processing emotional facial expressions: The role of anxiety and awareness

ATTENTION AND FACIAL EXPRESSIONS 61

and extinction.The results of behavioralstudieswould sug-gest that negative expressions should have a stronger im-pact on attentionalprocessing than do positive emotionalexpressions (Fox et al., 2001; Mogg & Bradley, 1999).

GENERAL DISCUSSION

A clear implicationof the present results is that the anx-iety level reported by a participantcan have a profound ef-fect on the distributionof spatial attention to behaviorallyrelevant stimuli. In Experiments 1 and 2, attentional biasfor fearful faces was observed only for those reporting rel-atively high levels of trait anxiety. This supports a grow-ing research literature in cognitionand emotion (Bradleyet al., 1998; Bradley et al., 2000; Fox et al., 2001; Mogg &Bradley, 1999) that has not generally been acknowledgedin cognitiveneuroscience. The present research confirmsthat although the emotionality of the stimuli used is im-portant in determining the distributionof spatial attention,the emotionalityof the individualparticipating in the taskis also of critical importance. An interesting aspect of thepresent study is that it demonstrates an LVF bias for pro-cessing fearful faces in high anxious individuals, but notin low anxious individuals. When face pairs were pre-sented for 500 msec (Experiment 1), HA individuals dem-onstrated an attentional bias toward fearful expressionsand away from happy expressions, but only when theemotional faces were presented in the LVF. When the facepairs were presented for just 17 msec and then backwardmasked, there was again evidence for an anxiety-relatedbias to emotional expressions, but this time only for fear-ful expressions. HA individuals were vigilant for fearfulfaces appearing in both the LVF and the RVF, but the mag-nitude of the bias was greater when the emotional faceswere presented in the LVF. The present results add to theevidence that the right hemisphere may be especially, butnot exclusively, sensitive to the presence of behaviorallyrelevant stimuli that signal potential danger to the or-ganism. It is of particular interest to determine whethera right hemisphere bias might be especially related tohigh levels of anxiety. Further evidence consistent withthis notion is the previous finding that masked angryfaces captured attentional resources in HA individuals,but only when they appeared in the LVF (Mogg & Bradley,1999).

Is there any neurological evidence that the right hemi-sphere might be especially sensitive to fear-relevant stim-uli in anxious individuals?A brief overviewof the literatureindicates a rather confusing picture. For instance, some re-search suggests that nonclinicalstate anxiety is associatedwith increased right-hemisphere activity, as measured byregional blood flow (Reivich,Gur, & Alavi, 1983),whereasanother study has found that trait anxiety was associatedwith greater left-hemisphere activation (Tucker, Antes,Stenslie, & Barnhardt, 1978). Consistent with this, higherrelative metabolism has been found in the left inferiorfrontal gyrus for patients with generalized anxiety disor-der, as compared with matched controls (Wu et al.,1991).

However, more recent studies have proposed that if anx-ious arousal (as presumably measured by state anxiety)is distinguished from anxious apprehension (as mea-sured by trait anxiety), the results become more coherent,with relatively greater right-hemisphere activation inanxious arousal and greater left-hemisphere activation inanxious apprehension (Heller & Nitschke, 1998; Helleret al., 1997). Although state anxiety was not directly ma-nipulated in the present study, the results seem inconsis-tent with this model, in that trait anxiety (apprehension)was a better predictor of attentional bias to fearful facesthan was state anxiety (arousal) and this effect was strongerfor faces presented in the LVF (i.e., a right-hemispherebias). Likewise, a recent study has found evidence forgreater right-hemisphere involvement in emotional pro-cessing for a group of people high in anxious apprehen-sion, which also does not support Heller et al.’s model(Compton, Heller, Banich, Palmieri, & Miller, 2000).Moreover, another study has shown that socially phobicindividuals show increased right-hemisphere activationwhen they are waiting for a potentially stressful event(Davidson et al., 2000). Thus, it my well be that the patternof brain activity may be related to an interaction betweentrait and state anxiety. There is clearly a need to conductmore research on the neural mechanisms underlyinganx-iety. In particular, it would be interesting to establish theneural mechanisms that are engaged during attentionalbiases toward fearful faces in HA and LA people underconditions of high and low stress. One mechanism bywhich selective attentional biases might work is that theprocessing of a fear-relevant stimulus (e.g., a fearful orangry expression) might increase the level of autonomicarousal in anxious individuals,which might, in turn, leadto activationof certain regionsof the right hemisphere (seeDavidson, 1998). Thus, this mechanism would result in ananxiety-relatedattentionalbias to fear-relevant stimuli pre-sented in the LVF, as was found in the present study. Itshould be noted, however, that some studies have foundlarger anxiety-relatedattentionalbias effects for threaten-ing faces presented to the RVF, relative to the LVF (Bradleyet al., 1998; Bradley et al., 2000). Further work is clearlyrequired to determine the role of each hemisphere in pro-cessing emotional expressions, especially in relation todifferent typesof anxiety (Compton et al., 2000).The pres-ent results suggest that the nature of the stimuli and theanxiety level of the individual combine to determine thedistribution of spatial attention. One potential avenue ofresearch would be to evaluate the responsiveness of theamygdala in HA individuals, since it is known that theamygdala can influence the distribution of spatial atten-tion, as well as modulating levels of arousal (Armony &LeDoux, 2000). The present research suggests that the dis-tribution of spatial attention is the result of a complexinterplay between the nature of the stimuli and the anxi-ety level of the participant.

A second issue addressed in this paper is the anxiety-related attentional bias toward fearful faces that were un-seen. It is of particular interest that the attentional bias to-

Page 11: Processing emotional facial expressions: The role of anxiety and awareness

62 FOX

ward masked stimuli seems to be somewhat stronger thanthat observed with unmasked stimuli. Although the criticalinteractionfailed to reach significancein the present study,it was nevertheless clear that fearful faces had a strongeraffect on attention in the HA group when the faces weremasked. For instance, attentional bias was weak and wasnot predicted by trait or state anxiety scores when stimuliwere unmasked (Experiment 1). However, when the stim-uli were masked, trait anxietywas a significantpredictorofattentionalbias for fearful faces (Experiment 2). Likewise,it has been found that when masked stimuli were presentednear threshold, so that participants were aware of stimulibeing presented some of the time, the attentional bias forangry faces disappeared (Mogg & Bradley, 1999). Simi-larly, in an earlier study, differential processing of negativeemotional words in HA individuals was observed onlywhen the words were masked, and not when they wereclearly visible (Fox, 1996). Recent brain-imaging researchhas also found that masked angry facial expressions pro-duce a stronger activation of the right amygdala, whereasunmasked angry facial expressions produce greater acti-vation of the left amygdala (Vuilleumier, Armony, Driver,& Dolan, 2001). Similarly, it has been found that whenaversively conditionedangry facial expressions are back-ward masked, the right amygdala is most active, with lit-tle activity in the left amygdala (Morris et al., 1998, 1999).Morris et al. (1999) suggest that there is a subcorticalpathway involving the superior colliculus and pulvinarthat provides a route for processing behaviorally relevantunseen visual events. This route seems to be specific tonegative emotional expressions, in that it was not acti-vated when participants viewed happy facial expressions.Thus, it seems reasonable to speculate that the right hemi-sphere may be especially responsive to emotional facesand that the pulvinar route gives an additional advantageto unseen stimuli. An important avenue for future researchis to establish why unseen emotional stimuli may have astronger effect on the distributionof spatial attention thando clearly visible stimuli and why anxiety may modulatethis effect. One possibility is that unseen stimuli are pro-cessed by the amygdala,which thenmodulatescorticalpro-cessing to focus on the potentiallydangerous informationin more detail.This internal signalmay be especially strongfor anxious individuals,who have recently been found toshow a stronger attentional bias toward internal threaten-ing information, relative to external information (Stegen,van Diest, van de Woestijne, & van den Bergh, 2001).

The results of the first two experiments demonstratethat level of trait anxiety, as well as degree of awareness ofvisual stimuli, may both be important determinants of at-tentional bias toward fearful facial expressions. As wasnoted previously, the backward masking of visual stim-uli may correspond fairly closely to the preattentive con-ditions observed for patientswith neglect and extinction.In Experiment 3, a patient with right parietal lobe dam-age, who exhibits left-sided neglect and extinction, wasstudied. It is of interest that this individual had receivedtreatment for problems with anxiety and worry, and there-

fore we can assume that he was a high anxious individual.The results of Experiment 3 demonstrated that signifi-cantly less extinction was observed in the LVF on bilat-eral trials for fearful and happy facial expressions, ascompared with neutral expressions. This confirms previ-ous findings that the automatic processing of emotionalfacial expressions can be revealed by the modulation ofextinction for faces presented in the LVF (Vuilleumier&Schwartz, 2001). The present study is the first demonstra-tion of this effect for the emotional expression of fear. It isof interest to note, however, that no difference was foundin the amount of extinctionobserved for fearful and happyfacial expressions. Thus, it seems that emotionally salientstimuli produce less extinction regardless of the valenceof the faces (Vuilleumier & Schwartz, 2001; present Ex-periment 3). In contrast, it can be concluded that behav-iorally relevant stimuli that signal potential threat are es-pecially effective in influencing the distributionof visualattention in high anxious people. It is interesting to specu-late that emotionally relevant stimuli (positive and neg-ative) may be processed automatically (as in the neglectpatient) but that attention then gets allocated only to thepotentially threatening stimuli (as in the anxiety-relatedbias for fearful facial expressions). This is consistent withthe recent finding that the presence of angry facial expres-sions tends to hold attention to their location, whereashappy facial expressions do not for anxious individuals(Fox et al., 2001). In summary, the present study suggeststhat the processingof affective facial expressions is not de-pendent on attentive processes and that this results in ananxiety-related selective processing of behaviorally rele-vant stimuli, which appears to be restricted to fear-relevantstimuli.

REFERENCES

Armony, J., & LeDoux, J. E. (2000). How danger is encoded: Toward asystems, cellular, and computational understanding of cognitive–emotional interactions in fear. In M. S. Gazzaniga (Ed.), The new cog-nitive neurosciences (2nd ed., pp. 1067-1079).Cambridge, MA: MITPress.

Bradley,B. P., Mogg,K., Falla,S. J., & Hamilton,L. R. (1998). At-tentional bias for threatening facial expressions in anxiety: Manipu-lation of stimulus duration. Cognition & Emotion, 6, 737-753.

Bradley, B. P., Mogg, K., & Millar, N. H. (2000). Covert and overtorienting of attention to emotional faces in anxiety. Cognition &Emotion, 14, 789-808.

Byrne,A., & Eysenck, M. W. (1995).Trait anxiety, anxious mood, andthreat detection. Cognition & Emotion, 9, 549-562.

Cheesman, J., & Merikle,P. M. (1986).Distinguishingconscious fromunconscious perceptual processes. Canadian Journal of Psychology,40, 343-367.

Cohen,J., MacWhinney,B., Flatt, M.,& Provost, J. (1993).PsyScope:An interactive graphic system for designingand controllingexperimentsin the psychology laboratory using Macintosh computers. BehavioralResearch Methods, Instruments, & Computers, 25, 257-271.

Compton, R. J., Heller,W., Banich,M. T., Palmieri,P. A., & Miller,

G. A. (2000).Respondingto threat: Hemispheric asymmetries and inter-hemispheric division of input. Neuropsychology, 14, 254-264.

Davidson, R. J. (1998). Affective style and affective disorders: Perspec-tives from affective neuroscience. Cognition& Emotion, 12, 307-330.

Davidson,R. J., Marshall,J. R., Tomarken,A. J., & Henriques,J. B.

(2000).While a phobicwaits: Regional brain electrical and autonomic

Page 12: Processing emotional facial expressions: The role of anxiety and awareness

ATTENTION AND FACIAL EXPRESSIONS 63

activity in social phobics during anticipation of public speaking. Bio-logical Psychiatry, 47, 85-95.

Dolan, R. J. (2000). Emotional processing in the human brain revealedthrough functional neuroimaging. In M. S. Gazzaniga (Ed.), The newcognitive neurosciences (2nd ed., pp 1115-1131). Cambridge, MA:MIT Press.

Eastwood, J. D., Smilek, D., & Merikle,P. M. (2001).Differential at-tentional guidance by unattended faces expressing positive and neg-ative emotion. Perception & Psychophysics, 63, 1004-1013.

Ekman, P., & Freisen, W. V. (1975). Pictures of facial affect. Palo Alto,CA: Consulting Psychologists Press.

Esteves, F.,Parra,C., Dimberg,U., & Ohman, A. (1994).Nonconsciousassociative learning: Pavlovian conditioning of skin conductance re-sponses tomasked fear-relevant stimuli.Psychophysiology, 31, 375-385.

Fox, E. (1996). Selective processing of threatening words in anxiety:The role of awareness. Cognition & Emotion, 10, 449-480.

Fox,E.,Lester,V., Russo, R., Bowles, R. J., Pichler,A., & Dutton, K.

(2000). Facial expressions of emotion: Are angry faces detected moreefficiently? Cognition & Emotion, 14, 61-92.

Fox, E., Russo, R., Bowles, R. J., & Dutton, K. (2001). Do threaten-ing stimuli draw or hold visual attention in subclinical anxiety? Jour-nal of Experimental Psychology: General, 130, 681-700.

Hansen,C. H., & Hansen,R. D. (1988).Finding the face in the crowd: Ananger superiorityeffect. Journalof Personality& SocialPsychology, 54,917-924.

Heller,W., & Nitschke, J. B. (1998). The puzzle of regional brain ac-tivity in depression and anxiety: The importance of subtypes and co-morbidity. Cognition & Emotion, 12, 421-447.

Heller,W., Nitschke, J. B., Etienne, M. A., & Miller,G. A. (1997).Patterns of regional brain activity differentiate types of anxiety. Jour-nal of Abnormal Psychology, 106, 376-385.

LeDoux,J. E. (1996).Theemotionalbrain:Themysterious underpinningsof emotional life. New York: Simon & Schuster.

McGlinchey-Berroth,R., Milberg, W. P., Verfaellie,M., Alexan-

der,M., & Kilduff, P. T. (1993). Semantic processing in the neglectedvisual field: Evidence from a lexical decision task. Cognitive Neuro-psychology, 10, 79-108.

Mogg, K., & Bradley,B. P. (1999). Orienting of attention to threaten-ing facial expressions presented under conditionsof restricted aware-ness. Cognition & Emotion, 13, 713-740.

Morris, J. S.,Frith, C. D., Perrett,D. I., Rowland,D., Young, A. W.,

Calder, A. J., & Dolan, R. J. (1996). A differential neural responsein the human amygdala to fearful and happy facial expressions. Nature,383, 813-815.

Morris, J. S., Ohman, A., & Dolan, R. J. (1998). Conscious and un-conscious emotional learning in the human amygdala. Nature, 393,467-470.

Morris, J. S., Ohman, A., & Dolan,R. J. (1999).A subcortical pathwayto the right amygdala mediating “unseen” fear. Proceedings of the Na-tional Academy of Sciences, 96, 1680-1685.

Nothdurft, H. C. (1993). Faces and facial expressions do not pop out.Perception, 22, 1287-1298.

Ohman, A., Lundqvist, D., & Esteves, F. (2001).The face in the crowdrevisited: A threat advantage with schematic stimuli. Journal of Per-sonality & Social Psychology, 80, 381-396.

Rafal, R. (1996). Visual attention: Converging operations from neurol-ogy and psychology. In A. F. Kramer, M. G. H. Coles, & G. D. Logan(Eds.), Converging operations in the study of visual selective attention(pp. 139-192).Washington,DC: American Psychological Association.

Reivich,M.,Gur, R.,& Alavi,A. (1983).Positronemission tomographicstudies of sensory stimuli, cognitiveprocesses and anxiety. Human Neu-robiology, 2, 25-33.

Spielberger,C. D., Gorsuch,R. L.,Lushene,R., Vagg,P. R., & Jacobs,

G. A. (1983).Manualfor the state-trait anxiety inventory. Palo Alto,CA:Consulting PsychologistsPress.

Stegen,K., vanDiest, I., vande Woestijne,K. P.,& vandenBergh,O.

(2001). Do persons with negative affect have an attentional bias to bod-ily sensations? Cognition & Emotion, 15, 813-829.

Tanaka, J. W., & Farah, M. J. (1993). Parts and wholes in face recog-nition.Quarterly Journalof Experimental Psychology, 46A, 225-245.

Taylor,J. A. (1953).A personality scale of manifest anxiety. Journal ofAbnormal & Social Psychology, 48, 285-290.

Tucker,D. M.,Antes, J. R., Stenslie,C. E.,& Barnhardt,T. M. (1978).Anxiety and lateral cerebral function. Journal of Abnormal Psychol-ogy, 87, 380-383.

Vallar,G. (1993).The anatomical basis of spatial neglect in humans. InI. H. Robertson & J. C. Marshall (Eds.), Unilateral neglect: Clinicaland experimental studies (pp. 27-62). Hillsdale, NJ: Erlbaum.

Vuilleumier, P. (2000). Faces call for attention: Evidence from pa-tients with visual extinction. Neuropsychologia, 38, 693-700.

Vuilleumier,P., Armony, J. L., Driver,J., & Dolan,R. J. (2001). Dis-tinct effects of attention and emotion on face processing in the humanbrain: An event related fMRI study. Neuron, 30, 829-841.

Vuilleumier,P., & Schwartz, S. (2001). Emotional facial expressionscapture attention. Neurology, 56, 153-158.

Ward, R., & Goodrich,S. (1996).Differences between objects and non-objects in visual extinction: A competition for attention. Psycholog-ical Science, 7, 177-180.

Whalen,P. J.,Rauch,S. L.,Etcoff, N. L.,McInerney,S.C., Lee,M.B.,

& Jenike, M. A. (1998). Masked presentation of emotional facial ex-pressions modulateamygdala activity without explicit knowledge.Jour-nal of Neuroscience, 18, 411-418.

Wu,J. C., Buchsbaum,M. S.,Hershey,T. G., Hazlett, E., Sicotte,N.,

& Johnson, J. C. (1991). PET in generalized anxiety disorder. Biolog-ical Psychiatry, 29, 1181-1199.

(Manuscript received May 23, 2001;revision accepted for publication February 20, 2002.)