1
Attentional Attentional and Linguistic Interactions in Speech Perception and Linguistic Interactions in Speech Perception Merav Sabri Merav Sabri * , Jeffrey R. Binder, , Jeffrey R. Binder, Rutvik Rutvik Desai, Mike D. Desai, Mike D. Leitl Leitl , David A. , David A. Medler Medler, Edward T. Edward T. Possing Possing & & Einat Liebenthal Einat Liebenthal Medical College of Wisconsin , Milwaukee WI, USA. Medical College of Wisconsin , Milwaukee WI, USA. INTRODUCTION INTRODUCTION There is ample evidence for implicit (automatic) semantic processing following presentation of meaningful linguistic stimuli in both the auditory and visual modalities (e.g., the Stroop effect; MacLeod, 1991). Nonetheless, whether words are processed obligatorily remains uncertain mainly due to inconsistent reports regarding differences in brain activation between word and nonword sounds. Subjects performing a nonlinguistic visual feature detection task produced activations in language areas suggesting implicit visual word processing (Price et al., 1996). The current study employed fMRI to study the effects of attention (task demands) and stimulus type (auditory words, nonwords) on brain areas implicated in language processing. Task demands were manipulated such that subjects were engaged in a bimodal selective attention task where sounds were either attended to or ignored. To hold attention constant and away from the sounds, in the ignore condition subjects focused on a demanding visual task. METHODS METHODS Subjects . 28 healthy, right-handed, native English speakers. Auditory Stimuli. Words: 140 concrete monosyllabic nouns Pseudowords: 140, matched to words in number of phonemes, phonological neighborhood size, average biphone log frequency Rotated Speech: 70 spectrally rotated words, 70 rotated pseudowords chosen randomly from the lists above Mean Duration = 667 ms Visual Stimuli. Single Japanese characters (Kana; e.g., ) in middle of screen. Paradigm . • SOA of sounds = 2, 4, or 6 s. SOA of visual stimuli = 0.5 s. Auditory or visual repetition = 17%. • Each of 7 runs included 12 blocks of randomized: Attend Auditory (ignore visual) and Attend Visual (ignore auditory) conditions. Task. Participants performed a 1-back matching task in the assigned attended modality. In Attend auditory, subjects were instructed to keep their eyes focused on the screen while ignoring the visual characters. fMRI Procedures . GE Signa 1.5T scanner (GE Medical Systems, Milwaukee). • Functional data: T2*-weighted, gradient-echo EPI (TR = 2 sec, TE = 40 ms, flip angle = 90, NEX = 1). 21 axial slices, 3.75 x 3.75 x 5.50 mm voxels. • Anatomical data: 3-D spoiled gradient-echo sequence. Whole brain, 0.94 x 0.94 x 1.2 mm 3 voxels. • Image analysis used AFNI software package (Cox, 1996). FWHM = 6 mm. Random effects analysis. • Cluster size threshold (541 μl, p < .05 corrected) was applied to the group t- maps thresholded at p < .001. RESULTS RESULTS Auditory Behavioral Performance The auditory matching task was performed well: overall d' = 3.20 (words = 3.35, pseudowords = 3.31, rotated speech = 2.95). The visual task proved to be very demanding: d' = 2.72. DISCUSSION DISCUSSION Regardless of stimulus type, attended sounds activated the STS, SMG and dPFC more than unattended sounds. These areas are associated with auditory working memory which is necessary for performing the 1-back task. Regardless of task demands (attend, ignore), words and pseudowords did not differ in activation, suggesting similar processing. Rotated speech activated dorsal temporal areas (HG, PT) more than either words or pseudowords, regardless of task demands. These results suggest additional pre-attentive acoustic analysis of these unfamiliar sounds. Rotated speech also activated SMG and dPFC more than either words or pseudowords during the attend auditory task. This suggests greater demands on auditory working memory, consistent with the behavioral data showing poorer performance on the 1-back task for rotated speech. Conversely, attended words and pseudowords activated more ventral temporal areas (STS, MTG, ITG, fusiform), suggesting linguistic/semantic analysis. A large region in the medial occipital lobe (cuneus, lingual gyrus) showed stronger activation in the attend auditory task than in the attend visual task. We speculate that this pattern may represent active inhibition of the ongoing visual stimulation. Taken together, these results suggest that attention modulates linguistic processing of words and pseudowords. Rotated speech appears to be processed mainly at a prephonemic auditory level regardless of attentional state. Attentional processing engages auditory working memory, especially for rotated speech. fMRI ATTEND AUDITORY > ATTEND VISUAL (ignore auditory) Words. Right: MTG. Left: dPFC, MFG, IFG, posterior cingulate gyrus. Bilateral: STG, STS, SMG, OFC, precuneus, cuneus, lingual gyrus. Pseudowords activated very similar areas to those activated by words. Rotated Speech activated very similar areas to those activated by words and pseudowords. There was somewhat more activation in right frontotemporal areas and less activation in left OFC. *[email protected] http://www.neuro.mcw.edu/~msabri Attend Pseudowords > Attend Rotated Speech. Left: ITG, OFC. Bilateral angular gyrus. Attend Rotated Speech > Attend Pseudowords. HG, PT, MFG, SMG. IGNORE AUDITORY: STIMULUS CONTRASTS Ignore Words > or < Ignore Pseudowords. No above threshold activations. Ignore Words > Ignore Rotated Speech. No above threshold activations. Ignore Rotated Speech > Ignore Words. Bilateral HG and PT. Ignore Rotated Speech > Ignore Pseudoword. Bilateral HG similar to above. Supported by 1 F32 DC007030-01 (M. Sabri) and R01-NS33576 (J.R. Binder). ATTEND AUDITORY: STIMULUS CONTRASTS Attend Words > Attend Pseudowords. No above threshold activations. Attend Words > Attend Rotated Speech (in Orange). Bilateral: anterior STS, posterior MTG and lateral occipital gyri, hippocampus, parahippocampus, post- central gyrus. Left: OFC, SFG. Right: anterior ITG, fusiform gyrus. Attend Rotated Speech > Attend Words (in Blue). Bilateral: HG, PT, SMG, anterior cingulate, anterior insula. Right: IFG, premotor cortex. Attend Words > Ignore Words Attend Rotated Speech >Ignore Rotated Speech Attend Words > Attend Rotated Speech R R R R R Attend Pseudowords > Attended Rotated Speech Ignored Rotated Speech >Ignore Words Abbreviations: HG=Heschl's gyrus, PT=planum temporale, STG=superior temporal gyrus, STS=superior temporal sulcus, MTG=middle temporal gyrus, ITG=inferior temporal gyrus, SMG=supramarginal gyrus, SFG=superior frontal gyrus, dPFC=dorsolateral prefrontal cortex, MFG=middle frontal gyrus, IFG=inferior frontal gyrus, OFC=orbitofrontal cortex, R=right.

Attentional and Linguistic Interactions in Speech …rhdesai/posters/cns05_sabri_atten_poster.pdfAttentional and Linguistic Interactions in Speech Perception Merav Sabri*, Jeffrey

  • Upload
    dongoc

  • View
    216

  • Download
    1

Embed Size (px)

Citation preview

Attentional Attentional and Linguistic Interactions in Speech Perceptionand Linguistic Interactions in Speech PerceptionMerav SabriMerav Sabri**, Jeffrey R. Binder, , Jeffrey R. Binder, Rutvik Rutvik Desai, Mike D. Desai, Mike D. LeitlLeitl, David A. , David A. MedlerMedler,,

Edward T. Edward T. Possing Possing & & Einat LiebenthalEinat LiebenthalMedical College of Wisconsin , Milwaukee WI, USA.Medical College of Wisconsin , Milwaukee WI, USA.

INTRODUCTIONINTRODUCTION There is ample evidence for implicit (automatic) semantic processing

following presentation of meaningful linguistic stimuli in both the auditory andvisual modalities (e.g., the Stroop effect; MacLeod, 1991).

Nonetheless, whether words are processed obligatorily remains uncertainmainly due to inconsistent reports regarding differences in brain activationbetween word and nonword sounds.

Subjects performing a nonlinguistic visual feature detection task producedactivations in language areas suggesting implicit visual word processing (Priceet al., 1996).

The current study employed fMRI to study the effects of attention (taskdemands) and stimulus type (auditory words, nonwords) on brain areasimplicated in language processing.

Task demands were manipulated such that subjects were engaged in abimodal selective attention task where sounds were ei ther attended to orignored. To hold attention constant and away from the sounds, in the ignorecondition subjects focused on a demanding visual task.

METHODSMETHODSSubjects. 28 healthy, right-handed, native English speakers.

Auditory Stimuli.

• Words: 140 concrete monosyllabic nouns

• Pseudowords: 140, matched to words in number of phonemes, phonologicalneighborhood size, average biphone log frequency

• Rotated Speech: 70 spectrally rotated words, 70 rotated pseudowordschosen randomly from the lists above

• Mean Duration = 667 ms

Visual Stimuli. Single Japanese characters (Kana; e.g., ) in middle of screen.

Paradigm.

• SOA of sounds = 2, 4, or 6 s. SOA of visual stimuli = 0.5 s. Auditory or visualrepetition = 17%.

• Each of 7 runs included 12 blocks of randomized: Attend Auditory (ignorevisual) and Attend Visual (ignore auditory) conditions.

Task. Participants performed a 1-back matching task in the assigned attendedmodality. In Attend auditory, subjects were instructed to keep their eyes focusedon the screen while ignoring the visual characters.

fMRI Procedures.

• GE Signa 1.5T scanner (GE Medical Systems, Milwaukee).

• Functional data: T2*-weighted, gradient-echo EPI (TR = 2 sec, TE = 40 ms,flip angle = 90, NEX = 1). 21 axial slices, 3.75 x 3.75 x 5.50 mm voxels.

• Anatomical data: 3-D spoiled gradient-echo sequence. Whole brain, 0.94 x0.94 x 1.2 mm3 voxels.

• Image analysis used AFNI software package (Cox, 1996). FWHM = 6 mm.Random effects analysis.

• Cluster size threshold (541 µl, p < .05 corrected) was applied to the group t-maps thresholded at p < .001.

RESULTSRESULTSAuditory Behavioral PerformanceThe auditory matching task was performed well: overall d' = 3.20 (words = 3.35,pseudowords = 3.31, rotated speech = 2.95).

The visual task proved to be very demanding: d' = 2.72.

DISCUSSIONDISCUSSION Regardless of stimulus type, attended sounds activated the STS, SMG and

dPFC more than unattended sounds. These areas are associated with auditoryworking memory which is necessary for performing the 1-back task.

Regardless of task demands (attend, ignore), words and pseudowords did notdiffer in activation, suggesting similar processing.

Rotated speech activated dorsal temporal areas (HG, PT) more than eitherwords or pseudowords, regardless of task demands. These results suggestadditional pre-attentive acoustic analysis of these unfamiliar sounds.

Rotated speech also activated SMG and dPFC more than either words orpseudowords during the attend auditory task. This suggests greater demands onauditory working memory, consistent with the behavioral data showing poorerperformance on the 1-back task for rotated speech.

Conversely, attended words and pseudowords activated more ventral temporalareas (STS, MTG, ITG, fusiform), suggesting linguistic/semantic analysis.

A large region in the medial occipital lobe (cuneus, lingual gyrus) showedstronger activation in the attend auditory task than in the attend visual task. Wespeculate that this pattern may represent active inhibition of the ongoing visualstimulation.

Taken together, these results suggest that attention modulates l inguisticprocessing of words and pseudowords. Rotated speech appears to be processedmainly at a prephonemic auditory level regardless of attentional state. Attentionalprocessing engages auditory working memory, especially for rotated speech.

fMRIATTEND AUDITORY > ATTEND VISUAL (ignore auditory)Words. Right: MTG. Left: dPFC, MFG, IFG, posterior cingulate gyrus. Bilateral:STG, STS, SMG, OFC, precuneus, cuneus, lingual gyrus.

Pseudowords activated very similar areas to those activated by words.

Rotated Speech activated very similar areas to those activated by words andpseudowords. There was somewhat more activation in right frontotemporalareas and less activation in left OFC.

*[email protected]://www.neuro.mcw.edu/~msabri

Attend Pseudowords > Attend Rotated Speech. Left: ITG, OFC. Bilateralangular gyrus. Attend Rotated Speech > Attend Pseudowords. HG, PT, MFG,SMG.

IGNORE AUDITORY: STIMULUS CONTRASTSIgnore Words > or < Ignore Pseudowords. No above threshold activations.Ignore Words > Ignore Rotated Speech. No above threshold activations.Ignore Rotated Speech > Ignore Words. Bilateral HG and PT.

Ignore Rotated Speech > Ignore Pseudoword. Bilateral HG similar to above.

Supported by 1 F32 DC007030-01 (M. Sabri) and R01-NS33576 (J.R. Binder).

ATTEND AUDITORY: STIMULUS CONTRASTS

Attend Words > Attend Pseudowords. No above threshold activations.

Attend Words > Attend Rotated Speech (in Orange). Bilateral: anterior STS,posterior MTG and lateral occipital gyri, hippocampus, parahippocampus, post-central gyrus. Left: OFC, SFG. Right: anterior ITG, fusiform gyrus. Attend RotatedSpeech > Attend Words (in Blue). Bilateral: HG, PT, SMG, anterior cingulate,anterior insula. Right: IFG, premotor cortex.

Attend Words > Ignore Words

Attend Rotated Speech >Ignore Rotated Speech

Attend Words > Attend Rotated Speech

R

R

R

R

R

Attend Pseudowords > Attended Rotated Speech

Ignored Rotated Speech >Ignore Words

Abbreviations: HG=Heschl's gyrus, PT=planum temporale, STG=superior temporal gyrus, STS=superiortemporal sulcus, M TG=middle temporal gyrus, ITG=inferior temporal gyrus, SMG=supramarginal gyrus,SFG=superior frontal gyrus, dPFC=dorsolateral prefrontal cortex, MFG=middle frontal gyrus, IFG=inferior frontalgyrus, OFC=orbitofrontal cortex, R=right.