8
The Picture Superiority Effect in Encoding and Retrieval Processes during Japanese Learning for Chinese Bilinguals LiPing Mi Department of Information Science and Intelligent Systems, The University of Tokushima, 770-8506, Tokushima, Japan, [email protected] Xiangyang Liu Department of Computer Science, The Shangqiu Normal University, Henan, 476000, China [email protected] Fuji Ren Department of Information Science and Intelligent Systems, , The University of Tokushima, 770-8506, Tokushima, Japan [email protected] Abstract: In order to investigate the picture superiority effect, we compared the ERP between picture combined word (picture-word) and pure word (word) at study and test phase. During encoding, the FN400 was more negative and lasted longer for picture-words than for words. The late positive component (LPC) was more positive and distributed broadly for words compared to picture-words. During retrieval, the old picture-word elicited remarkably FN400 familiarity effect and parietal old/new effect compared to the old word. We suggested that simultaneous image and verbal encoding of picture-word elicited better and faster recollection compared to word during the memory test. Our findings also demonstrated that the picture superiority effect was related to the ability of pictures enhancing encoding and facilitating recollection. Keywords: ERP; encoding; retrieval; recognition memory; picture superiority effect; familiarity; recollection 1. Introduction The memory is a process which to encode, store, retrieve the obtained information. Baddeley [1] thought the working memory was in charged to store and briefly process the information in the complex cognition activity such as the learning and language comprehension. Many researches of the working memory mainly centralized in lexical system of native language. There are a few researches using foreign language to investigate how bilinguals learn foreign language. Especially, up to now we did not find any researches which simultaneously used Japanese language and pictures as stimuli to search the facilitative effect that pictures improve the Japanese language learning memory for Chinese bilinguals. The neural mechanism of forming memory is the important problem of cerebration science research. Short-term storage of information and rehearsal are the essential point of memory maintenance and then it can 978-1-4244-4538-7/09/$25.00 ©2009 IEEE long-term store in the brain to obtain the knowledge. How to enhance the short-time storage quantities of the information and how to reduce memory load effectively became the popular researches. A Chinese proverb is “a picture is worth a thousand words”. Comparing with the pure word, a picture provides abundant perceptual information that is rarely matched by verbal description. Picture has also been proven to enhance memory relative to word. The picture superiority effect has demonstrated that learners are more likely to remember items if they are presented as pictures versus words [2]. More recently, comparing memory for pictures versus words has been used as a tool in memory research. In the present study we were interested in better understanding the neural basis of the memorial picture superiority effect, and in particular how pictures affect the event-related potentials (ERP) components of recognition memory: familiarity and recollection. The “FN400” is an ERP component typically occurs at bilateral frontal electrode sites between 300 and 500 ms, it is a familiarity-sensitive component. Curran [3] found that familiar items elicited a more positive response at frontal electrode sites than did unstudied items. Another component of the old/new effect typically occurs maximally at parietal electrode sites between 500 and 800ms. This parietal effect is associated with recollection, and enhances by items correctly identified as previously studied [4]. According to the above views, we speculated that the studied items would elicit a more positive FN400 and late positive potentials component (LPC) than did unstudied items during retrieval. Almost the ERP studies on picture superiority effect have focused on brain activities recorded during the retrieval phase. However, whether a recollection of picture is more accurately and faster occurred at retrieval should depend upon its abundant perceptual information processing at encoding. Two different hypotheses argued the encoding of picture. A dual-coding hypothesis has stated that distinct but interconnected mechanisms for picture and word processes were responsible for encoding [5]. Pictures are represented by an image code

[IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

  • Upload
    fuji

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

The Picture Superiority Effect in Encoding and Retrieval Processes during Japanese Learning for Chinese Bilinguals

LiPing Mi

Department of Information

Science and Intelligent Systems,

The University of Tokushima,

770-8506, Tokushima, Japan,

[email protected]

Xiangyang Liu

Department of Computer

Science, The Shangqiu Normal

University, Henan, 476000,

China

[email protected]

Fuji Ren Department of Information

Science and Intelligent Systems, ,

The University of Tokushima,

770-8506, Tokushima, Japan

[email protected]

Abstract: In order to investigate the picture superiority effect,

we compared the ERP between picture combined word (picture-word) and pure word (word) at study and test phase. During encoding, the FN400 was more negative and lasted longer for picture-words than for words. The late positive component (LPC) was more positive and distributed broadly for words compared to picture-words. During retrieval, the old picture-word elicited remarkably FN400 familiarity effect and parietal old/new effect compared to the old word. We suggested that simultaneous image and verbal encoding of picture-word elicited better and faster recollection compared to word during the memory test. Our findings also demonstrated that the picture superiority effect was related to the ability of pictures enhancing encoding and facilitating recollection.

Keywords:

ERP; encoding; retrieval; recognition memory; picture superiority effect; familiarity; recollection

1. Introduction

The memory is a process which to encode, store, retrieve the obtained information. Baddeley [1] thought the working memory was in charged to store and briefly process the information in the complex cognition activity such as the learning and language comprehension. Many researches of the working memory mainly centralized in lexical system of native language. There are a few researches using foreign language to investigate how bilinguals learn foreign language. Especially, up to now we did not find any researches which simultaneously used Japanese language and pictures as stimuli to search the facilitative effect that pictures improve the Japanese language learning memory for Chinese bilinguals.

The neural mechanism of forming memory is the important problem of cerebration science research. Short-term storage of information and rehearsal are the essential point of memory maintenance and then it can

978-1-4244-4538-7/09/$25.00 ©2009 IEEE

long-term store in the brain to obtain the knowledge. How to enhance the short-time storage quantities of the information and how to reduce memory load effectively became the popular researches. A Chinese proverb is “a picture is worth a thousand words”. Comparing with the pure word, a picture provides abundant perceptual information that is rarely matched by verbal description. Picture has also been proven to enhance memory relative to word. The picture superiority effect has demonstrated that learners are more likely to remember items if they are presented as pictures versus words [2]. More recently, comparing memory for pictures versus words has been used as a tool in memory research.

In the present study we were interested in better understanding the neural basis of the memorial picture superiority effect, and in particular how pictures affect the event-related potentials (ERP) components of recognition memory: familiarity and recollection. The “FN400” is an ERP component typically occurs at bilateral frontal electrode sites between 300 and 500 ms, it is a familiarity-sensitive component. Curran [3] found that familiar items elicited a more positive response at frontal electrode sites than did unstudied items. Another component of the old/new effect typically occurs maximally at parietal electrode sites between 500 and 800ms. This parietal effect is associated with recollection, and enhances by items correctly identified as previously studied [4]. According to the above views, we speculated that the studied items would elicit a more positive FN400 and late positive potentials component (LPC) than did unstudied items during retrieval.

Almost the ERP studies on picture superiority effect have focused on brain activities recorded during the retrieval phase. However, whether a recollection of picture is more accurately and faster occurred at retrieval should depend upon its abundant perceptual information processing at encoding. Two different hypotheses argued the encoding of picture. A dual-coding hypothesis has stated that distinct but interconnected mechanisms for picture and word processes were responsible for encoding [5]. Pictures are represented by an image code

Page 2: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

and words are represented by a verbal code, but these systems are interconnected such that either modality can evoke either code [6]. Paivio [7, 8] proposed that memorial accuracy would be better for pictures because they were more likely to be represented by both image and verbal codes (dual-coding), increasing the probability that they would be recollected compared to words. But the sensory-semantic hypothesis argued that pictures possessed highly distinctive visual information and features that allowed unique encoding in memory [9]. The distinctiveness account of the picture superiority effect proposed a picture might allow faster recollection of a studied item related to a word.

Based on above two different encoding hypotheses, we deliberately designed our particular experimental stimuli which differed from the traditional simplex pictures experimental stimuli. Namely, picture combined word (the corresponding name of the picture) as the pictorial stimulus was used in the present study. To uncover the role of encoding of picture distinctive image information, we compared the ERP differences during encoding between the picture-word and word in the study phase. We expected that simultaneous more active image and verbal encoding of picture-word would elicit better and faster recollection compared to word during the memory test.

The present study used ERP technique to investigate neural correlates of picture superiority effect both in encoding and retrieval phases by comparing the picture-word and word recognition memory during Chinese bilinguals learning Japanese language. The ERP technique can provide a moment-by-moment record of relevant neurophysiological activity in the encoding and retrieval phases. In addition, this technique provides information concerning the scalp distributions of peaks in the ERP waveform. The superiorities can help us to detect the neural mechanisms of temporal and spatial aspects of encoding and retrieval. Another objective of our study was to provide the interesting and lively Japanese learning methods for Chinese bilinguals. We suggested that Chinese bilinguals should use the picture superiority effect to improve the recognition memory of words and reduce the memory load.

The rest of the paper is organized as follows. The experimental procedure is reviewed in Section 2. In Section 3, we describe the ERP results. Section 4 mentions the discussion, and we conclude in Section 5.

2. Methods

2.1. Participants

Twenty graduates of Chinese students abroad from Tokushima University (10 males and 10 females, age range 24-29, average 26.1 years, all with middle or advanced level Japanese) participated in the current

experiments. Half of them participated in the picture-word experiment and another half participated in the word experiment. All participants were right-handed and healthy, and had normal or corrected-to-normal vision, screened to exclude those with current or past neurological or psychiatric disorders. They were paid for their participation. Participants’ consent was obtained following the guidelines of the Institutional Review Board of Tokushima University before the experiment started.

2.2 Stimuli and design

The current study consisted of two experiments, one investigated the picture combined word (hereinafter is short for the picture-word) memory and the other investigated the pure word (hereinafter is short for the word) memory. Each experiment consisted of 12 study-test blocks. In the study phase of picture-word memory experiment, 120 unambiguous line drawings of common objects and 120 nouns were presented visually to the participants. These line drawings were selected from the 260 common objects drawings of Snodgrass and Vanderwart [10]. They were divided into 12 blocks of 10 items each. In order to avoid the different experimental results which resulted in by different experimental materials, the same words were used both in the study phase of two experiments. The 120 nouns also were divided into 12 blocks of 10 items each in the study phase of the word memory experiment. In order to compare the two experimental results expediently, only words as memory test stimuli were presented at test phase of the two experiments. 5 old nouns which have been presented in the previous study phase as the old items and 5 new words as the new items were presented randomly in the test phase. Both in the study and test phase, each stimulus was presented for 2000 ms followed a 1000 ms black screen.

To avoid glare, stimuli were presented in white forms on the black background. The horizontal visual angles ranged from 2° to 4.7°, and the vertical visual angles from 3° to 4° respectively, for the longest and shortest words or largest and smallest pictures.

2.3 Procedure

The participant seated in a comfortable chair in a dimly illuminated, shielded, and acoustically isolated chamber. The participants’ eyes were horizontal with the stimulus, 150 cm away from the computer screen. To prevent eye blink and EEG artifacts, participants were instructed to move or blink their eyes as infrequently as possible during the presentation of the stimulus. But they might blink eyes quickly during the presentation of black screen. In order to ensure the experiment was performed well, we explained the experiment content, procedure

Page 3: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

and request to the participants in advance. In addition, a practice study-test block was performed before the beginning of the experiment to let participants know well the experimental procedure. Each study-test block began with an English word “Study” presented at the center of screen for 2000 ms, followed by a black screen for 1000 ms. It hinted the participant that the study phase would begin. Then 10 picture-words or 10 words were presented one by one for 2000 ms each, with a 1000 ms interval (black screen) between two stimuli. The participant was asked to memorize the presented stimuli for a subsequent memory test (Figure 1).

Both the test phase of the two experiments, only words as stimuli were presented. Each test block included 5 studied words (old items) which presented in the study phase and 5 new words (new items). Each test block started with an English word “Test” for 2000 ms, followed by a black screen for 1000 ms. Subsequently, a word was presented on the screen for 2000 ms followed by a black screen for 1000 ms. The participant was asked to say “Yes” (if he or she thought the stimulus was seen during study) or “No” (if he or she thought the stimulus had not been seen during study) during the black screen presentation. Each study-test block was lasted for 66 seconds (Figure 1). The next block started after a short rest. Each experiment lasted about 15 minutes. Two experiments were counterbalanced by participants.

Figure 1. The experimental block design of the

picture-word condition.

2.4. EEG recoding

We recorded the raw electroencephalogram (EEG) by using the EEG recording software (Kissei Comtec., Company, LTD) from 19 electrodes located at symmetrical sites on the scalp (Fp1, Fp2, F3, F4, C3, C4, P3, P4, O1, O2, F7, F8, T3, T4, T5, T6, Fz, Cz and Pz) according to the international 10-20 system. Vertical and horizontal electro-oculograms (VEOG and HEOG) were recorded with two pairs of electrodes, one placed above and below the left eye, and another placed on the outer canthi of the two eyes. Linked ear lobes served as the

reference, the earth point was the middle point between Fpz and Fz. The EEG and electrooculogram (EOG) both were sampled at 250Hz rate, the EEG band pass was set from 0.16 to 60Hz, the EOG band pass was set from 0.16 to 15Hz. Electrode impedance was kept below 5 KΩ. The continuous raw EEG data were stored on a computer disk to be averaged off-line later.

2.5 Statistical analyses

Recorded raw EEG data in each stimulus condition were averaged off-line to obtain a grand averaged ERP waveform by using Kissei Comtec Company BIMUTAS Ⅱanalysis software. Eye-movements and other artifacts were rejected off-line. The data analysis period in each trial was 1700 ms including a 200 ms pre-stimulus periods. The analysis period was divided into four time intervals, namely 100-250ms, 250-400ms, 400-600ms, 600-900ms. The latency and mean amplitude of N170, P300, FN400 and LPC in the four time intervals were calculated for each condition. Analysis of the behavioral data used a two-way ANOVA to examine the response accuracy of two conditions (picture-word, word). The P value was less than 0.05.

The data analysis was performed by SPSS15 software and the results were analyzed using three-way ANOVA. Significant main effects and interactions were followed-up with post hoc analyses using the Tukey honestly significant difference (HSD) test. The encoding ERP were compared between the two conditions: picture-word vs. word. The three factors during encoding were: conditions (picture-word, word), hemispheres (right, left), locations (frontal, central, parietal). Two kinds of comparisons during retrieval were performed. (1) The comparisons between the two conditions. The three factors during retrieval were: conditions (old item of picture-word/word), hemispheres and locations. (2) The comparisons within condition. The three factors during retrieval were: conditions (old/new item), hemispheres and locations.

For each ERP component (N170, P300, FN400, LPC), three-way ANOVA was performed to find whether significant differences of latency or amplitude were existed or not. If significant differences were found, independent-samples t-test was used to detect the scalp distributions of significant differences. The P value was less than 0.05.

3. Results

3.1. Behavioral data

Response accuracy for the picture-word condition in the test phase was 91.5% and for the word condition was 87.3%. Response accuracy was higher for the old items of the picture-word condition than the word

Page 4: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

condition [t(18)=2.957, P=0.008]. The behavioral data suggested that Chinese bilinguals could recognize the studied picture-words better and correctly than the studied words during the test phase.

3.2. ERP results

3.2.1. Study phase (encoding).

The encoding ERP during the study phase were compared between two conditions: the picture-word versus the word. The latency and mean amplitude of the N170, the P300, the FN400 and the LPC were calculated in each condition. Three-way ANOVA was used to assess the main effects of conditions (picture-word and word) ×hemispheres (right, left) × locations (frontal, central, parietal) on the latency and mean amplitude of each ERP component. There were significant main effects of conditions in the latency and mean peak amplitude for the FN400 [latency: f(1,36)=38.34, P<0.0001; amplitude: f(1,36)=36.18, P<0.0001] and LPC [latency: f(1,36)=11.64, P=0.002; amplitude: f(1,36)=24.75, P<0.0001]. Post hoc tests indicated that there were significant conditions × locations interaction for the FN400 and LPC in their latencies and amplitudes [latency of FN400: f(2,35)=58.72, P<0.0001; amplitude of FN400: f(2,35)=58.83, P<0.0001; latency of LPC: f(2,35)=26.52, P<0.0001; amplitude of LPC: f(2,35)=55.973, P<0.0001]. The ANOVA results revealed that the FN400 was more negative and lasted longer for the picture-word than for the word, and showed a frontal-central scalp distribution with frontal>posterior for the two conditions, the FN400 was not observed over the occipital sites for the two conditions (Figure 2).

Fig. 2: ERP waveform differences between the two

conditions at study phase.

On the contrary, the LPC was more positive and distributed broadly for the word compared to the picture-word, especially the LPC occurred maximally over the occipital sites, showed a posterior distribution dominance with posterior>central, no observed at the

frontal electrode sites both the picture-word and the word conditions (Figure 2). ANOVA was failed to show a significant conditions × hemisphere interaction. To capture the encoding ERP differences between the two conditions, t-test was used to compare the amplitude of the FN400 and the LPC at each electrode. The results of t-test revealed that the significant differences of the FN400 amplitude between the two conditions were broadly distributed over the frontal-central sites, but the significant differences of the LPC amplitude was only located on the central-parietal sites Cz, Pz, P3, P4. Table 1 and 2 showed the significant amplitude differences of the FN400 and the LPC between the two conditions.

Table 1: Amplitude values of the FN400 for the two

conditions and t-test results. Electrode P-W W t Fp1 -4.4 -2.16 4.12**** Fp2 -4.3 -2.48 3.65***

F3 -2.84 -0.55 4.23**** F4 -3.28 -0.75 4.41**** C3 -1.88 0.69 4.18**** C4 -2.06 0.51 4.45**** P3 0.31 1.86 2.12** P4 0.06 2.18 2.98** F8 -2.96 -0.57 4.27**** T4 -1.53 0.07 3.11** Fz -3.72 -0.97 3.46*** Cz -2.84 -0.03 3.53*** Pz -0.81 1.53 2.3*

Significant level: ****p<0.001; *** p<0.005;**p<0.01;*p<0.05.

Amplitude differences of LPC betweenconditions in the study phase

-1

0

1

2

3

4

5

6

Cz P3 P4 Pz O1 O2

electrode

volta

ge(u

V)

picture-word word

**** ****

****

*

Table 2. X-axis showed the electrodes, Y-axis showed

the voltage in microvolt. Significant level: ****p<0.001, ***p<0.005, **p<0.01, *p<0.05.

3.2.2 Test phase (retrieval))

The ERP recorded in the test phase were computed for the old/new ( studied and unstudied) items of the picture-word and the word. We measured the mean amplitude and latency of the N170, the P300, the FN400 and the LPC in four time windows (100-250ms, 250-400ms, 400-600ms, 600-900 ms). We compared the

Page 5: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

old/new effects of three cases (within the picture-word condition: the old items vs. the new items; within the word condition: the old items vs. the new items; between conditions: the old picture-word items vs. the old word items). Three-way ANOVA of conditions × hemispheres × locations was performed for each time window. Independent-samples t-test was used to detect the scalp distributions of ERP significant differences when there were ERP significant differences.

3.2.2.1 ERP comparisons of the old items between the two conditions

We compared the ERP differences which elicited by the old picture-word and the word during retrieval. Three-way ANOVA of conditions × hemispheres × locations found a significant main effect of conditions for the P300 amplitude [f(1,36)=5.78, P=0.02]. The P300 amplitude was remarkably greater for the old picture -word than the old word (Figure 3). Independent-samples t-test revealed that the significant differences of P300 amplitude only existed at Cz [t(18) =4.733, p<0.0001] and Pz [t(18) =6.033, p<0.0001]. These results revealed that the old picture-word elicited a more positive P300 than did the old word over the central-parietal electrode sites, which reflected active reactivation perceptual presentation of the old picture-word compared to the old word.

Figure 3. The topographic distributions for the peak

latency of each ERP component at test phase. Maximal positivity or negativity was indicated by dark red or blue.

The amplitude of the FN400 was greater for the old

word than for the old picture-word, which revealed an old/new effect for the old picture-word. But the results of ANOVA were failed to find the significant differences of FN400 amplitude between the two conditions. Post hoc test indicated a significant conditions × locations interaction for the N170 [f(2,35)=44.42, P<0.0001], the P300 [f(2,35)=29.09, P<0.0001], the FN400 [f(2,35)=41.42, P<0.0001], the LPC [f(2,35)=111.2, P<0.0001]. These results revealed that the scalp distributions of every ERP components were different.

The N170 and the FN400 were mainly distributed over the frontal electrode sites, and the P300 and the LPC were mainly distributed over the posterior areas, particularly over the parietal and occipital sites.

3.2.2.2 ERP comparisons of the old vs. new items within the picture-word condition

We compared the ERP differences which elicited by the studied (old) and the unstudied (new) word during retrieval within the picture-word condition. The results of three-way ANOVA of items (old, new)× hemispheres × locations found a significant main items effect for the P300 [f(1,36)=17.28, P<0.0001], the FN400 [f(1,36)=25.18, P<0.0001] and the LPC [f(1,36)=20.26, P<0.0001] in their amplitudes. The amplitudes of the P300 and the LPC were greater for the old than the new items (Figure 4). T-test results showed that the significant differences of the amplitude of the LPC were distributed at Pz [t(18) =2.38, p=0.029], O1 [t(18) =2.341, p=0.031] and O2[t(18) =2.135, p=0.047]. Table 3 showed the amplitude values of the FN400 for the old/new items and t-test results. The amplitude of the FN400 was much greater for the new than the old items. ANOVA also revealed a significant main effect of items for the FN400 [f(1,36)=528.8, P<0.0001] and the LPC [f(1,36)=202.5, P<0.0001] in the latency, reflecting the FN400 and the LPC of the old items started about 100-200 ms earlier (Figure 4). Post hoc test indicated a significant items × locations interaction for the N170 [f(2,35)=37.5, P<0.0001], the P300 [f(2,35)=34.1, P<0.0001], the FN400 [f(2,35)=41.7, P<0.0001], the LPC [f(2,35)=128.4, P<0.0001] in the amplitude. All the results effectively revealed that there were significant old/new effects in the picture-word condition, reflecting the participants recognized and recollected the old items better and faster than the new items.

Figure 4. Grand average waveforms elicited by the

old/new picture-word at test phase.

3.2.2.3 ERP comparisons of the old vs. new items within the word condition

We compared the ERP differences which elicited by the old and new items during retrieval within the word

Page 6: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

condition. Three-way ANOVA of items (old, new)× hemispheres × locations revealed that there were significant main items effect for the N170 [f(1,36)=4.45, P=0.043] and the P300 [f(1,36)=17.93, P<0.0001] in their amplitudes. Moreover, Post hoc test indicated that there were significant items × locations interaction for each ERP component in the amplitude (P<0.0001). Figure 5 showed the topographic distributions of each ERP component elicited by the old and new items. The N170 amplitude and the P300 amplitude were significant greater for the new items than the old items. The FN400 amplitude and LPC amplitude for the old items were similar with the new items, but ANOVA revealed a significant main effect of items for the FN400 on the latency [f(1,36)=234.3, P<0.0001]. The latencies of the FN400 and the LPC were 100-200 ms earlier for the old than the new items, which reflected a much faster recollection for the old items due to their familiarity.

Figure 5. Topographic distributions for the peak latency of each ERP component which elicited by the old and

new words at test phase.

Independent-samples t-test showed that N170 amplitude elicited by the new items was more positive than the old items. The significant differences in the N170 amplitude were distributed at the P3 [t(18) =2.237, p=0.031] and Pz [t(18) =2.286, p=0.035]. The P300 amplitude elicited by the new items was much greater than the old items. The significant differences in the P300 amplitude were distributed over the central-parietal sites, e.g. C3[t(18) =3.27, p=0.004], P3[t(18) =2.74, p=0.014], O1[t(18) =2.24, p=0.038], Cz[t(18) =2.54, p=0.02] and Pz[t(18) =2.46, p=0.024]. Moreover, the significant differences in the LPC amplitude were observed at O1 [t(18) =2.341, p=0.031] and O2 [t(18) =2.135, p=0.047]. The LPC amplitude was much greater for the old items than for the new items, reflecting a parietal old/new effect.

4. Discussion

In the present study we sought to understand the neural mechanisms of the memorial picture superiority effect by comparing the ERP differences which elicited

by the picture-word and the word in both encoding and retrieval phases.

4.1 Comparisons of picture-word and word encoding at study phase.

During encoding, although the ANOVA result was failed to show a significant statistical difference in the N170 amplitude for the two conditions, in fact the N170 amplitude was smaller for the picture-word compared to the word condition over the right frontal and occipital electrode sites. Because the stimulus of the picture-word condition was picture combined word, the participants simultaneously performed both the image and verbal encoding during encoding. The visual areas were more strongly activated by the abundant perceptual information of the picture-word stimulus. The more active perceptual encoding of the picture-word reduced the vision processing load, inducing much smaller N170 amplitude for the picture-word compared to the word. We predicted that the active perceptual encoding of picture-word would elicit greater familiarity during retrieval compared to word.

Compared the picture-word with the word encoding, we found that the FN400 was more negative and lasted longer for the picture-word stimulus compared to the word stimulus, and showed a frontal-central distribution. We interpreted the finding as the following. We told the participants that there was a subsequent memory test, requested them to remember the presented stimulus during encoding. The participants could know the meaning of the word at once, as soon as they saw the picture-word stimulus. They simultaneously performed both the image and verbal encoding, and needed not to deeply perform the semantic processing for the stimulus of the picture-word conditions. In addition, at the same time, the participants automatically utilized the abundant perceptual information of picture to perform the image encoding. They used the associative elaboration memory strategies to remember the meaning of word depending on the remembrance of the presentation of picture. In this way, the studied picture-word was stored in the memory with more accurate meaning and more abundant presentation. Therefore, the recollection of the studied picture-word was easier compared to the studied word during the subsequent retrieval. Because the participants simultaneously performed both the image and verbal encoding for the picture-word encoding, the ERP elicited by the two kinds of stimuli were same over the ventral stream where is essential for semantic processing. On the contrary, the right frontal lobe which has the processing predominance for the image processing, so the right frontal lobe was more strongly activated by the encoding of the perceptual feature of the picture. Hence, in the present study, at study phase, the FN400 amplitude which elicited by the picture-word was greater than for

Page 7: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

the word. Several studies have reported activation was greater in the right than the left inferior frontal during encoding of pictorial stimuli and could predict later memory. Encoding activation of picture was greater for item that was subsequently remembered than for item that was forgotten [11, 12, 13]. Furthermore, right frontal effects may be correlated with elaborative encoding strategies and associative processes.

The LPC was more positive and distributed broadly for the word condition compared to the picture-word condition. This difference was mainly distributed over the mid-lined electrode sites. We interpreted that the larger LPC might be related to the more encoding of semantic processing of word. Perhaps participants used rote encoding strategy to remember the word for the subsequent memory test. This rote memory strategy would increase the encoding load and affect the later recognition performances. Based on the above mentioned, we suggested that more activity encoding of picture-word elicited a subsequent better familiarity and recollection compared to word.

4.2 Comparisons of old/new effect within condition at test phase

At test phase of the picture-word condition, ANOVA showed the P300 amplitude elicited by the old items was much greater compared to the new item. The FN400 amplitude elicited by the old items was remarkably smaller than that elicited by the new items. The LPC amplitude elicited by the old item was much greater compared to the new items. In addition, the P300, the FN400 and the LPC appeared earlier for the old items than for the new words. Previous researches have reported that the FN400 early frontal effect was likely related to familiarity [3, 4, 14]. In line with the previous researches, we found that the FN400 was more positive and shifted earlier for the old compared to the new items. Our results suggested that there was greater familiarity for the old items. Because both the image and verbal encoding were performed for the picture-word, more abundant perceptual presentation was stored in the memory. The abundant perceptual presentation which stored in the memory was reactivated by the repeated old items. The FN400 amplitude was much smaller for the old items than for the new items due to the familiarity of the old items. On the other hand, the new item was presented for the first time. The participants had to perform its semantic processing, so much greater FN400 amplitude was elicited by the new item. Some previous researches proposed that the parietal effect has been associated with recollection [15, 4, 14]. In the present study, t-test showed the significant differences in the LPC amplitude were distributed only over the parietal-occipital electrodes Pz, O1 and O2. The LPC was remarkably greater for the old items than for the

new items. Moreover, the LPC appeared about 170 ms earlier for the old items, and the duration of the LPC for the old items was much shorter than for the new items. All the results suggested that the old items showed a remarkably parietal effect which recollection was better for the old compared to the new items. We explained why the parietal effect was briefer and more localized in the present study when recollection was better. One possible explanation was that how long the parietal effect lasted and how localized it occurred might depend upon the amount of time and resources needed to achieve recollection. When recollection was easier, less time and resources were required compared to when recollection was more difficult. Therefore, we suggested that old items not only elicited a significant FN400 familiarity effect but also elicited a remarkably parietal effect. All of these attributed to the facilitative effect of the picture superiority effect during encoding and retrieval. The picture superiority effect facilitated participant to recognize or recollect old items more quickly and easier than new items.

At test phase of the word condition, ANOVA showed that there were significant differences of the N170 and the P300 in the amplitudes, significant differences of the FN400 and the LPC in the latencies between the old and new items, but ANOVA was failed to show statistical significant differences of the FN400 and the LPC in the amplitude. These results demonstrated that the old items were recollected faster compared to the new items, but was failed to reveal a stronger FN400 familiarity activation and parietal recollection activation for the old items in the present study.

4.3 Comparisons of the old/new effect between the two conditions (old picture-word versus word) at retrieval.

In order to identify the picture superiority effect more clearly, we compared the ERP between the two conditions during retrieval. We found that the P300 amplitude was greater for the old picture-word than the old word (Figure 3), t-test revealed that the amplitude differences located on Cz and Pz. Some studies have reported that pictures had greater perceptual and conceptual distinctiveness than did words [6, 9, 16]. Our results supported the above opinion. In the present study, the encoding of pictorial and verbal stimuli automatically progresses to a semantic level, thus leading to the storage of conceptually-based information in addition to detailed perceptual information. While the availability of both types of information benefits the recognition of pictures, resulting in the picture superiority effect. Thus, the greater P300 amplitude elicited by the old picture-word item during retrieval revealed that picture had greater perceptual and conceptual distinctiveness compared to

Page 8: [IEEE 2009 International Conference on Natural Language Processing and Knowledge Engineering (NLP-KE) - Dalian, China (2009.09.24-2009.09.27)] 2009 International Conference on Natural

word. Pictures were remembered better because they were

more likely to be represented by both image and verbal codes, increasing the probability that they will be recollected compared to words [7, 8]. In the present study, although the results of ANOVA were failed to show a significant statistical difference of the FN400 amplitude between the two conditions, we found that the amplitude of the FN400 was greater for the old words than for the old picture-words over the frontal-central sites. It demonstrated that the FN400 familiarity of the picture-word condition was more obvious compared to the word condition. The picture superiority effect facilitated better recognition and recollection for the old items of the picture-word condition during the later memory test.

5. Conclusion

In the present study, we compared the ERP between the picture-word and the word conditions during encoding and retrieval to investigate the picture superiority effect. During encoding, the FN400 was more negative and lasted longer for the picture-word than for the word. The result indicated that more activity encoding of the picture-word elicited a subsequent better familiarity and recollection compared to word. The larger LPC for word might be related to the more encoding of semantic processing of the word, increasing the encoding load and affect the later recognition performances. During retrieval, the old picture-words elicited remarkably FN400 familiarity effect and parietal old/new effect compared to the old words. We suggested that simultaneous image and verbal encoding of the picture-words elicited better and faster recollection compared to the words. Our findings also demonstrated that the picture superiority effect was related to the ability of pictures enhancing encoding and facilitating recollection. In order to understand the picture superiority effect better, the future work will use color and black-and-white photographs as stimuli to investigate the recognition memory advantage of color.

Acknowledgements

This research has been partially supported by the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Challenging Exploratory Research, 21650030.

References

[1] A. D. Baddeley, Working memory. Science, 1992, 225.

[2] D. L. Nelson, U. S. Reed, J. R. Walling, Picture superiority effect . Journal of Experimental

Psychology: Human learning & Memory; 1976, 2: 523-528.

[3] T. Curran, Brain potentials of recollection and familiarity. Mem.Cogn. 28, 2000. 923–938.

[4] C. C. Woodruff, H. R. Hayama, M. D. Rugg, Electrophysiological dissociation of the neural correlates of recollection and familiarity. Brain Research. 1100, 2006, pp.125–135.

[5] A. Paivio, Imagery and Verbal Processes. Holt, Rinehart, and Winston, New York. 1971.

[6] M. Z. Mintzer, J. G. Snodgrass, The picture superiority effect: support for the distinctiveness model. Am. J. Psychol. 1999, 112, 113–146.

[7] A. Paivio, Mental representations: a dual coding approach. Oxford University Press, England, 1986.

[8] A. Paivio, Dual coding theory: retrospect and current status. Candian Journal of Psychology 45, 1991, pp.255–287.

[9] D. L. Nelson, Remembering pictures and words: appearance, significance, and name. In: Cermak, L.S., Craik, F.I.M. (Eds.), Levels of Processing in Human Memory. Erlbaum, Hillsdale, NJ, 1979, pp. 45–76.

[10] J. B. Brewer, Z. Zhao, J. E. Desmond, G. H. Glover and J. D. Gabrieli, Making memories: brain activity that predicts how well visual experience will be remembered. Science, 281(5380), 1998, pp.1185-1187.

[11] M. D. Rugg, P. C. Fletcher, P. M. Chua, and R. J. Dolan, (). The role of the prefrontal cortex in recognition memory and memory for source: an fMRI study. Neuroimage, 10(5), 1999, pp. 520-529.

[12] A. D. Wagner, D. L. Schacter, M. Rotte, W. Koutstaal, A. Maril, A. M. Dale, B. R. Rosen, and R. L. Buckner, Building memories: remembering and forgetting of verbal experiences as predicted by brain activity. Science, 281(5380), 1998, pp. 1188-1191.

[13] A. Brandon, Ph. D. Ally, E. Andrew and M. D. Budson,. The worth of pictures: Using high density event-related potentials to understand the memorial power of pictures and the dynamics of recognition memory. NeuroImage. 35, 2007, pp. 378–395.

[14] D. Friedman and Jr. R. Johnson, Event-related potential (ERP) studies of memory encoding and retrieval: a selective review. Microsc. Res. Tech. 51, 2000, pp. 6–28.

[15] G. Stenberg, K. Radeborg and L. R. Hedman, The picture superiority effect in a cross-format recognition task. Memory and Cognition 23, 1995, pp. 425–441.

[16] M. S. Weldon and H. L. Roediger, Altering retrieval demands reverses the pictures superiority effect. Memory and Cognition 15, 1987, pp. 269–280.