51
Supplementary Materials Section 1. Supplementary results: Coordinate information for three cluster solution...p.1 Suppl. Table S1. Activation coordinates pooled meta- analyses ...p.1 Suppl. Table S2. Activation coordinates linear contrasts ...p.2 Section 2. Comparing parameters for image correlation ...p.3 Suppl. Figure S1. Image correlation values ...p.4 Section 3. Supplementary results: Additional meta-analytic contrasts ...p.5 Suppl. Figure S2. Additional contrasts ...p.5 Section 3. All task descriptions for Theory of Mind task groups ...p.6 Suppl. Table S3.1. False Belief ...p.6 Suppl. Table S3.2. Trait Judgments ...p.8 Suppl. Table S3.3. Strategic Games ...p.10 Suppl. Table S3.4. Social Animations ...p.12 Suppl. Table S3.5. Mind in the Eyes ...p.14 Suppl. Table S3.6. Rational Actions ...p.15

supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Materials

Section 1. Supplementary results: Coordinate information for three cluster solution ...p.1

Suppl. Table S1. Activation coordinates pooled meta-analyses ...p.1

Suppl. Table S2. Activation coordinates linear contrasts ...p.2

Section 2. Comparing parameters for image correlation ...p.3

Suppl. Figure S1. Image correlation values ...p.4

Section 3. Supplementary results: Additional meta-analytic contrasts ...p.5

Suppl. Figure S2. Additional contrasts ...p.5

Section 3. All task descriptions for Theory of Mind task groups ...p.6

Suppl. Table S3.1. False Belief ...p.6

Suppl. Table S3.2. Trait Judgments ...p.8

Suppl. Table S3.3. Strategic Games ...p.10

Suppl. Table S3.4. Social Animations ...p.12

Suppl. Table S3.5. Mind in the Eyes ...p.14

Suppl. Table S3.6. Rational Actions ...p.15

References for Theory of Mind meta-analyses ...p.16

Section 4. All task descriptions for empathy task groups ...p.22

Suppl. Table S4.1. Observing Pain ...p.22

Suppl. Table S4.2. Observing Emotions ...p.24

Suppl. Table S4.3. Sharing Emotions or Pain ...p.26

Suppl. Table S4.4. Evaluating Situated Emotions ...p.27

Suppl. Table S4.5. Reasoning About Emotions ...p.29

References for the empathy meta-analyses ...p.30

Page 2: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Section 1. Supplementary Results Tables

Supplementary Table S1.Activation coordinates for three cluster solution. Pooled meta-analyses.

Cluster peak Sub-peaks

Label x y z z-val. vx x y z Label

Cluster 1 (pooled analysis)

1 R Ant. cing. g. 4 44 24 13.43 14961 2 36 -8 R med. front. g.

0 56 10 Med. front. g.

2 -34 34 R mid. cing. g.

-2 -58 34 L precuneus

12 -56 42 R precuneus

2 R mid. temp. g. 52 -62 18 7.01 3429 52 -42 24 R sup. temp. g.

52 -44 38 R supramarg. g.

58 -40 -12 R inf. temp. g.

3 R sup. temp. g. 52 -8 -4 5.76 4362 48 -4 -12 R sup. temp. g-

44 12 -14 R temp. pole

52 -4 -32 R inf. temp. g.

4 L mid. temp. g. -50 -64 24 6.66 2642 -54 -50 12 L mid. temp. g.

-50 -54 42 L inf. par. l.

6 L mid. temp. g. -50 -16 -14 4.08 1328 -60 -18 -22 L mid. temp. g.

-42 2 -26 L mid. temp. g.

7 R caudate/undef. 8 4 4 3.61 544

Cluster 2 (pooled analysis)

1 L mid. temp. g. -54 -58 16 6.50 6496 -64 -44 32 L supramarg. g.

-58 -24 -4 L mid. temp. g.

-38 12 0 L insula

-40 26 0 L inf. front. g. tri.

2 R temp. pole 50 16 -8 7.91 9873 58 -46 14 R sup. temp. g.

54 -42 40 R supramarg. g.

50 44 6 R inf. front. g. tri.

50 44 -10 R inf. front. g. orb.

40 16 0 R insula

3 R med. front. g. 2 38 34 6.99 4699 2 44 50 R med. front. g.

0 6 34 Mid. cing. g.

0 38 -6 L ant. cing. g.

4 L precuneus 0 -48 38 6.40 3358 10 -70 42 R precuneus

-6 -68 22 L calcarine s.

5 R caudate 8 14 0 4.95 885 16 20 8 R caudate

6 L cerebellum -24 -66 -22 5.05 223

7 R cerebellum 28 -72 -18 4.57 30

8 L inf. front. g. op. -52 10 24 4.44 18

9 R pallidum 16 2 0 4.91 15

1Supplementary Materials

Page 3: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Cluster 3 (pooled analysis)

1 R inf. front. g. op. 48 20 8 8.93 13961 52 4 8 R rolandic operc.

Supplementary Table S1 (continued).Activation coordinates for three cluster solution. Pooled meta-analyses.

Cluster peak Sub-peaks

Label x y z z-val. vx x y z Label

44 12 -10 R insula

56 -10 36 R postcentr. g.

58 4 26 R precentr. g.

60 -38 26 R supramarg. g.

56 -58 4 R mid. temp. g.

2 L inf. front. g. -48 18 10 7.36 10612 -34 16 -6 R insula

-52 2 36 L precentral

-48 -32 44 L inf. par. l.

-60 -34 22 L sup. temp. g.

-54 -58 4 L mid. temp. g.

3 L suppl. mot. a. -6 4 48 7.14 5604 -2 26 38 L med. front. g.

6 18 54 R suppl. mot. a.

0 -26 36 L mid. cing. g.

4 L inf. occ. g. -28 -88 -10 4.85 419

5 L cerebellum -30 -68 -24 3.85 163

Supplementary Table S2.Activation coordinates for three cluster solution. Linear contrasts.

Cluster peak Sub-peaks

Label x y z z-val. vx x y z Label

Cluster 1&2 > 3

1 R precuneus 6 -66 42 6.30 3311 8 -56 40 R precuneus

-6 -62 26 L precuneus

2 L angular g. -52 -60 24 5.05 2143 -42 -68 36 L angular g.

-50 -60 42 L angular g.

3 R mid. temp. g. 56 -10 -22 4.12 1597 54 -2 -26 R mid. temp. g.

52 -10 -16 R mid. temp. g.

4 R angular g. 60 -54 32 5.12 1591 56 -54 28 R angular g.

50 -50 30 R angular g.

5 L med. front. g. 0 44 44 4.02 894 -4 54 34 L med. front. g.

6 L mid. temp. g. -60 -14 -18 3.74 821

7 L cerebellum -24 -86 -36 1.95 95

Cluster 2&3 > 1

1 L insula -36 16 0 4.37 712 -44 30 -2 L inf. front. g. orb.

-52 18 16 L inf. front. g. op.

2Supplementary Materials

Page 4: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

2 R inf. front. g. op. 56 16 6 3.17 449 48 34 12 R inf. front. g. tri.

3 L suppl. mot. a. -8 6 60 2.83 297 6 8 58

4 L prec. g. -48 8 38 3.14 206 -46 10 32

5 L sup. temp. g. -58 -34 20 2.92 133

R inf. occ. g. 24 -96 0 2.82 109

R inf. temp. g. 52 -64 -4 2.25 47

L mid. temp. g. -54 -62 0 2.58 45

R sup. temp. g. 60 -26 14 2.31 11

3Supplementary Materials

Page 5: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Section 2. Comparing parameters for image correlation

We carried out different versions of image correlation analysis to determine which parameters

discriminate best between different maps. More concretely, we assessed how well different parameter

settings can distinguish between meta-analysis maps derived from studies belonging to the same task

group, versus maps from studies from different task groups. Good discrimination performance is

evident in the case of high image correlation between the former maps (same task group), and low

correlation between the latter.

For implementation of our analysis, we chose the two ToM task groups False Belief and

Reading Mind in the Eyes as test-cases. As it was detailed out in an earlier meta-analysis (Schurz et

al., 2014), these two groups show both overlapping activations (common 'core areas'), as well as

distinct areas of activation (e.g. left inferior frontal and precentral gyri for Reading Mind in the Eyes,

see also Figure 1 in main text). Therefore, we reasoned they would provide good candidates for

testing the discriminative success of different parameters. We divided the samples from these two task

groups (False Belief n=25, Reading Mind in the Eyes and n=15) each into two random halves. To

balance for sample-size, we drew two random sub-samples with n=8 each for the False Belief task

group, and divided the Reading Mind in the Eyes task group into two halves with n=7 and n=8 (again

randomized). We then ran separate meta-analyses for the 4 sub-samples, and calculated pairwise

correlations between the sub-samples drawn from either (i) the same task group or from (ii) different

task groups. We repeated this procedure 100 times with new random-halves (i.e. running 400 new

meta-analyses), and compared correlation results for different parameterizations and thresholdings,

which we describe next.

First, we compared different types of result maps generated by Anisotropic Effect-Size Signed

Differential Mapping (AES-SDM 4.3) meta-analysis. These were (i) Hedges' g maps (representing the

effect size of the brain response to the task, also referred to as d map), SDM z maps (representing

these effect sizes divided by their standard errors), and thresholded SDM z maps (where non-

statistically significant voxels had null value, as shown for example in Figure 1 in main text). Second,

we compared two correlation methods, (i) pearson and (ii) spearman correlation.

For statistical inference, we z-transformed the obtained correlation coefficients and carried out

a repeated measures ANOVA to compare the discrimination with the different parameters. We ran a

2x6 full-factorial repeated measures ANOVA, which included the factors sample type (same task

group, different task group), and parameter setting (pearson thresholded z map, pearson unthresholded

z map, pearson unthresholded g map, spearman thresholded z map, spearman unthresholded z map,

and spearman unthresholded g map) as well as their interactions. We found significant main effects

for both sample type, F(1,99)=7982.8, p<.001, and parameter setting, F(5,495)=5182.89, p<.001, as

well as their interaction, F(5,495)=1819.9, p<.001. As shown in Supplementary Figure S1, the

interaction was accompanied by a pattern of (z-scored) correlation values that indicated particularly

4Supplementary Materials

Page 6: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

high discriminative performance for the Pearson method (indicated by absolutely highest z-scores for

maps in section "Pearson correlation - Same task group", and relatively low z-scores for maps in

"Pearson correlation - Different task group"). To identify which map type would work particularly

well within the realm of Pearson correlations, we carried out a second analysis. Here, we first

calculated a discrimination score for each of the three map types, as the difference between

correlations for same and different task groups. We then carried out a one way repeated measures

ANOVA comparing the discriminations scores between the three map types, where we found a

significant main effect of map type, F(2,198)=277.5, p<.001. The highest discrimination score was

found for ‘Pearson correlation - Unthresholded d map’ (z=1.15 for same task group, z=0.33 for

different task group, resulting in a discrimination score of 0.82; see also Supplementary Figure S1).

Pairwise comparisons (Bonferroni corrected) found that this difference score was significantly higher

than that found the other two map types ps < .001.

Supplementary Figure S1. z-transformed image correlation values (Mean, 1 SD) found for the different methods we tested.

The to be correlated images were either subsamples from the same task group or drawn from two different task groups. Bar

colors indicate which type of meta-analysis map was used.

5Supplementary Materials

Page 7: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Section 3. Supplementary results: Additional meta-analytic contrasts

Supplementary Figure S2. Additional meta-analytic contrasts, showing brain activation for the opposite contrasts as shown in the main

manuscript. Selectively increased activations for the affective cluster are given by the contrast [Affect > Cog]&[ Affect > Interm], and for

the cognitive cluster are given by the contrast [Cog > Affect]&[ Interm > Affect]. Maps were thresholded at voxel-wise threshold of p

< .005 uncorrected and a cluster extent threshold of 10 voxels.

6Supplementary Materials

Page 8: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Section 3. All task descriptions for Theory of Mind task groups

Supplementary Table S3.1.False Belief tasks (n= 25).

Authors n Experimental Condition Control Condition

Aichhorn et al., 2009

n=21 Read a short vignette involving a person with a false belief. Predict the behavior of this person based on her belief. e.g., 'Julia sees the ice cream van go to the lake. She doesn't see that the van turns off to the town hall. Therefore, Julia will look for the ice cream van at the …?' (lake or town hall).

Read a short vignette involving a photograph of the past, and a description how things shown on the photo have changed by now. Answer a question about the outdated scene shown on the photo. e.g., 'Julia takes a picture of the ice-van in front of the pond. The ice cream van changes to the market place; the picture gets developed. On the picture, the ice-van is at the ...?' (pond or market place).

Andrews-Hanna et al., 2014

n=35 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Contreras et al., 2013a (exp. 1)

n=25 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Corradi-Dell’ Acqua et al., 2014

n=46 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Dodell-Feder et al., 2011

n=32 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Hartwright et al., 2015

n=21 Read a short vignette involving a person with a false belief. Answer a question about her belief. Similar to Saxe & Kanwisher, 2003. Averaged across stories with high- and low- salience of self-perspective (i.e., participants know all details about the true state of affairs, or some details about reality are unclear).

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. Similar Saxe & Kanwisher, 2003. Averaged across stories with high- and low- salience of self-perspective.

Heil et al., 2019

n=22 Read a short vignette involving a person with a false belief. Answer a question about her belief. Similar to Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. Similar to Saxe & Kanwisher, 2003.

Hughes et al., 2019

n=75 Read a short vignette involving a person with a false belief. Answer a question about her belief. After Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. After Saxe & Kanwisher, 2003.

Kliemann et al., 2008

n=26 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Lee et al.,2011

n=13 Read a short vignette involving a person with a false belief. Answer a question about her belief. e.g., 'David knows that Ethan is very scared of spiders. Ethan, alone in the attic, sees a shadow move and thinks it is a burglar. David hears Ethan cry for help. David assumes that Ethan thinks he has seen …?’(a spider or a burglar).

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. e.g., ‘Amy made a drawing of a treehouse three years ago. That was before the storm. We built a new treehouse last summer, but we painted it red instead of blue. The treehouse in Amy’s drawing is …?' (red or blue).

Lee & McCarthy, 2016

n=19 Read a short vignette involving a person with a false belief. Answer a question about her belief. After Dodell-Feder et al., 2011.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. After Dodell-Feder et al., 2011.

Mitchell,2008

n=20 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Naughtin et al., 2017

n=22 Read a short vignette involving a person with a false belief. Answer a question about her belief. After Dodell-Feder et al., 2011.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. After Dodell-Feder et al., 2011.

Özdem et al., 2017

n=20 Read a short vignette involving a person with a false belief. Answer a question about her belief. After Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. After Saxe & Kanwisher, 2003.

Oliver et al., 2018

n=19 Read a short vignette involving a person with a false belief. Answer a question about her belief. After Saxe

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. After Saxe & Kanwisher, 2003.

7Supplementary Materials

Page 9: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

& Kanwisher, 2003.

Perner et al, 2006

n=19 Read a short vignette involving a person with a false belief. Predict the behavior of that person based on her belief. See Aichhorn et al., 2009.

Read a short vignette involving a photograph of the past, and a description how things shown on the photo have changed by now. Answer a question about the outdated scene shown on the photo. See Aichhorn et al., 2009.

Saxe & Kanwisher, 2003

n=21 Read a short vignette involving a person with a false belief. Answer a question about her belief. e.g., 'John told Emily that he had a Porsche. Actually, his car is a Ford. Emily doesn't know anything about cars so she believed John. When Emily sees John's car, she thinks it is a …?’ (Porsche or Ford).

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. e.g., ‘A photograph was taken of an apple hanging on a tree branch. The film took half an hour to develop. In the meantime, a strong wind blew the apple to the ground. The developed photograph shows the apple on the …?' (tree or ground).

Saxe & Powell,2006

n=12 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Saxe et al.,2006

n=12 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Saxe & Wexler,2005

n=12 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Young et al.,2007a (exp.1)

n=10 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Young et al.,2007b (exp.2)

n=17 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Young & Saxe, 2009

n=14 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Young et al.,2010

n=17 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

Young et al., 2011

n=17 Read a short vignette involving a person with a false belief. Answer a question about her belief. See Saxe & Kanwisher, 2003.

Read a false-photograph vignette. Answer a question concerning the outdated content in the photo. See Saxe & Kanwisher, 2003.

8Supplementary Materials

Page 10: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S3.2.Trait Judgments (n= 19).

Authors n Experimental Condition Control Condition

Contreras et al., 2013a (exp. 1)

n=25 View a picture of a group of people, and answer a trait-related question about an individual of that group, e.g., ‘Enjoy a long car ride?’

View a picture of a group of people, and answer a physical question about an individual of that group, e.g., ‘Stay afloat?’ (... this person is wearing a life raft).

Contreras et al., 2013b (exp. 2)

n=13 View a picture of a group of people, and answer a trait-related question about an individual of that group, e.g., ‘Enjoy a long car ride?’

View a picture of a group of people, and answer a physical question about an individual of that group, e.g., ‘Stay afloat?’ (... this person is wearing a life raft).

Craik et al., 1999

n=8 Read a personality trait adjective. Indicate how well it describes a Canadian politician (Brian Mulroney).

Read a personality trait adjective. Indicate if it contains two or more syllables.

Greven et al., 2016

n=21 View bodies or names together with a trait-based statement (e.g., ‘He helped the child look for her puppy’). Respond to varying questions after each statement (e.g., gender, body-orientation, name, trait-valence).

View bodies or names together with a neutral statement (e.g., ‘He sharpened his pencil’). Respond to varying questions after each statement (e.g., gender, body-orientation, name).

Heather-ton et al., 2006

n=30 Read a personality trait adjective. Indicate how well it describes your best friend.

Read a personality trait adjective. Indicate if it is written in upper- or lowercase letters.

Lou et al.,2004

n=13 Read a personality trait adjective. Indicate how well it applies to the Danish Queen.

Read a personality trait adjective. Indicate if the number of syllables is odd or even.

Ma et al.,2011a

n=30 Read written statements conveying trait diagnostic information about persons (describing behavior). Then read a single trait-adjective and indicate whether it is consistent with the behavior of that person. e.g., 'Tolvan gave her sister a hug ... consistent with ‘friendly”?’

Read written statement about a person doing something. This behavior is neutral and does not convey trait diagnostic information about the person. Indicate the gender of the person in the sentence. e.g., 'Tolvan gave her mother a bottle … is Tolvan male or female?’

Mitchell et al., 2002

n=19 Read an adjective. Indicate whether it can be true for a hypothetical person. e.g., '”nervous” … can it be true for “David?”?’

Read an adjective. Indicate whether it can be true for an object. e.g., '”sundried” … can it be true for “grape”?’

Mitchell et al., 2004

n=17 View faces together with written statements conveying trait diagnostic information. Generate an opinion about the persons described. No overt response required.

View faces together with written statements conveying trait diagnostic information. Encode the order in which the statements are paired with the faces. No overt response required.

Mitchell et al., 2005a

n=19 Read a word describing a mental state: Can it be used for describing a person / a dog? e.g., '”curious” … can it be used to describe a dog/person?’

Read a word describing a body part: Can it be used for describing a person / a dog? e.g., '”liver” … is it a body part of a dog/person?’

Mitchell et al., 2005c

n=14 View faces together with trait diagnostic statements (e.g., ‘promised not to smoke in his apartment since his roommate was trying to quit’). Generate an opinion about the persons described / encode the order in which the statements are paired with the faces (activation collapsed). No overt response required.

View objects (e.g., computer) together with object diagnostic statements (e.g., ‘had coffee spilled on it last month’). Generate an opinion about the object described / encode the order in which the statements are paired with the objects (activation collapsed). No overt response required.

Mitchell et al., 2006

n=15 View faces together with trait diagnostic statements. Generate an opinion about the persons described / encode the order in which the statements are paired with the faces (activation collapsed). No overt response required. e.g., ‘he turned down three parties to study for organic chemistry’.

View faces together with statements conveying no trait diagnostic information. Generate an opinion about the persons described / encode the order in which the statements are paired with the faces (activation collapsed). No overt response required. e.g., ‘he bought a new set of highlighters’.

Modinos et al., 2011

n=18 Read a personality trait adjective and indicate if it correctly describes an acquaintance (e.g., ‘classmate’ or ‘teammate’).

Read a statement describing general knowledge (e.g., 'you need water to live'). Indicate if the statement is correct.

Murphy et al., 2010

n=10 Read a personality trait adjective and indicate if it correctly describes a significant other (e.g., a family member or close friend; identified before scanning). e.g., ‘likeable'.

Read a personality trait adjective and indicate if it has a positive emotional valence. e.g., 'likeable'.

Ochsner et al., 2005

n=17 Read a trait adjective and indicate whether it describes a close/significant other (identified before scanning).

Read a trait adjective and indicate whether it contains two syllables or not.

Schmitz et al., 2004

n=19 Read a personality trait adjective and indicate if it correctly describes a close/significant other (family

Read a personality trait adjective and indicate if it has a positive emotional valence.

9Supplementary Materials

Page 11: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

member or close friend).

Todorov et al., 2007

n=11 View a photograph of a face. Indicate whether the current face is the same or different from the one immediately preceding it (one-back task). The faces were presented prior to the fMRI experiment, together with written statements conveying trait diagnostic information.

Indicate whether the current face is the same or different from the one immediately preceding it (one-back task). The faces were not presented prior to the fMRI experiment.

Van der Cruyssen et al., 2015

n=18 Read a personality trait (e.g., ‘friendly’) and a sentence describing behavior (e.g., ‘smiles at the baby’). Rate how well the behavioral description applies to the trait (4-point scale, button press).

Read a label of a social category (e.g., ‘kindergarten teacher’) and a sentence describing behavior (e.g., ‘smiles at the baby’). Rate how well the behavioral description applies to the social category (4-point scale, button press).

Zhu et al.,2007

n=13 Read a personality trait adjective (e.g., 'brave', 'childish') and indicate if it correctly describes a former American president (Bill Clinton).

Read a personality trait adjective (e.g., 'brave', 'childish') and indicate if it is written in lower- or upper-case.

10Supplementary Materials

Page 12: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S3.3.Strategic Games (n= 13).

Authors n Experimental Condition Control Condition

Assaf et al., 2009

n=19 Play “domino game” with a human opponent (you get feedback about her moves). You and your opponent hold some domino chips in your hands (undisclosed). On each turn, you must play out a domino chip with a particular number to get a game point. You play out your chips face-down (undisclosed) – so you can pretend having the required number even if you haven’t. After you played out a chip, your opponent can decide whether or not to check the number on it. (simplified description).

Play “domino game” with a computer.

Chamina-de et al., 2012

n=18 Play “stone, paper, scissors” with a human opponent. Select one option (e.g., ‘stone’). After that, the option of your opponent is shown. The winner gets rewarded (winner is determined by a set of rules, e.g., ‘stone beats scissors’).

Play ‘stone, paper, scissors’ with a computer. You are informed that the computer chooses by a simple algorithm. Select one option.

Fukui et al., 2006

n=16 Play the “game of chicken” with a human opponent. On each trial, you and your opponent play for an amount of money. You must select between an ‘aggressive’ and a ‘reconciliative’ strategy on each trial. If you select the same strategy as your opponent, both gain little, but if you select differently and ‘aggressive’, you gain more than the other.

Play the “game of chicken” with a computer.

Gallagher et al., 2002

n=9 Play “stone, paper, scissors” with a human opponent. Select one option (e.g., ‘stone’). After that, the option of your opponent is shown. The winner gets rewarded (winner is determined by a set of rules, e.g., ‘stone beats scissors’).

Play “stone, paper, scissors” with a computer. You are informed that the computer chooses by a simple algorithm. Select one option.

Kircher et al., 2009

n=14 Play the prisoner’s dilemma game (iterated version). You play with a human player for game points. Both players choose a cooperative or defective strategy on each trial. If both players choose defective, they gain almost no game points at all. If both choose cooperative, both gain some game points. If players choose differently, the defective player gains more points.

Play the prisoner’s dilemma game (iterated version). You play with a computer for game points.

Krach et al., 2009

n=14 Play the prisoner’s dilemma game (iterated version). You play with a human opponent. See Kircher, 2009.

Play the prisoner’s dilemma game (iterated version). You play with a computer. See Kircher, 2009.

Riedl et al., 2014

n=18 Play the trust game with human partners. Each round you decide to keep a small amount of money, or ‘invest’ it in the other player. He/she will either pay you back later with a bonus (trustworthy reaction) or keep your money and pay back nothing (untrustworthy reaction).

Play the trust game with virtual avatars.

Rilling et al., 2004a

n=24 Play the prisoner’s dilemma game or the ultimatum game with a human opponent (iterated versions, activation collapsed). See Kircher, 2009 and Sripada, 2009.

Play the prisoner’s dilemma game or the ultimatum game with a computer (iterated versions, activation collapsed). See Kircher, 2009 and Sripada, 2009.

Sripada et al., 2009

n=26 Play a game like the ultimatum game (iterated version) with a human opponent. Two players must share an amount of money. Player 1 makes an offer how to split the amount, and player 2 can accept (both receive their share) or decline (no one receives money).

Play a game like the ultimatum game (iterated version) with a computer.

Suzuki et al., 2011

n=17 Play the prisoner’s dilemma game (iterated version). You play with a human opponent. See Kircher, 2009.

Play the prisoner’s dilemma game (iterated version). You play with a computer. See Kircher, 2009.

Takahashi et al., 2014

n=16 Play the matching pennies game with 5 different partners (human, three different robots of varying human-likeness, a computer). The matching pennies game asks both players to choose one out of two options at the same time. If one player is able to choose the same as the other on a trial (i.e., anticipate his/her choice), this player wins game points. The experimental condition (actually regression analysis) is linked to playing against opponents that participants rate as having high ‘mind-holderness’ (e.g., that are rated as human-like, cute, friendly, warm, bio- logical, conscious, natural, and emotional).

Play the matching pennies game with 5 different partners (human, three different robots of varying human-likeness, a computer). Brain activity for playing against those opponents that participants rate as having low ‘mind-holderness’.

Takahashi n=17 Play the matching pennies game against a human. The Play the matching pennies game against a computer. The matching

11Supplementary Materials

Page 13: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

et al., 2015

matching pennies game asks both players to choose one out of two options at the same time. If one player is able to choose the same as the other on a trial (i.e., anticipate his/her choice), this player wins game points.

pennies game asks both players to choose one out of two options at the same time. If one player is able to choose the same as the other on a trial (i.e., anticipate his/her choice), this player wins game points.

Zhou et al., 2014

n=28 Play a game like the ultimatum game (iterated version) with a human opponent. Two players must share an amount of money. Player 1 makes an offer how to split the amount, and player 2 can accept (both receive their share) or decline (no one receives money).

Play a game like the ultimatum game (iterated version) with a computer.

12Supplementary Materials

Page 14: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S3.4.Social Animations (n= 20).

Authors n Experimental Condition Control Condition

Ammons et al., 2018

n=14 Watch a video animation of two interacting triangles (e.g., ‘bullying’, ‘helping’). Judge whether the characters actions are socially motivated or random.

Watch a video animation of two randomly moving triangles (e.g., ‘geometric patterns’). Judge whether the characters actions are socially motivated or random.

Blake-more et al., 2003

n=10 Watch a video animation of two interacting triangles (e.g., ‘surprising one another’). Answer questions concerning the contingency (e.g., ‘is there an intention?’) between movements of the two shapes.

Watch video animation of two interacting triangles (e.g., ‘surprising one another’). Answer questions concerning the physical movement of the first shape (e.g., ‘did the velocity change?’).

Bliksted et al., 2019

n=17 Watch a video animation of two interacting triangles. After the clip, explain verbally what was happening. Answers are recorded and transcribed.

Watch a video animation of two randomly moving triangles (e.g., ‘geometric patterns’). After the clip, explain verbally what was happening. Answers are recorded and transcribed.

Castelli et al., 2000

n=6 Watch a video animation of two interacting triangles (e.g.,’ mother and child are playing’). Explain verbally what was happening (after fMRI).

Watch video animation of two randomly moving triangles.Explain verbally what was happening (after fMRI).

Das et al., 2012

n=22 Watch video animation of two interacting triangles (e.g., ‘surprising one another’). Explain verbally what was happing (after fMRI).

Watch video animation of two randomly moving triangles.Explain verbally what was happing (after fMRI).

Gobbini et al., 2007 (exp 1)

n=12 Watch a video animation of two interacting triangles (e.g., ‘surprising one another’). Select a word as a title for the animation (two options).

Watch video animation of two randomly moving triangles. Select a word as a title for the animation (two options).

Hennion et al., 2016

n=25 Watch a video animation of two interacting triangles, which involve influence on each other's mental states (e.g., ‘coaxing’).

Watch video animation of two triangles, which either interact in a goal-directed manner (e.g., ‘one chasing the other’) or move randomly. Activation is combined across the two control conditions (conjunction analysis).

Jack & Pelphrey, 2015

n=34 Watch video animations of three simple geometric shapes moving in a goal-directed or social way. After the task, give a verbal description of what you have seen in the videos.

Watch video animations of three simple geometric shapes moving randomly (e.g., ‘like billiard balls on the table’). Give a verbal description of the task.

Kana et al., 2009

n=12 Watch a video animation of two interacting triangles (e.g., ‘surprising one another’). Indicate which of 4 words best describes the interaction in the animation.

Watch video animation of two randomly moving triangles.Indicate which of 4 words best describes the interaction in the animation.

Koelke-beck et al., 2011

n=15 Watch video animation of two triangles which interact with one another ‘as if they read each other's mind” (e.g., ‘one is seducing the other’). Explain verbally what is happening (after fMRI).

Watch video animation of two randomly moving triangles.Explain verbally what is happening (after fMRI).

Malhi et al., 2008

n=20 Watch video animation of two interacting triangles (e.g., ‘surprising one another’). Explain verbally what is happening (after fMRI).

Watch video animation of two randomly moving triangles.Explain verbally what is happening (after fMRI).

Martin & Weisberg, 2003

n=12 Watch a video animation of simple geometrical shapes depicting a social interaction. Indicate which action was depicted - select a word from several alternatives (e.g., ‘dancing’, ‘fishing’, ‘sharing’…).

Watch a video animation of simple geometrical shapes depicting a mechanical action. Indicate which action was depicted - select a word from several alternatives (e.g., ‘billiards, bowling’…).

McAdams & Krawczyk, 2011

n=17 Watch video animation of three interacting geometric shapes (e.g., ‘playing hide and seek’). Judge if the geometrical shapes could be described as ‘friends”.

Watch video animation of three moving geometric shapes. They do not interact but collide with each other on their motion paths. Judge from these collisions if the geometrical shapes could be described as ‘having the same weight”.

Moessnang et al., 2016

n=42 Watch a video animation of two interacting triangles, which involve influence on each other's mental states (e.g., ‘coaxing’). Indicate if a social/goal-directed/random movement was shown and indicate the feeling of both triangles (positive/negative), response via button press to both questions.

Watch video animation of two triangles, which interact in a goal-directed manner (e.g., ‘one chasing the other’). Indicate if a social/goal-directed/random movement was shown and indicate the feeling of both triangles (positive/negative), response via button press to both questions.

Mori-guchi et al., 2006

n=38 Watch video animation of two interacting triangles (e.g., ‘surprising one another’). Explain verbally what was happing (after fMRI).

Watch video animation of two randomly moving triangles.Explain verbally what was happening (after fMRI).

Otti et al., 2015

n=20 Watch a video animation of two interacting triangles, which involve influence on each other's mental states (e.g., ‘coaxing’). Explain what was happening (after fMRI).

Watch video animations of three simple geometric shapes moving randomly (e.g., ‘like billiard balls on the table’). Explain what was happening (after fMRI).

Ross & Olson, 2010

n=15 Watch video animation of three interacting geometric shapes (e.g., ‘playing hide and seek’). Judge if the geometrical shapes could be described as ‘friends”.

Watch video animation of three moving geometric shapes. They do not interact but collide with each other on their motion paths. Judge from these collisions if the geometrical shapes could be described

13Supplementary Materials

Page 15: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

as ‘having the same weight”.

Santos et al., 2010

n=15 Watch video animation of two three-dimensional objects moving across the screen. In some animations, the objects approach each other and respond to each other’s movement. Rate the animacy of the animation by button press ('physical’' or 'personal'). Brain activity for trials with a high animacy rating (e.g., 'personal').

Watch video animation of two three-dimensional objects moving across the screen. In some animations, the objects approach each other and respond to each other’s movement. Rate the animacy of the animation by button press ('physical' or 'personal'). Brain activity for trials with a low animacy rating (e.g., 'physical').

Schultz et al., 2004

n=14 Watch video animation of a disk ‘chasing” another disk. One disk is moving to the predicted end-point of the movement of the other disk (i.e., using a predictive strategy). Categorize the strategy of the chasing disk / predict the outcome.

Watch video animation of a disk ‘chasing’ another disk. One disk is simply following the other disk. Categorize the strategy of the chasing disk / predict the outcome.

Tavares et al., 2008

n=16 Watch video animation of two socially interacting circles (e.g., ‘approaching each other and moving in a mutually coordinated way throughout’). Focus on behavioral or spatial aspects of the circle's movements (activation collapsed), and indicate if a statement can be used to describe he animation (e.g., behavioral: ‘An old lady was helped by a friend to carry her bags’; spatial: ‘The circles passed each other only once’).

Watch video animation of two moving circles not interacting with each other (e.g., ‘passing each other without interacting’). Focus on behavioral or spatial aspects of the circle's movements (activation collapsed) and indicate if a statement can be used to describe he animation.

14Supplementary Materials

Page 16: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S3.5.Reading Mind in the Eyes (n= 15).

Authors n Experimental Condition Control Condition

Adams et al., 2010

n=28 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

Baron-Cohen et al., 1999

n=12 View a photograph showing the eye-region of a face. Indicate which of two words (e.g., ‘concerned’ versus ‘unconcerned’) describes the mental state of that person.

View a photograph showing the eye-region of a face. Indicate if the person is male or female.

Bos et al., 2016

n=16 View a photograph showing the eye-region of a face. Indicate if a mental state word presented before (e.g., ‘shy’, ‘hostile’, ‘playful’) matches the photo.

View a photograph showing the eye-region of a face. Indicate if a non-mental state word presented before (e.g., ‘woman’, ‘curly-hair’, ‘heavy eyebrows’) matches the photo.

Castelli et al., 2010

n=12 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

De Achaval et al., 2012

n=14 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

Eddy et al., 2017

n=25 View a photograph showing the eye-region of a face. Indicate which of four words describes the mental state of that person.

View a photograph showing the eye-region of a face. Select among alternatives age best corresponding to the eyes.

Focquaert et al., 2010

n=24 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

Kullmann et al., 2014

n=18 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

Mascaro et al., 2013

n=29 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

Moor et al., 2012

n=55 View a photograph showing the eye-region of a face. Indicate which of four words describes the mental state of that person.

View a photograph showing the eye-region of a face. Indicate if the age and gender of person (e.g., ‘older male’).

Nolte et al., 2013

n=15 View a photograph showing the eye-region of a face. Indicate which of four words describes the mental state of that person.

View a photograph showing the eye-region of a face. Select among alternatives age best corresponding to the eyes.

Platek et al., 2004

n=5 View a photograph showing the eye-region of a face. Think about the mental state of the person depicted (no response).

View a fixation cross (no response).

Russell et al., 2000

n=7 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

Schiffer et al., 2013

n=22 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person.

View a photograph showing the eye-region of a face. Indicate if the person is male or female.

Thye et al., 2018a

n=18 View a photograph showing the eye-region of a face. Indicate which of two words describes the mental state of that person. See Baron-Cohen et al., 1999.

View a photograph showing the eye-region of a face. Indicate if the person is male or female. See Baron-Cohen et al., 1999.

15Supplementary Materials

Page 17: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S3.6.Rational Actions (n= 11).

Authors n Experimental Condition Control Condition

Brüne et al., 2008

n=13 View cartoon stories of two interacting persons. Pictures are in correct order. After that, answer silently two questions (written on the screen) concerning intentions and expectations of the persons (no overt response).

View cartoon stories of two interacting persons. Pictures are in jumbled order. After that, answer silently two questions (written on the screen) concerning properties of objects appearing in the scene (no overt response).

Brunet et al., 2000

n=8 View a cartoon story and predict what will happen based on intentions of a character (no false belief). Choose a logical story ending from several options shown in pictures. e.g., 'A prisoner is in his cell. First, he breaks the bars of his prison window. Then he walks to his bed.' Participants must indicate what will happen next … 'the prisoner ties a rope from the sheets on his bed /the prisoner shouts out loud.'

View a cartoon story and predict what will happen based on physical causality. Choose a logical story ending from several options shown in pictures. e.g., 'A person is standing in front of a slide. A large ball is coming down this slide, heading towards the person standing there.' Participants must indicate what will happen next … the ball is knocking over the person / the ball is resting on the ground and the person is standing next to it.'

Heleven et al., 2019

n=49 View cartoon stories showing a person over a sequence of events. Order of pictures (i.e., events) is jumbled. Indicate the correct order of pictures based on the (true belief) intentions of the character (button press).

View cartoon stories showing a sequence of events. Order of pictures (i.e., events) is jumbled. Indicate the correct order of pictures based on physical causality (button press).

Kana et al., 2014

n=14 Cartoon stories from Walter et al., 2004 (Brunet et al., 2000). View a sequence of pictures showing a simple story, choose a logical ending based on the actor's intentions.

View a sequence of pictures showing a simple sequence of events, choose a logical ending based on physical causality.

Sebastian et al., 2012b

n=30(same sample as Sebastian et al., 2012a)

View cartoon stories showing two persons. Predict the story ending based on the intentions of one person. Choose a logical story ending from several options shown in pictures.

View cartoon stories showing two persons. Predict the story ending based on physical causality. Choose a logical story ending from several options shown in pictures.

Thye et al., 2018b

n=18 Cartoon stories from Walter et al., 2004 (Brunet et al., 2000). View a sequence of pictures showing a simple story, choose a logical ending based on the actor's intentions.

View a sequence of pictures showing a simple sequence of events, choose a logical ending based on physical causality.

Voellm et al., 2006b

n=13 (same sample as Voellm et al., 2006a)

View cartoon stories (no social interactions or emotional situations included). Predict the story ending based on the intentions of one person. Choose a logical story ending from several options shown in pictures.

View cartoon stories. Predict the story ending based on physical causality. Choose a logical story ending from several options shown in pictures.

Walter et al., 2004 (exp1)

n=13 View a cartoon story showing two persons. One is making a communicative gesture. (e.g., is pointing to a bottle to request it). Choose a logical story ending from three options shown in pictures.

View a cartoon story showing some objects making contact because of physical causality. (e.g., a gust of wind blows a ball, so it knocks over several bottles). Choose a logical story ending from three options shown in pictures.

Walter et al., 2004 (exp2)

n=12 View cartoon stories showing one person, who is preparing for a future social interaction - i.e., has a prospective social intention (e.g., a person preparing a romantic dinner). Choose a logical story ending.

View cartoon stories showing objects making contact because of physical causality (e.g., a gust of wind blows a ball, so it knocks over several bottles). Choose a logical story ending.

Walter et al., 2009

n=12 View cartoon stories showing one person performing an action. Choose a logical story ending based on the private intention of the person (e.g., person is changing a broken bulb in order to read a book).

View cartoon stories showing objects making contact because of physical causality (e.g., a gust of wind blows a ball, so it knocks over several bottles). Choose a logical story ending.

Wang et al., 2015

n=56 View cartoon stories (no social interactions or emotional situations included). Predict the story ending based on the intentions of one person. Choose a logical story ending from several options shown in pictures.

View cartoon stories. Predict the story ending based on physical causality. Choose a logical story ending from several options shown in pictures.

16Supplementary Materials

Page 18: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

References for the ToM meta-analyses

Aichhorn, M., Perner, J., Weiss, B., Kronbichler, M., Staffen, W., & Ladurner, G. (2009). Temporo-parietal junction activity in theory-of-mind tasks: falseness, beliefs, or attention. Journal of Cognitive Neuroscience 21(6), 1179-1192.

Adams, R.B. Jr., Rule, N.O., Franklin, R.G. Jr., Wang, E., Stevenson, M.T., Yoshikawa, S., Nomura, M., Sato, W., Kveraga, K., Ambady, N. (2010). Cross-cultural reading the mind in the eyes: an fMRI investigation. Journal of Cognitive Neuroscience. 22(1), 97-108.

Ammons, C., Doss, C., Bala, D., & Kana, R. (2018). Brain responses underlying anthropomorphism, agency, and social attribution in Autism Spectrum Disorder. The Open Neuroimaging Journal, 12, 16-29.

Andrews-Hanna, J., Saxe, R., & Yarkoni, T. (2014). Contributions of episodic retrieval and mentalizing to autobiographical thought: Evidence from functional neuroimaging, resting-state connectivity, and fMRI meta-analyses. NeuroImage, 91, 324-335.

Assaf, M., Kahn, I., Pearlson, G.D., Johnson, M.R., Yeshurun, Y., Calhoun, V.D., & Hendler, T. (2009). Brain Activity Dissociates Mentalization from Motivation During an Interpersonal Competitive Game. Brain Imaging and Behavior, 3(1), 24-37.

Bahnemann, M., Dziobek, I., Prehn, K., Wolf, I., & Heekeren, H.R. (2010). Sociotopy in the temporoparietal cortex: common versus distinct processes. Social Cognititve and Affective Neuroscience, 5(1),48-58.

Baron-Cohen, S., Ring, H.A., Wheelwright, S., Bullmore, E.T., Brammer, M.J., Simmons, A., & Williams, S.C. (1999). Social intelligence in the normal and autistic brain: an fMRI study. European Journal of Neuroscience, 11(6), 1891-1898.

Blakemore, S.-J., Boyer, P., Pachot-Clouard, M., Meltzoff, A., Segebarth, C., & Decety, J. (2003). The Detection of Contingency and Animacy from Simple Animations in the Human Brain. Cerebral Cortex, 13(8), 837-844.

Bliksted, V., Frith, C., Videbech, P., Fagerlund, B., Emborg, C., Simonsen, A., … Campbell-Meiklejohn, D. (2019). Hyper- and Hypomentalizing in Patients with First-Episode Schizophrenia: fMRI and Behavioral Studies. Schizophrenia Bulletin, 45(2), 377-385.

Bos, P. A., Hofman, D., Hermans, E. J., Montoya, E. R., Baron-Cohen, S., & van Honk, J. (2016). Testosterone reduces functional connectivity during the ‘Reading the Mind in the Eyes’ Test. Psychoneuroendocrinology, 68, 194-201.

Bruene, M., Lissek, S., Fuchs, N., Witthaus, H., Peters, S., Nicolas, V., Juckel, G., & Tegenthoff, M. (2008). An fMRI study of theory of mind in schizophrenic patients with ‘passivity’ symptoms. Neuropsychologia, 46(7), 1992-2001.

Brunet, E., Sarfati, Y., Hardy-Baylé, M.C., & Decety, J. (2000). A PET investigation of the attribution of intentions with a nonverbal task. NeuroImage, 11(2), 157-166.

Castelli, F., Happé, F., Frith, U., & Frith, C. (2000). Movement and mind: a functional imaging study of perception and interpretation of complex intentional movement patterns. NeuroImage, 12(3), 314-315.

Castelli, I., Baglio, F., Blasi, V., Alberoni, M., Falini, A., Liverta-Sempio, O., Nemni, R., & Marchetti, A. (2010). Effects of aging on mindreading ability through the eyes: an fMRI study. Neuropsychologia. 48(9), 2586-2594.

Chaminade, T., Rosset, D., Da Fonseca, D., Nazarian, B., Lutcher, E., Cheng, G., & Deruelle, C. (2012). How do we think machines think? An fMRI study of alleged competition with an artificial intelligence. Frontiers in Human Neuroscience, 6:103.

Contreras, J. M., Schirmer, J., Banaji, M., & Mitchell, J.(2013a,b). Common brain regions with distinct patterns of neural responses during mentalizing about groups and individuals. Journal of Cognitive Neuroscience, 25(9). 1406-1417.

Corradi-Dell’Acqua, C., Hofstetter, C., & Vuilleumier, P. (2014). Cognitive and affective theory of mind share the same local patterns of activity in posterior temporal but not medial prefrontal cortex. Social Cognitive and Affective Neuroscience, 9, 1175-1184.

Craik, F.I.M., Moroz, T.M., Moscovitch, M., Stuss, D.T., Winocur, G., Tulving, E., & Kapur, S. (1999). In Search of the Self: A Positron Emission Tomography Study. Psychological Science, 10(1), 26-34.

17Supplementary Materials

Page 19: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Das, P., Lagopoulus, J., Coulston, C.M., Henderson, A.F., & Malhi, G.S. (2012). Mentalizing impairment in schizophrenia: a functional MRI study. Schizophrenia Research, 134(2-3), 158-164.

De Achával, D., Villareal, M.F., Costanzo, E.Y., Douer, J., Castro, M.N., Mora, M.C., Nemeroff, C.B., Chu, E., Bär, K.J., & Guinjoan, S.M. (2012). Decreased activity in right-hemisphere structures involved in social cognition in siblings discordant for schizophrenia. Schizophrenia Research, 134(2-3), 171-179.

Dodell-Feder, D., Koster-Hale, J., Bedny, M., & Saxe, R. (2011). fMRI item analysis in a theory of mind task. NeuroImage, 55(2), 705-712.

Eddy, C. M., Cavanna, A. E., & Hansen, P. C. (2017). Empathy and aversion: the neural signature of mentalizing in Tourette syndrome. Psychological Medicine, 47(3), 507-517.

Focquaert, F., Steven-Wheeler, M.S., Vanneste, S., Doron, K.W., & Platek, S.M. (2010). Mindreading in individuals with an empathizing versus systemizing cognitive style: An fMRI study. Brain Research Bulletin, 83(5), 214-222.

Fukui, H., Murai, T., Shinozaki, J., Aso, T., Fukuyama, H., Hayashi, T., Hanakawa, T. (2006). The neural basis of social tactics: An fMRI study. NeuroImage, 32(2), 913-920.

Gallagher, H.L., & Frith, C.D. (2004). Dissociable neural pathways for the perception and recognition of expressive and instrumental gestures. Neuropsychologia, 42(13), 1725-1736.

Gallagher, H.L., Jack, A.I., Roepstorff, A., & Frith, C.D. (2002). Imaging the intentional stance in a competitive game. NeuroImage, 16(3), 814-821.

Gobbini, M. I., Koralek, A., Bryan, R., Montgomery, K., & Haxby, J. (2007). Two takes on the social brain: A comparison of theory of mind tasks. Journal of Cognitive Neuroscience, 19(11), 1803-1814.

Greven, I., Downing, P., & Ramsey, R. (2016). Linking person perception and person knowledge in the human brain. Social Cognitive and Affective Neuroscience, 11(4), 641-651.

Hartwright, C., Apperly, I., & Hansen, P. (2015). The special case of self-perspective inhibition in mental, but not non-mental, representation. Neuropsychologia, 67, 183-192.

Heatherton, T.F., Wyland, C.L., Macrae, C.N., Demos, K.E., Denny, B.T. & Kelley, W.M. (2006). Medial prefrontal activity differentiates self from close others. Social Cognitive and Affective Neuroscience, 1(1), 18-25.

Heil, L., Colizoli, O., Hartstra, E., Kwisthout, J., van Pelt, S., van Rooij, I., Bekkering, H. (2019). Processing of prediction errors in mentalizing areas. Journal of Cognitive Neuroscience, 31(6), 900-912.

Heleven, E., van Dun, K., & van Overwalle, F. (2019). The posterior Cerebellum is involved in constructing Social Action Sequences: An fMRI study. Scientific Reports, 9, 11110.

Hennion, S., Delbeuck, X., Koelkebeck, K., Brion, M., Tyvart, L., Plomhause, L., … Szurhaj, W. (2016). A functional magnetic resonance imaging investigation of theory of mind impairments in patients with temporal lobe epilepsy. Neuropsychologia, 93, 271-279.

Hughes, C., Cassidy, B., Faskowitz, J., Avena-Koenigsberger, A., Sporns, O., & Krendl, A. (2019). Age differences in specific neural connections within the Default Mode Network underlie theory of mind. NeuroImage, 191, 269-277.

Jack, A. & Pelphrey, K. (2016). Neural correlates of animacy attribution include neocerebellum in healthy adults. Cerebral Cortex, 25, 4240-4247.

Kana, R.K., Keller, T.A., Cherkassky, V.L., Minshew, N.J., & Just, M.A. (2009). Atypical frontal-posterior synchronization of Theory of Mind regions in autism during mental state attribution. Social Neuroscience, 4(2), 135-152.

Kana, R., Libero, L., Hu, C., Deshpande, H., & Colburn, J. (2014). Functional brain networks and white matter underlying theory-of-mind in autism. Social Cognitive and Affective Neuroscience, 9, 98-105.

Kircher, T., Bluemel, I., Marjoram, D., Lataster, T., Krabbendam, L., Weber, J., van Os, J., & Krach, S. (2009). Online mentalising investigated with functional MRI. Neuroscience Letters, 454, 176-181.

Kliemann, D., Young, L., Scholz, J., & Saxe, R. (2008). The influence of prior record on moral judgment. Neuropsychologia. 46(12), 2949-2957.

18Supplementary Materials

Page 20: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Koelkebeck, K., Hirao, K., Kawada, R., Miyata, J., Saze, T., Ubukata, S., … Murai, T. (2011). Transcultural differences in brain activation patterns during theory of mind (ToM) task performance in Japanese and Caucasian participants. Social Neuroscience 6(5-6), 615-626.

Krach, S., Bluemel, I, Marjoram, D., Lataster, T., Krabbendam, L., Weber, J., van Os, J., & Kircher, T. (2009). Are women better mindreaders? Sex differences in neural correlates of mentalizing detected with functional MRI. BMC Neuroscience, 4, 10:9.

Kullmann, J., Grigoleit, J.-S., Wolf, O., Engler, H., Oberbeck, R., Elsenbruch, S., … Gizewski, E. (2014). Experimental human endotoxemia enhances brain activity during social cognition. Social Cognitive and Affective Neuroscience, 9, 786-793.

Lee, J., Quintana, J., Nori, P., & Green, M.F. (2011). Theory of mind in schizophrenia: exploring neural mechanisms of belief attribution. Social Neuroscience, 6(5-6), 569-581.

Lee, S. & McCarthy, G. (2016). Functional heterogeneity and convergence in the right temporoparietal junction. Cerebral Cortex, 26, 1108-1116.

Lou, H.C., Luber, B., Crupain, M., Keenan, J.P., Nowak, M., Kjaer, T.W., … Lisanby, S.H. (2004). Parietal cortex and representation of the mental Self. Proc. Natl. Acad. Sci. U. S. A. 101(17), 6827-6832.

Ma, N., Vanderkerckhove, M., van Overwalle, F., Seurinck, R., & Fias, W. (2011a). Spontaneous and intentional trait inferences recruit a common mentalizing network to a different degree: spontaneous inferences activate only its core areas. Social Neuroscience, 6(2), 123-138.

Malhi, G.S., Lagopoulos, J., Das, P., Moss, K., Berk, M., & Coulston, C.M. (2008). A functional MRI study of Theory of Mind in euthymic bipolar disorder patients. Bipolar Disorder, 10(8), 943-956.

Martin, A., & Weisberg, J. (2003). Neural foundations for understanding social and mechanical concepts. Cognitive Neuropsychology, 20(3-6), 575-587.

Mascaro, J., Rilling, J., Tenzin Negi, L., & Raison, C. (2013). Compassion meditation enhances empathic accuracy and related neural activity. Social Cognitive and Affective Neuroscience, 8, 48-55.

McAdams, C.J., & Krawczyk, D.C. (2011). Impaired neural processing of social attribution in anorexia nervosa. Psychiatry Research, 194(1), 54-63.

Mitchell, J.P. (2008). Activity in right temporo-parietal junction is not selective for theory-of-mind. Cerebral Cortex. 18(2), 262-271.

Mitchell, J. P., Heatherton, T. F., & Macrae, C. N. (2002). Distinct neural systems subserve person and object knowledge. Proc. Natl. Acad. Sci. U S A. 99, 15238-15243.

Mitchell, J.P., Macrae, C.N., & Banaji, M.R. 2004. Encoding-specific effects of social cognition on the neural correlates of subsequent memory. Journal of Neuroscience, 24(21), 4912-4917.

Mitchell, J.P., Banaji, M.R., & Macrae, C.N. (2005a). General and specific contributions of the medial prefrontal cortex to knowledge about mental states. NeuroImage, 28(4), 757-762.

Mitchell, J.P., Banaji, M.R., & Macrae, C.N. (2005b). The link between social cognition and self-referential thought in the medial prefrontal cortex. Journal of Cognitive Neuroscience, 17(8), 1306-1315.

Mitchell, J.P., Cloutier, J., Banaji, M.R., & Macrae, C.N. (2006). Medial prefrontal dissociations during processing of trait diagnostic and nondiagnostic person information. Social Cognitive and Affective Neuroscience, 1(1), 49-55.

Mitchell, J.P., Macrae, C.N., & Banaji, M.R. (2005c). Forming impressions of people versus inanimate objects: social-cognitive processing in the medial prefrontal cortex. NeuroImage, 26(1), 251-257.

Modinos, G., Renken, R., Ormel, J., & Aleman, A. (2011). Self-reflection and the psychosis-prone brain: an fMRI study. Neuropsychology, 25(3), 295-305.

Moessnang, C., Schäfer, A., Bilek, E., Roux, P., Otto, K., Baumeister, S., … Tost, H. (2016). Specificity, reliablity and sensitivity of social brain responses during spontaneous mentalizing. Social Cognitive and Affective Neuroscience, 11(11), 1687-1697.

Moor, B.G., Macks, Z.A., Gueroglu, B., Rombouts, S.A., Molen, M.W., & Crone, E.A. (2012). Neurodevelopmental changes of reading the mind in the eyes. Social Cognitive and Affective Neuroscience, 7(1), 44-52.

19Supplementary Materials

Page 21: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Moriguchi, Y., Ohnishi, T., Lane, R.D., Maeda, M., Mori, T., Nemoto, K., Matsuda, H., & Komaki, G. (2006). Impaired self-awareness and theory of mind: an fMRI study of mentalizing in alexithymia. NeuroImage, 32(3), 1472-1482.

Murphy, E.R., Brent, B.K., Benton, M., Pruitt, P., Diwadkar, V., Rajarethinam, R.P., Keshavan, M.S. (2010). Differential processing of metacognitive evaluation and the neural circuitry of the self and others in schizophrenia: a pilot study. Schizophrenia Research, 116(2-3), 252-258.

Naughtin, C., Horne, K., Schneider, D., Venini, D., York, A., & Dux, P. (2017). Do implicit and explicit belief processing share neural substrates? Human Brain Mapping, 38, 4760-4772.

Nolte, T., Bolling, D., Hudac, C., Fonagy, P., Mayes, L., & Pelphrey, K. (2013). Brain mechanisms underlying the impact of attachment-related stress on social cognition. Frontiers in Human Neuroscience, 7, 816.

Ochsner, K.N., Beer, J.S., Robertson, E.R., Cooper, J.C., Gabrieli, J.D., Kihlstrom, J.F., & D’Esposito, M. (2005). The neural correlates of direct and reflected self-knowledge. NeuroImage, 28(4), 797-814.

Özdem, C., Brass, M., van der Cruyssen, L., & van Overwalle, F. (2017). The overlap between false belief and spatial reorientation in the temporo-parietal junction: The role of input modality and task. Social Neuroscience, 12(2). 207-217.

Oliver, L., Vieira, J., Neufeld, R., Dziobek, I., & Mitchell, D. (2018). Greater involvement of action simulation mechanisms in emotional vs cognitive empathy. Social Cognitive and Affective Neuroscience, 13(4), 367-380.

Otti, A., Wohlschlaeger, A., & Noll-Hussong, M. (2015). Is the medial prefrontal cortex necessary for theory of mind? PLoS ONE, 10(8), e0135912.

Perner, J., Aichhorn, M., Kronbichler, M., Staffen, W., & Ladurner, G. (2006). Thinking of mental and other representations: The roles of left and right temporo-parietal junction. Social Neuroscience, 1(3-4), 245-258.

Platek, S.M., Keenan, J.P., Gallup, G.G. Jr., & Mohamed, F.B. (2004). Where am I? The neurological correlates of self and other. Brain Research. Cognitive Brain Research, 19(2), 114-122.

Riedl, R., Mohr, P., Kenning, P., Davis, F., & Heekeren, H. (2014). Trusting humans and avatars: A brain imaging study based on evolution theory. Journal of Management Information Systems, 30(4), 83-114.

Rilling, J.K., Sanfey, A.G., Aronson, J.A., Nystrom, L.E., & Cohen, J.D. (2004). The neural correlates of theory of mind within interpersonal interactions. NeuroImage, 22(4), 1694-1703.

Roser, P., Lissek, S., Tegenthoff, M., Nicolas, V., Juckel, G., & Brüne, M. (2012). Alterations of theory of mind network activation in chronic cannabis users. Schizophrenia Research, 139(1-3), 19-26.

Ross, L.A., Olson, I.R., 2010. Social cognition and the anterior temporal lobes. NeuroImage, 49, 3452-3462.

Russell, T.A., Rubia, K., Bullmore, E.T., Soni, W., Suckling, J., Brammer, M.J., Simmons, A., Williams, S.C., & Sharma, T. (2000). Exploring the social brain in schizophrenia: left prefrontal underactivation during mental state attribution. American Journal of Psychiatry, 157(12), 2040-2042.

Santos, N.S., Kuzmanovic, B., David, N., Rotarska-Jagiela, A., Eickhoff, S.B., Shah, J.N., Fink, G.R., Bente, G., & Vogeley, K. (2010). Animated brain: a functional neuroimaging study on animacy experience. NeuroImage, 53(1), 291-302.

Saxe, R., Kanwisher, N., (2003). People thinking about thinking people. The role of the temporo-parietal junction in ‘theory of mind’. NeuroImage, 19(4), 1835-42.

Saxe, R., & Powell, L., 2006. It's the thought that counts: specific brain regions for one component of theory of mind. Psychological Science, 17(8), 692-9.

Saxe, R., Schulz, L.E., Jiang, Y.V. (2006). Reading minds versus following rules: dissociating theory of mind and executive control in the brain. Social Neuroscience, 1(3-4), 284-298.

Saxe, R., & Wexler, A. (2005). Making sense of another mind: the role of the right temporo-parietal junction. Neuropsychologia, 43(10), 1391-1399.

Schiffer, B., Pawliczek, C., Müller, B., Gizewski, E., & Walter, H. (2013). Why don’t men understand women? Altered neural networks for reading the language of male and female eyes. PLoS ONE, 8(4), e60278.

20Supplementary Materials

Page 22: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Schlaffke, L., Lissek, S., Lenz, M., Juckel, G., Schultz, T., Tegenthoff, M., ... & Brüne, M. (2015). Shared and nonshared neural networks of cognitive and affective theory‐of‐mind: A neuroimaging study using cartoon picture stories. Human Brain Mapping, 36(1), 29-39.

Schmitz, T.W., Kawahara-Baccus, T.N., & Johnson, S.C. (2004). Metacognitive evaluation, self-relevance, and the right prefrontal cortex. NeuroImage, 22(2), 941-947.

Schultz, J., Imamizu, H., Kawato, M., & Frith, C.D. (2004). Activation of the human superior temporal gyrus during observation of goal attribution by intentional objects. Journal of Cognitive Neuroscience, 16(10), 1695-1705.

Sebastian, C.L., Fontaine, N.M., Bird, G., Blakemore, S.J., Brito, S.A., McCrory, E.J., & Viding, E. (2012a,b). Neural processing associated with cognitive and affective Theory of Mind in adolescents and adults. Social Cognitive and Affective Neuroscience, 7(1), 52-63.

Sripada, C.S., Angstadt, M., Banks, S., Nathan, P.J., Liberzon, I., & Phan, K.L. (2009). Functional neuroimaging of mentalizing during the trust game in social anxiety disorder. Neuroreport. 20(11), 984-989.

Suzuki, S., Niki, K., Fujisaki, S., & Akiyama, E. (2011). Neural basis of conditional cooperation. Social Cognitive and Affective Neuroscience, 6(3), 338-347.

Takahashi, H., Izuma, K., Matsumoto, M., Matsumoto, K., & Omori, T. (2015). The anterior insula tracks behavioral entropy during an interpersonal competitive game. PLoS ONE, 10(6), e0123329.

Takahashi, H., Terada, K., Morita, T., Suzuki, S., Haji, T., Kozima, H., … Naito, E. (2014). Different impressions of other agents obtained through social interaction uniquely modulate dorsal and ventral pathway activities in the social human brain. Cortex, 58, 289-300.

Tavares, P., Lawrence, A.D., Barnard, P.J. 2008. Paying attention to social meaning: an FMRI study. Cerebral Cortex, 18(8), 1876-1885.

Thye, M., Murdaugh, D., & Kana, R. (2018a,b). Brain mechanisms underlying reading the mind from eyes, voices, and actions. Neuroscience, 374, 172-186.

Todorov, A., Gobbini, M.I., Evans, K.K., & Haxby, J.V. (2007). Spontaneous retrieval of affective person knowledge in face perception. Neuropsychologia, 45(1), 163-173.

Van Der Cruyssen, L., Heleven, E., Ma, N., Vandekerckhove, M., & Van Overwalle, F. (2015). Distonct neural correlates of social categories and personality traits. NeuroImage, 104, 336-346.

Völlm, B.A., Taylor, A.N., Richardson, P., Corcoran, R., Stirling, J., McKie, S., Deakin, J.F. & Elliott, R. (2006a,b). Neuronal correlates of theory of mind and empathy: a functional magnetic resonance imaging study in a nonverbal task. NeuroImage, 29(1), 90-98.

Walter, H.W., Adenzato, M., Ciaramidaro, A., Enrici, I., Lorenzo, P., & Bara, B.G. (2004a,b). Understanding intentions in social interaction: the role of the anterior paracingulate cortex. Journal of Cognitive Neuroscience, 16(10), 1854-1863.

Walter, H., Ciaramidaro, A., Adenzato, M., Vasic, N., Ardito, R.B., Erk, S., & Bara, B.G. (2009). Dysfunction of the social brain in schizophrenia is modulated by intention type: an fMRI study. Social Cognitive and Affective Neuroscience, 4(2), 166-176.

Wang, Y., Liu, W.-H., Li, Z., Wei, X.-H., Jiang, X.-Q., Neumann, D., … Chan, R. (2015). Dimensional schizotypy and social cognition: an fMRI imaging study. Frontiers in Behavioral Neuroscience, 9, 133.

Wicker, B., Perrett, D.I., Baron-Cohen, S., & Decety, J. (2003). Being the target of another's emotion: a PET study. Neuropsychologia, 41(2), 139-146.

Young, L., Cushman, F., Hauser, M., & Saxe, R. (2007a,b). The neural basis of the interaction between theory of mind and moral judgment. Proc. Natl. Acad. Sci. U S A. 104(20), 8235-8240.

Young, L., & Saxe, R. 2009. An FMRI investigation of spontaneous mental state inference for moral judgment. Journal of Cognitive Neuroscience, 21(7), 1396-1405.

Young, L., Dodell-Feder, D., Saxe, R. (2010). What gets the attention of the temporo-parietal junction? An fMRI investigation of attention and theory of mind. Neuropsychologia. 48(9), 2658-2664.

Young, L., Scholz, J., & Saxe, R. (2011). Neural evidence for ‘intuitive prosecution’: the use of mental state information for negative moral verdicts. Social Neuroscience, 6(3), 302-315.

21Supplementary Materials

Page 23: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Zhou, Y., Wang, Y., Rao, L.-L., Yang, L.-Q, & Li, S. (2014). Money talks: neural substrate of modulation of fairness by monetary incentives. Frontiers in Behavioral Neuroscience, 8,150.

Zhu, Y., Zhang, L., Fan, J., & Han, S. (2007). Neural basis of cultural influence on self-representation. NeuroImage, 34(3), 1310-1316.

22Supplementary Materials

Page 24: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Section 4. All task descriptions for empathy task groups

Supplementary Table S4.1.Observing Pain (n= 21).

Authors n Experimental Condition Control Condition

Akitsuki & Decety 2009

n=26 View animations of hands and feet in painful situations, either intentionally caused by another person or accidentally caused (activation collapsed). No response in scanner.

View animations of hands and feet in non-painful situations either intentionally caused by another person or accidentally caused (activation collapsed). No response in scanner.

Azevedo et al. 2013

n=27 View video of a persons' face under painful stimulation (needle). No response in scanner.

View video of a persons' face under non-painful stimulation (cotton swab). No response in scanner.

Azevedo et al., 2014

n=12 Watch videos of hands under painful (needle) stimulation. No response in scanner.

Watch videos of hands under non-painful (cotton swab) stimulation. No response in scanner.

Ballotta et al., 2018

n=30 View pictures of painful facial expressions, perform a gender and a temporal-production task (dual task paradigm, both tasks are irrelevant with regard to the faces).

View pictures of neutral facial expressions, perform a gender and a temporal-production task (dual task paradigm, both tasks are irrelevant with regard to the faces).

Bos et al., 2015

n=24 Watch videos of hands under painful (needle) stimulation. Watch attentively and detect changes in fixation cross appearing between videos.

Watch videos of hands under non-painful (cotton swab) stimulation. Watch attentively and detect changes in fixation cross appearing between videos.

Christov-Moore et al., 2017

n=19 Watch video of a hand being pierced by needles. Passive viewing, no response.

Watch video of a hand being touched by a wooden q-tip. Passive viewing, no response.

Decety et al., 2010

n=22 View animations of hands and feet in painful situations either caused by another person intentionally or accidentally caused (activation collapsed). No response in scanner.

View animations of hands and feet in non-painful situations either caused by another person intentionally or accidentally caused (activation collapsed). No response in scanner.

Fan et al., 2014

n=21 View pictures of body parts being injured. No response in scanner.

View pictures of body parts under non-painful stimulation. No response in scanner.

Fujino et al., 2014

n=11 View video of hand under painful stimulation (pierced). Press button when video clip appears.

View video of hand under non-painful stimulation (pierced). Press button when video clip appears.

Gao et al., 2017

n=20 View pictures of body parts being injured. Either just passively observe or indicate if a pain or no-pain picture was shown (response types intermixed).

View pictures of body parts under non-painful stimulation. Either just passively observe or indicate if a pain or no-pain picture was shown (response types intermixed).

Kühn et al., 2018

n=80 View pictures of body parts being injured. No response in the scanner.

View pictures of body parts under non-painful stimulation. No response in the scanner.

Luo et al., 2014

n=36 Watch videos of person receiving painful stimulation (e.g., needle pierced through cheek). Indicate if person was feeling pain (button press).

Watch videos of person receiving non-painful stimulation (e.g., cheek touched by q-tip). Indicate if person was feeling pain (button press).

Ma et al., 2011b

n=33 View videos of faces or hands pierced by needle. Indicate if person in the video experienced pain (button press).

View videos of faces or hands touched by q-tip. Indicate if person in the video experienced pain (button press).

Morrison et al., 2004

n=14 Watch video clips of other person experiencing unpleasant pricks to the fingertips. No response in scanner.

Passive rest.

Ochsner et al., 2008

n=13 Experience pain or see someone in pain (activation collapsed). Self-experience induced through painful thermal stimulation of hand. Other in pain presented by videos of persons suffering injuries in sport events. No response in scanner.

Experience non-painful stimulation or see someone in non-painful activity (activation collapsed). Self-experience induced through warm but non-painful thermal stimulation of hand. Other in non-painful activity presented by videos of persons in sport events (no accidents, no injuries at this time). No response in scanner.

Olsson et al., 2007

n=11 View video of a model undergoing fear conditioning situations (receiving the shock). No response in scanner.

View video of a model undergoing fear conditioning situations (not receiving the shock). No response in scanner.

Quiao-Tasserit

n=41 view a short video-clip extract of negative (e.g., the movie ‘the Shining’), positive (e.g., the movie ‘when Harry met Sally’), or neutral content. Afterwards, view the picture of a person's hand in a painful context (e.g., a scalpel piercing the skin). By button press, indicate whether the previous picture showed a person's left or right hand.

view a short video-clip extract of negative (e.g., the movie ‘the Shining’), positive (e.g., the movie ‘when Harry met Sally’), or neutral content. Afterwards, view the picture of a person's hand in a neutral context (e.g., simply holding a scalpel). By button press, indicate whether the previous picture showed a person's left or right hand.

Seara-Cardoso et al.,

n=46 View the picture of a person's hand or feet in painful situations. By button press, perform a hand/ foot judgment task

View the picture of a person's hand or feet in non-painful everyday situations. By button press, perform a hand/ foot judgment task

23Supplementary Materials

Page 25: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

2015

Tei et al., 2014

n=25 Watch videos of hand under painful stimulation (e.g., ice-pick). No response in scanner.

Watch videos of hands under non painful stimulation (soft brush). No response in scanner.

Ushida et al., 2008

n=15 Watch video clips of hands being punctured by a needle. No response in scanner.

Watch video of static hand without needle. No response in scanner.

Zaki et al., 2007

n=13 Experience pain or see someone in pain (activation collapsed). Self-experience induced through painful thermal stimulation of hand. Other in pain presented in videos of persons suffering injuries in sport events or accidents. No response in scanner.

Experience non-painful stimulation or see someone in non-painful activity (activation collapsed). Self-experience induced through warm but non-painful thermal stimulation of hand. Other in non-painful activity presented by videos of persons in sport or other events (no accidents, no injuries at this time). No response in scanner.

24Supplementary Materials

Page 26: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S4.2.Observing Emotions (n=25).

Authors n Experimental Condition Control Condition

Andersson et al., 2008

n=16 View pictures of faces with emotional expression (fear). Judge if face or house was shown (houses were filler trials), button press.

View pictures of faces with neutral expression. Judge if face or house was shown (houses were filler trials), button press.

Batut et al., 2006

n=15 View pictures of faces with emotional expression (fear). Judge the gender of the face that was shown.

View pictures of faces with neutral expression. Judge the gender of the face that was shown.

Carr et al., 2003

n=11 View pictures of faces displaying basic emotions. Passively observe, no response in scanner.

Passive rest (view blank screen).

Chakrabarti et al., 2006

n=25 Watch video of a person (face) displaying a sad emotion. Press a button when the video starts.

Watch video of a person (face) displaying a neutral affect. Press a button when the video starts.

Cools et al., 2005

n=12 View pictures of faces with emotional expression (fear). Judge the gender of the face that was shown.

View pictures of faces with neutral expression. Judge the gender of the face that was shown.

Critchley et al., 2000

n=9 View pictures of faces with emotional expressions (anger, fear). Judge the gender of the person (button press).

View pictures of faces with neutral expressions. Judge the gender of the person (button press).

Gorno-Tempini et al., 2001

n=10 View pictures of faces with emotional expression (disgust). Judge gender (male/female) of person, button press.

Detect white square in the center of a scrambled face picture.

Habel et al., 2007

n=29 View picture of face with emotional expression (happiness, sadness, anger, fear, and disgust), and judge the age of the person depicted.

View a scrambled face with a fixation cross in the middle.

Hennenlotter et al., 2005

n=12 View video of person (face) who is smiling. Passively observe, no response in scanner.

View video of person (face) with neutral expression. Passively observe, no response in scanner.

Horan et al., 2014

n=23 View pictures of faces with prototypical expressions of four emotions (happy, sad, angry, and afraid). Passively observe, no response.

Passive rest.

Iidaka et al., 2001

n=12 View picture of faces with negative emotional expressions, and judge gender of the person (button press).

View two rectangles and compare their size, press button corresponding to the larger one.

Lange et al., 2003

n=9 View pictures of faces with emotional expression (fear). Judge gender of person shown, button press.

View pictures of faces with neutral expression. Judge gender of person shown, button press.

Moriguchi et al., 2005

n=16 View pictures of faces with negative emotional expression (fear). Passively observe, no response in scanner.

View pictures of faces with neutral expression. Passively observe, no response in scanner.

Philips et al., 1997

n=7 View picture of faces with negative emotional expressions (fear), and judge gender of the person (button press).

View picture of faces with neutral emotional expressions, and judge gender of the person (button press).

Schroeder et al., 2004

n=20 View pictures of disgusted faces, no response. View pictures of neutral faces, no response.

Sprengelmeyer et al., 1998

n=6 View pictures of faces with emotional expression (fear). Judge gender of person shown, button press.

View pictures of faces with neutral expression. Judge gender of person shown, button press.

Surguladze et al., 2003

n=9 View pictures of faces with emotional expression (fear). Judge gender (male/female) of person, button press.

View pictures of faces with neutral expression. Judge gender (male/female) of person, button press.

Surguladze et al., 2008

n=29 View pictures of faces with emotional expression (fear). Judge gender (male/female) of person, button press.

View pictures of faces with neutral expression. Judge gender (male/female) of person, button press.

Toller et al., 2015

n=29 View video of face showing fearful expression. Relax and focus on the actor’s eyes.

View video of landscapes. Relax and do nothing.

van der Gaag et al., 2007

n=17 View videos of a person (face) expressing different emotions (e.g., happy, fearful), Passively observe, no response in scanner. Note: Activation from contrast is incl. masked with activation for [executing the facial movement as in video > passive rest].

Passive rest (view fixation cross).

Vuilleumi n=12 View picture showing two faces (and two houses). View picture showing two faces (and two houses). Judge if same or

25Supplementary Materials

Page 27: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

er et al., 2001

Judge if same or different faces are shown (or houses, activation collapsed). Faces show fearful expression.

different faces are shown (or houses, activation collapsed). Faces show neutral expression.

Vuilleumier et al., 2004

n=13 View picture showing two faces (and two houses). Judge if same or different faces are shown (or houses, activation collapsed). Faces show fearful expression.

View picture showing two faces (and two houses). Judge if same or different faces are shown (or houses, activation collapsed). Faces show neutral expression.

Wicker, Keysers et al., 2003

n=14 Watch video of a person smelling a disgusting odor – as one can see from her facial expression. Passively observe, no response in the scanner.

Passive rest. No stimulation, no task.

Williams et al., 2006

n=13 View pictures of fearful faces, no response. View pictures of neutral faces, no response.

Wolfensberger et al., 2008

n=64 View pictures of faces, largely with emotional expressions (angry, fearful, happy, and neutral), and judge the gender of the person.

View scrambled faces, which have two arrows embedded. Indicate if arrows point to the left or right.

26Supplementary Materials

Page 28: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S4.3.Sharing Emotions or Pain (n=12).

Authors n Experimental Condition Control Condition

Ernst et al., 2013

n=18 View picture of face showing emotional affect. Try to empathize with the person and press button if successful (yes/no).

View blurred photo of face (no recognizable content). Try to empathize with the person and press button if successful (yes/no).

De Greck et al., 2012a

n=32 See a picture of a person (face) and empathize with the person. Expressions are neutral or angry. After the picture, rate how strongly you were able to empathize (VAS).

See a picture of a person (face). Expressions are neutral or angry. After the picture, rate the darkness of the skin. (VAS).

De Greck et al., 2012b

n=20 See a picture of a person (face) and empathize with the person. Expressions are neutral or angry. After the picture, rate how strongly you were able to empathize (VAS).

See a picture of a person (face). Expressions are neutral or angry. After the picture, rate the darkness of the skin. (VAS).

Guo et al., 2013

n=40 View pictures of body parts under painful stimulation. Watch attentively and try to experience the feelings of the target person.

View pictures of body parts under non-painful stimulation/touch. Watch attentively and try to experience the feelings of the target person.

Lamm et al., 2010

n=23 View picture of hand/lower arm punctuated by needle, share the affect (pain). Rate the amount of pain felt by the target (VAS). Note: We used only data for similar targets.

View picture of hand/lower arm touched by Q-tip, share the affect. Rate the amount of pain felt by the target (VAS). Note: We used only data for similar targets.

Mochizuki et al., 2013

n=18 View pictures of body parts showing marks of hurtful injury (e.g., skin burn). Imagine yourself being injured like that (i.e., being in the same situation). After fMRI, answer several questions about stimuli (e.g., ‘how hard it was to imagine being in the situation?’).

View pictures of regular, unharmed body parts (e.g., healthy skin). Imagine yourself being in the same situation as the model. After fMRI, answer several questions about stimuli (e.g., ‘how hard it was to imagine being in the situation?’).

Nomi et al., 2008

n=14 View picture of a face with emotional expression and empathize with the person (i.e., share the emotion). No response in the scanner.

View picture of a scrambled face. No instruction, no response.

Preis et al., 2013

n=64 View picture of body part (hand) under painful stimulation. Imagine how the person in the picture feels and rate her pain (VAS).

View picture of body part (hand) under non-painful stimulation. Imagine how the person in the picture feels and rate her pain (VAS).

Reniers et al., 2014a

n=15 View picture of a face with sad expression. Imagine what the person in the picture is feeling. No response in the scanner.

View picture of a face with neutral expression. Imagine what the person in the picture is feeling. No response in the scanner.

Schulte-Ruether et al., 2011

n=14 View picture of face with emotional expression. Empathize with the person by 'feeling into' her, and identify the emotional state observed in the face. Button press, select among options (e.g., 'sad', 'happy', ...).

View picture of face with neutral expression. Make a perceptual judgment about width of the face (e.g., 'thin', 'normal', ...). Button press, select among options.

Schulte-Ruether et al., 2014

n=12 View picture of face with emotional expression. Empathize with the person by 'feeling into' her and evaluate your own emotional response to that. Button press, select among options.

View picture of face with neutral expression. Make a perceptual judgment about width of the face (e.g., 'thin', 'normal', ...). Button press, select among options.

van der Heiden et al., 2013

n=18 View pictures showing body parts under painful stimulation. Imagine it is you suffering as seen on the picture, or try to put yourself into the perspective of an observer and imagine watching the other experiencing the pain (activation collapsed over two instructions). Indicate how successful you could adopt the respective perspective, on each trial (VAS).

Pictures showing body parts under painful or non-painful stimulation (activation collapsed over trials). Perceive the stimuli simply as a picture itself, not representing any real-life situation. Press the response button twice (on every trial).

27Supplementary Materials

Page 29: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S4.4.Evaluating Situated Emotions (n=15).

Authors n Experimental Condition Control Condition

Fourie et al., 2017

n=38 Watch videos of persons talking about emotional autobiographic events. Persons are victims (or relatives of victims) of apartheid atrocities in South Africa, recalling their experiences (e.g., Person's son was killed by state security police). Rate your empathic concern after each video block, by indicating how sorry you feel for the person (button press, likert-scale rating). (Note: Brain activity averaged across South-African subgroups of black and white ethnicity).

Watch neutral videos of persons being interviewed, for example for university curricula. Rate your empathic concern after each video block, by indicating how sorry you feel for the person (button press, likert-scale rating).

Fourie et al., 2019

n=36 Watch videos of persons talking about emotional autobiographic events. Persons are victims or perpetrators of apartheid atrocities in South Africa, recalling their experiences (e.g., Person's son was killed by state security police). Empathize with persons in the videos and imagine how they feel. After scanning, report personal/ empathic concern, moral indignation, guilt, shame in response to video.

Watch neutral videos of persons being interviewed, for example for university curricula. Empathize with individuals in each clip and imagine how they felt in their situation.

Harvey et al., 2013

n=15 View videos of person talking about a pos. or neg. autobiographic event. Rate (continuously during video) how pos. or neg. you believe that target feels at each moment while talking.

View videos of person talking about a pos. or neg. autobiographic event. Rate (continuously during video) how far to the left or right the target’s eyes gaze is directed relative to the screen.

Immor-dino-Yang et al., 2009

n=13 Before scanning, read stories about persons in emotion triggering situations – experiencing ‘social” pain (e.g., personal loss). In the scanner, see pictures reminding of these stories. Think about the story and report strength of own emotion via button press.

Before scanning, read stories about persons in non-emotion triggering situations. In the scanner, see pictures reminding of these stories. Think about the story and report strength of own emotion via button press.

Kanske et al., 2015a

n=25 (exp 1)

Watch videos of person talking about a negative autobiographical event. Rate how you feel and how much compassion you feel for the person in video.

Watch videos of person talking about a neutral autobiographical event. Rate how you feel and how much compassion you feel for the person in video.

Kanske et al., 2015b

n=178 (exp 2)

Watch videos of person talking about a negative autobiographical event. Rate how you feel and how much compassion you feel for the person in video.

Watch videos of person talking about a neutral autobiographical event. Rate how you feel and how much compassion you feel for the person in video.

Krach et al., 2015

n=16 View a hand-drawn sketch of a protagonist in a socially undesirable situation and read an introductory sentence along with it (e.g., the protagonist gives a speech in front of an audience but forgets what they want to say). Rate how painful the situation was (i.e., vicarious embarrassment).

View hand-drawn sketch of a protagonist in a neutral public situation, accompanied by an introductory sentence (i.e., the protagonist returns a book at the library). Rate how painful the situation was (i.e., vicarious embarrassment).

Mackes et al., 2018

n=47 View videos of person talking about emotional autobiographic events (happy, sad, angry, or frightened). Rate (continuously during video) the emotional intensity of the target at each moment while talking. After the video clip, categorize the emotion of the target, and your own emotion.

View videos of person describing his/her own bedroom (neutral emotion condition). Rate (continuously during video) the emotional intensity of the target at each moment while talking. After the video clip, categorize the emotion of the target, and your own emotion (correct response is neutral).

Pehrs et al., 2017

n=28 Watch a short video depicting a close-up of a character with a moderately sad facial expression (stimuli were piloted, rated and selected for being moderately sad) and no mouth movements. The video was preceded by a short vignette placing the character into a sad context (e.g., the character recently lost a loved person). After the video, rate your current emotional state in terms of compassion and ‘being moved’.

Watch a short video depicting a close-up of a character with a moderately sad facial expression (stimuli were piloted, rated and selected for being moderately sad) and no mouth movements. The video was preceded by a short vignette presenting neutral information (e.g., the character is sitting in a car). After the video, rate your current emotional state in terms of compassion and ‘being moved’.

Rameson et al., 2012

n=32 Read contextual sentence about a sad event (e.g., someone gets fired from a job) and view pictures of persons. Take the person’s perspectives and imagine how each one feels about the event. No response in scanner.

Read contextual sentence about a sad event (e.g., someone gets fired from a job) and view pictures of persons. Respond to pictures ‘naturally” as if you would see them in a magazine.

Regenbogen et al., 2012

n=27 Watch video of person talking about an emotional autobiographic event. Rate how you or the target person feels (pos. vs. neg.) on VAS after each video (activation collapsed across conditions).

Watch video of person telling about a neutral autobiographic event. Rate how you or the target person feels (pos. vs. neg.) on VAS after each video (activation collapsed across conditions).

Reyes Aguilar et al., 2017

n=34 Read a short vignette about an emotionally negative or positive event, then view a picture of the person to whom this happened. Think about what this person is feeling (no overt response).

Read a short vignette about an emotionally neutral event, then view a picture of the person to whom this happened. Think about what this person is feeling (no overt response).

28Supplementary Materials

Page 30: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Schneider et al., 2013

n=28 Watch video of person telling about an emotional autobiographic event. Rate how you or the target person feels (positive versus negative) on VAS after each video (activation collapsed across conditions).

Watch video of person telling about a neutral autobiographic event. Rate how you or the target person feels (positive versus negative) on VAS after each video (activation collapsed across conditions).

Zaki et al., 2009

n=16 View videos of person talking about a pos. or neg. autobiographic event. Rate (continuously during video) how pos. or neg. you believe that target feels at each moment while talking. Brain activation for trials where your rating is highly consistent with self-report of experienced emotion of the person. (Parametric analysis).

View videos of person talking about a pos. or neg. autobiographic event. Rate (continuously during video) how pos. or neg. you believe that target feels at each moment while talking. Brain activation for trials where your rating is rather inconsistent with self-report of experienced emotion of the person. (Parametric analysis).

Zaki et al., 2012

n=16 View videos of person talking about a pos. or neg. autobiographic event. Rate (continuously during video) how pos. or neg. you believe that target feels at each moment while talking.

View videos of person talking about a pos. or neg. autobiographic event. Rate (continuously during video) how far to the left or right the target’s eye gaze is directed relative to the screen.

29Supplementary Materials

Page 31: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Supplementary Table S4.5.Reasoning About Emotions (n=12).

Authors n Experimental Condition Control Condition

Hooker et al., 2008

n=20 View a picture showing a scene with two persons, one has true, the other a false belief regarding an emotion-triggering state of affairs. Imagine what the person with a false belief would feel if he/she had more knowledge. Select 1 out of 4 options (e.g., ‘angry’, ‘afraid’, ...). Button press.

View a picture showing a scene with two persons, one has true, the other a false belief regarding an emotion-triggering state of affairs. Imagine what the person with a true belief would feel if he/she had more knowledge (i.e., no change needs to be predicted). Select 1 out of 4 options. (e.g., ‘angry’, ‘afraid’, ...). Button press.

Hooker et al., 2010

n=15 View series of pictures with two persons. One person realizes that she mistakenly had a false belief regarding an emotion-triggering state of affairs. That leads to a change of her emotion. Indicate if pictures showed a social (i.e., emotional), a physical, or no change. Button press.

View series of pictures with two persons. One person has a false belief regarding an emotion-triggering state of affairs. Nothing changes over pictures (i.e., so this person does not realize that she had a false belief). Indicate if pictures showed a social, a physical, or no change. Button press.

Kim et al., 2010

n=19 View cartoon stories showing a suffering person (e.g., homeless, injured) and another person. Predict what will happen next (e.g., other person will help), select among alternative pictures. Button press.

View cartoon stories showing one or two persons in emotionally neutral everyday situations. Predict what will happen next based on physical causality, select among alternative pictures. Button press.

Mano et al., 2009

n=18 Read story including a person either witnessing or being ignorant about (conditions combined in conjunction analysis) an emotionally pleasing or unpleasing event. Indicate if the event is positive for that person (button press).

Read story including a person and an unrelated pleasing or unpleasing event happing to another person. Indicate if the event is positive for the first, unrelated person (button press, answer is always no).

Powell et al., 2017

n=12 View a comic strip of two characters interacting and think about what would make the protagonist feel better. By button press, choose which further image best completes the preceding pictures. See also Voellm et al., 2006b.

View comic strip depicting two characters and think about what would be most likely to happen next. By button press, choose which further image best completes the preceding pictures.

Schlaffke et al., 2015

n=39 View cartoon stories of two interacting persons (2/3 of stories feature deception, 1/3 feature cooperation). Pictures are in correct order. After that, answer silently two questions (written on the screen) concerning emotions and feelings of the persons (no overt response).

View cartoon stories of two interacting persons (2/3 of stories feature deception, 1/3 feature cooperation). Pictures are in jumbled order. After that, answer silently two questions (written on the screen) concerning properties of objects appearing in the scene (no overt response).

Schmitgen et al., 2016

n=22 View false-belief cartoon story in which the protagonist displays emotional or neutral facial expression (e.g., the protagonist finds that their bird cage's door is opened, and the bird is nowhere to be seen, but a cat is sitting close by. In the last frame, the protagonist sees the bird flying away, which was hidden outside the protagonist's view). Indicate by button press whether the protagonist's affective state has changed from the previous pictures (worse - equal - better)

view false-belief cartoon story in which the protagonist displays emotional or neutral facial expression. Indicate by button press whether the number of living beings in the picture has changed.

Schnell et al., 2011

n=21 View cartoon showing a character in a social scene where she first mis-interprets another’s intention, and later comes to understand it correctly. Indicate by button press if the characters’ emotional state changed (i.e., put yourself in the shoes of the protagonist). e.g., 'is there a change from afraid to non-afraid?' In another condition, indicate by button press if the number of persons visible to the protagonist has changed. Activation was collapsed across the two conditions.

View cartoon showing a character in a social scene where she first misinterprets another’s intention, and later comes to understand it correctly. Indicate by button press if your emotional state changed, or if the number of persons you can see changed (i.e., no perspective taking required). Activation collapsed across tasks.

Sebastian et al., 2012a

n=30(same sample as Sebastian et al., 2012b)

Cartoon story showing two persons. One is in an emotion-triggering situation. Predict what the other person will do to make her feel better (button press).

Cartoon story showing two persons in neutral everyday situation. Predict what will happen next based on physical causality (button press).

Voellm et al., 2006a

n=13 (same sample as Voellm et al., 2006b)

Cartoon story showing two persons. One is in an emotion-triggering situation. Predict what the other person will do to make her feel better (button press).

Cartoon story showing two persons in neutral everyday situation. Predict what will happen next based on physical causality (button press).

Wang et al., 2015

n=56 Cartoon story showing two persons. One is in an emotion-triggering situation. Predict what the other

Cartoon story showing two persons in neutral everyday situation. Predict what will happen next based on physical causality (button

30Supplementary Materials

Page 32: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

person will do to make her feel better (button press). press).

Willert et al., 2015

n=81 View three cartoon story of three pictures. By button press, judge whether protagonist's affective state changed from the previous picture (better - equal - worse). See Schnell et al., 2011.

View three cartoon story of three pictures. By button press, indicate whether the number of living beings changed from the previous picture (more - less - equal).

31Supplementary Materials

Page 33: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

References for the empathy meta-analyses

Akitsuki, Y., & Decety, J. (2009). Social context and perceived agency affects empathy for pain: an event-related fMRI investigation. NeuroImage, 47(2), 722-734.

Andersson, F., Glaser, B., Spiridon, M., Debbané, M., Vuilleumier, P., & Eliez, S. (2008). Impaired activation of face processing networks revealed by functional magnetic resonance imaging in 22q11. 2 deletion syndrome. Biological Psychiatry, 63(1), 49-57.

Azevedo, R. T., Macaluso, E., Avenanti, A., Santangelo, V., Cazzato, V., & Aglioti, S. M. (2013). Their pain is not our pain: brain and autonomic correlates of empathic resonance with the pain of same and different race individuals. Human Brain Mapping, 34(12), 3168-3181.

Azevedo, R. T., Macaluso, E., Viola, V., Sani, G., & Aglioti, S. M. (2014). Weighing the stigma of weight: An fMRI study of neural reactivity to the pain of obese individuals. NeuroImage, 91, 109-119.

Ballotta, D., Lui, F., Porro, C., Nichello, P., & Benuzzi, F. (2018). Modulation of neural circuits underlying temporal production by facial expressions of pain. PLoS ONE, 13(2), e0193100.

Batut, A.-C., Gounot, D., Namer, I., Hirsch, E., Kehrli, P., & Metz-Lutz, M.-N. (2006). Neural responses associated with positive and negative emotion processing in patients with left versus right temporal lobe epilepsy. Epilepsy & Behavior, 9, 415-423.

Bos, P. A., Montoya, E. R., Hermans, E. J., Keysers, C., & van Honk, J. (2015). Oxytocin reduces neural activity in the pain circuitry when seeing pain in others. NeuroImage, 113, 217-224.

Carr, L., Iacoboni, M., Dubeau, M. C., Mazziotta, J. C., & Lenzi, G. L. (2003). Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proceedings of the national Academy of Sciences, 100(9), 5497-5502.

Chakrabarti, B., Bullmore, E., & Baron-Cohen, S. (2006). Empathizing with basic emotions: common and discrete neural substrates. Social Neuroscience, 1(3-4), 364-384.

Christov-Moore, L., Conway, P., & Iacoboni, M. (2017). Deontological dilemma response tendencies and sensorimotor representations of harm to others. Frontiers in Integrative Neuroscience, 11, 34.

Cools, R., Calder, A., Lawrence, A., Clark, L., Bullmore, E., & Robbins, T. (2005). Individual differences in threat sensitivity predict serotonergic modulation of amygdala response to fearful faces. Psychopharmacology, 180, 670-679.

Critchley, H., Daly, E., Phillips, M., Brammer, M., Bullmore, E., Williams, S., ... & Murphy, D. (2000). Explicit and implicit neural mechanisms for processing of social information from facial expressions: a functional magnetic resonance imaging study. Human Brain Mapping, 9(2), 93-105.

Decety, J., Echols, S., & Correll, J. (2010). The blame game: the effect of responsibility and social stigma on empathy for pain. Journal of Cognitive Neuroscience, 22(5), 985-997.

Ernst, J., Northoff, G., Böker, H., Seifritz, E., & Grimm, S. (2013). Interoceptive awareness enhances neural activity during empathy. Human Brain Mapping, 34(7), 1615-1624.

Fan, Y. T., Chen, C., Chen, S. C., Decety, J., & Cheng, Y. (2013). Empathic arousal and social understanding in individuals with autism: evidence from fMRI and ERP measurements. Social Cognitive and Affective Neuroscience, 9(8), 1203-1213.

Fourie, M., Stein, D., Solms, M., Gobodo-Madikizela, P., & Decety, J. (2017). Empathy and moral emotions in post-apartheid South Africa: an fMRI investigation. Social Cognitive and Affective Neuroscience, 12(6), 881-892.

Fourie, M., Stein, D., Solms, M., Gobodo-Madikizela, P., & Decety, J. (2019). Effects of early adversity and social discrimination on empathy for complex mental states: An fMRI investigation. Scientific Reports, 9,12959.

Fujino, J., Yamasaki, N., Miyata, J., Kawada, R., Sasaki, H., Matsukawa, N., ... & Aso, T. (2014). Altered brain response to others׳ pain in major depressive disorder. Journal of Affective Disorders, 165, 170-175.

Gao, X., Pan, W., Li, C., Wng, L., Yao, M., & Chen, A. (2017). Long-time exposure to violent video games does not show desensitization on empathy for pain: An fMRI study. Frontiers in Psychology, 8, 650.

32Supplementary Materials

Page 34: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Gorno-Tempini, M. L., Pradelli, S., Serafini, M., Pagnoni, G., Baraldi, P., Porro, C., ... & Nichelli, P. (2001). Explicit and incidental facial expression processing: an fMRI study. NeuroImage, 14(2), 465-473.

de Greck, M., Shi, Z., Wang, G., Zuo, X., Yang, X., Wang, X., ... & Han, S. (2012a). Culture modulates brain activity during empathy with anger. NeuroImage, 59(3), 2871-2882.

de Greck, M., Wang, G., Yang, X., Wang, X., Northoff, G., & Han, S. (2012b). Neural substrates underlying intentional empathy. Social Cognitive and Affective Neuroscience, 7(2), 135-144.

Guo, X., Zheng, L., Wang, H., Zhu, L., Li, J., Wang, Q., ... & Yang, Z. (2013). Exposure to violence reduces empathetic responses to other’s pain. Brain and Cognition, 82(2), 187-191.

Habel, U., Windischberger, C., Derntl, B., Robinson, S., Kryspin-Exner, I., Gur, R., & Moser, E. (2007). Amygdala activation and facial expressions: explicit emotion discrimination versus implicit emotion processing. Neuropsychologia, 45(10), 2369-2377.

Harvey, P. O., Zaki, J., Lee, J., Ochsner, K., & Green, M. F. (2013). Neural substrates of empathic accuracy in people with schizophrenia. Schizophrenia Bulletin, 39(3), 617-628.

Hennenlotter, A., Schroeder, U., Erhard, P., Castrop, F., Haslinger, B., Stoecker, D., ... & Ceballos-Baumann, A. O. (2005). A common neural basis for receptive and expressive communication of pleasant facial affect. NeuroImage, 26(2), 581-591.

Hooker, C. I., Verosky, S. C., Germine, L. T., Knight, R. T., & D’Esposito, M. (2008). Mentalizing about emotion and its relationship to empathy. Social Cognitive and Affective Neuroscience, 3(3), 204-217.

Hooker, C. I., Verosky, S. C., Germine, L. T., Knight, R. T., & D'Esposito, M. (2010). Neural activity during social signal perception correlates with self-reported empathy. Brain Research, 1308, 100-113.

Horan, W., Iacoboni, M., Cross, K., Korb, A., Lee, J., Nori, P., … Green, M. (2014). Self-reported empathy and neural activity during action imitation and observation in schizophrenia. NeuroImage: Clinical, 5, 100-108.

Iidaka, T., Omori, M., Murata, T., Kosaka, H., Yonekura, Y., Okada, T., & Sadato, N. (2001). Neural interaction of the amygdala with the prefrontal and temporal cortices in the processing of facial expressions as revealed by fMRI. Journal of Cognitive Neuroscience, 13(8), 1035-1047.

Immordino-Yang, M. H., McColl, A., Damasio, H., & Damasio, A. (2009). Neural correlates of admiration and compassion. Proceedings of the National Academy of Sciences, 106(19), 8021-8026.

Kanske, P., Böckler, A., Trautwein, F. M., & Singer, T. (2015a,b). Dissecting the social brain: Introducing the EmpaToM to reveal distinct neural networks and brain–behavior relations for empathy and Theory of Mind. NeuroImage, 122, 6-19.

Kim, Y. T., Lee, J. J., Song, H. J., Kim, J. H., Kwon, D. H., Kim, M. N., ... & Chang, Y. (2010). Alterations in cortical activity of male methamphetamine abusers performing an empathy task: fMRI study. Human Psychopharmacology: Clinical and Experimental, 25(1), 63-70.

Krach, S., Kamp-Becker, I., Einhäuser, W., Sommer, J., Frässle, S., Jansen, A., … Paulus, F. (2015). Evidence from pupillometry and fMRI indicates reduced neural response during vicarious social pain but not physical pain in autism. Human Brain Mapping, 36, 4730-4744.

Kühn, S., Kugler, D., Schmalen, K., Weichenberger, M., Witt, C., & Gallinat, J. (2018). The myth of blunted gamers: No evidence for desensitization in empathy for pain after a violent video game intervention in a longitudinal fMRI study on non-gamers. Neurosignals, 26, 22-30.

Lamm, C., Meltzoff, A. N., & Decety, J. (2010). How do we empathize with someone who is not like us? A functional magnetic resonance imaging study. Journal of Cognitive Neuroscience, 22(2), 362-376.

Lange, K., Williams, L. M., Young, A. W., Bullmore, E. T., Brammer, M. J., Williams, S. C., ... & Phillips, M. L. (2003). Task instructions modulate neural responses to fearful facial expressions. Biological Psychiatry, 53(3), 226-232.

Luo, S., Shi, Z., Yang, X., Wang, X., & Han, S. (2013). Reminders of mortality decrease midcingulate activity in response to others’ suffering. Social Cognitive and Affective Neuroscience, 9(4), 477-486.

Ma, Y., Wang, C., & Han, S. (2011b). Neural responses to perceived pain in others predict real-life monetary donations in different socioeconomic contexts. NeuroImage, 57(3), 1273-1280.

33Supplementary Materials

Page 35: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Mackes, N., Golm, D., O’Daly, O., Sarkar, S., Sonuga-Barke, E., Fairchield, G., & Mehta, M. (2018). Tracking emotions in the brain – Revisiting the Empathic Accuracy Test. NeuroImage, 178, 677-686.

Mano, Y., Harada, T., Sugiura, M., Saito, D. N., & Sadato, N. (2009). Perspective-taking as part of narrative comprehension: a functional MRI study. Neuropsychologia, 47(3), 813-824.

Mochizuki, H., Baumgärtner, U., Kamping, S., Ruttorf, M., Schad, L., Flor, H., … Treede, R.-D. (2013). Cortico-subcortical activation patterns for itch and pain imagery. Pain, 154, 1989-1998.

Moriguchi, Y., Ohnishi, T., Kawachi, T., Mori, T., Hirakata, M., Yamada, M., ... & Komaki, G. (2005). Specific brain activation in Japanese and Caucasian people to fearful faces. Neuroreport, 16(2), 133-136.

Morrison, I., Lloyd, D., Di Pellegrino, G., & Roberts, N. (2004). Vicarious responses to pain in anterior cingulate cortex: is empathy a multisensory issue?. Cognitive, Affective, & Behavioral Neuroscience, 4(2), 270-278.

Nomi, J. S., Scherfeld, D., Friederichs, S., Schäfer, R., Franz, M., Wittsack, H. J., ... & Seitz, R. J. (2008). On the neural networks of empathy: A principal component analysis of an fMRI study. Behavioral and Brain Functions, 4(1), 41.

Ochsner, K. N., Zaki, J., Hanelin, J., Ludlow, D. H., Knierim, K., Ramachandran, T., ... & Mackey, S. C. (2008). Your pain or mine? Common and distinct neural systems supporting the perception of pain in self and other. Social Cognitive and Affective Neuroscience, 3(2), 144-160.

Olsson, A., Nearing, K. I., & Phelps, E. A. (2007). Learning fears by observing others: the neural systems of social fear transmission. Social Cognitive and Affective Neuroscience, 2(1), 3-11.

Pehrs, C., Zaki, J., Schlochtermeier, L., Jacobs, A., Kuchinke, L., & Koelsch, S. (2017). The temporal pole top-down modulates the ventral visual stream during social cognition. Cerebral Cortex, 27, 777-792.

Phillips, M. L., Young, A. W., Senior, C., Brammer, M., Andrew, C., Calder, A. J., ... & Gray, J. A. (1997). A specific neural substrate for perceiving facial expressions of disgust. Nature, 389(6650), 495.

Powell, J., Grossi, D., Corcoran, R., Gobet, F., & García-Fiñana, M. (2017). The neural correlates of theory of mind and their role during empathy and game of chess: A functional magnetic resonance imaging study. Neuroscience, 355, 149-160.

Preis, M. A., Schmidt-Samoa, C., Dechent, P., & Kroener-Herwig, B. (2013). The effects of prior pain experience on neural correlates of empathy for pain: An fMRI study. Pain, 154(3), 411-418.

Quaio-Tasserit, E., Corradi-Dell’Acqua, C., & Vuilleumier, P. (2018). The good, the bad, and the suffering. Transient emotional episodes modulate the neural circuits of pain and empathy. Neuropsychologia, 116(A), 99-116.

Rameson, L. T., Morelli, S. A., & Lieberman, M. D. (2012). The neural correlates of empathy: experience, automaticity, and prosocial behavior. Journal of Cognitive Neuroscience, 24(1), 235-245.

Regenbogen, C., Schneider, D. A., Gur, R. E., Schneider, F., Habel, U., & Kellermann, T. (2012). Multimodal human communication—targeting facial expressions, speech content and prosody. NeuroImage, 60(4), 2346-2356.

Reniers, R. L., Völlm, B. A., Elliott, R., & Corcoran, R. (2014). Empathy, ToM, and self–other differentiation: An fMRI study of internal states. Social Neuroscience, 9(1), 50-62.

Reyes-Aguilar, A., Fernandez-Ruiz, J., Pasaye, E., & Barrios, F. (2017). Executive mechanisms for thinking about negative situations in both cooperative and non-cooperative contexts. Frontiers in Human Neuroscience, 11, 275.

Schmitgen, M., Walter, H., Drost, S., Rückl, S., & Schnell, K. (2016). Stimulus-dependent amygdala involvement in affective theory of mind generation. NeuroImage, 129, 450-459.

Schneider, K., Regenbogen, C., Pauly, K. D., Gossen, A., Schneider, D. A., Mevissen, L., ... & Schneider, F. (2013). Evidence for gender‐specific endophenotypes in high‐functioning autism spectrum disorder during empathy. Autism Research, 6(6), 506-521.

Schnell, K., Bluschke, S., Konradt, B., & Walter, H. (2011). Functional relations of empathy and mentalizing: an fMRI study on the neural basis of cognitive empathy. NeuroImage, 54(2), 1743-1754.

34Supplementary Materials

Page 36: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Schlaffke, L., Lissek, S., Lenz, M., Juckel, G., Schultz, T., Tegenthoff, M., ... & Brüne, M. (2015). Shared and nonshared neural networks of cognitive and affective theory‐of‐mind: A neuroimaging study using cartoon picture stories. Human Brain Mapping, 36(1), 29-39.

Schroeder, U., Hennenlotter, A., Erhard, P., Haslinger, B., Stahl, R., Lange, K., & Ceballos-Baumann, A. (2004). Functional neuroanatomy of perceiving surprised faces. Human Brain Mapping, 23(4), 181-187.

Schulte-Rüther, M., Greimel, E., Markowitsch, H. J., Kamp-Becker, I., Remschmidt, H., Fink, G. R., & Piefke, M. (2011). Dysfunctions in brain networks supporting empathy: an fMRI study in adults with autism spectrum disorders. Social Neuroscience, 6(1), 1-21.

Schulte-Rüther, M., Greimel, E., Piefke, M., Kamp-Becker, I., Remschmidt, H., Fink, G. R., ... & Konrad, K. (2014). Age-dependent changes in the neural substrates of empathy in autism spectrum disorder. Social Cognitive and Affective Neuroscience, 9(8), 1118-1126.

Seara-Cardoso, A., Viding, E., Lickley, R., & Sebastian, C. (2015). Neural responses to others’ pain vary with psychopathic traits in healthy adult males. Cognitive, Affective, and Behavioral Neuroscience, 15, 578-588.

Sebastian, C. L., Fontaine, N. M., Bird, G., Blakemore, S. J., De Brito, S. A., McCrory, E. J., & Viding, E. (2012a,b). Neural processing associated with cognitive and affective Theory of Mind in adolescents and adults. Social Cognitive and Affective Neuroscience, 7(1), 53-63.

Sprengelmeyer, R., Rausch, M., Eysel, U., & Przuntek, H. (1998). Neural structures associated with recognition of facial expressions of basic emotions. Proceedings of the Royal Society B: Biological Sciences, 265, 1927-1931.

Surguladze, S. A., Brammer, M. J., Young, A. W., Andrew, C., Travis, M. J., Williams, S. C., & Phillips, M. L. (2003). A preferential increase in the extrastriate response to signals of danger. NeuroImage, 19(4), 1317-1328.

Surguladze, S., Elkin, A., Ecker, C., Kalidindi, S., Corsico, A., Giampetro, V., … Philips, M. (2008). Genetic variation in the serotonin transporter modulates neural system-wide response to fearful faces. Genes, Brain and Behavior, 7, 543-551.

Tei, S., Becker, C., Kawada, R., Fujino, J., Jankowski, K. F., Sugihara, G., ... & Takahashi, H. (2014). Can we predict burnout severity from empathy-related brain activity?. Translational Psychiatry, 4(6), e393.

Toller, G., Adhimoolam, B., Grunwald, T., Huppert, H.-J., Kurthen, M., Rankin, K., & Jokeit, H. (2015). Right mesial temporal lobe epilepsy impairs empathy-related brain responses to dynamic fearful faces. Journal of Neurology, 262, 729-741.

Ushida, T., Ikemoto, T., Tanaka, S., Shinozaki, J., Taniguchi, S., Murata, Y., ... & Tamura, Y. (2008). Virtual needle pain stimuli activates cortical representation of emotions in normal volunteers. Neuroscience Letters, 439(1), 7-12.

Van der Gaag, C., Minderaa, R. B., & Keysers, C. (2007). Facial expressions: what the mirror neuron system can and cannot tell us. Social Neuroscience, 2(3-4), 179-222.

van der Heiden, L., Scherpiet, S., Konicar, L., Birbaumer, N., & Veit, R. (2013). Inter-individual differences in successful perspective taking during pain perception mediates emotional responsiveness in self and others: an fMRI study. NeuroImage, 65, 387-394.

Völlm, B. A., Taylor, A. N., Richardson, P., Corcoran, R., Stirling, J., McKie, S., ... & Elliott, R. (2006a,b). Neuronal correlates of theory of mind and empathy: a functional magnetic resonance imaging study in a nonverbal task. NeuroImage, 29(1), 90-98.

Vuilleumier, P., Armony, J. L., Driver, J., & Dolan, R. J. (2001). Effects of attention and emotion on face processing in the human brain: an event-related fMRI study. Neuron, 30(3), 829-841.

Vuilleumier, P., Richardson, M. P., Armony, J. L., Driver, J., & Dolan, R. J. (2004). Distant influences of amygdala lesion on visual cortical activation during emotional face processing. Nature Neuroscience, 7(11), 1271.

Wang, Y., Liu, W.-H., Li, Z., Wei, X.-H., Jiang, X.-Q., Neumann, D., Chan, R. (2015). Dimensional schizotypy and social cognition: an fMRI imaging study. Frontiers in Behavioral Neuroscience, 9, 133.

Wicker, B., Keysers, C., Plailly, J., Royet, J. P., Gallese, V., & Rizzolatti, G. (2003). Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust. Neuron, 40(3), 655-664.

35Supplementary Materials

Page 37: supp.apa.org · Web viewAs it was detailed out in an earlier meta-analysis (Schurz et al., 2014), these two groups show both overlapping activations (common 'core areas'), as well

Willert, A., Mohnke, S., Erk, S., Schnell, K., Romanczuk-Seiferth, N., Quinlivan, E., … Walter, A. (2015). Alterations in neural theory of mind processing in euthymic patients with bipolar disorder and unaffected relatives. Bipolar Disorders, 17, 880-891.

Williams, L., Lidell, B., Kemp, A., Bryant, R., Meares, R., Peduto, A., & Gordon, E. (2006). Amygdala-prefrontal dissociation of subliminal and supraliminal fear. Human Brain Mapping, 27, 652-661.

Wolfensberger, S., Veltman, D., Hoogendijk, W., Boomsma, D., & de Geus, E. (2008). Amygdala responses to emotional faces in twins discordant or concordant for the risk for anxiety and depression. NeuroImage, 41, 544-552.

Zaki, J., Ochsner, K. N., Hanelin, J., Wager, T. D., & Mackey, S. C. (2007). Different circuits for different pain: patterns of functional connectivity reveal distinct networks for processing pain in self and others. Social Neuroscience, 2(3-4), 276-291.

Zaki, J., Weber, J., Bolger, N., & Ochsner, K. (2009). The neural bases of empathic accuracy. Proceedings of the National Academy of Sciences, 106(27), 11382-11387.

Zaki, J., Weber, J., & Ochsner, K. (2012). Task-dependent neural bases of perceiving emotionally expressive targets. Frontiers in Human Neuroscience, 6, 228.

36Supplementary Materials