55
Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians by Sarah M. Carpentier A thesis submitted in conformity with the requirements for the degree of Masters of Arts Department of Psychology University of Toronto © Copyright by Sarah M. Carpentier 2011

Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

Embed Size (px)

Citation preview

Page 1: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

Brain-Music Duet: MEG signal complexity and auditory

perception in musicians and nonmusicians

by

Sarah M. Carpentier

A thesis submitted in conformity with the requirements

for the degree of Masters of Arts

Department of Psychology

University of Toronto

© Copyright by Sarah M. Carpentier 2011

Page 2: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

ii

ii

Brain-Music Duet: MEG signal complexity and auditory perception

in musicians and nonmusicians

Sarah M. Carpentier

Degree of Masters of Arts

Department of Psychology

University of Toronto

2011

Abstract

Music training has been suggested to lead to an enhancement in the neural activity associated

with music processing. It has been proposed that brain signal complexity is a reflection of the

functional capacity of that neural system. The present study tested the hypothesis that musicians

have a larger repertoire of brain activity associated with musical perception then nonmusicians.

We used multiscale entropy to capture the complexity of the MEG signal while musicians and

nonmusicians listened to different melodies. We observed that initial melody presentation

elicited higher complexity in musicians compared to nonmusicians. Brain signal complexity

decreased in both groups as a function of stimulus repetition. We propose that the neural

networks that underlie auditory processing have a more diverse range of functioning in

musicians, as compared to nonmusicians. Repetition reduces the amount of information

processing and corresponding brain signal complexity.

Page 3: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

iii

iii

Acknowledgements

First and foremost, I have to express my deep gratitude to my supervisor Randy. His

enrichment of my academic career has exceeded well beyond my expectations, and his scientific

innovation and strength of character are inspirational. By setting such high standards for himself

he encourages all of his students to strive past their upmost limits to the best of their abilities.

Thank you, Randy for your guidance, encouragement, and faith in my potential. I consider

myself to be extremely privileged not only to be one of your students, but to know you.

Secondly, I would like to thank the other members of the McIntosh Lab who have

provided various types of feedback and support: Marc Berman, Kelly Shen, Anjali Beharelle,

Zainab Fatima, Erin Gibson, Michele Korostil, Grigori Yourganov, Maria Karacchalios and

Tanya Brown. Special appreciation goes to the following people: Jennifer Heisz – for being a

stupendous and patient sounding board, for providing intellectual and personal support, and for

giving the world a strong female role model; Vasily Vakorin – for helping me make my silly

little ideas a computational reality and for always surpassing the definition of ‘complex’; Natasa

Kovacevic – for making labs across the world wish human cloning was a reality; Bratislav Misic

– for always supplying the best English translation for scientific jargon; Gleb Bezgin – for his

effervescent personality that lights up the office, and for being the person who could probably

answer my ideas better than me; and Hongye Wang – for helping me with every red error

message, and for doing the jobs of three research assistants.

Also, I need to thank Takako Fujioka and Bernhard Ross for being open to new ideas and

generously sharing their data for this project. I admire Takako’s dedication to the study of music,

both in and outside the lab. She has made significant contributions to science, has ameliorated

patients’ lives, and impressively still manages to find spare time to enriche peoples’ lives with

Page 4: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

iv

iv

music. I thank Bernhard for his patience, for sharing his MEG expertise, and for invaluable help

with data pre-processing for this project.

Importantly, I also acknowledge the efforts put forth by subsidiary advisor, Glenn

Schellenberg, toward reading my drafts and providing helpful feedback. I thank Glenn for

tolerating the relatively micro examinations of music provided by neuroscience, and for the

reminders that music is not an experimental entity that exists in isolation from the rest of the

world. He reminds me that only through collaboration from multiple different approaches can

human behaviour and cognition be fully explained. I hope that someday we can put our different

pieces together and take a look at our completed puzzle. Also, to the other member of my

committee, Jennifer Ryan, thank you for taking your time to read this work, listen to me ramble

on about it, and provide helpful feedback.

Finally, I need to thank all my family and friends who helped me through this process.

My mother is the best teacher I have had the pleasure of knowing, and I thank her for the

countless hours she has spent editing every academic paper of my career. After all the help she

has given my sister and I over the years, in my family we say that my mother has Bachelor’s

degrees from Barnard College, McMaster University, and Queen’s University, a law degree from

the University of Ottawa, and Master’s degrees in Education from OISE, International Relations

from Carleton University, and now Psychology from the University of Toronto. Also, I thank my

father, for never stopping trying to be a better version of himself professionally and personally,

even though he is perfect now, and for trying now to fit as much neuroscience as possible into

his life. Lastly, Jonathan, my partner in adventure, who sets the bar high, maybe, sometimes,

possibly beats me at trivial pursuit, and never fails to supply me with the two most valuable

commodities, joy and relaxation.

Page 5: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

v

v

Table of Contents

Abstract ...................................................................................................................................... ii

Acknowledgements ................................................................................................................... iii

Table of Contents ....................................................................................................................... v

List of Figures ........................................................................................................................... vi

Introduction ................................................................................................................................ 1

Materials and Methods ..............................................................................................................11

Participants ....................................................................................................................11

Stimuli ...........................................................................................................................12

Contour ..............................................................................................................12

Interval ...............................................................................................................12

Procedure.......................................................................................................................13

Audio Presentation .............................................................................................13

MEG Overview ..................................................................................................13

MEG Acquisition ...............................................................................................13

Data Analysis ................................................................................................................14

Multiscale Entropy .............................................................................................14

Partial Least Squares ..........................................................................................16

Results ......................................................................................................................................18

Discussion .................................................................................................................................19

Musicians versus nonmusicians .....................................................................................19

Unconscious, automatic processing ................................................................................20

Melodic Contour versus Intervals...................................................................................22

Entropy and ERP ...........................................................................................................24

Conclusion ................................................................................................................................25

Figures ......................................................................................................................................26

References .................................................................................................................................31

Page 6: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

vi

vi

List of Figures

Figure 1. Musical stimuli ...........................................................................................................26

Figure 2. Results for standard melodies in the contour condition ...............................................27

Figure 3. Results for standard melodies in the interval condition ...............................................28

Figure 4. Results for deviant melodies in the contour condition .................................................29

Figure 5. Results for deviant melodies in the interval condition .................................................30

Page 7: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

1

Brain-Music Duet: MEG signal complexity and auditory perception

in musicians and nonmusicians

Musical abilities are arguably some of the most complex of human accomplishments. A

successful concert pianist can bimanually coordinate the production of over 1,500 notes per

minute, and listening to a symphony can involve perception of over 30 different instruments.

Because music is an auditory stimulus that unfolds over time, music perception relies on the

listener’s ability to appreciate the contribution of different instruments separately and together as

they develop. Investigations of musical capabilities typically have divided a whole piece into

parts, and revolve around identifying the neural substrate of each particular component, such as

pitch (Patterson, Uppenkamp, Johnsrude, & Griffiths, 2002; Zatorre, Evans, & Meyer, 1994;

Zatorre & Samson, 1991), timbre (Menon et al., 2002; Samson, 2003), or rhythm (Chen,

Penhune, & Zatorre, 2008; Fujioka, Trainor, Large, & Ross, 2009; Snyder & Large, 2005). When

put back together, the neuroimaging evidence seems to suggest that music listening activates

distributed brain regions. How these regions interact to produce a unified perception remains to

be elucidated.

Each individual has his or her own unique musical experiences, and these experiences

influence perceptions, listening tastes, and abilities to play an instrument. These ‘musical

memories’ guide the processing of new, incoming music (Krumhansl & Castellano, 1983). For

example, a large experience-dependent effect is seen when people are better able to process and

remember new music from their own culture (Hannon & Trehub, 2005; Morrison, Demorest,

Aylward, Cramer, & Maravilla, 2003). The neural basis for these ‘memories’ has been the

subject of ample investigation. Musicians provide an opportunity to investigate how the brain

can be altered with experience to support behaviour and cognition. Previous work has

Page 8: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

2

demonstrated that music training is associated with widespread structural (Gaser & Schlaug,

2003a, 2003b; Hutchinson, Lee, Gaab, & Schlaug, 2003; Hyde et al., 2009b; Schlaug, Forgeard,

Zhu, Norton, & Winner, 2009; Schmithorst & Wilke, 2002) and functional (Bhattacharya &

Petsche, 2005; Jentschke & Koelsch, 2009; Koelsch, Schroger, & Tervaniemi, 1999; Trainor,

Shahin, & Roberts, 2009) changes. However, similar to work with nonmusicians, these studies

typically single out a single component of music, and localize the difference in mean activation

between musicians and nonmusicians. While informative, this procedure fails to capture the full

neural dynamics that are responsible for processing intricate pieces of music over time. We

suggest that full processing requires neural activity that fluctuates with the changes in the

developing musical piece. It has been previously suggested that signal complexity serves as a

measure of the functional repertoire of the system: higher complexity indicates a wider range of

configurations (Ghosh, Rho, McIntosh, Kotter, & Jirsa, 2008; McIntosh, Kovacevic, & Itier,

2008). Therefore, higher complexity in the human brain might promote cognition (McIntosh et

al., 2010; Protzner, Valiante, Kovacevic, McCormick, & McAndrews, 2010; Tononi, Edelman,

& Sporns, 1998). The present study proposes that processing the intricate acoustic events in

music requires complex neural network activity that operates at multiple temporal and spatial

scales. The more information that is available in a musical piece, the greater the variations in

brain network activity necessary to capture that information. Accordingly, reflecting their

advanced musical capabilities, we hypothesize that musicians should exhibit higher brain signal

complexity during music perception.

Nonlinear, dynamical systems provide a useful framework for understanding the

information processing capacity of brain networks (Bullmore & Sporns, 2009; Deco, Jirsa, &

McIntosh, 2011). Unlike linear structures, a nonlinear network system is capable of parallel

processing of information at multiple spatial and temporal scales (Honey, Kotter, Breakspear, &

Page 9: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

3

Sporns, 2007; Kelso, 1995). These systems have highly complex activity and process diverse

information simultaneously within segregated local regions, and merge it interactively between

spatially distributed neural populations (Tononi, Sporns, & Edelman, 1994). Anatomical and

functional reports support the brain as a complex, scale-invariant system (Bullmore & Sporns,

2009) that gives rise to consciousness (Tononi, Sporns & Edelman, 1998; Tononi & Edelman,

1998) and cognition (McIntosh et al., 2010). Complexity is supported through small-world type

organizational structure, including dense local connections and sparse long-range connections

(Watts & Strogatz, 1998). A system that has sparse-reciprocal connections is able to integrate

more information than a system that is completely interconnected or one in which regions are

arranged hierarchically (Tononi, Sporns & Edelman, 1994; McIntosh, 2000). Small-world

organization is also believed to be an economical option for biological system organization

because longer axonal projections require more materials and energy (Cherniak, 1994).

Anatomical examinations of invertebrates (Watts & Strogatz, 1998), non-human primates

(Passingham, Stephan, & Kotter, 2002), and humans (Bullmore & Sporns, 2009; Hagmann et al.,

2008; Honey et al., 2009) agree that nervous systems of all levels have the structural

organization of small-world networks. Segregated functional specialization is fundamental to

brain organization (Bendor & Wang, 2005; Dobelle, Mladejovsky, & Girvin, 1974; Penfield &

Boldrey, 1937), and lesion studies have improved our understanding of neural functioning

(Goodale, Milner, Jakobson, & Carey, 1991; Scoville & Milner, 1957). However, while

localization can inform about the critical elements for a particular function, the full realization of

a unified consciousness requires coordination of interactions between different parts of the

network (McIntosh, 2000; Tononi & Edelman, 1998). Early connectivity estimates in the non-

human primate visual cortex suggested that only 30-40% of all possible connections are evident

among the different cortical areas (Felleman & Van Essen, 1991). More recently, thousands of

Page 10: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

4

tracer studies of the macaque monkey cortex have been compiled in the CoCoMac database

(http://www.cocomac.org; (Kotter, 2004). Meta-analysis of these studies support the proposal of

a balance between segregation and integration, indicative of high complexity and information

processing capacity (Passingham et al., 2002). In humans, a recent DSI study agreed that the

cortex is comprised of structurally segregated and functionally specialized regions that are

interconnected by a network of cortico-cortical axonal pathways (Hagmann et al., 2008).

Functional connectivity analysis, defined as temporal relation of activity between neural

populations (Friston, 1994), confirms the multiscale network interactions expected from small-

world architecture and simulations (Honey et al., 2007; Tononi et al., 1994). Cortical regions

dynamically couple to one another forming transient functional networks associated with

perception, cognition, and action (Bressler, 1995; Buzaski & Draguhn, 2004; Nyberg et al.,

2000; Singer & Gray, 1995; Varela, Lachaux, Rodriguez, & Martinerie, 2001). For example, it

was demonstrated that functional connectivity between the fusiform face area (FFA), dorsolateral

and ventrolateral prefrontal cortex (PFC), premotor cortex, intraparietal sulcus, caudate nucleus,

thalamus, hippocampus, and occipital regions is modulated by maintenance of a face image in

working memory (Gazzaley, Rissman, & D'Esposito, 2004). Spontaneous activity in the default

and resting states, where there is no experimental input to the system, also displays distributed

network dynamics (Biswal, Yetkin, Haughton, & Hyde, 1995; Fox et al., 2005; Greicius,

Krasnow, Reiss, & Menon, 2003; Raichle et al., 2001). For example, Fox and colleagues (2005)

reported reliable functional connectivity across participants between the posterior cingulate

cortex (PCC), medial PFC, left dorsolateral PFC, bilateral inferoparietal cortex, left inferolateral

temporal cortex, and left parahippocampal gyrus when participants were awake but with their

eyes closed. When participants switched to listening to narrative text, the ‘default’ PCC activity

(Raichle et al., 2001) became inversely correlated with activity in Broca’s area, which became

Page 11: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

5

positively connected to Wernicke’s area and the left premotor cortex (Hampson, Peterson,

Skudlarski, Gatenby, & Gore, 2002). These studies demonstrate that the anatomical architecture

of the cerebral cortex enables a shift between the transient functional networks necessary for

neurocognitive functioning (McIntosh, 2000).

It has been suggested that brain signal variability, or more formally complexity, can

measure the diversity of functional networks in a given brain (Ghosh et al., 2008; McIntosh et

al., 2008; McIntosh et al., 2010), which in turn serve as an index of the individual’s cognitive

capabilities. In line with this view, increases in complexity are observed from infancy to

adulthood (Anokhin, Birbaumer, Lutzenberger, Nikolaev, & Vogel, 1996; Lippé, Kovacevic, &

McIntosh, 2009; McIntosh et al., 2008; Meyer-Lindenberg, 1996), which likely coincides with

concomitant maturational increases in behavioural repertoire. Furthermore, increased complexity

has been shown to correlate positively with performance accuracy (Garrett, Kovacevic,

McIntosh, & Grady, 2010; McIntosh et al., 2008; Misic, Mills, Taylor, & McIntosh, 2010), and

is also negatively correlated with response time variability (Garrett et al., 2010; Garrett,

Kovacevic, McIntosh, & Grady, 2011; McIntosh et al., 2008) suggesting that greater stability of

cognitive responses is a result of increased neural network complexity. One recent study

demonstrated that EEG complexity increases as a function of previous knowledge of the stimulus

(Heisz, Shedden, & McIntosh, under review). The results revealed that famous faces elicited

higher signal complexity than novel faces as a function of personal familiarity with the famous

face (e.g., familiarity ratings, naming accuracy). The same study also showed that complexity

increases with acquired familiarity in an exposure training paradigm. Taken together, these

results support the hypothesis that a more complex functional neural architecture reflects the

enhanced dynamic repertoire of the system and supports a greater capacity for information

processing (Ghosh et al., 2008; McIntosh et al., 2010).

Page 12: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

6

As already noted, music listening involves widespread neural activity, including the

prefrontal and temporal cortices (Patterson et al., 2002; Peretz & Zatorre, 2005; Zatorre, 1988;

Zatorre et al., 1994; Zatorre & Samson, 1991), premotor and supplementary motor area (SMA)

(Halpern, Zatorre, Bouffard, & Johnson, 2004; Halsband, Tanji, & Freund, 1993) cerebellum

(Ivry & Keele, 1989; Janata & Grafton, 2003), and parahippocampal and paralimbic regions

(Green et al., 2008; Menon & Levitin, 2005; Mizuno & Sugishita, 2007; Salimpoor, Benovoy,

Larcher, Dagher, & Zatorre, 2011). Initially, different components of music, such as pitch,

rhythm, and dynamics (loudness), are processed separately (De Santis, Clarke, & Murray, 2007;

De Santis, Spierer, Clarke, & Murray, 2007; Peretz, 1990), and integrated later to give the

impression of a unified musical piece (Munte, Altenmuller, & Jancke, 2002; Peretz & Kolinsky,

1993). For example, fixed pitches are processed bilaterally in Heschl’s gyrus (Griffiths, 2003;

Patterson et al., 2002), whereas posterior regions of the secondary auditory cortex process pitch

height, and anterior regions process pitch chroma (Warren, Uppenkamp, Patterson, & Griffiths,

2003). The superior temporal gyrus and planume polare are activated by intervals, contour, and

melody (Patterson et al., 2002). Rhythm processing invokes activity in the cerebellum, basal

ganglia (Ivry & Keele, 1989; Janata & Grafton, 2003), premotor cortex, and SMA (Halpern et

al., 2004; Halsband et al., 1993).

Anatomical studies have observed increased grey and white matter volumes in musicians

relative to nonmusicians (Gaser & Schlaug, 2003a; Schlaug, 2001; Schlaug, Jancke, Huang, &

Steinmetz, 1995). In order to distinguish between predisposition and training specific effects,

investigators in a recent study gave different levels of musical instruction to children for 15

months (Hyde et al., 2009a). Consistent with structural differences found between adult

musicians and nonmusicians (Bermudez & Zatorre, 2005; Lee, Chen, & Schlaug, 2003; Schlaug,

Page 13: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

7

Jancke, Huang, & Steinmetz, 1995; Schmithorst & Wilke, 2002; Schneider et al., 2002), the

authors observed increases in size of the right pre-central gyrus, corpus callosum, and Heschl’s

gyrus in the musically trained children compared to the nonmusically trained children. Increased

cerebellar volume is also reported in musicians and has been shown to correlate positively with

lifelong intensity of practice (Hutchinson et al., 2003).

Musicians show additional motor cortex activation compared to nonmusicians during

music perception (Grahn & Brett, 2007). Similarly, fMRI research has found reduced

asymmetrical activity in the motor cortices of pianists, corresponding to the bimanual

coordination required in piano performance (Jancke, Schlaug, & Steinmetz, 1997). The bulk of

neuroimaging studies concerned with the functional differences between musicians and

nonmusicians employ techniques such as electroencephalography (EEG) or

magnetoencephalography (MEG), which forego the spatial accuracy of fMRI in favour of

temporal precision. Evoked event-related potentials (ERP) are time-locked to a stimulus at rates

of hundreds of milliseconds, and reflect specific physical features of the acoustic environment

(Näätanen & Picton, 1987). One of the first indications of functional reorganization in musicians

was presented by Elbert and colleagues (1995). They found that somatosensory-evoked magnetic

fields were larger for the left-hand fingers of violinists. Pantev and colleagues (1998) found that

musicians show approximately 25% larger amplitude evoked N1 magnetic responses for piano

sounds compared to pure tones and nonmusicians, and that this enlargement was correlated with

the age at which music training began. Continued investigation revealed that this effect was most

pronounced for tones from the musicians’ own type of instrument (Pantev, Roberts, Schulz,

Engelien, & Ross, 2001). Similar enhanced electrophysiological responses have been observed

for spatial selectivity of sound sources in conductors compared to pianists and nonmusicians

(Munte, Nager, Beiss, Schroeder, & Altenmuller, 2003).

Page 14: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

8

Studies of violations in musical expectancy yield characteristic ERP responses that are

heightened in musicians. The early right anterior negativity (ERAN) is observed in response to

violations of acoustical grammar or syntax (Koelsch, Jentschke, Sammler, & Mietchen, 2007;

Koelsch, Schroger, & Tervaniemi, 2000; Maess, Koelsch, Gunter, & Friederici, 2001) and is

generally sensitive to the amount of harmonic appropriateness (Koelsch et al., 2001). Typically

this means that a negative spike in amplitude is recorded approximately 500 ms after an

individual is presented with violation in the regularities of the musical environment they were

raised in. For instance, the dominant-tonic progression is known as the authentic cadence and is

the most common ending of a harmonic progression in Western tonal music. Endings other than

the authentic cadence, considered acoustic ‘odd-balls’, elicit an ERAN response (Koelsch &

Friederici, 2003). Both musicians and nonmusicians display ERAN responses, though it is

enhanced by musical training (Jentschke & Koelsch, 2009), and it appears to develop by 5 years

of age (Jentschke, Koelsch, Sallat, & Friederici, 2008). MEG analysis suggests that the ERAN is

generated by the inferior frontal cortex (Koelsch & Friederici, 2003). The mismatch negativity

(MMN) is somewhat similar to the ERAN; however, it can be recorded after any change occurs

in a repeated auditory stimulus, even in the absence of attention (Näätanen & Alho, 1995;

Näätanen & Picton, 1987; Picton, Alain, Leun, Ritter, & Achim, 2000), and can therefore be

thought of as more sensitive to the immediate acoustic context. The MMN has been localized

mainly to the supratemporal plane (Alho, 1995; Levanen, Ahonen, Jari, McEvoy, & Sams,

1996). Relative to nonmusicians, musicians display a more pronounced MMN response to

deviations in harmonic chord progressions (Koelsch et al., 1999), pitch sequences (Brattico,

Winkler, Naatanen, Paavilainen, & Tervaniemi, 2002), and complex temporal patterns (Lopez et

al., 2003; Tervaniemi, Ilvonen, Karma, Alho, & Naatanen, 1997). Specifically, professional

musicians show MMN for timing deviations as small as 20 ms, whereas nonmusicians require a

Page 15: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

9

deviation of at least 50 ms in order to exhibit an MMN response (Russeler, Altenmuller, Nager,

Kohlmetz, & Munte, 2001). Furthermore, when compared to woodwind players and

nonmusicians, drummers show a heighted MMN response to manipulated drum beat sequences

(Munte et al., 2003).

The data analyzed in the present study were previously collected and analyzed for

MMNm (magnetic mismatch negativity, as the data were acquired with MEG) responses in

musicians and nonmusicians (Fujioka, Trainor, Ross, Kakigi, & Pantev, 2004). Fujioka and

colleagues observed that musicians had consistent MMNm responses to deviant tones in two

different conditions, in which melodies were learned over the course of the experiment.

Nonmusicians showed MMNm only in one of the two conditions and the amplitude of their

response was significantly attenuated compared to musicians. These results paralleled those of a

behavioural measure. Musicians’ were at ceiling on a task that required them to detect deviant

tones (96.50% & and 95.83%, respectively for the two conditions), and significantly more

accurate than nonmusicians (63.0% and 86.17%). Importantly, both musicians and nonmusicians

exhibited a significant MMNm response to deviance from a repeated control pure tone,

indicating that musicians’ superior performance relied on consideration of the entire melody and

not an acoustical advantage.

Despite the collection of evidence that music perception involves distributed brain

regions, direct examinations of the network interactions between regions are limited. Two lines

of evidence stand out: increased corpus callosum size in musicians, and enhanced long-range

oscillatory synchronization. Interhemispheric interaction is believed to be crucial for the

integration necessary for conscious perception (Tononi, 2010; Tononi & Edelman, 1998).

Increased size of the anterior midline corpus callosum is observed consistently in musicians (Lee

et al., 2003; Schlaug, Jancke, Huang, Staiger, & Steinmetz, 1995; Schmithorst & Wilke, 2002).

Page 16: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

10

These corpus callosum differences emerged in children after approximately 29 months of

practicing music, and total weekly music exposure predicted degree of change as well as

improvement (Schlaug et al., 2009). These results loosely support the suggestion that musicians

have the enhanced network integration to support their abilities.

Synchronization of neural oscillations at a specific frequency between different neuronal

populations is frequently proposed as a mechanism for information transfer in the brain

(Bressler, Coppola, & Nakamura, 1993; Buzaski, 2006; Varela et al., 2001). Transient periods of

oscillation synchronization in the gamma frequency band (25-80Hz), measured as phase-locking

between different neural recording sites, have been theorized to integrate distributed neuronal

sets together into a coherent ensemble that underlies a cognitive act (Canolty et al., 2007; Hipp,

Engel, & Siegel, 2011; Tallon-Baudry & Bertrand, 1999). Rodriguez and colleagues (1999)

observed that face perception, but not meaningless shapes, induced long-distance patterns of

gamma synchronization in EEG scalp recording corresponding to the moment of perception and

an ensuing motor response. In music perception, long-range oscillatory activity is proposed to

bind segregated musical features and to match acoustic information to learned templates in

memory for music (Fujioka et al., 2009). Gamma and beta band synchronization between the

auditory and motor cortices is modulated by beat processing (Fujioka et al., 2009; Snyder &

Large, 2005), and induced gamma activity is enhanced when participants listen to meaningful

stimuli compared with pure tones during a discrimination task (Crone, Boatman, Gordon, & Hao,

2001). Compared to nonmusicians, musicians show increased phase synchrony over distributed

cortical areas, predominately in the gamma band (Bhattacharya & Petsche, 2005; Snyder &

Large, 2005; Sokolov, Pavlova, Lutzenberger, & Birbaumer, 2004). This heightened effect

appears to be music specific because it is reduced to nonmusician levels when musicians listen to

text (Bhattacharya & Petsche, 2005). In an effort to delineate training specific effects, Trainor

Page 17: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

11

and colleagues (2009), showed that induced oscillatory gamma band activity is enhanced in

musicians relative to nonmusicians specific to instrument of practice, and it develops in children

after one year of musical training, but not in children without training. Taken together, these

results suggest that music training leads to increased integration of information in neural

networks.

The presented evidence converges with the understanding that the dynamics that emerge

from neural networks are crucial for the manifestation of complex behaviours and cognitions

(McIntosh, 2000). Previous investigations have uncovered widespread brain activity associated

with music perception yet discussion of interactions among regions remains minimal. The

temporal coordination of activity flow through various music networks can be expected to

influence perception of a complex piece of music, and music training may result in distinct

patterns of functional dynamics. It has been suggested that a brain with higher signal complexity

has a more varied set of functional connections and the ability to perform more operations

(Ghosh et al., 2008; McIntosh et al., 2008; McIntosh et al., 2010; Tononi et al., 1994). The goal

of the present study was to investigate whether increased diversity of neural activity underlies

musicians’ superior perception of musical melodies. Specifically, we hypothesized that

compared to nonmusicians, musicians should exhibit higher brain signal complexity when

listening to music.

Materials and Methods

Participants

Participants were 12 musicians (8 female, ages 19-33 years) and 12 nonmusicians (9

female, ages 19-40 years). The musicians had studied and played two or more instruments

Page 18: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

12

regularly for at least 10 years (M = 14.3 years, range: 10 to 23) with formal education including

musical schools or private lessons. The nonmusicians had almost no formal music training (3 of

12 had 2 years of lessons but quit playing more than 10 years ago) except for what they learned

in school. None of the individuals in either group had absolute pitch. All participants were right

handed (assessed by the Edinburgh handedness test) with normal hearing (range of 250 to 8000

Hz as tested by clinical audiometry), and they were screened for any history of medical,

neurological, psychiatric, and substance-abuse problems. After being informed about the nature

of the study, they provided written consent to participate and the experiment was approved by

The Ethics Commission of the Baycrest Centre for Geriatric Care in accordance with the

Declaration of Helsinki.

Stimuli

In both conditions, stimuli comprised CD-quality standard and deviant melodies, both

with five tones in a digitally recorded piano timbre. Standard melodies were presented 80% of

the time and the duration of each tone was 300 ms.

Contour.

In the contour condition, eight different five-tone ascending melodies were used (see

Figure 1). Each comprised different tones and intervals from the C-major diatonic scale, and

each started on one of five different tones between C5 and G5 (American notation). In the

corresponding deviant melodies, the contour was altered such that the fifth tone was lower in

pitch than the fourth tone. The interval size deviations in the contour melodies ranged from a

minor second to a major third (2 to 4 semitones, respectively). The mean value was a major

second (2 semitones) and this was the same value of deviation in the interval conditions.

Interval.

Page 19: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

13

Standard stimuli in the interval condition consisted of a single five-tone major-key

melody with an up-up-down-up contour (do-re-fa-mi-sol), which was transposed to eight keys

with starting notes in the same range as those of the contour stimuli. For the deviant melodies,

the final note was raised by one whole tone (2 semitones, from sol to la). The altered tone did not

change the contour and it remained within the key of the melody.

Procedure

Audio Presentation.

Hearing thresholds for each participant were determined for the left and right ears for the

B5 piano sound. Stimuli were presented at 60 dB above threshold. Participants were presented

with three successive blocks of 300 trials of each condition, and condition presentation order was

counterbalanced. The order of melody variations was pseudo-random. Each melody was

separated by a 900ms silent interval.

MEG Overview.

MEG is a passive, non-invasive neurophysiological technique that measures the magnetic

fields generated by neuronal activity of the brain. Different areas of the brain communicate with

electrochemical impulses that generate electric and magnetic fields. MEG detects patterns of

brain activity as a number of sources of extremely miniscule electro-magnetic fields, such that

MEG is a ‘direct’ measure of brain activity, compared to fMRI and PET, which measure brain

metabolism. In the MEG equipment, magnetic field sensors rest in a helmet placed on the

individual’s head. Fiducials are placed on the nasion and pre-auicular points for head

localization.

MEG Acquisition.

Page 20: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

14

Magnetic field responses were recorded with a 151-channel whole-cortex magnetometer

system (OMEGA, CTF Systems Inc, Port Coquitlam, Canada) at the Rotman Research Institute

at Baycrest Centre for Geriatric Care. Data were collected at a sampling rate of 312.5 samples/s.

Pre-processing included low-pass filtering at 55 Hz, and eye-blink and movement artifact

rejection at a threshold of 2 pT. Duration of recording epoch was 2.099 s, including a 0.4-s pre-

stimulus period. Onset of the first tone of each stimulus synchronized the stimulus presentation

and the data acquisition.

The recordings were performed while participants sat upright in an adjustable chair in a

magnetically shielded room. Prior to MEG acquisition, each participant was fitted with three

fiducial localization coils in order to localize the position of the individual’s head relative to the

MEG sensors. Participants watched a soundless movie of their choice and were instructed not to

pay attention to the sound stimuli. The movie was projected onto a screen placed in front of the

MEG chair. Compliance was verified by video monitoring. Participants were instructed not to

pay attention to the auditory stimuli and no explanation about the stimuli was provided.

Data Analysis

Multiscale Entropy.

Because interactions due to both local dense interconnectivity and sparse long-range

projections give rise to the outputs of neuronal networks (Kelso, 1995; Tononi et al., 1994), the

resulting dynamics could be expected to operate at multiple time scales. MSE calculates sample

entropy of a signal at different time scales (Costa, Goldberger, & Peng, 2002, 2005) and was

used to measure the complexity of the brain signal and music sequences. Sample entropy is the

negative of the logarithmic conditional probability that two sequences of m consecutive data

points that are similar to each other (within a given tolerance r) will remain similar at the next

point (m+1) in the data set (N), where N is the length of the time series (Richman & Moorman,

Page 21: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

15

2000). MSE was calculated using the algorithm available at www.physionet.org/physiotools/mse

with parameter values pattern length, m, equal to 5, and the similarity criterion, r, equal to 1

(Vakorin et al., 2010). The value r is defined as a proportion of the standard deviation of the

original data (Costa, Goldberger, & Peng, 2004).

MSE calculation involves two procedures: (1) coarse-graining of the time series, and (2)

calculating sample entropy for each coarse-grained time series. For scale t, the time series is

constructed by averaging the data points with non-overlapping windows of length t. The number

of scales is determined by the reliability of the entropy estimation of the series and is a function

of the number of data points in the signal. A signal with fewer data points has fewer time scales.

MSE requires a time series of approximately 500 data points to get a stable measure of entropy

by scale, and the MEG epochs of 2.099 s with a sampling rate of 312.5Hz yielded 656 samples

and fulfilled this requirement. On an MSE curve plot (see Figure 2c), scale 1 is the sample

entropy for the non-averaged, original signal, scale 2 is the entropy from the signal average of 2

adjacent points, scale 3 is the average of 3 adjacent points, and so on. Unlike traditional single-

scale entropy measures, signals with low complexity, such as random noise or completely

deterministic signals, exhibit a steep decline in MSE curve with increasing scale, and signals

with greater temporal interdependencies, such as 1/f noise or cardiovascular inter-beat intervals,

display a more gradual shift in the curve (Costa et al., 2002, 2004, 2005; Costa & Healey, 2003;

Lippé et al., 2009; Nikulin & Brismar, 2004). This is because 1/f signals contain information

about dependences within and between timescales (Costa et al., 2005). For each subject in the

present analysis, a channel specific MSE estimate was obtained as a mean across single trial

entropy measures for timescales 1-10. MSE comparisons are done on the shape of the MSE

curve (when entropy is plotted by the coarse-grain level) or as an area-under-the-curve.

Page 22: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

16

It is worth noting that a strong correlation between spectral power distribution and MSE

values has been observed (McIntosh, Kovacevic & Itier, 2008). Specifically, simulated data

demonstrated that alterations to Fourier coefficients changed the resulting MSE curve. It was

also demonstrated, however, that although random phase jitter of the time series significantly

impacted MSE, it had no effect on spectral power. Thus, it was concluded from these simulations

that MSE is sensitive to non-linearities that are not captured by spectral power. Other results

confirm that MSE delineates brain patterns not observable from spectral power (Heisz et al.,

under review; Misic et al., 2010).

Partial Least Squares Analysis.

Partial Least Squares (PLS) analysis is a multivariate statistical technique that has been

used in neuroimaging to extract commonalities between brain activity and experimental design

(McIntosh, Bookstein, Haxby, & Grady, 1996; McIntosh & Lobaugh, 2004). Although this

approach is similar to canonical correlation, it maximizes the covariance between two data sets

instead of the correlation. PLS has been validated for analysis of data from PET (McIntosh et al.,

1996), fMRI (Martinez-Montes, Valdes-Sosa, Miwakeichi, Goldman, & Cohen, 2004; McIntosh

& Lobaugh, 2004), ERP (Lobaugh, West, & McIntosh, 2001), and MEG (Misic et al., 2010).

Task PLS (McIntosh & Lobaugh, 2004) specifically analyzes the association between brain

activity (data set X) and experimental design (data set Y).

In the first step, MSE values were arranged into matrix X, with subject measures, by

group and condition in rows, and mean MSE values for each MEG channel at each time scale in

columns. To assess the relationship of group and condition with MSE value, matrix Y is a matrix

of dummy coding that codes for the experimental groups or conditions. In the second stage of

analysis, the average for each condition was then computed and stored as M. Each column of M

was then mean-centered by subtracting the mean of the column from each value of that column.

Page 23: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

17

This mean-centered matrix was subjected to singular value decomposition (SVD) to compute an

optimal least-squares fit to the covariance between the original X and Y variables (i.e. MSE

across all channels and group/condition). The decomposition yields a set of orthogonal latent

variables (LV) each containing three matrices: (1) U, weights for the rows, indicating a contrast

that characterizes the differences between groups and or/tasks; (2) V, weights for the columns,

indicating the linear combination of channels maximally related to the contrast; and (3) the

singular value, which is the covariance between the contrast and the MSE weights. In this

analysis, each LV represented one contrast between experimental groups and/or conditions in

relation to a particular pattern of channels and temporal scales. SVD is similar to principle

components’ analysis (PCA), whereby an LV accounts for a proportion of the total variance in

the data matrix. The SVD LV is similar to loading on a factor in PCA. Also, it has been noted

that task PLS is somewhat akin to discriminant analysis (Abdi & Williams, 2010).

The statistical significance of each LV as a whole was determined using permutation tests

(McIntosh et al., 1996; McIntosh & Lobaugh, 2004). The purpose of conducting a permutation

test is to evaluate whether an LV is significantly different from random signal. Permutation tests

consist of randomly reordering the rows (i.e. subject observations) of matrix X, while leaving

matrix Y unchanged. PLS is recomputed on the permuted matrix to obtain a new matrix of

singular values. After repetition of this procedure (500 permutation), the set of all singular values

provides a sampling distribution from which the null hypothesis can be tested. The number of

permutations performed is proportional to the desired precision of the alpha critical value, thus,

500 permutations allow for precision to the third decimal point. P-values are determined by

calculating the proportion of permuted singular values that are equal to or exceed the original

singular value.

Page 24: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

18

The stability of each statistical effect is assessed through bootstrap estimation of standard

error confidence intervals of the weights in U and V. Bootstrap samples are created by sampling

with replacement the observations in X and Y (Efron & Tibshirani, 1986). The standard error

was subsequently estimated from 500 bootstrap samples, and the singular vector weights for each

MSE coefficient were divided by this standard error to yield a bootstrap ratio. The bootstrap ratio

is similar to a z-score if the distribution of singular vector weights is Gaussian (McIntosh &

Lobaugh, 2004). Peak channels with a weight/standard error ratio > 3.5 (99% confidence

interval) were considered to be reliable (Sampson, Streissguth, Barr, & Bookstein, 1989) .

Results

In both conditions (contour, interval), standard melodies elicited greater sample entropy

in musicians compared to nonmusicians (p < .001, see Figures 2 and 3). In the contour condition,

both groups showed a consistent pattern for both standard (Figure 2) and deviant (Figure 4)

melodies. Musicians had higher MSE than nonmusicians for the first block, followed by a rapid

decrease in the second block, and a return to an intermediate level in the third block.

Nonmusicians displayed a gradual decrease in MSE across all three blocks (p = .001; the

difference between blocks 2 and 3 was non-significant, p = .906).

A similar general trend of decreasing MSE as a function of trial was observed in the

interval condition. Figure 3 illustrates that standard melodies elicited a rapid decrease in MSE

from block 1 to block 2 in musicians, with no significant change between blocks 2 and 3, and a

more gradual decline in MSE across all three blocks in nonmusicians. Deviant melodies in the

interval condition resulted in stable MSE across all three blocks in musicians (i.e., no significant

differences, see Figure 5). Nonmusicians exhibited greater MSE than musicians during the first

block, p < .001, and rapidly declined to musician levels for the second and third blocks (musician

Page 25: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

19

vs. nonmusicians’ block 2 and block 3 were insignificant). Bootstrap ratios (Figure 2-5B)

demonstrate that all effects were reliable across most channels and observed at coarse scales.

Discussion

Musicians versus Nonmusicians

As predicted, musical training is associated with increases in brain signal complexity.

Compared to nonmusicians, musicians displayed higher multiscale entropy during initial melody

presentation. Increased signal complexity is indicative of increased diversity of transient

functional networks (McIntosh et al., 2010; Tononi et al., 1998). This highly variable activity

may be ideally suited to capture changes in elaborate harmonies, rhythms, and dynamics of

pieces of music, and may develop through experience to support a more accurate and wider

range of musical behaviours.

In addition to enhanced music performance, musicians also display enhanced

performance on tests of music perception (Peretz & Babai, 1992; Tervaniemi, Just, Koelsch,

Widmann, & Schroger, 2005). In an earlier study with the same sample as the present study

musicians outperformed nonmusicians in a deviant tone-detection task (Fujioka et al., 2004).

Unfortunately, the behavioural task was not performed at the time of MEG acquisition, which

precludes examination of correlations between brain activity and performance. However, recent

evidence has demonstrated that optimal neural complexity is positively correlated with

performance on tests of face recognition (Heisz et al., under review; McIntosh et al., 2008; Misic

et al., 2010), as well as with performance on tests of perceptual matching, attentional cueing, and

delayed match-to-sample (Garrett et al., 2010). Considered jointly with simulations

demonstrating that high complexity indicates rapid processing of high levels of information that

can flexibly adapt to changes in external input (Tononi, Sporns & Edelman, 1994), our results

Page 26: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

20

provide support for the theory that increased neural complexity supports augmented music

perception. Further investigation involving simultaneous neuroimaging and behavioural testing is

required before this claim is definitive.

Maturational increases in brain signal complexity have been observed in infants exposed

to an auditory stimulus (Lippé et al., 2009), and this trend continues from childhood to adulthood

(McIntosh, Kovacevic & Itier, 2008). Because both groups in the present study were similar in

age, our results suggest that biological maturation cannot explain increased neural complexity

seen among musician. This account of experience dependent changes in transient neural activity

is further supported by a previous report of a positive relationship between MSE and stimulus

familiarity (Heisz et al., under review), in which famous faces elicited greater EEG complexity

than novel faces, an effect that was a function of personal familiarity measured with familiarity

ratings and naming. Heisz and colleagues also found that MSE correlated with acquired

familiarity in a multi-day training paradigm. Because the effect was distributed across brain

regions, the authors concluded that the increase in signal complexity indexed high integration of

specialized, segregated functional regions. Similarly, we propose that the heightened signal

complexity recorded in musicians during melody perception developed as a result of training in

order to facilitate efficient processing of elaborate music. Further investigation of the

relationship between the extent of music experience and brain signal complexity is required to

substantiate this claim.

Unconscious, automatic processing

A general decrease of entropy across blocks was observed in both groups but this effect

was more pronounced in musicians. The present experiment was originally designed to

investigate the association between extensive music training and automatic melodic processing

(Fujioka et al., 2004). Recall that participants watched a silent movie during melody presentation

Page 27: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

21

and they were instructed not to ignore the sound stimuli. Reduced complexity was also reported

in a similar experimental paradigm that drew attention away from the stimulus (Vakorin et al.,

2010), and during repetition of a familiar stimulus (Heisz et al., under review). Initial

presentation of a novel stimulus attracts attention (Daffner, Mesulam, et al., 2000; Daffner,

Scinto, et al., 2000; Escera, Alho, Winkler, & Näätanen, 1998; Tiitinen et al., 1993), and

transient neural network activity resolves the input information (Honey et al., 2007; McIntosh,

2000). Auditory sensory processing involves integration of information in the input with stored

representations of preceding auditory events (Alain, Woods, & Knight, 1998). Thus, with further

repetition, listeners habituate to innocuous stimuli and minimal sensory processing descends to

an unconscious level (Fantz, 1964; Trainor, McDonald, & Alain, 2002). Decreases in signal

complexity are associated with diminished conscious awareness (Protzner et al., 2010; Shen,

Olbrich, Achermann, & Meier, 2003); therefore, while we did not assess levels of stimulus

awareness, we believe that the observed decreases in MSE may reflect a transition from

conscious processing of informative and novel information to automatic processing of repetitive

information.

Automatic processing is expected to coincide with decreased functional integration

between regions of the network and with decreased of the neural signal complexity (Tononi,

2010). The reductions in sample entropy in our analysis were observed reliably across coarse

time scales, indicative of long-range temporal correlations. The lack of source analysis in this

investigation makes spatial conclusions difficult. Recent analyses of EEG signals demonstrated,

however, that local information is typically represented at finer timescales whereas conduction of

distributed information conduction is expressed at coarser timescales (Vakorin, Lippé, &

McIntosh, 2011). Furthermore, maturational increases in coarse-scale entropy (McIntosh et al.,

2008; Vakorin et al., 2011) parallel developmental increases in integrated long-range

Page 28: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

22

connectivity relative to local activity (Fair et al., 2009). Segregated, localized activity is

suggested to be associated with automatic or unconscious processes, and integration from

distributed neuronal groups is necessary for conscious, unified perception (Tononi & Edelman,

1998).

The importance of a balance between segregation and integration can be seen in

comparison of different conscious and unconscious states. Massive brain synchrony and hyper-

integration result in seizures, diminished consciousness, and decreased complexity (Protzner et

al., 2010). Conversely, slow-wave sleep (Massimini et al., 2007; Massimini et al., 2005) and

anaesthetic states (Ferrarelli et al., 2010) are associated with decreased connectivity between

brain regions and decreased complexity. During REM sleep, when conscious-like dreams occur

(Hobson, 2006; Stickgold, Hobson, Fosse, & Fosse, 2001), wakefulness-like EEG patterns are

associated with increased effective connectivity, integration (Ferrarelli et al., 2010), and signal

complexity (Shen et al., 2003). Simulations have shown that long-range integration appears

when a novel task is introduced, which decreases during routinization (Dehaene, Kerszberg, &

Changeux, 1998). Additionally, specialized, local processing corresponds to lower complexity

when compared to collaboration between distributed regions (Tononi et al., 1994). Animal

recordings also show that activity evoked by a habituated stimulus is restricted to local sensory

pathways (Horel et al., 1967), and fMRI has shown that evoked activity is restricted to primary

and secondary auditory cortex (BA 41 and 42) following auditory tone habituation (Celsis et al.,

1999).

Melodic Contour versus Intervals

Compared with nonmusicians, deviant tones in the interval condition elicited minimal

MSE among musicians during block 1 (Figure 5). We suggest that this effect is because the

intervallic change contained nominal information relative to the changes in the contour

Page 29: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

23

condition, and consequently may be more easily and rapidly processed by trained musicians.

This explanation is in line with previous results that have observed lower brain signal entropy as

a function of stimulus information content (Heisz et al., under review; Misic et al., 2010). Firstly,

the deviant melodies were dispersed infrequently (20% of trials) among the standard melodies.

As a result, consequences of habituation on neural signal complexity were expected to contribute

to a reduction in recorded entropy. Secondly, neural network activity that accompanies deviant

melodies in the interval condition may have been lower than in the contour condition because

these melodies contain less inherent information than the contour melodies, simply by virtue of

their design. In the contour condition, all eight melodies were composed of different intervals.

The deviant tones changed contour as well as the final interval of the melody. For example, in

contour melody 2 (Figure 1), the fifth (last) tone is one semitone higher than the fourth tone,

whereas the deviant tone is four semitones lower. By contrast, melodies in the interval condition

are transpositions of the same sequence, and all deviant tones increased the size of the final

interval by one whole tone. Consequently, the melodies in the interval condition are much more

statistically stable and carry less novel information than the melodies in the contour condition

(Shannon & Weaver, 1949). In a probabilistic sense, the contour melodies involved a wider

range of possible outcomes and hence a higher level of uncertainty then interval melodies.

It has been suggested that the brain operates in a Bayesian probabilistic sense to generate

predictions about the likely network activity configuration that would be optimal for a given

input (McIntosh et al., 2010). Indeed, there is ample evidence that there are neural mechanisms

that operate in a predictive and regular sequence even in the omission of an expected stimuli

(Rankin, Large, & Fink, 2009; Ritter, Sussman, Deacon, Cowan, & Vaughan, 1999; Russeler et

al., 2001; Snyder & Large, 2005; Zanto, Large, Fuchs, & Kelso, 2005). Specific to music, Large

and colleagues (Zanto, Snyder, & Large, 2006) observed induced gamma-band activity that

Page 30: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

24

reflected temporally precise rhythm predictions, which coincided with participants’ anticipation-

based synchronized responses to changing rhythms (Large & Jones, 1999; Rankin et al., 2009).

In a continually changing environment, activity that flows through variable network

configurations enables adaptability of response (Manoel & Connolly, 1995). Conversely, when

the temporal regularity of previous successive events decreases the number of possible

outcomes, as is the case of the interval melodies in the present study, network activity stabilizes

because highly complex and variable network activity is no longer necessary, in a probabilistic

sense, to ensure a correct response. Although this explanation is plausible, further exploration of

the effect of stimulus information on brain signal complexity is required before a definite

conclusion can be reached.

Entropy and ERP

Fujioka and colleagues (2004) observed larger amplitude MMNm in musicians for

interval compared to contour deviants, and only for interval deviants in nonmusicians. Further

analyses of the evolution of this ERP over the course of the experiment were not conducted in

the present study, making it difficult to delineate the specific relationship between signal entropy

and event-related potentials. Indeed, signal-to-noise ratio difficulties make determining the

evolution of ERP problematic. MSE is computed on neural activity containing both induced and

evoked signal components. Importantly, previous analyses observed no change in task associated

MSE values after subtraction of the average evoked response (Misic et al., 2010). Because

MMNm evoked potentials were more pronounced for the more statistically regular interval

melodies, it seems that ERP responses were more sensitive to the impact of a deviant in the

overall regularity of the sequence. In other words, the ‘deviance’ value assigned to a particular

tone is taken relative to the variability of the standard sequences. By contrast, our measure of

Page 31: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

25

neural complexity that appears to be sensitive to the amount of consciously processed

information.

Conclusion

When presented with a new melody, musicians display more brain signal complexity

compared to age-matched nonmusicians. Such increases in the variability of their brain activity

suggest that music training increases neural resources that are available for music processing.

Highly variable brain activity may be better able to capture the subtle intricacies and the large

amount of information in a dynamic piece of music, and thus be responsible for the improved

behavioural and cognitive skills of musicians. The present study represents an important step in

testing this topic hypothesis.

Page 32: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

26

Page 33: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

27

Page 34: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

28

Page 35: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

29

Page 36: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

30

Page 37: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

31

References

Abdi, H., & Williams, L. (2010). Barycentric discriminant analysis. In N. Salkind, D. Dougherty

& B. Frey (Eds.), Encyclopedia of Research Design (pp. 64-75). Thousand Oaks, CA:

Sage.

Alain, C., Woods, D., & Knight, R. (1998). A distributed cortical network for auditory sensory

memory in humans. [Article]. Brain Research, 812(1-2), 23-37.

Alho, K. (1995). Cerebral generators of mismatch negativity (MMN) and its magnetic

counterpart (MMNm) elicited by sound changes. Ear and Hearing, 16, 38-51.

Anokhin, A., Birbaumer, N., Lutzenberger, W., Nikolaev, A., & Vogel, F. (1996). Age increases

brain complexity. [Article]. Electroencephalography and Clinical Neurophysiology,

99(1), 63-68.

Bendor, D., & Wang, X. (2005). The neuronal representation of pitch in primate auditory cortex.

[Article]. Nature, 436(7054), 1161-1165. doi: DOI 10.1038/nature03867

Bermudez, P., & Zatorre, R. (2005). Differences in gray matter between musicians and

nonmusicians. [Proceedings Paper]. Neurosciences and Music Ii: From Perception To

Performance, 1060, 395-399. doi: DOI 10.1196/annals.1360.057

Bhattacharya, J., & Petsche, H. (2005). Phase synchrony analysis of EEG during music

perception reveals changes in functional connectivity due to musical expertise. [Article].

Signal Processing, 85(11), 2161-2177. doi: DOI 10.1016/j.sigpro.2005.07.007

Biswal, B., Yetkin, F., Haughton, V., & Hyde, J. (1995). Functional Connectivity in the Motor

Cortex of Resting Human Brain Using Echo-Planar MRI. [Article]. Magnetic Resonance

in Medicine, 34(4), 537-541.

Page 38: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

32

Brattico, E., Winkler, I., Naatanen, R., Paavilainen, P., & Tervaniemi, M. (2002). Simultaneous

storage of two complex temporal sound patterns in auditory sensory memory. [Article].

Neuroreport, 13(14), 1747-1751.

Bressler, S. (1995). Large-scale cortical networks and cognition. Brain Research Reviews, 20,

288-304.

Bressler, S., Coppola, R., & Nakamura, R. (1993). Episodic multiregional cortical coherence at

multiple frequencies during visual task performance. Nature, 366, 153-156.

Bullmore, E., & Sporns, O. (2009). Complex brain networks: graph and theoretical analysis of

structural and functional systems. Nature Reviews Neuroscience, 10, 186-198.

Buzaski, G. (2006). Rhythms of the Brain. New York, NY: Oxford University Press.

Buzaski, G., & Draguhn, A. (2004). Neuronal oscillations in cortical networks. Science, 304,

1926-1929.

Canolty, R., Soltani, M., Dalal, S., Edwards, E., Dronkers, N., Nagarajan, S., . . . Knight, R.

(2007). Spatiotemporal dynamics of work processing the human brain. Frontiers in

Human Neuroscience, 1, 185-196.

Celsis, P., Boulanouar, K., Doyon, B., Ranjeva, J., Berry, I., Nespoulous, J., & Chollet, F.

(1999). Differential fMRI responses in the left posterior superior temporal gyrus and left

supramarginal gyrus to habituation and change detection in syllables and tones. [Article].

Neuroimage, 9(1), 135-144.

Chen, J., Penhune, V., & Zatorre, R. (2008). Listening to Musical Rhythms Recruits Motor

Regions of the Brain. [Article]. Cerebral Cortex, 18(12), 2844-2854. doi: DOI

10.1093/cercor/bhn042

Cherniak, C. (1994). Component Placement Optimization in the Brain. [Article]. Journal of

Neuroscience, 14(4), 2418-2427.

Page 39: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

33

Costa, M., Goldberger, A., & Peng, C. (2002). Multiscale entropy analysis of complex

physiologic time series. [Article]. Physical Review Letters, 89(6), -. doi: ARTN 068102

DOI 10.1103/PhysRevLett.89.068102

Costa, M., Goldberger, A., & Peng, C. (2004). Comment on "Multiscale entropy analysis of

complex physiologic time series" - Reply. [Editorial Material]. Physical Review Letters,

92(8), -. doi: ARTN 089804 DOI 10.1103/PhysRevLett.92.089804

Costa, M., Goldberger, A., & Peng, C. (2005). Multiscale entropy analysis of biological signals.

[Article]. Physical Review E, 71(2), -. doi: ARTN 021906 DOI

10.1103/PhysRevE.71.021906

Costa, M., & Healey, J. (2003). Multiscale entropy analysis of complex heart rate dynamics:

Discrimination of age and heart failure effects. [Proceedings Paper]. Computers in

Cardiology 2003, Vol 30, 30, 705-708.

Crone, N., Boatman, D., Gordon, B., & Hao, L. (2001). Induced electrocorticographic gamma

activity during auditory perception. [Article]. Clinical Neurophysiology, 112(4), 565-582.

Daffner, K., Mesulam, M., Scinto, L., Acar, D., Calvo, V., Faust, R., . . . Holcomb, P. (2000).

The central role of the prefrontal cortex in directing attention to novel events. [Article].

Brain, 123, 927-939.

Daffner, K., Scinto, L., Calvo, V., Faust, R., Mesulam, M., West, W., & Holcomb, P. (2000).

The influence of stimulus deviance on electrophysiologic and behavioral responses to

novel events. [Article]. Journal of Cognitive Neuroscience, 12(3), 393-406.

De Santis, L., Clarke, S., & Murray, M. (2007). Automatic and intrinsic auditory "what" and

"where" processing in humans revealed by electrical neuroimaging. [Article]. Cerebral

Cortex, 17(1), 9-17. doi: DOI 10.1093/cercor/bhj119

Page 40: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

34

De Santis, L., Spierer, L., Clarke, S., & Murray, M. (2007). Getting in touch: Segregated

somatosensory what and where pathways in humans revealed by electrical neuroimaging.

[Article]. Neuroimage, 37(3), 890-903. doi: DOI 10.1016/j.neuroimage.2007.05.052

Deco, G., Jirsa, V., & McIntosh, A. (2011). Emerging concepts for the dynamical organization of

resting-state activity in the brain. Nature Reviews Neuroscience, 12, 43-56.

Dehaene, S., Kerszberg, M., & Changeux, J. (1998). A neuronal model of a global workspace in

effortful cognitive tasks. [Article]. Proceedings of the National Academy of Sciences of

the United States of America, 95(24), 14529-14534.

Dobelle, W., Mladejovsky, M., & Girvin, J. (1974). Artificial vision for blind - electrical-

stimulation of visual-cortex offers hope for a functional prothesis. [Article]. Science,

183(4123), 440-444.

Efron, B., & Tibshirani, R. (1986). Bootstrap Methods for Standard Errors, Confidence Intervals,

and Other Measures of Statistical Accuracy. Statistical Science, 1(1), 54-75.

Elbert, T., Pantev, C., Wienbruch, C., Rockstroh, B., & Taub, E. (1995). Increased cortical

representation of the fingers of the left hand in string players. [Article]. Science,

270(5234), 305-307.

Escera, C., Alho, K., Winkler, I., & Näätanen, R. (1998). Neural mechanisms of involuntary

attention to acoustic novelty and change. [Article]. Journal of Cognitive Neuroscience,

10(5), 590-604.

Fair, D., Cohen, A., Power, J., Dosenbach, N., Church, J., Miezin, F., . . . Petersen, S. (2009).

Functional Brain Networks Develop from a "Local to Distributed" Organization.

[Article]. Plos Computational Biology, 5(5), -. doi: ARTN e1000381 DOI

10.1371/journal.pcbi.1000381

Page 41: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

35

Fantz, R. (1964). Visual experience in infants - Decreased attention to familiar patterns relative

to novel ones. [Article]. Science, 146(364), 668-&.

Felleman, D., & Van Essen, D. (1991). Distributed Hierarchical Processing in the Primate

Cerebral Cortex. Cerebral Cortex, 1, 1-47.

Ferrarelli, F., Massimini, M., Sarasso, S., Casali, A., Riedner, B., Angelini, G., . . . Pearce, R.

(2010). Breakdown in cortical effective connectivity during midazolam-induced loss of

consciousness. [Article]. Proceedings of the National Academy of Sciences of the United

States of America, 107(6), 2681-2686. doi: DOI 10.1073/pnas.0913008107

Fox, M., Snyder, A., Vincent, J., Corbetta, M., Van Essen, D., & Raichle, M. (2005). The human

brain is intrinsically organized into dynamic, anticorrelated functional networks.

[Article]. Proceedings of the National Academy of Sciences of the United States of

America, 102(27), 9673-9678. doi: DOI 10.1073/pnas.0504136102

Friston, K. (1994). Functional and effective connectivity: A synthesis. Human Brain Mapping,

2(1/2), 56-78.

Fujioka, T., Trainor, L., Large, E., & Ross, B. (2009). Beta and Gamma Rhythms in Human

Auditory Cortex during Musical Beat Processing. [Proceedings Paper]. Neurosciences

and Music Iii: Disorders and Plasticity, 1169, 89-92. doi: DOI 10.1111/j.1749-

6632.2009.04779.x

Fujioka, T., Trainor, L., Ross, B., Kakigi, R., & Pantev, C. (2004). Musical training enhances

automatic encoding of melodic contour and interval structure. [Article]. Journal of

Cognitive Neuroscience, 16(6), 1010-1021.

Garrett, D., Kovacevic, N., McIntosh, A., & Grady, C. (2010). Blood Oxygen Level-Dependent

Signal Variability Is More than Just Noise. [Article]. Journal of Neuroscience, 30(14),

4914-4921. doi: Doi 10.1523/jneurosci.5166-09.2010

Page 42: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

36

Garrett, D., Kovacevic, N., McIntosh, A., & Grady, C. (2011). The Importance of Being

Variable. [Article]. Journal of Neuroscience, 31(12), 4496-4503. doi: Doi

10.1523/jneurosci.5641-10.2011

Gaser, C., & Schlaug, G. (2003a). Brain structures differ between musicians and non-musicians.

[Article]. Journal of Neuroscience, 23(27), 9240-9245.

Gaser, C., & Schlaug, G. (2003b). Gray matter differences between musicians and nonmusicians.

[Proceedings Paper]. Neurosciences and Music, 999, 514-517. doi: DOI

10.1196/annals.1284.062

Gazzaley, A., Rissman, J., & D'Esposito, M. (2004). Functional connectivity during working

memory maintenance. Cognitive, Affective & Behavioral Neuroscience, 4(4), 580-599.

Ghosh, A., Rho, Y., McIntosh, A., Kotter, R., & Jirsa, V. (2008). Noise during Rest Enables the

Exploration of the Brain's Dynamic Repertoire. [Article]. Plos Computational Biology,

4(10), -. doi: ARTN e1000196 DOI 10.1371/journal.pcbi.1000196

Goodale, M., Milner, A., Jakobson, L., & Carey, D. (1991). A Neurological Dissociation

Between Perceiving Objects and Grasping Them. [Article]. Nature, 349(6305), 154-156.

Grahn, J., & Brett, M. (2007). Rhythm and beat perception in motor areas of the brain. [Article].

Journal of Cognitive Neuroscience, 19(5), 893-906.

Green, A., Baerensten, K., Stodkilde-Jorgensen, H., Wallentin, M., Roepstorff, A., & Vuust, P.

(2008). Music in minor activates limbic structures: a relationship with dissonance?

Neuroreport, 19, 711-715.

Greicius, M., Krasnow, B., Reiss, A., & Menon, V. (2003). Functional connectivity in the resting

brain: A network analysis of the default mode hypothesis. [Article]. Proceedings of the

National Academy of Sciences of the United States of America, 100(1), 253-258. doi: DOI

10.1073/pnas.0135058100

Page 43: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

37

Griffiths, T. (2003). Functional Imaging of Pitch Analysis. Annals of the New York Academy of

Sciences, 999, 40-49.

Hagmann, P., Cammoun, L., Gigandet, X., Meuli, R., Honey, C., Wedeen, V., & Sporns, O.

(2008). Mapping the structural core of human cerebral cortex. [Article]. Plos Biology,

6(7), 1479-1493. doi: ARTN e159 DOI 10.1371/journal.pbio.0060159

Halpern, A., Zatorre, R., Bouffard, M., & Johnson, J. (2004). Behavioral and neural correlates of

perceived and imagined musical timbre. Neuropsychologia, 42, 1281-1292.

Halsband, U., Tanji, J., & Freund, H.-J. (1993). The role of premotor cortex and the

supplementary motor area in the temporal control of movement in man. Brain, 116, 243-

246.

Hampson, M., Peterson, B., Skudlarski, P., Gatenby, J., & Gore, J. (2002). Detection of

functional connectivity using temporal correlations in MR images. [Article]. Human

Brain Mapping, 15(4), 247-262. doi: DOI 10.1002/hbm.10022

Hannon, E., & Trehub, S. (2005). Metrical categories in infancy and adulthood. [Article].

Psychological Science, 16(1), 48-55.

Heisz, J., Shedden, J., & McIntosh, A. (under review). Relating brain signal variability to

knowledge representation. The Journal of Neuroscience.

Hipp, J., Engel, A., & Siegel, M. (2011). Oscillatory synchronization in large-scale cortical

networks predicts perception. Neuron, 69, 387-396.

Hobson, J. A. (2006). Sleep and Dreaming Encyclopedia of Cognitive Science: John Wiley &

Sons, Ltd.

Honey, C., Kotter, R., Breakspear, M., & Sporns, O. (2007). Network structure of cerebral cortex

shapes functional connectivity on multiple time scales. [Article]. Proceedings of the

Page 44: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

38

National Academy of Sciences of the United States of America, 104(24), 10240-10245.

doi: DOI 10.1073/pnas.0701519104

Honey, C., Sporns, O., Cammoun, L., Gigandet, X., Thiran, J., Meuli, R., & Hagmann, P. (2009).

Predicting human resting-state functional connectivity from structural connectivity.

[Article]. Proceedings of the National Academy of Sciences of the United States of

America, 106(6), 2035-2040. doi: DOI 10.1073/pnas.0811168106

Horel, J., Vierck, C., Pribram, K., Spinelli, D., John, E., & Ruchkin, D. (1967). Average Evoked

Responses and Learning. [Article]. Science, 158(3799), 394-&.

Hutchinson, S., Lee, L., Gaab, N., & Schlaug, G. (2003). Cerebellar volume of musicians.

[Article]. Cerebral Cortex, 13(9), 943-949.

Hyde, K., Lerch, J., Norton, A., Forgeard, M., Winner, E., Evans, A., & Schlaug, G. (2009a).

The Effects of Musical Training on Structural Brain Development A Longitudinal Study.

[Proceedings Paper]. Neurosciences and Music Iii: Disorders and Plasticity, 1169, 182-

186. doi: DOI 10.1111/j.1749-6632.2009.04852.x

Hyde, K., Lerch, J., Norton, A., Forgeard, M., Winner, E., Evans, A., & Schlaug, G. (2009b).

Musical Training Shapes Structural Brain Development. [Article]. Journal of

Neuroscience, 29(10), 3019-3025. doi: Doi 10.1523/jneurosci.5118-08.2009

Ivry, R., & Keele, S. (1989). Timing functions of the cerebellum. Journal of Cognitive

Neuroscience, 1, 136-152.

Janata, P., & Grafton, S. (2003). Swining in the brain: shared neural substrates for behaviors

related to sequencing and music. Nature Neuroscience, 6, 682-687.

Jancke, L., Schlaug, G., & Steinmetz, H. (1997). Hand skill asymmetry in professional

musicians. [Article]. Brain and Cognition, 34(3), 424-432.

Page 45: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

39

Jentschke, S., & Koelsch, S. (2009). Musical training modulates the development of syntax

processing in children. [Article]. Neuroimage, 47(2), 735-744. doi: DOI

10.1016/j.neuroimage.2009.04.090

Jentschke, S., Koelsch, S., Sallat, S., & Friederici, A. (2008). Children with Specific Language

Impairment Also Show Impairment of Music-syntactic Processing. [Article]. Journal of

Cognitive Neuroscience, 20(11), 1940-1951.

Kelso, J. (1995). Dynamic Patterns: The Self-Organization of Brain and Behavior. Cambridge,

Massachusetts

London, England: The MIT Press.

Koelsch, S., & Friederici, A. (2003). Toward the neural basis of processing structure in music -

Comparative results of different neurophysiological investigation methods. [Proceedings

Paper]. Neurosciences and Music, 999, 15-28.

Koelsch, S., Gunter, T., Schroger, E., Tervaniemi, M., Sammler, D., & Friederici, A. (2001).

Differentiating ERAN and MMN: An ERP study. [Article]. Neuroreport, 12(7), 1385-

1389.

Koelsch, S., Jentschke, S., Sammler, D., & Mietchen, D. (2007). Untangling syntactic and

sensory processing: An ERP study of music perception. [Article]. Psychophysiology,

44(3), 476-490. doi: DOI 10.1111/j.1469-8986.2007.00517.x

Koelsch, S., Schroger, E., & Tervaniemi, M. (1999). Superior pre-attentive auditory processing

in musicians. [Article]. Neuroreport, 10(6), 1309-1313.

Koelsch, S., Schroger, E., & Tervaniemi, M. (2000). Superior pre-attentive and attentive

processing of auditory information in musicians: an MMN study. [Meeting Abstract].

Journal of Psychophysiology, 14(1), 64-65.

Page 46: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

40

Kotter, R. (2004). Online retrieval, processing, and visualization of primate connSectivity data

from the CoCoMac database. [Proceedings Paper]. Neuroinformatics, 2(2), 127-144.

Krumhansl, C., & Castellano, M. (1983). DYNAMIC PROCESSES IN MUSIC PERCEPTION.

[Article]. Memory & Cognition, 11(4), 325-334.

Large, E., & Jones, M. (1999). The dynamics of attending: How people track time-varying

events. [Review]. Psychological Review, 106(1), 119-159.

Lee, D., Chen, Y., & Schlaug, G. (2003). Corpus callosum: musician and gender effects.

[Article]. Neuroreport, 14(2), 205-209. doi: DOI 10.1097/01.wnr.0000053761.76853.41

Levanen, S., Ahonen, A., Jari, R., McEvoy, L., & Sams, M. (1996). Deviant auditory stimuli

activate human left and right auditory cortex differently. Cerebral Cortex, 6, 288-296.

Lippé, S., Kovacevic, N., & McIntosh, A. (2009). Differential maturation of brain signal

complexity in the human auditory and visual system. [Article]. Frontiers in Human

Neuroscience, 3, -. doi: ARTN 48 DOI 10.3389/neuro.09.048.2009

Lobaugh, N., West, R., & McIntosh, A. (2001). Spatiotemporal analysis of experimental

differences in event-related potential data with partial least squares. [Article].

Psychophysiology, 38(3), 517-530.

Lopez, L., Jurgens, R., Diekmann, V., Becker, W., Ried, S., Grozinger, B., & Erne, S. (2003).

Musicians versus nonmusicians - A neurophysiological approach. [Proceedings Paper].

Neurosciences and Music, 999, 124-130.

Maess, B., Koelsch, S., Gunter, T., & Friederici, A. (2001). Musical syntax is processed in

Broca's area: an MEG study. Nature Neuroscience, 4(5), 540-545.

Manoel, E., & Connolly, K. (1995). Variability and the development of skilled actions. [Article].

International Journal of Psychophysiology, 19(2), 129-147.

Page 47: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

41

Martinez-Montes, E., Valdes-Sosa, P., Miwakeichi, F., Goldman, R., & Cohen, M. (2004).

Concurrent EEG/fMRI analysis by multiway Partial Least Squares. [Article].

Neuroimage, 22(3), 1023-1034.

Massimini, M., Ferrarelli, F., Esser, S., Riedner, B., Huber, R., Murphy, M., . . . Tononi, G.

(2007). Triggering sleep slow waves by transcranial magnetic stimulation. [Article].

Proceedings of the National Academy of Sciences of the United States of America,

104(20), 8496-8501. doi: DOI 10.1073/pnas.0702495104

Massimini, M., Ferrarelli, F., Huber, R., Esser, S., Singh, H., & Tononi, G. (2005). Breakdown

of cortical effective connectivity during sleep. [Article]. Science, 309(5744), 2228-2232.

doi: DOI 10.1126/science.1117256

McIntosh, A. (2000). Towards a network theory of cognition. [Article]. Neural Networks, 13(8-

9), 861-870.

McIntosh, A., Bookstein, F., Haxby, J., & Grady, C. (1996). Spatial pattern analysis of functional

brain images using partial least squares. [Article]. Neuroimage, 3(3), 143-157.

McIntosh, A., Kovacevic, N., & Itier, R. (2008). Increased Brain Signal Variability Accompanies

Lower Behavioral Variability in Development. [Article]. Plos Computational Biology,

4(7), -. doi: ARTN e1000106 DOI 10.1371/journal.pcbi.1000106

McIntosh, A., Kovacevic, N., Lippé, S., Garrett, D., Grady, C., & Jirsa, V. (2010). The

development of a noisy brain. [Review]. Archives Italiennes De Biologie, 148(3), 323-

337.

McIntosh, A., & Lobaugh, N. (2004). Partial least squares analysis of neuroimaging data:

applications and advances. [Proceedings Paper]. Neuroimage, 23, S250-S263. doi: DOI

10.1016/j.neuroimage.2004.07.020

Page 48: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

42

Menon, V., & Levitin, D. (2005). The rewards of music listening: Response and physiological

connectivity of the mesolimbic system. [Article]. Neuroimage, 28(1), 175-184. doi: DOI

10.1016/j.neuroimage.2005.05.053

Menon, V., Levitin, D., Smith, B., Lembke, A., Krasnow, B., Glazer, D., . . . McAdams, S.

(2002). Neural correlates of timbre change in harmonic sounds. [Article]. Neuroimage,

17(4), 1742-1754. doi: DOI 10.1006/nimg.2002.1295

Meyer-Lindenberg, A. (1996). The evolution of complexity in human brain development: an

EEG study. Electroencephalography and Clinical Neurophysiology, 99(405-411).

Misic, B., Mills, T., Taylor, M., & McIntosh, A. (2010). Brain Noise Is Task Dependent and

Region Specific. [Article]. Journal of Neurophysiology, 104(5), 2667-2676. doi: DOI

10.1152/jn.00648.2010

Mizuno, T., & Sugishita, M. (2007). Neural correlates underlying perception of tonality-related

emotional contents. Neuroreport, 18, 1651-1655.

Morrison, S., Demorest, S., Aylward, E., Cramer, S., & Maravilla, K. (2003). FMRI

investigation of cross-cultural music comprehension. [Article]. Neuroimage, 20(1), 378-

384. doi: Doi 10.1016/s1053-8119(03)00300-8

Munte, T., Altenmuller, E., & Jancke, L. (2002). The musician's brain as a model of

neuroplasticity. [Review]. Nature Reviews Neuroscience, 3(6), 473-478. doi: DOI

10.1038/nrn843

Munte, T., Nager, W., Beiss, T., Schroeder, C., & Altenmuller, E. (2003). Specialization of the

specialized: Electrophysiological investigations in professional musicians. [Proceedings

Paper]. Neurosciences and Music, 999, 131-139.

Näätanen, R., & Alho, K. (1995). Mismatch negativity - a unique measure of sensory processing

in audition. International Journal of Neuroscience, 80, 317-337.

Page 49: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

43

Näätanen, R., & Picton, T. (1987). The N1 wave of the human electric and magnetic response to

sound: a review and analysis of the component structure. Psychophysiology, 38, 283-299.

Nikulin, V., & Brismar, T. (2004). Comment on "Multiscale entropy analysis of complex

physiologic time series". [Editorial Material]. Physical Review Letters, 92(8), -. doi:

ARTN 089803 DOI 10.1103/PhysRevLett.92.089803

Nyberg, L., Persson, J., Habib, R., Tulving, E., McIntosh, A., Cabeza, R., & Houle, S. (2000).

Large scale neurocognitive networks underlying episodic memory. Journal of Cognitive

Neuroscience, 12(163-173).

Pantev, C., Oostenveld, R., Engelien, A., Ross, B., Roberts, L., & Hoke, M. (1998). Increased

auditory cortical representation in musicians. [Article]. Nature, 392(6678), 811-814.

Pantev, C., Roberts, L., Schulz, M., Engelien, A., & Ross, B. (2001). Timbre-specific

enhancement of auditory cortical representations in musicians. [Article]. Neuroreport,

12(1), 169-174.

Passingham, R., Stephan, K., & Kotter, R. (2002). The anatomical basis of functional

localization in the cortex. [Review]. Nature Reviews Neuroscience, 3(8), 606-616. doi:

DOI 10.1038/nrn893

Patterson, R., Uppenkamp, S., Johnsrude, I., & Griffiths, T. (2002). The processing of temporal

pitch and melody information in auditory cortex. [Article]. Neuron, 36(4), 767-776.

Penfield, W., & Boldrey, E. (1937). Somatic motor and sensory representation in the cerebral

cortex of man as studied by electrical stimulation. [Article]. Brain, 60, 389-443.

Peretz, I. (1990). Processing of Local and Global Musical Information by Unilateral Brain-

Damaged Patients. [Article]. Brain, 113, 1185-1205.

Peretz, I., & Babai, M. (1992). The role of contour and intervals it the recognition of melody

parts: Evidence from cerebral asymmetries in musicians. Neuropsychologia, 30, 277-292.

Page 50: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

44

Peretz, I., & Kolinsky, R. (1993). Boundaries of separability between melody and rhythm in

music discrimination - A neuropsychological perspective. [Article]. Quarterly Journal of

Experimental Psychology Section a-Human Experimental Psychology, 46(2), 301-325.

Peretz, I., & Zatorre, R. (2005). Brain organization for music processing. Annual Reviews of

Psychology, 56, 89-114.

Picton, T., Alain, C., Leun, O., Ritter, W., & Achim, A. (2000). Mismatch Negativity: Different

Water in the Same River. Audiology & Neurotology, 5, 111-139. doi: 10.1159/000013875

Protzner, A., Valiante, T., Kovacevic, N., McCormick, C., & McAndrews, M. (2010).

Hippocampal signal complexity in mesial temporal lobe epilepsy: a noisy brain is a

healthy brain. Archives Italiennes De Biologie, 148, 289-297.

Raichle, M., MacLeod, A., Snyder, A., Powers, W., Gusnard, D., & Shulman, G. (2001). A

default mode of brain function. [Article]. Proceedings of the National Academy of

Sciences of the United States of America, 98(2), 676-682.

Rankin, S., Large, E., & Fink, P. (2009). Fractal Tempo Fluctuation and Pulse Prediction.

[Article]. Music Perception, 26(5), 401-413. doi: Doi 10.1525/mp.2009.26.5.401

Richman, J., & Moorman, J. (2000). Physiological time-series analysis using approximate

entropy and sample entropy. [Article]. American Journal of Physiology-Heart and

Circulatory Physiology, 278(6), H2039-H2049.

Ritter, W., Sussman, E., Deacon, D., Cowan, N., & Vaughan, H. (1999). Two cognitive systems

simultaneously prepared for opposite events. [Article]. Psychophysiology, 36(6), 835-

838.

Rodriguez, E., George, N., Lachaux, J., Martinerie, J., Renault, B., & Varela, F. (1999).

Perception’s shadow: long-distance synchronization of human brain activity. Nature, 397,

430-433.

Page 51: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

45

Russeler, J., Altenmuller, E., Nager, W., Kohlmetz, C., & Munte, T. (2001). Event-related brain

potentials to sound omissions differ in musicians and non-musicians. [Article].

Neuroscience Letters, 308(1), 33-36.

Salimpoor, V., Benovoy, M., Larcher, K., Dagher, A., & Zatorre, R. (2011). Anatomically

distinct dopamine release during anticipation and experience of peak emotion to music.

[Article]. Nature Neuroscience, 14(2), 257-U355. doi: DOI 10.1038/nn.2726

Sampson, P., Streissguth, A., Barr, H., & Bookstein, F. (1989). Neuro-Bahavioral Effects of

Prenatal Alcohol .2. Partial Least-Squares Analysis. [Article]. Neurotoxicology and

Teratology, 11(5), 477-491.

Samson, S. (2003). Neuropsychological studies of musical timbre. [Proceedings Paper].

Neurosciences and Music, 999, 144-151.

Schlaug, G. (2001). The Brain of Musicians. Annals of the New York Academy of Sciences, 930,

281-299.

Schlaug, G., Forgeard, M., Zhu, L., Norton, A., & Winner, E. (2009). Training-induced

Neuroplasticity in Young Children. [Proceedings Paper]. Neurosciences and Music Iii:

Disorders and Plasticity, 1169, 205-208. doi: DOI 10.1111/j.1749-6632.2009.04842.x

Schlaug, G., Jancke, L., Huang, Y., Staiger, J., & Steinmetz, H. (1995). Increased Corpus-

Callosum Size in Musicians. [Article]. Neuropsychologia, 33(8), 1047-&.

Schlaug, G., Jancke, L., Huang, Y., & Steinmetz, H. (1995). In-vivo evidence of structural brain

asymmetry in musicians. Science, 267(5198), 699-701.

Schmithorst, V., & Wilke, M. (2002). Differences in white matter architecture between

musicians and non-musicians: a diffusion tensor imaging study. [Article]. Neuroscience

Letters, 321(1-2), 57-60. doi: Pii s0304-3940(02)00054-x

Page 52: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

46

Schneider, P., Scherg, M., Dosch, H., Specht, H., Gutschalk, A., & Rupp, A. (2002).

Morphology of Heschl's gyrus reflects enhanced activation in the auditory cortex of

musicians. [Article]. Nature Neuroscience, 5(7), 688-694. doi: DOI 10.1038/nn871

Scoville, W., & Milner, B. (1957). Loss of rencent memory after bilateral hippocampal lesions.

[Article]. Journal of Neurology Neurosurgery and Psychiatry, 20(1), 11-21.

Shannon, C., & Weaver, W. (1949). The Mathematical Theory of Information. Urbana, IL:

University of Illinois Press.

Shen, Y., Olbrich, E., Achermann, P., & Meier, P. (2003). Dimensional complexity and spectral

properties of the human sleep EEG. [Article]. Clinical Neurophysiology, 114(2), 199-

209. doi: Doi 10.1016/s1388-2457(02)00338-3

Singer, W., & Gray, C. (1995). Visual feature integration and the temporal correlation

hypothesis. Annual Reviews of Neuroscience, 18(555-586).

Snyder, J., & Large, E. (2005). Gamma-band activity reflects the metric structure of rhythmic

tone sequences. [Article]. Cognitive Brain Research, 24(1), 117-126. doi: DOI

10.1016/j.cogbrainres.2004.12.014

Sokolov, A., Pavlova, M., Lutzenberger, W., & Birbaumer, N. (2004). Reciprocal modulation of

neuromagnetic induced gamma activity by attention in the human visual and auditory

cortex. [Article]. Neuroimage, 22(2), 521-529. doi: DOI

10.1016/j.neuroimage.2004.01.01.045

Stickgold, R., Hobson, J. A., Fosse, R., & Fosse, M. (2001). Sleep, Learning, and Dreams: Off-

line Memory Reprocessing. Science, 294(5544), 1052-1057. doi:

10.1126/science.1063530

Tallon-Baudry, C., & Bertrand, O. (1999). Oscillatory gamma activity in humans and its role in

object representation. [Editorial Material]. Trends in Cognitive Sciences, 3(4), 151-162.

Page 53: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

47

Tervaniemi, M., Ilvonen, T., Karma, K., Alho, K., & Naatanen, R. (1997). The musical brain:

Brain waves reveal the neurophysiological basis of musicality in human subjects.

[Article]. Neuroscience Letters, 226(1), 1-4.

Tervaniemi, M., Just, V., Koelsch, S., Widmann, A., & Schroger, E. (2005). Pitch discrimination

accuracy in musicians vs. nonmusicians: an event-related potential and behavioural study.

Experimental Brain Research, 161, 1-10.

Tiitinen, H., Alho, K., Huotilainen, M., Ilmoniemi, R., Simola, J., & Näätanen, R. (1993).

Tonotopic Auditory-Cortex and the Magnetoencephalographic (MEG) Equivalent of the

Mismatch Negativity. [Note]. Psychophysiology, 30(5), 537-540.

Tononi, G. (2010). Information integration: its relevance to brain function and consciousness.

[Article]. Archives Italiennes De Biologie, 148(3), 299-322.

Tononi, G., & Edelman, G. (1998). Neuroscience - Consciousness and complexity. [Review].

Science, 282(5395), 1846-1851.

Tononi, G., Edelman, G., & Sporns, O. (1998). Complexity and coherency: integrating

information in the brain. [Review]. Trends in Cognitive Sciences, 2(12), 474-484.

Tononi, G., Sporns, O., & Edelman, G. (1994). A measure for brain complexity - relating

functional segregation and integration in the nervous system. [Article]. Proceedings of

the National Academy of Sciences of the United States of America, 91(11), 5033-5037.

Trainor, L., McDonald, K., & Alain, C. (2002). Automatic and controlled processing of melodic

contour and interval information measured by electrical brain activity. [Article]. Journal

of Cognitive Neuroscience, 14(3), 430-442.

Trainor, L., Shahin, A., & Roberts, L. (2009). Understanding the Benefits of Musical Training

Effects on Oscillatory Brain Activity. [Proceedings Paper]. Neurosciences and Music Iii:

Disorders and Plasticity, 1169, 133-142. doi: DOI 10.1111/j.1749-6632.2009.04589.x

Page 54: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

48

Vakorin, V., Lippé, S., & McIntosh, A. (2011). Variability of Brain Signals Processed Locally

Transforms into Higher Connectivity with Brain Development. [Article]. Journal of

Neuroscience, 31(17), 6405-6413. doi: Doi 10.1523/jneurosci.3153-10.2011

Vakorin, V., Ross, B., Krakovska, O., Bardouille, T., Cheyne, D., & McIntosh, A. (2010).

Complexity analysis of source activity underlying the neuromagnetic somatosensory

steady-state response. [Article]. Neuroimage, 51(1), 83-90. doi: DOI

10.1016/j.neuroimage.2010.01.100

Varela, F., Lachaux, J., Rodriguez, E., & Martinerie, J. (2001). The brainweb: Phase

synchronization and large-scale integration. [Review]. Nature Reviews Neuroscience,

2(4), 229-239.

Warren, J., Uppenkamp, S., Patterson, R., & Griffiths, T. (2003). Separating pitch chroma and

pitch height in the human brain. [Article]. Proceedings of the National Academy of

Sciences of the United States of America, 100(17), 10038-10042. doi: DOI

10.1073/pnas.1730682100

Watts, D., & Strogatz, S. (1998). Collective dynamics of 'small-world' networks. Nature, 393,

440-442.

Zanto, T., Large, E., Fuchs, A., & Kelso, J. (2005). Gamma-band responses to perturbed auditory

sequences: Evidence for synchronization of perceptual processes. [Proceedings Paper].

Music Perception, 22(3), 531-547.

Zanto, T., Snyder, J., & Large, E. (2006). Neural correlates of rhythm expectancy. Advances in

Cognitive Psychology, 2(2), 221-231.

Zatorre, R. (1988). Pitch perception of complex tones and human temporal-lobe function.

Journal of the Acoustical Society of America, 84, 566-572.

Page 55: Brain-Music Duet: MEG signal complexity and auditory ... · PDF fileii ii Brain-Music Duet: MEG signal complexity and auditory perception in musicians and nonmusicians Sarah M. Carpentier

49

Zatorre, R., Evans, A., & Meyer, E. (1994). Neural Mechanisms Underlying Melodic Perception

and Memory for Pitch. [Article]. Journal of Neuroscience, 14(4), 1908-1919.

Zatorre, R., & Samson, S. (1991). Role of the Right Temporal Neocortex in Retention of Pitch in

Auditory Short-Term Memory. [Article]. Brain, 114, 2403-2417.