14
IOP PUBLISHING JOURNAL OF NEURAL ENGINEERING J. Neural Eng. 6 (2009) 066001 (14pp) doi:10.1088/1741-2560/6/6/066001 Decoding flexion of individual fingers using electrocorticographic signals in humans J Kub´ anek 1,2,3 , K J Miller 4,5 , J G Ojemann 6 , J R Wolpaw 1 and G Schalk 1,7,8,9,10 1 BCI R&D Program, Wadsworth Center, New York State Department of Health, Albany, NY, USA 2 Department of Biomedical Engineering, Washington University, St Louis, MO, USA 3 Department of Anatomy and Neurobiology, Washington University School of Medicine, St Louis, MO, USA 4 Department of Physics, University of Washington, Seattle, WA, USA 5 Department of Medicine, University of Washington, Seattle, WA, USA 6 Department of Neurosurgery, University of Washington School of Medicine, Seattle, WA, USA 7 Department ofNeurology, Albany Medical College, Albany, NY, USA 8 Department of Neurosurgery, Washington Univ, St Louis, MO, USA 9 Department of Biomedical Science, State University of New York at Albany, Albany, NY, USA 10 Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA E-mail: [email protected] Received 21 October 2008 Accepted for publication 27 August 2009 Published 1 October 2009 Online at stacks.iop.org/JNE/6/066001 Abstract Brain signals can provide the basis for a non-muscular communication and control system, a brain–computer interface (BCI), for people with motor disabilities. A common approach to creating BCI devices is to decode kinematic parameters of movements using signals recorded by intracortical microelectrodes. Recent studies have shown that kinematic parameters of hand movements can also be accurately decoded from signals recorded by electrodes placed on the surface of the brain (electrocorticography (ECoG)). In the present study, we extend these results by demonstrating that it is also possible to decode the time course of the flexion of individual fingers using ECoG signals in humans, and by showing that these flexion time courses are highly specific to the moving finger. These results provide additional support for the hypothesis that ECoG could be the basis for powerful clinically practical BCI systems, and also indicate that ECoG is useful for studying cortical dynamics related to motor function. M This article features online multimedia enhancements (Some figures in this article are in colour only in the electronic version) 1. Introduction Brain–computer interfaces (BCIs) use brain signals to communicate a user’s intent [1]. Because these systems directly translate brain activity into action, without depending on peripheral nerves and muscles, they can be used by people with severe motor disabilities. Successful translation of BCI technology from the many recent laboratory demonstrations into widespread and valuable clinical applications is currently substantially impeded by the problems of traditional non- invasive or intracortical signal acquisition technologies. Non-invasive BCIs use electroencephalographic activity (EEG) recorded from the scalp [18]. While non- invasive BCIs can support much higher performance than previously assumed, including two- and three-dimensional cursor movement [5, 810], their use typically requires extensive user training, and their performance is often not robust. Intracortical BCIs use action potential firing rates or 1741-2560/09/066001+14$30.00 1 © 2009 IOP Publishing Ltd Printed in the UK

Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

IOP PUBLISHING JOURNAL OF NEURAL ENGINEERING

J. Neural Eng. 6 (2009) 066001 (14pp) doi:10.1088/1741-2560/6/6/066001

Decoding flexion of individual fingersusing electrocorticographic signals inhumansJ Kubanek1,2,3, K J Miller4,5, J G Ojemann6, J R Wolpaw1 andG Schalk1,7,8,9,10

1 BCI R&D Program, Wadsworth Center, New York State Department of Health, Albany, NY, USA2 Department of Biomedical Engineering, Washington University, St Louis, MO, USA3 Department of Anatomy and Neurobiology, Washington University School of Medicine, St Louis, MO,USA4 Department of Physics, University of Washington, Seattle, WA, USA5 Department of Medicine, University of Washington, Seattle, WA, USA6 Department of Neurosurgery, University of Washington School of Medicine, Seattle, WA, USA7 Department of Neurology, Albany Medical College, Albany, NY, USA8 Department of Neurosurgery, Washington Univ, St Louis, MO, USA9 Department of Biomedical Science, State University of New York at Albany, Albany, NY, USA10 Department of Biomedical Engineering, Rensselaer Polytechnic Institute, Troy, NY, USA

E-mail: [email protected]

Received 21 October 2008Accepted for publication 27 August 2009Published 1 October 2009Online at stacks.iop.org/JNE/6/066001

AbstractBrain signals can provide the basis for a non-muscular communication and control system, abrain–computer interface (BCI), for people with motor disabilities. A common approach tocreating BCI devices is to decode kinematic parameters of movements using signals recordedby intracortical microelectrodes. Recent studies have shown that kinematic parameters of handmovements can also be accurately decoded from signals recorded by electrodes placed on thesurface of the brain (electrocorticography (ECoG)). In the present study, we extend theseresults by demonstrating that it is also possible to decode the time course of the flexion ofindividual fingers using ECoG signals in humans, and by showing that these flexion timecourses are highly specific to the moving finger. These results provide additional support forthe hypothesis that ECoG could be the basis for powerful clinically practical BCI systems, andalso indicate that ECoG is useful for studying cortical dynamics related to motor function.

M This article features online multimedia enhancements

(Some figures in this article are in colour only in the electronic version)

1. Introduction

Brain–computer interfaces (BCIs) use brain signals tocommunicate a user’s intent [1]. Because these systemsdirectly translate brain activity into action, without dependingon peripheral nerves and muscles, they can be used by peoplewith severe motor disabilities. Successful translation of BCItechnology from the many recent laboratory demonstrationsinto widespread and valuable clinical applications is currently

substantially impeded by the problems of traditional non-invasive or intracortical signal acquisition technologies.

Non-invasive BCIs use electroencephalographic activity(EEG) recorded from the scalp [1–8]. While non-invasive BCIs can support much higher performance thanpreviously assumed, including two- and three-dimensionalcursor movement [5, 8–10], their use typically requiresextensive user training, and their performance is often notrobust. Intracortical BCIs use action potential firing rates or

1741-2560/09/066001+14$30.00 1 © 2009 IOP Publishing Ltd Printed in the UK

Page 2: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

local field potential activity recorded from individual or smallpopulations of neurons within the brain [11–19]. Signalsrecorded within cortex may encode more information andmight support BCI systems that require less training thanEEG-based systems. However, clinical implementations areimpeded mainly by the problems in achieving and maintainingstable long-term recordings from individual neurons and bythe high variability in neuronal behavior [20–22]. Despiteencouraging evidence that BCI technologies can actuallybe useful for severely disabled individuals [6, 23], theseissues of non-invasive and action potential-based techniques inacquiring and maintaining robust recordings and BCI controlremain crucial obstacles that currently impede widespreadclinical use in humans.

The capacities and limitations of different brain signalsfor BCI applications are still unclear, in particular in light ofthe technical and practical constraints imposed by the long-term clinical applications in humans that are the immediategoal of BCI research. The further development of most BCIsbased on intracortical recordings is driven by the prevalentconviction [24–27] that only action potentials provide detailedinformation about specific (e.g. kinematic) parameters ofactual or imagined limb movements. It is argued thatthe use of such specific movement-related parameters cansupport BCI control that is more accurate or intuitive thanthat supported by BCIs that do not use detailed movementparameters. This prevalent conviction has recently beenchallenged in two ways. First, it is becoming increasinglyevident that brain signal features other than action potentialfiring rates also contain detailed information about the plansfor and the execution of movements. This possibility wasfirst proposed for subdurally recorded signals by Brindley in1972 [28]. More recently, several studies [29–31] showedin monkeys that local field potentials (LFPs) recorded usingintracortical microelectrodes contain substantial movement-related information and that this information ‘discriminatedbetween directions with approximately the same accuracyas the spike rate’ [29]. Furthermore, our own results inhumans [32–37] showed that electrocorticographic activity(ECoG) recorded from the cortical surface can give bothgeneral and specific information about movements, such ashand position and velocity. We also found that the fidelity ofthis information was within the range of the results reportedpreviously for intracortical microelectrode recordings in non-human primates (table 3 in [36]). These results have recentlybeen replicated and further extended [38, 39]. There haseven been some evidence that EEG and MEG carry someinformation about kinematic parameters [40]. Second, it isstill unclear whether information about movement parametersdoes in fact extend the range of options for BCI research andapplication. Recent studies using EEG and ECoG in humans([5, 8–10] and [41], respectively) used brain signals that did notreflect particular kinematic parameters. Nevertheless, thesestudies demonstrated BCI performance that was amongst thebest reported to date. In summary, it has become clear thatinformation about specific movement parameters is accessibleto different types of signals recorded by different sensors,and there is currently little evidence that any sensor supportssubstantially higher BCI performance than the others.

Going forward, a critical challenge in designing BCIsystems for widespread clinical application is to identify andoptimize a BCI method that combines good performance withease of use and robustness. Based on the studies listed aboveand others [32, 41–44], ECoG has emerged as a signal modalitythat may satisfy these requirements. This notion has beensupported in part by our previous study [36], in which weshowed that ECoG can be used to decode parameters of handmovements. In the present study, we asked to what extentECoG also holds information about movements of individualfingers. There is some evidence in the literature that supportsthis hypothesis. When monkeys were prompted to moveindividual fingers [45–47], neurons were found to change theiractivity in widely distributed patterns. In line with this finding,another study did not find a focal anatomical representationof individual fingers in M1 [48]. This widely distributedand thus relatively large representation has also been shownin humans, e.g., in fMRI studies [49, 50], lesion studies[51, 52] and a MEG study [53]. Recent studies have alreadybegun to attempt to differentiate activity associated withparticular finger movement tasks: in humans, movementsof the index finger could be discriminated from rest usingEEG [54]; the laterality of index finger movements could bedetermined using EEG [55, 56], MEG [57], or ECoG [58];MEG signals were found to correlate with velocity profiles ofmovements of one finger [59]; and studies even found somedifferences between movements of different fingers on thesame hand using EEG [60], or more robust differences usingECoG [61, 62] and activity recorded from individual neuronsin monkey motor cortex [63–66].

In this study, we set out to determine whether it is possibleto faithfully decode the time course of flexion of each finger,i.e. not only the laterality of finger movements or the typeof finger, in humans using ECoG. We studied five subjectswho were asked to repeatedly flex each of the five fingers inresponse to visual cues. The principal results show that it ispossible to accurately decode the time course of the flexionof each finger individually. Furthermore, they confirm thefinding of our previous study [36] that the most useful sourcesof information that can be recorded with subdural electrodesare the local motor potential (LMP) and ECoG activity in highfrequency bands (i.e. from about 70 Hz to about 200 Hz)recorded from hand and other motor cortical representations.These results provide strong additional evidence that ECoGcould be used to provide accurate multidimensional BCIcontrol, and also suggest that ECoG is a powerful tool forstudying brain function.

2. Methods

2.1. Subjects

The subjects in this study were five patients with intractableepilepsy who underwent temporary placement of subduralelectrode arrays to localize seizure foci prior to surgicalresection. They included three women (subjects A, C andE) and two men (subjects B and D). (See table 1 for additionalinformation.) All gave informed consent. The study was

2

Page 3: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

(B) (C)

D) (E)

40

5

1-3468

9

2242

43

(A)

Figure 1. Estimated locations of recorded electrodes in the five subjects. Subject B’s grid was located on the right hemisphere. Weprojected this grid on the left hemisphere. The brain template on the right highlights the location of the central sulcus and Sylvian fissure,and also outlines relevant Brodmann areas.

Table 1. Clinical profiles. All subjects had normal cognitive capacity and were literate and functionally independent.

Subject Age Sex Handedness Grid location Seizure focus

A 46 F L Left frontal Left frontalB 24 M R Right frontal Right medial frontalC 18 F R Left frontal Left frontalD 32 M R Left fronto-temporal Left temporalE 27 F R Left fronto-temporal Left temporal

approved by the Institutional Review Board of the Universityof Washington School of Medicine. Each subject had a 48-or 64-electrode grid placed over the fronto-parietal-temporalregion including parts of sensorimotor cortex (figure 1). Thesegrids consisted of flat electrodes with an exposed diameter of2.3 mm and an inter-electrode distance of 1 cm and wereimplanted for about 1 week. Grid placement and durationof ECoG monitoring were based solely on the requirementsof the clinical evaluation, without any consideration of thisstudy. Following placement of the subdural grid, each subjecthad postoperative anterior–posterior and lateral radiographs toverify grid location.

2.2. Experimental paradigm

For this study, each subject was in a semi-recumbent positionin a hospital bed about 1 m from a video screen. Subjectswere instructed to move specific individual fingers of the handcontralateral to the implant in response to visual cues. The cueswere the written names of individual fingers (e.g. ‘thumb’).Each cue was presented for 2 s. The subjects were askedto repeatedly flex and extend the requested finger during cuepresentation. The subjects typically flexed the indicated finger3–5 times over a period of 1.5–3 s. Each cue was followedby the word ‘rest’, which was presented for 2 s. The visualcues were presented in random order. Data were collectedfor a total period of 10 min, which yielded an average of 30trials for each finger. The flexion of each finger was measuredusing a data glove (5DT Data Glove 5 Ultra, Fifth DimensionTechnologies) that digitized the flexion of each finger at 12bit resolution. The effective resolution (i.e. the resolutionbetween full flexion and extension) averaged 9.6 bits for allsubjects and fingers. Figure 2 shows an example of flexionpatterns for the five fingers acquired using this data glove.These behavioral data were stored along with the digitizedECoG signals as described below and were subsequentlynormalized by their mean and standard deviation prior to anysubsequent analysis.

2.3. Data collection

In all experiments in this study, we presented the stimuli andrecorded ECoG from the electrode grid using the general-purpose BCI2000 system [67] connected to a NeuroscanSynamps2 system. Simultaneous clinical monitoring wasachieved using a connector that split the cables coming fromthe subject into one set that was connected to the clinicalmonitoring system and another set that was connected to theBCI2000/Neuroscan system. All electrodes were referencedto an inactive electrode. The signals were amplified, bandpassfiltered between 0.15 and 200 Hz, digitized at 1000 Hz andstored in BCI2000 along with the digitized flexion samples forall fingers [33]. Each dataset was visually inspected and allchannels that did not clearly contain ECoG activity (e.g. suchas channels that contained flat signals or noise due to brokenconnections—we did not notice and thus exclude channelswith interictal activity) were removed prior to analysis, whichleft 48, 63, 47, 64 and 61 channels (for subjects A–E,respectively) for our analyses.

2.4. 3D cortical mapping

We used lateral skull radiographs to identify the stereotacticcoordinates of each grid electrode with software [68] thatduplicated the manual procedure described in [69]. We definedcortical areas using Talairach’s Co-Planar Stereotaxic Atlasof the Human Brain [70] and a Talairach transformation(http://www.talairach.org). We obtained a template 3Dcortical brain model (subject-specific brain models werenot available) from the source code provided on the AFNISUMA website (http://afni.nimh.nih.gov/afni/suma). Finally,we projected each subject’s electrode locations on this 3Dbrain model and generated activation maps using a customMatlab program.

3

Page 4: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

0

32s

10 s 0

5

Figure 2. Example finger flexion. These traces show the behavioral flexion patterns for subject E and each of the five fingers over the first 2min of data collection. Flexion consisted of movements of individual fingers, accompanied occasionally by minor flexion of other fingers.Horizontal bars give time and vertical bars give units of standard deviation.

2.5. Feature extraction

We first re-referenced the signal from each electrode usinga common average reference (CAR) montage. To do this,we obtained the CAR-filtered signal s !

h at channel h usings !h = sh " 1

H

!Hq=1 sq . H was the total number of channels and

sh was the original signal sample at a particular time.For each 100 ms time period (overlapping by 50 ms), we

then converted the time-series ECoG data into the frequencydomain with an autoregressive model [71] of order 20. Usingthis model, we calculated spectral amplitudes between 0 and200 Hz in 1 Hz bins. We also derived the LMP, whichwas calculated as the running average of the raw unrectifiedtime-domain signal at each electrode calculated for each100 ms time window. Figure 3 demonstrates that therewas practically no (linear) relationship between the LMP andamplitudes in different frequencies, and generally little relationbetween different frequencies. We then averaged the spectralamplitudes in particular frequency ranges (i.e. 8–12 Hz, 18–24 Hz, 75–115 Hz, 125–159 Hz, 159–175 Hz). These bins aresimilar to what we used in our previous study [36], except thatwe abandoned the 35–70 Hz frequency range because it hasbeen demonstrated to reflect conflicting spectral phenomena[72] and because its inclusion in preliminary tests decreasedperformance. Figure 4 shows an example to demonstrate thatour selected ECoG features at particular locations typicallyexhibited correlation with the flexion of individual fingers.

In summary, we extracted up to 384 features (6 features—the LMP and 5 frequency-based features—at up to 64locations) using 100 ms time windows from the ECoG signals.We then decoded from these features the flexion of theindividual fingers. As shown later in this paper, we achievedoptimal performance in decoding finger flexion when weused brain activity that preceded the actual movement by50–100 ms. These features were submitted to a decodingalgorithm that is described in more detail below.

2.6. Decoding and evaluation

As described above, we first extracted several time- orfrequency-domain features from the raw ECoG signals. We

0 Hz 20 Hz 40 Hz 60 Hz 80 Hz 100 Hz 120 Hz 140 Hz 160 Hz 180 HzLMP0 Hz

20 Hz

40 Hz

60 Hz

80 Hz

100 Hz

120 Hz

140 Hz

160 Hz

180 Hz

LMP

0

1

r

Figure 3. Correlation of features. This figure shows the correlationcoefficient r (color coded—see the color bar), calculated for all databetween the LMP and amplitudes in different frequency bands. r isvery low for the LMP and amplitudes in different frequencies, andgenerally low for different frequencies.

then constructed one linear multivariate decoder for eachfinger, where each decoder defined the relationship betweenthe set of features and flexion of a particular finger. Eachdecoder was computed from the finger movement periods. Amovement period was defined as the time from 1000 ms prior tomovement onset to 1000 ms after movement offset. Movementonset was specified as the time when the finger’s flexion valueexceeded an empirically defined threshold. Movement offsetwas specified as the time when the finger’s flexion value fellbelow that threshold and no movement onset was detectedwithin the next 1200 ms. This procedure yielded 25–48 (31 onaverage) movement periods for each finger. We discardedall data outside the movement periods so as to provide abalanced representation of movement and rest periods for theconstruction of the decoder.

4

Page 5: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

4

39

7

36

4

39

2022

4

5

6

7

12

13

14

15

20

21

22

23

28

29

30

31

36

37

38

39

THU

MB

LMP

HIG

H G

AM

MA

(A)

(B) (C)

2 s

2 s

Figure 4. Example of ECoG features related to finger flexion in subject B. (A) Example traces of the LMP (left) and high gamma(75–115 Hz, right), corresponding to the 20 locations indicated by the black rectangle shown in (B). These traces show the time course ofECoG (top traces) as well as the behavioral trace for thumb flexion (trace on the bottom). The traces only show data for the movementperiods, i.e. periods around thumb movements (see scissors symbols). Dashed lines indicate beginning/ends of movement periods.(B) Locations of the electrodes on the grid shown on the 3D template brain model. The locations that showed correlations with thumbflexion in A, i.e. channels 20 and 22, are indicated with symbols. (C) Magnification of the first three movement periods. The solid traces ontop and bottom show thumb flexion, and the dotted traces on top and bottom give LMP and high gamma activity, respectively, at channel 20.The LMP and high gamma activity are correlated with flexion of the thumb.

For each finger, the decoder was constructed and evaluatedusing fivefold cross-validation. To do this, each dataset wasdivided into five parts, the decoders were determined from4/5th of the dataset (training set) and tested (i.e. the decoderwas applied) on the remaining 1/5th (test set). This procedurewas then repeated five times—each time, a different 1/5th ofthe dataset was used as the test set. The decoder for theflexion of each finger was constructed by determining thelinear relationship between ECoG features and finger flexionusing the PaceRegression algorithm that is part of the Java-based Weka package [73]. Unless otherwise noted, we utilizedfeature samples that preceded the flexion traces to be decoded.Specifically, we used a window of 100 ms length that wascentered at "50 ms with respect to the flexion trace, i.e.causal prediction. Thus, this procedure could be used topredict the behavioral flexion patterns of each finger in real

time without knowledge of which finger moved or when itmoved. (All decoded flexion traces shown in figures 7 and 8,and classification time courses shown in figure 11, are shiftedback in time by the 50 ms offset used for decoding to highlightthe fact that cortical signals precede the movements.)

We then computed Pearson’s correlation coefficient rbetween the actual and decoded flexion traces for each testset. Finally, we derived a metric of decoding performance foreach subject and finger by averaging the r values across thefive test sets.

3. Results

3.1. Brain signal responses to finger movements

We first characterized the brain signal responses to fingerflexion. These responses were assessed by determining the

5

Page 6: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

(E)(D)(C)(B)(A)

r0.43

0

ms-600 +600ms-600 +600 ms-600 +600 ms-600 +600 ms-600 +600

r0.59

0

r0

0.25

r0.66

0

r0.32

0

r0.50

0

Hz2000

r0

0.60r0

0.24r0

0.22r0

0.27

Figure 5. Brain signal changes between rest and thumb movement for all subjects. Each column (A–E) gives data for the respective subject.Plots on top show brain signal changes in frequencies between 0 and 200 Hz over time (horizontal axis from 600 ms prior to 600 ms aftermovement onset). These changes are given in color-coded values of r2 (see color bars). Plots on the bottom show changes (expressed as r2

values) in the LMP feature over time. Blue traces on the bottom right of each brain indicate the average actual thumb flexion trace. Insetsshow magnifications of the areas shown by black rectangles. High frequencies in the ECoG and the LMP give substantial information aboutthe movement, and some of this information appears to precede the actual movement.M An MPEG movie of this figure is available from stacks.iop.org/JNE/6/066001

correlation of ECoG features with the task of resting or movinga finger. The rest dataset included all data from 1800 ms to1200 ms before each movement onset. For each subject, fingerand feature, we then calculated the value of r2 between thedistribution of feature values during rest and the distributionof feature values for each time point from 600 ms before to600 ms after movement onset. Figure 5 shows the results forflexion of the thumb for all subjects.

The time–frequency plots on top show that in each subjectthe amplitudes at particular locations and frequencies (inparticular at frequencies >50 Hz) hold substantial information(up to an r2 of 0.66) about whether or not the thumb wasflexed. This information was highly statistically significant:when we ran a randomization test, r2 values of less than 0.02were significant at the 0.01 level for any subject and time–frequency or LMP feature. This information about fingerflexion was localized to one or only a few electrode locations.Furthermore, it is also evident from these plots that ECoGchanges preceded the actual movement. The traces on thebottom show that the LMP also changed substantially (i.e. upto an r2 of 0.60) with the movement, and that, again, some ofthe changes preceded the actual movement. Activity changesfor different fingers (see supplementary figure 1 available

at stacks.iop.org/JNE/6/066001) showed the same generaltopographical and spectral characteristics as shown here forthe thumb movement. At the same time, each finger showedits own distinctive pattern (e.g. time–frequency changes). Asshown later in section 3.6, these different patterns allowed foraccurate classification of individual fingers.

Figure 5, supplementary figure 1 and the movie (availablefrom stacks.iop.org/JNE/6/066001) illustrate qualitativelyhow ECoG signals differ between rest and finger movement.We also quantitatively assessed these changes by determininghow many electrodes showed significant activity changes forthe high gamma (75–115 Hz) band and the LMP11. Weapplied a three-way ANOVA to determine which factors

11 To compute significance for each electrode, we ran a randomization testusing 100 repetitions of the same procedure that produced figure 5 andsupplementary figure 1 (available at stacks.iop.org/JNE/6/066001), whichresulted in sets of r2 values for actual and randomized data. We then averagedthese r2 values between "600 ms and +600 ms. Separately for the highgamma (75–115 Hz) band and the LMP, we then fit the randomized r2 datawith an exponential distribution, using a maximum likelihood estimate ofits one parameter. We computed the level of significance ! of the actualr2 value as its percentile within the randomized r2 distribution. A LMP orhigh gamma feature was considered significant at a certain channel when itsp-value (p = 1"!) was smaller than 0.01 after Bonferroni-correcting for thenumber of channels.

6

Page 7: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

0 20-log(p)

LMP HIGH GAMMA

0 12-log(p)

(E)(D)(C)(B)(A)

LMP

HIG

H G

AM

MA

Figure 6. Relationship of brain signals with thumb flexion. This figure shows the spatial distribution of the confidence index ci for the LMP(top left) or 75–115 Hz high gamma band (top right) and for each subject and the LMP or high gamma feature (ten brains on the bottom).The confidence indices are color coded (see color bars).

(subject, finger or feature) were related to the number ofchannels with significant signal changes (Ns). The resultsshowed that Ns was not related to the subjects (p = 0.20)

or the different fingers (p = 0.77). However, the numberof significant electrodes substantially differed for the LMP(17% of all channels) and high gamma features (5% of allchannels) (p < 0.0001). Notably, we did not find a differencewhen we compared Ns for the thumb versus the other fingers(p = 0.21). This suggests that a potential difference in thesize of the cortical representation of the thumb versus otherfingers is smaller than the spatial resolution (1 cm) availablein this study.

In summary, changes in the LMP and amplitudes atfrequencies higher than 50 Hz over the hand area of motorcortex are associated with flexion of individual fingers withdifferent changes for different fingers. Some of these changespreceded finger flexion. Furthermore, the LMP featureis spatially broader than the high gamma feature. The analysesin the next section determined how well each feature wasrelated to the degree of finger flexion.

3.2. Relationship of ECoG signals with finger flexion

To further characterize the relationship of the ECoG featureswith finger flexion, we correlated, for all movement periods,the time course of the flexion with the time course of eachfeature for each finger. We then computed one measure ofthe correlation coefficient r and its significance p (determinedusing the F-test) for each subject, finger and cross-validationfold. To get these two measures for each finger of eachsubject, we used the results from the four (out of five)

cross-validation folds with the highest r values. We thenconverted the associated p values into indices of confidence ci

(ci = "log10(p)) and averaged those across the four cross-validation folds.

The results of the analyses for thumb flexion are shownin figure 6. They are representative of the results achieved forthe other fingers, which indicates that the brain signal changesassociated with specific flexion values largely overlap for thedifferent fingers. The two large brain images on top showthe confidence indices for the LMP and 75–115 Hz (i.e. highgamma), averaged across all subjects. The ten smaller brainimages on the bottom show results for individual subjects A–E and the LMP (upper row) and high gamma (lower row)features.

The high values of the confidence indices over primarilydifferent motor cortical areas and for the LMP and high gammafeatures again document the important role of these locationsand ECoG features, respectively. Similar to our previousstudy [36], the topographies of the confidence indices differbetween the LMP and high gamma (see figure 5 in [36])and are more diffuse for the LMP. Interestingly, the patternsof confidence indices shown here are spatially much morewidespread than the information shown before in figure 5 thatdifferentiates movement from rest. See section 4 for furtherdiscussion of this topic. Also, the spatial distribution of theLMP and high gamma features for subjects A and B appearsto be more spatially widespread than that for the other threesubjects. This is interesting, as subjects A and B used theirnon-dominant hand for the finger movement task. The nextsection demonstrates how this information about finger flexion

7

Page 8: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

r=0.80

thumb

r=0.65indexfinger

r=0.58middlefinger

r=0.59ringfinger

r=0.55littlefinger

1s

50

Figure 7. Examples for actual and decoded movement trajectories. This figure shows, for the best cross-validation fold for each finger andfor subject A, the actual (solid) and decoded (dotted) finger flexion traces, as well as the respective correlation coefficients r. These tracesdemonstrate the generally close concurrence between actual and decoded finger flexion. They also show that this subject used differentflexion rates for different fingers and/or cross-validation folds and that the decoded traces reflect these different rates. The horizontal bargives time and the vertical bar gives units of standard deviation.

Table 2. Decoding performance for all subjects and fingers. This table lists correlation coefficients r, calculated between the actual anddecoded flexion time courses for the indicated finger. The three values for each combination of subject and finger represent the mean,minimum and maximum value of r for all cross-validation folds. These results demonstrate that reconstruction of the flexion of each of thefive fingers is possible using ECoG signals in humans.

Subjects Thumb Index finger Middle finger Ring finger Little finger Avg.

A 0.75/0.71/0.80 0.61/0.53/0.65 0.49/0.40/0.58 0.54/0.50/0.59 0.43/0.33/0.55 0.57B 0.66/0.62/0.74 0.61/0.53/0.68 0.51/0.37/0.57 0.58/0.46/0.66 0.53/0.47/0.60 0.58C 0.60/0.51/0.66 0.62/0.51/0.67 0.60/0.55/0.64 0.49/0.44/0.52 0.39/0.34/0.44 0.53D 0.37/0.32/0.42 0.55/0.50/0.58 0.54/0.44/0.64 0.36/0.22/0.43 0.23/0.14/0.33 0.41E 0.42/0.29/0.52 0.60/0.43/0.65 0.56/0.43/0.63 0.54/0.49/0.61 0.54/0.49/0.58 0.53AVG 0.56 0.60 0.54 0.50 0.42 0.52

supported the successful decoding of individual finger flexionmovements.

3.3. Accurate decoding of finger flexion

Table 2 and figure 7 show the principal results of this study.Table 2 gives the correlation coefficients calculated for allfinger movement periods (see section 2.6 for a descriptionof how these periods were determined) between actual anddecoded flexion traces. The given correlation coefficientsrepresent the mean, minimum and maximum correlationcoefficients calculated for a particular subject and fingerand across all cross-validation folds. When averaged acrosssubjects, the correlation coefficients range from 0.42 to 0.60for the different fingers. When averaged across fingers,the correlation coefficients range from 0.41 to 0.58 for thedifferent subjects. As reported in table 2, the average decodingperformance (i.e. correlation coefficient averaged across allsubjects and fingers) was 0.52. We also computed this averagedecoding performance for individual features, as well as forthe set of all frequency-based features. This resulted in 0.40(LMP), 0.19 (8–12 Hz), 0.25 (18–24 Hz), 0.45 (75–115 Hz),0.40 (125–159 Hz), 0.33 (159–175 Hz) and 0.49 (all frequencyfeatures). Furthermore, and in contrast to our previous study[36], the application of the CAR filter had only a modest

effect: e.g. for the LMP alone, we achieved 0.40/0.41 (withCAR/without CAR, respectively) and for all features, weachieved 0.52/0.54 (with CAR/without CAR, respectively).

These results demonstrate that it is possible to accuratelyinfer the time course of repeated, rhythmic finger flexion usingECoG signals in humans. Results shown later in this paperalso show that these flexion patterns are highly specific to themoving finger. Furthermore, for a particular subject and finger,the correlation coefficients only modestly varied across cross-validation folds. This indicates that the decoding performanceis robust during the whole period of data collection.Figure 7 shows examples for actual and decoded flexion traces.Because the calculation of decoding parameters only involvedthe training dataset, but not the test dataset, similar resultscan be expected in online experiments, at least for durationssimilar to those in this study.

3.4. Decoding finger flexion for different flexion speeds

While this study was not designed to systematically vary thespeed of the finger flexions, the subjects varied that speedsubstantially for different fingers and/or at different times(see traces for the thumb and little fingers in figure 7). Tostudy decoding at different flexion speeds, we first estimatedthe speed for each movement period by calculating the period

8

Page 9: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

200

A/D

200 ms(A) (B) (C)

Figure 8. Average actual and decoded flexion traces for three different flexion speeds. The traces in A, B and C show actual (blue solid) anddecoded (red dashed) flexion traces, averaged across all subjects and fingers, and corresponding to the 0–10th, 45–55th and 90–100thpercentile of flexion speeds, respectively. Shaded bands indicate the standard error of the mean. The minimum and maximum amplitudes ofthe average decoded traces were scaled to match those of the actual traces to facilitate comparison. All traces begin 100 ms prior to detectedmovement onsets (which are indicated with arrows). The decoded traces are shifted back in time by the 50 ms offset used for decoding tohighlight the fact that cortical signals precede the movements.

from the onset of the movement (that was detected as describedearlier in the paper) to the offset of the first flexion (whichwas detected as the first time the flexion value fell below thesame threshold that was used for detecting the onset). Wethen compiled the flexion speeds from all subjects, fingers andmovement periods and selected the flexion patterns accordingto flexion speeds in three categories. These categoriescorresponded to the 0–10th, 45–55th and 90–100th percentileof estimated flexion speeds. Figures 8(A)–(C) show theaveraged flexion traces (solid) and corresponding averageddecoded traces (dashed) for the three different speed bins,respectively. These results demonstrate that the decodedtraces accurately track the flexion dynamics at different flexionspeeds. They also show that the brain signals precede theactual flexion by approximately 100 ms.

3.5. Optimal decoding offset

We also studied the effect of the decoding offset, i.e. thetemporal offset of the 100 ms window that was used fordecoding the flexion of a particular finger at a particular time.Unless otherwise noted, we used an offset of "50 ms forthe analyses throughout the paper, i.e. brain signals from a100 ms window centered at 50 ms prior to time t were used todecode finger flexion at time t. We evaluated the effect of thatoffset ("300, "250, "200, "150, "100, "50, 0, +50 ms) ondecoding performance by calculating the resulting correlationcoefficient r for a particular subject, finger and cross-validationfold. We then averaged these correlation coefficients acrosssubjects and cross-validation folds. The color-coded r valuesare shown in figure 9 for each finger. They show that forthe different fingers, decoding performance (i.e. the averagedvalue of r) peaked around "100 to "50 ms. (In this figure,the values at "50 ms correspond to the values of r averagedacross subjects in table 2.) These results provide support forthe hypothesis that the ECoG features we assessed were relatedprimarily to movement and not to sensory feedback producedby the moving fingers.

3.6. Information about individual fingers

We finally studied the amount of information that is capturedin the ECoG about different fingers because the resultsshown up to this point (except for the finger-specific ECoGresponses shown in supplementary figure 1 (available at

FINGER

DEC

OD

ING

OFFS

ET (

ms)

0

0.20

0.40

0.60

r

thumb indexfinger

middlefinger

ringfinger

littlefinger

+50

+0

-50

-100

-150

-200

-250

-300

Figure 9. Correlation coefficients for different fingers and decodingoffsets. Correlation coefficients r (color coded and averaged acrosssubjects and cross-validation folds) for different decoding offsetsand fingers. The highest correlation coefficients are achieved whenwe used a 100 ms window that preceded the decoded flexion valuesby 50–100 ms.

stacks.iop.org/JNE/6/066001)) do not exclude the possibilitythat our decoding results simply detected movement of anyfinger. We investigated this possibility in two ways.

First, we determined the degree of interdependence offinger flexion for the actual and decoded flexion, respectively.To do this, we calculated the correlation coefficient r for theactual (i.e. behavioral) or decoded movement traces betweenall pairs of fingers. The results are shown in confusion matricesin figure 10. Each of the color-coded squares gives, for theindicated subject and for actual (top) or decoded (bottom)flexion, the correlation coefficient r calculated for the wholetime course of finger flexion for a particular combination offingers. The thumb and little finger correspond to the left-mostand right-most columns, and to the top-most and bottom-mostrows, respectively. If the actual or decoded movements of thedifferent fingers were completely independent of each other,one would expect a correlation coefficient of 1 in the diagonalrunning from top left to bottom right, and a correlation

9

Page 10: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

0

1

r

(A) (B) (C) (D) (E)

ACTU

AL

DEC

OD

ED

Figure 10. Interdependence of actual and decoded finger flexion. Each of the ten figures shows color-coded correlation coefficients,calculated for actual and decoded finger flexion time courses for the indicated subject across all finger combinations (thumb: firstrow/column, little finger: last row/column).

(A) (B) (C) (D) (E)

ACCU

RACY

(%)

TIME (s)!2 !1.5 !1 !0.5 0 0.5 1 1.5 20

20

40

60

80

100

!2 !1.5 !1 !0.5 0 0.5 1 1.5 20

20

40

60

80

100

!2 !1.5 !1 !0.5 0 0.5 1 1.5 20

20

40

60

80

100

!2 !1.5 !1 !0.5 0 0.5 1 1.5 20

20

40

60

80

100

!2 !1.5 !1 !0.5 0 0.5 1 1.5 20

20

40

60

80

100

0

100

%

TIME (s) TIME (s) TIME (s) TIME (s)

Figure 11. Discrete classification of finger movements. The top row shows the classification accuracy (i.e. the fraction of all fingermovements that were correctly classified) for each subject A–E as a function of time relative to the movement onset. Movement onset ismarked by the dotted line. Accuracy due to chance (i.e. 20%) is marked by the dashed line. The bottom row shows the confusion matrix ofclassification at the time of the peak classification accuracy for each subject. In each matrix, each row represents the finger that actuallymoved (from top to bottom: thumb, index, middle, ring and little finger). Each column represents the finger that we classified (from left toright: thumb, index, middle, ring and little finger). In each cell, the color indicates the fraction of all movements of a particular finger thatwas assigned to the particular classified finger.

coefficient of 0 in all other cells. The results on top show thatthere was some degree of interdependence between the actualmovements of the different fingers, in particular between indexfinger and middle finger and between ring finger and littlefinger. The results for decoded flexion on the bottom show asimilar pattern, although the degree of interdependence of thedecoded movements was modestly higher than that calculatedfor actual movements.

Second, we also determined to what extent it is possibleto determine from the ECoG signals which finger was moving.We did this using a five-step procedure: (1) we first detectedmovement onsets for each finger using the procedure describedearlier in this paper, and labeled the period from 2 s before to2 s after movement onset as the movement period of that finger.(2) We then decoded flexion traces for each finger using the50 ms offset used for other analyses in this paper, andnormalized the decoded traces for each finger by subtractingeach trace’s mean and dividing the result by the standarddeviation. (3) We then smoothed each trace using a zerodelay averaging filter of width 500 ms. (4) We then classified,at each time during each movement period, the smoothed

decoded flexion values into five finger categories by simplydetermining which of the five decoded flexion values (i.e. thefive values corresponding to the five fingers) was highest atthat point in time. (5) We finally determined classificationaccuracy by calculating the fraction of all trials in whichthe actual and classified finger matched. The results areshown in figure 11. The maximum classification accuracies(i.e. the classification accuracy at the best time for eachsubject) were 90.6%, 89.9%, 76.4%, 68.1% and 76.7% foreach subject, respectively. The average of those accuraciesis 80.3% (accuracy due to chance is 20%). The best timeswere 0 ms, +150 ms, +50 ms, +0 ms and +500 ms relative tomovement onset, for each subject, respectively. When we usedthe same common optimal offset for each subject (i.e. withoutoptimizing for each subject individually), the mean accuracywas 77.3% at +150 ms and the median accuracy was 76.4%at +50 ms relative to movement onset. The mean accuracy at0 ms was 77.1%. When we did not smooth the decoded tracesprior to classification, the mean accuracy at 0 ms was 67.5%.In summary, our results show conclusive evidence that ECoGsignals hold substantial information about the time course ofthe flexion of each individual finger.

10

Page 11: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

4. Discussion

In this study, we showed that ECoG signals can be used toaccurately decode the time course of the flexion of individualfingers in humans. The information in the ECoG that wasmost reflective of the movement preceded it by 50–100 ms.The results presented in this paper also further support thefindings of our previous study [36]—in particular, that theLMP and amplitudes in the high gamma range hold substantialmovement-related information. This information allowed usto accurately decode the flexion of each finger, even at differentflexion speeds, using short windows of 100 ms. These resultsprovide further evidence to support the hypothesis that themovement information encoded in ECoG signals exceeds thatprovided by EEG (see also figure 7 in [40]). Furthermore,our results suggest that both the fidelity of movement-related information in ECoG and its potential clinicalpracticality should position ECoG well for neuroprostheticapplications. In particular, previous efforts to classify fingermovements used activity recorded sequentially from individualneurons (i.e. in successive recordings) in non-human primates[63–66]. These studies demonstrated robust but only discretedecodings, i.e. inferring which finger flexed or extended butnot the temporal flexion patterns of each finger as we showedhere. The results in our present study are particularly appealinggiven that the patients in this study were under the influenceof a variety of external factors, and that the experiments didnot control for other relevant factors that may increase signalvariance, such as eye position, head position, posture, etc.

The successful decoding of finger movements achieved inthis study depended substantially on the LMP component andon amplitudes in gamma bands recorded primarily over hand-related and other areas of motor cortex. This is similar to thefindings reported in two recent studies that decoded kinematicparameters related to hand movements using ECoG [36, 38].In contrast to the cosine tuning reported for hand movementsin humans using ECoG [36] or EEG/MEG [40], in the presentstudy we assessed the linear relationship of ECoG featureswith finger flexion. We did this because auxiliary analysesdemonstrated that a linear function was better able to explainthe relationship of ECoG features with finger flexion comparedto a cosine function or finger velocity.

In this study, we found that gamma band activity differedmarkedly between movement of a particular finger and restin one or only a few electrodes (e.g. figure 5, supplementaryfigure 1 available at stacks.iop.org/JNE/6/066001). This isin contrast to the spatial distribution of the LMP feature thattended to be broader. These differences support the hypothesisthat the LMP and amplitudes in gamma bands are governedby different physiological processes. The information shownin figure 5 is also different from the spatial distribution of theinformation related to the degree of finger flexion shown infigure 6. This discrepancy suggests that the brain representsthe general state of finger movement (i.e. any flexion or rest)differently from the specific degree of flexion. While theLMP component appears to be a different phenomenon fromfrequency-related components (see figures 3, 5 and 6 in thispaper, and figure 5 in [36]), the physiological origin of the

LMP component is still unclear. Thus, further studies areneeded to determine the origin of the LMP and its relationshipwith existing brain signal phenomena, in particular to activityin the gamma band. While activity in the gamma band hasbeen related to motor or language function in several studies,there still exists considerable debate about its physiologicalorigin [74, 75].

In our experiments, subjects were cued to move only onefinger at a time. Thus, aside from residual movements ofother fingers that were due to mechanical or neuromuscularcoupling [76–78], the subjects moved each finger in isolation.It is not clear whether the results presented in the paperwould generalize to more natural and completely self-pacedmovements. Because ECoG detects mainly neuronal activityclose to the cortical surface, rather than subcortical structuresor the spinal cord, it is likely that the performance of decodingof particular parameters of movements will depend on towhat extent the cortex is involved in preparing for andexecuting these movements. Thus, successful BCI- or motor-related studies, and practical implementations derived fromthose studies, will depend on the design of protocols thatappropriately engage sizable areas of cortex.

The present paper confirms and significantly extends theresults presented in recent studies [32, 36, 38, 39], whichdemonstrated that ECoG holds information about position andvelocity of hand movements, by showing that it is possible toderive detailed information about flexion of individual fingersin humans using ECoG signals. This information could beused either in an open-loop fashion to provide a real-timeassessment of actual or potentially even imagined or intendedmovements, or in a closed-loop fashion to provide the basisfor a brain–computer interface system for communication andcontrol.

In this study, we showed that recordings from electrodeswith relatively coarse spacing (1 cm) in patients who are underthe influence of medications and a variety of other externalinfluences can provide detailed information about highlyspecific aspects of motor actions in relatively uncontrolledexperiments. While such results are appealing, this situationis most likely substantially limiting the potential informationcontent and usefulness of the ECoG platform for BCI purposes.Clinical application of ECoG-based BCI technology willrequire optimization of several parameters for the BCI purposeand also evaluation of other important questions. Such effortsshould include: optimization of the inter-electrode distance(there is evidence that much smaller distances may be optimal[43, 79]) and recording location (i.e. subdural versus epidural,different cortical locations); long-term durability effects ofsubdural or epidural recordings; and the long-term trainingeffects of BCI feedback.

In summary, the results shown in this paper providefurther evidence that ECoG may support BCI systems withfinely constructed movements. Further research is neededto determine whether ECoG also gives information aboutdifferent fingers in more complex and concurrent finger flexionpatterns and to determine the optimum for different parameterssuch as the recording density. We anticipate that such studieswill demonstrate that ECoG is a recording platform that

11

Page 12: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

combines high signal fidelity and robustness with clinicalpracticality.

Acknowledgments

We thank Drs Dennis McFarland and Xiao-mei Pei forhelpful comments on the manuscript and Dr Eberhard Fetz forinsightful discussions on the LMP. This work was supportedin part by grants from NIH (EB006356 (GS), EB000856 (JRWand GS), NS07144 (KJM) and HD30146 (JRW)), the US ArmyResearch Office (W911NF-07-1-0415 (GS) and W911NF-08-1-0216 (GS)), the James S McDonnell Foundation (JRW),NSF (0642848 (JO)) and the Ambassadorial Scholarship ofthe Rotary Foundation (JK).

References

[1] Wolpaw J R, Birbaumer N, McFarland D J, Pfurtscheller Gand Vaughan T M 2002 Brain–computer interfaces forcommunication and control Electroencephalogr. Clin.Neurophysiol. 113 767–91

[2] Birbaumer N, Ghanayim N, Hinterberger T, Iversen I,Kotchoubey B, Kubler A, Perelmouter J, Taub E and Flor H1999 A spelling device for the paralysed Nature 398 297–8

[3] Pfurtscheller G, Guger C, Muller G, Krausz G and Neuper C2000 Brain oscillations control hand orthosis in atetraplegic Neurosci. Lett. 292 211–4

[4] Millan J, Renkens F, Mourino J and Gerstner W 2004Noninvasive brain-actuated control of a mobile robot byhuman EEG IEEE Trans. Biomed. Eng. 51 1026–33

[5] Wolpaw J R and McFarland D J 2004 Control of atwo-dimensional movement signal by a noninvasivebrain–computer interface in humans Proc. Natl Acad. Sci.USA 101 17849–54

[6] Kubler A, Nijboer F, Mellinger J, Vaughan T M, Pawelzik H,Schalk G, McFarland D J, Birbaumer N and Wolpaw J R2005 Patients with ALS can use sensorimotor rhythms tooperate a brain–computer interface Neurology 64 1775–7

[7] Muller K R and Blankertz B 2006 Toward noninvasivebrain–computer interfaces IEEE Signal Process. Mag.23 126–8

[8] McFarland D J, Krusienski D J, Sarnacki W A andWolpaw J R 2008 Emulation of computer mouse controlwith a noninvasive brain–computer interface J. Neural. Eng.5 101–10

[9] Muller K R, Tangermann M, Dornhege G, Krauledat M,Curio G and Blankertz B 2008 Machine learning forreal-time single-trial EEG-analysis: from brain–computerinterfacing to mental state monitoring J. Neurosci. Methods167 82–90

[10] McFarland D J, Sarnacki W A and Wolpaw J R 2008Electroencephalographic (EEG) control ofthree-dimensional movement Program No. 778.4.2008Abstract Viewer/Itinerary Planner (Washington, DC:Society for Neuroscience) Online

[11] Taylor D M, Tillery S I and Schwartz A B 2002 Direct corticalcontrol of 3D neuroprosthetic devices Science 296 1829–32

[12] Serruya M D, Hatsopoulos N G, Paninski L, Fellows M Rand Donoghue J P 2002 Instant neural control of amovement signal Nature 416 141–2

[13] Shenoy K V, Meeker D, Cao S, Kureshi S A, Pesaran B,Buneo C A, Batista A P, Mitra P P, Burdick J Wand Andersen R A 2003 Neural prosthetic control signalsfrom plan activity Neuroreport 14 591–6

[14] Andersen R A, Musallam S and Pesaran B 2004 Selecting thesignals for a brain–machine interface Curr. Opin.Neurobiol. 14 720–6

[15] Lebedev M A, Carmena J M, O’Doherty J E, ZacksenhouseM, Henriquez C S, Principe J C and Nicolelis M A 2005Cortical ensemble adaptation to represent velocity of anartificial actuator controlled by a brain–machine interfaceJ. Neurosci. 25 4681–93

[16] Santhanam G, Ryu S I, Yu B M, Afshar A and Shenoy K V2006 A high-performance brain–computer interface Nature442 195–8

[17] Donoghue J P, Nurmikko A, Black M and Hochberg L R 2007Assistive technology and robotic control using motor cortexensemble-based neural interface systems in humans withtetraplegia J. Physiol. 579 603–11

[18] Kim S, Simeral J D, Hochberg L R, Donoghue J P, Friehs G Mand Black M J 2007 Multi-state decoding of point-and-clickcontrol signals from motor cortical activity in a human withtetraplegia 3rd Int. IEEE/EMBS Conf. on NeuralEngineering

[19] Velliste M, Perel S, Spalding M C, Whitford A S andSchwartz A B 2008 Cortical control of a prosthetic arm forself-feeding Nature 453 1098–101

[20] Shain W, Spataro L, Dilgen J, Haverstick K, Retterer S,Isaacson M, Saltzman M and Turner J N 2003 Controllingcellular reactive responses around neural prosthetic devicesusing peripheral and local intervention strategies IEEETrans. Neural. Syst. Rehabil. Eng. 11 186–8

[21] Donoghue J P, Nurmikko A, Friehs G and Black M 2004Development of neuromotor prostheses for humans Suppl.Clin. Neurophysiol. 57 592–606

[22] Davids K, Bennett S and Newell K M 2006 Movement SystemVariability (Champaign, IL: Human Kinetics)

[23] Hochberg L R, Serruya M D, Friehs G M, Mukand J A,Saleh M, Caplan A H, Branner A, Chen D, Penn R D andDonoghue J P 2006 Neuronal ensemble control of prostheticdevices by a human with tetraplegia Nature 442 164–71

[24] Fetz E E 1999 Real-time control of a robotic arm by neuronalensembles Nat. Neurosci. 2 583–4

[25] Chapin J K 2000 Neural prosthetic devices for quadriplegiaCurr. Opin. Neurol. 13 671–5

[26] Nicolelis M A 2001 Actions from thoughts Nature409 403–7

[27] Donoghue J P 2002 Connecting cortex to machines: recentadvances in brain interfaces Nat. Neurosci.5 (Suppl) 1085–8

[28] Brindley G S and Craggs M D 1972 The electrical activity inthe motor cortex that accompanies voluntary movementJ. Physiol. 223 28P–29P

[29] Pesaran B, Pezaris J S, Sahani M, Mitra P P and Andersen R A2002 Temporal structure in neuronal activity duringworking memory in macaque parietal cortex Nat. Neurosci.5 805–11

[30] Mehring C, Rickert J, Vaadia E, Cardosa de Oliveira S,Aertsen A and Rotter S 2003 Inference of hand movementsfrom local field potentials in monkey motor cortex Nat.Neurosci. 6 1253–4

[31] Rickert J, Oliveira S C, Vaadia E, Aertsen A, Rotter Sand Mehring C 2005 Encoding of movement direction indifferent frequency ranges of motor cortical local fieldpotentials J. Neurosci. 25 8815–24

[32] Leuthardt E C, Schalk G, Wolpaw J R, Ojemann J Gand Moran D W 2004 A brain–computer interface usingelectrocorticographic signals in humans J. Neural. Eng.1 63–71

[33] Miller K J, Leuthardt E C, Schalk G, Rao R P, Anderson N R,Moran D W, Miller J W and Ojemann J G 2007 Spectralchanges in cortical surface potentials during motormovement J. Neurosci. 27 2424–32

12

Page 13: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

[34] Leuthardt E C, Miller K, Anderson N R, Schalk G, Dowling J,Miller J, Moran D W and Ojemann J G 2007Electrocorticographic frequency alteration mapping: aclinical technique for mapping the motor cortexNeurosurgery 60 260–70, discussion 270–1

[35] Miller K J, denNijs M, Shenoy P, Miller J W, Rao R Pand Ojemann J G 2007 Real-time functional brain mappingusing electrocorticography NeuroImage 37 504–7

[36] Schalk G, Kubanek J, Miller K J, Anderson N R,Leuthardt E C, Ojemann J G, Limbrick D, Moran D,Gerhardt L A and Wolpaw J R 2007 Decodingtwo-dimensional movement trajectories usingelectrocorticographic signals in humans J. Neural. Eng.4 264–75

[37] Schalk G, Leuthardt E C, Brunner P, Ojemann J G,Gerhardt L A and Wolpaw J R 2008 Real-time detection ofevent-related brain activity Neuroimage 43 245–9

[38] Pistohl T, Ball T, Schulze-Bonhage A, Aertsen A andMehring C 2008 Prediction of arm movement trajectoriesfrom ECoG-recordings in humans J. Neurosci. Methods167 105–14

[39] Sanchez J C, Gunduz A, Carney P R and Principe J C 2008Extraction and localization of mesoscopic motor controlsignals for human ECoG neuroprosthetics J. Neurosci.Methods 167 63–81

[40] Waldert S, Preissl H, Demandt E, Braun C, Birbaumer N,Aertsen A and Mehring C 2008 Hand movement directiondecoded from MEG and EEG J. Neurosci. 28 1000–8

[41] Schalk G, Miller K J, Anderson N R, Wilson J A, Smyth M D,Ojemann J G, Moran D W, Wolpaw J R and Leuthardt E C2008 Two-dimensional movement control usingelectrocorticographic signals in humans J. Neural. Eng.5 75–84

[42] Leuthardt E C, Miller K J, Schalk G, Rao R P andOjemann J G 2006 Electrocorticography-based braincomputer interface—the Seattle experience IEEE Trans.Neur. Sys. Rehab. Eng. 14 194–8

[43] Wilson J A, Felton E A, Garell P C, Schalk G andWilliams J C 2006 ECoG factors underlying multimodalcontrol of a brain–computer interface IEEE Trans. Neur.Sys. Rehab. Eng. 14 246–50

[44] Felton E A, Wilson J A, Williams J C and Garell P C 2007Electrocorticographically controlled brain–computerinterfaces using motor and sensory imagery in patients withtemporary subdural electrode implants. Report of four casesJ. Neurosurg. 106 495–500

[45] Schieber M H and Hibbard L S 1993 How somatotopic is themotor cortex hand area? Science 261 489–92

[46] Schieber M H and Poliakov A V 1998 Partial inactivation ofthe primary motor cortex hand area: effects on individuatedfinger movements J. Neurosci. 18 9038–54

[47] Poliakov A V and Schieber M H 1999 Limited functionalgrouping of neurons in the motor cortex hand area duringindividuated finger movements: a cluster analysisJ. Neurophysiol. 82 3488–505

[48] Rathelot J A and Stric P L 2006 Muscle representation in themacaque motor cortex: an anatomical perspective Proc.Natl Acad. Sci. USA 103 8257–62

[49] Sanes J N, Donoghue J P, Thangaraj V, Edelman R R andWarach S 1995 Shared neural substrates controlling handmovements in human motor cortex Science 268 1775–7

[50] Beltramello A, Cerini R, Puppini G, El-Dalati G, Viola S,Martone E, Cordopatri D, Manfredi M, Aglioti Sand Tassinari G 1998 Motor representation of the hand inthe human cortex: an f-MRI study with a conventional 1.5T clinical unit Ital. J. Neurol. Sci. 19 277–84

[51] Schieber M H 1999 Somatotopic gradients in the distributedorganization of the human primary motor cortex hand area:evidence from small infarcts Exp. Brain Res.128 139–48

[52] Lang C E and Schieber M H 2004 Reduced muscle selectivityduring individuated finger movements in humans afterdamage to the motor cortex or corticospinal tractJ. Neurophysiol. 91 1722–33

[53] Volkmann J, Schnitzler A, Witte O W and Freund H 1998Handedness and asymmetry of hand representation inhuman motor cortex J. Neurophysiol. 79 2149–54

[54] Lisogurski D and Birch G E 1998 Identification of fingerflexions from continuous EEG as a brain computer interfaceEng. Proc. 20th Ann. Int. Conf. IEEE (Engineering inMedicine and Biology Society) (29 Oct.–1 Nov. 1998) vol 4pp 2004–7

[55] Li Y, Gao X, Liu H and Gao S 2004 Classification ofsingle-trial electroencephalogram during finger movementIEEE Trans. Biomed. Eng. 51 1019–25

[56] Lehtonen J, Jylanki P, Kauhanen L and Sams M 2008 Onlineclassification of single EEG trials during finger movementsIEEE Trans. Biomed. Eng. 55 713–20

[57] Kauhanen L, Nykopp T and Sams M 2006 Classification ofsingle MEG trials related to left and right index fingermovements Clin. Neurophysiol. 117 430–9

[58] Ball T, Nawrot MP, Pistohl T, Aertsen A, Schulze-Bonhage Aand Mehring C 2004 Towards a brain–machine interfacebased on epicortical field potentials Biomed. Eng. (Berlin)49 756–9

[59] Kelso J A, Fuchs A, Lancaster R, Holroyd T, Cheyne Dand Weinberg H 1998 Dynamic cortical activity in thehuman brain reveals motor equivalence Nature392 814–8

[60] Blankertz B, Dornhege G, Krauledat M, Muller K R,Kunzmann V, Losch F and Curio G 2006 The BerlinBrain–computer Interface: EEG-based communicationwithout subject training IEEE Trans. Neural. Syst. Rehab.Eng. 14 147–52

[61] Zanos S, Miller K J and Ojemann J G 2008Electrocorticographic spectral changes associated withipsilateral individual finger and whole hand movementConf. Proc. IEEE Eng. Med. Biol. Soc. 2008 5939–42

[62] Miller K J, Zanos S, Fetz E E, den Nijs M and Ojemann J G2009 Decoupling the cortical power spectrum revealsreal-time representation of individual finger movements inhumans J. Neurosci. 29 3132–7

[63] Georgopoulos A P, Pellizzer G, Poliakov A V andSchieber M H 1999 Neural coding of finger and wristmovements J. Comput. Neurosci. 6 279–88

[64] Ben Hamed S, Schieber M H and Pouget A 2007 DecodingM1 neurons during finger multiple movementsJ. Neurophysiol. 98 327–33

[65] Acharya S, Tenore F, Aggarwal V, Etienne-Cummings R,Schieber M H and Thakor N V 2008 Decoding individuatedfinger movements using volume-constrained neuronalensembles in the M1 hand area IEEE Trans. Neural. Syst.Rehab. Eng. 16 15–23

[66] Aggarwal V, Acharya S, Tenore F, Shin H C,Etienne-Cummings R, Schieber M H and Thakor N V 2008Asynchronous decoding of dexterous finger movementsusing M1 neurons IEEE Trans. Neural. Syst. Rehab. Eng.16 3–14

[67] Schalk G, McFarland D J, Hinterberger T, Birbaumer Nand Wolpaw J R 2004 BCI2000: a general-purposebrain–computer interface (BCI) system IEEE Trans.Biomed. Eng. 51 1034–43

[68] Miller K J, Makeig S, Hebb A O, Rao R P, Dennijs Mand Ojemann J G 2007 Cortical electrode localization fromx-rays and simple mapping for electrocorticographicresearch: The “Location on Cortex” (LOC) package forMATLAB J. Neurosci. Methods 162 303–8

[69] Fox P T, Perlmutter J S and Raichle M E 1985 A stereotacticmethod of anatomical localization for positron emissiontomography J. Comput. Assist. Tomogr. 9 141–53

13

Page 14: Decoding flexion of individual fingers using electrocorticographic … · 2013-09-04 · J. Neural Eng. 6 (2009) 066001 JKub´anek et al local field potential activity recorded from

J. Neural Eng. 6 (2009) 066001 J Kubanek et al

[70] Talairach J and Tournoux P 1988 Co-Planar Sterotaxic Atlas ofthe Human Brain (New York: Thieme Medical Publishers)

[71] Marple S L 1987 Digital Spectral Analysis: With Applications(Englewood Cliffs, NJ: Prentice-Hall)

[72] Miller K J, Shenoy P, den Nijs M, Sorensen L B, Rao R P Nand Ojemann J G 2008 Beyond the gamma band: the role ofhigh-frequency features in movement classification IEEETrans. Biomed. Eng. 55 1635

[73] Witten I H and Frank E 2005 Data Mining: Practical MachineLearning Tools and Techniques 2nd edn (San Francisco:Morgan Kaufmann)

[74] Crone N E, Miglioretti D L, Gordon B and Lesser R P 1998Functional mapping of human sensorimotor cortex withelectrocorticographic spectral analysis: II. Event-relatedsynchronization in the gamma band Brain 121 2301–15

[75] Canolty R T, Soltani M, Dalal S S, Edwards E, Dronkers N F,Nagarajan S S, Kirsch H E, Barbaro N M and Knight R T

2007 Spatiotemporal dynamics of word processing in thehuman brain Front. Neurosci. 1 185–96

[76] Schieber M H 1991 Individuated finger movements of rhesusmonkeys: a means of quantifying the independence of thedigits J. Neurophysiol. 65 1381–91

[77] Reilly K T and Schieber M H 2003 Incomplete functionalsubdivision of the human multitendoned finger muscleflexor digitorum profundus: an electromyographic study J.Neurophysiol. 90 2560–70

[78] Lang C E and Schieber M H 2004 Human finger independence:limitations due to passive mechanical coupling versusactive neuromuscular control J. Neurophysiol.92 2802–10

[79] Freeman W J, Rogers L J, Holmes M D and Silbergeld D L2000 Spatial spectral analysis of humanelectrocorticograms including the alpha and gamma bandsJ. Neurosci. Methods 95 111–21

14