25
Entropy 2012, 14, 1553-1577; doi:10.3390/e14081553 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article Permutation Entropy and Its Main Biomedical and Econophysics Applications: A Review Massimiliano Zanin 1,2,3, *, Luciano Zunino 4,5 , Osvaldo A. Rosso 6,7 and David Papo 1 1 Centre for Biomedical Technology, Polytechnic University of Madrid, Pozuelo de Alarc´ on, 28223 Madrid, Spain 2 Faculdade de Ciˆ encias e Tecnologia, Departamento de Engenharia Electrot´ ecnica, Universidade Nova de Lisboa, 2829-516 Caparica, Portugal 3 Innaxis Foundation & Research Institute Jos´ e Ortega y Gasset 20, 28006 Madrid, Spain 4 Centro de Investigaciones ´ Opticas (CONICET La Plata - CIC) C.C. 3, 1897 Gonnet, Argentina 5 Departamento de Ciencias B´ asicas, Facultad de Ingenier´ ıa, Universidad Nacional de La Plata (UNLP), 1900 La Plata, Argentina 6 LaCCAN/CPMAT - Instituto de Computac ¸˜ ao, Universidade Federal de Alagoas, BR 104 Norte km 97, 57072-970 Macei ´ o, Alagoas, Brazil 7 Laboratorio de Sistemas Complejos, Facultad de Ingenier´ ıa, Universidad de Buenos Aires, 1063 Av. Paseo Col´ on 840, Ciudad Aut ´ onoma de Buenos Aires, Argentina ? Author to whom correspondence should be addressed; E-Mail: [email protected]; Tel./Fax: +34-913-364-648. Received: 1 July 2012; in revised form: 10 August 2012 / Accepted: 21 August 2012 / Published: 23 August 2012 Abstract: Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, an aspect which is not usually taken into account. The idea of calculating entropy based on permutation patterns (that is, permutations defined by the order relations among values of a time series) has received a lot of attention in the last years, especially for the understanding of complex and chaotic systems. Permutation entropy directly accounts for the temporal information contained in the time series; furthermore, it has the quality of simplicity, robustness and very low computational cost. To celebrate the tenth anniversary of the original work, here we analyze the theoretical foundations of the permutation entropy,

OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14, 1553-1577; doi:10.3390/e14081553OPEN ACCESS

entropyISSN 1099-4300

www.mdpi.com/journal/entropy

Article

Permutation Entropy and Its Main Biomedical andEconophysics Applications: A ReviewMassimiliano Zanin 1,2,3, *, Luciano Zunino 4,5, Osvaldo A. Rosso 6,7 and David Papo 1

1 Centre for Biomedical Technology, Polytechnic University of Madrid, Pozuelo de Alarcon, 28223Madrid, Spain

2 Faculdade de Ciencias e Tecnologia, Departamento de Engenharia Electrotecnica, Universidade Novade Lisboa, 2829-516 Caparica, Portugal

3 Innaxis Foundation & Research Institute Jose Ortega y Gasset 20, 28006 Madrid, Spain4 Centro de Investigaciones Opticas (CONICET La Plata - CIC) C.C. 3, 1897 Gonnet, Argentina5 Departamento de Ciencias Basicas, Facultad de Ingenierıa, Universidad Nacional de La Plata

(UNLP), 1900 La Plata, Argentina6 LaCCAN/CPMAT - Instituto de Computacao, Universidade Federal de Alagoas, BR 104 Norte km

97, 57072-970 Maceio, Alagoas, Brazil7 Laboratorio de Sistemas Complejos, Facultad de Ingenierıa, Universidad de Buenos Aires, 1063 Av.

Paseo Colon 840, Ciudad Autonoma de Buenos Aires, Argentina

? Author to whom correspondence should be addressed; E-Mail: [email protected];Tel./Fax: +34-913-364-648.

Received: 1 July 2012; in revised form: 10 August 2012 / Accepted: 21 August 2012 /Published: 23 August 2012

Abstract: Entropy is a powerful tool for the analysis of time series, as it allows describingthe probability distributions of the possible state of a system, and therefore the informationencoded in it. Nevertheless, important information may be codified also in the temporaldynamics, an aspect which is not usually taken into account. The idea of calculating entropybased on permutation patterns (that is, permutations defined by the order relations amongvalues of a time series) has received a lot of attention in the last years, especially for theunderstanding of complex and chaotic systems. Permutation entropy directly accounts forthe temporal information contained in the time series; furthermore, it has the quality ofsimplicity, robustness and very low computational cost. To celebrate the tenth anniversaryof the original work, here we analyze the theoretical foundations of the permutation entropy,

Page 2: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1554

as well as the main recent applications to the analysis of economical markets and to theunderstanding of biomedical systems.

Keywords: permutation entropy; forbidden patterns; Shannon entropy; econophysics; EEG;epilepsy

1. Introduction

Given a system, be it natural or man-made, and given an observable of such system whose evolutioncan be tracked through time, a natural question arises: how much information is this observable encodingabout the dynamics of the underlying system? The information content of a system is typically evaluatedvia a probability distribution function (PDF) P describing the apportionment of some measurable orobservable quantity, generally a time series X (t). Quantifying the information content of a givenobservable is therefore largely tantamount to characterizing its probability distribution. This is oftendone with the wide family of measures called information entropies [1].

Entropy is a basic quantity with multiple field-specific interpretations: for instance, it has beenassociated with disorder, state-space volume, or lack of information [2]. When dealing with informationcontent, the Shannon entropy is often considered as the foundational and most natural one [3,4]. Givenany arbitrary discrete probability distribution P = pi : i = 1, . . . ,M, with M degrees of freedom,Shannon’s logarithmic information measure reads:

S[P ] = −M∑i=1

pi ln(pi) (1)

This can be regarded as a measure of the uncertainty associated to the physical process described by P .For instance, if S[P ] = Smin = 0, we are in position to predict with complete certainty which of thepossible outcomes i, whose probabilities are given by pi, will actually take place. Our knowledge of theunderlying process described by the probability distribution is maximal in this instance. In contrast, ourknowledge is minimal for a uniform distribution (Pe = 1/M,∀i = 1, . . . ,M) and the uncertainty ismaximal, i.e., S[Pe] = Smax = lnM .

Since Shannon’s seminal paper [3], his entropy has been used in the characterization of a great varietyof systems. Yet, this traditional method presents a number of drawbacks.

First and most importantly, Shannon’s and other classical measures neglect temporal relationshipsbetween the values of the time series, so that structure and possible temporal patterns present in theprocess are not accounted for [5]. In other words, if two time series are defined as X1 = 0, 0, 1, 1and X2 = 0, 1, 0, 1, it holds that S[P (X1)] = S[P (X2)]. More generally, this occurs when one merelyassigns to each time point of the series X , a symbol from a given finite alphabet A, thus creating asymbolic sequence that can be regarded as a non-causal coarse grained description of the time seriesunder consideration. As a consequence, order relations and the time scales of the dynamics are lost.The usual histogram-technique corresponds to this kind of assignment. Causal information may be duly

Page 3: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1555

incorporated if information about the past dynamics of the system is included in the symbolic sequence,i.e., symbols of alphabet A are assigned to a (phase-space) trajectory’s portion.

Second, classical entropy measures suppose some prior knowledge about the system; specifically,in using quantifiers based on Information Theory, a probability distribution associated to the timeseries under analysis should be provided beforehand. The determination of the most adequate PDFis a fundamental problem because the PDF P and the sample space Ω are inextricably linked. Manymethods have been proposed for a proper selection of the probability space (Ω, P ). Among others,we can mention frequency counting [6], procedures based on amplitude statistics [7], binary symbolicdynamics [8], Fourier analysis [9], or wavelet transform [10]. Their applicability depends on particularcharacteristics of the data, such as stationarity, time series length, variation of the parameters, level ofnoise contamination, etc. In all these cases the dynamics’ global aspects can be somehow captured, butthe different approaches are not equivalent in their ability to discern all the relevant physical details.One must also acknowledge the fact that the above techniques are introduced in a rather “ad-hocfashion” and they are not directly derived from the dynamical properties themselves of the system understudy. Therefore, a question naturally arises: is there a way to define a PDF that is more general andsystem independent?

Third, classical methods are often best designed to deal with linear systems, and only poorly describehighly nonlinear chaotic regimes.

Figure 1. Evolution of the number of citations for the Bandt and Pompe seminalpaper [11], from 2002 to 2011. Information extracted from the Scopus bibliographic database(http://www.scopus.com/home.url accessed on March 12, 2012.)

2002 2003 2004 2005 2006 2007 2008 2009 2010 20110

10

20

30

40

50

60

Year

Num

ber

of cites

Bandt & Pompe seminal paper

Bandt and Pompe [11] addressed these issues by introducing a simple and robust method that takesinto account time causality by comparing neighboring values in a time series. The appropriate symbolsequence arises naturally from the time series, with no prior knowledge assumed. “Partitions” are

Page 4: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1556

naturally devised by comparing the order of neighboring relative values, rather than by apportioningamplitudes according to different levels. Based on this symbolic analysis, a permutation entropy isthen built. Bandt and Pompe’s approach for generating PDFs is a simple symbolization technique thatincorporates causality in the evaluation of the PDF associated to a time series. Its use has been shown toyield a clear improvement on the quality of Information theory-based quantifiers (see, e.g., [12–15] andreferences therein). The power and usefulness of this approach has been validated in many subsequentpapers, as shown by the evolution in the number of citations of the cornerstone paper [11] throughtime—see Figure 1.

We review the principles behind permutation entropy, present methods derived from Bandt andPompe’s original idea, and describe several applications drawn from the fields of econophysicsand biology.

2. The Permutation Entropy

At each time s of a given a time series X = xt : t = 1, . . . , N, a vector composed of the D-thsubsequent values is constructed:

s 7→ (xs, xs+1, . . . , xs+(D−2), xs+(D−1)) (2)

D is called the embedding dimension, and determines how much information is contained in eachvector. To this vector, an ordinal pattern is associated, defined as the permutation π = (r0r1 . . . rD−1) of(01 · · ·D − 1) which fulfills

xs+r0 ≤ xs+r1 ≤ . . . ≤ xs+rD−2≤ xs+rD−1

(3)

In other words, the values of each vector are sorted in an ascending order, and a permutation pattern πis created with the offset of the permuted values. A numerical example may help clarifying this concept.Take, for example, the time series X = 3, 1, 4, 1, 5, 9. For D = 3, the vector of values correspondingto s = 1 is (3, 1, 4); the vector is sorted in ascending order, giving (1, 3, 4), and the correspondingpermutation pattern is then π = (102). For s = 2, the vector of values is (1, 4, 1), leading to thepermutation π = (021). Notice that, if two values are equal (here, the first and the third elements), theyare ordered according to the time of their appearance.

Graphically, Figure 2 illustrates the construction principle of the ordinal patterns of length D = 2, 3

and 4 [16,17]. Consider the value sequence x0, x1, x2, x3. For D = 2, there are only two possibledirections from x0 to x1, up and down. For D = 3, starting from x1 (up) the third part of the pattern canbe above x1, below x0, or between x0 and x1. A similar situation can be found starting from x1 (down).For D = 4, for each one of the 6 possible positions for x2, 4 possible localizations exist for x3, givingD! = 4! = 24 different ordinal patterns. In the diagram of Figure 2, with full circles and continuous linewe represent the sequence values x0 < x2 < x3 < x1, which lead to the pattern π = (0231). A graphicalrepresentation of all possible patterns corresponding to D = 3, 4 and 5 can be found in Figure 2 of [17].

Equation 2 can be further extended by considering an embedding delay τ :

s 7→ (xs, ss+τ , . . . , xs+τ(D−2), xs+τ(D−1)) (4)

Page 5: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1557

when τ is greater than one, the values composing the permutations are taken non-consecutively, thusmapping the dynamics of the system at different temporal resolutions.

Figure 2. Illustration of the construction-principle for ordinal patterns of length D [16,17].If D = 4, full circles and continuous lines represent the sequence values x0 < x2 < x3 < x1

which lead to the pattern π = (0231).

X 0

X 1

X 2

X 3

D = 4D = 3D = 2D = 1

The idea behind permutation entropy is that patterns may not have the same probability of occurrence,and thus, that this probability may unveil relevant knowledge about the underlying system. An extremesituation is represented by the forbidden patterns, that is, patterns that do not appear at all in the analyzedtime series.

There are two reasons behind the presence of forbidden patterns. The first, and most trivial one, is dueto the finite length of any real time series, thus leading to finite-size effects. Going back to the previousexample, the permutation π = (210) does not appears in the sequence 3, 1, 4, 1, 5, 9, i.e., there isnot triplet of consecutive values ordered in a descending order. The second reason is related to thedynamical nature of the systems generating the time series. If a time series is constructed using a perfectrandom number generator, all possible sequences of numbers should be expected, and no forbiddenpattern should appear. On the contrary, suppose that we are studying the output of the logistic map [18],defined as:

xt+1 = αxt(1− xt) (5)

for all x included between [0, 1]. Figure 3(left) shows the behavior of such map for α = 4, correspondingto chaotic dynamics; the black line represents all the possible initial values x0 ∈ [0, 1], the red curve thecorresponding outputs after one iteration (i.e, x1), and the green curve those after the second (x2). Theordering of these curves graphically represents the corresponding permutation pattern; for instance, forx0 = 0.1, from bottom to top we find the black, red and green curves: thus, x0 < x1 < x2 (that is,

Page 6: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1558

0 < 0.36 < 0.9216), resulting in π = (012). The reader may notice that 5 different permutations can begenerated by this map, identified by the 5 regions enclosed by vertical dashed lines, while the numberof possible permutations for an embedding dimension of 3 are 3! = 6: in other words, the permutationπ = (210) is forbidden by the own dynamics of the system.

Figure 3. (Left) Behavior of the logistic map [18] with α = 4. The plot represents theevolution of x1 (red curve) and x2 (green curve) for all possible value of x0 (black line);(Right) Number of forbidden patterns, for D = 3, found in different time series (1000

realizations) generated through the logistic map with α = 4, as a function of the length ofthe series. Each point corresponds to the mean value of forbidden patterns, and vertical barsto the corresponding standard deviation.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 10 20 30 40 50 60 70 80 90 1000

1

2

3

4

5

6

Time series length

Mean n

um

ber

of fo

rbid

den p

attern

s

In Figure 3(right), the mean number of forbidden patterns found in a time series created withEquation 5, along with its standard deviation, are represented as a function of the length of the series,thus representing both factors at the same time.

The relevance of this method is then clear: by assessing the presence, or absence, of some permutationpatterns of the elements of a time series, it is possible to derive information about the dynamics of theunderlying system. Even if all patterns eventually appear, the probability with which each one is presentcan unveil relevant information about this dynamics. More generally, to each time series it is possibleto associate a probability distribution Π, whose elements πi are the frequencies associated with the ipossible permutation patterns - therefore, i = 1, · · · , D!. The Permutation Entropy, PE, is then definedas the Shannon entropy associated to such distribution:

PE = −D!∑i=1

πi lnπi (6)

In order to assess the quantity of information encoded by such distribution, the logarithm is usuallyin base 2. Furthermore, by noticing that PE ∈ [0, log2D!], a normalized Permutation Entropy canbe defined:

PEnorm = − 1

log2D!

D!∑i=1

πi log2 πi (7)

Page 7: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1559

A very related information measure, 1−PEnorm, called normalized Kullback-Leibler entropy (KLE)was introduced in [19]. It quantifies the distance between the ordinal pattern probability distribution andthe uniform distribution.

Equation 4 indicates that the resulting probability distribution has two main parameters: the dimensionD and the embedding delay τ . The former plays an important role in the evaluation of the appropriateprobability distribution, since D determines the number of accessible states, given by D!. Moreover, toachieve a reliable statistics and proper discrimination between stochastic and deterministic dynamics, itis necessary that N D! [13,20]. For practical purpose, the authors in [11] suggested to work with3 ≤ D ≤ 7 with a time lag τ = 1. Nevertheless, other values of τ might provide additional information,related with the intrinsic time scales of the system [21–24].

3. Applying Permutation Entropy

3.1. Distinguishing Noise from Chaos

In order to model a system, it is necessary to identify the underlying dynamics. Stochastic or chaotic(deterministic) classification is essential to achieve the modeling goal.

This is not always an easy task. Consider, for example, the time series generated by the logistic map ofEquation 5. The time series takes values in the interval [0, 1], and for α = 4 the dynamics is chaotic [18].In this regime, the logistic map exhibits an almost flat PDF-histogram, with peaks at x = 0 and x = 1.This histogram-PDF constitutes an invariant measure of the system [18]. Thus, if we use this PDF, weobtain a value for the Shannon entropy close to its corresponding maximum Smax: the logistic map isalmost indistinguishable from uncorrelated random noise.

This problem can be solved if the time-causality (in the series’ values) is duly taken into accountwhen extracting the associated PDF from the time series, something that one gets automatically fromthe Bandt and Pompe methodology [11]. Specifically, in the case of unconstrained stochastic process(uncorrelated process) every ordinal pattern has the same probability of appearance [25–28]. That is, ifthe data set is long enough, all the ordinal patterns will eventually appear.

Amigo and co-workers [25,26] proposed a test that uses this last property, i.e., the number of missingordinal patterns, in order to distinguish determinism (chaos) from pure randomness in finite time seriescontaminated with observational white noise (uncorrelated noise). The test is based on two importantpractical properties: their finiteness and noise contamination. These two properties are importantbecause finiteness produces missing patterns in a random sequence without constrains, whereas noiseblurs the difference between deterministic and random time-series. The methodology proposed byAmigo et al. [26] consists in a graphic comparison between the decay of the missing ordinal patterns(of length D) of the time series under analysis as a function of the series length N , and the decaycorresponding to white Gaussian noise.

Stochastic process could also present forbidden patterns [14]. However, in the case of eitheruncorrelated or some correlated stochastic processes, it can be numerically ascertained that no forbiddenpatterns emerge. Moreover, analytical expressions can be derived [29] for some stochastic processes(i.e., fractional Brownian motion for PDF’s based on ordinal patterns with length 2 ≤ D ≤ 4). The

Page 8: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1560

methodology of Amigo was recently extended by Carpi et al. [30] for the analysis of such stochasticprocesses: specifically, fractional Brownian motion (fBm), fractional Gaussian noise (fGn), and noiseswith f−k power spectrum and (k ≥ 0). More precisely, they analyzed the decay rate of missing ordinalpatterns as a function of pattern-length D (embedding dimension) and of time series length N . Resultsshow that for a fixed pattern length, the decay of missing ordinal patterns in stochastic processes dependsnot only on the series length but also on their correlation structures. In other words, missing ordinalpatterns are more persistent in the time series with higher correlation structures. Carpi et al. [30] alsohave shown that the standard deviation of the estimated decay rate of missing ordinal patterns decreaseswith increasing D. This is due to the fact that longer patterns contain more temporal information and aretherefore more effective in capturing the dynamics of time series with correlation structures.

3.2. The Statistical Complexity and the Complexity-Entropy Plane

It is widely known that an entropic measure does not quantify the degree of structure or patternspresent in a process [5]. Moreover, it was recently shown that measures of statistical or structuralcomplexity are necessary for a better understanding of chaotic time series because they are able to capturetheir organizational properties [31]. This specific kind of information is not revealed by randomnessmeasures. The opposite extreme perfect order (like a periodic sequence) and maximal randomness(e.g., fair coin toss) possess no complex structure and exhibit zero statistical complexity. Between theseextremes, a wide range of possible degrees of physical structure exists, which should be quantified bythe statistical complexity measure. An effective statistical complexity measure (SCM) was introduced todetect essential details of the dynamics and differentiate different degrees of periodicity and chaos [32].This measure provides additional insight into the details of the system’s probability distribution, whichis not discriminated by randomness measures like the entropy [13,31]. It can also help to uncoverinformation related to the correlational structure between the components of the physical process understudy [33,34].

This measure is a function of the probability distribution function P associated to a time series, andis defined as the product of the normalized Shannon entropy and another term called disequilibrium.The use of the ordinal patterns PDF naturally results in several advantages, viz. the inclusion ofthe temporal relationships between the elements of the time series and the invariance with respect tonon-linear monotonous transformation.

More formally, following [35], the MPR-statistical complexity measure is defined as the product

CJS[P ] = QJ [P, Pe] · H[P ] (8)

of (i) the normalized Shannon entropy and (ii) the so-called disequilibrium QJ , which is defined interms of the extensive (in the thermodynamical sense) Jensen–Shannon divergence J [P, Pe] that linkstwo PDFs [36]. The Jensen–Shannon divergence, which quantifies the difference between two (ormore) probability distributions, is especially useful to compare the symbol-composition of differentsequences [37]. Furthermore, the complexity-entropy plane, H × CJS , which represents the evolutionof the complexity of the system as a function of its entropy, has been used to study changes inthe dynamics of a system originated by modifications of some characteristic parameters (see, forinstance, [12,15,38–40] and references therein). The complexity measure constructed in this way

Page 9: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1561

has the intensive property found in many thermodynamic quantities [32]. We stress the fact that thestatistical complexity defined above is a function of two normalized entropies (the Shannon entropyand Jensen–Shannon divergence), but such function is not trivial, in that it depends on two differentprobability distributions, i.e., the one corresponding to the state of the system, P , and the uniformdistribution, Pe, taken as reference state.

3.3. Identification of Time Scales

Often, when first studying a complex physical or biological system, the first almost mandatory step inits investigation involves determining its characteristic dimensions.

Classically, this issue has been tackled by means of autocorrelation functions or Delayed MutualInformation (see, for instance, [41,42]). Recently, the PE has been proposed as an alternative approach.Specifically, the idea is that the entropy associated with a time series should be minimal, that is, theunderlying dynamics should be more predictable and simple, when the value of the embedding delay τ(see Equation 4) is equal to the characteristic time delay of the system.

This approach has been checked by Zunino and coworkers [22] with time series generated by aMackey–Glass oscillator [43]. Results indicate that the permutation entropy exhibits a minimum incorrespondence with the delay of the system, along with other secondary minima corresponding toharmonics and subharmonics of such delay. Furthermore, it is also shown that this method is able torecover the characteristics of the system even when more than one delay is used or when the time seriesis contaminated by noise. In [23] this technique is also applied to time series generated by chaoticsemiconductor lasers with optical feedback, enabling the identification of three important features of thesystem: feedback time delay, relaxation oscillation period, and pulsing time scale. Similar applicationscan be also found in [24,44,45].

3.4. Dependences between Time Series

The identification of the presence of relationships between the dynamics of two or more time seriesis a relevant problem in many fields of science, among them in economics and biophysics. Severaltechniques have been proposed in the past, but they usually require previous knowledge of the probabilitydistribution from which the time series have been drawn. The model-independence of PE makes it anideal tool to tackle this problem.

A test based on permutation patterns for independence between time series was proposedby Matilla-Garcıa and Ruiz Marın [46]. Specifically, given a two-dimensional time seriesW = (X ,Y) and an embedding dimension D, to each subset ofW a bidimensional permutation patternπ = ((x0, x1, · · · , xD−1), (y0, y1, · · · , yD−1)) is assigned. Notice that, due to the dimension of thetime series, the number of possible patterns grows from D! to D!2 .The appearance probability ofall πi generates a global probability distribution Π. When the two components of the time seriesare independent, it is demonstrated that the Shannon entropy calculated on Π follows asymptoticallya χ2 distribution. Thanks to the use of permutation patterns, this method has the advantages of notrequiring any model assumption (i.e., it is nonparametric) and of being suitable for the analysis ofnonlinear processes.

Page 10: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1562

Several papers have expanded the work of Matilla-Garcıa and Ruiz Marın [46]. For instance, in [47]a method for spatial independence test is proposed; in [48] the permutation entropy is used to detectspatial structures, and specifically of the order of contiguity; also, in [49] the symbolic entropy is usedto assess the presence of lineal and non-linear spatial causality.

Canovas et al. [50] proposed an alternative approach for the analysis of the dependence of twotime series, based on the construction of contingency tables, i.e., matrices where the frequency ofco-appearance of two patterns in two different time series at the same time is reported. Once acontingency table has been constructed, the independence of both series can be checked with standardstatistical tests, including Pearson’s chi-square, G-test, or the Fisher–Freeman–Halton test [51].

Finally, it is worth noticing the work of Bahraminasab et al. [52], in which the permutation entropyis used along with the Conditional Mutual Information for the assessment of causal (or driver-response)relationships between two time series. The method is tested with van der Pol oscillators, demonstratinga good tolerance to external noise.

3.5. Some Improvements on the PE Definition

The original definition of PE presents two main drawbacks, for which solutions have beenrecently proposed.

First, it is clear that the magnitude of the difference between neighboring values is not taken intoconsideration when the time series is symbolized by using the Bandt–Pompe recipe. Consequently,vectors of very different appearance are mapped to the same symbol. Liu and Wang introduced the fine-grained PE (FGPE), in which a factor is added in the permutation type for discriminating these differentvectors [53]. It is shown that the FGPE allows for a more sensitive identification of the dynamicalchange of time series and approximates more closely to the Lyapunov exponent for chaotic time series.Obviously, the time needed for estimating the FGPE is slightly larger in comparison with the time neededfor calculating the conventional PE.

Second, by assuming that the time series under study has a continuous distribution, Bandt andPompe [11] neglected equal values and consider only inequalities between the data. Moreover, theseauthors proposed to rank possible equalities according to their order of emergence or to eliminatethem by adding small random perturbations to the original time series. Bian et al. [54] have recentlyproposed the modified permutation entropy (mPE) method for improving the symbolization of equalvalues. They have shown that the probability of equal values can be very high when the observed timeseries is digitized with lower resolution. In this situation, the original recipe to deal with equalities canintroduce some bias in the results. By mapping equal values to the same symbol, the mPE allows for abetter characterization of the system states. Complexity of heart rate variability related to three differentgroups (young, elderly and congestive heart failure) is better characterized with this improved version,reaching a more clear discrimination between the groups.

Page 11: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1563

4. Biomedical Applications

Over the last few years, permutation entropy and related metrics have emerged as particularlyappropriate complexity measures in the study of time series from biological systems, such as the brainor the heart. The reasons for this increasing success are manifold.

First, biological systems are typically characterized by complex dynamics, with rich temporalstructure even at rest [55]. For instance, spontaneous brain activity encompasses a set of dynamicallyswitching states, which are continuously re-edited across the cortex [56], in a non-random way [57,58].On the other hand, various pathologies are associated with the appearance of highly stereotyped patternsof activity [59]. For instance, epileptic seizures are typically characterized by ordered sequences ofsymptoms. Permutation entropy seems particularly well equipped to capture this structure in both healthysystems and in pathological states.

Second, while over the last decades a wealth of linear and more recently nonlinear methods forquantifying this structure from time series have been devised [60,61], most of them, in addition to makingrestrictive hypotheses as to the type of underlying dynamics, are vulnerable to even low levels of noise.Even when mostly deterministic, biological time series typically contain a certain degree of randomness,e.g., in the form of dynamical and observational noise. Therefore, analyzing signals from such systemsimplies methods that are model-free and robust. Contrary to most nonlinear measures, permutationentropy and derived metrics can be calculated for arbitrary real-world time series and are rather robustto noise sources and artifacts [11,62].

Finally, real time applications for clinical purposes require computationally parsimonious algorithmsthat can provide reliable results for relatively short and noisy time series. Most existing methods requirelong, stationary and noiseless data. Permutation entropy, on the contrary, is extremely fast and robust,and seems particularly advantageous when there are huge data sets and no time for preprocessing andfine-tuning of parameters.

In what follows, we give a brief overview of applications of ordinal time series analysis to studies ofbrain (electrical) activity, namely in epilepsy research [63–69], in anesthesiology [70–74], and cognitiveneuroscience [75,76]. All the studies reviewed below investigated brain electrical activity, recorded withelectroencephalographic (EEG) techniques. We also review studies of heart rate rhythms using variousmethods derived from the original Bandt and Pompe’s method [17,19,52,54,77].

4.1. Epilepsy Studies

Epilepsy is one of the most common neurological disorders, with a prevalence of approximately1% of the world’s population. Epilepsy presents itself in seizures, which result from abnormal,hyper-synchronous brain activity. The sudden and often unforeseen occurrence of seizures representsone of the most disabling aspects of the disease. In many patients suffering from epilepsy, seizuresare well controlled with anti-epileptic drugs. However, approximately 30% of patients do not respondto available medication. For these patients, neurosurgical resection of epileptogenic brain tissue mayrepresent a solution. Typically, surgeons strive to identify this tissue by implanting intracranial electrodesin the patients’ brain. Correctly identifying the presence of epileptic activity, characterizing thespatio-temporal patterns of the corresponding brain activity, and predicting the occurrence of seizures

Page 12: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1564

are major challenges the efficient solution of which could significantly improve the quality of life forepilepsy patients.

4.1.1. Classification

In biomedical studies, it is often very important to be able to classify different conditions, for instancefor diagnostic purposes. In the case of epilepsy, discriminating between normal and pathologicalelectroencephalographic recording often represents a non-trivial task. Ordinal pattern distributionshave been proving a valuable tool for classifying and discriminating dynamical states of variousbiological systems. Veisi et al. [65] illustrated the ability of permutation entropy for classifying normaland epileptic EEG. The results of classification performed using discriminant analysis indicated thatpermutation entropy measures can distinguish normal and epileptic EEG signals with an accuracy ofmore than 97% for clean and more than 85% for highly noisy EEG signals.

4.1.2. Determinism Detection

Often, epileptic seizures manifest in a highly stereotypical ordered sequences of symptoms and signswith limited variability. Schindler et al. [59] conjectured that this stereotype may imply that ictalneuronal dynamics might have deterministic characteristics, and that this would presumably be enhancedin the ictogenic regions of the brain. To test this hypothesis, the authors used the time-varying averagenumber of forbidden patterns of multichannel recordings of periictal EEG activity in 16 patients. Resultsfor intracranial EEG demonstrated a spatiotemporally limited shift of neuronal dynamics toward a moredeterministic dynamic regime, specifically pronounced during the seizure-onset period. While the meannumber of forbidden patterns did not significantly change during seizures, the maximum number offorbidden patterns across electrodes typically increased significantly during the first third of the seizureperiod and then gradually decreased toward and beyond seizure termination. Interestingly, for patientswho became seizure free following surgery, the maximal number of forbidden patterns during seizureonset tended to be recorded from within the seizure-onset zones identified by visual inspection.

4.1.3. Detection of Dynamic Change

Detection of dynamical changes is one of the most important problems in physics and biology.Indeed, in clinical studies, accurate detection of transitions from normal to pathological states mayimprove diagnosis and treatment. This is particularly evident in the case of epilepsy, as seizuredetection is a necessary precondition for diagnosis. During the last two decades, a number of numericalmethods have been proposed to detect dynamical changes. However, most of these methods arecomputationally expensive, as they involve inspecting the underlying dynamics in the system’s phasespace. Cao et al. [63] used permutation entropy to identify the various phases of epileptic activityin the intracranial EEG signals recorded from three patients suffering from intractable epilepsy. Theauthors found a sharp PE drop after the seizure, followed by a gradual increase, indicating that thedynamics of the brain first becomes more regular right after the seizure, then its irregularity increasesas it approaches the normal state. Ouyang et al. [68] calculated the distribution of ordinal patterns forthe detection of absence seizures in rats. A dissimilarity measure between two EEG series was then

Page 13: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1565

used to distinguish between interictal, preictal and ictal states, i.e., respectively far away, close to andduring an epileptic seizure, leading to the successful detection of the preictal state prior to their onsetin 109 out of 168 seizures (64.9%). Nicolaou and Georgiou [78] investigated the use of permutationentropy as a feature for automated epileptic seizure detection. A support vector machine was used toclassify segments of normal and epileptic EEG, yielding an average sensitivity of 94.38% and averagespecificity of 93.23%. Perfect sensitivity and specificity were obtained in single-trial classifications.Finally, a cautionary note on the scope for the use of permutation entropy in seizure detection comesfrom the study of Bruzzo et al. [69], where the scalp EEG data recorded from three epileptic patientswere considered. With a receiver operating characteristics analysis, the authors evaluated the separabilityof amplitude distributions of ordinal patterns resulting from preictal and interictal phases. While a goodseparability of interictal and preictal phase was found, the changes in permutation entropy values duringthe preictal phase and at seizure onset coincided with changes in vigilance state, restricting its possibleuse for seizure prediction on scalp EEG. On the other hand, this finding suggested the possible usefulnessfor an automated classification of vigilance states.

4.1.4. Prediction

Over and above the very occurrence of epileptic seizures and their frequency, their sudden andincontrollable character is probably the single most important factor negatively affecting the life ofpatients. Thus, methods capable of reliably predicting the occurrence of seizures could significantlyimprove the quality of life for these patients and pave the way for new therapeutic strategies. Li et al. [79]proved, for a population of rats, that permutation entropy can be used not only to track the dynamicalchanges of EEG data, but also to successfully detect pre-seizure states. A threshold for detecting pre-ictalstate was determined by calculating the mean value and standard deviation of the permutation entropyand another commonly used metric, i.e., sample entropy variations, from the respective rat. The methodwas successful in detecting 169 out of 314 seizures from 28 rats, with an average anticipation time ofapproximately 5 seconds, faring better than sample entropy (3.7 seconds). Ouyang et al. [67] studied thestatistics of forbidden patterns for the EEG series of genetic absence epilepsy rats. The results showedthat the number of forbidden patterns grew significantly from an interictal to an ictal state via a preictalstate. In addition to indicating increases in deterministic dynamics in the transition from interictal toictal states, these results suggested that forbidden patterns may represent a predictor of absence seizures.

4.1.5. Spatio-Temporal Dynamics

While the emphasis of most studies in epilepsy has long been on the identification of localepileptogenic foci, it is now widely recognized that seizure dynamics is an essentially spatially-extendedphenomenon [80]. One fundamental issue is then the assessment of the relationship between dynamicsobserved in different parts of the system. Keller et al. [64] proposed a method for visualizingtime-dependent similarities and dissimilarities between the components of a high-dimensional timeseries. The method, derived from correspondence analysis, essentially counts pattern type frequencies.At each time, the method quantifies how inhomogeneous the set of time series components is andprovides a one-dimensional representation of this system. The method was shown to be able to

Page 14: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1566

quantify long-term qualitative changes and local differences in scalp EEG activity for children withepileptic disorders. Similarities and dissimilarities between the channels were calculated in terms of ascaling parameter, allowing discriminating the components with respect to a specific weighting of thepattern frequencies.

A related issue, when dealing with inherently multivariate data sets, is the evaluation of couplingdirection between subparts of the considered system (see Section 3.4). Staniek et al. [81] combinedtransfer and permutation entropy, to analyze electroencephalographic recordings from 15 epilepsypatients. The results showed that the derived metric could reliably identify the hemisphere containing theepileptic focus, without observing actual seizure activity. Finally, Li et al. [82] proposed a methodologybased on permutation analysis and conditional mutual information to estimate directionality of couplingbetween two neuronal populations. Simulations showed that this method outperformed conditionalmutual information and Granger causality in a neuronal mass model, and in assessing the couplingdirection between neuronal populations in a hippocampal rat model of focal epilepsy. This couplingdirection estimation method also allowed tracing the propagation direction of the seizure events.

In summary, the studies reviewed above point at various ways in which permutation entropy canfruitfully be employed to tackle various fundamental theoretical and clinical issues associated withepilepsy. From a theoretical point of view, results using forbidden pattern statistics hint at a deterministicnature of the dynamics associated with epilepsy. From a clinical point of view, these results indicate thatpermutation entropy and particularly forbidden pattern statistics can be used not only to detect seizureonset but also to predict upcoming seizures before they actually occur. Furthermore, in spite of theconceptual similarity between forbidden ordinal patterns and standard EEG analysis based on visualinspection, forbidden ordinal patterns may provide additional information that is difficult to detect byvisual inspection alone. Its clinical relevance, particularly in pre-surgical evaluation, is underscored forinstance by Schindler et al.’s finding that the maximal number of forbidden patterns tended to occur morerarely in EEG signals recorded from the visually identified seizure-onset zone in patients who were notrendered seizure free by resection of that zone [59].

4.2. Anesthesia

Anesthetic drugs mainly exert their effects on the central nervous system. Thus, EEG technologycan be used to assess the effects of anesthesia. Electroencephalogram-based monitors represent asupplement to standard anesthesia monitoring, the main aim of which is that of reducing the riskof awareness during surgery. Electroencephalogram-based parameters typically aim at reducing thecomplex observed electroencephalographic pattern to a single value associated with the anesthetic drugeffect and clinical patient status, e.g., consciousness and unconsciousness. These issues were examinedin various studies [70,72–74,79], which consistently showed that permutation entropy can be used toefficiently discriminate between different levels of consciousness during anesthesia, providing an indexof the anesthetic drug effect.

Biological systems, such as the brain or the cardio-respiratory system, typically show activity overmultiple time scales. Even at rest, the interplay between different regulatory systems ensures constantinformation exchange across these scales. Thus, a correct description should be more accurate when

Page 15: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1567

accounting for activity not just at one particular scale, but across all or most of the relevant scales atwhich the system operates. This intuition received important confirmations in two studies assessingthe depth of anesthesia. Li et al. [83] proposed a multiscale permutation measure, called compositemulti-scale permutation entropy (CMSPE), to quantifying the anesthetic drug effect on EEG recordingsduring sevoflurane anesthesia. Three sets of simulated EEG series during awake, light and deepanesthesia were examined. The results showed that the single-scale permutation entropy was blind tosubtle transitions between light and deep anesthesia, while the CMSPE index tracked these changesaccurately. Around the time of loss of consciousness, CMSPE responded significantly more rapidlythan the single-scale permutation entropy. In addition, the prediction probability was slightly higher forCMSPE and correlated well with the level of anesthesia. These results were consistent with those of arecent study [74], where promising results in terms of evaluation of depth of anesthesia were found withboth single-scale permutation and multiscale permutation entropy.

4.3. Cognitive Neuroscience

Understanding brain activity has an interest that goes beyond the clinical domain. For instance,cognitive neuroscience studies the biological substrates, mainly brain activity, underlying cognition. Atypical cognitive neuroscience study involves averaging the brain (electrical or hemodynamic) responseto given stimuli. Extracting the part of the observed response that is stimulus-specific from theinherent variability of brain activity is a challenging task. Schinkel and colleagues [75,76] showedhow ordinal pattern methods can be used to achieve better signal-to-noise ratios. This was doneusing permutation entropy in conjunction with recurrence quantification analysis (RQA). The authorsshowed that this combination of methods can improve the analysis of event related potentials (ERP),i.e., the trial-averaged EEG signal time-locked to given behavioral events. The resulting technique,termed order patterns recurrence plots (OPRPs), was applied on EEG data recorded during a languageprocessing experiment, resulting in a significant reduction of the number of trials required to extract atask-related ERP.

Ordinal patterns can also be used for classification purposes, between different trial types in cognitiveneuroscience experiments. For instance, permutation entropy was employed to characterize signals fromthe electroencephalogram of three subjects performing four different motor imagery tasks, which werethen classified using a support vector machine [84]. Subject-specific single-trial classification accuracylevels higher than conventional classifiers could be achieved, occasionally achieved perfect classification.

4.4. Heart Rhythms

Cardiac diseases are often associated with changes in heart rate variability and in characteristicpatterns of beat-to-beat intervals (BBI). Discriminating between physiological and pathological BBIpatterns represents a key diagnostic tool. Successful classification of time series of BBIs cruciallydepends on the availability of significant features. Permutation entropy has consistently been shownto greatly improve the ability to distinguish heart rate variability under different physiological andpathological conditions [54]. Ordinal pattern statistics was proven to be more efficient than establishedheart rate variability indicators at distinguishing between patients suffering from congestive heart failure

Page 16: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1568

from healthy subjects [17]. Ordinal patterns have also proved to be valuable features for the classificationof fetal heart state [19] and could conceivably serve to develop and investigate clustering methodsby considering the ordinal structure of a time series. Berg et al. [77] compared a large number ofsignal features, including conventional heart rate variability parameters, and a statistics based on ordinalpatterns; the aim was to assess their ability to form the basis for suitable signal classifiers. The resultsfor animal and humans suffering from myocardial infarction (MI) suggested that ordinal patterns mayrepresent meaningful features.

The heart continuously interacts with other physiological regulatory mechanisms, and the failure ofone system can trigger a breakdown of the entire network. Understanding and quantifying the complexcoupling patterns interactions patterns between these systems represents a major theoretical challenge.Two studies suitably modified information theoretic measures of directionality of coupling with ordinalpattern statistics. Bahraminasab et al. [52] introduced a permutation entropy-based directionality index,which could distinguish unidirectional from bidirectional coupling, and reveal and quantify asymmetryin bidirectional coupling. The method was tested on cardiorespiratory data from 20 healthy subjects.Consistent with existing physiological literature, the results from this study showed that respiration drivesthe heart more than vice versa.

Taken together, these results illustrate the possible use of permutation entropy-based methods to tackleoften non-trivial diagnostic problems in the field of cardiology. Conceptual simplicity and computationalefficiency render this method of data analysis an excellent one for screening and detecting pathologicalpatterns of physiological activity both in systems considered in isolation, such as the heart, and incoupled systems.

5. Econophysics Applications

Assessing the efficiency and the potential development of a given market is a fundamental issuein economics, as this has clear implications in terms of political economy. The main problem is thatsuch assessment can only be performed by means of the analysis of the time series of market indicators,which are usually the only available objective output. The Efficient Market Hypothesis (EMH) stipulatesthat efficient markets should be perfectly unpredictable, as any deterministic structure can be used tooutperform the market. As previously reviewed, PE can be used effectively to discriminate betweendeterministic chaos and random noise.

A natural way of doing that using the Bandt and Pompe methodology involves quantifying theforbidden patterns in the series, as their presence indicates a deterministic chaotic dynamics (seeSection 3.1). This idea was first examined in [85], where real time series of different financial indicators(Dow Jones Industrial Average, Nasdaq Composite, IBM and Boeing NYSE stocks, and the ten yearU.S. Bond interest rate) were shown to have a number of forbidden patterns at least two ordersof magnitude higher than that expected if the different time series were random. By employing arolling sample approach with a sliding window, it was also found that the evolution of the forbiddenpatterns allowed identifying periods of time where noise prevailed over the deterministic behavior of thefinancial indicators.

Page 17: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1569

In addition to helping assessing the presence or absence of determinism, the number of forbiddenpatterns can also be used to quantify the amount of market structure, which in turn is an indicator ofthe level of stock market development. Therefore, statistics about ordinal patterns can be used as amodel-independent measure of stock market inefficiency. In [86], the number of forbidden patterns andthe normalized PE were estimated for the stock market indices of 32 different countries, including 18developed and 14 emerging markets. Developed markets had a lower number of forbidden patterns andhigher normalized PE, indicating that they are less predictable.

Expanding on this idea, Zunino and coworkers analyzed the location of stock [38], commodity [39]and sovereign bond [87] markets in the complexity-entropy causality plane (see Section 3.2). The EMHstipulates that efficient markets should be associated with large entropy and low complexity values. Thepresence of temporal patterns resulted in deviations from the position associated to a totally randomprocess. Consequently, the distance to this random ideal location is used to quantify the inefficiency ofthe market under analysis. The results from [38,87] showed that the complexity-entropy causality planecould robustly discriminate emergent and developed stock markets.

Another common problem in stock market analysis is that of judging the degree of dependencybetween of two or more time series. In [46], authors propose a test based on ordinal patterns, whichwas described in Section 3.4. Furthermore, the method was evaluated against the daily financial returnsof Dow Jones Industrial Average, S&P 500 and three exchange rate time series (French franc, Germanmark and Canadian dollar all against the U.S. dollar). Results indicate that all five time series are notindependent, and thus substantially deviate from a random process.

Finally, it has very recently been shown [88] that volatility in energy markets can be effectivelyquantified by PE and its improved version FGPE, defined in Section 3.5. Through numerical examplesthe authors proved that these two approaches based on ordinal patterns are more appropriate forestimating the uncertainty associated to a time series than conventional measures of dispersion, such asthe standard deviation. Moreover, the analysis of some typical electricity markets (Nord Pool, Ontario,Omel and four Australian markets) demonstrated the ability of these measures in detecting interestingfeatures, such as seasonal behavior of the volatilities and relationships between markets.

6. Conclusions

In this work, we have reviewed the technique introduced exactly ten years ago by Bandt andPompe [11], which is based on the assessment of the frequency of appearance of permutation patterns ina time series. The mathematical foundations of the method have been discussed, and some extensions ofthe original concept have been described.

The Bandt and Pompe methodology represents an extremely simple technique that only requires inits basic form two parameters: the pattern length/embedding dimension D and the embedding delayτ . Its most important merit resides in its ability of extracting useful knowledge about the dynamics ofa system. Often the quantity of forbidden patterns is related to other classical non-linear quantifiers,as, for instance, the Lyapunov exponent [89], but can be calculated with minimum computational cost.Furthermore, the ordinal-pattern’s associated PDF is invariant with respect to nonlinear monotonoustransformations. Accordingly, nonlinear drifts or scalings artificially introduced by a measurement

Page 18: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1570

device will not modify the quantifiers’ estimation, a nice property if one deals with experimentaldata [90]. A further valuable property is its robustness to both observational and dynamical noise [11].Finally, it is model-free and can be applied to any type of time series, i.e., regular, chaotic, noisy, etc.

While the original goal of the PE was the discrimination of chaotic from random dynamics, it has soonbecame clear that this method can be used in an effective way to address a number of important problemsin time series analysis: among others, (a) classifying different dynamics; (b) identifying break points intime series; (c) predicting future events; (d) determining time scales; (e) quantifying the dissimilaritybetween time series; or (f) identifying directionality and causality. Furthermore, while the methodwas originally designed to deal with simple scalar time series, it has been successfully extended tomulti-variate and multi-scale systems.

Several applications to economical and biological systems have been discussed. Yet, uncountableother examples of applications can be found in the literature: they include characterization of correlatedstochastic processes [91–93] and lasers dynamics [94–97], identification of the quantum-classicaltransition [20,98–100], analysis of solar winds time series [101] and of climate evolution [90],songs classification [102], analysis of motor current signals [103], control of machine tool chatterphenomena [104], characterization of gene expression [105], optimization of grid computing [79],characterization and improvements of pseudo-random number generators (PSRG) [7,106–109], deter-mination of the sampling period for chaotic attractors which preserves the chaotic dynamics [110], andencryption test of messages in lasers signals [111].

Acknowledgments

LZ and OAR were supported by the Consejo Nacional de Investigaciones Cientıficas y Tecnicas(CONICET), Argentina. O. A. Rosso gratefully acknowledge support from CNPq fellowship, Brazil.

References

1. Gray, R.M. Entropy and Information Theory; Springer: Berlin/Heidelberg, Germany, 1990.2. Brissaud, J.B. The meaning of entropy. Entropy 2005, 7, 68–96.3. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423.4. Shannon, C.; Weaver, W. The Mathematical Theory of Communication; University of Illinois

Press: Champaign, IL, USA, 1949.5. Feldman, D.P.; Crutchfield, J.P. Measures of statistical complexity: Why? Phys. Lett. A 1998,

238, 244–252.6. Rosso, O.A.; Craig, H.; Moscato, P. Shakespeare and other English renaissance authors as

characterized by Information Theory complexity quantifiers. Physia A 2009, 388, 916–926.7. De Micco, L.; Gonzalez, C.M.; Larrondo, H.A.; Martın, M.T.; Plastino, A.; Rosso, O.A.

Randomizing nonlinear maps via symbolic dynamics. Physia A 2008, 387, 3373–3383.8. Mischaikow, K.; Mrozek, M.; Reiss, J.; Szymczak, A. Construction of symbolic dynamics from

experimental time series. Phys. Rev. Lett. 1999, 82, 1144–1147.9. Powell, G.E.; Percival, I.C. A spectral entropy method for distinguishing regular and irregular

motion of hamiltonian systems. J. Phys. A: Math. Gen. 1979, 12, 2053–2071.

Page 19: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1571

10. Rosso, O.A.; Blanco, S.; Jordanova, J.; Kolev, V.; Figliola, A.; Schurmann, M.; Basar, E. Waveletentropy: A new tool for analysis of short duration brain electrical signals. J. Neurosci. Meth.2001, 105, 65–75.

11. Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys.Rev. Lett. 2002, 88, 174102:1–174102:4.

12. Martın, M.T.; Plastino, A.; Rosso, O.A. Generalized statistical complexity measures: Geometricaland analytical properties. Physia A 2006, 369, 439–462.

13. Rosso, O.A.; Larrondo, H.A.; Martın, M.T.; Plastino, A.; Fuentes, M.A. Distinguishing noisefrom chaos. Phys. Rev. Lett. 2007, 99, 154102:1–154102:4.

14. Rosso, O.A.; Carpi, L.C.; Saco, P.M.; Ravetti, M.G.; Plastino, A.; Larrondo, H.A. Causality andthe entropy-complexity plane: Robustness and missing ordinal patterns. Phys. A 2012, 391,42–55.

15. Rosso, O.A.; De Micco, L.; Larrondo, H.; Martın, M.T.; Plastino, A. Generalized statisticalcomplexity measure. Int. J. Bifurc. Chaos 2010, 20, 775–785.

16. Parlitz, U.; Berg, S.; Luther, S.; Schirdewan, A.; Kurths, J.; Wessel, N. Classifying CardiacBiosignals Using Pattern Statistics and Symbolic Dynamics. In Proceedings of the 6th ESGCO,Berlin, Germany, 12–14 April 2010.

17. Parlitz, U.; Berg, S.; Luther, S.; Schirdewan, A.; Kurths, J.; Wessel, N. Classifying cardiacbiosignals using ordinal pattern statistics and symbolic dynamics. Comput. Biol. Med. 2012, 42,319–327.

18. Peitgen, H.O.; Jurgens, H.; Saupe, D. Chaos and Fractals, New Frontiers of Science;Springer-Verlag: New York, NY, USA, 1992.

19. Frank, B.; Pompe, B.; Schneider, U.; Hoyer, D. Permutation entropy improves fetal behaviouralstate classification based on heart rate analysis from biomagnetic recordings in near term fetuses.Med. Biol. Eng. Comput. 2006, 44, 179–187.

20. Kowalski, A.M.; Martın, M.T.; Plastino, A.; Rosso, O.A. Bandt-Pompe approach to theclassical-quantum transition. Phys. D 2007, 233, 21–31.

21. Bandt, C. Ordinal time series analysis. Ecol. Model. 2005, 182, 229–238.22. Zunino, L.; Soriano, M.C.; Fischer, I.; Rosso, O.A.; Mirasso, C.R. Permutation-information-

theory approach to unveil delay dynamics from time-series analysis. Phys. Rev. E 2010, 82,046212:1–046212:9.

23. Soriano, M.C.; Zunino, L.; Rosso, O.A.; Fischer, I.; Mirasso, C.R. Time scales of a chaoticsemiconductor laser with optical feedback under the lens of a permutation information analysis.IEEE J. Quantum Electron. 2011, 47, 252–261.

24. Soriano, M.C.; Zunino, L.; Larger, L.; Fischer, I.; Mirasso, C.R. Distinguishing fingerprints ofhyperchaotic and stochastic dynamics in optical chaos from a delayed opto-electronic oscillator.Opt. Lett. 2011, 36, 2212–2214.

25. Amigo, J.M.; Kocarev, L.; Szczepanski, J. Order patterns and chaos. Phys. Lett. A 2006, 355,27–31.

26. Amigo, J.M.; Zambrano, S.; Sanjuan, M.A.F. True and false forbidden patterns in deterministicand random dynamics. Europhys. Lett. 2007, 79, doi:10.1209/0295-5075/79/50001.

Page 20: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1572

27. Amigo, J.M.; Zambrano, S.; Sanjuan, M.A.F. Combinatorial detection of determinism in noisytime series. Europhys. Lett. 2008, 83, doi:10.1209/0295-5075/83/60005.

28. Amigo, J.M. Permutation Complexity in Dynamical Systems; Springer-Verlag: Berlin/Heidelberg,Germany, 2010.

29. Bandt, C.; Shiha, F. Order patterns in time series. J. Time Ser. Anal. 2007, 28, 646–665.30. Carpi, L.C.; Saco, P.M.; Rosso, O.A. Missing ordinal patterns in correlated noises. Phys. A 2010,

389, 2020–2029.31. Feldman, D.P.; McTague, C.S.; Crutchfield, J.P. The organization of intrinsic computation:

Complexity-entropy diagrams and the diversity of natural information processing. Chaos 2008,18, doi:10.1063/1.2991106.

32. Lamberti, P.W.; Martın, M.T.; Plastino, A.; Rosso, O.A. Intensive entropic nontriviality measure.Phys. A 2004, 334, 119–131.

33. Rosso, O.A.; Masoller, C. Detecting and quantifying stochastic and coherence resonances viainformation-theory complexity measurements. Phys. Rev. E 2009, 79, 040106R:1–040106R:4.

34. Rosso, O.A.; Masoller, C. Detecting and quantifying temporal correlations in stochastic resonancevia information theory measures. Eur. Phys. J. B 2009, 69, 37–43.

35. Lopez-Ruiz, R.; Mancini, H.L.; Calbet, X. A statistical measure of complexity. Phys. Lett. A1995, 209, 321–326.

36. Kowalski, A.M.; Martın, M.T.; Plastino, A.; Rosso, O.A.; Casas, M. Distances in probabilityspace and the statistical complexity setup. Entropy 2011, 13, 1055–1075.

37. Grosse, I.; Bernaola-Galvan, P.; Carpena, P.; Roman-Roldan, R.; Oliver, J.; Stanley, H.E. Analysisof symbolic sequences using the Jensen-Shannon divergence. Phys. Rev. E 2002, 65,doi:10.1103/PhysRevE.65.041905.

38. Zunino, L.; Zanin, M.; Tabak, B.M.; Perez, D.G.; Rosso, O.A. Complexity-entropy causalityplane: A useful approach to quantify the stock market inefficiency. Phys. A 2010, 389,1891–1901.

39. Zunino, L.; Tabak, B.M.; Serinaldi, F.; Zanin, M.; Perez, D.G.; Rosso, O.A. Commoditypredictability analysis with a permutation information theory approach. Phys. A 2011, 390,876–890.

40. Masoller, C.; Rosso, O.A. Quantifying the complexity of the delayed logistic map. Philos. Trans.Roy. Soc. A 2011, 369, 425–438.

41. Lepri, S.; Giacomelli, G.; Politi, A.; Arecchi, F.T. High-dimensional chaos in delayed dynamicalsystems. Phys. D 1994, 70, 235–249.

42. Fraser, A.M.; Swinney, H.L. Independent coordinates for strange attractors from mutualinformation. Phys. Rev. A 1986, 33, 1134–1140.

43. Mackey, M.C.; Glass, L. Oscillation and chaos in physiological control systems. Science 1977,197, 287–289.

44. Xiang, S.; Pan, W.; Luo, B.; Yan, L.; Zou, X.; Jiang, N.; Yang, L.; Zhu, H. Concealtime-delay signature of chaotic vertical-cavity surface-emitting lasers by variable-polarizationoptical feedback. Opt. Commun. 2011, 284, 5758–5765.

Page 21: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1573

45. Wu, J.G.; Wu, Z.M.; Xia, G.Q.; Feng, G.Y. Evolution of time delay signature of chaos generatedin a mutually delay-coupled semiconductor lasers system. Opt. Express 2012, 20, 1741–1753.

46. Matilla-Garcıa, M.; Ruiz Marın, M. A non-parametric independence test using permutationentropy. J. Econom. 2008, 144, 139–155.

47. Lopez, F.; Matilla-Garcıa, M.; Mur, J.; Ruiz Marın, M. A non-parametric spatial independencetest using symbolic entropy. Reg. Sci. Urban Econ. 2010, 40, 106–115.

48. Matilla-Garcıa, M.; Ruiz Marın, M. Spatial symbolic entropy: A tool for detecting the order ofcontiguity. Geogr. Anal. 2011, 43, 228–239.

49. Herrera Gomez, M.; Ruiz Marın, M.; Mur, J.; Paelinck, J. A non-parametric approach to spatialcausality. 9mes Journees Internationales dEconometrie et de Statistique Spatiales, 24–25 June2010, Orleans, France.

50. Canovas, J.S.; Guillamon, A.; Ruız, M.C. Using permutations to detect dependence between timeseries. Phys. D 2011, 240, 1199–1204.

51. Freeman, G.H.; Halton, J.H. Note on an exact treatment of contingency, goodness of fit and otherproblems of significance. Biometrika 1951, 38, 141–149.

52. Bahraminasab, A.; Ghasemi, F.; Stefanovska, A.; McClintock, P.V.E.; Kantz, H. Direction ofcoupling from phases of interacting oscillators: A permutation information approach. Phys. Rev.Lett. 2008, 100, 084101:1–084101:4.

53. Liu, X.-F.; Wang, Y. Fine-grained permutation entropy as a measure of natural complexity fortime series. Chin. Phys. B 2009, 18, 2690–2695.

54. Bian, C.; Qin, C.; Ma, Q.D.Y.; Shen, Q. Modied permutation-entropy analysis of heartbeatdynamics. Phys. Rev. E 2012, 85, 021906:1–021906:7.

55. Goldberger, A.L.; Amaral, L.A.N.; Hausdorff, J.M.; Ivanov, P.; Peng, C.K.; Stanley, H.E. Fractaldynamics in physiology: Alterations with disease and aging. Proc. Natl. Acad. Sci. USA 2002,99, 2466–2472.

56. Kenet, T.; Bibitchkov, D.; Tsodyks, M.; Grinvald, A.; Arieli, A. Spontaneously emerging corticalrepresentations of visual attributes. Nature 2003, 425, 954–956.

57. Beggs, J.; Plenz, D. Neuronal avalanches in neocortical circuits. J. Neurosci. 2003, 23,11167–11177.

58. Dragoi, G.; Tonegawa, S. Preplay of future place cell sequences by hippocampal cellularassemblies. Nature 2011, 469, 397–401.

59. Schindler, K.; Gast, H.; Stieglitz, L.; Stibal, A.; Hauf, M.; Wiest, R.; Mariani, L.; Rummel, C.Forbidden ordinal patterns of periictal intracranial EEG indicate deterministic dynamics in humanepileptic seizures. Epilepsia 2011, 52, 1771–1780.

60. Stam, C.J. Nonlinear dynamical analysis of EEG and MEG: Review of an emerging field. Clin.Neurophysiol. 2005, 116, 2266–2301.

61. Pereda, E.; Quiroga, R.Q.; Bhattacharya, J. Nonlinear multivariate analysis of neurophysiologicalsignals. Prog. Neurobiol. 2005, 77, 1–37.

62. Groth, A. Visualization of coupling in time series by order recurrence plots. Phys. Rev. E 2005,72, 046220:1–046220:8.

Page 22: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1574

63. Cao, Y.; Tung, W.; Gao, J.B.; Protopopescu, V.A.; Hively, L.M. Detecting dynamicalchanges in time series using the permutation entropy. Phys. Rev. E 2004, 70,doi:10.1103/PhysRevE.70.046217.

64. Keller, K.; Wittfeld, K. Distances of time series components by means of symbolic dynamics.Int. J. Bifurc. Chaos 2004, 14, 693–703.

65. Veisi, I.; Pariz, N.; Karimpour, A. Fast and Robust Detection of Epilepsy in Noisy EEGSignals Using Permutation Entropy. In Proceedings of the 7th IEEE International Conferenceon Bioinformatics and Bioengineering, Boston, MA, USA, 14–17 October 2007; pp. 200–203.

66. Li, X.; Ouyang, G.; Richards, D.A. Predictability analysis of absence seizures with permutationentropy. Epilepsy Res. 2007, 77, 70–74.

67. Ouyang, G.; Li, X.; Dang, C.; Richards, D.A. Deterministic dynamics of neural activity duringabsence seizures in rats. Phys. Rev. E 2009, 79, 041146:1–041146:8.

68. Ouyang, G.; Dang, C.; Richards, D.A.; Li, X. Ordinal pattern based similarity analysis for EEGrecordings. Clin. Neurophysiol. 2010, 121, 694–703.

69. Bruzzo, A.A.; Gesierich, B.; Santi, M.; Tassinari, C.; Birbaumer, N.; Rubboli, G. Permutationentropy to detect vigilance changes and preictal states from scalp EEG in epileptic patients: Apreliminary study. Neurol. Sci. 2008, 29, 3–9.

70. Jordan, D.; Stockmanns, G.; Kochs, E.F.; Pilge, S.; Schneider, G. Electroencephalographicorder pattern analysis for the separation of consciousness and unconsciousness: An analysis ofapproximate entropy, permutation entropy, recurrence rate, and phase coupling of order recurrenceplots. Anesthesiology 2008, 109, 1014–1022.

71. Li, X.; Cui, S.M.E.; Voss, L.J. Using permutation entropy to measure the electroencephalographiceffect of sevoflurane. Anesthesiology 2007, 109, 448–456.

72. Olofsen, E.; Sleigh, J.W.; Dahan, A. Permutation entropy of the electroencephalogram: A measureof anaesthetic drug effect. Br. J. Anaesth. 2008, 101, 810–821.

73. Silva, A.; Cardoso-Cruz, H.; Silva, F.; Galhardo, V.; Antunes, L. Comparison of anestheticdepth indexes based on thalamocortical local field potentials in rats. Anesthesiology 2010, 112,355–363.

74. Silva, A.; Campos, S.; Monteiro, J.; Venancio, C.; Costa, B.; Guedes de Pinho, P.; Antunes, L.Performance of anesthetic depth indexes in rabbits under propofol anesthesia. Anesthesiology2011, 115, 303–314.

75. Schinkel, S.; Marwan, N.; Kurths, J. Order patterns recurrence plots in the analysis of ERP data.Cogn. Neurodyn. 2007, 1, 317–325.

76. Schinkel, S.; Marwan, N.; Kurths, J. Brain signal analysis based on recurrences. J. Physiol. 2009,103, 315–323.

77. Berg, S.; Luther, S.; Lehnart, S.E.; Hellenkamp, K.; Bauernschmitt, R.; Kurths, J.; Wessel, N.;Parlitz, U. Comparison of Features Characterizing Beat-to-Beat Time Series. In Proceedings ofBiosignal, Berlin, Germany, 14–16 July 2010.

78. Nicolaou, N.; Georgiou, J. Detection of epileptic electroencephalogram based on PermutationEntropy and Support Vector Machines. Expert Syst. Appl. 2012, 39, 202–209.

Page 23: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1575

79. Li, H.; Muskulus, M.; Heusdens, R.; Wolters, L. Analysis and Synthesis of Pseudo-PeriodicJob Arrivals in Grids: A Matching Pursuit Approach. In Proceedings of the Seventh IEEEInternational Symposium on Cluster Computing and the Grid, Rio De Janeiro, Brazil, 14–17May 2007; pp. 183–196.

80. Wendling, F.; Chauvel, P.; Biraben, A.; Bartolomei, F. From intracerebral EEG signals to brainconnectivity: Identification of epileptogenic networks in partial epilepsy. Front. Syst. Neurosci.2010, 4, doi:10.3389/fnsys.2010.00154.

81. Staniek, M.; Lehnertz, K. Symbolic transfer entropy. Phys. Rev. Lett. 2008, 100, 158101:1–158101:4 .

82. Li, X.; Ouyang, G. Estimating coupling direction between neuronal populations with permutationconditional mutual information. Neuroimage 2010, 52, 497–507.

83. Li, D.; Li, X.; Liang, Z.; Voss, L.J.; Sleigh, J.W. Multiscale permutation entropy analysisof EEG recordings during sevoflurane anesthesia. J. Neural Eng. 2010, 7, doi:10.1088/1741-2560/7/4/046010.

84. Nicolaou, N.; Georgiou, J. Permutation Entropy: A New Feature for Brain-Computer Interfaces.In Proceedings of the IEEE Biomedical Circuits and Systems Conference, Paphos, Cyprus, 3–5November 2010; pp. 49–52.

85. Zanin, M. Forbidden patterns in financial time series. Chaos 2008, 18, doi:10.1063/1.2841197.86. Zunino, L.; Zanin, M.; Tabak, B.M.; Perez, D.G.; Rosso, O.A. Forbidden patterns, permutation

entropy and stock market inefficiency. Phys. A 2009, 388, 2854–2864.87. Zunino, L.; Bariviera, A.F.; Guercio, M.B.; Martinez, L.B.; Rosso, O.A. On the efficiency of

sovereign bond markets. Phys. A 2012, 391, 4342–4349.88. Ruiz, M.C.; Guillamon, A.; Gabaldon, A. A new approach to measure volatility in energy markets.

Entropy 2012, 14, 74–91.89. Amigo, J.M.; Kennel, M.B.; Kocarev, L. The permutation entropy rate equals the metric entropy

rate for ergodic information sources and ergodic dynamical systems. Phys. D 2005, 210,doi: 10.1016/j.physd.2005.07.006.

90. Saco, P.M.; Carpi, L.C.; Figliola, A.; Serrano, E.; Rosso, O.A. Entropy analysis of the dynamicsof El Nino/Southern oscillation during the holocene. Phys. A 2010, 389, 5022–5027.

91. Rosso, O.A.; Zunino, L.; Perez, D.G.; Figliola, A.; Larrondo, H.A.; Garavaglia, M.; Martın, M.T.;Plastino, A. Extracting features of Gaussian self-similar stochastic processes via the Bandt-Pompeapproach. Phys. Rev. E 2007, 76, 061114:1–061114:6.

92. Zunino, L.; Perez, D.G.; Martın, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A. Permutationentropy of fractional Brownian motion and fractional Gaussian noise. Phys. Lett. A 2008, 372,4768–4774.

93. Zunino, L.; Perez, D.G.; Kowalski, A.; Martın, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.Fractional Brownian motion, fractional Gaussian noise, and Tsallis permutation entropy. Phys. A2008, 387, 6057–6068.

94. Tiana-Alsina, J.; Torrent, M.C.; Rosso, O.A.; Masoller, C.; Garcia-Ojalvo, J. Quantifying thestatistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback.Phys. Rev. A 2010, 82, doi:10.1103/PhysRevA.82.013819.

Page 24: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1576

95. Tiana-Alsina, J.; Buldu, J.M.; Torrent, M.C.; Garcia-Ojalvo, J. Quantifying stochasticity in thedynamics of delay-coupled semiconductor lasers via forbidden patterns. Philos. Trans. Roy. Soc.A 2010, 368, 367–377.

96. Xiang, S.; Pan, W.; Yan, L.; Luo, B.; Zou, X.; Jiang, N.; Wen, K. Influence of polarizationmode competition on chaotic unpredictability of vertical-cavity surface-emitting lasers withpolarization-rotated optical feedback. Opt. Lett. 2011, 36, 310–312.

97. Zunino, L.; Rosso, O.A.; Soriano, M.C. Characterizing the hyperchaotic dynamics of asemiconductor laser subject to optical feedback via permutation entropy. IEEE J. Sel. Top.Quantum Electron. 2011, 17, 1250–1257.

98. Kowalski, A.M.; Martın, M.T.; Plastino, A.; Zunino, L. Information flow during thequantum-classical transition. Phys. Lett. A 2010, 374, 1819–1826.

99. Kowalski, A.M.; Martın, M.T.; Zunino, L.; Plastino, A.; Casas, M. The quantum-classicaltransition as an information flow. Entropy 2010, 12, 148–160.

100. Kowalski, A.M.; Martın, M.T.; Plastino, A.; Rosso, O.A. Chaos and complexity in theclassical-quantum transition. Int. J. Appl. Math. Stat. 2012, 26, 67–80.

101. Suyal, V.; Prasad, A.; Singh, H.P. Hysteresis in a solar activity cycle. Solar Phys. 2012, 276,407–414.

102. Ribeiro, H.V.; Zunino, L.; Mendes, R.S.; Lenzi, E.K. Complexity-entropy causality plane: Auseful approach for distinguishing songs. Phys. A 2012, 391, 2421–2428.

103. Li, X.; Ouyang, G.; Liang, Z. Complexity measure of motor current signals for tool flute breakagedetection in end milling. Int. J. Mach. Tools Manuf. 2008, 48, 371–379.

104. Nair, U.; Krishna, B.M.; Namboothiri, V.N.N.; NaMpoori, V.P.N. Permutation entropy basedreal-time chatter detection using audio signal in turning process. Int. J. Adv. Manuf. Technol.2010, 46, 61–68.

105. Sun, X.; Zou, Y.; Nikiforova, V.; Kurths, J.; Walther, D. The complexity of gene expressiondynamics revealed by permutation entropy. BMC Bioinform. 2010, 11, doi:10.1186/1471-2105-11-607.

106. Larrondo, H.A.; Gonzalez, C.M.; Martın, M.T.; Plastino, A.; Rosso, O.A. Intensive statisticalcomplexity measure of pseudorandom number generators. Phys. A 2005, 356, 133–138.

107. Larrondo, H.A.; Martın, M.T.; Gonzalez, C.M.; Plastino, A.; Rosso, O.A. Random numbergenerators and causality. Phys. Lett. A 2006, 352, 421–425.

108. De Micco, L.; Larrondo, H.A.; Plastino, A.; Rosso, O. Quantifiers for randomness of chaoticpseudo-random number generators. Philos. Trans. Roy. Soc. A 2009, 367, 3281–3296.

109. De Micco, L.; Petrocelli, R.A.; Rosso, O.A.; Plastino, A.; Larrondo, H.A. Mixing chaotic mapsand electromagnetic interference reduction. Int. J. Appl. Math. Stat. 2012, 26, 106–120.

110. De Micco, L.; Fernandez, J.G.; Larrondo, H.A.; Plastino, A.; Rosso, O.A. Sampling period,statistical complexity, and chaotic attractors. Phys. A 2012, 391, 2564–2575.

Page 25: OPEN ACCESS entropymzanin.com/papers/Permutation_Entropy_Review.pdfMassimiliano Zanin 1 ;2 3 *, Luciano Zunino 4 5, Osvaldo A. Rosso 6;7 and David Papo 1 1 Centre for Biomedical Technology,

Entropy 2012, 14 1577

111. Rosso, O.A.; Vicente, R.; Mirasso, C.R. Encryption test of pseudo-aleatory messages embeddedon chaotic laser signals: An Information Theory approach. Phys. Lett. A 2008, 372, 1018–1023.

c© 2012 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access articledistributed under the terms and conditions of the Creative Commons Attribution license(http://creativecommons.org/licenses/by/3.0/).