14
Ball, sics Theory Group and ntre for Complexity Science versity of Warwick MacKay, Maths iakonova, Physics&Complexity Emergence in Quantitative Systems – towards a measurable definition

R C Ball, Physics Theory Group and Centre for Complexity Science University of Warwick R S MacKay, Maths M Diakonova, Physics&Complexity Emergence in Quantitative

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

R C Ball,

Physics Theory Group and Centre for Complexity ScienceUniversity of Warwick

R S MacKay, MathsM Diakonova, Physics&Complexity

Emergence in Quantitative Systems – towards a measurable definition

Input ideas:

Shannon: Information -> Entropy transmission -> Mutual Information

Crutchfield: Complexity <-> Information

MacKay: Emergence = system evolves to non-unique state

Emergence in Quantitative Systems – towards a measurable definition

Emergence measure: Persistent Mutual Information across time.

Work in progress …. still mostly ideas.

Emergent Behaviour?

• System + Dynamics• Many internal d.o.f. and/or observe over long times• Properties: averages, correlation functions• Multiple realisations (conceptually)

Emergent properties- behaviour which is predictable (from prior observations) but not forseeable (from previous realisations).

time

real

isa

tion

s

Statisticalproperties

Strong emergence: different realisations (can) differ for ever

MacKay: non-unique Gibbs phase (distribution over configurations for a dynamical system)

Physics example: spontaneous symmetry breaking

system makes/inherits one of many equivalent choices of how to order

fine after you have achieved the insight that there is ordering (maybe heat capacity anomaly?) and what ordering to look for (no general technique).

Entropy & Mutual Information Shannon 1948

Mutual information as missing entropy:

log logij A BAB A B AB ij

ij AB i j AB

p N NI S S S p

p p N

Entropy as (logarithm of) variety of outcomes

log( ) log(1/ ) logA i i Ai A

S p p p N

A

B

A

B

A

B

0I 0I 0I - reduction of joint possibilities compared to independent case;

- measure of information transmission when ; . A input B output

MI-based Measures of Complexity

time

Entropy density (rate) Shannon ?

Excess Entropy Crutchfield & Packard 1982

A Bmeasure ABrelated to I

Persistent Mutual Information

- candidate measure of Emergence

Statistical Complexity Shalizi et al PRL 2004space

Measurement of Persistent MI

0

0 0[PMI lim lim [,], , ]I t t t t

0

•Measurement of I itself requires converting the data to a string of discrete symbols (e.g. bits)

•above seems the safer order of limits, and computationally practical

•The outer limit may need more careful definition

Examples with PMI

• Oscillation (persistent phase)

• Spontaneous ordering (magnets)

• Ergodicity breaking (spin glasses) – pattern is random but aspects become frozen in over time

Cases without with PMI

• Reproducible steady state

• Chaotic dynamics

PMI = 0

log 2

log 4log 8

log 3log 4

log 2

0

1 (1 )n n nx r x x Logistic map

x

r

Issue of time windows and limits

PMI / log2

Length of “present”

Length of past, future

Short time correl’n

Long strings under- sampled

r=3.58, PMI / log2 = 2

First direct measurementsPMI / ln2

r

r

Discrete vs continuous emergent order parameters

0

/Discrete order parameters are well resolved beyond threshold values of  

1/2/

/

Resolution of cts order parameters might improve without limit,

e.g.   (time averaging)

1 PMI   log const

2

P

This suggests some need to anticipate “information dimensionalities”

A definition of Emergence• System self-organises into a non-trivial behaviour;• there are different possible instances of that behaviour;• the choice is unpredictable but• it persists over time (or other extensive coordinate).

• Quantified by PMI = entropy of choice

Shortcomings

•Assumes system/experiment conceptually repeatable•Measuring MI requires deep sampling•Appropriate mathematical limits need careful construction

Generalisations•Admit PMI as function of timescale probed•Other extensive coordinates could play the role of time