43
Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Embed Size (px)

Citation preview

Page 1: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantum Information Theory

Patrick Hayden (McGill)

4 August 2005, Canadian Quantum Information Summer School

Page 2: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Overview

Part I: What is information theory?

Entropy, compression, noisy coding and beyond What does it have to do with quantum mechanics? Noise in the quantum mechanical formalism

Density operators, the partial trace, quantum operations Some quantum information theory highlights

Part II: Resource inequalities A skeleton key

Page 3: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Information (Shannon) theory

A practical question: How to best make use of a given communications

resource?

A mathematico-epistemological question: How to quantify uncertainty and information?

Shannon: Solved the first by considering the second. A mathematical theory of communication [1948]

The

Page 4: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantifying uncertainty

Entropy: H(X) = - x p(x) log2 p(x) Proportional to entropy of statistical physics Term suggested by von Neumann

(more on him later) Can arrive at definition axiomatically:

H(X,Y) = H(X) + H(Y) for independent X, Y, etc.

Operational point of view…

Page 5: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

X1X2 …Xn

Compression

Source of independent copies of X

{0,1}n: 2n possible strings

2nH(X) typical strings

If X is binary:0000100111010100010101100101About nP(X=0) 0’s and nP(X=1) 1’s

Can compress n copies of X toa binary string of length ~nH(X)

Page 6: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Typicality in more detail

Let xn = x1,x2,…,xn with xj 2 X We say that xn is -typical with respect to p(x)

if For all a 2 X with p(a)>0, |1/n N(a|xn) – p(a) | < / |X| For all a 2 X with p(a) = 0, N(a|xn)=0.

For >0, the probability that a random string Xn is -typical goes to 1.

If xn is -typical, 2-n[H(X)+]· p(xn) · 2-n[H(X)-]

The number of -typical strings is bounded above by 2n[H(X)+

Page 7: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

H(Y)

Quantifying information

H(X)

H(Y|X)

Information is that which reduces uncertainty

I(X;Y)H(X|Y)

Uncertainty in Xwhen value of Yis known

H(X|Y) = H(X,Y)-H(Y)= EYH(X|Y=y)

I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y)

H(X,Y)

Page 8: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Sending information through noisy channels

Statistical model of a noisy channel: ´

mEncoding Decoding

m’

Shannon’s noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can send messages reliably to Bob through is given by the formula

Page 9: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Data processing inequality

Alice Bob

time),( YX

I(X;Y) ¸ I(Z;Y)

I(X;Y)

X Y

p(z|x)Z Y

I(Z;Y)

Page 10: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Optimality in Shannon’s theorem

mEncoding Decoding

m’

Shannon’s noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can send messages reliably to Bob through is given by the formula

Assume there exists a code with rate R and perfect decoding. Let M be the random variable corresponding to the uniform distribution over messages.

Xn Yn

nR = H(M) = I(M;M’) · I(M;Yn) · I(Xn;Yn) · j=1n I(Xj,Yj) · n¢maxp(x) I(X;Y)

M has nR bits of entropy

Perfect decoding: M=M’Data processing

Some fiddlingTerm by term

Page 11: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Shannon theory provides

Practically speaking: A holy grail for error-correcting codes

Conceptually speaking: A operationally-motivated way of thinking about

correlations

What’s missing (for a quantum mechanic)? Features from linear structure:

Entanglement and non-orthogonality

Page 12: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantum Shannon Theory provides

General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits…

Relies on a Major simplifying assumption:

Computation is free

Minor simplifying assumption:Noise and data have regular structure

Page 13: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Before we get going: Some unavoidable formalism

We need quantum generalizations of: Probability distributions (density operators) Marginal distributions (partial trace) Noisy channels (quantum operations)

Page 14: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Mixing quantum states: The density operator

23

4 1

Draw |xi with probability p(x) Perform a measurement {|0i,|1i}:

Probability of outcome j:

qj = x p(x) |hj|xi |2

= x p(x) tr[|jih j|xihx|] = tr[ |jih j| ],

i

xxxp )(where

Outcome probability is linear in

Page 15: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Properties of the density operator

is Hermitian: y = [x p(x) |xihx|]y = x p(x) [|xihx|]y =

is positive semidefinite: h||i = x p(x) h|xihx|i¸ 0

tr[] = 1: tr[] = x p(x) tr[|xihx|] = x p(x) = 1

Ensemble ambiguity: I/2 = ½[|0ih 0| + |1ih 1|] = ½[|+ih+| + |-ih-|]

Page 16: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

The density operator: examples

Which of the following are density operators?

Page 17: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

The partial trace

Suppose that AB is a density operator on A B

Alice measures {Mk} on A Outcome probability is

qk = tr[ (Mk IB) AB] Define A = trB[AB] = j Bhj|AB|jiB.

Then qk = tr[ Mk A ] A describes outcome statistics for all

possible experiments by Alice alone

{Mk}

Page 18: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

i

Purification

Suppose that A is a density operator on A

Diagonalize A = i i |iihi|

Let |i = i i

1/2 |iiA|iiB Note that A = trB[] |i is a purification of Symmetry:

=

and

have same

non-zero eigenvalues

Page 19: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantum (noisy) channels:Analogs of p(y|x)

What reasonable constraints might such a channel :A! B satisfy?

1) Take density operators to density operators2) Convex linearity: a mixture of input states should be mapped to

a corresponding mixture of output states

All such maps can, in principle, be realized physically

Must be interpreted very strictly

Require that ( IC)(AC) always be a density operator too

Doesn’t come for free! Let T be the transpose map on A.If |i = |00iAC + |11iAC, then (T IC)(|ih|) has negative eigenvalues

The resulting set of transformations on density operators are known astrace-preserving, completely positive maps

Page 20: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantum channels: examples

Adjoining ancilla: |0ih0| Unitary transformations: UUy

Partial trace: AB trB[AB] That’s it! All channels can be built out of

these operations:

U

|0i

Page 21: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Further examples

The depolarizing channel:

(1-p) + p I/2 The dephasing channel

j hj||ji

|0i

Equivalent to measuring {|ji} then forgetting the outcome

Page 22: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

One last thing you should see...

What happens if a measurement is preceded by a general quantum operation?

Leads to more general types of measurements: Positive Operator-Valued Measures (forevermore POVM)

{Mk} such that Mk ¸ 0, k Mk = 1 Probability of outcome k is tr[Mk ]

Page 23: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

POVM’s: What are they good for?

Try to distinguish |0i=|0i and |1i = |+i = (|0i+|1i)/21/2

States are non-orthogonal, so projective measurements won’t work.

Let N = 1/(1+1/21/2).

Exercise: M0 = N |1ih1|, M1 = N |-ih-|, M2 = I – M0 – M1 is a POVM

Note: * Outcome 0 implies |1i* Outcome 1 implies |0i* Outcome 2 is inconclusive

Instead of imperfect distinguishability all of the time, the POVM provides perfect distinguishability some of the time.

Page 24: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Notions of distinguishability

Basic requirement: quantum channels do not increase “distinguishability”

Fidelity Trace distance

F(,)=max |h|i|2

T(,)=|-|1F(,)=[Tr(1/21/2)]2

F=0 for perfectly distinguishableF=1 for identical

T=2 for perfectly distinguishableT=0 for identical

T(,)=2max|p(k=0|)-p(k=0|)|where max is over measurements {Mk}

F((),()) ¸ F(,) T(,) ¸ T((,())

Statements made today hold for both measures

Page 25: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Back to information theory!

Page 26: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantifying uncertainty

Let = x p(x) |xihx| be a density operator von Neumann entropy:

H() = - tr [ log Equal to Shannon entropy of eigenvalues Analog of a joint random variable:

AB describes a composite system A B

H(A) = H(A) = H( trB AB)

Page 27: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Quantifying uncertainty: examples

H(|ih|) = 0 H(I/2) = 1 H( ) = H() + H() H(I/2n) = n H(p © (1-p)) =

H(p,1-p) + pH() + (1-p)H()

Page 28: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Compression

Source of independent copies of :

B n

dim(Effective supp of B n ) ~ 2nH(B)

Can compress n copies of B toa system of ~nH(B) qubits whilepreserving correlations with A

No statistical assumptions:Just quantum mechanics!

A A A

B B B(aka typical subspace)

[Schumacher, Petz]

Page 29: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

The typical subspace

Diagonalize = x p(x) |exihex|

Then n = xn p(xn) |exn ihexn| The -typical projector t is the projector

onto the span of the |exn ihexn| such that xn is typical

tr[ n t] ! 1 as n ! 1

Page 30: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

H(B)

Quantifying information

H(A)

H(B|A)H(A|B)

Uncertainty in Awhen value of Bis known?

H(A|B) = H(AB)-H(B)

|iAB=|0iA|0iB+|1iA|1iB

B = I/2

H(A|B) = 0 – 1 = -1

Conditional entropy canbe negative!

H(AB)

Page 31: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

H(B)

Quantifying information

H(A)

H(B|A)

Information is that which reduces uncertainty

I(A;B)H(A|B)

Uncertainty in Awhen value of Bis known?

H(A|B) = H(AB)-H(B)

I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB)̧ 0

H(AB)

Page 32: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Sending classical information

through noisy channels

Physical model of a noisy channel:(Trace-preserving, completely positive map)

m Encoding( state)

Decoding(measurement)

m’

HSW noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can send messages reliably to Bob through is given by the (regularization of the) formula

where

Page 33: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Sending classical information

through noisy channels

m Encoding( state)

Decoding(measurement)

m’

B n

2nH(B)

X1,X2,…,Xn

2nH(B|A)

2nH(B|A)

2nH(B|A)

Page 34: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Sending classical information

through noisy channels

m Encoding( state)

Decoding(measurement)

m’

B n

2nH(B)

X1,X2,…,Xn

2nH(B|A)

2nH(B|A)

2nH(B|A)

Distinguish using well-chosen POVM

Page 35: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Data processing inequality(Strong subadditivity)

Alice Bob

timeAB

U

I(A;B)

I(A;B)

I(A;B) ¸ I(A;B)

Page 36: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Optimality in the HSW theorem

Assume there exists a code with rate R with perfect decoding. Let M be the random variable corresponding to the uniform distribution over messages.

nR = H(M) = I(M;M’) · I(A;B)

M has nR bits of entropy

Perfect decoding: M=M’Data processing

m Encoding( state)

Decoding(measurement)

m’

where

m

Page 37: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Sending quantum information

through noisy channels

Physical model of a noisy channel:(Trace-preserving, completely positive map)

|i 2 Cd Encoding(TPCP map)

Decoding(TPCP map)

LSD noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can reliably send qubits to Bob (1/n log d) through is given by the (regularization of the) formula

whereConditional

entropy!

Page 38: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

All x

Random 2n(I(X;Y)-) x

Entanglement and privacy: More than an analogy

p(y,z|x)x = x1 x2 … xn

y=y1 y2 … yn

z = z1 z2 … zn

How to send a private message from Alice to Bob?

AC93Can send private messages at rate I(X;Y)-I(X;Z)

Sets of size 2n(I(X;Z)+)

Page 39: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

All x

Random 2n(I(X:A)-) x

Entanglement and privacy: More than an analogy

UA’->BE n|xiA’

|iBE = U n|xi

How to send a private message from Alice to Bob?

D03Can send private messages at rate I(X:A)-I(X:E)

Sets of size 2n(I(X:E)+)

Page 40: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

All x

Random 2n(I(X:A)-) x

Entanglement and privacy: More than an analogy

UA’->BE nx px

1/2|xiA|xiA’x px

1/2|xiA|xiBE

How to send a private message from Alice to Bob?

SW97D03Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E)

Sets of size 2n(I(X:E)+)

H(E)=H(AB)

Page 41: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Conclusions: Part I

Information theory can be generalized to analyze quantum information processing

Yields a rich theory, surprising conceptual simplicity

Operational approach to thinking about quantum mechanics: Compression, data transmission, superdense

coding, subspace transmission, teleportation

Page 42: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School

Some references:

Part I: Standard textbooks:* Cover & Thomas, Elements of information theory.* Nielsen & Chuang, Quantum computation and quantum information. (and references therein)* Devetak, The private classical capacity and quantum capacity of a quantum channel, quant-ph/0304127

Part II: Papers available at arxiv.org:* Devetak, Harrow & Winter, A family of quantum protocols,

quant-ph/0308044.* Horodecki, Oppenheim & Winter, Quantum information can be

negative, quant-ph/0505062

Page 43: Quantum Information Theory Patrick Hayden (McGill) 4 August 2005, Canadian Quantum Information Summer School