92
Recoverability in quantum information theory Mark M. Wilde Hearne Institute for Theoretical Physics, Department of Physics and Astronomy, Center for Computation and Technology, Louisiana State University, Baton Rouge, Louisiana, USA [email protected] Tulane Quantum Information Seminar, July 17, 2015, New Orleans, Louisiana, USA Based on arXiv:1505.04661 Mark M. Wilde (LSU) 1 / 23

Recoverability in quantum information theory - …markwilde.com/publications/recoverability-QIT-tulane.pdf · Recoverability in quantum information theory Mark M. Wilde Hearne Institute

  • Upload
    vocong

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

Recoverability in quantum information theory

Mark M. Wilde

Hearne Institute for Theoretical Physics,Department of Physics and Astronomy,Center for Computation and Technology,

Louisiana State University,Baton Rouge, Louisiana, USA

[email protected]

Tulane Quantum Information Seminar,July 17, 2015, New Orleans, Louisiana, USA

Based on arXiv:1505.04661

Mark M. Wilde (LSU) 1 / 23

Main message

Entropy inequalities established in the 1970s are a mathematicalconsequence of the postulates of quantum physics

They are helpful in determining the ultimate limits on many physicalprocesses (communication, thermodynamics, uncertainty relations)

Many of these entropy inequalities are equivalent to each other, so wecan say that together they constitute a fundamental law of quantuminformation theory

There has been recent interest in refining these inequalities, trying tounderstand how well one can attempt to reverse an irreversiblephysical process

This talk discusses progress in this direction

Mark M. Wilde (LSU) 2 / 23

Main message

Entropy inequalities established in the 1970s are a mathematicalconsequence of the postulates of quantum physics

They are helpful in determining the ultimate limits on many physicalprocesses (communication, thermodynamics, uncertainty relations)

Many of these entropy inequalities are equivalent to each other, so wecan say that together they constitute a fundamental law of quantuminformation theory

There has been recent interest in refining these inequalities, trying tounderstand how well one can attempt to reverse an irreversiblephysical process

This talk discusses progress in this direction

Mark M. Wilde (LSU) 2 / 23

Main message

Entropy inequalities established in the 1970s are a mathematicalconsequence of the postulates of quantum physics

They are helpful in determining the ultimate limits on many physicalprocesses (communication, thermodynamics, uncertainty relations)

Many of these entropy inequalities are equivalent to each other, so wecan say that together they constitute a fundamental law of quantuminformation theory

There has been recent interest in refining these inequalities, trying tounderstand how well one can attempt to reverse an irreversiblephysical process

This talk discusses progress in this direction

Mark M. Wilde (LSU) 2 / 23

Main message

Entropy inequalities established in the 1970s are a mathematicalconsequence of the postulates of quantum physics

They are helpful in determining the ultimate limits on many physicalprocesses (communication, thermodynamics, uncertainty relations)

Many of these entropy inequalities are equivalent to each other, so wecan say that together they constitute a fundamental law of quantuminformation theory

There has been recent interest in refining these inequalities, trying tounderstand how well one can attempt to reverse an irreversiblephysical process

This talk discusses progress in this direction

Mark M. Wilde (LSU) 2 / 23

Main message

Entropy inequalities established in the 1970s are a mathematicalconsequence of the postulates of quantum physics

They are helpful in determining the ultimate limits on many physicalprocesses (communication, thermodynamics, uncertainty relations)

Many of these entropy inequalities are equivalent to each other, so wecan say that together they constitute a fundamental law of quantuminformation theory

There has been recent interest in refining these inequalities, trying tounderstand how well one can attempt to reverse an irreversiblephysical process

This talk discusses progress in this direction

Mark M. Wilde (LSU) 2 / 23

Main message

Entropy inequalities established in the 1970s are a mathematicalconsequence of the postulates of quantum physics

They are helpful in determining the ultimate limits on many physicalprocesses (communication, thermodynamics, uncertainty relations)

Many of these entropy inequalities are equivalent to each other, so wecan say that together they constitute a fundamental law of quantuminformation theory

There has been recent interest in refining these inequalities, trying tounderstand how well one can attempt to reverse an irreversiblephysical process

This talk discusses progress in this direction

Mark M. Wilde (LSU) 2 / 23

Background — entropies

Umegaki relative entropy [Ume62]

The quantum relative entropy is a measure of dissimilarity between twoquantum states. Defined for state ρ and positive semi-definite σ as

D(ρ‖σ) ≡ Tr{ρ[log ρ− log σ]}

whenever supp(ρ) ⊆ supp(σ) and +∞ otherwise

Physical interpretation with quantum Stein’s lemma [HP91, NO00]

Given are n quantum systems, all of which are prepared in either the stateρ or σ. With a constraint of ε ∈ (0, 1) on the Type I error ofmisidentifying ρ, then the optimal error exponent for the Type II error ofmisidentifying σ is D(ρ‖σ).

Mark M. Wilde (LSU) 3 / 23

Background — entropies

Umegaki relative entropy [Ume62]

The quantum relative entropy is a measure of dissimilarity between twoquantum states. Defined for state ρ and positive semi-definite σ as

D(ρ‖σ) ≡ Tr{ρ[log ρ− log σ]}

whenever supp(ρ) ⊆ supp(σ) and +∞ otherwise

Physical interpretation with quantum Stein’s lemma [HP91, NO00]

Given are n quantum systems, all of which are prepared in either the stateρ or σ. With a constraint of ε ∈ (0, 1) on the Type I error ofmisidentifying ρ, then the optimal error exponent for the Type II error ofmisidentifying σ is D(ρ‖σ).

Mark M. Wilde (LSU) 3 / 23

Background — entropies

Umegaki relative entropy [Ume62]

The quantum relative entropy is a measure of dissimilarity between twoquantum states. Defined for state ρ and positive semi-definite σ as

D(ρ‖σ) ≡ Tr{ρ[log ρ− log σ]}

whenever supp(ρ) ⊆ supp(σ) and +∞ otherwise

Physical interpretation with quantum Stein’s lemma [HP91, NO00]

Given are n quantum systems, all of which are prepared in either the stateρ or σ. With a constraint of ε ∈ (0, 1) on the Type I error ofmisidentifying ρ, then the optimal error exponent for the Type II error ofmisidentifying σ is D(ρ‖σ).

Mark M. Wilde (LSU) 3 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Background — entropies

Relative entropy as “mother” entropy

Many important entropies can be written in terms of relative entropy:

H(A)ρ ≡ −D(ρA‖IA) (entropy)

H(A|B)ρ ≡ −D(ρAB‖IA ⊗ ρB) (conditional entropy)

I (A;B)ρ ≡ D(ρAB‖ρA ⊗ ρB) (mutual information)

I (A;B|C )ρ ≡ D(ρABC‖ exp{log ρAC + log ρBC − log ρC}) (cond. MI)

Equivalences

H(A|B)ρ = H(AB)ρ − H(B)ρ

I (A;B)ρ = H(A)ρ + H(B)ρ − H(AB)ρ

I (A;B)ρ = H(B)ρ − H(B|A)ρ

I (A;B|C )ρ = H(AC )ρ + H(BC )ρ − H(ABC )ρ − H(C )ρ

I (A;B|C )ρ = H(B|C )ρ − H(B|AC )ρ

Mark M. Wilde (LSU) 4 / 23

Fundamental law of quantum information theory

Monotonicity of quantum relative entropy [Lin75, Uhl77]

Let ρ be a state, let σ be positive semi-definite, and let N be a quantumchannel. Then

D(ρ‖σ) ≥ D(N (ρ)‖N (σ))

“Distinguishability does not increase under a physical process”Characterizes a fundamental irreversibility in any physical process

Proof approaches

Lieb concavity theorem [L73]

relative modular operator method (see, e.g., [NP04])

quantum Stein’s lemma [BS03]

Mark M. Wilde (LSU) 5 / 23

Fundamental law of quantum information theory

Monotonicity of quantum relative entropy [Lin75, Uhl77]

Let ρ be a state, let σ be positive semi-definite, and let N be a quantumchannel. Then

D(ρ‖σ) ≥ D(N (ρ)‖N (σ))

“Distinguishability does not increase under a physical process”Characterizes a fundamental irreversibility in any physical process

Proof approaches

Lieb concavity theorem [L73]

relative modular operator method (see, e.g., [NP04])

quantum Stein’s lemma [BS03]

Mark M. Wilde (LSU) 5 / 23

Fundamental law of quantum information theory

Monotonicity of quantum relative entropy [Lin75, Uhl77]

Let ρ be a state, let σ be positive semi-definite, and let N be a quantumchannel. Then

D(ρ‖σ) ≥ D(N (ρ)‖N (σ))

“Distinguishability does not increase under a physical process”Characterizes a fundamental irreversibility in any physical process

Proof approaches

Lieb concavity theorem [L73]

relative modular operator method (see, e.g., [NP04])

quantum Stein’s lemma [BS03]

Mark M. Wilde (LSU) 5 / 23

Fundamental law of quantum information theory

Monotonicity of quantum relative entropy [Lin75, Uhl77]

Let ρ be a state, let σ be positive semi-definite, and let N be a quantumchannel. Then

D(ρ‖σ) ≥ D(N (ρ)‖N (σ))

“Distinguishability does not increase under a physical process”Characterizes a fundamental irreversibility in any physical process

Proof approaches

Lieb concavity theorem [L73]

relative modular operator method (see, e.g., [NP04])

quantum Stein’s lemma [BS03]

Mark M. Wilde (LSU) 5 / 23

Fundamental law of quantum information theory

Monotonicity of quantum relative entropy [Lin75, Uhl77]

Let ρ be a state, let σ be positive semi-definite, and let N be a quantumchannel. Then

D(ρ‖σ) ≥ D(N (ρ)‖N (σ))

“Distinguishability does not increase under a physical process”Characterizes a fundamental irreversibility in any physical process

Proof approaches

Lieb concavity theorem [L73]

relative modular operator method (see, e.g., [NP04])

quantum Stein’s lemma [BS03]

Mark M. Wilde (LSU) 5 / 23

Strong subadditivity

Strong subadditivity [LR73]

Let ρABC be a tripartite quantum state. Then

I (A;B|C )ρ ≥ 0

Equivalent statements (by definition)

Entropy sum of two individual systems is larger than entropy sum oftheir union and intersection:

H(AC )ρ + H(BC )ρ ≥ H(ABC )ρ + H(C )ρ

Conditional entropy does not decrease under the loss of system A:

H(B|C )ρ ≥ H(B|AC )ρ

Mark M. Wilde (LSU) 6 / 23

Strong subadditivity

Strong subadditivity [LR73]

Let ρABC be a tripartite quantum state. Then

I (A;B|C )ρ ≥ 0

Equivalent statements (by definition)

Entropy sum of two individual systems is larger than entropy sum oftheir union and intersection:

H(AC )ρ + H(BC )ρ ≥ H(ABC )ρ + H(C )ρ

Conditional entropy does not decrease under the loss of system A:

H(B|C )ρ ≥ H(B|AC )ρ

Mark M. Wilde (LSU) 6 / 23

Strong subadditivity

Strong subadditivity [LR73]

Let ρABC be a tripartite quantum state. Then

I (A;B|C )ρ ≥ 0

Equivalent statements (by definition)

Entropy sum of two individual systems is larger than entropy sum oftheir union and intersection:

H(AC )ρ + H(BC )ρ ≥ H(ABC )ρ + H(C )ρ

Conditional entropy does not decrease under the loss of system A:

H(B|C )ρ ≥ H(B|AC )ρ

Mark M. Wilde (LSU) 6 / 23

Strong subadditivity

Strong subadditivity [LR73]

Let ρABC be a tripartite quantum state. Then

I (A;B|C )ρ ≥ 0

Equivalent statements (by definition)

Entropy sum of two individual systems is larger than entropy sum oftheir union and intersection:

H(AC )ρ + H(BC )ρ ≥ H(ABC )ρ + H(C )ρ

Conditional entropy does not decrease under the loss of system A:

H(B|C )ρ ≥ H(B|AC )ρ

Mark M. Wilde (LSU) 6 / 23

Strong subadditivity

Strong subadditivity [LR73]

Let ρABC be a tripartite quantum state. Then

I (A;B|C )ρ ≥ 0

Equivalent statements (by definition)

Entropy sum of two individual systems is larger than entropy sum oftheir union and intersection:

H(AC )ρ + H(BC )ρ ≥ H(ABC )ρ + H(C )ρ

Conditional entropy does not decrease under the loss of system A:

H(B|C )ρ ≥ H(B|AC )ρ

Mark M. Wilde (LSU) 6 / 23

Equality conditions [Pet86, Pet88]

When does equality in monotonicity of relative entropy hold?

D(ρ‖σ) = D(N (ρ)‖N (σ)) iff ∃ a recovery map RPσ,N such that

ρ = (RPσ,N ◦ N )(ρ), σ = (RP

σ,N ◦ N )(σ)

This “Petz” recovery map has the following explicit form [HJPW04]:

RPσ,N (ω) ≡ σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2

Classical case: Distributions pX and qX and a channel N (y |x). Thenthe Petz recovery map RP(x |y) is given by the Bayes theorem:

RP(x |y)qY (y) = N (y |x)qX (x)

where qY (y) ≡∑

x N (y |x)qX (x)

Mark M. Wilde (LSU) 7 / 23

Equality conditions [Pet86, Pet88]

When does equality in monotonicity of relative entropy hold?

D(ρ‖σ) = D(N (ρ)‖N (σ)) iff ∃ a recovery map RPσ,N such that

ρ = (RPσ,N ◦ N )(ρ), σ = (RP

σ,N ◦ N )(σ)

This “Petz” recovery map has the following explicit form [HJPW04]:

RPσ,N (ω) ≡ σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2

Classical case: Distributions pX and qX and a channel N (y |x). Thenthe Petz recovery map RP(x |y) is given by the Bayes theorem:

RP(x |y)qY (y) = N (y |x)qX (x)

where qY (y) ≡∑

x N (y |x)qX (x)

Mark M. Wilde (LSU) 7 / 23

Equality conditions [Pet86, Pet88]

When does equality in monotonicity of relative entropy hold?

D(ρ‖σ) = D(N (ρ)‖N (σ)) iff ∃ a recovery map RPσ,N such that

ρ = (RPσ,N ◦ N )(ρ), σ = (RP

σ,N ◦ N )(σ)

This “Petz” recovery map has the following explicit form [HJPW04]:

RPσ,N (ω) ≡ σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2

Classical case: Distributions pX and qX and a channel N (y |x). Thenthe Petz recovery map RP(x |y) is given by the Bayes theorem:

RP(x |y)qY (y) = N (y |x)qX (x)

where qY (y) ≡∑

x N (y |x)qX (x)

Mark M. Wilde (LSU) 7 / 23

Equality conditions [Pet86, Pet88]

When does equality in monotonicity of relative entropy hold?

D(ρ‖σ) = D(N (ρ)‖N (σ)) iff ∃ a recovery map RPσ,N such that

ρ = (RPσ,N ◦ N )(ρ), σ = (RP

σ,N ◦ N )(σ)

This “Petz” recovery map has the following explicit form [HJPW04]:

RPσ,N (ω) ≡ σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2

Classical case: Distributions pX and qX and a channel N (y |x). Thenthe Petz recovery map RP(x |y) is given by the Bayes theorem:

RP(x |y)qY (y) = N (y |x)qX (x)

where qY (y) ≡∑

x N (y |x)qX (x)

Mark M. Wilde (LSU) 7 / 23

Equality conditions [Pet86, Pet88]

When does equality in monotonicity of relative entropy hold?

D(ρ‖σ) = D(N (ρ)‖N (σ)) iff ∃ a recovery map RPσ,N such that

ρ = (RPσ,N ◦ N )(ρ), σ = (RP

σ,N ◦ N )(σ)

This “Petz” recovery map has the following explicit form [HJPW04]:

RPσ,N (ω) ≡ σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2

Classical case: Distributions pX and qX and a channel N (y |x). Thenthe Petz recovery map RP(x |y) is given by the Bayes theorem:

RP(x |y)qY (y) = N (y |x)qX (x)

where qY (y) ≡∑

x N (y |x)qX (x)

Mark M. Wilde (LSU) 7 / 23

More on Petz recovery map

Linear, completely positive by inspection and trace non-increasingbecause

Tr{RPσ,N (ω)} = Tr{σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2}

= Tr{σN †(

(N (σ))−1/2ω(N (σ))−1/2)}

= Tr{N (σ)(N (σ))−1/2ω(N (σ))−1/2}≤ Tr{ω}

For N (σ) positive definite, the map perfectly recovers σ from N (σ):

RPσ,N (N (σ)) = σ1/2N †

((N (σ))−1/2N (σ)(N (σ))−1/2

)σ1/2

= σ1/2N † (I )σ1/2

= σ

Mark M. Wilde (LSU) 8 / 23

More on Petz recovery map

Linear, completely positive by inspection and trace non-increasingbecause

Tr{RPσ,N (ω)} = Tr{σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2}

= Tr{σN †(

(N (σ))−1/2ω(N (σ))−1/2)}

= Tr{N (σ)(N (σ))−1/2ω(N (σ))−1/2}≤ Tr{ω}

For N (σ) positive definite, the map perfectly recovers σ from N (σ):

RPσ,N (N (σ)) = σ1/2N †

((N (σ))−1/2N (σ)(N (σ))−1/2

)σ1/2

= σ1/2N † (I )σ1/2

= σ

Mark M. Wilde (LSU) 8 / 23

More on Petz recovery map

Linear, completely positive by inspection and trace non-increasingbecause

Tr{RPσ,N (ω)} = Tr{σ1/2N †

((N (σ))−1/2ω(N (σ))−1/2

)σ1/2}

= Tr{σN †(

(N (σ))−1/2ω(N (σ))−1/2)}

= Tr{N (σ)(N (σ))−1/2ω(N (σ))−1/2}≤ Tr{ω}

For N (σ) positive definite, the map perfectly recovers σ from N (σ):

RPσ,N (N (σ)) = σ1/2N †

((N (σ))−1/2N (σ)(N (σ))−1/2

)σ1/2

= σ1/2N † (I )σ1/2

= σ

Mark M. Wilde (LSU) 8 / 23

Functoriality

Normalization [LW14]

For identity channel, the Petz recovery map is the identity map:RPσ,id = id. “If there’s no noise, then no need to recover”

Tensorial [LW14]

Given a tensor-product state and channel, then the Petz recovery map is atensor product: RP

σ1⊗σ2,N1⊗N2= RP

σ1,N1⊗RP

σ2,N2. “Individual action

suffices for ‘pretty good’ recovery of individual states”

Composition [LW14]

Given N2 ◦ N1, then RPσ,N2◦N1

= RPσ,N1◦ RP

N1(σ),N2. “To recover ‘pretty

well’ overall, recover ‘pretty well’ from the last noise first and the firstnoise last”

Mark M. Wilde (LSU) 9 / 23

Functoriality

Normalization [LW14]

For identity channel, the Petz recovery map is the identity map:RPσ,id = id. “If there’s no noise, then no need to recover”

Tensorial [LW14]

Given a tensor-product state and channel, then the Petz recovery map is atensor product: RP

σ1⊗σ2,N1⊗N2= RP

σ1,N1⊗RP

σ2,N2. “Individual action

suffices for ‘pretty good’ recovery of individual states”

Composition [LW14]

Given N2 ◦ N1, then RPσ,N2◦N1

= RPσ,N1◦ RP

N1(σ),N2. “To recover ‘pretty

well’ overall, recover ‘pretty well’ from the last noise first and the firstnoise last”

Mark M. Wilde (LSU) 9 / 23

Functoriality

Normalization [LW14]

For identity channel, the Petz recovery map is the identity map:RPσ,id = id. “If there’s no noise, then no need to recover”

Tensorial [LW14]

Given a tensor-product state and channel, then the Petz recovery map is atensor product: RP

σ1⊗σ2,N1⊗N2= RP

σ1,N1⊗RP

σ2,N2. “Individual action

suffices for ‘pretty good’ recovery of individual states”

Composition [LW14]

Given N2 ◦ N1, then RPσ,N2◦N1

= RPσ,N1◦ RP

N1(σ),N2. “To recover ‘pretty

well’ overall, recover ‘pretty well’ from the last noise first and the firstnoise last”

Mark M. Wilde (LSU) 9 / 23

Functoriality

Normalization [LW14]

For identity channel, the Petz recovery map is the identity map:RPσ,id = id. “If there’s no noise, then no need to recover”

Tensorial [LW14]

Given a tensor-product state and channel, then the Petz recovery map is atensor product: RP

σ1⊗σ2,N1⊗N2= RP

σ1,N1⊗RP

σ2,N2. “Individual action

suffices for ‘pretty good’ recovery of individual states”

Composition [LW14]

Given N2 ◦ N1, then RPσ,N2◦N1

= RPσ,N1◦ RP

N1(σ),N2. “To recover ‘pretty

well’ overall, recover ‘pretty well’ from the last noise first and the firstnoise last”

Mark M. Wilde (LSU) 9 / 23

Petz recovery map for strong subadditivity

Strong subadditivity is a special case of monotonicity of relativeentropy with ρ = ωABC , σ = ωAC ⊗ IB , and N = TrA

Then N †(·) = (·)⊗ IA and Petz recovery map is

RPC→AC (τC ) = ω

1/2AC

(ω−1/2C τCω

−1/2C ⊗ IA

)ω1/2AC

Interpretation: If system A is lost but H(B|C )ω = H(B|AC )ω, thenone can recover the full state on ABC by performing the Petzrecovery map on system C of ωBC , i.e.,

ωABC = RPC→AC (ωBC )

Mark M. Wilde (LSU) 10 / 23

Petz recovery map for strong subadditivity

Strong subadditivity is a special case of monotonicity of relativeentropy with ρ = ωABC , σ = ωAC ⊗ IB , and N = TrA

Then N †(·) = (·)⊗ IA and Petz recovery map is

RPC→AC (τC ) = ω

1/2AC

(ω−1/2C τCω

−1/2C ⊗ IA

)ω1/2AC

Interpretation: If system A is lost but H(B|C )ω = H(B|AC )ω, thenone can recover the full state on ABC by performing the Petzrecovery map on system C of ωBC , i.e.,

ωABC = RPC→AC (ωBC )

Mark M. Wilde (LSU) 10 / 23

Petz recovery map for strong subadditivity

Strong subadditivity is a special case of monotonicity of relativeentropy with ρ = ωABC , σ = ωAC ⊗ IB , and N = TrA

Then N †(·) = (·)⊗ IA and Petz recovery map is

RPC→AC (τC ) = ω

1/2AC

(ω−1/2C τCω

−1/2C ⊗ IA

)ω1/2AC

Interpretation: If system A is lost but H(B|C )ω = H(B|AC )ω, thenone can recover the full state on ABC by performing the Petzrecovery map on system C of ωBC , i.e.,

ωABC = RPC→AC (ωBC )

Mark M. Wilde (LSU) 10 / 23

Petz recovery map for strong subadditivity

Strong subadditivity is a special case of monotonicity of relativeentropy with ρ = ωABC , σ = ωAC ⊗ IB , and N = TrA

Then N †(·) = (·)⊗ IA and Petz recovery map is

RPC→AC (τC ) = ω

1/2AC

(ω−1/2C τCω

−1/2C ⊗ IA

)ω1/2AC

Interpretation: If system A is lost but H(B|C )ω = H(B|AC )ω, thenone can recover the full state on ABC by performing the Petzrecovery map on system C of ωBC , i.e.,

ωABC = RPC→AC (ωBC )

Mark M. Wilde (LSU) 10 / 23

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy

What can we say when D(ρ‖σ)− D(N (ρ)‖N (σ)) = ε ?

Does there exist a CPTP map R that recovers σ perfectly from N (σ)while recovering ρ from N (ρ) approximately? [WL12]

Approximate case for strong subadditivity

What can we say when H(B|C )ω − H(B|AC )ω = ε ?

Is ωABC approximately recoverable from ωBC by performing arecovery map on system C alone? [WL12]

Mark M. Wilde (LSU) 11 / 23

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy

What can we say when D(ρ‖σ)− D(N (ρ)‖N (σ)) = ε ?

Does there exist a CPTP map R that recovers σ perfectly from N (σ)while recovering ρ from N (ρ) approximately? [WL12]

Approximate case for strong subadditivity

What can we say when H(B|C )ω − H(B|AC )ω = ε ?

Is ωABC approximately recoverable from ωBC by performing arecovery map on system C alone? [WL12]

Mark M. Wilde (LSU) 11 / 23

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy

What can we say when D(ρ‖σ)− D(N (ρ)‖N (σ)) = ε ?

Does there exist a CPTP map R that recovers σ perfectly from N (σ)while recovering ρ from N (ρ) approximately? [WL12]

Approximate case for strong subadditivity

What can we say when H(B|C )ω − H(B|AC )ω = ε ?

Is ωABC approximately recoverable from ωBC by performing arecovery map on system C alone? [WL12]

Mark M. Wilde (LSU) 11 / 23

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy

What can we say when D(ρ‖σ)− D(N (ρ)‖N (σ)) = ε ?

Does there exist a CPTP map R that recovers σ perfectly from N (σ)while recovering ρ from N (ρ) approximately? [WL12]

Approximate case for strong subadditivity

What can we say when H(B|C )ω − H(B|AC )ω = ε ?

Is ωABC approximately recoverable from ωBC by performing arecovery map on system C alone? [WL12]

Mark M. Wilde (LSU) 11 / 23

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy

What can we say when D(ρ‖σ)− D(N (ρ)‖N (σ)) = ε ?

Does there exist a CPTP map R that recovers σ perfectly from N (σ)while recovering ρ from N (ρ) approximately? [WL12]

Approximate case for strong subadditivity

What can we say when H(B|C )ω − H(B|AC )ω = ε ?

Is ωABC approximately recoverable from ωBC by performing arecovery map on system C alone? [WL12]

Mark M. Wilde (LSU) 11 / 23

Approximate case

Approximate case would be useful for applications

Approximate case for monotonicity of relative entropy

What can we say when D(ρ‖σ)− D(N (ρ)‖N (σ)) = ε ?

Does there exist a CPTP map R that recovers σ perfectly from N (σ)while recovering ρ from N (ρ) approximately? [WL12]

Approximate case for strong subadditivity

What can we say when H(B|C )ω − H(B|AC )ω = ε ?

Is ωABC approximately recoverable from ωBC by performing arecovery map on system C alone? [WL12]

Mark M. Wilde (LSU) 11 / 23

Other measures of similarity for quantum states

Trace distance

Trace distance between ρ and σ is ‖ρ− σ‖1 where ‖A‖1 = Tr{√A†A}.

Has a one-shot operational interpretation as the bias in success probabilitywhen distinguishing ρ and σ with an optimal quantum measurement.

Fidelity [Uhl76]

Fidelity between ρ and σ is F (ρ, σ) ≡ ‖√ρ√σ‖21. Has a one-shot

operational interpretation as the probability with which a purification of ρcould pass a test for being a purification of σ.

Bures distance [Bur69]

Bures distance between ρ and σ is DB(ρ, σ) =

√2(

1−√F (ρ, σ)

).

Mark M. Wilde (LSU) 12 / 23

Other measures of similarity for quantum states

Trace distance

Trace distance between ρ and σ is ‖ρ− σ‖1 where ‖A‖1 = Tr{√A†A}.

Has a one-shot operational interpretation as the bias in success probabilitywhen distinguishing ρ and σ with an optimal quantum measurement.

Fidelity [Uhl76]

Fidelity between ρ and σ is F (ρ, σ) ≡ ‖√ρ√σ‖21. Has a one-shot

operational interpretation as the probability with which a purification of ρcould pass a test for being a purification of σ.

Bures distance [Bur69]

Bures distance between ρ and σ is DB(ρ, σ) =

√2(

1−√F (ρ, σ)

).

Mark M. Wilde (LSU) 12 / 23

Other measures of similarity for quantum states

Trace distance

Trace distance between ρ and σ is ‖ρ− σ‖1 where ‖A‖1 = Tr{√A†A}.

Has a one-shot operational interpretation as the bias in success probabilitywhen distinguishing ρ and σ with an optimal quantum measurement.

Fidelity [Uhl76]

Fidelity between ρ and σ is F (ρ, σ) ≡ ‖√ρ√σ‖21. Has a one-shot

operational interpretation as the probability with which a purification of ρcould pass a test for being a purification of σ.

Bures distance [Bur69]

Bures distance between ρ and σ is DB(ρ, σ) =

√2(

1−√F (ρ, σ)

).

Mark M. Wilde (LSU) 12 / 23

Other measures of similarity for quantum states

Trace distance

Trace distance between ρ and σ is ‖ρ− σ‖1 where ‖A‖1 = Tr{√A†A}.

Has a one-shot operational interpretation as the bias in success probabilitywhen distinguishing ρ and σ with an optimal quantum measurement.

Fidelity [Uhl76]

Fidelity between ρ and σ is F (ρ, σ) ≡ ‖√ρ√σ‖21. Has a one-shot

operational interpretation as the probability with which a purification of ρcould pass a test for being a purification of σ.

Bures distance [Bur69]

Bures distance between ρ and σ is DB(ρ, σ) =

√2(

1−√F (ρ, σ)

).

Mark M. Wilde (LSU) 12 / 23

Breakthrough result of [FR14]

Remainder term for strong subadditivity [FR14]

∃ unitary channels UC and VAC such that

H(B|C )ω − H(B|AC )ω ≥ − log F(ωABC , (VAC ◦ RP

C→AC ◦ UC )(ωBC ))

Nothing known from [FR14] about these unitaries! However, can concludethat I (A;B|C ) is small iff ωABC is approximately recoverable from systemC alone after the loss of system A.

Remainder term for monotonicity of relative entropy [BLW14]

∃ unitary channels U and V such that

D(ρ‖σ)− D(N (ρ)‖N (σ)) ≥ − log F(ρ, (V ◦ RP

σ,N ◦ U)(N (ρ)))

Again, nothing known from [BLW14] about U and V.

Mark M. Wilde (LSU) 13 / 23

Breakthrough result of [FR14]

Remainder term for strong subadditivity [FR14]

∃ unitary channels UC and VAC such that

H(B|C )ω − H(B|AC )ω ≥ − log F(ωABC , (VAC ◦ RP

C→AC ◦ UC )(ωBC ))

Nothing known from [FR14] about these unitaries! However, can concludethat I (A;B|C ) is small iff ωABC is approximately recoverable from systemC alone after the loss of system A.

Remainder term for monotonicity of relative entropy [BLW14]

∃ unitary channels U and V such that

D(ρ‖σ)− D(N (ρ)‖N (σ)) ≥ − log F(ρ, (V ◦ RP

σ,N ◦ U)(N (ρ)))

Again, nothing known from [BLW14] about U and V.

Mark M. Wilde (LSU) 13 / 23

Breakthrough result of [FR14]

Remainder term for strong subadditivity [FR14]

∃ unitary channels UC and VAC such that

H(B|C )ω − H(B|AC )ω ≥ − log F(ωABC , (VAC ◦ RP

C→AC ◦ UC )(ωBC ))

Nothing known from [FR14] about these unitaries! However, can concludethat I (A;B|C ) is small iff ωABC is approximately recoverable from systemC alone after the loss of system A.

Remainder term for monotonicity of relative entropy [BLW14]

∃ unitary channels U and V such that

D(ρ‖σ)− D(N (ρ)‖N (σ)) ≥ − log F(ρ, (V ◦ RP

σ,N ◦ U)(N (ρ)))

Again, nothing known from [BLW14] about U and V.

Mark M. Wilde (LSU) 13 / 23

New result of [Wil15]

New Theorem: Let ρ and σ be such that supp(ρ) ⊆ supp(σ) and letN be a quantum channel. Then the following inequality holds

D (ρ‖σ)− D (N (ρ)‖N (σ)) ≥ − log

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))],

where RP,tσ,N is the following rotated Petz recovery map:

RP,tσ,N (·) ≡

(Uσ,t ◦ RP

σ,N ◦ UN (σ),−t

)(·) ,

RPσ,N is the Petz recovery map, and Uσ,t and UN (σ),−t are defined

from Uω,t (·) ≡ ωit (·)ω−it , with ω a positive semi-definite operator.

Two tools for proof: Renyi generalization of a relative entropydifference and the Hadamard three-line theorem

Mark M. Wilde (LSU) 14 / 23

New result of [Wil15]

New Theorem: Let ρ and σ be such that supp(ρ) ⊆ supp(σ) and letN be a quantum channel. Then the following inequality holds

D (ρ‖σ)− D (N (ρ)‖N (σ)) ≥ − log

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))],

where RP,tσ,N is the following rotated Petz recovery map:

RP,tσ,N (·) ≡

(Uσ,t ◦ RP

σ,N ◦ UN (σ),−t

)(·) ,

RPσ,N is the Petz recovery map, and Uσ,t and UN (σ),−t are defined

from Uω,t (·) ≡ ωit (·)ω−it , with ω a positive semi-definite operator.

Two tools for proof: Renyi generalization of a relative entropydifference and the Hadamard three-line theorem

Mark M. Wilde (LSU) 14 / 23

New result of [Wil15]

New Theorem: Let ρ and σ be such that supp(ρ) ⊆ supp(σ) and letN be a quantum channel. Then the following inequality holds

D (ρ‖σ)− D (N (ρ)‖N (σ)) ≥ − log

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))],

where RP,tσ,N is the following rotated Petz recovery map:

RP,tσ,N (·) ≡

(Uσ,t ◦ RP

σ,N ◦ UN (σ),−t

)(·) ,

RPσ,N is the Petz recovery map, and Uσ,t and UN (σ),−t are defined

from Uω,t (·) ≡ ωit (·)ω−it , with ω a positive semi-definite operator.

Two tools for proof: Renyi generalization of a relative entropydifference and the Hadamard three-line theorem

Mark M. Wilde (LSU) 14 / 23

New result of [Wil15]

New Theorem: Let ρ and σ be such that supp(ρ) ⊆ supp(σ) and letN be a quantum channel. Then the following inequality holds

D (ρ‖σ)− D (N (ρ)‖N (σ)) ≥ − log

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))],

where RP,tσ,N is the following rotated Petz recovery map:

RP,tσ,N (·) ≡

(Uσ,t ◦ RP

σ,N ◦ UN (σ),−t

)(·) ,

RPσ,N is the Petz recovery map, and Uσ,t and UN (σ),−t are defined

from Uω,t (·) ≡ ωit (·)ω−it , with ω a positive semi-definite operator.

Two tools for proof: Renyi generalization of a relative entropydifference and the Hadamard three-line theorem

Mark M. Wilde (LSU) 14 / 23

New result of [Wil15]

New Theorem: Let ρ and σ be such that supp(ρ) ⊆ supp(σ) and letN be a quantum channel. Then the following inequality holds

D (ρ‖σ)− D (N (ρ)‖N (σ)) ≥ − log

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))],

where RP,tσ,N is the following rotated Petz recovery map:

RP,tσ,N (·) ≡

(Uσ,t ◦ RP

σ,N ◦ UN (σ),−t

)(·) ,

RPσ,N is the Petz recovery map, and Uσ,t and UN (σ),−t are defined

from Uω,t (·) ≡ ωit (·)ω−it , with ω a positive semi-definite operator.

Two tools for proof: Renyi generalization of a relative entropydifference and the Hadamard three-line theorem

Mark M. Wilde (LSU) 14 / 23

Renyi generalizations of a relative entropy difference

Definition from [BSW14, SBW14]

∆α (ρ, σ,N ) ≡ 2

α′log∥∥∥(N (ρ)−α

′/2N (σ)α′/2 ⊗ IE

)Uσ−α

′/2ρ1/2∥∥∥2α,

where α ∈ (0, 1) ∪ (1,∞), α′ ≡ (α− 1)/α, and US→BE is an isometricextension of N .

Important properties

limα→1

∆α (ρ, σ,N ) = D (ρ‖σ)− D (N (ρ) ‖N (σ)) .

∆1/2 (ρ, σ,N ) = − log F(ρ,RP

σ,N (N (ρ))).

Mark M. Wilde (LSU) 15 / 23

Renyi generalizations of a relative entropy difference

Definition from [BSW14, SBW14]

∆α (ρ, σ,N ) ≡ 2

α′log∥∥∥(N (ρ)−α

′/2N (σ)α′/2 ⊗ IE

)Uσ−α

′/2ρ1/2∥∥∥2α,

where α ∈ (0, 1) ∪ (1,∞), α′ ≡ (α− 1)/α, and US→BE is an isometricextension of N .

Important properties

limα→1

∆α (ρ, σ,N ) = D (ρ‖σ)− D (N (ρ) ‖N (σ)) .

∆1/2 (ρ, σ,N ) = − log F(ρ,RP

σ,N (N (ρ))).

Mark M. Wilde (LSU) 15 / 23

Renyi generalizations of a relative entropy difference

Definition from [BSW14, SBW14]

∆α (ρ, σ,N ) ≡ 2

α′log∥∥∥(N (ρ)−α

′/2N (σ)α′/2 ⊗ IE

)Uσ−α

′/2ρ1/2∥∥∥2α,

where α ∈ (0, 1) ∪ (1,∞), α′ ≡ (α− 1)/α, and US→BE is an isometricextension of N .

Important properties

limα→1

∆α (ρ, σ,N ) = D (ρ‖σ)− D (N (ρ) ‖N (σ)) .

∆1/2 (ρ, σ,N ) = − log F(ρ,RP

σ,N (N (ρ))).

Mark M. Wilde (LSU) 15 / 23

Hadamard three-line theorem

Let S ≡ {z ∈ C : 0 ≤ Re {z} ≤ 1} , and let L (H) be the space ofbounded linear operators acting on a Hilbert space H. LetG : S → L (H) be a bounded map that is holomorphic on the interiorof S and continuous on the boundary. Let θ ∈ (0, 1) and define pθ by

1

pθ=

1− θp0

p1,

where p0, p1 ∈ [1,∞]. For k = 0, 1 define

Mk = supt∈R‖G (k + it)‖pk .

Then‖G (θ)‖pθ ≤ M1−θ

0 Mθ1 .

Mark M. Wilde (LSU) 16 / 23

Hadamard three-line theorem

Let S ≡ {z ∈ C : 0 ≤ Re {z} ≤ 1} , and let L (H) be the space ofbounded linear operators acting on a Hilbert space H. LetG : S → L (H) be a bounded map that is holomorphic on the interiorof S and continuous on the boundary. Let θ ∈ (0, 1) and define pθ by

1

pθ=

1− θp0

p1,

where p0, p1 ∈ [1,∞]. For k = 0, 1 define

Mk = supt∈R‖G (k + it)‖pk .

Then‖G (θ)‖pθ ≤ M1−θ

0 Mθ1 .

Mark M. Wilde (LSU) 16 / 23

Hadamard three-line theorem

Let S ≡ {z ∈ C : 0 ≤ Re {z} ≤ 1} , and let L (H) be the space ofbounded linear operators acting on a Hilbert space H. LetG : S → L (H) be a bounded map that is holomorphic on the interiorof S and continuous on the boundary. Let θ ∈ (0, 1) and define pθ by

1

pθ=

1− θp0

p1,

where p0, p1 ∈ [1,∞]. For k = 0, 1 define

Mk = supt∈R‖G (k + it)‖pk .

Then‖G (θ)‖pθ ≤ M1−θ

0 Mθ1 .

Mark M. Wilde (LSU) 16 / 23

Hadamard three-line theorem

Let S ≡ {z ∈ C : 0 ≤ Re {z} ≤ 1} , and let L (H) be the space ofbounded linear operators acting on a Hilbert space H. LetG : S → L (H) be a bounded map that is holomorphic on the interiorof S and continuous on the boundary. Let θ ∈ (0, 1) and define pθ by

1

pθ=

1− θp0

p1,

where p0, p1 ∈ [1,∞]. For k = 0, 1 define

Mk = supt∈R‖G (k + it)‖pk .

Then‖G (θ)‖pθ ≤ M1−θ

0 Mθ1 .

Mark M. Wilde (LSU) 16 / 23

Three (or so) line proof

Pick G (z) ≡(

[N (ρ)]z/2 [N (σ)]−z/2 ⊗ IE

)Uσz/2ρ1/2,

p0 = 2, p1 = 1, θ ∈ (0, 1) ⇒ pθ =2

1 + θ

M0 = supt∈R

∥∥∥(N (ρ)it/2N (σ)−it/2 ⊗ IE

)Uσitρ1/2

∥∥∥2≤∥∥∥ρ1/2∥∥∥

2= 1,

M1 = supt∈R‖G (1 + it)‖1 =

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]1/2

.

Apply the three-line theorem to conclude that

‖G (θ)‖2/(1+θ) ≤[

supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]θ/2

.

Take a negative logarithm and the limit as θ ↘ 0 to conclude.

Mark M. Wilde (LSU) 17 / 23

Three (or so) line proof

Pick G (z) ≡(

[N (ρ)]z/2 [N (σ)]−z/2 ⊗ IE

)Uσz/2ρ1/2,

p0 = 2, p1 = 1, θ ∈ (0, 1) ⇒ pθ =2

1 + θ

M0 = supt∈R

∥∥∥(N (ρ)it/2N (σ)−it/2 ⊗ IE

)Uσitρ1/2

∥∥∥2≤∥∥∥ρ1/2∥∥∥

2= 1,

M1 = supt∈R‖G (1 + it)‖1 =

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]1/2

.

Apply the three-line theorem to conclude that

‖G (θ)‖2/(1+θ) ≤[

supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]θ/2

.

Take a negative logarithm and the limit as θ ↘ 0 to conclude.

Mark M. Wilde (LSU) 17 / 23

Three (or so) line proof

Pick G (z) ≡(

[N (ρ)]z/2 [N (σ)]−z/2 ⊗ IE

)Uσz/2ρ1/2,

p0 = 2, p1 = 1, θ ∈ (0, 1) ⇒ pθ =2

1 + θ

M0 = supt∈R

∥∥∥(N (ρ)it/2N (σ)−it/2 ⊗ IE

)Uσitρ1/2

∥∥∥2≤∥∥∥ρ1/2∥∥∥

2= 1,

M1 = supt∈R‖G (1 + it)‖1 =

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]1/2

.

Apply the three-line theorem to conclude that

‖G (θ)‖2/(1+θ) ≤[

supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]θ/2

.

Take a negative logarithm and the limit as θ ↘ 0 to conclude.

Mark M. Wilde (LSU) 17 / 23

Three (or so) line proof

Pick G (z) ≡(

[N (ρ)]z/2 [N (σ)]−z/2 ⊗ IE

)Uσz/2ρ1/2,

p0 = 2, p1 = 1, θ ∈ (0, 1) ⇒ pθ =2

1 + θ

M0 = supt∈R

∥∥∥(N (ρ)it/2N (σ)−it/2 ⊗ IE

)Uσitρ1/2

∥∥∥2≤∥∥∥ρ1/2∥∥∥

2= 1,

M1 = supt∈R‖G (1 + it)‖1 =

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]1/2

.

Apply the three-line theorem to conclude that

‖G (θ)‖2/(1+θ) ≤[

supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]θ/2

.

Take a negative logarithm and the limit as θ ↘ 0 to conclude.

Mark M. Wilde (LSU) 17 / 23

Three (or so) line proof

Pick G (z) ≡(

[N (ρ)]z/2 [N (σ)]−z/2 ⊗ IE

)Uσz/2ρ1/2,

p0 = 2, p1 = 1, θ ∈ (0, 1) ⇒ pθ =2

1 + θ

M0 = supt∈R

∥∥∥(N (ρ)it/2N (σ)−it/2 ⊗ IE

)Uσitρ1/2

∥∥∥2≤∥∥∥ρ1/2∥∥∥

2= 1,

M1 = supt∈R‖G (1 + it)‖1 =

[supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]1/2

.

Apply the three-line theorem to conclude that

‖G (θ)‖2/(1+θ) ≤[

supt∈R

F(ρ,RP,t

σ,N (N (ρ)))]θ/2

.

Take a negative logarithm and the limit as θ ↘ 0 to conclude.

Mark M. Wilde (LSU) 17 / 23

SSA refinement as a special case

Let ρABC be a density operator acting on a finite-dimensional Hilbertspace HA ⊗HB ⊗HC . Then the following inequality holds

I (A;B|C )ρ ≥ − log

[supt∈R

F(ρABC ,RP,t

C→AC (ρBC ))],

where RP,tC→AC is the following rotated Petz recovery map:

RP,tC→AC (·) ≡

(UρAC ,t ◦ R

PC→AC ◦ UρC ,−t

)(·) ,

the Petz recovery map RPC→AC is defined as

RPC→AC (·) ≡ ρ1/2AC

[ρ−1/2C (·) ρ−1/2C ⊗ IA

]ρ1/2AC ,

and the partial isometric maps UρAC ,t and UρC ,−t are defined as before.

Mark M. Wilde (LSU) 18 / 23

SSA refinement as a special case

Let ρABC be a density operator acting on a finite-dimensional Hilbertspace HA ⊗HB ⊗HC . Then the following inequality holds

I (A;B|C )ρ ≥ − log

[supt∈R

F(ρABC ,RP,t

C→AC (ρBC ))],

where RP,tC→AC is the following rotated Petz recovery map:

RP,tC→AC (·) ≡

(UρAC ,t ◦ R

PC→AC ◦ UρC ,−t

)(·) ,

the Petz recovery map RPC→AC is defined as

RPC→AC (·) ≡ ρ1/2AC

[ρ−1/2C (·) ρ−1/2C ⊗ IA

]ρ1/2AC ,

and the partial isometric maps UρAC ,t and UρC ,−t are defined as before.

Mark M. Wilde (LSU) 18 / 23

SSA refinement as a special case

Let ρABC be a density operator acting on a finite-dimensional Hilbertspace HA ⊗HB ⊗HC . Then the following inequality holds

I (A;B|C )ρ ≥ − log

[supt∈R

F(ρABC ,RP,t

C→AC (ρBC ))],

where RP,tC→AC is the following rotated Petz recovery map:

RP,tC→AC (·) ≡

(UρAC ,t ◦ R

PC→AC ◦ UρC ,−t

)(·) ,

the Petz recovery map RPC→AC is defined as

RPC→AC (·) ≡ ρ1/2AC

[ρ−1/2C (·) ρ−1/2C ⊗ IA

]ρ1/2AC ,

and the partial isometric maps UρAC ,t and UρC ,−t are defined as before.

Mark M. Wilde (LSU) 18 / 23

SSA refinement as a special case

Let ρABC be a density operator acting on a finite-dimensional Hilbertspace HA ⊗HB ⊗HC . Then the following inequality holds

I (A;B|C )ρ ≥ − log

[supt∈R

F(ρABC ,RP,t

C→AC (ρBC ))],

where RP,tC→AC is the following rotated Petz recovery map:

RP,tC→AC (·) ≡

(UρAC ,t ◦ R

PC→AC ◦ UρC ,−t

)(·) ,

the Petz recovery map RPC→AC is defined as

RPC→AC (·) ≡ ρ1/2AC

[ρ−1/2C (·) ρ−1/2C ⊗ IA

]ρ1/2AC ,

and the partial isometric maps UρAC ,t and UρC ,−t are defined as before.

Mark M. Wilde (LSU) 18 / 23

SSA refinement as a special case

Let ρABC be a density operator acting on a finite-dimensional Hilbertspace HA ⊗HB ⊗HC . Then the following inequality holds

I (A;B|C )ρ ≥ − log

[supt∈R

F(ρABC ,RP,t

C→AC (ρBC ))],

where RP,tC→AC is the following rotated Petz recovery map:

RP,tC→AC (·) ≡

(UρAC ,t ◦ R

PC→AC ◦ UρC ,−t

)(·) ,

the Petz recovery map RPC→AC is defined as

RPC→AC (·) ≡ ρ1/2AC

[ρ−1/2C (·) ρ−1/2C ⊗ IA

]ρ1/2AC ,

and the partial isometric maps UρAC ,t and UρC ,−t are defined as before.

Mark M. Wilde (LSU) 18 / 23

Conclusions

The result of [FR14] already had a number of important implicationsin quantum information theory.

The new result in [Wil15] applies to relative entropy differences, has abrief proof, and improves our understanding of the input and outputunitaries (but see [SFR15] for the special case of SSA)

By building on [SFR15, Wil15], we can now generalize these results:there is a universal recovery map which depends only on σ and N andhas the form [SRWW15]:

X →∫µ(dt) RP,t

σ,N (X )

for some probability measure µ.

It is still conjectured that the recovery map can be the Petz recoverymap alone (not a rotated Petz map).

Mark M. Wilde (LSU) 19 / 23

Conclusions

The result of [FR14] already had a number of important implicationsin quantum information theory.

The new result in [Wil15] applies to relative entropy differences, has abrief proof, and improves our understanding of the input and outputunitaries (but see [SFR15] for the special case of SSA)

By building on [SFR15, Wil15], we can now generalize these results:there is a universal recovery map which depends only on σ and N andhas the form [SRWW15]:

X →∫µ(dt) RP,t

σ,N (X )

for some probability measure µ.

It is still conjectured that the recovery map can be the Petz recoverymap alone (not a rotated Petz map).

Mark M. Wilde (LSU) 19 / 23

Conclusions

The result of [FR14] already had a number of important implicationsin quantum information theory.

The new result in [Wil15] applies to relative entropy differences, has abrief proof, and improves our understanding of the input and outputunitaries (but see [SFR15] for the special case of SSA)

By building on [SFR15, Wil15], we can now generalize these results:there is a universal recovery map which depends only on σ and N andhas the form [SRWW15]:

X →∫µ(dt) RP,t

σ,N (X )

for some probability measure µ.

It is still conjectured that the recovery map can be the Petz recoverymap alone (not a rotated Petz map).

Mark M. Wilde (LSU) 19 / 23

Conclusions

The result of [FR14] already had a number of important implicationsin quantum information theory.

The new result in [Wil15] applies to relative entropy differences, has abrief proof, and improves our understanding of the input and outputunitaries (but see [SFR15] for the special case of SSA)

By building on [SFR15, Wil15], we can now generalize these results:there is a universal recovery map which depends only on σ and N andhas the form [SRWW15]:

X →∫µ(dt) RP,t

σ,N (X )

for some probability measure µ.

It is still conjectured that the recovery map can be the Petz recoverymap alone (not a rotated Petz map).

Mark M. Wilde (LSU) 19 / 23

Conclusions

The result of [FR14] already had a number of important implicationsin quantum information theory.

The new result in [Wil15] applies to relative entropy differences, has abrief proof, and improves our understanding of the input and outputunitaries (but see [SFR15] for the special case of SSA)

By building on [SFR15, Wil15], we can now generalize these results:there is a universal recovery map which depends only on σ and N andhas the form [SRWW15]:

X →∫µ(dt) RP,t

σ,N (X )

for some probability measure µ.

It is still conjectured that the recovery map can be the Petz recoverymap alone (not a rotated Petz map).

Mark M. Wilde (LSU) 19 / 23

References I

[BLW14] Mario Berta, Marius Lemm, and Mark M. Wilde. Monotonicity of quantumrelative entropy and recoverability. December 2014. arXiv:1412.4067.

[BS03] Igor Bjelakovic and Rainer Siegmund-Schultze. Quantum Stein’s lemmarevisited, inequalities for quantum entropies, and a concavity theorem of Lieb. July2003. arXiv:quant-ph/0307170.

[BSW14] Mario Berta, Kaushik Seshadreesan, and Mark M. Wilde. Renyi generalizationsof the conditional quantum mutual information. March 2014. arXiv:1403.6102.

[Bur69] Donald Bures. An extension of Kakutani’s theorem on infinite product measuresto the tensor product of semifinite w∗-algebras. Transactions of the AmericanMathematical Society, 135:199–212, January 1969.

[FR14] Omar Fawzi and Renato Renner. Quantum conditional mutual information andapproximate Markov chains. October 2014. arXiv:1410.0664.

[HJPW04] Patrick Hayden, Richard Jozsa, Denes Petz, and Andreas Winter. Structureof states which satisfy strong subadditivity of quantum entropy with equality.Communications in Mathematical Physics, 246(2):359–374, April 2004.arXiv:quant-ph/0304007.

Mark M. Wilde (LSU) 20 / 23

References II

[HP91] Fumio Hiai and Denes Petz. The proper formula for relative entropy and itsasymptotics in quantum probability. Communications in Mathematical Physics,143(1):99–114, December 1991.

[Lin75] Goran Lindblad. Completely positive maps and entropy inequalities.Communications in Mathematical Physics, 40(2):147–151, June 1975.

[L73] Elliott H. Lieb. Convex Trace Functions and the Wigner-Yanase-DysonConjecture. Advances in Mathematics, 11(3), 267–288, December 1973.

[LR73] Elliott H. Lieb and Mary Beth Ruskai. Proof of the strong subadditivity ofquantum-mechanical entropy. Journal of Mathematical Physics, 14(12):1938–1941,December 1973.

[LW14] Ke Li and Andreas Winter. Squashed entanglement, k-extendibility, quantumMarkov chains, and recovery maps. October 2014. arXiv:1410.4184.

[NO00] Hirsohi Nagaoka and Tomohiro Ogawa. Strong converse and Stein’s lemma inquantum hypothesis testing. IEEE Transactions on Information Theory,46(7):2428–2433, November 2000. arXiv:quant-ph/9906090.

Mark M. Wilde (LSU) 21 / 23

References III

[NP04] Michael A. Nielsen and Denes Petz. A simple proof of the strong subadditivityinequality. arXiv:quant-ph/0408130.

[Pet86] Denes Petz. Sufficient subalgebras and the relative entropy of states of a vonNeumann algebra. Communications in Mathematical Physics, 105(1):123–131,March 1986.

[Pet88] Denes Petz. Sufficiency of channels over von Neumann algebras. QuarterlyJournal of Mathematics, 39(1):97–108, 1988.

[SBW14] Kaushik P. Seshadreesan, Mario Berta, and Mark M. Wilde. Renyi squashedentanglement, discord, and relative entropy differences. October 2014.arXiv:1410.1443.

[SFR15] David Sutter, Omar Fawzi, and Renato Renner. Universal recovery map forapproximate Markov chains. April 2015. arXiv:1504.07251.

[SRWW15] David Sutter, Renato Renner, Mark M. Wilde, and Andreas Winter.Universal recovery from a decrease of quantum relative entropy. June 2015.arXiv:150?.?????.

Mark M. Wilde (LSU) 22 / 23

References IV

[Uhl76] Armin Uhlmann. The “transition probability” in the state space of a *-algebra.Reports on Mathematical Physics, 9(2):273–279, 1976.

[Uhl77] Armin Uhlmann. Relative entropy and the Wigner-Yanase-Dyson-Lieb concavityin an interpolation theory. Communications in Mathematical Physics, 54(1):21–32,1977.

[Ume62] Hisaharu Umegaki. Conditional expectations in an operator algebra IV (entropyand information). Kodai Mathematical Seminar Reports, 14(2):59–85, 1962.

[Wil15] Mark M. Wilde. Recoverability in quantum information theory. May 2015.arXiv:1505.04661.

[WL12] Andreas Winter and Ke Li. A stronger subadditivity relation?http://www.maths.bris.ac.uk/∼csajw/stronger subadditivity.pdf, 2012.

Mark M. Wilde (LSU) 23 / 23