Upload
hoangdan
View
225
Download
0
Embed Size (px)
Citation preview
DESIGUALDADES DE HEISENBERG GENERALES
Mariela Portesi (IFLP & DF, La Plata)In collaboration with: Steeve Zozor (Grenoble) Christophe Vignat (Paris) Jesús S. Dehesa (Granada) Pablo Sánchez-Moreno (Granada)
VI WORKSHOP MECÁNICA ESTADÍSTICA Y TEORÍA DE LA INFORMACIÓN
Hotel Iruña, Mar del Plata 26 - 28 Mayo 2010
MotivationsMeasure of uncertainty / information :
measures of the dispersion or randomness of a statemeasure of the “complexity" of a signal, of a time series, of atomic organization...in (tele)communications: measure of the information that can be transmitted through a channelin time-frequency representations: measure of uncertainty of a signal in the “dual" time and frequency spaces; search for the time-frequency “atoms" that minimize the “joint“ uncertainty
Quantum Mechanics’ Uncertainty Principle: The precision with which incompatible physical observables can be prepared, is limited by an upper bound. There exists an irreducible lower bound for the uncertainty in the result of a simultaneous mesaurement of non-commuting observables.
( , ; ) ( , )U A B A Bψ ≥B
U: VARIANCES / HIGHER ORDER MOMENTSHeisenberg inequality 1927 Schrödinger – Robertson generalized relation 1929Sánchez-Moreno, González-Férez & S.Dehesa 2006Zozor, MP, Sánchez-Moreno & S.Dehesa 2010
U: ENTROPIES (Shannon / Rényi / Tsallis)Deutsch 1983Maassen & Uffink 1988; Uffink thesis 1990 Bialynicki-Birula 1975; 2007MP & Plastino 1995Vignat & Zozor 2007A. Luis 2007Zozor, MP & Vignat 2008
Outline of the talk
Variance formulation of Uncertainty Principle
Entropic formulation of Uncertainty Principle:
• Shannon entropy
• Havrda&Charvat-Daroczy-Tsallis entropy
• Rényi entropy; Rényi entropy power
( Both discrete and continuous cases )
Beyond the variance formulation: higher order moments
Final comments
Variance formulation of UPUse product of variances as measures of dispersion
Variance of observable C when the system is in state ψ:
gives a measure of uncertainty in the value of C; measures spread or localization for simple probability distributions (one hump)
In random vector point of view, C corresponds to a random variable X distributed according to |ψ(x)|2. Or in time-frequency domain, signal x(t) can be normalized so that |x(t)|2 sums to 1
2( ) ( )C C CψΔ = ⟨ − ⟨ ⟩ ⟩ | |C Cψ ψ⟨ ⟩ = ⟨ ⟩
Variance formulation of UPGENERALIZED HEISENBERG(-ROBERTSON) INEQUALITY:
has sense provided both variances existdoes not provide a universal lower bound for uncertainty (unless [A,B] is a c-number: Heisenberg case)involves only second moments of A and B
--------------------ALTERNATIVE FORMULATIONS: entropies, Fisher info OTHER ALTERNATIVES: higher order moments
Heisenberg, Z.Phys. (1927); Robertson, Phys.Rev. 34 (1929)
1 [ , ]2
A B A BΔ Δ ≥ ⟨ ⟩
Entropic formulation of UP: ShannonUse sum of Shannon entropies as measure of uncertainty
Shannon entropy for prob.distribution obtained by projection of ψ on discrete n-dimensional eigenbasis of C, gives missing or lack of information
SHANNON ENTROPIC UNCERTAINTY INEQUALITY:
provides informative universal lower bound, independent of A and Beigenvalues; more stringent than Heisenberg relation.Search for infimum (or bounds) for
Deutsch, Phys.Rev.Lett. 50 (1983)
, ,1
( ; ) lnn
C i C ii
S C p pψ=
= −∑ 2, |C i ip c ψ= ⟨ ⟩
( ; ) ( ; ) ( , )S A S B A Bψ ψ+ ≥B
( ) ( ) ( )1 , U A B S A S B= +
Bounds for Shannon entropy UPLet c be the maximum overlap between A and B eigenstates.
Then
Particularly, for complementary observables (corresponding to ψand its discrete Fourier transform), one has c=1/√n .Then
Other alternative measures for missing information: extensions of Shannon entropy
Kraus, Phys.Rev.D (1987); Maassen & Uffink, Phys.Rev.Lett. 60 (1988)
11 2( , ) 2 ln 2 ln
1U A B
c c≥ ≥
+
1( , ) lnB nU A ≥
Entropic formulation of UP: HCDTUse q-generalized sum of q-entropies as measure of uncertainty
Havrda&Charvat-Daroczy-Tsallis q-entropy (q>0) for probability distribution pA,i ; same for B with identical value of index q
HCDT q-ENTROPIC UNCERTAINTY INEQUALITY:
provides informative universal lower bound, indep.of A,B eigenvalues
Search for infimum (or bounds) for
MP, Thesis UNLP (1995); MP & Plastino, Physica A 225 (1996)
,1
1( ; ) 11
nT qq A i
iS A p
qψ
=
⎛ ⎞⎜ ⎟⎝
= −− ⎠
∑
( ) ( ) (1 ) ( ) ( ) ( , )T T T T Tq q q q qS A S B q S A S B A B+ + − ≥B
( ) ( ) ( ), T T Tq q q qU A B S A S B= ⊕
Bounds for HCDT entropy UPLet c be the maximum overlap between A and B eigenstates.
Then
where
Particularly, for complementary observables: c=1/√n . Then
Other alternative measures for (missing) information
MP, Thesis UNLP (1995); MP & Plastino, Physica A 225 (1996)
2
2
1 2 1( , ) ln ln , 11 2
Tq q qU A B q
c c⎛ ⎞≥ ≥ ≤ ≤⎜ ⎟+⎝ ⎠
11ln ( )1
q
qxx
q
−−≡
−
lnTq qU n≥
UP in the continuous caseUse (generalized) sum of (q-)entropies, for the probability densities
associated to observable A and its conjugate Ã, e.g.:X and P in d dimensions, or angular position and ang. momentum
Heisenberg product saturates for CS (d=1 Gaussian wf of width δ):
Shannon entropic ineq. is stronger than Heisenberg’s; saturates for CS or HO wf:
HCDT q-entropic uncertainty for Coherent States is δ-independent:
Bialynicki-Birula et al., Comm.Math.Phys. 44 (1975), Phys.Lett.A 108 (1985), Phys.Rev.A 74 (2006)
( )2 2( ) | ( ) | and ( ) | ( ) | ( )x x p p= = =ρ ψ γ ψ ψ ψF
( ) ( )2 2
X G P Gδ δδ
δΔ Δ =
1( , ; ) (1 ln )U X P dψ π≥ +
1( , ; ) (1 ln )q qU X P Gqδ π= +
Shannon entropic UP implies Heisenberg inequality
is equivalent to
for the Shannon entropy power
Maximizer of N(ρ) under variance constraint ( <r2> fixed ) is a Gaussian, and the entropy power is <r2> / d . Same for N(γ).
Then
from which
( ) ( ) (1 ln )S S d+ ≥ +ρ γ π ( ) ( ) 1/ 4N N ≥ρ γ
1 2( ) exp ( )2
N Se d
⎛ ⎞= ⎜ ⎟⎝ ⎠
ρ ρπ
2 21 ( ) ( ) ( ) ( )4 G G
r pN N N N
d d= ρ γ ≤ ρ γ =
22 2 2 ( ) ( )
4dr p d N N≥ ρ γ ≥
Entropic formulation of UP: RényiUse sum of Rényi entropies (or product of Rényi entropy powers) as measure of uncertainty
Rényi α-entropy (α>0, approaches S when α 1) forprobab.distribution pA,i ; same for B with arbitrary β
Rényi α-entropy power :(N α =M α +1 : M&U certainty; N2: Larsen’s purity)
RÉNYI ENTROPIC UNCERTAINTY INEQUALITY:
RÉNYI ENTROPY POWER (EP)UNCERTAINTY INEQUALITY:
Zozor & Vignat, Physica A 375 (2007); A.Luis, Phys.Rev.A 75 (2007); B-B, Found.Prob.& Phys.889 (2007)
, 21
1 2( ; ) ln ln1 1
nR
A ii
S A p αα α
αψ ψα α=
⎛ ⎞= =⎜ ⎟− −⎝ ⎠
∑1( ) exp ( )RN A S Adα α
⎛ ⎞= ⎜ ⎟⎝ ⎠
,( ; ) ( ; ) ( , )R R RS A S B A Bα β α βψ ψ+ ≥B
,( ; ) ( ; ) ( , )N A N B A Bα β α βψ ψ ≥C
Rényi EP uncertainty prod. for A,ÃAssume A and B=Ã are complementary (or conjugate) observables
(e.g., X and P), with associated probability densities and
, i.e., the corresponding wavefunctions are linked by a
Fourier transform .
We consider continuous and discrete state space.
Search for non-trivial infimum (or lower bounds), if there exists, for the Renyi entropy power uncertainty product
2| ( ) |pψ
2| ( ) |xψ
( ), ( , ) ( ) ??U A Ã N A N Ãα β α β= ≥ …
1.Particular situation: conjugated indices
Property:
Universal lower bounds (i.e. independent of the state of the system)In C-C case saturates iff ψ=Gaussian; in D-D case saturates iff ψ=Kronecker indicator or
constantOnly possibility with same uncertainty quantifier : α=β=1 (Shannon)
Bialynicki-Birula, Phys.Rev.A 74 (2006); Zozor & Vignat, Physica A 375 (2007)
1 1 1, 12 2p q p
p qα β≡ ≡ + = ≥and with
1 1( , ) : 1,2 2
α βα β+ =For there exists a Rényi EP UP of the form
/2 /21 1
( 2) ( 2)
( ) ( ) 2
2
p q
p q
nN A N A
p q
π
π − −
⎧⎪⎪≥ ⎨⎪⎪⎩
discrete-discrete case
discrete- continuous case
continuous- cont. case
C-C case: hints for the proof
p-norm : Fourier transform :
Babenco – Beckner inequality :
then
finally use p= 2α and q=2β , and take log to obtain entropy power
1/
( )d
pp d
px d x
⎛ ⎞ψ = ψ⎜ ⎟
⎝ ⎠∫
( )
( )( )
/21/
1/
/ 2 1 1sup , 1/ 2d
p
dpq
qL p
pp qqψ∈
⎡ ⎤ψ π= + =⎢ ⎥
ψ π⎢ ⎥⎣ ⎦
( )( )
( )/21
2tix p d
dp x e d x−ψ = ψπ ∫
( ),
d
p qq pCψ ≤ ψ
2.General situation: arbitrary indices
Gives flexibility to choose information measures in position and in momentum space, as αand β are unrelated. Also same quantifiers can be taken, other than Shannon
Define
and
On hyperbola C (in D), 2α and 2β are conjugated. Shannon case (α ,β) = (1,1) is on C.
1 1 12 2α β
+ ≠
1/ 2 2 22 1αα α α αα
= ≥−
with (then and are conjugated)
0
20
2
1( , ) : and 2
1( , ) : ,2
\10 ;2
[ )
α β α β α
α β α β α
+
⎧ ⎧ ⎫= > >⎨ ⎬⎪ ⎩ ⎭⎪⎪ ⎧ ⎫= ≥ =⎪ ⎨ ⎬⎨ ⎩ ⎭⎪ =⎪⎪
=⎪⎩
D
C
D D
S
arbitrary indices (cont.)Discrete-discrete case:
Property D-D:
Saturates for Kronecker or constant wf
Discrete-continuous case:
Property D-C:
Sharp lower bound, saturates for Kronecker indicator. Not solved yet in D0.
Continuous-continuous case:
E.g. α = β = 2: N2(A) N2(Ã) is not saturated forGaussians !!Indeed, A. Luis found d=1 exponential states with U2,2(E) < 2π = U2,2(G)
M & U, Phys.Rev.Lett. 60 (1988); A.Luis, Phys.Rev.A 75 (2007); Zozor, MP & Vignat, Phys.A 387(2008)
2
2//2
/ 2
For any ( , ) , there exists a Rényi EP UP of the form
2( ) ( )1
dd
d
nN A N Anα β
α β +∈
⎛ ⎞≥ ⎜ ⎟+⎝ ⎠
For any ( , ) , there exists a Rényi EP UP of the form
( ( ) 2)N A N Aα β
α β
π
∈
≥
D
Property C-C1:
This relation is not sharp and/or not saturated for Gaussians .
Property C-C2:
Example: the 2-parameters Student-t wf gives arbitrarily small Rényi EP uncertaintyproduct for given values of those parameters.
The case (α, β) = (2,2) corresponds to a point in D0, what explains Luis results !!
Zozor, MP & Vignat, Physica A 387 (2008)
arbitrary indices (cont.)
0For any ( , ) , no Rényi EP UP exists since
( ) ( ) can be arbitrarily small (and is positive).N A N Aα β
α β ∈D
1 12( 1) 2( 1)
For any ( , ) , there exists a Rényi EP UP of the form( ) in \ with
( ) ( ) ( ) in \ with 2 in
where ( ) withx x
BN A N A B
B x x x
α β
α βα α ββ β απ
π − −
∈
≥⎧ ⎫⎪ ⎪≥ ≥⎨ ⎬⎪ ⎪⎩ ⎭
=
DDD
SS
S
/ (2 1) x x x= −
Higher order moments as UPUse moments higher than variance as general measure of dispersion
Then Heisenberg-like relations:
for any positive a, b. Hint: , ...
2/( )
aaa X rΔ =ψ
2/( )
bbbP pΔ =ψ
2/ 2/...??
a ba br p ≥
/( ) ( , )
d aar N g aα≥ ρ × α
The case a=b=2 improved for V(r)Heisenberg inequality in d dimensions:
improved for d-dim central potentials: ( l = hyperangular q.number)
Reduces to Heisenberg for s states, but improves for l>0 states (growing as l2)
Saturates for nodeless isotropic HO wavefunctions (g.s.):
For then( nr = No.nodes )
For Coulomb potential V(r) = -1/r : ( L=l+(d-3)/2 , η=n+(d-3)/2 )Uncertainty product is larger than bound.
22 2
4dr p ≥
22 2
2dr p l⎛ ⎞≥ +⎜ ⎟
⎝ ⎠
2 2( ) / 2V r r= ω2
2 2
2rdr p n l⎛ ⎞= + +⎜ ⎟
⎝ ⎠
( )2 2 21 5 3 ( 1) 12
r p L L= η − + +
Sánchez-Moreno et al., NJP 8 (2006)
Final commentsWe consider the mathematical formulation of the Uncertainty Principle for two conjugate
observables A and à (in state space and Fourier transformed space, with discrete orcontinuous spectra), in terms of products of Rényi entropy powers:
Previous known results refer mostly to inequalities for pairsof conjugated indices: (α,β) located on curve CWe extend use of products of entropy powers to general situation of arbitrary, positive non-conjugated indices:
Continuos-Continuous case :we show that entropic uncertainty relations exist for (α,β) in D, with different bounds. Conversely, in D0
the positive product Nα(A) Nβ(Ã) can be arbitrarily small (e.g. (2,2) ).
NEXT STEPS: deeper study of absence of restriction in D0, search for the best bounds for entropy-products in D, search for states that minimize entropy-product in D.Analisis of Heisenberg-like relations of higher order.
1 1 12 2α β
+ ≠
,( ) ( ) ( , )N A N A A Aα β α β≥C 1( ) exp ( )RN A S Adα α
⎛ ⎞= ⎜ ⎟⎝ ⎠