Upload
others
View
13
Download
1
Embed Size (px)
Citation preview
ELEN
Compressive sensing of low-complexity signals: theory, algorithms and extensions
Laurent Jacques March 7, 9, 10, 14, 16 and 18, 2016
9h30 - 12h30 (incl. 30’ )
Graduate School in Systems, Optimization, Control and Networks (SOCN)
ELEN
General content of this doctoral course‣ Sparse signal models:
from Fourier to *-lets transforms (wavelets, curvelets, …) ‣ Beyond sparsity: grouped sparsity and low-rank priors ‣ General overview of compressed sensing theory
(aka incoherent sensing) ‣ Random isometries and Johnson-Lindenstrauss lemma ‣ Signal recovery in compressed sensing:
optimization and greedy methods ‣ Quantizing compressed sensing
and quasi-isometric embeddings ‣ Compressed sensing applications
2
ELEN
Main course material‣ S. Mallat, “A Wavelet Tour of Signal Processing”,
3rd ed., Academic Press, dec. 2008. ISBN 978-0-1237-4370-1
‣ S. Foucart, H. Rauhut, “A Mathematical Introduction to Compressive Sensing”, ISBN 978-0-8176-4948-7 http://www.maths.manchester.ac.uk/~mlotz/teaching/books/csbook.pdf
‣ plus internet resources: ‣ The Numerical Tours of Signal Processing
http://www.numerical-tours.com ‣ Rice University's Compressive Sensing Resources:
http://dsp.rice.edu/cs
‣ “The Nuit Blanche blog”: http://nuit-blanche.blogspot.be (on CS, Machine/Deep Learning, low-rank minimization, …)
3
ELEN
Caveats‣ The course will focus on a discrete formalism
(signals, images, videos, … are vectors!) ‣ Emphasize both on intuition & proofs
(when these are short enough ;-) ) ‣ Reference books/resources
available for further reading
4
ELEN
Part 0 Course overview
ELEN 6
Generally, sampling is ...
Human Readable Signal+ Shannon/Nyquist
ELEN 7
Generally, sampling is ...
Human Readable Signal
Boulevard du Temple, Paris, 1839 (wikipedia)
Example: Daguerreotype
“Camera obscura” + photochemical recording
+ Shannon/Nyquist
ELEN 8
But, new ways to sample signals !!!
Paradigm shift: “Computer readable” sensing + prior information
ELEN 9
But, new ways to sample signals !!!
World Sensing Device Human
Signal Sensing Signal
Optimized blocs! Sampling rate ≈ information!
Paradigm shift: “Computer readable” sensing + prior information
ELEN
Prior information? Informative signals are composed of structures ...
10
Speech signal3-D data
AstronomyBiology Spherical data
Data on Graph
2.4. Transformee continue en ondelettes sur la sphere 37
avec ψa(l, m) = ⟨Y ml |ψa⟩ la transformee en harmonique spherique12 de ψa = Daψ.
Une condition plus simple a manipuler et presque equivalente a (2.65) est d’imposerque [Van98]
!
S2
dµ(θ, ϕ)ψ(θ, ϕ)
1 + cos θ= 0, (2.66)
condition homologue a l’annulation de la moyenne des ondelettes planes.En remarquant que
!
S2
dµ(θ, ϕ)Daψ(θ, ϕ)
1 + cos θ= a
!
S2
dµ(θ, ϕ)ψ(θ, ϕ)
1 + cos θ, (2.67)
la condition (2.66) permet de creer toute une classe d’ondelettes admissibles de la forme
ψ(θ, ϕ) = φ(θ, ϕ) − 1αDαφ(θ, ϕ), (2.68)
pour une certaine fonction φ ∈ L2(S2).
Fig. 2.3 – L’ondelette DOG pour α = 1.25 dilatee de a = 0.1.
En particulier, pour φ = exp"− tan2(1
2θ)#, c.-a-d. la projection stereographique inverse
de la gaussienne sur la sphere, nous obtenons l’ondelette spherique DOG13
ψ(θ, ϕ) = exp"− tan2(1
2θ)#
− 1αλ(α, θ)
12 exp
"− 1
α2 tan2(12θ)
#, (2.69)
dont une representation dilatee d’un facteur a = 0.1 est donnee sur la Figure 2.3.
12Nommee egalement transformee de Fourier sur S2.13Pour Difference of Gaussians.
ELEN
Origin: sparse models‣ Hypothesis: any informative signal can be decomposed
in a “sparsity basis” with few non-zero elements :
‣ can be an ONB (e.g. Fourier, wavelets) or a dictionary (atoms)
11
atomf# atoms ⇔ improved quality
Non-linear approximation
x
↵
x 'X
i
↵i i = � ↵ =
sparse vector0 00 0 0000 0
ELEN
‣ Include the case of sparse models + mixed-norm sparsities & model-based + low-rank data models+ union of low-dimension subspaces+ parametric models+ manifolds+ ….
‣ Intuition: ‣ model = small domain, low-effective dimension ‣ allows, e.g., inverse problem “regularization”
12
What are low-complexity models (LC)?
ELEN
Common applications for LC models1. Data Compression/Transmission: (by definition)
2. Data restoration: e.g., ...
3. Simplified model and interpretation (e.g. in ML)
13
Inpainting
50 100 150 200 250
50
100
150
200
250
(Renormalized) haar DWT : 3 resolutions
50 100 150 200 250
50
100
150
200
250 −1
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
Wavelets, Curvelets, *-lets, Dictionaries, ...
(inverse problem solving)
Deconvolution Matrix completion
(not covered here)
ELEN
4. and … Compressed Sensing!
14
ELEN 15
Generalize Dirac/Nyquist sampling: 1°) ask few (linear) questions about your informative signal 2°) and recover it differently (non-linearly)”
... in a nutshell:
2.4. Transformee continue en ondelettes sur la sphere 37
avec ψa(l, m) = ⟨Y ml |ψa⟩ la transformee en harmonique spherique12 de ψa = Daψ.
Une condition plus simple a manipuler et presque equivalente a (2.65) est d’imposerque [Van98]
!
S2
dµ(θ, ϕ)ψ(θ, ϕ)
1 + cos θ= 0, (2.66)
condition homologue a l’annulation de la moyenne des ondelettes planes.En remarquant que
!
S2
dµ(θ, ϕ)Daψ(θ, ϕ)
1 + cos θ= a
!
S2
dµ(θ, ϕ)ψ(θ, ϕ)
1 + cos θ, (2.67)
la condition (2.66) permet de creer toute une classe d’ondelettes admissibles de la forme
ψ(θ, ϕ) = φ(θ, ϕ) − 1αDαφ(θ, ϕ), (2.68)
pour une certaine fonction φ ∈ L2(S2).
Fig. 2.3 – L’ondelette DOG pour α = 1.25 dilatee de a = 0.1.
En particulier, pour φ = exp"− tan2(1
2θ)#, c.-a-d. la projection stereographique inverse
de la gaussienne sur la sphere, nous obtenons l’ondelette spherique DOG13
ψ(θ, ϕ) = exp"− tan2(1
2θ)#
− 1αλ(α, θ)
12 exp
"− 1
α2 tan2(12θ)
#, (2.69)
dont une representation dilatee d’un facteur a = 0.1 est donnee sur la Figure 2.3.
12Nommee egalement transformee de Fourier sur S2.13Pour Difference of Gaussians.
e.g., sparse, structured, low-rank, ...
Compressed Sensing...
ELEN 16
x
N
Signal
0
00
0
0000
0
Sparsity Prior
( = Id)
A signal in this discrete world
�
M ⇥N
Sensing method
SENSORSENSORSENSOR
y
M
M questions
'noise
OBSERVATIONS
OBSERVATIONS
OBSERVATIONS
Compressed Sensing...
ELEN 17
� x
y
M M ⇥N
N
Sensing method Signal
0
00
0
0000
0
Sparsity Prior
( = Id)
A signal in this discrete world
Generalized Linear Sensing!
yi'i
1 i M e.g., to be realized optically/analogically
'noise
yi ' h'i,xi = '
Ti x
M questionsCompressed Sensing...
ELEN 18
� x
y
M M ⇥N
N
M questions Sensing method Signal
0
00
0
0000
0
Sparsity Prior
( = Id)
A signal in this discrete world
yi'i
'noise
But why does it work?Identifiability of x from �x?
Compressed Sensing...
(sparse)
ELEN 19
Geometry of �(⌃K)
⇡ Geometry of ⌃K
For many random constructions of �and “M & K log(N/K)”, with high probability,
(e.g., Gaussian, Bernoulli, structured)
Compressed Sensing...Two K-sparse signals x,x0 2 ⌃K := {u : kuk0 := | suppu| 6 K}
ELEN 20
RN
⌃K
�(⌃K) RM
�
�x ⇡ �x
0 , x ⇡ x
0
Geometry of �(⌃K)
⇡ Geometry of ⌃K
For many random constructions of �and “M & K log(N/K)”, with high probability,
(e.g., Gaussian, Bernoulli, structured)
Compressed Sensing...Two K-sparse signals x,x0 2 ⌃K := {u : kuk0 := | suppu| 6 K}
ELEN 21
Geometry of �(⌃K)
⇡ Geometry of ⌃K
For many random constructions of �and “M & K log(N/K)”, with high probability,
� embeds the low-dimensional domain ⌃K in RM!
(e.g., Gaussian, Bernoulli, structured)
Mathematically,
(1� ⇢)kuk2 1M k�uk2 (1 + ⇢)kuk2
� respects the Restricted Isometry Property RIP(K, ⇢)
for all u 2 ⌃K and 0 < ⇢ < 1.
Compressed Sensing...Two K-sparse signals x,x0 2 ⌃K := {u : kuk0 := | suppu| 6 K}
ELEN
Then, if ⇢ <p2� 1 [Candes, 09],
22
Robustness: vs sparse deviation + noise.
kx� xk . 1pKkx� xKk1 + ✏p
M
(with f . g ⌘ 9c > 0 : f 6 c g)
If
1pM� respects the Restricted Isometry Property (RIP)
⇢
x 2 arg minu2RN
kuk1 s.t. ky ��uk ✏
Basis Pursuit DeNoise [Chen, Donoho, Saunders, 1998]
Possible reconstruction: (others exist, e.g., greedy)
kuk1 =P
j |uj |Level of “noise”Sparsity promotion
y = �x+ n, knk 6 ✏
Compressed Sensing...
e0(K) : error of the model noise
⇢hidden constant
ELEN 23
The Power of Random ProjectionsAt the heart of CS: random projections!
But also: ‣ random sub-Gaussian ensembles (e.g., Bernoulli, bounded);
or structured sensing matrices: ‣ random Fourier/Hadamard ensembles (e.g., for CT, MRI); ‣ random convolutions, spread-spectrum (e.g., for imaging)
(see, e.g., [Foucart, Rauhut, 2013])
Gaussian: � 2 RM⇥N , with �ij ⇠iid N (0, 1)e.g.,
as realized by random sensing matrices
ELEN 24
RN�(M) RM
�x ⇡ �x
0 , x ⇡ x
0
Geometry of �(M)
⇡ Geometry of M
For many random constructions of �and “M & intrinsic dimension of M”, with high probability,
(e.g., Gaussian, Bernoulli, structured)
M
ULS, manifolds Hilbert spaces
Related Concepts?
Of specific interest beyond CS!
�
ELEN 25
RN�(M) RM
�x ⇡ �x
0 , x ⇡ x
0
(e.g., Gaussian, Bernoulli, structured)
Of specific interest beyond CS!
f(�x) ⇡ f(�x
0) , x ⇡ x
0
...
with f non-linear (e.g., quantification, sign operator)
M
ULS, manifolds Hilbert spaces
· · ·
Related Concepts?For many random constructions of �and “M & intrinsic dimension of M”, with high probability,
Geometry of f(�(M))
⇡ Geometry of Mf(� ·)
ELEN
‣ Connection with Quantized Compressed Sensing:Recover/estimate x from
26
sign (�x) 2 {�1,+1}M Q(�x) 2 C ⇢ RM
finite codebook
01011000111CS QCS
Related Concepts?
ELEN
CS Applications?
27
MANY!
ELEN
CS Applications?
28
Satellite imagingMagnetic Resonance ImagingProof of concept, 2007
Hyperspectral imaging
Internet of Thing Radio-interferometry
MANY!
ELEN
General outline of this course:‣ Part I: Low-complexity “signal” models ‣ Part II: Compressed Sensing (CS) ‣ Part III: Quantized aspects of CS ‣ Part IV: CS applications
29
ELEN
So, let’s start!
30