Upload
reese-stallman
View
228
Download
4
Tags:
Embed Size (px)
Citation preview
COMPRESSED SENSING
Luis ManceraVisual Information Processing GroupDep. Computer Science and AIUniversidad de Granada
CONTENTS
1. WHAT? Introduction to Compressed Sensing (CS)
2. HOW? Theory behind CS
3. FOR WHAT PURPOSE? CS applications
4. AND THEN? Active research and future lines
CONTENTS
1. WHAT? Introduction to Compressed Sensing (CS)
2. HOW? Theory behind CS
3. FOR WHAT PURPOSE? CS applications
4. AND THEN? Active research and future lines
Transmission scheme
Sample
Receive
Compress
Decompress
N KN >> K
Transmit
KN
Why so many samples?
Natural signals (sparse/compressible) no significant perceptual loss
Brick wall to performance
Shannon/Nyquist theorem
Shannon/Nyquist theorem tell us to use a sampling rate of 1/(2W) seconds, if W is the highest frequency of the signal
This is a worst-case bound for ANY band-limited signal
Sparse / compressible signals is a favorable case
CS solution: melt sampling and compression
Compressed Sensing (CS)
Compressed Sensing
ReceiveReconstruct
M
K < M << N
Transmit
MN
Recover sparse signals by directly acquiring compressed data
Replace samples by measurements
What do we need for CS to success?
We now how to Sense Compressively I’m glad this battle is over.
Finally my military period is over. I will now come back to Motril
and get married, and then I will grow up pigs as I have always
wanted to doAye Cool!
Do you mean you’re glad this battle is over because now
you’ve finished here and you will go back to Motril, get
married, and grow up pigs as you always wanted to?
What does CS need?
Nice sensing dictionary
Appropriate sensing
A priori knowledge
Recovery process
Wie lange wird das nehmen?
What?
Saint Roque’s dog has no tail
I know this guy so
much that I know
what he means
Cool!
Words Idea
CS needs:
Nice sensing dictionary
Appropriate sensing
A priori knowledge
Recovery process
SPARSENESS
RANDOMNESS
INCOHERENCE
OPTIMIZATION
Sparseness: less is more
A stranger approaching a hut by the only known road: the valley
Dictionary:
How to express it?
Idea:
“He was advancing by the only road that was ever traveled by the stranger as he approached the Hut; or, he came up the valley”
Wyandotte
Combining elements…
J.F. Cooper E.A. Poe
Combining elements…
Hummm, you could say the same using less words…
“He was advancing by the valley, the only road traveled by a stranger approaching the Hut”
Comments to Wyandotte
SPARSER
Sparseness: less is more Sparseness: Property of being small in numbers
or amount, often scattered over a large area
[Cambridge Advanced Learner’s Dictionary]
A CERTAIN DISTRIBUTION A SPARSER DISTRIBUTION
Sparseness: less is more
Original EinsteinTaking 10% pixels 10% Fourier coeffs. 10% Wavelet coeffs.
Pixels: not sparse A new domain can increase sparseness
Sparseness: less is moreDictionary:
How to express it?
Linear analysis
Non-linear analysisX-lets
elementary functions (atoms)
X-lets elementary functions (atoms)
linear subband
non-linear subband
SPARSER
X-let-based representations are compressible, meaning that most of the energy is concentrated in few coefficients Analysis-sense Sparseness:
Response of X-lets filters is sparse [Malllat 89, Olshausen & Field 96]
Synthesis-sense Sparseness: We can increase sparseness by non-linear analysis
Sparseness: less is moreDictionary:
How to express it?
Idea:
Combining other way…X-lets
elementary functions
X-lets elementary functions
non-linear subband
SPARSER
Taking around 3.5% of total coeffs…
Taking less coefficients we achieve strict sparseness, at the price of just approximating the image
PSNR: 35.67 dB
Incoherence Sparse signals in a given dictionary must be
dense in another incoherent one Sampling dictionary should be incoherent w.r.t.
that where the signal is sparse/compressible
A time-sparse signal Its frequency-dense representation
Measurement and recovery processes Measurement process:
Sparseness + Incoherence Random sampling will do
Recovery process: Numerical non-linear optimization is able to
exactly recover the signal given the measurements
CS relies on:
A priori knowledge: Many natural signals are sparse or compressible in a proper basis
Nice sensing dictionary: Signals should be dense when using the sampling waveforms
Appropriate sensing: Random sampling have demonstrated to work well
Recovery process: Bounds for exact recovery depends on the optimization method
Summary
CS is a simple and efficient signal acquisition protocol which samples at a reduced rate and later use computational power for reconstruction from what appears to be an incomplete set of measurements
CS is universal, democratic and asymmetrical
CONTENTS
1. WHAT? Introduction to Compressed Sensing (CS)
2. HOW? Theory behind CS
3. FOR WHAT PURPOSE? CS applications
4. AND THEN? Active research and future lines
The sensing problem
xt: Original discrete signal (vector) : Sampling dictionary (matrix) yk: Sampled signal (vector)
The sensing problem
Traditional sampling:
Original signalSampled signal Sampling dictionary
N x 1
y
N x N N x 1
= I x
The sensing problem When the signal is sparse/compressible, we can directly acquire
a condensed representation with no/little information loss Random projection will work if M = O(K log(N/K)) [Candès et al.,
Donoho, 2004]
M x 1
y
M x N N x 1
x
K nonzero entries
K < M << N
Universality
Random measurements can be used if signal is sparse/compressible in any basis
M x 1
y
M x N N x 1
a
K nonzero entries
K < M << N
N x N
Good sensing waveforms?
and should be incoherent Measure the largest correlation between any two
elements:
Large correlation low incoherence Examples
Spike and Fourier basis (maximal incoherence) Random and any fixed basis
Solution: sensing randomly
Random measurements
ReceiveReconstruct
M
M = O(K log(N/K))
Transmit
MN
We have set up the encoder Let’s now study the decoder
CS recovery
Assume a is K-sparse, and y = a We can recover a by solving:
This is a NP-hard problem (combinatorial) Use some tractable approximation
Count number of active coefficients
Robust CS recovery
What about a is only compressible and y = a + n), with n and unknown error term?
Isometry constant of : The smallest K such that, for all K-sparse vectors x:
obeys a Restricted Isometry Property (RIP) if K is not too close to 1
obeys a RIP Any subset of K columns are nearly orthogonal To recover K-sparse signals we need 2K < 1 (unique solution)
Recovery techniques
Minimization of L1-norm Greedy techniques Iterative thresholding Total-variation minimization …
Recovery by minimizing L1-norm
Convexity: tractable problem Solvable by Linear or Second-order
programming For C > 0, â1 = â if:
Sum of absolute values
Recovery by minimizing L1-norm Noisy data: Solve the LASSO problem
Convex problem solvable via 2nd order cone programming (SOCP)
If 2K < 2 – 1, then:
Example of L1 recovery
A120X512: Random orthonormal matrix Perfect recovery of x by L1-minimization
x y = Ax
Recovery by Greedy Pursuit Algorithm:
New active component: that whose corresponding i is most correlated with y
Find best approximation, y’, to y using active components
Substract y’ from y to form residual e Make y = e and repeat
Very fast for small-scale problems Not as accurate/robust for large signals in the
presence of noise
Recovery by Iterative Thresholding Algorithm:
Iterates between shrinkage/thresholding operation and projection onto perfect reconstruction
If soft-thresholding is used, analogous theory to L1-minimization
If hard-thresholding is used, the error is within a constant factor of the best attainable estimation error [Blumensath08]
Recovery by TV minimization
Sparseness: signals have few “jumps” Convexity: tractable problem Accurate and robust, but can be slow for
large-scale problems
Example of TV recovery
: Fourier transform Perfect recovery of x by TV-minimization
x xLS = x
Summary
Sensing: Use random sampling in dictionaries with low
coherence to that where the signal is sparse. Choose M wisely
Recovery: A wide range of techniques are available L1-minimization seems to work well, but choose
that best fitting your needs
CONTENTS
1. WHAT? Introduction to Compressed Sensing (CS)
2. HOW? Theory behind CS
3. FOR WHAT PURPOSE? CS applications
4. AND THEN? Active research and future lines
Some CS applications
Data compression Compressive imaging Detection, classification, estimation, learning… Medical imaging Analog-to-information conversion Biosensing Geophysical data analysis Hyperspectral imaging Compressive radar Astronomy Comunications Surface metrology Spectrum analysis …
Data compression
The sparse basis may be unknown or impractical to implement at the encoder
A randomly designed can be considered a universal encoding strategy
This may be helpful for distributed source coding in multi-signal settings
[Baron et al. 05, Haupt and Nowak 06,…]
Magnetic resonance imaging
Rice Single-Pixel CS Camera
Rice Analog-to-Information conversion
Analog input signal into discrete digital measurements
Extension of A2D converter that samples at signal’s information rate rather than its Nyquist rate
CS in Astronomy [Bobin et al 08]
Desperate need for data compression Resolution, Sensitivity and photometry are important Herschel satellite (ESA, 2009): conventional
compression cannot be used CS can help with:
New compressive sensors A flexible compression/decompression scheme
Computational cost (x): O(t) vs. JPEG 2000’s O(t log(t)) Decoupling of compression and decompression
CS outperforms conventional compression
CONTENTS
1. WHAT? Introduction to Compressed Sensing (CS)
2. HOW? Theory behind CS
3. FOR WHAT PURPOSE? CS applications
4. AND THEN? Active research and future lines
CS is a very active area
CS is a very active area
More than seventy 2008 papers in CS repository Most active areas:
New applications (de-noising, learning, video, New recovery methods (non-convex, variational, CoSamp,…)
ICIP 08: COMPRESSED SENSING FOR MULTI-VIEW TRACKING AND
3-D VOXEL RECONSTRUCTION COMPRESSIVE IMAGE FUSION IMAGE REPRESENTATION BY COMPRESSED SENSING KALMAN FILTERED COMPRESSED SENSING NONCONVEX COMPRESSIVE SENSING AND
RECONSTRUCTION OF GRADIENT-SPARSE IMAGES: RANDOM VS. TOMOGRAPHIC FOURIER SAMPLING
…
Conclusions
CS is a new technique for acquiring and compressing images simultaneously
Sparseness + Incoherence + random sampling allows perfect reconstruction under some conditions
A wide range of applications are possible Big research effort now on recovery
techniques
Our future lines?
Convex CS: TV-regularization
Non-convex CS: L0-GM for CS Intermediate norms (0 < p < 1) for CS
CS Applications: Super-resolved sampling? Detection, estimation, classification,…
Thank you
See references and software here:
http://www.dsp.ece.rice.edu/cs/