49
COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Embed Size (px)

Citation preview

Page 1: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

COMPRESSED SENSING

Luis ManceraVisual Information Processing GroupDep. Computer Science and AIUniversidad de Granada

Page 2: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CONTENTS

1. WHAT? Introduction to Compressed Sensing (CS)

2. HOW? Theory behind CS

3. FOR WHAT PURPOSE? CS applications

4. AND THEN? Active research and future lines

Page 3: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CONTENTS

1. WHAT? Introduction to Compressed Sensing (CS)

2. HOW? Theory behind CS

3. FOR WHAT PURPOSE? CS applications

4. AND THEN? Active research and future lines

Page 4: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Transmission scheme

Sample

Receive

Compress

Decompress

N KN >> K

Transmit

KN

Why so many samples?

Natural signals (sparse/compressible) no significant perceptual loss

Brick wall to performance

Page 5: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Shannon/Nyquist theorem

Shannon/Nyquist theorem tell us to use a sampling rate of 1/(2W) seconds, if W is the highest frequency of the signal

This is a worst-case bound for ANY band-limited signal

Sparse / compressible signals is a favorable case

CS solution: melt sampling and compression

Page 6: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Compressed Sensing (CS)

Compressed Sensing

ReceiveReconstruct

M

K < M << N

Transmit

MN

Recover sparse signals by directly acquiring compressed data

Replace samples by measurements

What do we need for CS to success?

Page 7: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

We now how to Sense Compressively I’m glad this battle is over.

Finally my military period is over. I will now come back to Motril

and get married, and then I will grow up pigs as I have always

wanted to doAye Cool!

Do you mean you’re glad this battle is over because now

you’ve finished here and you will go back to Motril, get

married, and grow up pigs as you always wanted to?

Page 8: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

What does CS need?

Nice sensing dictionary

Appropriate sensing

A priori knowledge

Recovery process

Wie lange wird das nehmen?

What?

Saint Roque’s dog has no tail

I know this guy so

much that I know

what he means

Cool!

Words Idea

Page 9: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CS needs:

Nice sensing dictionary

Appropriate sensing

A priori knowledge

Recovery process

SPARSENESS

RANDOMNESS

INCOHERENCE

OPTIMIZATION

Page 10: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Sparseness: less is more

A stranger approaching a hut by the only known road: the valley

Dictionary:

How to express it?

Idea:

“He was advancing by the only road that was ever traveled by the stranger as he approached the Hut; or, he came up the valley”

Wyandotte

Combining elements…

J.F. Cooper E.A. Poe

Combining elements…

Hummm, you could say the same using less words…

“He was advancing by the valley, the only road traveled by a stranger approaching the Hut”

Comments to Wyandotte

SPARSER

Page 11: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Sparseness: less is more Sparseness: Property of being small in numbers

or amount, often scattered over a large area

[Cambridge Advanced Learner’s Dictionary]

A CERTAIN DISTRIBUTION A SPARSER DISTRIBUTION

Page 12: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Sparseness: less is more

Original EinsteinTaking 10% pixels 10% Fourier coeffs. 10% Wavelet coeffs.

Pixels: not sparse A new domain can increase sparseness

Page 13: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Sparseness: less is moreDictionary:

How to express it?

Linear analysis

Non-linear analysisX-lets

elementary functions (atoms)

X-lets elementary functions (atoms)

linear subband

non-linear subband

SPARSER

X-let-based representations are compressible, meaning that most of the energy is concentrated in few coefficients Analysis-sense Sparseness:

Response of X-lets filters is sparse [Malllat 89, Olshausen & Field 96]

Synthesis-sense Sparseness: We can increase sparseness by non-linear analysis

Page 14: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Sparseness: less is moreDictionary:

How to express it?

Idea:

Combining other way…X-lets

elementary functions

X-lets elementary functions

non-linear subband

SPARSER

Taking around 3.5% of total coeffs…

Taking less coefficients we achieve strict sparseness, at the price of just approximating the image

PSNR: 35.67 dB

Page 15: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Incoherence Sparse signals in a given dictionary must be

dense in another incoherent one Sampling dictionary should be incoherent w.r.t.

that where the signal is sparse/compressible

A time-sparse signal Its frequency-dense representation

Page 16: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Measurement and recovery processes Measurement process:

Sparseness + Incoherence Random sampling will do

Recovery process: Numerical non-linear optimization is able to

exactly recover the signal given the measurements

Page 17: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CS relies on:

A priori knowledge: Many natural signals are sparse or compressible in a proper basis

Nice sensing dictionary: Signals should be dense when using the sampling waveforms

Appropriate sensing: Random sampling have demonstrated to work well

Recovery process: Bounds for exact recovery depends on the optimization method

Page 18: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Summary

CS is a simple and efficient signal acquisition protocol which samples at a reduced rate and later use computational power for reconstruction from what appears to be an incomplete set of measurements

CS is universal, democratic and asymmetrical

Page 19: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CONTENTS

1. WHAT? Introduction to Compressed Sensing (CS)

2. HOW? Theory behind CS

3. FOR WHAT PURPOSE? CS applications

4. AND THEN? Active research and future lines

Page 20: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

The sensing problem

xt: Original discrete signal (vector) : Sampling dictionary (matrix) yk: Sampled signal (vector)

Page 21: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

The sensing problem

Traditional sampling:

Original signalSampled signal Sampling dictionary

N x 1

y

N x N N x 1

= I x

Page 22: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

The sensing problem When the signal is sparse/compressible, we can directly acquire

a condensed representation with no/little information loss Random projection will work if M = O(K log(N/K)) [Candès et al.,

Donoho, 2004]

M x 1

y

M x N N x 1

x

K nonzero entries

K < M << N

Page 23: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Universality

Random measurements can be used if signal is sparse/compressible in any basis

M x 1

y

M x N N x 1

a

K nonzero entries

K < M << N

N x N

Page 24: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Good sensing waveforms?

and should be incoherent Measure the largest correlation between any two

elements:

Large correlation low incoherence Examples

Spike and Fourier basis (maximal incoherence) Random and any fixed basis

Page 25: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Solution: sensing randomly

Random measurements

ReceiveReconstruct

M

M = O(K log(N/K))

Transmit

MN

We have set up the encoder Let’s now study the decoder

Page 26: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CS recovery

Assume a is K-sparse, and y = a We can recover a by solving:

This is a NP-hard problem (combinatorial) Use some tractable approximation

Count number of active coefficients

Page 27: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Robust CS recovery

What about a is only compressible and y = a + n), with n and unknown error term?

Isometry constant of : The smallest K such that, for all K-sparse vectors x:

obeys a Restricted Isometry Property (RIP) if K is not too close to 1

obeys a RIP Any subset of K columns are nearly orthogonal To recover K-sparse signals we need 2K < 1 (unique solution)

Page 28: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Recovery techniques

Minimization of L1-norm Greedy techniques Iterative thresholding Total-variation minimization …

Page 29: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Recovery by minimizing L1-norm

Convexity: tractable problem Solvable by Linear or Second-order

programming For C > 0, â1 = â if:

Sum of absolute values

Page 30: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Recovery by minimizing L1-norm Noisy data: Solve the LASSO problem

Convex problem solvable via 2nd order cone programming (SOCP)

If 2K < 2 – 1, then:

Page 31: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Example of L1 recovery

A120X512: Random orthonormal matrix Perfect recovery of x by L1-minimization

x y = Ax

Page 32: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Recovery by Greedy Pursuit Algorithm:

New active component: that whose corresponding i is most correlated with y

Find best approximation, y’, to y using active components

Substract y’ from y to form residual e Make y = e and repeat

Very fast for small-scale problems Not as accurate/robust for large signals in the

presence of noise

Page 33: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Recovery by Iterative Thresholding Algorithm:

Iterates between shrinkage/thresholding operation and projection onto perfect reconstruction

If soft-thresholding is used, analogous theory to L1-minimization

If hard-thresholding is used, the error is within a constant factor of the best attainable estimation error [Blumensath08]

Page 34: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Recovery by TV minimization

Sparseness: signals have few “jumps” Convexity: tractable problem Accurate and robust, but can be slow for

large-scale problems

Page 35: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Example of TV recovery

: Fourier transform Perfect recovery of x by TV-minimization

x xLS = x

Page 36: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Summary

Sensing: Use random sampling in dictionaries with low

coherence to that where the signal is sparse. Choose M wisely

Recovery: A wide range of techniques are available L1-minimization seems to work well, but choose

that best fitting your needs

Page 37: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CONTENTS

1. WHAT? Introduction to Compressed Sensing (CS)

2. HOW? Theory behind CS

3. FOR WHAT PURPOSE? CS applications

4. AND THEN? Active research and future lines

Page 38: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Some CS applications

Data compression Compressive imaging Detection, classification, estimation, learning… Medical imaging Analog-to-information conversion Biosensing Geophysical data analysis Hyperspectral imaging Compressive radar Astronomy Comunications Surface metrology Spectrum analysis …

Page 39: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Data compression

The sparse basis may be unknown or impractical to implement at the encoder

A randomly designed can be considered a universal encoding strategy

This may be helpful for distributed source coding in multi-signal settings

[Baron et al. 05, Haupt and Nowak 06,…]

Page 40: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Magnetic resonance imaging

Page 41: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Rice Single-Pixel CS Camera

Page 42: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Rice Analog-to-Information conversion

Analog input signal into discrete digital measurements

Extension of A2D converter that samples at signal’s information rate rather than its Nyquist rate

Page 43: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CS in Astronomy [Bobin et al 08]

Desperate need for data compression Resolution, Sensitivity and photometry are important Herschel satellite (ESA, 2009): conventional

compression cannot be used CS can help with:

New compressive sensors A flexible compression/decompression scheme

Computational cost (x): O(t) vs. JPEG 2000’s O(t log(t)) Decoupling of compression and decompression

CS outperforms conventional compression

Page 44: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CONTENTS

1. WHAT? Introduction to Compressed Sensing (CS)

2. HOW? Theory behind CS

3. FOR WHAT PURPOSE? CS applications

4. AND THEN? Active research and future lines

Page 45: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CS is a very active area

Page 46: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

CS is a very active area

More than seventy 2008 papers in CS repository Most active areas:

New applications (de-noising, learning, video, New recovery methods (non-convex, variational, CoSamp,…)

ICIP 08: COMPRESSED SENSING FOR MULTI-VIEW TRACKING AND

3-D VOXEL RECONSTRUCTION COMPRESSIVE IMAGE FUSION IMAGE REPRESENTATION BY COMPRESSED SENSING KALMAN FILTERED COMPRESSED SENSING NONCONVEX COMPRESSIVE SENSING AND

RECONSTRUCTION OF GRADIENT-SPARSE IMAGES: RANDOM VS. TOMOGRAPHIC FOURIER SAMPLING

Page 47: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Conclusions

CS is a new technique for acquiring and compressing images simultaneously

Sparseness + Incoherence + random sampling allows perfect reconstruction under some conditions

A wide range of applications are possible Big research effort now on recovery

techniques

Page 48: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Our future lines?

Convex CS: TV-regularization

Non-convex CS: L0-GM for CS Intermediate norms (0 < p < 1) for CS

CS Applications: Super-resolved sampling? Detection, estimation, classification,…

Page 49: COMPRESSED SENSING Luis Mancera Visual Information Processing Group Dep. Computer Science and AI Universidad de Granada

Thank you

See references and software here:

http://www.dsp.ece.rice.edu/cs/