Transcript Document

COMPRESSED SENSING
Luis Mancera
Visual Information Processing Group
Dep. Computer Science and AI
Universidad de Granada
CONTENTS
1. WHAT?
 Introduction to Compressed Sensing (CS)
2. HOW?
 Theory behind CS
3. FOR WHAT PURPOSE?
 CS applications
4. AND THEN?
 Active research and future lines
CONTENTS
1. WHAT?
 Introduction to Compressed Sensing (CS)
2. HOW?
 Theory behind CS
3. FOR WHAT PURPOSE?
 CS applications
4. AND THEN?
 Active research and future lines
Transmission scheme
Brick wall to performance
N >> K
Sample
N
Compress
K
Transmit
Why so many samples?
N
Decompress
K
Receive
Natural signals (sparse/compressible)
 no significant perceptual loss
Shannon/Nyquist theorem




Shannon/Nyquist theorem tell us to use a
sampling rate of 1/(2W) seconds, if W is the
highest frequency of the signal
This is a worst-case bound for ANY bandlimited signal
Sparse / compressible signals is a favorable
case
CS solution: melt sampling and compression
Compressed Sensing (CS)
K < M << N
Compressed Sensing
M
Transmit
What do we need for CS to success?
N


Reconstruct
M
Receive
Recover sparse signals by directly acquiring
compressed data
Replace samples by measurements
We now how to Sense Compressively
Do you mean you’re glad this
battle is over because now
you’ve finished here and you
will go back to Motril, get
Aye
married, and grow up
pigs as
you always wanted to?
I’m glad this battle is over. Finally
my military period is over. I will
now come back to Motril and get
married, and then I will grow up
pigs as I have always wanted to
Cool!
do
What does CS need?

Nice sensing dictionary

Appropriate sensing

A priori knowledge

Recovery process
I know this
guy so
much that
I know
What?
what
he
means
Words
SaintCool!
Wie
lange
Roque’s
wird
dog
dashas
nehmen?
no tail
Idea
CS needs:

Nice sensing dictionary
INCOHERENCE

Appropriate sensing
RANDOMNESS

A priori knowledge
SPARSENESS

Recovery process
OPTIMIZATION
Sparseness: less is more
Dictionary:
Idea:
A stranger
approaching a
hut by the only
known road:
the valley
How to
express it?
Combining
elements…
Combining
elements…
J.F. Cooper
“He was advancing by the
only road that was ever
traveled by the stranger as
he approached the Hut; or,
he came up the valley”
Wyandotte
“He was advancing
by the valley, the
Hummm,
only road traveled
you could
by a stranger
say the
approaching the
same
Hut”
using less
Comments to
words…
Wyandotte
E.A. Poe
Sparseness: less is more

Sparseness: Property of being small in numbers
or amount, often scattered over a large area
[Cambridge Advanced Learner’s Dictionary]
A CERTAIN DISTRIBUTION
A SPARSER DISTRIBUTION
Sparseness: less is more


Pixels: not sparse 
A new domain can increase sparseness 
Original
Einstein
Taking
10%
pixels
10% Fourier coeffs.
10% Wavelet coeffs.
Sparseness: less is more
Dictionary:
How to
express it? X-lets
elementary
functions
(atoms)
Non-linear
analysis
Linear
analysis
non-linear subband
Synthesis-sense Sparseness:
We can increase sparseness
by non-linear analysis
X-let-based representations
are compressible, meaning
that most of the energy is
concentrated in few
coefficients
Analysis-sense Sparseness:
Response of X-lets filters is
sparse
linear subband
[Malllat 89, Olshausen & Field 96]
Sparseness: less is more
Idea:
Dictionary:
How to
express it? X-lets
elementary
functions
Combining
other way…
Taking around
3.5% of total
coeffs…
Taking less coefficients we
achieve strict sparseness, at
the price of just
approximating the image
PSNR: 35.67 dB
non-linear subband
Incoherence


Sparse signals in a given dictionary must be
dense in another incoherent one
Sampling dictionary should be incoherent w.r.t.
that where the signal is sparse/compressible
A time-sparse signal
Its frequency-dense representation
Measurement and recovery processes

Measurement process:


Sparseness + Incoherence  Random sampling
will do
Recovery process:

Numerical non-linear optimization is able to
exactly recover the signal given the
measurements
CS relies on:




A priori knowledge: Many natural signals are
sparse or compressible in a proper basis
Nice sensing dictionary: Signals should be
dense when using the sampling waveforms
Appropriate sensing: Random sampling have
demonstrated to work well
Recovery process: Bounds for exact recovery
depends on the optimization method
Summary


CS is a simple and efficient signal acquisition
protocol which samples at a reduced rate and
later use computational power for
reconstruction from what appears to be an
incomplete set of measurements
CS is universal, democratic and asymmetrical
CONTENTS
1. WHAT?
 Introduction to Compressed Sensing (CS)
2. HOW?
 Theory behind CS
3. FOR WHAT PURPOSE?
 CS applications
4. AND THEN?
 Active research and future lines
The sensing problem



xt: Original discrete signal (vector)
F: Sampling dictionary (matrix)
yk: Sampled signal (vector)
The sensing problem

Traditional sampling:
y
Nx1
Sampled signal
F=I
NxN
x
Nx1
Sampling dictionary
Original signal
The sensing problem


When the signal is sparse/compressible, we can directly acquire
a condensed representation with no/little information loss
Random projection will work if M = O(K log(N/K)) [Candès et al.,
Donoho, 2004]
y
F
x
K nonzero entries
K < M << N
Mx1
MxN
Nx1
Universality

Random measurements can be used if signal
is sparse/compressible in any basis
y
F
Y
a
K nonzero entries
K < M << N
Mx1
MxN
NxN
Nx1
Good sensing waveforms?




F and Y should be incoherent
Measure the largest correlation between any two
elements:
Large correlation  low incoherence
Examples


Spike and Fourier basis (maximal incoherence)
Random and any fixed basis
Solution: sensing randomly
M = O(K log(N/K))
Random measurements
N


Reconstruct
We have set up the encoder
Let’s now study the decoder
M
Transmit
M
Receive
CS recovery


Assume a is K-sparse, and y = FYa
We can recover a by solving:
Count number of active coefficients


This is a NP-hard problem (combinatorial)
Use some tractable approximation
Robust CS recovery





What about a is only compressible and y = F(Ya + n), with n and
unknown error term?
Isometry constant of F: The smallest K such that, for all K-sparse
vectors x:
F obeys a Restricted Isometry Property (RIP) if dK is not too close
to 1
F obeys a RIP  Any subset of K columns are nearly orthogonal
To recover K-sparse signals we need d2K < 1 (unique solution)
Recovery techniques





Minimization of L1-norm
Greedy techniques
Iterative thresholding
Total-variation minimization
…
Recovery by minimizing L1-norm
Sum of absolute values



Convexity: tractable problem
Solvable by Linear or Second-order
programming
For C > 0, â1 = â if:
Recovery by minimizing L1-norm

Noisy data: Solve the LASSO problem

Convex problem solvable via 2nd order cone
programming (SOCP)
If d2K < 2 – 1, then:

Example of L1 recovery
x


y = Ax
A120X512: Random orthonormal matrix
Perfect recovery of x by L1-minimization
Recovery by Greedy Pursuit

Algorithm:






New active component: that whose corresponding
fi is most correlated with y
Find best approximation, y’, to y using active
components
Substract y’ from y to form residual e
Make y = e and repeat
Very fast for small-scale problems
Not as accurate/robust for large signals in the
presence of noise
Recovery by Iterative Thresholding

Algorithm:



Iterates between shrinkage/thresholding operation
and projection onto perfect reconstruction
If soft-thresholding is used, analogous theory
to L1-minimization
If hard-thresholding is used, the error is within
a constant factor of the best attainable
estimation error [Blumensath08]
Recovery by TV minimization



Sparseness: signals have few “jumps”
Convexity: tractable problem
Accurate and robust, but can be slow for
large-scale problems
Example of TV recovery
x


F
xLS = FTFx
F: Fourier transform
Perfect recovery of x by TV-minimization
Summary

Sensing:



Use random sampling in dictionaries with low
coherence to that where the signal is sparse.
Choose M wisely
Recovery:


A wide range of techniques are available
L1-minimization seems to work well, but choose
that best fitting your needs
CONTENTS
1. WHAT?
 Introduction to Compressed Sensing (CS)
2. HOW?
 Theory behind CS
3. FOR WHAT PURPOSE?
 CS applications
4. AND THEN?
 Active research and future lines
Some CS applications














Data compression
Compressive imaging
Detection, classification, estimation, learning…
Medical imaging
Analog-to-information conversion
Biosensing
Geophysical data analysis
Hyperspectral imaging
Compressive radar
Astronomy
Comunications
Surface metrology
Spectrum analysis
…
Data compression




The sparse basis Y may be unknown or
impractical to implement at the encoder
A randomly designed F can be considered a
universal encoding strategy
This may be helpful for distributed source
coding in multi-signal settings
[Baron et al. 05, Haupt and Nowak 06,…]
Magnetic resonance imaging
Rice Single-Pixel CS Camera
Rice Analog-to-Information conversion


Analog input signal into discrete digital
measurements
Extension of A2D converter that samples at
signal’s information rate rather than its
Nyquist rate
CS in Astronomy [Bobin et al 08]




Desperate need for data compression
Resolution, Sensitivity and photometry are important
Herschel satellite (ESA, 2009): conventional
compression cannot be used
CS can help with:


New compressive sensors
A flexible compression/decompression scheme



Computational cost (Fx): O(t) vs. JPEG 2000’s O(t log(t))
Decoupling of compression and decompression
CS outperforms conventional compression
CONTENTS
1. WHAT?
 Introduction to Compressed Sensing (CS)
2. HOW?
 Theory behind CS
3. FOR WHAT PURPOSE?
 CS applications
4. AND THEN?
 Active research and future lines
CS is a very active area
CS is a very active area



More than seventy 2008 papers in CS repository
Most active areas:
 New applications (de-noising, learning, video,
 New recovery methods (non-convex, variational, CoSamp,…)
ICIP 08:
 COMPRESSED SENSING FOR MULTI-VIEW TRACKING AND
3-D VOXEL RECONSTRUCTION
 COMPRESSIVE IMAGE FUSION
 IMAGE REPRESENTATION BY COMPRESSED SENSING
 KALMAN FILTERED COMPRESSED SENSING
 NONCONVEX COMPRESSIVE SENSING AND
RECONSTRUCTION OF GRADIENT-SPARSE IMAGES:
RANDOM VS. TOMOGRAPHIC FOURIER SAMPLING
 …
Conclusions




CS is a new technique for acquiring and
compressing images simultaneously
Sparseness + Incoherence + random
sampling allows perfect reconstruction under
some conditions
A wide range of applications are possible
Big research effort now on recovery
techniques
Our future lines?

Convex CS:


Non-convex CS:



TV-regularization
L0-GM for CS
Intermediate norms (0 < p < 1) for CS
CS Applications:


Super-resolved sampling?
Detection, estimation, classification,…
Thank you
See references and software here:
http://www.dsp.ece.rice.edu/cs/