sparsity randomness and compressed sensing
play

Sparsity, Randomness and Compressed Sensing Petros Boufounos - PowerPoint PPT Presentation

Sparsity, Randomness and Compressed Sensing Petros Boufounos Mitsubishi Electric Research Labs petrosb@merl.com Sparsity Why Sparsity Naturaldataandsignalsexhibit structure Sparsity o2encapturesthat


  1. Sparsity, Randomness and Compressed Sensing Petros Boufounos Mitsubishi Electric Research Labs petrosb@merl.com

  2. Sparsity


  3. Why Sparsity • Natural
data
and
signals
exhibit
 structure 
 • Sparsity
 o2en
captures
that
 structure
 • Very
 general
 signal
model
 • Computa9onally
 tractable
 • Wide
range
of
applica9ons
in
signal
 acquisi2on ,
 processing ,
and
 transmission


  4. Signal Representations

  5. Signal example: Images • 2‐D
func9on
 • Idealized
view
 
 
some
func9on
 
 
space
defined
 
 
over



  6. Signal example: Images • 2‐D
func9on
 • Idealized
view
 
 
some
func9on
 
 
space
defined
 
 
over

 • In
prac9ce
 
 
ie:
an














matrix


  7. Signal example: Images • 2‐D
func9on
 • Idealized
view
 
 
some
func9on
 
 
space
defined
 
 
over

 • In
prac9ce
 
 
ie:
an














matrix

(pixel
average)


  8. Signal Models Classical Model: Signal lies in a linear vector space x 1 (e.g. bandlimited functions) X Sparse Model: Signals of interest are often sparse or compressible x 2 Signal Transform x 3 Image Wavelet Bat Sonar Gabor/ Chirp STFT i.e., very few large coefficients, many close to zero.

  9. Sparse Signal Models x 1 x 1 X Sparse signals have X x 2 few non-zero coefficients x 2 x 3 x 3 1 -sparse 2 -sparse x 1 Compressible signals have few significant coefficients. x 2 The coefficients decay as a power law. x 3 Compressible (  p ball, p <1 )

  10. Sparse Approximation

  11. Computational Harmonic Analysis • Representa9on
 coefficients
 basis,
frame
 • Analysis: 







study




through
 structure 
of

 
 
















should
 extract features 
of
interest
 • Approxima2on: 






uses
just
a
few
terms
 
 
 
 
exploit
 sparsity of



  12. Wavelet Transform Sparsity • Many
 










 (blue)


  13. Sparseness ⇒ Approximation few big many small sorted
index


  14. Linear Approximation index


  15. Linear Approximation • ‐term approxima6on :

use
“ first” 






 index


  16. Nonlinear Approximation • ‐term approxima6on :


 
 
 
 
use
 largest 






 independently 

 • Greedy
/
 thresholding sorted
index
 few big

  17. Error Approximation Rates as
 • Op9mize
asympto9c
 error decay rate • Nonlinear
approxima9on
works
beQer
than
linear


  18. Compression is Approximation • Lossy
compression
of
an
image
creates
an
 approxima9on
 coefficients
 basis,
frame
 quan6ze to total bits

  19. Sparse approximation ≠ Compression • Sparse
approxima9on
chooses
coefficients
but
does
 not quan6ze 
or
worry
about
their
 loca6ons threshold

  20. Location, Location, Location • Nonlinear
approxima9on
 selects





largest
 to
minimize
error

 (easy
–
threshold)
 • Compression
algorithm
 must
encode
 both 
a
set
 of





 and 
their
loca9ons
 (harder)







  21. Exposing Sparsity

  22. Spikes and Sinusoids example Example Signal Model: Sinusoidal with a few spikes. f B a DCT Basis:

  23. Spikes and Sinusoids Dictionary f a D DCT basis Impulses Lost Uniqueness!!

  24. Overcomplete Dictionaries Strategy: Improve sparse approximation by constructing a large dictionary. f D a How do we design a dictionary?

  25. Dictionary Design Wavelets DCT, DFT Edgelets, curvelets, … Impulse Basis Oversampling Frame … … Dictionary D Can we just throw in the bucket everything we know?

  26. Dictionary Design Considerations • Dic9onary
Size:

 – Computa2on
 and
 storage
increases
 with
size
 • Fast
Transforms:
 – FFT,
DCT,
FWT,
etc.
drama9cally
 decrease
computa2on
 and
 storage
 • Coherence:
 – Similarity
 in
elements
makes
solu9on
 harder


  27. Dictionary Coherence Two candidate dictionaries: D 1 D 2 BAD! Intuition: D 2 has too many similar elements. It is very coherent . Coherence (similarity) between elements: 〈 d 1 ,d 2 〉 Dictionary coherence: μ =max i,j 〈 d i ,d j 〉

  28. Incoherent Bases • “Mix”
well
the
signal
components
 – Impulses
and
Fourier
Basis
 – Anything
and
Random
Gaussian
 – Anything
and
Random
0‐1
basis


  29. Computing Sparse Representations

  30. Thresholding Zero out Compute set of coefficients small ones f D a a= D † f Computationally efficient Good for small and very incoherent dictionaries

  31. Matching Pursuit Measure image Select largest Add to against dictionary correlation representation ρ f a k ← a k + ρ k D T ρ k Compute residual f ← f - ρ k d k 〈 d k ,f 〉 = ρ k Iterate using residual

  32. Greedy Pursuits Family • Several
Varia9ons
of
MP:
 OMP,
StOMP,
ROMP,
CoSaMP,
Tree
MP,
…
 (You
can
create
an
 AndrewMP 
if

you
work
on
it…)




















 • Some
have
 provable
guarantees
 • Some
improve
 dic2onary
search
 • Some
improve
 coefficient
selec2on


  33. CoSaMP (Compressive Sampling MP) Add to Measure image Select location support set against dictionary of largest Ω = supp( ρ | 2 K ) ∪ T 2K correlations ρ f D T Invert over support supp( ρ | 2K ) b = D † Ω f Truncate and compute residual T = supp( b | K ) a = b | K 〈 d k ,f 〉 = ρ k r ← f − Da Iterate using residual

  34. Optimization (Basis Pursuit) Sparse approximation: Minimize non-zeros in representation s.t.: representation is close to signal min
 ‖ a ‖ 0
 
s.t.

 f ≈ D a Number of non-zeros Data Fidelity (sparsity measure) (approximation quality) Combinatorial complexity. Very hard problem!

  35. Optimization (Basis Pursuit) Sparse approximation: Minimize non-zeros in representation s.t.: representation is close to signal min
 ‖ a ‖ 0
 
s.t.

 f ≈ D a Convex Relaxation min
 ‖ a ‖ 1
 
s.t.

 f ≈ D a Ploynomial complexity. Solved using linear programming.

  36. Why l 1 relaxation works min
 ‖ a ‖ 1
 
s.t.

 f ≈ D a Sparse solution l 1 “ball” f = D a

  37. Basis Pursuits • Have
 provable
guarantees
 – Finds
 sparsest
 solu9on
for
 incoherent 
dic9onaries
 • Several
variants
in
formula9on:
 BPDN,
LASSO,
Dantzig
selector,
…
 • Varia9ons
on
 fidelity
 term
and
 relaxa2on
 choice
 • Several
fast
algorithms:
 FPC,
GPSR,
SPGL,
…


  38. Compressed
Sensing:
 Sensing,
Sampling
and

 Data
Processing


  39. Data Acquisition • Usual
acquisi9on
methods
 sample 
signals
uniformly
 – Time:
A/D
with
microphones,
geophones,
hydrophones.
 – Space:
CCD
cameras,
sensor
arrays.
 • Founda9on:
 Nyquist/Shannon
sampling
theory
 – Sample
at
 twice
the
signal
bandwidth.
 – Generally
a
 projec2on
to
a
complete
basis 
that
spans
the
 signal
space.


  40. Data Processing and Transmission • Data
processing
steps :
 – Sample 
Densely
 Signal x , N coefficients – Transform 
to
an
informa9ve
domain
(Fourier,
Wavelet)
 K<<N significant coefficients – Process/Compress/Transmit
 Sets
small
coefficients
to
zero
(sparsifica9on)


  41. Sparsity Model • Signals
can
usually
be
 compressed
 in
some
basis
 • Sparsity:
good
 prior 
in
picking
from
a
lot
of
candidates


  42. Compressive Sensing Principles x 1 If a signal is sparse, do not waste X effort sampling the empty space. x 2 x 3 Instead, use fewer samples 1-sparse and allow ambiguity. x 1 Use the sparsity model to reconstruct X and uniquely resolve the ambiguity. x 2 x 3 2-sparse

  43. Measuring Sparse Signals

  44. Compressive Measurements Ambiguity x 1 y 1 x 2 x 1 Measurement X x 3 (Projection) X x 2 y 2 x 3 Reconstruction Φ has rank M ≪ N N = Signal dimensionality M = Number of measurements K = Signal sparsity (dimensionality of y ) N ≫
 M ≳ K

  45. One Simple Question

  46. Geometry of Sparse Signal Sets

  47. Geometry: Embedding in R M

  48. Illustrative Example

  49. Example: 1-sparse signal y 1 =x 2 X N =3 Bad! y 2 =x 3 x 1=0 K =1 M =2 K =2 x 1 X x 2 y 1 =x 1 =x 2 x 3 X Bad! y 2 =x 3

  50. Example: 1-sparse signal y 1 =x 2 N =3 x 1 K =1 X M =2 K =2 Good! y 2 =x 3 x 1 X y 1 =x 1 x 2 x 3 X y 2 Better! x 3 x 2

  51. Restricted Isometry Property

  52. RIP as a “Stable” Embedding

  53. Verifying RIP

  54. Universality Property

  55. Universality Property

Recommend


More recommend