UNCLASSIFIED Joint Sparsity for Target Detection Nasser M. Nasrabadi Nasser M. Nasrabadi U.S. Army Research Laboratory UNCLASSIFIED
Introduction • Objective: Segmentation of HSI into multiple classes (target and background) or classify classes (target and background) or classify individual objects (military targets) from multiple views of the same physical target. • Assumptions – Training data: known spectral characteristics (or images) of different classes images) of different classes – Test data: a sparse linear combination of all training data – In HSI Neighboring pixels: similar materials – Mutiple views of targets are similar • Results compared to classical SVM classifiers
Hyperspectral Imagery
Pixel-w ise Sparsity Model • Background pixels approximately lie in a low- di dimensional subspace i l b x a a a A α b b b b b b b b i i ,1 1 i ,2 2 i N , N i b b • Target pixels also lie in a low-dimensional subspace x a a a A A α t t t t t t t t t t t t t t t t i i ,1 1 i ,2 2 i N , N i t t x • A test sample can be sparsely represented i b by b x A A A A A b b t t b t i i i i i i i i i t t i
Ilustration: Pixel-Wise Sparse Model 0 . 1 4 0 . 1 2 t e s t s a m p l e b a c k g ro u n d d ic t io n a ry 0 . 1 1 0 . 1 0 . 1 0 . 0 9 0 . 0 8 0 . 0 8 0 . 0 7 0 . 0 6 0 . 0 6 0 . 0 5 t a rg e t d ic t io n a ry 0 . 0 4 0 . 0 4 0 5 0 1 0 0 1 5 0 Target Pixel 0 . 0 2 0 5 0 1 0 0 1 5 0 x Spectral dictionary A Nonzero Test Spectrum i entries b b x A A A A A b b t t b t i i i i i t i
Sparse Recovery • Sparse coefficient is recovered by A A x x ˆ arg min arg min subject to subject to i i i i 0 • For empirical data A x ˆ arg min subject to i i i i 0 2 A A x x ˆ arg min arg min subject to subject to K K i i i i 0 2 0 • NP-hard problem – Greedy algorithms: MP, OMP, SP, CoSaMP, LARS Greedy algorithms: MP OMP SP C S MP LARS – Convex relaxation: Iterative Thresholding, Primal-Dual Interior-Point, Gradient Projection, Proximal Gradient, Augmented Lagrange Multiplier A x ˆ arg min subject to i i i i 1
Classification Based on Residuals b ˆ i ˆ • Recover sparse coefficient Recover sparse coefficient i i t ˆ i • Compute the residuals (approximation errors Compute the residuals (approximation errors w.r.t. the two sub-dictionaries) x x A x x A b b b b t t t t ˆ ˆ ˆ ˆ r and r b i i i t i i i 2 2 x • Class of test pixel is made by comparing the i residuals
Example: Reconstruction R e c o v e r e d s p a r s e c o e ffi c i e n t s 0 . 9 0 . 8 0 . 7 ˆ b b 0 . 6 i ˆ 0 . 5 i 0 . 4 t ˆ 0 . 3 i 0 . 2 0 2 0 . 1 0 0 5 0 1 0 0 1 5 0 2 0 0 2 5 0 0 . 1 2 0 . 1 0 . 0 8 A x x A t t t ˆ ˆ i i 0 . 0 6 A x b b b 0 . 0 4 ˆ ˆ i i O r ig in a l 0 0 0 . 0 2 R e c o n s t r u c t e d fr o m b g d ic . e c o s t u c t e d o b g d c R e c o n s t r u c t e d fr o m t a r g e t 0 0 5 0 1 0 0 1 5 0
Joint Sparsity Model (Joint Structural Sparsity Prior) • Use of contextual information – Neighboring pixels: similar spectral characteristics g g p p – Approximated by the same few training samples, weighed differently • Consider T pixels in a small neighborhood • Consider T pixels in a small neighborhood A x 1 1 A x A 2 2 X x x x A AS 1 2 T 1 2 T S A x T T ’s: sparse vectors with same support, different magnitude p pp , g – i i S – : sparse matrix with only a few nonzero rows
Illustration: T=3x3 Neighborhood 9 0.14 0.14 0.12 0.12 0.1 0.1 0.08 0.08 0.06 0.06 0.04 T=9 0 04 0.04 0.02 0 0.02 0 50 100 150 0 50 100 150 X Spectral dictionary A Data matrix Row-sparse matrix S S t i
Joint Sparse Recovery S • is recovered by ˆ S S AS X arg min subject to row, 0 • Solved by greedy algorithms: Simultaneous OMP • Solved by greedy algorithms: Simultaneous OMP (SOMP) , Simultaneous SP (SSP) or Convex optimization to find the same active set ˆ S S AS X arg min subject to 1,2 • Decision obtained by comparing total residuals
Comparison of single pixel sparsity model VS Joint Sparsity Recovery Model (k=5 atoms active) Input a single Input a single background pixel x A x ˆ arg min subject to 0 Input nine put e neighboring background pixels X ˆ S S AS X arg min subject to row, 0
Results on HYDICE FR-I Original image (averaged g g ( g Proposed detector output p p over 150 bands)
Results on FR-I: ROC Curves
Extension to Multiple Classes • AVIRIS HSI data set with 16 classes, 220 bands, 20 meters pixel resolution 220 bands, 20 meters pixel resolution
Extension to Multiple Classes
Multi-View Target Classification • In ATR applications we can have multiple observations of the same physical target from p y g different platforms or from the same platform at different viewing angles (aspects). g g ( p ) A y ˆ arg min subject to (Single-Measurement) i i i i 0 ˆ ˆ S S AS Y arg min subject to (Multi-Measurements) row, 0
Experimental Results on Multi- View Target Classification • MSTAR SAR data-base consists of 10 military consists of 10 military targets at roughly 1-3 interval azimuth angles (0- 360 ) 360 ) at two different t t diff t depression angles 15 and 17 . Data from 17 is used for training (dictionary design) 15 is used for testing
Experimental Results on Multi- View Target Classification • Three class (BMP2, BTR70, T72) target classification C=3 with multiple views M=3 . Features are incoherent random projections dimension range from d=128 to1024. A x ˆ arg min subject to i i i i 0 ˆ A x arg min subject to 0 x 1 1 A x and x M M S ˆ S AS X arg min subject to row, 0 S Note [ ] 1 M
Experimental Results on Number of View s and Angle Size • Effect of different number of views M • Effect of the angle size between the views
Experimental Results on Multi- View Target Classification • 10 class classification results using M=3 views with dictionary of size y N=2747 tested on 15 degree depression g p
Multi-Pose Face Recognition • Scenarios where we have multiple poses of the same face as input to the classifier. • UMIST database consists of 564 images of 20 individuals with a range of poses. • Randomly select 10 poses for each individual to construct the dictionary.
Conclusions • Formulated target and object recognition as joint sparsity underdetermined regression problem. • Investigated the effect single vs multiple measurements • Included the idea of joint structured sparsity prior into the regularization part of the optimization th l i ti t f th ti i ti • Investigated performance of multiple measurements on classification performance on several data bases. p
THANK YOU Thank You
Recommend
More recommend