decomposing a third order tensor in rank l l 1 terms by
play

Decomposing a Third-Order Tensor in Rank-(L,L,1) Terms by Means of - PowerPoint PPT Presentation

Decomposing a Third-Order Tensor in Rank-(L,L,1) Terms by Means of Simultaneous Matrix Diagonalization Dimitri Nion & Lieven De Lathauwer K.U. Leuven, Kortrijk campus, Belgium E-mails: Dimitri.Nion@kuleuven-kortrijk.be


  1. Decomposing a Third-Order Tensor in Rank-(L,L,1) Terms by Means of Simultaneous Matrix Diagonalization Dimitri Nion & Lieven De Lathauwer K.U. Leuven, Kortrijk campus, Belgium E-mails: Dimitri.Nion@kuleuven-kortrijk.be Lieven.DeLathauwer@kuleuven-kortrijk.be 2009 SIAM Conference on Applied Linear Algebra , Session MS33 “Computational Methods for Tensors” Monterey, USA, October 26-29, 2009

  2. Roadmap I. Introduction � Tensor decompositions: PARAFAC, Tucker, Block-Component Decompositions II. Block-Component Decomposition in Rank-(L,L,1) Terms � Definition of the BCD-(L,L,1), Uniqueness bound, ALS Algorithm III. Reformulation of BCD-(L,L,1) in terms of simultaneous matrix diagonalization � New algorithm, relaxed uniqueness bound IV. An application of the BCD-(L,L,1): blind source separation in telecommunications V. Conclusion and Future Research

  3. Roadmap I. Introduction � Tensor decompositions: PARAFAC, Tucker, Block-Component Decompositions II. Block-Component Decomposition in Rank-(L,L,1) Terms � Definition of the BCD-(L,L,1), Uniqueness bound, ALS Algorithm III. Reformulation of BCD-(L,L,1) in terms of simultaneous matrix diagonalization � New algorithm, relaxed uniqueness bound IV. An application of the BCD-(L,L,1): blind source separation in telecommunications V. Conclusion and Future Research

  4. Tucker/ HOSVD and PARAFAC W [Tucker, 1966] / [De Lathauwer, 2000] U V W K = × × × � � N 1 2 3 V T = � � U I L M J PARAFAC [Harshman, 1970] C � is diagonal K � � � R B T = ( if i=j=k, h ijk =1, else, h ijk =0 ) � I A R � R J c 1 c c c c R c c c 1 1 1 R R R Sum of R rank-1 tensors: + … + b 1 b b b b R b b b � 1 +…+ � � � R � � � � = 1 R 1 1 R R R R R a a a R a a 1 a a a R R R 1 1 1

  5. From PARAFAC/HOSVD to Block Components Decompositions (BCD) [De Lathauwer and Nion, SIMAX 2008] BCD in rank (L r ,L r ,1) terms c c 1 R K B T B T L 1 L 1 L R 1 R L R � = + … + I A A 1 R J BCD in rank (L r , M r , . ) terms K K K B T B � T = � � +…+ 1 A L R 1 R A L 1 1 I 1 R M R M 1 J BCD in rank (L r , M r , N r ) terms C C 1 R K N 1 N R B T B T = � � +…+ � R 1 A R A L 1 1 I L R 1 R M 1 M R J

  6. Roadmap I. Introduction � Tensor decompositions: PARAFAC, Tucker, Block-Component Decompositions II. Block-Component Decomposition in Rank-(L,L,1) Terms � Definition of the BCD-(L,L,1), Uniqueness bound, ALS Algorithm III. Reformulation of BCD-(L,L,1) in terms of simultaneous matrix diagonalization � New algorithm, relaxed uniqueness bound IV. An application of the BCD-(L,L,1): blind source separation in telecommunications V. Conclusion and Future Research

  7. The BCD(L,L,1) as a generalization of PARAFAC. c c 1 R K B T B T L L BCD-(L,L,1) R L 1 L = + … + � I A A 1 R J � Generalization of PARAFAC [De Lathauwer, de Baynast, 2003] BCD-(1,1,1)=PARAFAC � Unknown matrices: L L L L ... C = ... ... B = A = A A B B K I J 1 1 R R c c 1 R � BCD-(L,L,1) is said essentially unique if only remaining ambiguities are: � Arbitrary permutation of the blocks in A A and B A A B B B and of the columns of C C C C � Rotational freedom of each block (block-wise subspace estimation) + scaling ambiguity on the columns of C C C C

  8. The BCD(L,L,1) as a constrained Tucker model. The BCD-(L ,L , 1) can be seen as a particular case of Tucker model, where the core tensor is « block-diagonal », with L by L blocks on its diagonal. c c 1 R K B T B T L L R L 1 L = + … + � I A A 1 R J K C R R L L L L ... L B B ... A A J = I L 1 R 1 R

  9. BCD(L,L,1): existing results on algorithms and uniqueness � Several usual algorithms used to compute PARAFAC have been adapted to the BCD(L,L,1). Example 1: ALS algorithm (alternate between Least Squares updates of unknowns A , B and C ). Example 2: ALS with Enhanced Line Search to speed up convergence. Example 3: Gauss-Newton based algorithms (Levenberg-Marquardt). � First result on essential uniqueness, in the generic sense [De Lathauwer, 2006]     I J min min min 2 1 ≤ ≥ LR IJ ( ,R)+ ( ,R)+ (K,R) (R+ ) and (1)      L   L 

  10. Starting point of this work � In 2005, De Lathauwer has shown that, under certain assumptions on the dimensions, PARAFAC can be reformulated as a simultaneous diagonalization (SD) problem. This yields: � A very fast and accurate algorithm to compute PARAFAC � A new, relaxed, uniqueness bound � Is it possible to generalize these results to the BCD-(L,L,1)? � If so, does it also yield a fast algoritm and a new uniqueness bound (more relaxed than the one on previous slide)? � The answer is YES

  11. Roadmap I. Introduction � Tensor decompositions: PARAFAC, Tucker, Block-Component Decompositions II. Block-Component Decomposition in Rank-(L,L,1) Terms � Definition of the BCD-(L,L,1), Uniqueness bound, ALS Algorithm III. Reformulation of BCD-(L,L,1) in terms of simultaneous matrix diagonalization � New algorithm, relaxed uniqueness bound IV. An application of the BCD-(L,L,1): blind source separation in telecommunications V. Conclusion and Future Research

  12. Reformulation of DCB-(L,L,1) in terms of SD: overview (1) c c r c c c c c c r r r r r r r R R K K K ∑ = J = ∑ � B B B r B T L r = 1 r = 1 I X X X X r I A A A A r rank L I J J L min( , ) R ≤ IJ K Assumption: i.e., K has to be a sufficiently long dimension Build Y , the JI by K matrix unfolding of � ∃ W ∈ R × R C BCD-(L,L,1) in matrix format : ~ ( ) Y = ( X ) ( X ) ⋅ C = X ⋅ C ( 1 ) T T vec vec � ~ 1 R X E W = ⋅ SVD of Y (generically rank-R): C W − 1 V T = ⋅ H Y U V E V ( 2 ) = ⋅ ⋅ H = ⋅ H Σ Goal: Find W , i.e., find the linear combinations of the columns of E that yield vectorized rank-L matrices.

  13. Reformulation of DCB-(L,L,1) in terms of SD: overview (2) Note 1: Once W found, the unknown matrices A , B , C of the BCD-(L,L,1) follow ~ X E W = ⋅ C W − 1 V T = ⋅ H ~ ( ) X ( X ) ( X ) = vec vec � C V * W − = ⋅ T 1 R ( ) ( A B ) ( A B ) = T T vec vec � 1 1 R R Matricize and estimate Matricize and estimate A 1 and B 1 from best A R and B R from best rank-L approximation. rank-L approximation. Note 2: For PARAFAC (i.e. L=1), we have ~ ( ) X ( a b ), , ( a b ) = T T vec vec � 1 1 R R ( ) b a , , b a = ⊗ ⊗ � 1 1 R R = B A � where � is the Khatri - Rao product ~ X E W = ⋅ is a Khatri-Rao structure recovery problem, and can be solved by simultaneous diagonalization [De Lathauwer, 2005]

  14. Reformulation of DCB-(L,L,1) in terms of SD: overview (3) Remark: on typical matrix factorization problems in Signal Processing Problem formulation: Given only an (MxN) rank-R observed matrix X , find the (MxR) and (RxN) matrices H and S s.t. X = H S N R N = S H R X M M But infinite number of solutions X = ( HF ) ( F -1 S ) so we need extra constraints. Examples: � ICA (Independent Component Analysis) � find H that makes the R source signals in S as much statistically independent as possible. Blind Source Separation. � FIR filter estimation � H holds the impulse response of a FIR filter, and S is Toeplitz. Blind Channel Estimation in telecommunications. � Source localization � H is Vandermonde and holds the individual response of the M antennas to the R source signals, each signal impinging with a Direction Of Arrival (DOA). � Non-negative matrix factorization � Finite Alphabet projection � S holds numerical symbols

  15. Reformulation of DCB-(L,L,1) in terms of SD: overview (4) R R ~ X E W = ⋅ W = ( X ) ... ( X ) ( E ) ... ( E ) vec vec vec vec JI JI 1 R 1 R W + � + W = E E X For = 1 r R 1 I I � r Rr I 1 R r J J J How to find the coefficients of the linear combinations of the E r that yield rank-L matrices? ( X , X , , X ) 0 φ φ = X ∈ C × I J � Tool: mapping for rank-L detection. Let , then L r r r L r X iif is at most rank-L. r After several algebraic manipulations, one can show that W is solution of a SD problem Q W D W = ⋅ ⋅ T 1 1 Q = W ⋅ D ⋅ W T 2 2 � � Q W D W = ⋅ ⋅ T R R

Recommend


More recommend