efficient tensor
play

Efficient Tensor Decomposition and Its Application Naoki KAWASHIMA - PowerPoint PPT Presentation

TNSAA2018-2019 Kobe Dec.3-Dec.6, 2018 Efficient Tensor Decomposition and Its Application Naoki KAWASHIMA (ISSP) Dec. 3, 2018 Occam's Razor " Pluralitas non est ponenda sine necessitate. " (We should not make more


  1. TNSAA2018-2019 ( Kobe Dec.3-Dec.6, 2018 ) Efficient Tensor Decomposition and Its Application Naoki KAWASHIMA (ISSP) Dec. 3, 2018

  2. Occam's Razor " Pluralitas non est ponenda sine necessitate. " (We should not make more assumptions, if not necessary.) Be stingy with model parameters! William of Occam ca. 1285-1349 from Wikipedia

  3. Tensor Network State ( ) (   )     =    Cont , , , T T S S S   1 2 N   , , , S S S =  =  =  1 2 N S 1 S 1 S 1 1 2 N T 8 T 7 virtual index Parametrized T 5 T 6 by only O(N) T 2 physical index tensors. T T T 4 3 1   S S S S N 3 1 2 Traditional model TN model Exact model << << O(e N ) O(1) O(N)

  4. Y. Hieida, K. Okunishi and Y. Akutsu (1999) PEPS (or TPS) T. Nishino, et al (2001) F. Verstraete and J. Cirac (2004) Majority of low-T condensed matter physics problems satisfy the "Area Law" PEPS satisfies the area law by definition. ?      i

  5. Ising Model is a TN

  6. Real Space RG with TN Gu, Levin, Wen: PRB 78 (2008); Schuch, et al: PRL 98 (2007)

  7. Occam's Razor in TRG --- SVD Singular Value Decomposition (SVD) with Truncation 𝑦 𝑦 𝑦 𝑦 𝑦 𝑦 𝑦 2 𝑦 ≈ = 𝑈 𝑈 𝑈 𝑈 𝑈 1 2 1 2 𝑦 𝑦 𝑦 𝑦 𝑦 𝑦 𝑈 = 𝑉𝑇𝑊 = ෡ 𝑉 መ 𝑇 ෠ ෡ መ 𝑇 ෠ መ 𝑊 = 𝑉 𝑇 𝑊 = 𝑈 1 𝑈 2 𝑦 2 𝑦 2 ≈ = 𝑦 2 = 𝑦 𝑊 𝑈 𝑉 𝑇 ෡ መ ෠ 𝑈 𝑈 2 𝑉 𝑇 𝑊 1

  8. How good is it? See Morita's talk on Wednesday 2D Potts model (L<=1,048,576) S. Morita and NK: arXiv:1806.10275 HOTRG calculation with χ ~ 50 Finite Size Scaling 1st order nature of 5-Potts polynomial time calculation of Tc confirmed

  9. Tensor Network Calculation of ab-initio model for Na 2 IrO 3 Experimental observation (zigzag state) is reproduced. T. Okubo (U. Tokyo) ab-initio model for Na 2 IrO 3

  10. S=1 Bilinear-Biquadratic Model Hyunyong LEE H.Y. Lee and NK: PRB 97, 205123 (2018) (ISSP) dE/ dΦ E Q M

  11. Improvement of TRG Evenbly and Vidal: Phys. Rev. Lett. 115, 180405 (2015) Optimization  condition for u, v and w RG transformation: can get rid of local entanglement converges faster when D increased

  12. What has been improved? --- Corner Double Line (CDL) Tensor --- It also appears as the fixed point tensor of the TRG procedure in the disordered phase.

  13. The fate of a local entanglement loop Suppose each tensor is a CDL

  14. The fate of a local entanglement loop Focus on a plaquette

  15. The fate of a local entanglement loop The 1st SVD The entanglement loop is deformed.

  16. The fate of a local entanglement loop The network after contraction of small squares. The green loop is still there.

  17. The fate of a local entanglement loop The 2nd SVD

  18. The fate of a local entanglement loop After the 2nd SVD The green loop still survives. Some of the ent. loops have been removed. But at each generation the influence of the original ent. loop remains. The expressive capacity of the network is wasted.

  19. Removal of Ent. Loops It is essential that this line is thin.  Evenbly and Vidal: Phys. Rev. Lett. 115, 180405 (2015) By pinching the "information path", we can split the remaining loop, and remove them at the next contraction.

  20. Another example of loops: Tensor Ring Decomposition (TRD)

  21. TRD in Informatics --- Images Zhao, Cichocki ら arXiv:1606.05535 Columbia Object Image Libraries (COIL)-100 dataset data set = 100 object image sets 1 object image set = 72 images 1 image = 128 x 128 dots 1 dot = 3 colors = ... x-coordinate x 1, ,128 x Z 1 = ... y-coordinate y 1, ,128 T i  xyci = ... color y Z 2 Z 4 c 1, ,3 T = ... image ID i 1, ,7200 Z 3 c

  22. Application for Classification Zhao, Cichocki ら arXiv:1606.05535 COIL100 2D image classification task 128 x 128 x 3 x 7200 bits KNN classifier (K=1) applied to the image specifier core (Z 4 ). score ( %) tolerated average score ( %) maximum error bond dim. bond dim. (large training) (small training) Ring decomposition shows better performance. "open chain" ... but can we do that easily? "ring"

  23. Alternating Least Square (ALS) Z 4 (1) random initial Z 1 Z 3 tensors Z i Z 2 (2) for i=1,2,3,4, update Z i by 2 Z 4 min T - Z 1 Z 3 Z i Z 2 (3) repeat until the error converges However, ALS is trapped by local loops.

  24. ALS on CDL H.-Y. Lee and N.K. arXiv:1807.03862 sALS ... The initial condition obtained by sequential (open chain) SVD ALS on CDL is either unstable or stuck with a local minimum. (At least partially, due to the local entanglement loops.)

  25. Redundant entanglement loops causes various problems. (reduction of expressive power, and obstacle in optimization.) Any direct method for removing them?

  26. TRD of CDL H.-Y. Lee and N.K. arXiv:1807.03862 If we knew U, V, W and x, y, z explicitly, we can find Z1, Z2, Z3 of the TRD very easily. U ... but how do we know them?

  27. Ring Decomposition by Index Splitting H.-Y. Lee and N.K. arXiv:1807.03862 𝑓 𝑗𝜚 When the given tensor T is a CDL, i.e., it must have the following form:  =  i ( ) pq i I p q , , U = T V W ... then, we can find U, V and W by HOSVD

  28. Index Splitting H.-Y. Lee and N.K. arXiv:1807.03862 𝑗 ( ) , I p q ... injection from ( p,q ) to i such as e.g., ( ) = 0,0 0 I ( ) = I 0,1 1 𝑞 𝑟 ( ) = I 1,0 2 ( ) = 1,1 3 I  =  i ( ) pq , , i I p q

  29. Uniqueness of HOSVD H.-Y. Lee and N.K. arXiv:1807.03862 HOSVD U = T t t t = V W x If T is expressed as a core tensor t and unitaries U , V , and W , where any matrix slices of t are mutually orthogonal, such an expression is unique up to the permutation within each index and the phase factors. mutual orthogonality CDL satisfies mutual orthogonality of matrix slices of t → U,V,W can be obtained by HOSVD

  30. Ring Decomposition by Index Splitting H.-Y. Lee and N.K. arXiv:1807.03862 with Random Noise 4th By working directly with the inner 4th structure, we can avoid the difficulty of the local minima in the optimization.

  31. 2D Ising Model above Tc H.-Y. Lee and N.K. arXiv:1807.03862 16 4 4 16 16 4 4 16

  32. Summary ■ TN representation makes Quantum Information it possible to handle extremely Information Processing large systems, frustrated systems, etc. MERA Ring Decomp. TNS ■ CDL-like structure typical Renormalization in TN-based RG often cause Group serious difficulty. mutual information Entanglement TN ■ Index-splitting based on Optical Lattice HOSVD may be useful in Condensed overcoming the difficulty. Matter Theory

Recommend


More recommend