adaptive layout decomposition with graph embedding neural
play

Adaptive Layout Decomposition with Graph Embedding Neural Networks - PowerPoint PPT Presentation

Adaptive Layout Decomposition with Graph Embedding Neural Networks Wei Li 1 , Jialu Xia 1 , Yuzhe Ma 1 , Jialu Li 1 , Yibo Lin 2 , Bei Yu 1 1 The Chinese University of Hong Kong 2 Peking University 1 / 22 Outline Background & Introduction


  1. Adaptive Layout Decomposition with Graph Embedding Neural Networks Wei Li 1 , Jialu Xia 1 , Yuzhe Ma 1 , Jialu Li 1 , Yibo Lin 2 , Bei Yu 1 1 The Chinese University of Hong Kong 2 Peking University 1 / 22

  2. Outline Background & Introduction Algorithms Results Conclusion 3 / 22

  3. Background & Introduction Algorithms Results Conclusion Outline Background & Introduction Algorithms Results Conclusion 4 / 22

  4. Background & Introduction Algorithms Results Conclusion Multiple Patterning Lithography Decomposition a a b c b c d d e e f f (a) (b) An example of the layout and corresponding decomposition results 4 / 22

  5. Background & Introduction Algorithms Results Conclusion Uncolorable case: Conflict p1 p1 a p2 p2 b p4 p3 p4 p3 (a) (b) An example of the uncolorable case 5 / 22

  6. Background & Introduction Algorithms Results Conclusion One possible solution for the uncolorable case: Stitch p1 p1 a a r1 r1 p2 p2 b b r4 r4 r3 r3 r3 r2 r2 (a) (b) An example of the stitch candidate and stitch 6 / 22

  7. Background & Introduction Algorithms Results Conclusion Problem Formulation p1 a r1 p2 b r4 r3 r2 X X min c ij + α s ij , (1a) x s.t. c ij = ( x i == x j ) , 8 e ij 2 CE , (1b) s ij = ( x i 6 = x j ) , 8 e ij 2 SE , (1c) x i 2 { 0 , 1 , . . . , k } , 8 x i 2 x , (1d) x : color assigned to each node, CE : conflict edge set, SE : stitch edge set. 7 / 22

  8. Background & Introduction Algorithms Results Conclusion Integer Linear Programming (ILP) ∗ X X min c ij + α s ij (2a) e ij ∈ CE e ij ∈ SE s.t. x i 1 + x i 2  1 , x ij 2 { 0 , 1 } . (2b) x i 1 + x j 1  1 + c ij 1 , x i 2 + x j 2  1 + c ij 2 , 8 e ij 2 CE , (2c) ( 1 � x i 1 ) + ( 1 � x j 1 )  1 + c ij 1 , 8 e ij 2 CE , (2d) ( 1 � x i 2 ) + ( 1 � x j 2 )  1 + c ij 2 , 8 e ij 2 CE , (2e) c ij 1 + c ij 2  1 + c ij , 8 e ij 2 CE , (2f) | x j 1 � x i 1 |  s ij 1 , | x j 2 � x i 2 |  s ij 2 , 8 e ij 2 SE , (2g) s ij � s ij 1 , s ij � s ij 2 , 8 e ij 2 SE , (2h) ∗ Bei Yu et al. (Mar. 2015). “Layout Decomposition for Triple Patterning Lithography”. In: IEEE TCAD 34.3, pp. 433–446. 8 / 22

  9. Background & Introduction Algorithms Results Conclusion Exact Cover-based algorithm (EC) † a b c ab ab ac ac a a a ab ac a a ab ac b c b b ab b b ab Picked row c c ac c c ac An example of the exact cover-based algorithm † Hua-Yu Chang and Iris Hui-Ru Jiang (2016). “Multiple patterning layout decomposition considering complex coloring rules”. In: Proc. DAC , 40:1–40:6. 9 / 22

  10. Background & Introduction Algorithms Results Conclusion Pros and cons analysis I ILP - Pros: Optimal - Cons: Bad runtime performance I EC - Pros: High e ffi ciency - Cons: Degradation of the solution quality I Graph matching ‡ - Pros: Good performance in both e ffi ciency and quality for small graphs - Cons: Graph library size is limited ‡ Jian Kuang and Evangeline F. Y. Young (2013). “An E ffi cient Layout Decomposition Approach for Triple Patterning Lithography”. In: Proc. DAC . San Francisco, California, 69:1–69:6. 10 / 22

  11. Background & Introduction Algorithms Results Conclusion Graph Embedding … An example of graph embeddings of layout graphs, where the graphs are transformed into vector space. 11 / 22

  12. Background & Introduction Algorithms Results Conclusion Graph Convolutional Network 0 1 W ( l ) u ( l ) + u ( l ) u ( l + 1 ) @X = ReLU (3) A , i i j ∈ N i u ( l ) : node representation at the l th layer, N i : neighbours of node i . I Composed of two modules, aggregator and encoder I Node embedding: node representation at the final layer I Graph embedding: obtained from node embedding through some operations such as summation and mean I Not applicable for heterogeneous graphs 12 / 22

  13. Background & Introduction Algorithms Results Conclusion Outline Background & Introduction Algorithms Results Conclusion 13 / 22

  14. Background & Introduction Algorithms Results Conclusion Framework Overview Graph Simplification Selection Selected Decomposer N N Y Y Return Stitch Insertion Node num < k ? Graph Matched? Results Graph Simplification RGCN The online workflow of our framework. I Online: Shown in the figure. I O ffl ine: Model training & Graph library construction. 13 / 22

  15. Background & Introduction Algorithms Results Conclusion Relational Graph Convolutional Networks (RGCN) 0 1 u ( l + 1 ) W ( l ) e u ( l ) + u ( l ) @X X = ReLU (4) A , i j i j ∈ N e e ∈ E i E :{ CE (conflict edge set), SE (stitch edge set)} I Neighbours connected by di ff erent kinds of edges are assigned to di ff erent encoder tracks. I Applicable for heterogeneous graphs 14 / 22

  16. Background & Introduction Algorithms Results Conclusion Graph Embedding Workflow by RGCN Node Graph Date Preprocessing RGCN Embedding Embedding + + Sum … Graph simplification & stitch insertion Stitch edge Conflict edge Overview of the process for graph embedding 15 / 22

  17. Background & Introduction Algorithms Results Conclusion O ffl ine: Graph Library Construction What we need? I Enumerate all possible graphs under a size constraint I Avoid isomorphic graphs Rough Algorithm 1. Enumerate all valid graphs under the given size constraint 2. For each graph enumerated, calculate the graph embedding and normalize it 3. Multiply it with the graph embeddings in the library 4. If the maximal value is less than one, insert the graph and corresponding optimal solution by ILP into the library 16 / 22

  18. Background & Introduction Algorithms Results Conclusion Online: Graph Matching & Decomposer Selection Graph Matching I Similar idea with graph library construction I Return the optimal solution of the corresponding matched graph whose graph embedding multiplication result is exactly one Decomposer Selection y = arg max( W s h + b s ) , (5) W s , b s : trainable weight and bias, h : graph embedding Two-class classification problem: ILP or EC 17 / 22

  19. Background & Introduction Algorithms Results Conclusion Outline Background & Introduction Algorithms Results Conclusion 18 / 22

  20. Background & Introduction Algorithms Results Conclusion E ff ectiveness of RGCN Label Label ILP EC ILP EC ILP 13 682 ILP 2 244 Predicted Predicted EC 0 5900 EC 11 6338 Recall 100.0% Recall 15.4% F1-score 0.0367 F1-score 0.0154 (a) Proposed RGCN (b) Conventional GCN I Classify all ‘ILP’ cases correctly and such achieves the optimality I 2 ⇥ F1-score, 6 ⇥ Recall 18 / 22

  21. Background & Introduction Algorithms Results Conclusion Comparison with state-of-the-art Circuit ILP SDP EC RGCN st# cn# cost time (s) st# cn# cost time (s) st# cn# cost time (s) st# cn# cost time (s) 4 0 0.4 0.486 4 0 0.4 0.016 4 0 0.4 0.005 4 0 0.4 0.007 C432 C499 0 0 0 0.063 0 0 0 0.018 0 0 0 0.011 0 0 0 0.015 7 0 0.7 0.135 7 0 0.7 0.021 7 0 0.7 0.010 7 0 0.7 0.014 C880 C1355 3 0 0.3 0.121 3 0 0.3 0.024 3 0 0.3 0.011 3 0 0.3 0.015 C1908 1 0 0.1 0.129 1 0 0.1 0.024 1 0 0.1 0.017 1 0 0.1 0.031 C2670 6 0 0.6 0.158 6 0 0.6 0.044 6 0 0.6 0.035 6 0 0.6 0.046 C3540 8 1 1.8 0.248 8 1 1.8 0.086 8 1 1.8 0.032 8 1 1.8 0.038 C5315 9 0 0.9 0.226 9 0 0.9 0.106 9 0 0.9 0.039 9 0 0.9 0.049 C6288 205 1 21.5 5.569 203 4 24.3 0.648 203 5 25.3 0.151 205 1 21.5 0.154 21 1 3.1 0.872 21 1 3.1 0.157 21 1 3.1 0.071 21 1 3.1 0.111 C7552 S1488 2 0 0.2 0.147 2 0 0.2 0.031 2 0 0.2 0.013 2 0 0.2 0.016 54 19 24.4 7.883 48 25 29.8 1.686 54 19 24.4 0.329 54 19 24.4 0.729 S38417 S35932 40 44 48 13.692 24 60 62.4 5.130 46 44 48.6 0.868 40 44 48 1.856 117 36 47.7 13.494 108 46 56.8 4.804 116 37 48.6 0.923 117 36 47.7 1.840 S38584 S15850 97 34 43.7 11.380 85 46 54.5 4.320 100 34 44 0.864 97 34 43.7 1.792 average 12.893 3.640 15.727 1.141 13.267 0.225 12.893 0.448 ratio 1.000 1.000 1.220 0.313 1.029 0.062 1.000 0.123 I Obtain the optimal solution in the benchmark I Runtime is reduced to 12 . 3 % compared to another optimal ILP-based algorithm 19 / 22

  22. Background & Introduction Algorithms Results Conclusion Runtime breakdown of our framework 2 . 14% 1 . 63% I The decomposition runtime by the selected decomposer is the major bottleneck I RGCN inference and graph matching runtime of our framework are actually trivial I Our method has strong scalability and can be 96 . 23% applied to select other more e ffi cient GCN Inference decomposers in the future. ILP/EC Decomposer Graph Matching 20 / 22

  23. Background & Introduction Algorithms Results Conclusion Outline Background & Introduction Algorithms Results Conclusion 21 / 22

  24. Background & Introduction Algorithms Results Conclusion Conclusion I Graph embedding by RGCN - Build the isomorphism-free graph library - Match graphs in the library - Adaptively select decomposer I The results show that: - The obtained graph embeddings have powerful representation capability - Excellent balance between decomposition quality and e ffi ciency - Our framework has strong scalability for future incremental selection 21 / 22

Recommend


More recommend