gan opc mask optimization with lithography guided
play

GAN-OPC: Mask Optimization with Lithography-guided Generative - PowerPoint PPT Presentation

GAN-OPC: Mask Optimization with Lithography-guided Generative Adversarial Nets Haoyu Yang , Shuhe Li, Yuzhe Ma, Bei Yu and Evangeline F. Y. Young The Chinese University of Hong Kong 1 / 16 Lithography Proximity Effect Still hotspot: low


  1. GAN-OPC: Mask Optimization with Lithography-guided Generative Adversarial Nets Haoyu Yang , Shuhe Li, Yuzhe Ma, Bei Yu and Evangeline F. Y. Young The Chinese University of Hong Kong 1 / 16

  2. Lithography Proximity Effect ◮ Still hotspot: low fidelity patterns ◮ What you see � = what you get ◮ Diffraction information loss ◮ Worse on designs under 10 nm or beyond ◮ Simulations: extremely CPU intensive ◮ RET: OPC, SRAF, MPL 2 / 16

  3. Optical Proximity Correction (OPC) Design target 3 / 16

  4. Optical Proximity Correction (OPC) Mask Wafer Design target without OPC 3 / 16

  5. Optical Proximity Correction (OPC) Mask Wafer Design target without OPC with OPC 3 / 16

  6. Previous Work Classic OPC Machine Learning OPC ◮ Model/Rule-based OPC [Matsunawa+,JM3’16][Choi+,SPIE’16] [Kuang+,DATE’15][Awad+,DAC’16] [Xu+,ISPD’16][Shim+,APCCAS’16] [Su+,ICCAD’16] 1. Edge fragmentation; 1. Fragmentation of shape edges; 2. Feature extraction; 2. Move fragments for better printability. 3. Model training. ◮ Inverse Lithography [Gao+,DAC’14][Poonawala+,TIP’07] [Ma+,ICCAD’17] 1. Efficient model that maps mask to aerial image; 2. Continuously update mask through descending the gradient of contour error. 4 / 16

  7. Preliminaries Definition (PV Band) Given the lithography simulation contours under a set of process conditions, the process variation (PV) band is the area between the outer contour and inner contour. PV Band reflects the robustness of the design to process window variations. A PVBand Example: Lithography results of a 2 × 2 via/contact array under different process conditions. 5 / 16

  8. Preliminaries Definition (Squared- L 2 Error) Let Z t and Z as target image and wafer image respectively, the squared L 2 error of Z is given by || Z t − Z || 2 2 . Problem (Mask Optimization) Given a target image Z t , the objective of the problem is generating the corresponding mask M such that remaining patterns Z after lithography process is as close as Z t or, in other word, minimizing the squared L 2 error of lithography images. 6 / 16

  9. Lithography Model ◮ SVD Approximation of Partial Coherent System [Cobb,1998] N 2 � w k | M ⊗ h k | 2 . I = (1) k = 1 ◮ Reduced Model [Gao+,DAC’14] N h � w k | M ⊗ h k | 2 . I = (2) k = 1 ◮ Etch Model � 1 , if I ( x , y ) ≥ I th , Z ( x , y ) = (3) 0 , if I ( x , y ) < I th . 7 / 16

  10. Inverse Lithography Technique (ILT) The main objective in ILT is minimizing the lithography error through gradient descent. E = || Z t − Z || 2 2 , (4) where Z t is the target and Z is the wafer image of a given mask. Apply translated sigmoid functions to make the pixel values close to either 0 or 1. 1 Z = 1 + exp [ − α × ( I − I th )] , (5) 1 M b = 1 + exp ( − β × M ) . (6) Combine Equations (1)–(6) and the analysis in [Poonawala,TIP’07] , ∂ E ∂ M = 2 αβ × M b ⊙ ( 1 − M b ) ⊙ ((( Z − Z t ) ⊙ Z ⊙ ( 1 − Z ) ⊙ ( M b ⊗ H ∗ )) ⊗ H + (( Z − Z t ) ⊙ Z ⊙ ( 1 − Z ) ⊙ ( M b ⊗ H )) ⊗ H ∗ ) . (7) 8 / 16

  11. Genarative Adversarial Net (GAN) ◮ x : Sample from the distribution of target dataset; z : Input of G ◮ Generator G ( z ; θ g ) : Differentiable function represented by a multilayer perceptron with parameters θ g . ◮ Discriminator D ( x ; θ d ) : Represents the probability that x came from the data rather than G . 1. Train D to maximize the probability of assigning the correct label to both training examples and samples from G . 2. Train G to minimize log ( 1 − D ( G ( z ))) , i.e. generate faked samples that are drawn from similar distributions as p data ( x ) . min G max D E x ∼ p data ( x ) [ log D ( x )] + E z ∼ p z ( z ) [ log ( 1 − D ( G ( z )))] . (8) 9 / 16

  12. GAN Architecture 1.62 3.83 … 3.15 … Generator Discriminator … 0.2 0.8 Fake Real 10 / 16

  13. GAN-OPC Target Target & Mask Encoder Discriminator Generator … … Decoder 0.2 0.8 Bad Good Mask Mask Mask 11 / 16

  14. GAN Training Based on the OPC-oriented GAN architecture in our framework, we tweak the objectives of G and D accordingly, max E Z t ∼Z [ log ( D ( Z t , G ( Z t )))] , (9) max E Z t ∼Z [ log ( D ( Z t , M ∗ ))] + E Z t ∼Z [ 1 − log ( D ( Z t , G ( Z t )))] . (10) In addition to facilitate the training procedure, we minimize the differences between generated masks and reference masks when updating the generator as in Equation (11). min E Z t ∼Z || M ∗ − G ( Z t ) || n , (11) where || · || n denotes the l n norm. Combining (9), (10) and (11), the objective of our GAN model becomes E Z t ∼Z [ 1 − log ( D ( Z t , G ( Z t ))) + || M ∗ − G ( Z t ) || n n ] min G max D + E Z t ∼Z [ log ( D ( Z t , M ∗ ))] . (12) 12 / 16

  15. ILT-guided Pre-training Observing that both ILT and neural network optimization share similar gradient descent procedure, we propose a jointed training algorithm that takes advantages of ILT engine, as depicted in Figure (b). We initialize the generator with lithography-guided pre-training to make it converge well in the GAN optimization flow thereafter. Real (a) Generator Discriminator Fake (b) Litho- Generator Simulator Feed-forward Back-propagetion 13 / 16

  16. Results 45 , 000 Squared L 2 Error 51 , 000 PVBand ( nm 2 ) 43 , 000 50 , 500 41 , 000 50 , 000 39 , 000 49 , 500 37 , 000 49 , 000 35 , 000 ILT GAN PGAN ILT GAN PGAN (a) (b) ILT PGAN (c) 14 / 16

  17. Results (a) (b) (c) (d) (e) 15 / 16

  18. Thank You 16 / 16

Recommend


More recommend