data driven multifidelity methods for monte carlo
play

Data-Driven Multifidelity Methods for Monte Carlo Estimation - PowerPoint PPT Presentation

Data-Driven Multifidelity Methods for Monte Carlo Estimation Benjamin Peherstorfer Courant Institute of Mathematical Sciences New York University Karen Willcox Massachusetts Institute of Technology Max Gunzburger Florida State University 1


  1. Data-Driven Multifidelity Methods for Monte Carlo Estimation Benjamin Peherstorfer Courant Institute of Mathematical Sciences New York University Karen Willcox Massachusetts Institute of Technology Max Gunzburger Florida State University 1 / 36

  2. Outer loop applications control inference optimization visualization model calibration multi-discipline coupling uncertainty quantification 2 / 36

  3. Surrogate models Given is a high-fidelity model f ( 1 ) : D → Y ◮ Large-scale numerical simulation high-fidelity model ◮ Achieves required accuracy ◮ Computationally expensive surrogate Additionally, often have surrogate models costs model surrogate model f ( i ) : D → Y , i = 2 , . . . , k surrogate surrogate ◮ Approximate high-fidelity f ( 1 ) model model ◮ Often orders of magnitudes cheaper error Examples of surrogate models u ( z M ) u ( z 1 ) u ( z 2 ) { u ( z ) | z ∈ D} R N data-fit models, coarse-grid reduced basis, simplified models, response surfaces, approximations proper orthogonal linearized models machine learning decomposition 3 / 36

  4. Replacing high-fidelity model with surrogate Replace f ( 1 ) with a surrogate model ◮ Costs of outer loop reduced outer loop ◮ Often orders of magnitude speedups application Estimate depends on surrogate accuracy output y ◮ Control with error bounds/estimators input z ◮ Rebuild if accuracy too low ◮ No guarantees without bounds/estimators Issues high-fidelity ◮ Propagation of surrogate error on estimate model ◮ Surrogates without error control ◮ Costs of rebuilding a surrogate model 4 / 36

  5. Replacing high-fidelity model with surrogate Replace f ( 1 ) with a surrogate model ◮ Costs of outer loop reduced outer loop ◮ Often orders of magnitude speedups application Estimate depends on surrogate accuracy output y ◮ Control with error bounds/estimators input z ◮ Rebuild if accuracy too low ◮ No guarantees without bounds/estimators Issues surrogate ◮ Propagation of surrogate error on estimate model ◮ Surrogates without error control ◮ Costs of rebuilding a surrogate model 4 / 36

  6. Our approach: Multifidelity methods Combine high-fidelity and surrogate models ◮ Leverage surrogate models for speedup . ◮ Recourse to high-fidelity for accuracy outer loop application Multifidelity guarantees high-fidelity accuracy ◮ Occasional recourse to high-fidelity model output y input z ◮ High-fidelity model is kept in the loop ◮ Independent of error control for surrogates high-fidelity model Multifidelity speeds up computations ◮ Adapt, fuse, filter with surrogate models surrogate model ◮ Balance #solves among models [Brandt, 1977], [Hackbusch, 1985], [Bramble et al, 1990], [Booker et al, 1999], [Jones et al, 1998], [Alexandrov et al, 1998], [Christen et al, 2005], [Cui et al, 2014] [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; SIAM Review, 2018 (to appear)] 5 / 36

  7. Our approach: Multifidelity methods Combine high-fidelity and surrogate models ◮ Leverage surrogate models for speedup . ◮ Recourse to high-fidelity for accuracy outer loop application Multifidelity guarantees high-fidelity accuracy ◮ Occasional recourse to high-fidelity model output y input z ◮ High-fidelity model is kept in the loop high-fidelity model ◮ Independent of error control for surrogates surrogate model Multifidelity speeds up computations ◮ Adapt, fuse, filter with surrogate models . . . ◮ Balance #solves among models surrogate model [Brandt, 1977], [Hackbusch, 1985], [Bramble et al, 1990], [Booker et al, 1999], [Jones et al, 1998], [Alexandrov et al, 1998], [Christen et al, 2005], [Cui et al, 2014] [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; SIAM Review, 2018 (to appear)] 5 / 36

  8. Outline 1. Motivation for multifidelity methods 2. Multifidelity Monte Carlo estimation (MFMC) 3. Asymptotic analysis of MFMC 4. Adaptive surrogates and MFMC 5. Outlook and conclusions 6 / 36

  9. Outline 1. Motivation for multifidelity methods 2. Multifidelity Monte Carlo estimation (MFMC) 3. Asymptotic analysis of MFMC 4. Adaptive surrogates and MFMC 5. Outlook and conclusions 1 P., Willcox & Gunzburger Optimal model management for multifidelity Monte Carlo estimation . SISC, 2016. 6 / 36

  10. Uncertainty propagation as an outer loop application High-fidelity (“truth”) model with costs w 1 > 0 f ( 1 ) : D → Y Given random variable Z , estimate s = E [ f ( 1 ) ( Z )] Monte Carlo estimator with realizations z 1 , . . . , z n of Z n = 1 � y ( 1 ) f ( 1 ) ( z i ) ¯ n n i = 1 Uncertainty propagation with Monte Carlo is outer-loop application ◮ Each high-fidelity model solve is computationally expensive ◮ Repeated model solves become prohibitive [Rozza, Carlberg, Manzoni, Ohlberger, Veroy-Grepl, Willcox, Kramer, Benner, Ullmann, Nouy, Zahm, etc] 7 / 36

  11. MFMC: Control variates Estimate E [ A ] of random variable A with Monte Carlo estimator n a n = 1 � ¯ a i , a 1 , . . . , a n ∼ A n i = 1 Unbiased estimator E [¯ a n ] = E [ A ] with mean-squared error (MSE) a n ) = Var [ A ] e (¯ n a n with Monte Carlo estimator ¯ Combine ¯ b n of E [ B ] of random variable B � � E [ B ] − ¯ ˆ s A = ¯ a n + γ b n , γ ∈ R Control variate estimator ˆ s A is unbiased estimator E [ˆ s A ] = E [ A ] with MSE s A ) = ( 1 − ρ 2 ) e (¯ e (ˆ a n ) ◮ Correlation coefficient − 1 ≤ ρ ≤ 1 of A and B ◮ If ρ = 0, same MSE as regular Monte Carlo ◮ If | ρ | > 0, lower MSE ◮ The higher correlated, the lower MSE of ˆ s A [Nelson, 87] 8 / 36

  12. MFMC: Control variates and surrogate models Models P., Willcox, Gunzburger, Optimal model management for multifidelity Monte Carlo ◮ High-fidelity model f ( 1 ) : D → Y estimation . SISC, 2016 ◮ Surrogates f ( 2 ) , . . . , f ( k ) : D → Y Exploit correlation of f ( 1 ) ( Z ) and f ( i ) ( Z ) for reducing MSE Cov [ f ( 1 ) ( Z ) , f ( i ) ( Z )] ρ i = , i = 2 , . . . , k � Var [ f ( 1 ) ( Z )] Var [ f ( i ) ( Z )] Related work : Combine multiple models for Monte Carlo estimation ◮ Multilevel Monte Carlo [Giles 2008], [Heinrich 2001], [Speight, 2009] ◮ RBM and control variates [Boyaval et al, 2010, 2012], [Vidal et al 2015] ◮ Data-fit models and control variates [Tracey et al 2013] ◮ Monte Carlo with low-/high-fidelity model [Ng & Eldred 2012] ◮ Two models and control variates [Ng & Willcox 2012, 2014] ⇒ Need for arbitrary number of surrogates, any type of surrogates 9 / 36

  13. MFMC: Multifidelity Monte Carlo estimator Take realizations of input random variable Z P., Willcox, Gunzburger, Optimal model management for multifidelity Monte Carlo estimation . SISC, 2016 z 1 , z 2 , z 3 , . . . Evaluate model f ( i ) at first m i realizations z 1 , . . . , z m i of Z f ( i ) ( z 1 ) , . . . , f ( i ) ( z m i ) , i = 1 , . . . , k Multifidelity Monte Carlo (MFMC) estimator k � � � y ( 1 ) y ( i ) y ( i ) ˆ s = ¯ + γ i ¯ m i − ¯ m 1 m i − 1 ���� i = 2 � �� � from HFM from surrogates ◮ MFMC estimator ˆ s is unbiased estimator of s = E [ f ( 1 ) ( Z )] ◮ Costs of each model evaluation 0 < w 1 , . . . , w k ∈ R give costs of MFMC k � c (ˆ s ) = m i w i i = 1 ◮ Selection of coefficients γ 2 , . . . , γ k and model evaluations m 1 , . . . , m k ? ◮ Comparison in terms of costs/MSE to regular Monte Carlo estimation? 10 / 36

  14. MFMC: Balancing work among models Variance of MFMC estimator ˆ s is k � � � s ] = σ 2 1 − 1 � � 1 γ 2 i σ 2 e (ˆ s ) = Var [ˆ + i − 2 γ i ρ i σ 1 σ i m 1 m i − 1 m i i = 2 ◮ Variance σ 2 i of f ( i ) ( Z ) ◮ Correlation coefficient ρ i between f ( 1 ) ( Z ) and f ( i ) ( Z ) Find m and γ that minimize MSE for given computational budget q arg min Var [ˆ s ] m ∈ R k ,γ 2 ,...,γ k ∈ R subject to m i − 1 − m i ≤ 0 , i = 2 , . . . , k , − m 1 ≤ 0 , w T m = q . 11 / 36

  15. MFMC: Optimal sampling Theorem 1 (P., Willcox, Gunzburger, 2016). Optimization problem has unique (analytic) solution if ρ 2 1 > · · · > ρ 2 k > 0 and > ρ 2 i − 1 − ρ 2 w i − 1 i , i = 2 , . . . , k (1) ρ 2 i − ρ 2 w i i + 1 Sketch of proof ◮ Establish necessary condition for local optima with Karush-Kuhn-Tucker ◮ Only one local optima with m 1 < m 2 < · · · < m k ◮ This local optima has smaller objective value than any with “ ≤ ” y ( 1 ) Variance reduction of MFMC ˆ s w.r.t. benchmark Monte Carlo ¯ q � k � w i � 2 � � � ρ 2 i − ρ 2 y ( 1 ) e (ˆ s ) = e (¯ q ) i + 1 w 1 i = 1 [P., Willcox & Gunzburger Optimal model management for multifidelity Monte Carlo estimation . SISC, 2016.] 12 / 36

  16. MFMC: Numerical example Locally damaged plate in bending ◮ Inputs: nominal thickness, load, damage ◮ Output: maximum deflection of plate ◮ Only distribution of inputs known ◮ Estimate expected deflection Six models ◮ High-fidelity model: FEM, 300 DoFs (a) wing panel ◮ Reduced model: POD, 10 DoFs 0.08 1 ◮ Reduced model: POD, 5 DoFs spatial coordinate x 2 0.8 ◮ Reduced model: POD, 2 DoFs 0.07 thickness 0.6 ◮ Data-fit model: linear interp., 256 pts 0.4 0.06 ◮ Support vector machine: 256 pts 0.2 0 0.05 Var, corr, and costs est. from 100 samples 0 0.2 0.4 0.6 0.8 1 spatial coordinate x 1 (b) damaged plate 13 / 36

Recommend


More recommend