Multifidelity importance sampling methods for rare event simulation Benjamin Peherstorfer University of Wisconsin-Madison Karen Willcox and Boris Kramer MIT Max Gunzburger Florida State University July 2017 1 / 24
Introduction High-fidelity model with costs w 1 ≫ 0 f ( 1 ) : D → Y uncertainty Random variable Z , estimate quantification s = E [ f ( 1 ) ( Z )] output y input z Monte Carlo estimator of s with samples Z 1 , . . . , Z n n = 1 � y ( 1 ) f ( 1 ) ( Z i ) ¯ n n i = 1 high-fidelity Computational costs high model ◮ Many evaluations of high-fidelity model ◮ Typically 10 3 − 10 6 evaluations ◮ Intractable if f ( 1 ) expensive 2 / 24
Surrogate models Given is a high-fidelity model f ( 1 ) : D → Y ◮ Large-scale numerical simulation high-fidelity model ◮ Achieves required accuracy ◮ Computationally expensive surrogate Additionally, often have surrogate models costs model surrogate model f ( i ) : D → Y , i = 2 , . . . , k surrogate surrogate ◮ Approximate high-fidelity f ( 1 ) model model ◮ Often orders of magnitudes cheaper error Examples of surrogate models u ( z M ) u ( z 1 ) u ( z 2 ) { u ( z ) | z ∈ D} R N data-fit models, coarse-grid reduced basis, simplified models, response surfaces, approximations proper orthogonal linearized models machine learning decomposition 3 / 24
Surrogate models in uncertainty quantification Replace f ( 1 ) with a surrogate model ◮ Costs of uncertainty quantification reduced uncertainty ◮ Often orders of magnitude speedups quantification Estimate depends on surrogate accuracy output y ◮ Control with error bounds/estimators input z ◮ Rebuild if accuracy too low ◮ No guarantees without bounds/estimators Issues surrogate ◮ Propagation of surrogate error on estimate model ◮ Surrogates without error control ◮ Costs of rebuilding a surrogate model 4 / 24
Our approach: Multifidelity methods Combine high-fidelity and surrogate models ◮ Leverage surrogate models for speedup . ◮ Recourse to high-fidelity for accuracy uncertainty quantification Multifidelity speeds up computations ◮ Balance #solves among models output y input z ◮ Adapt, fuse, filter with surrogate models Multifidelity guarantees high-fidelity accuracy high-fidelity ◮ Occasional recourse to high-fidelity model model ◮ High-fidelity model is kept in the loop surrogate model ◮ Independent of error control for surrogates [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; available online as technical report, MIT, 2016] 5 / 24
Our approach: Multifidelity methods Combine high-fidelity and surrogate models ◮ Leverage surrogate models for speedup . ◮ Recourse to high-fidelity for accuracy uncertainty quantification Multifidelity speeds up computations ◮ Balance #solves among models output y input z ◮ Adapt, fuse, filter with surrogate models high-fidelity model Multifidelity guarantees high-fidelity accuracy surrogate model ◮ Occasional recourse to high-fidelity model ◮ High-fidelity model is kept in the loop . . . ◮ Independent of error control for surrogates surrogate model [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; available online as technical report, MIT, 2016] 5 / 24
Multifidelity Monte Carlo (MFMC) MFMC use control variates for var reduction ◮ Derives control variates from surrogates ◮ Uses number of samples that minimize error [P., Willcox, Gunzburger: Optimal model management for multi- fidelity Monte Carlo estimation. SIAM Journal on Scientific Com- puting, 2016] Multifidelity sensitivity analysis ◮ Identify the parameters of model with largest Figures: Elizabeth Qian (MIT) influence on quantity of interest (MF)MC hydraulic conductivity estimation 10 -1 ◮ Elizabeth Qian (MIT)/Earth Science (LANL) MC MFMC 10 -2 Variance of mean estimate Asymptotic analysis of MFMC 10 -3 [P., Gunzburger, Willcox: Convergence analysis of multifidelity Monte Carlo estimation, submitted. 2016] 10 -4 MFMC with information reuse 10 -5 [Ng, Willcox: Monte Carlo Information-Reuse Approach to Air- craft Conceptual Design Optimization Under Uncertainty. 2015] 10 -6 10 1 10 0 10 2 MFMC with optimally-adapted surrogates Computational budget (s) 6 / 24
Multifidelity rare event simulation based on importance sampling with Karen Willcox and Boris Kramer 7 / 24
MFCE: Problem setup Threshold 0 < t ∈ R and random variable Z ∼ p 5 realizations Estimate rare event probability density 4 P f = P p [ f ( 1 ) ≤ t ] 3 density Can be reformulated to estimating E P f = E p [ I ( 1 ) 2 t ] with indicator function 1 � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = 0 f ( 1 ) ( z ) > t 0 , -0.5 0 0.5 1 1.5 outputs f (1) ( z ) If P f ≪ 1 , very unlikely that we hit f ( 1 ) ≤ t 8 / 24
MFCE: Importance sampling Consider a biasing density q with 5 supp ( p ) ⊆ supp ( q ) realizations nominal 4 biasing Reformulate estimation problem � p � 3 P f = E p [ I ( 1 ) I ( 1 ) t ] = E q density t q 2 Goal is constructing suitable q with � p � 1 I ( 1 ) ≪ Var p [ I ( 1 ) Var q t ] t q 0 ⇒ Use surrogate models -0.5 0 0.5 1 1.5 outputs f (1) ( z ) 9 / 24
MFCE: Literature review Two-fidelity approaches ◮ Switch between models [Li, Xiu et al., 2010, 2011, 2014] ◮ Reduced basis models with error estimators [Chen and Quarteroni, 2013] ◮ Kriging models and importance sampling [Dubourg et al., 2013] ◮ Subset method with machine-learning-based models [Bourinet et al., 2011], [Papadopoulos et al., 2012] ◮ Surrogates and importance sampling [P., Cui, Marzouk, Willcox, 2016] Multilevel methods for rare event simulation ◮ Variance reduction via control variates [Elfverson et al., 204, 2016], [Fagerlund et al., 2016] ◮ Subset method with coarse-grid approximations [Ullmann and Papaioannou, 2015] Combining multiple general types of surrogates ◮ Importance sampling + control variates [P., Kramer, Willcox, 2017] 10 / 24
MFCE: Direct sampling of surrogate models Directly sampling surrogate models to construct biasing density ◮ Reduces costs per sample ◮ Number of samples to construct biasing density remains the same ◮ Works well for probabilities > 10 − 5 ⇒ Insufficient for very rare event probabilities in range of 10 − 9 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t mean realizations of f ( 1 ) ( Z ) [P., Cui, Marzouk, Willcox, 2016], [P., Kramer, Willcox, 2017] 11 / 24
MFCE: Direct sampling of surrogate models Directly sampling surrogate models to construct biasing density ◮ Reduces costs per sample ◮ Number of samples to construct biasing density remains the same ◮ Works well for probabilities > 10 − 5 ⇒ Insufficient for very rare event probabilities in range of 10 − 9 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t mean realizations of f ( 1 ) ( Z ) [P., Cui, Marzouk, Willcox, 2016], [P., Kramer, Willcox, 2017] 11 / 24
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 12 / 24
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 12 / 24
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 12 / 24
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 12 / 24
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 12 / 24
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 5 t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 12 / 24
Recommend
More recommend