A Multifidelity Cross-entropy Method for Rare Event Simulation Benjamin Peherstorfer Courant Institute of Mathematical Sciences New York University Karen Willcox and Boris Kramer MIT 1 / 22
Problem setup High-fidelity model with costs w 1 > 0 f ( 1 ) : D → Y 5 Threshold 0 < t ∈ R and random variable realizations density 4 Z ∼ p Estimate rare event probability 3 density P f = P p [ f ( 1 ) ≤ t ] 2 Reformulated to P f = E p [ I ( 1 ) t ] with 1 � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = 0 f ( 1 ) ( z ) > t 0 , -0.5 0 0.5 1 1.5 outputs f (1) ( z ) Coefficient of variation of standard Monte Carlo estimator grows with 1 / P f 2 / 22
Surrogate models in uncertainty quantification Replace f ( 1 ) with a surrogate model ◮ Costs of uncertainty quantification reduced uncertainty ◮ Often orders of magnitude speedups quantification Estimate depends on surrogate accuracy output y ◮ Control with error bounds/estimators input z ◮ Rebuild if accuracy too low ◮ No guarantees without bounds/estimators Issues surrogate ◮ Propagation of surrogate error on estimate model ◮ Surrogates without error control ◮ Costs of rebuilding a surrogate model 3 / 22
Our approach: Multifidelity methods Combine high-fidelity and surrogate models ◮ Leverage surrogate models for speedup . ◮ Recourse to high-fidelity for accuracy uncertainty quantification Multifidelity speeds up computations ◮ Balance #solves among models output y input z ◮ Adapt, fuse, filter with surrogate models Multifidelity guarantees high-fidelity accuracy high-fidelity ◮ Occasional recourse to high-fidelity model model ◮ High-fidelity model is kept in the loop surrogate model ◮ Independent of error control for surrogates [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; SIAM Review, 2017 (to appear)] 4 / 22
Our approach: Multifidelity methods Combine high-fidelity and surrogate models ◮ Leverage surrogate models for speedup . ◮ Recourse to high-fidelity for accuracy uncertainty quantification Multifidelity speeds up computations ◮ Balance #solves among models output y input z ◮ Adapt, fuse, filter with surrogate models high-fidelity model Multifidelity guarantees high-fidelity accuracy surrogate model ◮ Occasional recourse to high-fidelity model ◮ High-fidelity model is kept in the loop . . . ◮ Independent of error control for surrogates surrogate model [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; SIAM Review, 2017 (to appear)] 4 / 22
MFCE: Importance sampling Consider a biasing density q with supp ( p ) ⊆ supp ( q ) 5 Reformulate estimation problem realizations nominal � � p 4 biasing P f = E p [ I ( 1 ) I ( 1 ) t ] = E q t q 1. First, construct suitable q with 3 density � � p I ( 1 ) ≪ Var p [ I ( 1 ) Var q t ] 2 t q 2. Estimate probability with Monte Carlo 1 m Z i ) p ( ˜ P f = 1 Z i ) I ( 1 ) ˜ � t ( ˜ ˜ 0 , Z i ∼ q -0.5 0 0.5 1 1.5 q ( ˜ m Z i ) outputs f (1) ( z ) i = 1 ⇒ Use surrogates for constructing q 5 / 22
MFCE: Literature review Two-fidelity approaches ◮ Switch between models [Li, Xiu et al., 2010, 2011, 2014] ◮ Reduced basis models with error estimators [Chen and Quarteroni, 2013] ◮ Kriging models and importance sampling [Dubourg et al., 2013] ◮ Subset method with machine-learning-based models [Bourinet et al., 2011], [Papadopoulos et al., 2012] ◮ Surrogates and importance sampling [P., Cui, Marzouk, Willcox, 2016] Multilevel methods for rare event simulation ◮ Variance reduction via control variates [Giles et al., 2015], [Elfverson et al., 2014, 2016], [Fagerlund et al., 2016] ◮ Subset method with coarse-grid approximations [Ullmann and Papaioannou, 2015] Combining multiple general types of surrogates ◮ Importance sampling + control variates [P., Kramer, Willcox, 2017] [P., Willcox, Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and opti- mization ; SIAM Review, 2017 (to appear)] 6 / 22
MFCE: Direct sampling of surrogate models Directly sampling surrogate models to construct biasing density ◮ Reduces costs per sample ◮ Number of samples to construct biasing density remains the same ◮ Works well for probabilities > 10 − 5 ⇒ Insufficient for very rare event probabilities in range of 10 − 9 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t mean realizations of f ( 1 ) ( Z ) [P., Cui, Marzouk, Willcox, 2016], [P., Kramer, Willcox, 2017] 7 / 22
MFCE: Direct sampling of surrogate models Directly sampling surrogate models to construct biasing density ◮ Reduces costs per sample ◮ Number of samples to construct biasing density remains the same ◮ Works well for probabilities > 10 − 5 ⇒ Insufficient for very rare event probabilities in range of 10 − 9 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t mean realizations of f ( 1 ) ( Z ) [P., Cui, Marzouk, Willcox, 2016], [P., Kramer, Willcox, 2017] 7 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 5 t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 6 t 5 t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 7 t 6 t 5 t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Construct biasing density iteratively Threshold t controls “rareness” of event � f ( 1 ) ( z ) ≤ t , 1 , I ( 1 ) t ( z ) = f ( 1 ) ( z ) > t 0 , Cross-entropy method constructs densities iteratively t 1 > t 2 > · · · > t 4 3 . 5 3 2 . 5 density 2 1 . 5 1 0 . 5 0 t t 7 t 6 t 5 t 4 t 3 t 2 t 1 mean realizations of f ( 1 ) ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 8 / 22
MFCE: Cross-entropy method – in each step Need to find biasing density in each step i = 1 , . . . , T ◮ Optimal biasing density that reduces variance to 0 q i ( z ) ∝ I ( 1 ) t i ( z ) p ( z ) ⇒ Unknown normalizing constant (quantity we want to estimate) ◮ Find q v i ∈ Q = { q v : v ∈ P} with min Kullback-Leibler distance to q i v i ∈P D KL ( q i || q v i ) min ◮ Reformulate as (independent of normalizing constant of q i ) v i ∈P E p [ I ( 1 ) max t i log ( q v i )] ◮ Solve approximately by replacing E p with Monte Carlo estimator m 1 � I ( 1 ) max t i ( Z i ) log ( q v i ( Z i )) , Z 1 , . . . , Z m ∼ p m v i ∈P i = 1 ⇒ Optimization problems affected by rareness of event I ( 1 ) t i ( Z ) [Rubinstein, 1999], [Rubinstein, 2001] 9 / 22
Recommend
More recommend