Two Topics in Parametric Integration Applied to Stochastic Simulation in Industrial Engineering Jeremy Staum Department of Industrial Engineering and Management Sciences Northwestern University September 15th, 2014 Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Outline 1 Simulation Metamodeling introduction and overview 2 Multi-Level Monte Carlo Metamodeling with Imry Rosenbaum http: //users.iems.northwestern.edu/~staum/MLMCM.pdf 3 Generalized Integrated Brownian Fields for Simulation Metamodeling with Peter Salemi and Barry L. Nelson http://users.iems.northwestern.edu/~staumGIBF.pdf Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
MCQMC / IBC Application Domain Industrial Engineering & Operations Research using math to analyze systems and improve decisions Stochastic Simulation: production, logistics, financial, . . . � integration: µ = E [ Y ] = Y ( ω ) d ω parametric integration: approx. µ def. by µ ( x ) = E [ Y ( x )] optimization: min { µ ( x ) : x ∈ X} Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
What is Stochastic Simulation Metamodeling? Stochastic simulation model example Fuel injector production line System performance measure µ ( x ) = E [ Y ( x )] x : design of production line Y ( x ): number of fuel injectors produced Simulating each scenario (20 replications) takes 8 hours Stochastic simulation metamodeling Simulation output ¯ Y ( x i ) at x i , i = 1 , . . . , n Predict µ ( x ) by ˆ µ ( x ), even without simulating at x µ ( x ) is usually a weighted average of ¯ Y ( x 1 ) , . . . , ¯ ˆ Y ( x n ) Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Overview of Multi-Level Monte Carlo (MLMC) Error in Stochastic Simulation Metamodeling k � w i ( x ) ¯ prediction ˆ µ ( x ) = Y ( x i ) i =1 variance: Var [ˆ µ ( x )] caused by variance of simulation output interpolation error (bias): E [ˆ µ ( x )] � = µ ( x ) Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Main Idea of Multi-Level Monte Carlo Ordinary Monte Carlo to reduce variance: large number n of replications per simulation run (design point) to reduce bias: large number k of design points (fine grid) very large computational effort kn Multi-Level Monte Carlo to reduce variance: coarser grids, many replications each to reduce bias: finer grids, few replications each less computational effort / better convergence rate Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Our Contributions Theoretical mix and match or expand (from Heinrich’s papers): derive desired conclusions under desired assumptions to suit IE goals and applications Practical algorithm design (based on Giles) Experimental show how much MLMC speeds up realistic examples in IE Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Heinrich (2001): MLMC for Parametric Integration Assumptions � Approximate µ given by µ ( x ) = Ω Y ( x , ω ) d ω over x ∈ X . X ⊆ R d and Ω ⊆ R d 2 : bounded, open, Lipschitz boundary. With respect to x , Y has weak derivatives up to r th order. Y and weak derivatives are L q -integrable in ( x , ω ). Sobolev embedding condition: r / d > 1 / q . µ − µ � p � q d ω ) 1 / p , where p = min { 2 , q } . Measure error as ( Ω � ˆ Conclusion: There is a MLMC method with optimal rate. MLMC attains the best rate of convergence in C , the number of evaluations of Y . The error bound is proportional to C − r / d if r / d < 1 − 1 / p C 1 / p − 1 log C if r / d = 1 − 1 / p C 1 / p − 1 if r / d > 1 − 1 / p . Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Assumptions Smoothness: assume r = 1 Stock option, Y ( x , ω ) = max { xR ( ω ) − K , 0 } Queueing: waiting time W n +1 = max { W n + B n − A n +1 , 0 } Inventory: S n = min { I n + P n , D n } , I n +1 = I n − S n Parameter Domain Assume X ⊂ R d is compact (not open). Heinrich and Sindambiwe (1999), Daun and Heinrich (2014) If X were open, we would have to extrapolate. No need to approximate unbounded µ near a boundary of X . Domain of Integration Ω ⊆ R d 2 is not important; d 2 does not appear in theorem. Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Changing Perspective Measure of Error Use p = q = 2 to get Root Mean Integrated Squared Error � 1 / 2 µ ( x ) − µ ( x )) 2 d x d ω �� � X (ˆ Ω Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Changing Perspective Measure of Error Use p = q = 2 to get Root Mean Integrated Squared Error � 1 / 2 µ ( x ) − µ ( x )) 2 d x d ω �� � X (ˆ Ω Sobolev Embedding Criterion with r = 1, q = 2 r / d > 1 / q becomes 1 / d > 1 / 2, i.e. d = 1!?? Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Changing Perspective Measure of Error Use p = q = 2 to get Root Mean Integrated Squared Error � 1 / 2 µ ( x ) − µ ( x )) 2 d x d ω �� � X (ˆ Ω Sobolev Embedding Criterion with r = 1, q = 2 r / d > 1 / q becomes 1 / d > 1 / 2, i.e. d = 1!?? Why We Don’t Need the Sobolev Embedding Condition Assume the domain X is compact. Assume Y ( · , ω ) is (almost surely) Lipschitz continuous. Conclude Y ( · , ω ) is (almost surely) bounded. Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Our Assumptions On the Stochastic Simulation Metamodeling Problem X ⊂ R d is compact Y ( x ) has finite variance for all x ∈ X | Y ( x ′ , ω ) − Y ( x , ω ) | ≤ κ ( ω ) � x ′ − x � , a.s., and E [ κ 2 ] < ∞ . On the Approximation Method and MLMC Design i =1 w i ( x ) ¯ µ ( x ) = � N ˆ Y ( x i ) where each w i ( x ) ≥ 0 and Total weight on points x i far from x gets close to 0. Total weight on points x i near x gets close to 1. Thresholds for “far”/“‘near” and “close to” are O ( N − 1 / 2 φ ) as number N of points increases. Examples: piecewise linear interpolation on a grid; nearest-neighbors, Shepard’s method, kernel smoothing Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Approximation Method Used in Examples Kernel Smoothing N � w i ( x ) ¯ µ ( x ) = ˆ Y ( x i ) i =1 weight w i ( x ) is 0 if x i is outside the cell containing x otherwise, proportional to exp( −� x − x i � ) weights are normalized to sum to 1 Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Our Conclusions MLMC Performance As number N of points used in a level increases, Errors due to bias and refinement variance are like O ( N − 1 /φ ). Example: nearest-neighbor approximation on grid, φ = d / 2 Computational Complexity (based on Giles 2013) To attain RMISE < ǫ , the required number of evaluations of Y is O ( ǫ − 2(1+ φ ) ) for standard Monte Carlo and for MLMC it is O ( ǫ − 2 φ ) if φ > 1 O (( ǫ − 1 (log ǫ − 1 )) 2 ) if φ = 1 O ( ǫ − 2 ) if φ < 1. Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Sketch of Algorithm (based on Giles 2008) Goal: add levels until target RMISE < ǫ is achieved. 1 INITIALIZE level ℓ = 0. 2 SIMULATE at level ℓ : Run level ℓ simulation experiment with M 0 replications. 1 Observe sample variance of simulation output. 2 Choose number of replications M ℓ to control variance; run 3 more replications if needed. 3 TEST CONVERGENCE: Use Monte Carlo to estimate the size of the refinement ∆ˆ µ ℓ , 1 µ ℓ ( x )) 2 d x . � X (∆ˆ If refinements are too large compared to target RMISE, 2 increment ℓ and return to step 2. 4 CLEAN UP: Finalize number of replications M 0 , . . . , M ℓ to control variance; run more replications at each level if needed. Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Asian Option Example, d = 3 MLMC up to 150 times better than standard Monte Carlo Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Inventory System Example, d = 2 MLMC was 130-8900 times better than standard Monte Carlo Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Conclusion on Multi-Level Monte Carlo Celebration Multi-Level Monte Carlo works for typical IE stochastic simulation metamodeling too! Future Research Handle discontinuities in simulation output. Combine with good experiment designs. Grids are not good in high dimension. Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Introduction: Generalized Integrated Brownian Field Kriging / Interpolating Splines Pretend µ is a realization of a Gaussian random field M with mean function m and covariance function σ 2 . Kriging predictor: µ ( x ) = m ( x ) + σ 2 ( x ) Σ − 1 (¯ � β i σ 2 ( x , x i ) ˆ Y − m ) = m ( x ) + i σ 2 ( x ) is a vector with i th element σ 2 ( x , x i ) Σ is a matrix with i , j th element σ 2 ( x i , x j ) Y − m is a vector with i th element ¯ ¯ Y ( x i ) − m ( x i ) Stochastic Kriging / Smoothing Splines µ ( x ) = m ( x ) + σ 2 ( x )( Σ + ❈ ) − 1 (¯ � β i σ 2 ( x , x i ) ˜ ˆ Y − m ) = m ( x ) + i ❈ = covariance matrix of noise, estimated from replications Jeremy Staum Topics in Simulation Metamodeling in Industrial Engineering
Recommend
More recommend