Stochastic Simulation Introduction Bo Friis Nielsen Applied Mathematics and Computer Science Technical University of Denmark 2800 Kgs. Lyngby – Denmark Email: bfn@imm.dtu.dk
Practicalities Practicalities • Reading material available online, with some suggestions for further reading • Course evaluation is: passed/not passed - based on lab reports, report over final project, and possibly oral presentation of project. • Teachers: ⋄ Bo Friis Nielsen, e-mail bfni@dtu.dk ⋄ Clara Brimnes Gardner (s153542@student.dtu.dk), Nikolaj Nikolaj Overgaard Sørensen (s190191@student.dtu.dk), Edward Xu (s181238@student.dtu.dk) 02443 02443 – lecture 1 2
Significance Significance • One of the most (The most?)important Operations Research techniques • Several modern statistical techniques rely on simulation 02443 02443 – lecture 1 3
What is simulation? What is simulation? • (From Concise Oxford Dictionary ): To simulate: To pretend, to act like, to mimic, to imitate. • Here: Computer experiments with mathematical model • Stochastic simulation To (have a computer) simulate a system which is affected by randomness. Narrow sense: To generate (pseudo)random numbers from a prescribed distribution (e.g. Gaussian) • Computer experiments with mathematical model • General engineering technique • Analytical/numerical solutions
Why simulate? Why simulate? • Real system expensive • Mathematical model to complex • Get idea of dynamic behaviour 02443 02443 – lecture 1 5
Related areas Related areas • Statistics • Computer science • Operations research 02443 02443 – lecture 1 6
Target group Target group • Methodology course of general interest • Of special importance for students specialising in ⋄ Computer science ⋄ Statistics ⋄ Operations Research ⋄ Planning and management 02443 02443 – lecture 1 7
Course goal Course goal • Topics related to scientific computer experimentation • Specialised techniques ⋄ Variance reduction methods ⋄ Random number generation ⋄ Random variable generation ⋄ The event-by-event principle • Simulation based statistical techniques ⋄ Markov chain Monte Carlo ⋄ Bootstrap • Validition and verification of models • Model building
Recommended reading Recommended reading • Sheldon M. Ross: Simulation, fifth edition, Academic Press 2013 available online for DTU students • Søren Asmussen and Peter W. Glynn: Stochastic Simulation: Algorithms and Analysis, Springer 2007, available online for DTU students • C.P. Robert and G. Casella: Introducing Monte Carlo Methods with R, Springer, 2010 • Reuven Y. Rubinstein and Benjamin Melamed: Modern Simulation and Modelling, John Wiley & Sons 1998, First 50 pages available at DTU Inside. It is illegal to distribute these notes • Villy Bæk Iversen: Numerisk Simulation (In Danish), DTU, 2007 02443 02443 – lecture 1 9
Supplementary reading Supplementary reading • Averill M. Law: Simulation Modeling and Analysis, McGraw-Hill 2015 • Jerry Banks, John S. Carson II, Barry L. Nelson, David M. Nicol: Discrete-Event System Simulation, Prentice and Hall 1999 • Brian Ripley: Stochastic Simulation, John Wiley & Sons 1987 • Jack P. C. Kleijnen: Statistical Tools for Simulation Practitioneers, Marcel Dekker 1987 02443 02443 – lecture 1 10
Knowledge/science in simulation Knowledge/science in simulation • Modelling skill • Statistical methods - it is necessary to understand statistical methodology • OR - Stochastic Processes • Technical skills ⋄ Random number generations ⋄ Sampling from distributions ⋄ Variance reduction techniques ⋄ Statistical techniques bootstrap/MCMC • General purpose/and specialised simulation software 02443 02443 – lecture 1 11
Discrete versus continuous Discrete versus continuous • Discrete event simulation • as opposed to continuous simulation • mixed models 02443 02443 – lecture 1 12
Probability basics Probability basics • 0 ≤ P ( A ) ≤ 1 P ( ∅ ) = 0 P (Ω) = 1 • A ∩ B = ∅ ⇒ P ( A ∪ B ) = P ( A ) + P ( B ) • Complement rule P ( A c ) = 1 − P ( A ) • Difference rule for A ⊂ B : P ( B ∩ A c ) = P ( B ) − P ( A ) • Inclusion, exclusion for 2 events P ( A ∪ B ) = P ( A ) + P ( B ) − P ( A ∩ B ) • Conditional probability: for A given B (partial information): P ( A | B ) = P ( A ∩ B ) P ( B ) • Multiplication rule: P ( A ∩ B ) = P ( B ) P ( A | B ) • Law of total probability ( B i is a partitioning): P ( A ) = � i P ( B i ) P ( A | B i ) P ( A | B i ) P ( B i ) • Bayes theorem: ( B i is a partitioning): P ( B i | A ) = j P ( A | B j ) P ( B j ) � • independence: P ( A | B ) = P ( A | B c ) ( P ( A ∩ B ) = P ( A ) P ( B ))
Random variables Random variables • Mapping from sample space to the real line • Probabilities defined in terms of the preimage • Most probabilitistic calculations are performed with only a slight reference to the underlying sample space 02443 02443 – lecture 1 14
Random variables Random variables • Random variables: maps outcomes to real values ⋄ Distribution P ( X = x ) � x P ( X = x ) = 1 ⋄ Joint distribution � P ( X = x, Y = y ) x,y P ( X = x, Y = y ) = 1 ⋄ Marginal distribution P X ( X = x ) = � y P ( X = x, Y = y ) ⋄ Conditional distribution P ( Y = y | X = x ) = P ( X = x,Y = y ) P X ( X = x ) ⋄ independence P ( Y = y, X = x ) = P X ( X = x ) P Y ( Y = y ) , ∀ ( x, y ) • Mean value E ( X ) = � x · P ( X = x ) • General expectation E ( g ( X )) = � x g ( x ) · P ( X = x ) • Linearity E ( aX + bY + c ) = a E ( X ) + b E ( Y ) + c 02443 02443 – lecture 1 15
Continuous random variables Continuous random variables • Uniform distribution of two variables: P (( x, y ) ∈ C ) = A ( C ) A ( D ) • Continuous random variables ⋄ Density: f ( x ) ≥ 0 , � P ( X ∈ d x ) = f ( x ) d x f ( x ) d x = 1 , ⋄ Mean, variance (moments): E ( X ) = � xf ( x ) d x X k � x k f ( x ) d x � � � E ( g ( X )) = g ( x ) f ( x ) d x, E = 2 2 ( x − µ 2 πσ e − 1 σ ) Z = X − µ 1 • Normal distribuion: f ( x ) = √ σ • Joint densities f ( x, y ) d x d y = P ( x ≤ X ≤ x + d x, y ≤ Y ≤ y + d y ) , f ( x, y ) ≥ 0 • Joint distribution � y � x F ( x, y ) = P ( X ≤ x, Y ≤ y ) = f ( u, v ) d u d v −∞ −∞ DTU 02443 – lecture 1 16
Continuous random variables continued Continuous random variables continued • Conditional continous distributions f Y ( y | X = x ) = f ( x,y ) f X ( x ) • Integral version of law of total probability � P ( A ) = P ( A | X = x ) f X ( x ) d x • Conditional expectation E ( Y ) = E ( E ( Y | X )) • Covariance/corellation Cov ( X, Y ) = E [( X − E ( X ))( Y − E ( Y ))] = E ( XY ) − E ( X ) E ( Y ) Cov ( X, Y ) Corr ( X, Y ) = SD ( X ) SD ( Y ) • ( X, Y ) independent ⇒ Corr ( X, Y ) = 0 • Variance of sum of variables �� N � = � n k =1 Var ( X k ) + 2 � Var 1 ≤ j<k ≤ n Cov ( X j , X k ) k =1 X k • Bilinearity of covariance �� n � i =1 a i X i , � m = � n � m Cov j =1 a i b j Cov ( X i , Y j ) j =1 b j Y j i =1
Recommend
More recommend