generation of non uniform random numbers
play

Generation of Non-Uniform Random Numbers Refs: Chapter 8 in Law and - PowerPoint PPT Presentation

Generation of Non-Uniform Random Numbers Refs: Chapter 8 in Law and book by Devroye (watch for typos) Peter J. Haas CS 590M: Simulation Spring Semester 2020 1 / 21 Generation of Non-Uniform Random Numbers Acceptance-Rejection Convolution


  1. Generation of Non-Uniform Random Numbers Refs: Chapter 8 in Law and book by Devroye (watch for typos) Peter J. Haas CS 590M: Simulation Spring Semester 2020 1 / 21

  2. Generation of Non-Uniform Random Numbers Acceptance-Rejection Convolution Method Composition Method Alias Method Random Permutations and Samples Non-Homogeneous Poisson Processes 2 / 21

  3. Acceptance-Rejection Goal: Generate a random variate X having pdf f X I Avoids computation of F − 1 ( u ) as in inversion method I Assumes f X is easy to calculate Special case: f X ( x ) > 0 only on [ a , b ] (finite support) I Throw down points uniformly in enclosing rectangle R , reject points above f X curve f X (x) m x b a I Return x -coordinate of accepted point 3 / 21

  4. Acceptance-Rejection, Continued Claim: x -coordinate of an accepted point has pdf f X Proof 1. Let ( Z 1 , Z 2 ) be the ( x , y )-coordinates of a random point distributed uniformly in R and fix x 2 [ a , b ] � � 2. Then P ( Z 1  x , acceptance) = P Z 1  x , Z 2  f X ( Z 1 ) � � 3. But P Z 1  x , Z 2  f X ( Z 1 ) = prob that ( Z 1 , Z 2 ) falls in shaded region: ffu )du/AreaLR ) " { P ( Z 1  x , acceptance) = PKr.b.aueptj-asbf.mu/ArcacA)--l/ArealA P (acceptance) = f X (t) ) , accept ) Phish P ( Z 1  x | acceptance) = x b " fxlutduIAreacspfaco-e.pt ) a = { t T¥j=£f × Cu)duV 4 / 21

  5. Acceptance-Rejection, Continued Acceptance-Rejection Algorithm (Finite Support) D 1. Generate U 1 , U 2 ⇠ Uniform(0 , 1) ( U 1 and U 2 are independent) 2. Set Z 1 = a + ( b � a ) U 1 and Z 2 = mU 2 (inversion method) 3. if Z 2  f X ( Z 1 ), return X = Z 1 , else go to step 1 How many ( U 1 , U 2 ) pairs must we generate? I N (= number pairs generated) has geometric dist’n: P ( N = k ) = p (1 � p ) k − 1 where p = 1 / (area of R ) I So E [ N ] = 1 / p = (area of R ) = ( b � a ) m I So make m as small as possible M f X (x) m x a b 5 / 21

  6. Generalized Acceptance-Rejection: Infinite Support must have a > ficxyglx ) thx choose smallest such Find density g that majorizes f X a I There exists a constant c such that f X ( x ) / c  g ( x ) for all x I Smallest such constant is c = sup � � f X ( x ) / g ( x ) x Generalized Acceptance-Rejection Algorithm 1. Generate Z D ⇠ g and U D ⇠ Uniform(0 , 1) ( Z , U independent) 2. if U g ( Z )  f X ( Z ) / c , return X = Z , else go to step 1 Expected number of ( Z , U ) pairs generated: E [ N ] = c 6 / 21

  7. Convolution Method are iid Yi 's Goal: Generate X where X = Y 1 + · · · + Y m I f X = f Y 1 ⇤ f Y 2 ⇤ · · · ⇤ f Y m I Where the convolution f ⇤ g is defined by R ∞ ( f ⇤ g )( x ) = −∞ f ( x � y ) g ( y ) dy Convolution Algorithm I Generate Y 1 , . . . , Y m I Return X = Y 1 + · · · + Y m Example: Binomial Distribution I Suppose X D ⇠ Binom( m , p ) D I Then X = Y 1 + · · · + Y m where Y i ⇠ Bernoulli( p ) I Often part of a more complex algorithm (e.g., do something else if m large) 7 / 21

  8. Composition Method Suppose that we can write I F X ( x ) = p 1 F Y 1 ( x ) + p 2 F Y 2 ( x ) + · · · + p m F Y m ( x ) or I f X ( x ) = p 1 f Y 1 ( x ) + p 2 f Y 2 ( x ) + · · · + p m f Y m ( x ) where P m i =1 p i = 1 Pi > o Ponti Composition Method 1. Generate a discrete RV J where P ( J = j ) = p j for 1  j  m 2. Generate Y J from F Y J or f Y J 3. Return X = Y J Can lead to very fast generation algorithms 8 / 21

  9. Composition Method: Example R 3 f X (x) R 1 R 2 a b c I Let p i = area of R i for 1  i  3 I Then f X = p 1 f Y 1 + p 2 f Y 2 + p 3 f Y 3 , where ( 2( x − a ) ( 2( c − x ) if a  x  b ; if b  x  c ; ( b − a ) 2 ( c − b ) 2 f Y 1 ( x ) = f Y 2 ( x ) = 0 otherwise 0 otherwise � � f Y 3 ( x ) = (1 / p 3 ) f X ( x ) � p 1 f Y 1 ( x ) � p 2 f Y 2 ( x ) I Easy to generate Y 1 and Y 2 (the “usual case”): I Y 1 = max( U 1 , U 2 ) and Y 2 = min( U 1 , U 2 ) or use inversion 9 / 21

  10. Alias Method for Discrete Random Variables Goal: Generate X with P ( X = x i ) = p i for 1  i  n Easy case: p 1 = p 2 = · · · = p n 1/4 1 2 3 4 0 4 Algorithm 1. Generate U D ⇠ Uniform(0 , 1) 2. Return x J , where J = d nU e d x e = smallest integer � x (ceiling function) 10 / 21

  11. Alias Method: General Case 1/2 1/2 i x i p i a i r i 1 1/6 0.5 1/3 1/3 x 1 x 2 x 2 2 1/2 1.0 x 2 x 2 1/6 1/6 x 2 x 3 3 1/3 1.0 x 3 x 3 x 1 x 1 x 2 x 3 Alias Algorithm D 1. Generate U 1 , U 2 ⇠ Uniform(0 , 1) 2. Set I = d nU 1 e 3. If U 2  r I return X = x I else return X = a I 11 / 21

  12. Alias Method: Another Example 3/7 9/21 7/21 # Xu 2/7 5/21 Xi 1/7 3/21 - Ag x 1 x 2 x 3 i x i p i a i r i 1 x 1 3/7 x , a 2 3/7 517 x 2 X , Xu 3/7 3 x 3 1/7 12 / 21

  13. Generation of Non-Uniform Random Numbers Acceptance-Rejection Convolution Method Composition Method Alias Method Random Permutations and Samples Non-Homogeneous Poisson Processes 13 / 21

  14. Random Permutations: Fisher-Yates Shu ffl e Goal: Create random permutation of array x of length n Fisher-Yates Algorithm 1. Set i n 2. Generate random integer N between 1 and i (e.g., as d iU e ) 3. Swap x [ N ] and x [ i ] 4. Set i i � 1 5. If i = 1 then exit a b c d a d c b a d c b d a c b Q: What about this algorithm: For i = 1 to n : swap x [ i ] with random entry Other random objects: graphs, matrices, random vectors, ... 14 / 21

  15. Sequential Bernoulli Sampling I Stream of items x 1 , x 2 , . . . , I Insert each item into sample with probability p I Expected sample size after n th item = np I Fast implementation: generate skips directly (geometrically distributed) Bernoulli Sampling: Generate U D ⇠ Uniform(0 , 1) Set ∆ 1 + b ln U / ln(1 � p ) c [Geometric on { 1 , 2 , . . . } , HW #4] Set m ∆ Upon arrival of x i : I if i = m , then I Include x i in sample D I Generate U ⇠ Uniform(0 , 1) I Set ∆ 1 + b ln U / ln(1 � p ) c I set m m + ∆ 15 / 21

  16. Reservoir Sampling I Stream of items x 1 , x 2 , . . . , I Maintain a uniform random sample of size N w.o.replacement Reservoir Sampling: Upon arrival of x i : I if i  N , then include x i in sample I if i > N , then D I Generate U ∼ Uniform(0 , 1) I If U ≤ N / i , then include x i in sample, replacing randomly chosen victim I Can generate skips directly using acceptance-rejection [JS Vitter, ACM Trans. Math. Softw., 11(1): 37–57, 1985] 16 / 21

  17. Reservoir Sampling: Simple Example I Sample size = 1 I S i = sample state after processing j th item (called i j ) I accept item i 1 into S 1 with probability 1 P ( i 1 2 S 1 ) = 1 I accept item i 2 into S 2 with probability 1 / 2 - squat P ( i 1 2 S 2 ) = P ( i 1 2 S 1 ) ⇥ P ( i 2 62 S 2 ) = (1)(1 / 2) = 1 / 2 . P ( i 2 2 S 2 ) = 1 / 2 I accept item i 3 into S 3 with probability 1 / 3 - I :& P ( i 1 2 S 3 ) = P ( i 1 2 S 2 ) ⇥ P ( i 3 62 S 3 ) = (1 / 2)(2 / 3) = 1 / 3 . P ( i 2 2 S 3 ) = P ( i 2 2 S 2 ) ⇥ P ( i 3 62 S 3 ) = (1 / 2)(2 / 3) = 1 / 3 P ( i 3 2 S 2 ) = 1 / 3 17 / 21

  18. Generation of Non-Uniform Random Numbers Acceptance-Rejection Convolution Method Composition Method Alias Method Random Permutations and Samples Non-Homogeneous Poisson Processes 18 / 21

  19. Non-Homogeneous Poisson Processes: Thinning Ordinary (Homogeneous) Poisson process I Times between successive events are i.i.d. Exp( λ ) I Event probabilities for disjoint time intervals disk ' n are independent (Markov property) Poisson I P (event in t + ∆ t ) ⇡ λ ∆ t for ∆ t very small I P ( T n > y + z | T n − 1 = y ) = e − λ z P ( n events in [ t , t + ∆ t ]) = ( λ ∆ t ) n e − λ ∆ t / n ! Non-Homogeneous Poisson process I Event probabilities for disjoint time intervals are independent I P (event in t + ∆ t ) ⇡ λ ( t ) ∆ t for ∆ very small [ λ ( t ) is sometimes called a hazard function] R y + z λ ( u ) du I P ( T n > y + z | T n − 1 = y ) = e − y I Can capture, e.g., time-of-day e ff ects 19 / 21

  20. Intuition Vt st ) p ( event in : - place ept ) Thinning, Continued = Puma , event in vi. At ) Has .at/ Suppose that λ ( t )  λ max gym . - At - ' * max = for t 2 [0 , τ ] I Idea: Generate “too many” events according to a Poisson( λ max ) process, then reject some of the events Thinning Algorithm: 1. Set T 0 = 0 , V = 0, and n = 0 2. Set n n + 1 [Generate T n ] 3. Generate E D ⇠ Exp( λ max ) and U D ⇠ Uniform(0 , 1) 4. Set V V + E [Proposed event time] 5. If U  λ ( V ) / λ max then set T n = V else go to Step 3 6. If T n < τ then go to Step 2 else exit 20 / 21

  21. Thinning, Continued Ross’s Improvement I Piecewise-constant upper-bounding to reduce rejections I Correction for event times that span segments Many other approaches I Inversion (see HW #4) I Idiosyncratic methods: Exploit special properties of Poisson process 21 / 21

Recommend


More recommend