sampling methods
play

Sampling Methods CMSC 691 UMBC (Some) Learning Techniques - PowerPoint PPT Presentation

Approximate Inference: Sampling Methods CMSC 691 UMBC (Some) Learning Techniques MAP/MLE: Point estimation, basic EM Variational Inference: Functional Optimization Sampling/Monte Carlo today Outline Monte Carlo methods Sampling


  1. Approximate Inference: Sampling Methods CMSC 691 UMBC

  2. (Some) Learning Techniques MAP/MLE: Point estimation, basic EM Variational Inference: Functional Optimization Sampling/Monte Carlo today

  3. Outline Monte Carlo methods Sampling Techniques Uniform sampling Importance Sampling Rejection Sampling Metropolis-Hastings Gibbs sampling Example: Collapsed Gibbs Sampler for Topic Models

  4. Two Problems for Sampling Methods to Solve Generate samples from p π‘ž 𝑦 = 𝑣 𝑦 , 𝑦 ∈ ℝ 𝐸 π‘Ž 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 samples Q : Why might sampling from p(x) be hard?

  5. Two Problems for Sampling Methods to Solve Generate samples from p π‘ž 𝑦 = 𝑣 𝑦 , 𝑦 ∈ ℝ 𝐸 π‘Ž 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 samples Q : Why might sampling from p(x) be hard? A1 : Can we evaluate Z? A2 : Can we sample without enumerating? (Correct samples should be where p is big)

  6. Two Problems for Sampling Methods to Solve Generate samples from p π‘ž 𝑦 = 𝑣 𝑦 , 𝑦 ∈ ℝ 𝐸 π‘Ž 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 samples Q : Why might sampling from p(x) be hard? A1 : Can we evaluate Z? A2 : Can we sample without 𝑣 𝑦 = exp(.4 𝑦 βˆ’ .4 2 βˆ’ 0.08𝑦 4 ) ITILA, Fig enumerating? (Correct samples 29.1 should be where p is big)

  7. Two Problems for Sampling Methods to Solve Estimate expectation of a Generate samples from p function 𝜚 π‘ž 𝑦 = 𝑣 𝑦 , 𝑦 ∈ ℝ 𝐸 Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 = π‘Ž ∫ π‘ž 𝑦 𝜚 𝑦 𝑒𝑦 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 samples Q : Why might sampling from p(x) be hard? A1 : Can we evaluate Z? A2 : Can we sample without enumerating? (Correct samples should be where p is big)

  8. Two Problems for Sampling Methods to Solve Estimate expectation of a Generate samples from p function 𝜚 π‘ž 𝑦 = 𝑣 𝑦 , 𝑦 ∈ ℝ 𝐸 Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 = π‘Ž ∫ π‘ž 𝑦 𝜚 𝑦 𝑒𝑦 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 samples 1 ΰ·‘ 𝑆 Οƒ 𝑠 𝜚 𝑦 𝑠 Ξ¦ = Q : Why might sampling from p(x) be hard? A1 : Can we evaluate Z? A2 : Can we sample without enumerating? (Correct samples should be where p is big)

  9. Two Problems for Sampling Methods to Solve Estimate expectation of a Generate samples from p function 𝜚 π‘ž 𝑦 = 𝑣 𝑦 , 𝑦 ∈ ℝ 𝐸 Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 = π‘Ž ∫ π‘ž 𝑦 𝜚 𝑦 𝑒𝑦 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 samples 1 ΰ·‘ 𝑆 Οƒ 𝑠 𝜚 𝑦 𝑠 Ξ¦ = Q : Why is sampling from p(x) hard? If we could sample from p… A1 : Can we evaluate Z? consistent 𝔽 ΰ·‘ A2 : Can we sample without Ξ¦ = Ξ¦ estimator enumerating? (Correct samples should be where p is big)

  10. Outline Monte Carlo methods Sampling Techniques Uniform sampling Importance Sampling Rejection Sampling Metropolis-Hastings Gibbs sampling Example: Collapsed Gibbs Sampler for Topic Models

  11. Goal: Uniform Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 sample ΰ·‘ 𝜚 𝑦 𝑠 π‘ž βˆ— (𝑦 𝑠 ) Ξ¦ = ෍ uniformly : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑠

  12. Goal: Uniform Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 sample ΰ·‘ 𝜚 𝑦 𝑠 π‘ž βˆ— (𝑦 𝑠 ) Ξ¦ = ෍ uniformly : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑠 π‘ž βˆ— 𝑦 = 𝑣 𝑦 π‘Ž βˆ— π‘Ž βˆ— = ෍ 𝑣(𝑦 𝑠 ) 𝑠

  13. Goal: Uniform Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 sample ΰ·‘ 𝜚 𝑦 𝑠 π‘ž βˆ— (𝑦 𝑠 ) Ξ¦ = ෍ uniformly : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑠 π‘ž βˆ— 𝑦 = 𝑣 𝑦 π‘Ž βˆ— π‘Ž βˆ— = ෍ 𝑣(𝑦 𝑠 ) 𝑠 this might work if R (the number of samples) sufficiently hits high probability regions

  14. Goal: Uniform Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 sample ΰ·‘ 𝜚 𝑦 𝑠 π‘ž βˆ— (𝑦 𝑠 ) Ξ¦ = ෍ uniformly : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑠 π‘ž βˆ— 𝑦 = 𝑣 𝑦 π‘Ž βˆ— π‘Ž βˆ— = ෍ 𝑣(𝑦 𝑠 ) 𝑠 this might work if R Ising model example: (the number of 2 H states of high β€’ samples) sufficiently probability hits high probability 2 N states total β€’ regions

  15. Goal: Uniform Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 sample ΰ·‘ 𝜚 𝑦 𝑠 π‘ž βˆ— (𝑦 𝑠 ) Ξ¦ = ෍ uniformly : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑠 π‘ž βˆ— 𝑦 = 𝑣 𝑦 π‘Ž βˆ— π‘Ž βˆ— = ෍ 𝑣(𝑦 𝑠 ) 𝑠 this might work if R chance of sample being in Ising model example: 2 𝐼 (the number of 2 H states of high high prob. region: β€’ 2 𝑂 samples) sufficiently probability hits high probability 2 N states total β€’ min. samples needed: ∼ 2 π‘‚βˆ’πΌ regions

  16. Outline Monte Carlo methods Sampling Techniques Uniform sampling Importance Sampling Rejection Sampling Metropolis-Hastings Gibbs sampling Example: Collapsed Gibbs Sampler for Topic Models

  17. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 ITILA, Fig 29.5

  18. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 p(x) x where Q(x) > p(x): over-represented x where Q(x) < p(x): under-represented ITILA, Fig 29.5

  19. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: Ξ¦ = Οƒ 𝑠 𝜚 𝑦 𝑠 π‘₯(𝑦 𝑠 ) 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 ΰ·‘ Οƒ 𝑠 π‘₯ 𝑦 𝑠 sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 p(x) x where Q(x) > p(x): π‘₯ 𝑦 𝑠 = 𝑣 π‘ž 𝑦 over-represented 𝑣 π‘Ÿ 𝑦 x where Q(x) < p(x): under-represented ITILA, Fig 29.5

  20. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: Ξ¦ = Οƒ 𝑠 𝜚 𝑦 𝑠 π‘₯(𝑦 𝑠 ) 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 ΰ·‘ Οƒ 𝑠 π‘₯ 𝑦 𝑠 sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 p(x) x where Q(x) > p(x): π‘₯ 𝑦 𝑠 = 𝑣 π‘ž 𝑦 over-represented 𝑣 π‘Ÿ 𝑦 x where Q(x) < p(x): under-represented Q : How reliable will ITILA, Fig 29.5 this estimator be?

  21. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: Ξ¦ = Οƒ 𝑠 𝜚 𝑦 𝑠 π‘₯(𝑦 𝑠 ) 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 ΰ·‘ Οƒ 𝑠 π‘₯ 𝑦 𝑠 sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 p(x) x where Q(x) > p(x): π‘₯ 𝑦 𝑠 = 𝑣 π‘ž 𝑦 over-represented 𝑣 π‘Ÿ 𝑦 x where Q(x) < p(x): under-represented A : In practice, difficult Q : How reliable will ITILA, Fig 29.5 to say. π‘₯ 𝑦 𝑠 may not this estimator be? be a good indicator

  22. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: Ξ¦ = Οƒ 𝑠 𝜚 𝑦 𝑠 π‘₯(𝑦 𝑠 ) 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 ΰ·‘ Οƒ 𝑠 π‘₯ 𝑦 𝑠 sample from Q : x where Q(x) > p(x): π‘₯ 𝑦 𝑠 = 𝑣 π‘ž 𝑦 over-represented 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑣 π‘Ÿ 𝑦 x where Q(x) < p(x): under-represented p(x) A : In practice, difficult Q : How reliable will to say. π‘₯ 𝑦 𝑠 may not this estimator be? be a good indicator Q : How do you choose a good approximating ITILA, Fig 29.5 distribution?

  23. Goal: Importance Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: Ξ¦ = Οƒ 𝑠 𝜚 𝑦 𝑠 π‘₯(𝑦 𝑠 ) 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 ΰ·‘ Οƒ 𝑠 π‘₯ 𝑦 𝑠 sample from Q : x where Q(x) > p(x): π‘₯ 𝑦 𝑠 = 𝑣 π‘ž 𝑦 over-represented 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 𝑣 π‘Ÿ 𝑦 x where Q(x) < p(x): under-represented p(x) A : In practice, difficult Q : How reliable will to say. π‘₯ 𝑦 𝑠 may not this estimator be? be a good indicator Q : How do you choose A : Task/domain a good approximating ITILA, Fig 29.5 specific distribution?

  24. Importance Sampling: Variance Estimator may vary q(x): Gaussian q(x): Cauchy distribution true value iterations ITILA, Fig 29.6

  25. Outline Monte Carlo methods Sampling Techniques Uniform sampling Importance Sampling Rejection Sampling Metropolis-Hastings Gibbs sampling Example: Collapsed Gibbs Sampler for Topic Models

  26. Goal: Rejection Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 , 𝑑 βˆ— 𝑣 π‘Ÿ > 𝑣 π‘ž 𝑑 βˆ— 𝑣 π‘Ÿ 𝑦 𝑣 π‘ž 𝑦 ITILA, Fig 29.8

  27. Goal: Rejection Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 , 𝑑 βˆ— 𝑣 π‘Ÿ > 𝑣 π‘ž sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 βˆ— select sample uniformly : tuples 𝑨 𝑙 ∼ Unif(0, 𝑑 βˆ— 𝑣 π‘Ÿ 𝑦 𝑙 ) 𝑑 βˆ— 𝑣 π‘Ÿ 𝑦 𝑣 π‘ž 𝑦 ITILA, Fig 29.8

  28. Goal: Rejection Sampling Ξ¦ = 𝜚 𝑦 π‘ž = 𝔽 π‘¦βˆΌπ‘ž 𝜚 𝑦 approximating distribution: if 𝑨 𝑙 ≀ 𝑣 π‘ž 𝑦 𝑙 : add 𝑦 𝑙 to 𝑅 𝑦 ∝ 𝑣 π‘Ÿ 𝑦 , 𝑑 βˆ— 𝑣 π‘Ÿ > 𝑣 π‘ž sampled R points otherwise: reject it sample from Q : 𝑦 1 , 𝑦 2 , … , 𝑦 𝑆 βˆ— select sample uniformly : tuples 𝑨 𝑙 ∼ Unif(0, 𝑑 βˆ— 𝑣 π‘Ÿ 𝑦 𝑙 ) 𝑑 βˆ— 𝑣 π‘Ÿ 𝑦 𝑑 βˆ— 𝑣 π‘Ÿ 𝑦 𝑣 π‘ž 𝑦 𝑣 π‘ž 𝑦 𝑨 ITILA, Fig 29.8

Recommend


More recommend