metropolis hastings algorithms in function space for
play

Metropolis-Hastings Algorithms in Function Space for Bayesian - PowerPoint PPT Presentation

Metropolis-Hastings Algorithms in Function Space for Bayesian Inverse Problems Bjrn Sprungk, Oliver Ernst, Hans-Jrg Starkloff, Daniel Rudolf Advances in Uncertainty Quantification Methods, Algorithms and Applications (UQAW 2015) SRI Center


  1. Metropolis-Hastings Algorithms in Function Space for Bayesian Inverse Problems Björn Sprungk, Oliver Ernst, Hans-Jörg Starkloff, Daniel Rudolf Advances in Uncertainty Quantification Methods, Algorithms and Applications (UQAW 2015) SRI Center for UQ in CS&E, KAUST, Saudi Arabia January 9, 2015 FAKULTÄT FÜR MATHEMATIK

  2. Outline 1 Motivation 2 MCMC in Function Space 3 Better Proposals 4 Convergence 5 Numerical Examples Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 2 / 24

  3. Outline 1 Motivation 2 MCMC in Function Space 3 Better Proposals 4 Convergence 5 Numerical Examples Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 3 / 24

  4. Motivation Uncertainty in groundwater flow modelling Typical in groundwater flow modelling: WIPP Data 6 x 10 transmissivity • Limited observational data about 3.595 pressure head WIPP site boundary aquifer (hydraulic conductivity) computational domain 3.59 • Different kinds of measurement: direct and indirect data (transmissivity of aquifer and, e.g., 3.585 UTM Northing pressure head) 3.58 3.575 3.57 6.05 6.1 6.15 6.2 UTM Easting 5 x 10 Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 4 / 24

  5. Motivation Uncertainty in groundwater flow modelling Typical in groundwater flow modelling: • Limited observational data about 4 x 10 aquifer (hydraulic conductivity) 2 1.9 • Different kinds of measurement: 1.8 direct and indirect data (transmissivity of aquifer and, e.g., 1.7 pressure head) 1.6 y 1.5 • Quantity of interest (QoI): functional of flux or pressure (e.g., 1.4 breakthrough time of 1.3 pollutant/radionuclides) 1.2 1.1 0.7 0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 x 4 x 10 Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 4 / 24

  6. Motivation Uncertainty in groundwater flow modelling Typical in groundwater flow modelling: • Limited observational data about aquifer (hydraulic conductivity) • Different kinds of measurement: direct and indirect data (transmissivity of aquifer and, e.g., pressure head) • Quantity of interest (QoI): ECDF of log travel time 20 000 MC samples 1 functional of flux or pressure (e.g., 0.9 0.8 breakthrough time of 0.7 pollutant/radionuclides) 0.6 0.5 0.4 • UQ approach: model uncertainty 0.3 M = 30 M = 100 0.2 about conductivity (and therefore M = 500 0.1 M = 1000 QoI) stochastically 0 4 4.5 5 5.5 6 log t Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 4 / 24

  7. Motivation Uncertainty in groundwater flow modelling −∇ · ( e κ ( x ) ∇ p ( x )) = 0 , • Groundwater flow model: p | ∂ D = p 0 • Model κ as random function from Hilbert space H , e.g., H = L 2 ( D ) : � { φ m } ∞ κ ( x , ω ) = ξ m ( ω ) φ m ( x ) , m = 1 CONS of H . m ≥ 1 Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 5 / 24

  8. Motivation Uncertainty in groundwater flow modelling −∇ · ( e κ ( x ) ∇ p ( x )) = 0 , • Groundwater flow model: p | ∂ D = p 0 • Model κ as random function from Hilbert space H , e.g., H = L 2 ( D ) : � { φ m } ∞ κ ( x , ω ) = ξ m ( ω ) φ m ( x ) , m = 1 CONS of H . m ≥ 1 • Employ direct data to fit Gaussian prior model ( ξ m ) m ∈ N = ξ ∼ µ 0 = N ( 0 , C 0 ) on ℓ 2 ( R ) . κ ∼ µ 0 = N ( 0 , C 0 ) , resp. Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 5 / 24

  9. Motivation Uncertainty in groundwater flow modelling −∇ · ( e κ ( x ) ∇ p ( x )) = 0 , • Groundwater flow model: p | ∂ D = p 0 • Model κ as random function from Hilbert space H , e.g., H = L 2 ( D ) : � { φ m } ∞ κ ( x , ω ) = ξ m ( ω ) φ m ( x ) , m = 1 CONS of H . m ≥ 1 • Employ direct data to fit Gaussian prior model ( ξ m ) m ∈ N = ξ ∼ µ 0 = N ( 0 , C 0 ) on ℓ 2 ( R ) . κ ∼ µ 0 = N ( 0 , C 0 ) , resp. • Incorporate indirect data y by conditioning prior ξ ∼ µ 0 on y = G ( κ ( ξ )) + ε where G forward operator and ε ∼ N ( 0 , Σ) Gaussian noise. Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 5 / 24

  10. Motivation Uncertainty in groundwater flow modelling −∇ · ( e κ ( x ) ∇ p ( x )) = 0 , • Groundwater flow model: p | ∂ D = p 0 • Model κ as random function from Hilbert space H , e.g., H = L 2 ( D ) : � { φ m } ∞ κ ( x , ω ) = ξ m ( ω ) φ m ( x ) , m = 1 CONS of H . m ≥ 1 • Employ direct data to fit Gaussian prior model ( ξ m ) m ∈ N = ξ ∼ µ 0 = N ( 0 , C 0 ) on ℓ 2 ( R ) . κ ∼ µ 0 = N ( 0 , C 0 ) , resp. • Incorporate indirect data y by conditioning prior ξ ∼ µ 0 on y = G ( κ ( ξ )) + ε where G forward operator and ε ∼ N ( 0 , Σ) Gaussian noise. • Bayes’ rule yields conditional/posterior distribution µ : [Stuart, 2010] Φ( ξ ) = 1 µ ( d ξ ) ∝ e − Φ( ξ ) µ 0 ( d ξ ) , 2 � y − G ( κ ( ξ )) � 2 Σ − 1 . Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 5 / 24

  11. Outline 1 Motivation 2 MCMC in Function Space 3 Better Proposals 4 Convergence 5 Numerical Examples Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 6 / 24

  12. MCMC in Function Space Sampling the QoI from the posterior Markov chain ( ξ n ) n ∈ N in ℓ 2 ( R ) with transition kernel A ∈ B ( ℓ 2 ( R )) Q ( η , A ) := P ( ξ n + 1 ∈ A | ξ n = η ) , which is reversible w.r.t. µ : Q ( ξ , d η ) µ ( d ξ ) = Q ( η , d ξ ) µ ( d η ) ⇒ µ = µ Q . Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 7 / 24

  13. MCMC in Function Space Sampling the QoI from the posterior Markov chain ( ξ n ) n ∈ N in ℓ 2 ( R ) with transition kernel A ∈ B ( ℓ 2 ( R )) Q ( η , A ) := P ( ξ n + 1 ∈ A | ξ n = η ) , which is reversible w.r.t. µ : Q ( ξ , d η ) µ ( d ξ ) = Q ( η , d ξ ) µ ( d η ) ⇒ µ = µ Q . Then – under suitable conditions (later) – we have for QoI f � N � 1 N →∞ f ( ξ n ) − − − − → f ( ξ ) µ ( d ξ ) = E µ [ f ] . N n = 1 Mean squared error ∝ N − 1 2 , constant is sum of autocovariances: ∞ � � � γ ( k ) , γ ( k ) = Cov f ( ξ 1 ) , f ( ξ 1 + k ) ξ 1 ∼ µ. , k = −∞ Rapid decay of autocovariance function γ ⇒ high statistical efficiency. Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 7 / 24

  14. MCMC in Function Space Metropolis-Hastings MCMC Focus on Metropolis-Hastings (MH) MCMC where ξ n → ξ n + 1 is as follows: 1 Propose new state η according to proposal kernel q ( ξ n , d η ) , e.g., η ∼ q ( ξ n , · ) = N ( ξ n , s 2 C 0 ) , s ∈ R + stepsize . 2 Accept proposal η with probability α ( ξ n , η ) : draw a ∼ U [ 0 , 1 ] and set � a ≤ α ( ξ n , η ) , η , ξ n + 1 = ξ n , otherwise. Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 8 / 24

  15. MCMC in Function Space Metropolis-Hastings MCMC Focus on Metropolis-Hastings (MH) MCMC where ξ n → ξ n + 1 is as follows: 1 Propose new state η according to proposal kernel q ( ξ n , d η ) , e.g., η ∼ q ( ξ n , · ) = N ( ξ n , s 2 C 0 ) , s ∈ R + stepsize . 2 Accept proposal η with probability α ( ξ n , η ) : draw a ∼ U [ 0 , 1 ] and set � a ≤ α ( ξ n , η ) , η , ξ n + 1 = ξ n , otherwise. Then MH chain has transition kernel � � � Q ( ξ , d η ) = α ( ξ , η ) q ( ξ , d η ) + 1 − α ( ξ , ζ ) q ( ξ , d ζ ) δ ξ ( d η ) , � �� � = rejection prob. Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 8 / 24

  16. MCMC in Function Space MH-MCMC in Hilbert Space Sufficient for reversibility w.r.t. µ is the choice � � 1 , d ν ⊤ α ( ξ k , η ) = min d ν ( ξ k , η ) , ν ⊤ ( d ξ , d η ) := ν ( d η , d ξ ) . where ν ( d ξ , d η ) := q ( ξ , d η ) µ ( d ξ ) , Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 9 / 24

  17. MCMC in Function Space MH-MCMC in Hilbert Space Sufficient for reversibility w.r.t. µ is the choice � � 1 , d ν ⊤ α ( ξ k , η ) = min d ν ( ξ k , η ) , ν ⊤ ( d ξ , d η ) := ν ( d η , d ξ ) . where ν ( d ξ , d η ) := q ( ξ , d η ) µ ( d ξ ) , In finite dimensions d ν ⊤ d ν is simply ratio of densities (w.r.t. Lebesgue measure). E.g., if q has density ρ ( | ξ − η | ) , then d ν ⊤ d ν ( ξ , η ) = π ( η ) π ( ξ ) , µ ( d ξ ) ∝ π ( ξ ) d ξ . Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 9 / 24

  18. MCMC in Function Space MH-MCMC in Hilbert Space Sufficient for reversibility w.r.t. µ is the choice � � 1 , d ν ⊤ α ( ξ k , η ) = min d ν ( ξ k , η ) , ν ⊤ ( d ξ , d η ) := ν ( d η , d ξ ) . where ν ( d ξ , d η ) := q ( ξ , d η ) µ ( d ξ ) , In finite dimensions d ν ⊤ d ν is simply ratio of densities (w.r.t. Lebesgue measure). E.g., if q has density ρ ( | ξ − η | ) , then d ν ⊤ d ν ( ξ , η ) = π ( η ) π ( ξ ) , µ ( d ξ ) ∝ π ( ξ ) d ξ . In infinite dimensions, µ 0 -reversibility of proposal q sufficient in order that � d µ � − 1 d ν ⊤ d ν ⊤ d ν ( ξ , η ) = d µ = e Φ( ξ ) − Φ( η ) . d ν ( ξ , η ) exists and ( η ) ( ξ ) d µ 0 d µ 0 Oliver Ernst (TU Chemnitz) Metropolis-Hastings Algorithms KAUST, January 2015 9 / 24

Recommend


More recommend