bayesian fitting
play

Bayesian Fitting Probabilistic Morphable Models Summer School, June - PowerPoint PPT Presentation

> DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Bayesian Fitting Probabilistic Morphable Models Summer School, June 2017 Sandro Schnborn University of Basel > DEPARTMENT OF


  1. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Bayesian Fitting Probabilistic Morphable Models Summer School, June 2017 Sandro Schönborn University of Basel

  2. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Uncertainty: Probability Distributions • Probabilistic Models • Uncertain Observation (noise, outlier, occlusion, …) • Fitting: Model explanation of observed data – probabilistic? Tells us about the outcome’s certainty! Observations Fit & Certainty 2 Ground truth Bishop PRML, 2006

  3. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Probability: An Example • Dentist example: Does the patient have a cavity? 𝑄 cavity = 0.1 Certainty 𝑄 cavity toothache) = 0.8 𝑄 cavity toothache, gum problems) = 0.4 Bu But t th the pati tient t eith ther has a cavity ty or does not There is no 80% cavity! • Having a cavity should not depend on whether the • patient has a toothache or gum problems All these statements do not contradict each other, they summarize th owledge about the patient the denti tist’s know 3 AIMA: Russell & Norvig, Artificial Intelligence. A Modern Approach, 3 rd edition,

  4. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Uncertainty: Bayesian Probability • How are probabilities to be interpreted? They are sometimes contradictory: Why does the distribution change when we have more data? Shouldn’t there be a real distribution of 𝑄 𝜄 ? • Bayesian probabilities rely on a subjective perspective: Probability is used to express our current knowledge . It can change when we learn or see more: With more data, we are more certain about our result. Subjectivity : There is no single, real underlying distribution. A probability distribution expresses our knowledge – It is different in different situations and for different observers since they have different knowledge. • Not subjective in the sense that it is arbitrary! There are quantitative rules to follow mathematically • Probability expresses an observers certainty , often called be belie lief 4

  5. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Towards Bayesian Inference Posterior Models: Gaussian Process Regression Probabilistic Fit : Probabilistic interpretation of data Observed Points Update of prior to posterior model: Bayesian Inference Posterior Model 5

  6. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Belief Updates Mo Model Ob Obser ervation Po Posterior Face distribution Concrete points Face distribution Possibly uncertain consistent with observation Prior belief More knowledge Posterior belief Consistency: Laws of probability calculus! 6

  7. � > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Joint Distribution Probabilistic model: joint distribution of points 𝑄 𝑦 > , 𝑦 @ Marginal Conditional Distribution of certain points only Distribution of points conditioned on known values of others 𝑄 𝑦 > |𝑦 @ = 𝑄 𝑦 > , 𝑦 @ 𝑄 𝑦 > = A 𝑄(𝑦 > , 𝑦 @ ) 𝑄 𝑦 @ D E Both can be easily calculated for Gaussian models 7

  8. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Certain Observation • Observations are known values • Distribution of 𝑦 > after observing 𝑦 @ , … , 𝑦 G : 𝑄 𝑦 > |𝑦 @ … 𝑦 G = 𝑄 𝑦 > , 𝑦 @ , … , 𝑦 G 𝑄 𝑦 @ , … , 𝑦 G • Conditional probability 8

  9. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Towards Bayesian Inference • Update belief about 𝑦 > by observing 𝑦 @ , … , 𝑦 G 𝑄 𝑦 > → 𝑄 𝑦 > 𝑦 @ … 𝑦 G • Factorize joint distribution 𝑄 𝑦 > , 𝑦 @ , … , 𝑦 G = 𝑄 𝑦 @ , … , 𝑦 G |𝑦 > 𝑄 𝑦 > • Rewrite conditional distribution 𝑄 𝑦 > |𝑦 @ … 𝑦 G = 𝑄 𝑦 > , 𝑦 @ , … , 𝑦 G = 𝑄 𝑦 @ , … , 𝑦 G |𝑦 > 𝑄 𝑦 > 𝑄 𝑦 @ , … , 𝑦 G 𝑄 𝑦 @ , … , 𝑦 G • General: Query ( 𝑅 ) and Evidence ( 𝐹 ) 𝑄 𝑅|𝐹 = 𝑄 𝑅, 𝐹 = 𝑄 𝐹|𝑅 𝑄 𝑅 𝑄 𝐹 𝑄 𝐹 9

  10. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Uncertain Observation • Observations with uncertainty Model needs to describe how observations are distributed with joint distribution 𝑄 𝑅, 𝐹 • Still conditional probability But joint distribution is more complex • Joint distribution factorized 𝑄 𝑅, 𝐹 = 𝑄 𝐹|𝑅 𝑄 𝑅 • Likelihood 𝑄 𝐹|𝑅 • Prior 𝑄 𝑅 10

  11. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Likelihood Joint Joi Likelihood Li Prior Pr 𝑄 𝑅, 𝐹 = 𝑄 𝐹|𝑅 𝑄 𝑅 • Likelihood x prior: factorization is more flexible than full joint • Prior: distribution of core model without observation • Likelihood: describes how observations are distributed • Common example: Gaussian distributed points 𝑹 𝑭 𝑄 𝑅 𝑄 𝐹|𝑅 11

  12. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Bayesian Inference • Conditional/Bayes rule: method to update beliefs Li Likelihood Pr Prior 𝑄 𝑅|𝐹 = 𝑄 𝐹|𝑅 𝑄 𝑅 Posterior Po 𝑄 𝐹 Ma Marginal Likelihood • Each observation updates our belief (changes knowledge!) 𝑄 𝑅 → 𝑄 𝑅 𝐹 → 𝑄 𝑅 𝐹, 𝐺 → 𝑄 𝑅 𝐹, 𝐺, 𝐻 → ⋯ • Bayesian Inference: How beliefs evolve with observation • Recursive: Posterior becomes prior of next inference step 12

  13. � � > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Marginalization • Models contain irrelevant/hidden variables e.g. points on chin when nose is queried • Marginalize over hidden variables ( 𝐼 ) = A 𝑄 𝐹, 𝐼|𝑅 𝑄 𝑅 𝑄 𝑅 𝐹 = A 𝑄 𝑅, 𝐼 𝐹 𝑄 𝐹, 𝐼 Q Q 13

  14. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL General Bayesian Inference • Observation of additional variables • Common case, e.g. face rendering, landmark locations • Coupled to core model via likelihood factorization • General Bayesian inference case: • Distribution of data 𝐸 (formerly Evidence) • Parameters 𝜄 (formerly Query) 𝑄 𝜄|𝐸 = 𝑄 𝐸|𝜄 𝑄 𝜄 𝑄 𝐸 𝑄 𝜄|𝐸 ∝ 𝑄 𝐸|𝜄 𝑄 𝜄 14

  15. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Example: Bayesian Curve Fitting • Curve Fitting: Data interpretation with a model • Posterior distribution expresses certainty • in parameter space • in the predictive distribution 15

  16. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Posterior of Regression Parameters 𝑄 𝑥 No data 𝑄 𝐸 > |𝑥 N=1 𝑄 𝑥 𝐸 > 𝑄 𝐸 @ |𝑥 𝑄 𝑥 𝐸 > , 𝐸 @ N=2 𝑄 𝑥 𝐸 > , 𝐸 @ , … N=19 Bishop PRML, 2006 16

  17. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL More Bayesian Inference Examples Bishop PRML, 2006 Non-Linear Curve Fitting Classification e.g. Gaussian Process Regression e.g. Bayes classifier 17

  18. > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE PROBABILISTIC MORPHABLE MODELS | JUNE 2017 | BASEL Summary: Bayesian Inference • Belief : formal expression of an observer’s knowledge • Subjective state of knowledge about the world • Beliefs are expressed as probability distributions • Formally not arbitrary: Consistency requires laws of probability • Observations change knowledge and thus beliefs • Bayesian inference formally updates prior beliefs to posteriors • Conditional Probability • Integration of observation via likelihood x prior factorization 𝑄 𝜄|𝐸 = 𝑄 𝐸|𝜄 𝑄 𝜄 𝑄 𝐸 18

Recommend


More recommend