cs 4495 computer vision hidden markov models
play

CS 4495 Computer Vision Hidden Markov Models Aaron Bobick School - PowerPoint PPT Presentation

Hidden Markov Models CS 4495 Computer Vision A. Bobick CS 4495 Computer Vision Hidden Markov Models Aaron Bobick School of Interactive Computing Hidden Markov Models CS 4495 Computer Vision A. Bobick Administrivia PS4 going


  1. Hidden Markov Models CS 4495 Computer Vision – A. Bobick CS 4495 Computer Vision Hidden Markov Models Aaron Bobick School of Interactive Computing

  2. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Administrivia • PS4 – going OK? • Please share your experiences on Piazza – e.g. discovered something that is subtle about using vl_sift. If you want to talk about what scales worked and why that’s ok too.

  3. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Outline • Time Series • Markov Models • Hidden Markov Models • 3 computational problems of HMMs • Applying HMMs in vision- Gesture Slides “borrowed” from UMd and elsewhere Material from: slides from Sebastian Thrun, and Yair Weiss

  4. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Audio Spectrum Audio Spectrum of the Song of the Prothonotary Warbler

  5. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Bird Sounds Prothonotary Warbler Chestnut-sided Warbler

  6. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Questions One Could Ask  Time series classification What bird is this? •  Time series prediction How will the song • continue? Is this bird sick? •  Outlier detection What phases does this •  Time series segmentation song have?

  7. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Other Sound Samples

  8. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Another Time Series Problem Cisco General Electric Intel Microsoft

  9. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Questions One Could Ask  Time series prediction Will the stock go up or • down? What type stock is this (eg, •  Time series classification risky)? Is the behavior abnormal? •  Outlier detection

  10. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Music Analysis

  11. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Questions One Could Ask  Time series classification Is this Beethoven or Bach? •  Time series Can we compose more of • prediction/generation that?  Time series segmentation Can we segment the piece • into themes?

  12. Hidden Markov Models CS 4495 Computer Vision – A. Bobick For vision: Waving, pointing, controlling?

  13. Hidden Markov Models CS 4495 Computer Vision – A. Bobick The Real Question How do we model these problems? • How do we formulate these questions as a • inference/learning problems?

  14. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Outline For Today • Time Series • Markov Models • Hidden Markov Models • 3 computational problems of HMMs • Applying HMMs in vision- Gesture • Summary

  15. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Weather: A Markov Model (maybe?) 80% 60% Sunny Rainy 15% 38% 5% 2% 75% 5% Snowy Probability of moving to a given state depends only on the current state: 1 st Order Markovian 20%

  16. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Ingredients of a Markov Model { , S S ,..., S } States: • 1 2 N State transition probabilities: • = = = a P q ( S | q S ) + ij t 1 i t j Initial state distribution: • π = = P q [ S ] i 1 i 80% Sunny 60% Rainy 15% 38% 2% 5% 5% 75% Snowy 20%

  17. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Ingredients of Our Markov Model { S , S , S } • States: sunny rainy snowy • State transition probabilities:   .8 .15 .05   =  A .38 .6 .02    .75 .05 .2   • Initial state distribution: 80% Sunny π = 60% Rainy (.7 .25 .05) 15% 38% 2% 5% 5% 75% Snowy 20%

  18. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Probability of a Time Series • Given: • What is the probability of this series? ⋅ ⋅ ⋅ P ( S ) P ( S | S ) P ( S | S ) P ( S | S ) sunny rainy sunny rainy rainy rainy rainy ⋅ ⋅ P ( S | S ) P ( S | S ) snowy rainy snowy snowy = ⋅ ⋅ ⋅ ⋅ ⋅ = 0 . 7 0 . 15 0 . 6 0 . 6 0 . 02 0 . 2 0 . 0001512   . 8 . 15 . 05   π = = ( . 7 . 25 . 05 )   A . 38 . 6 . 02    . 75 . 05 . 2 

  19. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Outline For Today • Time Series • Markov Models • Hidden Markov Models • 3 computational problems of HMMs • Applying HMMs in vision- Gesture • Summary

  20. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Hidden Markov Models 80% 60% Sunny 60% 5% NOT OBSERVABLE Rainy 80% 30% 30% Sunny 15% 60% Rainy 15% 38% 38% 2% 5% 5% 75% 10% 5% 65% Snowy 2% 75% 5% Snowy 20% 50% 0% 50% OBSERVABLE 20%

  21. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Probability of a Time Series • Given: • What is the probability of this series? = P ( O ) P ( O , O , O ,..., O ) coat coat umbrella umbrella ∑ ∑ = = P ( O | Q ) P ( Q ) P ( O | q ,..., q ) P ( q ,..., q ) 1 7 1 7 all Q q ,..., q 1 7 = ⋅ ⋅ ⋅ ⋅ + 2 4 6 (0.3 0.1 0.6) (0.7 0.8 ) ...     . 8 . 15 . 05 . 6 . 3 . 1     π = = =   ( . 7 . 25 . 05 ) A . 38 . 6 . 02   B . 05 . 3 . 65      . 75 . 05 . 2   0 . 5 . 5 

  22. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Specification of an HMM • N - number of states • Q = {q 1 ; q 2 ; : : : ;q T } – sequence of states • Some form of output symbols • Discrete – finite vocabulary of symbols of size M. One symbol is “emitted” each time a state is visited (or transition taken). • Continuous – an output density in some feature space associated with each state where a output is emitted with each visit • For a given sequence observation O • O = {o 1 ; o 2 ; : : : ;o T } – o i observed symbol or feature at time i

  23. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Specification of an HMM • A - the state transition probability matrix • a ij = P ( q t +1 = j|q t = i ) • B - observation probability distribution • Discrete: b j ( k ) = P ( o t = k |q t = j ) i ≤ k ≤ M • Continuous b j ( x ) = p( o t = x | q t = j ) • π - the initial state distribution S 2 S 3 S 1 • π (j) = P(q 1 = j) • Full HMM over a of states and output space is thus specified as a triplet: λ = (A,B, π )

  24. Hidden Markov Models CS 4495 Computer Vision – A. Bobick What does this have to do with Vision? • Given some sequence of observations, what “model” generated those? • Using the previous example: given some observation sequence of clothing: • Is this Philadelphia, Boston or Newark? • Notice that if Boston vs Arizona would not need the sequence!

  25. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Outline For Today • Time Series • Markov Models • Hidden Markov Models • 3 computational problems of HMMs • Applying HMMs in vision- Gesture • Summary

  26. Hidden Markov Models CS 4495 Computer Vision – A. Bobick The 3 great problems in HMM modelling 1. Evaluation: Given the model 𝜇 = ( 𝐵 , 𝐶 , 𝜌 ) what is the probability of occurrence of a particular observation sequence 𝑃 = { 𝑝 1 , … , 𝑝 𝑈 } = 𝑄 ( 𝑃 | 𝜇 ) • This is the heart of the classification/recognition problem: I have a trained model for each of a set of classes, which one would most likely generate what I saw. 2. Decoding: Optimal state sequence to produce an observation sequence 𝑃 = { 𝑝 1 , … , 𝑝 𝑈 } • Useful in recognition problems – helps give meaning to states – which is not exactly legal but often done anyway. 3. Learning: Determine model λ , given a training set of observations • Find λ , such that 𝑄 ( 𝑃 | 𝜇 ) is maximal

  27. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Problem 1: Naïve solution • State sequence 𝑅 = ( 𝑟 1 , … 𝑟 𝑈 ) • Assume independent observations: ) ∏ T λ = λ = P ( O | q , P ( o | q , ) b ( o ) b ( o )... b ( o ) t t q 1 1 q 2 2 qT T = i 1 NB: Observations are mutually independent, given the hidden states. That is, if I know the states then the previous observations don’t help me predict new observation. The states encode *all* the information. Usually only kind-of true – see CRFs.

  28. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Problem 1: Naïve solution • But we know the probability of any given sequence of states: λ = π P q ( | ) a a ... a − q 1 q q 1 2 q q 2 3 q T ( 1) qT

  29. Hidden Markov Models CS 4495 Computer Vision – A. Bobick Problem 1: Naïve solution • Given: ) ∏ T λ = λ = P ( O | q , P ( o | q , ) b ( o ) b ( o )... b ( o ) t t q 1 1 q 2 2 qT T = i 1 λ = π P q ( | ) a a ... a − q 1 q q 1 2 q q 2 3 q T ( 1) qT • We get: ∑ λ = λ λ P ( O | ) P ( O | q , ) P ( q | ) q NB: -The above sum is over all state paths -There are 𝑂 𝑈 states paths, each ‘costing’ 𝑃 ( 𝑈 ) calculations, leading to 𝑃 ( 𝑈𝑂 𝑈 ) time complexity.

Recommend


More recommend