pr probability obability an and d ti time e
play

Pr Probability obability an and d Ti Time: e: Hi Hidden dden - PowerPoint PPT Presentation

Pr Probability obability an and d Ti Time: e: Hi Hidden dden Mark arkov ov Mod odels els (H (HMMs) MMs) Co Computer ter Sc Science ce cpsc3 c322, 22, Lectur ture e 32 (Te Text xtbo book ok Chpt 6.5.2) .2) April, l, 7,


  1. Pr Probability obability an and d Ti Time: e: Hi Hidden dden Mark arkov ov Mod odels els (H (HMMs) MMs) Co Computer ter Sc Science ce cpsc3 c322, 22, Lectur ture e 32 (Te Text xtbo book ok Chpt 6.5.2) .2) April, l, 7, 2010 CPSC 322, Lecture 32 Slide 1

  2. Lecture cture Ov Overview view • Re Recap ap • Markov Models • Markov Chain • Hidden dden Markov ov Models els CPSC 322, Lecture 32 Slide 2

  3. Ans nswer wering ing Que ueries ies un unde der Unc ncerta ertainty inty Probabi obabili lity ty Theor ory Dynam namic ic Bayesi esian an Netwo work Stati atic c Belief Ne Netwo work k & Va & Variable able El Eliminat nation ion Hidde den n Mark rkov ov Models dels Student dent Trac acing ing in Monit nitori oring ng tutor oring ng System ems Robot botics ics BioInf nform ormatics ics (e.g .g credi edit card rds) s) Mar arkov kov Chains ins Natural ural Language nguage Diagn gnostic ostic Proc ocessing ssing Syst stem ems s (e.g., medic dicine) ine) Email il spam am filters rs CPSC 322, Lecture 18 Slide 3

  4. Sta tatio tiona nary ry Ma Markov ov Cha hain in (SMC MC) A stationary Markov Chain : for all t >0 • P ( S t+1 | S 0 ,…, S t ) = and • P ( S t +1 | We only need to specify and • Simple Model, easy to specify • Often the natural model • The network can extend indefinitely • Variati ation ons s of SMC C are at the core of most Na Natural al Language ge Processi ssing ng (NL NLP) applicati ations! ns!

  5. Lecture cture Ov Overview view • Re Recap ap • Markov Models • Markov Chain • Hidden dden Markov ov Models els CPSC 322, Lecture 32 Slide 5

  6. How ow can an we e mi mini nima mally lly ex exte tend nd Ma Marko kov v Cha hain ins? • Maintaining the Markov and stationary assumption? A useful situation to model is the one in which: • the reasoning system does not have acces ess s to the states • but can make observ rvati ation ons s that give some information about the current state

  7. Hidden Markov Model • A Hidden Markov Model (HMM) starts with a Markov chain, and adds a noisy observation about the state at each time step: • |domain(S)| = k • |domain(O)| = h • P ( S 0 ) specifies initial conditions • P ( S t+1 | S t ) specifies the dynamics • P ( O t | S t ) specifies the sensor model CPSC 322, Lecture 30 Slide 7

  8. Example: mple: Localization for “Pushed around” Robot • Locali aliza zatio tion (where am I?) is a fundamental problem in robotics • Suppose a robot is in a circular corridor with 16 locations • There are four doors at positions: 2, 4, 7, 11 • The Robot initially doesn’t know where it is • The Robot is pushed around. After a push it can stay in the same location, move left or right. • The Robot has Noisy sensor telling whether it is in front of a door CPSC 322, Lecture 32 Slide 8

  9. This scenario can be represented as… • Examp mple e Stochastic astic Dy Dynamics cs: when pushed, it stays in the same location p=0.2, moves left or right with equal probability P(Loc t + 1 | Loc t ) P(Loc 1 ) CPSC 322, Lecture 32 Slide 9

  10. This scenario can be represented as… Examp mple e of No Noisy sensor r telling P(O t | Loc t ) whether it is in front of a door. • If it is in front of a door P(O t = T) = .8 • If not in front of a door P(O t = T) = .1 CPSC 322, Lecture 32 Slide 10

  11. Useful eful in infe feren ence ce in in HM HMMs Ms • Local aliz izati ation on: Robot starts at an unknown location and it is pushed around t times. It wants to determine where it is • In general: l: compute the posterior distribution over the current state given all evidence to date P(S t | O 0 … O t )

  12. Example mple : : Rob obot ot Lo Local aliz izati ation on • Suppose a robot wants to determine its location based on its actions and its sensor readings • Three actions: goRight, goLeft, Stay • This can be represented by an augmented HMM CPSC 322, Lecture 32 Slide 12

  13. Rob obot ot Lo Local aliz ization ation Sen ensor or an and D d Dyna namics mics Mo Mode del • Sa Sample le Se Sensor Model l (assume same as for pushed around) • Sa Sample le St Stochastic astic Dy Dynamics ics: P(Loc t + 1 | Action t , Loc t ) P(Loc t + 1 = L | Action t = goRight , Loc t = L) = 0.1 P(Loc t + 1 = L+1 | Action t = goRight , Loc t = L) = 0.8 P(Loc t + 1 = L + 2 | Action t = goRight , Loc t = L) = 0.074 P(Loc t + 1 = L’ | Action t = goRight , Loc t = L) = 0.002 for all other locations L’ • All location arithmetic is modulo 16 • The action goLeft works the same but to the left CPSC 322, Lecture 32 Slide 13

  14. Dyna namics mics Mo Mode del l Mo More e Det etai ails ls • Sample e Stochastic astic Dy Dynamics cs: P(Loc t + 1 | Action, Loc t ) P(Loc t + 1 = L | Action t = goRight , Loc t = L) = 0.1 P(Loc t + 1 = L+1 | Action t = goRight , Loc t = L) = 0.8 P(Loc t + 1 = L + 2 | Action t = goRight , Loc t = L) = 0.074 P(Loc t + 1 = L’ | Action t = goRight , Loc t = L) = 0.002 for all other locations L’ CPSC 322, Lecture 32 Slide 14

  15. Rob obot ot Lo Local aliz ization ation ad addi diti tion onal al sen ensor or • Additi tion onal al Light Sensor: r: there is light coming through an opening at location 10 P (L t | Loc t ) • Info fo from m the two wo sensors rs is is combin ined ed :“Sensor Fusion” CPSC 322, Lecture 30 Slide 15

  16. The Robot starts at an unknown location and must determine where it is The model appears to be too ambiguous • Sensors are too noisy • Dynamics are too stochastic to infer anything But inference actually works pretty well. Let’s check: http://www.cs.ubc.ca/spider/poole/demos/localization /localization.html You can use standard Bnet inference. However you typically take advantage of the fact that time moves forward (not in 322) CPSC 322, Lecture 30 Slide 16

  17. Sampl mple e scenari nario o to to explore lore in demo • Keep making observations without moving. What happens? • Then keep moving without making observations. What happens? • Assume you are at a certain position alternate moves and observations • …. CPSC 322, Lecture 32 Slide 17

  18. HMMs have many other applications…. Natura ral l Languag age e Pr Process ssin ing: g: e.g., Speech Recognition • States: phoneme \ word • Observations : acoustic signal \ phoneme Bi Bioinfo nform rmatics atics: Gene Finding • States: coding / non-coding region • Observations: DNA Sequences Fo For these se problem ems s the critic tical al infere renc nce e is: find the most likely sequence of states given a sequence of observations CPSC 322, Lecture 32 Slide 18

  19. Markov kov Models els Simplest Possible Markov Chains Dynamic Bnet Add noisy Observations Hidden Markov about the state Model at time t Add Actions and Markov Decision Values (Rewards) Processes (MDPs) CPSC 322, Lecture 32 Slide 19

  20. Learning Goals for today’s class Yo You u can an: • Specify the components of an Hidden Markov Model (HMM) • Justify and apply HMMs to Robot Localization Clarifi ifica catio ion n on second nd LG for last t class ss You can: • Justify and apply Markov Chains to compute the probability of a Natural Language sentence (NOT to estimate the conditional probs- slide 18) CPSC 322, Lecture 4 Slide 20

  21. Next xt week En Enviro ronm nmen ent Stocha chastic stic Determi De rministic nistic Pr Problem em Arc Co Consisten tency cy Search ch Constr trai aint nt Vars s + Satis isfactio action Constr Co traints aints SLS Stati atic Belief f Ne Nets Logic ics Query ry Var. . Eliminati ation Search Se ch Mar arkov kov Chains ins and HMMs Decision De on Ne Nets Sequenti ntial al STRI RIPS Var. . Eliminati ation Planning Search ch Mar arkov kov Decis ision on Proce cess sses es Representation Re sentation Va Valu lue Iterati eration on Re Reasoning ing CPSC 322, Lecture 2 Slide 21 Techniqu nique

  22. Next xt Class ss • One ne-off off de decis isions ions (TextBook 9.2) • Sin ingl gle e Sta tage ge Dec ecisi ision on ne netw twork orks s ( 9.2.1) CPSC 322, Lecture 29 Slide 22

Recommend


More recommend