modeling time series with hidden markov models
play

Modeling time series with hidden Markov models Advanced Machine - PowerPoint PPT Presentation

Modeling time series with hidden Markov models Advanced Machine learning 2017 Nadia Figueroa, Jose Medina and Aude Billard Time series data Barometric pressure Temperature Data Humidity Time Whats going on here? Modeling time series


  1. Modeling time series with hidden Markov models Advanced Machine learning 2017 Nadia Figueroa, Jose Medina and Aude Billard

  2. Time series data Barometric pressure Temperature Data Humidity Time What‘s going on here? Modeling time series with HMMs 2

  3. Time series data Data What‘s the problem setting? Time We have several We don’t care We have unstructured about time … trajectories with identical trajectory(ies)! duration Hum. Data Time Explicit time dependency Consider dependency on the past Too complex! Modeling time series with HMMs 3

  4. Unstructured time series data How to simplify this problem? Sunny Cloudy Rainy Consider dependency on the past Markov assumption Modeling time series with HMMs 4

  5. Outline First part (10:15 – 11:00): • Recap on Markov chains Data • Hidden Markov Model (HMM) - Recognition of time series Time - ML Parameter estimation Second part (11:15 – 12:00): • Time series segmentation • Bayesian nonparametrics for HMMs https://github.com/epfl-lasa/ML_toolbox Modeling time series with HMMs 5

  6. Outline first part Sunny Cloudy Rainy Modeling time series with HMMs 6

  7. Markov chains Sunny Cloudy Transition matrix Sunny Cloudy Rainy Sunny Cloudy Rainy Rainy Initial probabilities Sunny Cloudy Rainy Modeling time series with HMMs 7

  8. Likelihood of a Markov chain Sunny Sunny Cloudy Transition matrix Sunny Cloudy Rainy Sunny Cloudy Rainy Initial probabilities Sunny Cloudy Rainy Modeling time series with HMMs 8

  9. Learning Markov chains Topologies Periodic Sunny Sunny Cloudy Left-to-right Ergodic Modeling time series with HMMs 9

  10. Outline first part Sunny Cloudy Rainy Modeling time series with HMMs 10

  11. Hidden Markov model Transition matrix Rainy Sunny Cloudy Sunny Sunny Cloudy Cloudy Rainy Initial probabilities Sunny Cloudy Rainy Rainy Modeling time series with HMMs 11

  12. Likelihood of an HMM Transition matrix Rainy Sunny Cloudy Sunny Cloudy Rainy Initial probabilities Sunny Cloudy Rainy Forward variable Modeling time series with HMMs 12

  13. Likelihood of an HMM Transition matrix Rainy Sunny Cloudy Sunny Cloudy Rainy Initial probabilities Sunny Cloudy Rainy Forward variable Modeling time series with HMMs 13

  14. Likelihood of an HMM Transition matrix Rainy Sunny Cloudy Sunny Cloudy Rainy Initial probabilities Sunny Cloudy Rainy Backward variable Modeling time series with HMMs 14

  15. Learning an HMM Baum-Welch algorithm (Expectation-Maximization for HMMs) - Iterative solution - Converges to local minimum Starting from an initial find a such that • E-step: Given an observation sequence and a model, find the probabilities of the states to have produced those observations. • M-step: Given the output of the E-step, update the model parameters to better fit the observations. Modeling time series with HMMs 15

  16. Learning an HMM E-step: Probability of being in state i at time k and transition to state j Probability of being in state i at time k Modeling time series with HMMs 16

  17. Learning an HMM M-step: Probability of being in state i at time k and transition to state j Probability of being in state i at time k Modeling time series with HMMs 17

  18. Learning an HMM • HMM is a parametric technique (Fixed number of states, fixed topology)  Heuristics to determining the optimal number of states • : dataset; : number of datapoints; : number of free parameters X N K − + - Aikaike Information Criterion: AIC= 2ln L 2 K ( ) = − + - Bayesian Information Criterion: 2ln ln BIC L K N L: maximum likelihood of the model giv en K parameters Choosing AIC versus BIC depends on the application:  Is the purpose of the analysis to make predictions, or to decide which model Lower BIC implies either fewer explanatory variables, better fit, or both. best represents reality? As the number of datapoints (observations) increase, BIC assigns more weights AIC may have better predictive ability than BIC, but BIC finds a computationally to simpler models than AIC. more efficient solution. Modeling time series with HMMs 18

  19. Applications of HMMs State estimation: What is the most probable state/state sequence of the system? Prediction: What are the most probable next observations/state of the system? Model selection: What is the most likely model that represents these observations? Modeling time series with HMMs 19

  20. Examples Speech recognition : • Left-to-right model • States are phonemes • Observations in frequency domain D.B. Paul., Speech Recognition Using Hidden Markov Models, The Lincoln laboratory journal, 1990 Modeling time series with HMMs 20

  21. Examples Motion prediction : • Periodic model • Observations are observed joints • Simulate/predict walking patterns Karg, Michelle, et al. "Human movement analysis: Extension of the f-statistic to time series using hmm." Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on . IEEE, 2013. Modeling time series with HMMs 21

  22. Examples Motion prediction : • Left-to-right models • Autonomous segmentation • Recognition + prediction Modeling time series with HMMs 22

  23. Examples Motion prediction : • Left-to-right models • Autonomous segmentation • Recognition + prediction Modeling time series with HMMs 23

  24. Examples Motion prediction : • Left-to-right model • Each state is a dynamical system Modeling time series with HMMs 24

  25. Examples Toy training set • 1 player • 7 actions • 1 Hidden Markov model per action Motion recognition : • Recognition of most likely motion and prediction of next step. MATLAB demo Modeling time series with HMMs 25

  26. Outline First part (10:15 – 11:00): • Recap on Markov chains Data • Hidden Markov Model (HMM) - Recognition of time series Time - ML Parameter estimation Second part (11:15 – 12:00): • Time series segmentation • Bayesian non-parametrics for HMMs https://github.com/epfl-lasa/ML_toolbox Modeling time series with HMMs 26

  27. Time series Segmentation Times-series = Sequence of discrete segments Why is this an important problem? Modeling time series with HMMs 27

  28. Segmentation of Speech Signals Segmenting a continuous speech signal into sets of distinct words. Modeling time series with HMMs 28

  29. Segmentation of Speech Signals Segmenting a continuous speech signal into sets of distinct words. I am on a diet. seafood I see food eat it! and I Modeling time series with HMMs 29

  30. Segmentation of Human Motion Data Segmention of Continuous Motion Capture data from exercise routines into motion categories Jumping Jacks Knee Raises Arm Squats Circles Emily Fox et al., Sharing Features among Dynamical Systems with Beta Processes, NIPS, 2009 Modeling time series with HMMs 30

  31. Segmentation in Human Motion Data 12 Variables - Torso position - Waist Angles (2) - Neck Angle - Shoulder Angles - .. Emily Fox et al.., Sharing Features among Dynamical Systems with Beta Processes, NIPS, 2009 Emily Fox et al., Sharing Features among Dynamical Systems with Beta Processes, NIPS, 2009 Modeling time series with HMMs 31

  32. Segmentation in Robotics Learning Complex Sequential Tasks from Demonstration 7 Variables - Position - Orientation Trash Reach Grate Modeling time series with HMMs 32

  33. HMM for Time series Segmentation Assumptions : • The time-series has been generated by a system that transitions between a set of hidden states: • At each time step, a sample is drawn from an emission model associated to the current hidden state: Modeling time series with HMMs 33

  34. HMM for Time series Segmentation How do we find these segments? Modeling time series with HMMs 34

  35. HMM for Time series Segmentation Steps for Segmentation with HMM : 1. Learn the HMM parameters through Maximum Likelihood Estimate (MLE): Initial State Transition Emission Model Probabilities Matrix Parameters HMM Likelihood Baum-Welch algorithm (Expectation-Maximization for HMMs) - Iterative solution - Converges to local minimum Hyper-parameter: Number of states possible K Modeling time series with HMMs 35

  36. HMM for Time series Segmentation Steps for Segmentation with HMM : 2. Find the most probable sequence of states generating the observations through the Viterbi algorithm : HMM Joint Probability Distribution Modeling time series with HMMs 36

  37. HMM for Time series Segmentation Modeling time series with HMMs 37

  38. HMM for Time series Segmentation Modeling time series with HMMs 38

  39. Model Selection for HMMs  Modeling time series with HMMs

  40. Model Selection for HMMs ? Modeling time series with HMMs

Recommend


More recommend