Nonparametric Filter Quan Nguyen November 16, 2015 1
Outline 1. Hidden Markov Model 2. State estimation 3. Bayes filters 4. Histogram filter 5. Binary filter with static state 6. Particle filter 7. Summary 8. References 2
1. . Hidden Mark rkov Mod odel Bayesian Network - Graphical model of conditional probabilistic relation - Directed acyclic graph (DAG) π― = πΎ, π V: set of random variables E: set of conditional dependencies http://www.intechopen.com/books/current-topics-in-public-health/from-creativity-to-artificial-neural- networks-problem-solving-methodologies-in-hospitals 3
1. . Hidden Mark rkov Mod odel Hidden Markov Model - Particular kind of Bayesian Network - Modelling time series data http://sites.stat.psu.edu/~jiali/hmm.html 4
1. . Hidden Mark rkov Mod odel Hidden Markov Model https://en.wikipedia.org/wiki/Viterbi_algorithm#Example 5
1. 1. Hidd idden Mar arkov Mod odel el Hidden Markov Model Observing a patient for 3 days: + Day 1: Cold + Day 2: Normal + Day 3: Dizzy Question: 1) Most likely sequence of health condition of the patient in last 3 days ? Most likely health condition of the patient in the 4 th day ? 2) 6
2. . State es esti timati tion on State space - Quantities that cannot be directly observed but can be inferred from sensor data - Examples: position and direction of robot in a room - Notation: π = π¦ 1 , π¦ 2 , β¦ π¦ π’ π π = π¦ π’ : ππ πππππππ’π§ ππ π‘ππ’π πππ£πππ‘ π’π π¦ ππ’ π’πππ π’ 7
2. . St State es esti timati tion on Measurement (Observation) - Environment data provided by robot sensor - Examples: distance to ground, camera images - Notation: π = π¨ 1 , π¨ 2 , β¦ , π¨ π’ π π = π¨ π’ : ππ πππππππ’π§ ππ ππππ‘π£π πππππ’ πππ£πππ‘ π’π π¨ ππ’ π’πππ π’ 8
2.S .State es esti timati tion on Control data - Information about the change of state in the environment - Examples: velocity of robot, temperature of a room, an action of robot on environment objects - Notation: π = π£ 1 , π£ 2 , β¦ , π£ π’ π π = π£ π’ : ππ πππππππ’π§ ππ ππππ‘π£π πππππ’ πππ£πππ‘ π’π π¨ ππ’ π’πππ π’ 9
2.S .State es esti timati tion on Probabilistic Generative Laws β’ State can be constructed on all past states, measurements and controls: π π = π¦ π’ = π π = π¦ π’ π¦ 0:π’β1 , π¨ 0:π’β1 , π£ 0:π’β1 ) β’ Markov assumption: π(π = π¦ π’ ) = π(π = π¦ π’ | π¦ π’β1 , π£ π’ ) π(π = π¨ π’ ) = π(π = π¨ π’ | π¦ π’ ) 10
2.S .State es esti timati tion on Belief distribution β’ Belief: - Internal knowledge of the robot about the true state - Represent probability to each possible true sate - Notation: πππ π¦ π’ = π(π¦ π’ | π¨ 1:π’ , π£ 1:π’ ) β’ Prediction: πππ π¦ π’ = π(π¦ π’ | π¨ 1:π’β1 , π£ 1:π’ ) β’ Correction: πππ π¦ π’ = F(πππ π¦ π’ ) 11
3. . Bay Bayes Filter Bayes Filter algorithm (continuous case) 1: πΊπ£ππ_ππππ’ππππ£π‘_πΆππ§ππ‘_ππππ’ππ (πππ π¦ π’β1 , π£ π’ , π¨ π’ ) πππ πππ π¦ π’ ππ 2: πππ π¦ π’ = π( π¦ π’ | π£ π’ , π¦ π’β1 )πππ π¦ π’β1 ππ¦ 3: πππ π¦ π’ = πππ πππππ¨ππ β π(π¨ π’ | π¦ π’ ) πππ π¦ π’ 4: πππ 5: 6: π ππ’π£π π πππ(π¦ π’ ) 12
3. . Bay Bayes Filter Bayes Filters algorithm (discrete case) 1: πΊπ£ππ_πππ‘ππ ππ’π_πΆππ§ππ‘_ππππ’ππ (π π,π’β1 , π£ π’ , π¨ π’ ) πππ πππ π ππ 2: π π,π’ = π( π¦ π’ | π£ π’ , π π’β1 = π¦ π )π π,π’β1 3: 4: π π,π’ = πππ πππππ¨ππ β π(π¨ π’ | π¦ π’ ) π π,π’ πππ 5: 6: π ππ’π£π π π π,π’ 13
4.His .Histogram filter Histogram Filter β’ Discrete Bayes filter estimation for continuous state spaces β’ State space decomposition: - πππππ π π’ = {π¦ 1,π’ βͺ π¦ 2,π’ βͺ β¦ π¦ π,π’ } - πΊππ ππ€ππ π§ π β π: π¦ π,π’ β© π¦ π,π’ = β β’ In each region the posterior is a piecewise constant density: β’ πΊππ ππ€ππ π§ π‘π’ππ’π π¦ π’ πππππππ‘ π’π π π’β π πππππ: π π,π’ π π¦ π’ = π¦ π π’ 14
4.His .Histogram filter Histogram filter β’ Problem: prior information is defined for individual states, not for region ! - Refer to line 3, 4 of discrete Bayes filter algorithm β’ Solution: approximating density of a region by a representative state of that region. π¦ π,π’ π¦ π’ ππ¦ π’ π¦ π,π’ = π¦ π,π’ 15
4.His .Histogram filter Histogram filter β’ Approximation of density values for regions: π π¨ π’ |π¦ π,π’ β π π¨ π’ π¦ π,π’ π π¦ π,π’ |π£ π’ , π¦ π,π’β1 β πππ πππππ¨ππ β π( π¦ π,π’ |π£ π’ , π¦ π,π’β1 ) β’ Precondition: all regions must have the same size. β’ Now discrete Bayes filter algorithm is applicable ! 16
5. . Bi Binary filter r with th stati tic state Binary Bayes filter with Static State β’ Belief is a function of measurement: πππ π’ π¦ = π π¦ π¨ 1:π’ , π£ 1:π’ = π(π¦|π¨ 1:π’ ) β’ General algorithm: 1: πΊπ£ππ_πππππ π§_πΆππ§ππ‘_ππππ’ππ (π π’β1 , π¨ π’ ) π(π¦|π¨ π’ ) π(π¦) π π’ = π π’β1 + log β log 2: 1 βπ π¦ π¨ π’ 1βπ(π¦) 3: π ππ’π£π π π π’ 17
5. . Bi Binary filter r with th stati tic state β’ Log odds ratio π(π¦) π π¦ = log 1 β π(π¦) - Avoids truncation problems when probabilities close to 0 or 1 β’ Inverse measurement model: - Reduce complexity by using probability of state given measurement data - Example: infer state of a door in an image is much easier than infer an image from all other images of a close/open door. 18
5. . Bin Binary fil ilter r with ith stati tic state Example of Binary filter: Occupancy grid mapping - Estimate (generate) map from (noisy) sensor measurement data and robot position - General algorithm: π π΅ππ = π΅ π π:π , π π:π = π(π«πππ = π ππ ππ π πππππ|π π:π , π π:π ) π π(π«πππ = π ππ ππ π πππππ|π π:π , π π:π ) is a binary estimation problem 19
6. . Parti article filter Particle filter algorithm β’ Represent the posterior density by a set of weighted random particles β’ General algorithm: 1: πΊπ£ππ_πππ π’ππππ_ππππ’ππ (π π’β1 , π£ π’ , π¨ π’ ) 2: π π’ = π π’ = β 3: πππ π = 1 π’π π ππ π ~ π(π¦ π’ |π¦ π’β1 π π‘πππππ π¦ π’ ) 4: π = π(π¨ π’ |π¦ π’ π ) 5: π₯ π’ π , π₯ π’ π ) 6: π π’ = π π’ + (π¦ π’ 7: ππππππ 8: πππ π = 1 π’π π ππ π ππ ππ₯ π π₯ππ’β ππ ππππππππ’π§ β π₯ π’ 9: π π’π π π’ 10: πππ π¦ π’ ππππππ 11: π ππ’π£π π π π’ 12: 20
6. . Parti article filter Particle filter algorithm http://www.juergenwiki.de/work/wiki/doku.php?id=public:particle_filter 21
6. 6. Par article fil filter Properties of Particle filter algorithm β’ Degree of freedom: - Because of normalization we lost one degree of freedom: deg = π β 1 β’ Identical particles after resampling phase : - Resampling with probability proportional to weight: after every iteration we failed to draw one or more state sample 22
6. . Parti article filter Properties of Particle filter algorithm β’ Deterministic sensor: - Sensor with noise-free range: measurement data is zero for most of state ! ο All weights become zero. β’ Particle deprivation problem: - Resampling can wipe out all particles near the true state ο incorrect states have larger weight ! 23
6. . Parti article filter Application of Particle filter - Tracking the state of a dynamic system modeled by a Bayesian Network: Robot localization, SLAM, robot fault diagnosis. - Image segmentation: by generating a large number of particles and gradually focus on particle with desired properties ο Image processing, Medial image analysis 24
Recommend
More recommend