a kalman filter for robust outlier detection
play

A Kalman Filter for Robust Outlier Detection Jo-Anne Ting, - PowerPoint PPT Presentation

A Kalman Filter for Robust Outlier Detection Jo-Anne Ting, Evangelos Theodorou, Stefan Schaal Computational Learning & Motor Control Lab University of Southern California IROS 2007 October 31, 2007 Outline Motivation Quick review


  1. A Kalman Filter for Robust Outlier Detection Jo-Anne Ting, Evangelos Theodorou, Stefan Schaal Computational Learning & Motor Control Lab University of Southern California IROS 2007 October 31, 2007

  2. Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions IROS 2007 2

  3. Motivation • Consider real-time applications where storing data samples may not be a viable option due to high frequency of sensory data • In systems where high quality sensory data is needed, reliable detection of outliers is essential for optimal performance (e.g. legged locomotion): • The Kalman filter (Kalman, ’60) is commonly used for real-time tracking, but it is not robust to outliers! IROS 2007 3

  4. Previous Methods Robust Kalman filter approach Drawback 1) Use non-Gaussian distributions for Complicated resulting parameter random variables (Sorenson & Alspach estimation for systems with ’71, West ’82) transient disturbances 2) Model observation & state noise as Difficult & involved filter non-Gaussian, heavy-tailed implementation distributions (Masreliez ’75) 3) Use resampling or numerical Heavy computation not suitable integration (Kitagawa ’87) for real-time applications 4) Use a robust least squares approach & Need to determine the optimal model weights with heuristic functions values of open parameters (e.g., Durovic & Kovacevic, ’99) IROS 2007 4

  5. Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions IROS 2007 5

  6. A Quick Review of the Kalman Filter • The system equations for the Kalman filter are as follows: Observation noise: Observation matrix ( ) v k ~ Normal 0, R z k = C � k + v k � k = A � k � 1 + s k State noise: State transition matrix ( ) s k ~ Normal 0, Q IROS 2007 6

  7. Standard Kalman Filter Equations Propagation: ' = A � k � 1 � k Can use ML ' = A � k � 1 A T + Q � k framework to estimate Update: system dynamics ( ) ' = C � k ' C T + R (Myers & Tapley, 1976) � 1 S k ' = � k ' C T S k ' K k ( ) ' + K k ' z k � C � k � k = � k ' ( ) � k ' C � k = I � K k ' IROS 2007 7

  8. Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions IROS 2007 8

  9. Robust Kalman Filtering with Bayesian Weights • Use a weighted least squares approach & learn the optimal weights: ( ) z k � k , w k ~ Normal C � k , R / w k ( ) � k � k � 1 ~ Normal A � k � 1 , Q ( ) w k ~ Gamma a w k , b w k Kalman filter Robust Kalman filter with Bayesian weights � k � k � 1 � k � 1 � k Q k Q k � 1 A k A k � 1 w k w k � 1 z k z k � 1 z k � 1 z k R k � 1 R k C k � 1 C k IROS 2007 9

  10. Inference Procedure • We can treat this as an EM learning problem (Dempster & Laird, ’77): N ( ) � p � 1: k , z i , w 1: k Maximize log i = 1 • We use a variational factorial approximation of the true posterior distribution: N N ( ) ( ) ( ) ( ) = � � Q w , � Q � i � i � 1 Q � 0 Q w i i = 1 i = 1 to get analytically tractable inference (e.g., Ghahramani & Beal, ’00). IROS 2007 10

  11. Robust Kalman Filter Equations Propagation: Propagation: ' = A k � k � 1 ' = A � k � 1 � k � k ' = Q k ' = A � k � 1 A T + Q � k Compare to � k standard Update: Update: Kalman filter � 1 ( ) ' = C � k ' C T + R � � � 1 ' = C k � k T + R k ' C k S k S k � � � � ' = � k w k ' C T S k ' K k ' = � k ' C k ( ) ' + K k T S k ' K k ' z k � C � k � k = � k ' ( ) ' + K k ' z k � C k � k � k = � k ' ( ) � k ' C � k = I � K k ' ( ) � k ' C k � k = I � K k ' a w k 0 + 1 2 w k = T R k ( ) ( ) � 1 z k � C k � k b w k 0 + z k � C k � k IROS 2007 11

  12. Important Things to Note • Our robust Kalman filter: 1) Has the same computational complexity as the standard Kalman filter 2) Is principled & easy to implement (no heuristics) 3) Offers a natural framework to incorporate prior knowledge of the presence of outliers IROS 2007 12

  13. Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions IROS 2007 13

  14. Real-time Outlier Detection on LittleDog Outliers Our robust KF performs as well as a hand- tuned KF (that required prior knowledge and, hence, is near-optimal) IROS 2007 14

  15. Outline • Motivation • Quick review of the Kalman filter • Robust Kalman filtering with Bayesian weights • Experimental results • Conclusions IROS 2007 15

  16. Conclusions • We have introduced an outlier-robust Kalman filter that: 1) Is principled & easy to implement 2) Has the same computational complexity as the Kalman filter 3) Provides a natural framework to incorporate prior knowledge of noise • This framework can be extended to other more complex, nonlinear filters & methods in order to incorporate automatic outlier detection abilities. IROS 2007 16

  17. Final Posterior EM Update Equations E-step: M-step: ( ) � 1 � 1 � 1 C k + Q k � 1 � � � � � k = T R k k k w k C k � � T C k = w i z i � i � i � i T w i � � � � ( ) � � � � � k = � k Q k � 1 A k � k � 1 + w k C k � 1 z k T R k i = 1 i = 1 � 1 � � � � k k � � a w k 0 + 1 A k = � i � i � 1 T � i � 1 � i � 1 T � � � � � � � � 2 w k = i = 1 i = 1 ( ) T R k ( ) � 1 z k � C k � k b w k 0 + z k � C k � k k r km = 1 ( ) � 2 z i � C k ( m ,:) � i w i k i = 1 k q kn = 1 ( ) � 2 � i � A k ( n ,:) � i � 1 w i Need to be written in k i = 1 incremental form These are computed once for each time step k (e.g., Ghahramani & Hinton, 1996) IROS 2007 17

  18. Incremental Version of M-step Equations • Gather sufficient statistics to re-write M-step equations in incremental form (i.e., only using values observed or calculated in the current time step, k): M-step: ( ) � 1 � � wz � T w �� T C k = k k ( ) � 1 � � �� T � ' � ' A k = k k { } � � r km = 1 � � � wz � w �� T wzz � 2 C k ( m ,:) + diag C k ( m ,:) C k ( m ,:) T � � � � k km km k { } q kn = 1 � � � � � � 2 �� ' � ' � ' � 2 A k ( n ,:) + diag A k ( n ,:) A k ( n ,:) T � � � � k kn kn k IROS 2007 18

Recommend


More recommend