summary b ayes ian multi sensor tracking
play

Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of - PowerPoint PPT Presentation

Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time.


  1. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 1

  2. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. • Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets’ temporal evolution is usually not well-known. Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 2

  3. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. • Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets’ temporal evolution is usually not well-known. • Approach: Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them. Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 3

  4. Summary: B AYES ian (Multi-) Sensor Tracking • Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. • Objective: Learn as much as possible about the individual target states at each time by analyzing the ‘time series’ which is constituted by the sensor data. • Problem: imperfect sensor information: inaccurate, incomplete, and eventually ambiguous. Moreover, the targets’ temporal evolution is usually not well-known. • Approach: Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them. • Solution: Derive iteration formulae for calculating the pdfs! Develop a mech- anism for initiation! By doing so, exploit all background information available! Derive state estimates from the pdfs along with appropriate quality measures! Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 4

  5. Elements for multisensor situation pictures: tracks of temporally evolving objects Which object properties are of interest? → state X k at time t k • road-moving vehicle: odometer count x k X k = ( x k , ˙ x k ) • position, speed, acceleration: X k = ( r k , ˙ r k , ¨ r k ) X k = ( x 1 k , x 2 • joint state of several objects: k , . . . ) • attributes, e.g. radar cross section x k ∈ R + : X k = ( x k , x k ) • maneuvering phase, object class i k ∈ N : X k = ( x k , i k ) Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 5

  6. Elements for multisensor situation pictures: tracks of temporally evolving objects → state X k at time t k Which object properties are of interest? → from sensor data Z k = { Z k , Z k − 1 } , context How to lean states X k ? → e.g. by conditional pdfs p ( X k | Z k ) How to imprecise information? → iterative calculation of p ( X k | Z k ) What means “learning” in this context? Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 6

  7. The general tracking equations � p ( X k − 1 | Z k − 1 ) p ( X k | Z k − 1 ) = Prediction dX k − 1 p ( X k | X k − 1 ) � �� � � �� � evolution filtering t k − 1 p ( Z k | X k ) p ( X k | Z k − 1 ) filtering p ( X k | Z k ) = � p ( X k | Z k − 1 ) dX k p ( Z k | X k ) � �� � � �� � sensor model prediction filtering t l evolution � �� � � �� � p ( X l | Z l ) � p ( X l +1 | X l ) p ( X l +1 | Z k ) retrodiction p ( X l | Z k ) = dX l +1 p ( X l +1 | Z l ) � �� � retrodiction t l +1 � �� � prediction t l +1 Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 7

  8. Elements for situation pictures: tracks of temporally evolving objects → state X k at time t k Which object properties are of interest? → from sensor data Z k = { Z k , Z k − 1 } , context How to lean states X k ? → e.g. by conditional pdfs p ( X k | Z k ) How to imprecise information? → iterative calculation of p ( X k | Z k ) What means “learning” in this context? → evolution / sensor models p ( X k | X k − 1 ) , p ( Z k | X k ) What is needed for this? → sequential decisions How to initiate / terminate tracking processes? Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 8

  9. Why is ‘target tracking’ a key function? infer secondary quantities from incomplete measurement data. Eliminate fluctuating false return background (clutter). Create a time basis for classification from attribute data. . . . ‘Shape’ of objects / object groups relevant in many applications. Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 9

  10. Track-based inference of object properties • Velocity history: vehicle, helicopter, plane • Acceleration history: threat: no under-wing weapons • Rare events: truck by night on dirt road near a border • Object interrelations: resulting from formation, convoy • Object sources / sinks: classification by origin / designation • Classification: road-moving vehicle, ‘on-road’ → ‘off-road’ Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 10

  11. Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 11

  12. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! • Bayes: p ( x | y )= p ( y | x ) p ( x ) p ( y | x ) p ( x ) p ( x | y ) ← p ( y | x ) , p ( x ) ! = dx p ( y | x ) p ( x ) : � p ( y ) Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 12

  13. Recapitulation: The Multivariate G AUSS ian Pdf – wanted: probabilities ‘concentrated’ around a center x q ( x ) = 1 2 ( x − x ) P − 1 ( x − x ) ⊤ – quadratic distance: q ( x ) defines an ellipsoid around x , its volume and orien- tation being determined by a matrix P (symmetric: P ⊤ = P , positively definite: all eigenvalues > 0 ). � dx e − q ( x ) (normalized!) p ( x ) = e − q ( x ) / – first attempt: e − 1 1 2( x − x ) ⊤ P − 1 ( x − x ) p ( x ) = N ( x ; x , P ) = � | 2 π P | p ( x ) = � – G AUSS ian Mixtures: i p i N ( x ; x i , P i ) (weighted sums) Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 13

  14. pdf: t k−1 Prädiktion: t k Exploit imprecise knowledge on the dynamical behavior of the object. = � � � � � p ( x k |Z k − 1 ) d x k − 1 N x k ; Fx k − 1 , D N x k − 1 ; x k − 1 | k − 1 , P k − 1 | k − 1 . � �� � � �� � � �� � prediction dynamics old knowledge Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 14

  15. A Useful Product Formula for G AUSS ians � � � � � � � x ; y + W ν , P − WSW ⊤ � N z ; Fx , D N x ; y , P = N z ; Fy , S N � �� � independent of x S = FPF ⊤ + D , W = PF ⊤ S − 1 . ν = z − Fy , Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 15

  16. k ) ⊤ , Z k = { z k , Z k − 1 } Kalman filter: x k = ( r ⊤ r ⊤ k , ˙ � � initiation: p ( x 0 ) = N x 0 ; x 0 | 0 , P 0 | 0 , initial ignorance: P 0 | 0 ‘large’ � � dynamics model � � N x k − 1 ; x k − 1 | k − 1 , P k − 1 | k − 1 − − − − − − − − − → N x k ; x k | k − 1 , P k | k − 1 prediction: F k | k − 1 , D k | k − 1 x k | k − 1 = F k | k − 1 x k − 1 | k − 1 ⊤ + D k | k − 1 P k | k − 1 = F k | k − 1 P k − 1 | k − 1 F k | k − 1 Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 16

  17. pdf: t k−1 t k : kein plot missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence) Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 17

  18. pdf: t k−1 pdf: t k Prädiktion: t k+1 missing sensor information: increasing knowledge dissipation Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 18

Recommend


More recommend