introduction to sensor data fusion methods and
play

Introduction to Sensor Data Fusion Methods and Applications Last - PowerPoint PPT Presentation

Introduction to Sensor Data Fusion Methods and Applications Last lecture: Why Sensor Data Fusion? Motivation, general context Discussion of examples Today: Steep climb to a first algorithm. oral examination: 6 credit points


  1. Introduction to Sensor Data Fusion Methods and Applications • Last lecture: Why Sensor Data Fusion? – Motivation, general context – Discussion of examples • Today: Steep climb to a first algorithm. • oral examination: 6 credit points after the end of the semester • prerequisite: participate in the excercises, explain a good program • job opportunities as research assistant in ongoing projects, practicum • subsequently: bachelor at Fraunhofer FKIE, master / PhD possible • slides/script: email to wolfgang.koch@fkie.fraunhofer.de, download Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 1

  2. Sensor & Information Fusion: Basic Task -/ information sources: defined by operational requirements Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 2

  3. Sensor & Information Fusion: Basic Task information to be fused: imprecise, incomplete, ambiguous, un- resolved, false, deceptive, hard to formalize, contradictory . . . Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 3

  4. Sensor & Information Fusion: Basic Task information to be fused: imprecise, incomplete, ambiguous, un- resolved, false, deceptive, hard to formalize, contradictory . . . information sources: defined by operational requirements Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 4

  5. Create your own ground truth generator! Consider an object that moves in two dimensions on the trajectory: � � � � x ( t ) sin( ωt ) Exercise 2.1 A = v 2 q r ( t ) = = A with q , ω = y ( t ) sin(2 ωt ) 2 v and speed and acceleration parameters: v = 300 m s , q = 9 m s 2 ! 1. Plot the trajectory. Why is it periodical? What is its period T = T ( v, q ) ? 2. Show for the velocity and acceleration vector: � � � � cos( ωt ) / 2 sin( ωt ) / 4 r ( t ) = v ˙ , ¨ r ( t ) = − q ! cos(2 ωt ) sin(2 ωt ) 3. Calculate for each instance of time t the tangential and normal vectors in r ( t ) : � � � � x ( t ) ˙ − ˙ y ( t ) 1 1 t ( t ) = , n ( t ) = ! y ( t ) ˙ x ( t ) ˙ | ˙ r ( t ) | | ˙ r ( t ) | 4. Plot | ˙ r ( t ) | , | ¨ r ( t ) | , ¨ r ( t ) t ( t ) and ¨ r ( t ) n ( t ) over a period T ! 5. Discuss the temporal behaviour based on the trajectory r ( t ) ! 6. What are the maximum speeds and accelerations, v max , q max ? Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 5

  6. pdf: t k−1 ‘Probability densities functions (pdf)’ p ( x k − 1 |Z k − 1 ) represent imprecise knowledge on the ‘state’ x k − 1 based on imprecise measurements Z k − 1 . Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 6

  7. Characterize an object by quantitatively describable properties: object state – object position x on a strait line: x ∈ R – kinematic state x = ( r ⊤ , ˙ r ⊤ , ¨ r ⊤ ) ⊤ , x ∈ R 9 position r = ( x, y, z ) ⊤ , velocity ˙ r , acceleration ¨ r – joint state of two objects: x = ( x ⊤ 1 , x ⊤ 2 ) ⊤ Examples: – kinematic state x , object extension X z.B. ellipsoid: symmetric, positively definite matrix – kinematic state x , object class class z.B. bird, sailing plane, helicopter, passenger jet, ... Learn unknown object states from imperfect measurements and describe by functions p ( x ) imprecise knowledge mathematically precisely! Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 7

  8. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 8

  9. pdf: t k−1 Prädiktion: t k Exploit imprecise knowledge on the dynamical behavior of the object. = � p ( x k |Z k − 1 ) p ( x k − 1 |Z k − 1 ) d x k − 1 p ( x k | x k − 1 ) . � �� � � �� � � �� � prediction dynamics old knowledge Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 9

  10. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! � p ( x k |Z k − 1 ) = d x k − 1 p ( x k , x k − 1 |Z k − 1 ) � d x k − 1 p ( x k | x k − 1 , Z k − 1 ) p ( x k − 1 |Z k − 1 ) = � d x k − 1 p ( x k | x k − 1 ) p ( x k − 1 |Z k − 1 ) = Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 10

  11. pdf: t k−1 t k : kein plot missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence) Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 11

  12. pdf: t k−1 pdf: t k Prädiktion: t k+1 missing sensor information: increasing knowledge dissipation Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 12

  13. pdf: t k−1 pdf: t k t k+1 : ein plot sensor information on the kinematical object state Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 13

  14. pdf: t k−1 likelihood (Sensormodell) pdf: t k Prädiktion: t k+1 p ( z k +1 | x k +1 ) p ( x k +1 |Z k ) p ( x k +1 |Z k +1 ) B AYES ’ formula: = � | x k +1 ) p ( x k +1 |Z k ) d x k +1 p ( z k +1 � �� � new knowledge ���� � �� � plot prediction Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 14

  15. How to deal with probability density functions? • pdf p ( x ) : Extract probability statements about the RV x by integration! � dx p ( x ) = 1 ) • na¨ ıvely: positive and normalized functions ( p ( x ) ≥ 0 , • conditional pdf p ( x | y ) = p ( x,y ) p ( y ) : Impact of information on y on RV x ? � dy p ( x, y ) = � dy p ( x | y ) p ( y ) : • marginal density p ( x ) = Enter y ! • Bayes: p ( x | y )= p ( y | x ) p ( x ) p ( y | x ) p ( x ) p ( x | y ) ← p ( y | x ) , p ( x ) ! = dx p ( y | x ) p ( x ) : � p ( y ) p ( x | y ) p ( y ) = p ( x, y ) = p ( y, z ) = p ( y | x ) p ( x ) Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 15

  16. pdf: t k−1 pdf: t k+1 (Bayes) pdf: t k filtering = sensor data processing Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 16

  17. Target or Object Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state x k at time t k , accumulated sensor data Z k a priori knowledge: target dynamics models, sensor model dynamics model p ( x k − 1 |Z k − 1 ) p ( x k |Z k − 1 ) • prediction: − − − − − − − − − − → sensor data Z k p ( x k |Z k − 1 ) p ( x k |Z k ) • filtering: − − − − − − − − − − → sensor model filtering output p ( x l − 1 |Z k ) p ( x l |Z k ) • retrodiction: ← − − − − − − − − − − dynamics model Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 17

  18. pdf: t k−1 Prädiktion: t k Exploit imprecise knowledge on the dynamical behavior of the object. = � p ( x k |Z k − 1 ) p ( x k − 1 |Z k − 1 ) d x k − 1 p ( x k | x k − 1 ) . � �� � � �� � � �� � prediction dynamics old knowledge Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 18

  19. The Multivariate G AUSS ian Pdf – wanted: probabilities ‘concentrated’ around a center x q ( x ) = 1 2 ( x − x ) P − 1 ( x − x ) ⊤ – quadratic distance: q ( x ) defines an ellipsoid around x , its volume and orien- tation being determined by a matrix P (symmetric: P ⊤ = P , positively definite: all eigenvalues > 0 ). � dx e − q ( x ) (normalized!) p ( x ) = e − q ( x ) / – first attempt: e − 1 1 2( x − x ) ⊤ P − 1 ( x − x ) p ( x ) = N ( x ; x , P ) = � | 2 π P | p ( x ) = � i p i N ( x ; x i , P i ) (weighted sums) – G AUSS ian Mixtures: Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 19

  20. pdf: t k−1 Prädiktion: t k Exploit imprecise knowledge on the dynamical behavior of the object. = � � � � � p ( x k |Z k − 1 ) d x k − 1 N x k ; Fx k − 1 , D N x k − 1 ; x k − 1 | k − 1 , P k − 1 | k − 1 . � �� � � �� � � �� � prediction dynamics old knowledge Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 20

  21. A Useful Product Formula for G AUSS ians � � � � � � � x ; y + W ν , P − WSW ⊤ � N z ; Fx , D N x ; y , P = N z ; Fy , S N � �� � independent of x S = FPF ⊤ + D , W = PF ⊤ S − 1 . ν = z − Fy , Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 21

Recommend


More recommend