topics in brain computer interfaces topics in brain
play

Topics in Brain Computer Interfaces Topics in Brain Computer - PowerPoint PPT Presentation

Topics in Brain Computer Interfaces Topics in Brain Computer Interfaces CS295- -7 7 CS295 Professor: M ICHAEL B LACK TA: F RANK W OOD Spring 2005 Probabilistic Inference Michael J. Black - CS295-7 2005 Brown University Brown University


  1. Topics in Brain Computer Interfaces Topics in Brain Computer Interfaces CS295- -7 7 CS295 Professor: M ICHAEL B LACK TA: F RANK W OOD Spring 2005 Probabilistic Inference Michael J. Black - CS295-7 2005 Brown University

  2. Brown University Reza Shadmehr Michael J. Black - CS295-7 2005

  3. Reza Shadmehr After 300 trails Forces used “Catch” trails to counter field. Michael J. Black - CS295-7 2005 Brown University

  4. Brown University Kinarm Michael J. Black - CS295-7 2005

  5. Kinarm project • Connect Kinarm with TG2 game software (partially done). • Calibrate Kinarm (partially done) • Develop a high-level way of programming different force fields (not done). • Develop a basic set of fields. Michael J. Black - CS295-7 2005 Brown University

  6. Encoding Review Moran & Schwartz (’99): = + θ + θ ( cos( ) sin( )) z h speed h h 0 k x k y k = + + ( Linear in velocity ) h h v h v 0 x x y y k k v v y y v v x x Michael J. Black - CS295-7 2005 Brown University

  7. Encoding Review Kettner et al (’88): = + + z b b x b y ( Linear in position ) 0 k x k y k y y x x Flament et al (‘88): Firing rate is also related to hand acceleration Michael J. Black - CS295-7 2005 Brown University

  8. Notation ⎡ ⎤ x Firing rates of n cells at time k k ⎢ ⎥ y ⎡ ⎤ z ⎢ ⎥ k 1 , k ⎢ ⎥ ⎢ ⎥ Hand kinematics v z v , v x k ⎢ ⎥ = at time k ⎢ ⎥ = 2 , k x z ⎢ ⎥ v k k M ⎢ ⎥ , y k ⎢ ⎥ ⎢ ⎥ a z ⎣ ⎦ ⎢ , ⎥ x k , n k ⎢ ⎥ a ⎣ ⎦ v , v v v y k = K ( , , , ) Z z z z − v 1 1 k k k v v v = K ( , , , ) X x x x − 1 1 k k k Michael J. Black - CS295-7 2005 Brown University

  9. Decoding Methods Direct decoding methods: v v v = ( , ,...) x f z z − 1 k k k Simple linear regression method v v = T x f Z 1 v − : k k k d v = T y f Z − 2 : k k k d Michael J. Black - CS295-7 2005 Brown University

  10. Decoding Methods Direct decoding methods: v v v = ( , ,...) x f z z − 1 k k k In contrast to generative encoding models: v = v ( ) z f x k k Need a sound way to exploit generative models for decoding. Michael J. Black - CS295-7 2005 Brown University

  11. Uncertainty Typically there will be uncertainty in our models v v = + ( ) noise z f x k k This defines the likelihood of the observations given the state: v v ( | ) p z k x k But we want to estimate something about the state x given noisy measurements z v v v v v = ( | ) ( | ,..., ) p x Z p x z z − − 1 t t j t t j Michael J. Black - CS295-7 2005 Brown University

  12. Probability Review Probability a a a a a a a 3 5 6 7 1 2 4 Let X be a random variable that can take on one of the discrete values ∈ K [ , , ] X a a 1 7 Michael J. Black - CS295-7 2005 Brown University

  13. Basic facts Probability a a a a a a a 3 5 6 7 1 2 4 = ( ) or just ( ) p X a p a i i ≤ = ≤ 0 ( ) 1 p X a i 7 ∑ = ( ) 1 p a i = 1 i Michael J. Black - CS295-7 2005 Brown University

  14. Basic facts Probability a a a a a a a 3 5 6 7 1 2 4 Expected value or expectation of a random variable ∑ µ = = [ ] ( ) E x p x x ? x ∑ σ = = − = − µ 2 2 2 var[ ] [( ( )) ] ( ) ( ) x E x E x x p x x Michael J. Black - CS295-7 2005 Brown University

  15. Joint Probability = = = ( , ) ( , ) p X a X a p a a 1 1 , 2 2 , 1 , 2 , i j i j ∑∑ = ( , ) 1 p a i a 1 , 2 , j a a 1 , 2 , i j Statistical independence = ( , ) ( ) ( ) p x y p x p y - knowing y tells you nothing about x Michael J. Black - CS295-7 2005 Brown University

  16. Conditional Probability Dependence - Knowing the value of one random variable tells us something about the other. ( , ) p A B = ( | ) p A B ( ) p B = ( | ) ( ) ( , ) p A B p B p A B Michael J. Black - CS295-7 2005 Brown University

  17. Statistical Independence ( , ) ( ) ( ) p A B p A p B = ? = = ( | ) ( ) p A B p A ( ) ( ) p B p B If A and B are statistically independent? Michael J. Black - CS295-7 2005 Brown University

  18. Statistical Independence ( , ) ( ) ( ) p A B p A p B = = = ( | ) ( ) p A B p A ( ) ( ) p B p B A and B are statistically independent if and only if = = ( , ) ( | ) ( ) ( ) ( ) p A B p A B p B p A p B = ( | ) ( ) p A B p A = ( | ) ( ) p B A p B Michael J. Black - CS295-7 2005 Brown University

  19. Conditional Independence A is independent of B, conditioned on C = ( , , ) ( , | ) ( ) p A B C p A B C p C = ( | ) ( | ) ( ) p A C p B C p C If I know C , then knowing B doesn’t give me any more information about A . This does not mean that A and B are statistically independent Michael J. Black - CS295-7 2005 Brown University

  20. More generally = ( , | ) ( | , ) ( | ) p A B C p A B C p B C Marginalizing over a random variable ∑ ∑ = = ( | ) ( , | ) ( | , ) ( | ) p A C p A B C p A B C p B C B B Michael J. Black - CS295-7 2005 Brown University

  21. Bayes’ Theorem = = ( , ) ( | ) ( ) ( | ) ( ) p A B p A B p B p B A p A ( | ) ( ) p B A p A = ( | ) p A B ( ) p B Revd. Thomas Bayes, 1701-1761 Michael J. Black - CS295-7 2005 Brown University

  22. Bayesian Inference Likelihood Prior ( a priori – (evidence) before the evidence) Posterior ( firing | kinematics ) ( kinematics ) p p = ( kinematics | firing ) p ( firing ) p a posteriori probability normalization constant (after the evidence) (independent of mouth) We infer hand kinematics from uncertain evidence and our prior knowledge of how hands move. Michael J. Black - CS295-7 2005 Brown University

  23. G ENERATIVE M ODEL linear, non-linear? Encoding: noise (e.g. v v v = + Normal or ( ) z f x q 1 Poisson) k k k v v v = + ( − ) x f x w 2 1 k k k neural firing rate of N=42 cells in M=70ms behavior (e.g. hand position, velocity, acceleration) Michael J. Black - CS295-7 2005 Brown University

  24. “cell 8” … “cell 18” … Michael J. Black - CS295-7 2005 Brown University

  25. v v = + z H x noise t t … … Michael J. Black - CS295-7 2005 Brown University

  26. v v = + z H x noise t t ⎡ ⎤ ⎡ ⎤ η ⎡ ⎤ L h h h x ⎡ ⎤ 1 , 1 , 1 , x y a t 1 z ⎢ ⎥ ⎢ ⎥ y ⎢ ⎥ 1 , t η L ⎢ ⎥ h h h y ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ 2 , 2 , 2 , = x y a + t 2 M y ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ M M M ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ z ⎣ ⎦ … η , n t L ⎢ ⎥ ⎢ h h h ⎥ ⎣ ⎦ a ⎣ ⎦ ⎣ ⎦ , , , n x n y n a y , t n y … Michael J. Black - CS295-7 2005 Brown University

  27. G ENERATIVE M ODEL ⎛ ⎞ x ⎜ ⎟ k ⎜ ⎟ system y Observation Equation: k ⎜ ⎟ state v ⎜ ⎟ x vector ⎛ ⎞ k firing 1 z ⎜ ⎟ ⎜ ⎟ − v k j (zero rate y ⎜ ⎟ k ⎜ ⎟ 2 z − mean) a k j ⎜ ⎟ vector ⎜ ⎟ v v v x k M ⎜ ⎟ ⎜ ⎟ = + (zero a ⎝ ⎠ z H x q ⎜ ⎟ y 42 k z mean, ⎝ ⎠ − − k j k k k j sqrt) 42 X 6 matrix Michael J. Black - CS295-7 2005 Brown University

  28. Assumption Gaussian distribution: v v ~ ( , ) z N H x Q k k v v v − = ~ ( 0 , ) z H x q N Q k k k Recall: 2 σ 1 1 = − − µ 2 ( ) exp( ( ) / ) p x x π σ 2 2 Michael J. Black - CS295-7 2005 Brown University

  29. Gaussian For a single cell: v 1 1 = − − + + + σ 2 2 L ( | ) exp( ( ( )) / ) p z x z h x h y h a , , , , , , i t t π σ i t i x t i y t i a y t 2 2 y What about multiple cells? If the firing rates are conditionally independent: n v v ∏ = K ( , , , | ) ( | ) p z z z x p z x 1 , 2 , , , t t n t t i t t = 1 i If we know x t , then the firing rates of the other cells tell us nothing more about z i,t Michael J. Black - CS295-7 2005 Brown University

  30. Covariance first moment ∑∑ σ = = − µ = − µ 2 2 v [ ] E [( )] ( ) ( , ) ar x x x p x y x x x x y second moment ∑∑ σ = − µ − µ = − µ − µ E [( )( )] ( )( ) ( , ) x y x y p x y xy x y x y x y µ ⎡ ⎤ ⎡ ⎤ v v x µ = = x ⎢ ⎥ ⎢ ⎥ x µ ⎣ ⎦ ⎣ ⎦ y y σ σ ⎡ ⎤ v v v v = − µ − µ = xx xy T ⎢ ⎥ E [( )( ) ] C x x σ σ ⎣ ⎦ yx yy Michael J. Black - CS295-7 2005 Brown University

  31. Covariance Multivariate Gaussian (Normal) ∆ Mahalanobis distance 2 ⎛ ⎞ v 1 1 v v v w − = − − µ − µ ⎜ ⎟ 1 T ( ) exp ( ) ( ) p x x C x π / 2 1 / 2 ⎝ ⎠ D ( 2 ) | | 2 C Michael J. Black - CS295-7 2005 Brown University

Recommend


More recommend