probability theory intro
play

Probability Theory Intro Jonathan Pillow Mathematical Tools for - PowerPoint PPT Presentation

Probability Theory Intro Jonathan Pillow Mathematical Tools for Neuroscience (NEU 314) Spring, 2016 lecture 12 neural coding problem stimuli spike trains what is the probabilistic relationship between stimuli and spike trains? neural


  1. Probability Theory Intro Jonathan Pillow Mathematical Tools for Neuroscience (NEU 314) Spring, 2016 lecture 12

  2. neural coding problem stimuli spike trains • what is the probabilistic relationship between stimuli and spike trains?

  3. neural coding problem “codebook” stimuli spike trains • what is the probabilistic relationship between stimuli and spike trains?

  4. neural coding problem “encoding” ? novel stimulus “codebook” (Alex Piet, Cosyne 2016)

  5. neural coding problem “decoding” ? “who was that”? “codebook” posterior likelihood prior Bayes’ Rule:

  6. Goals for today • basics of probability • probability vs. statistics • continuous & discrete distributions • joint distributions • marginalization • conditionalization • Bayes’ rule (prior, likelihood, posterior)

  7. • “probability • “events” distribution” • “random variables” model samples parameter

  8. • “probability • “events” distribution” • “random variables” model samples parameter parameter sample space space

  9. • “probability • “events” distribution” • “random variables” model samples parameter parameter sample space space examples 1. coin flipping X = “H” or “T” 2. spike counts mean spike rate

  10. • “probability • “events” distribution” • “random variables” model samples parameter parameter sample space space examples 3. reaction times X ∈ positive reals mean reaction time

  11. Probability vs. Statistics model samples parameter ? parameter sample space space coin flipping probability T, T, H, T, H, T, T, T, T, ….

  12. Probability vs. Statistics model samples parameter ? parameter sample space space statistics T, T, H, T, H, T, T, T, T, H, T, H, T, H, H, T, T “inverse probability”

  13. discrete probability distribution takes finite (or countably infinite) number of values, eg probability mass function (pmf): positive and sum to 1: • •

  14. continuous probability distribution takes values in a continuous space, e.g., probability density function (pdf): positive and integrates to 1: • • •

  15. some friendly neighborhood probability distributions Discrete Bernoulli coin flipping ⇤ n ⌅ sum of n coin p k (1 − p ) n − k binomial P ( k ; n, p ) = flips k P ( k ; λ ) = λ k sum of n coin flips k ! e − λ Poisson with P(heads)= λ /n, in limit n →∞

  16. some friendly neighborhood probability distributions ⇤ ⌅ Continuous � ( x − u ) 2 1 ⇥ Gaussian P ( x ; µ, σ ) = exp √ - 2 σ 2 2 πσ 1 multivariate 2 ( x − µ ) T Λ − 1 ( x − µ ) − 1 � ⇥ P ( x n ; µ, Λ ) = 2 exp Gaussian 1 n 2 | Λ | (2 π ) P ( x ; a ) = ae � ax exponential

  17. joint distribution • positive • sums to 1 3 2 1 0 − 1 − 2 − 3 − 3 − 2 − 1 0 1 2 3

  18. marginalization (“integration”) 3 2 1 0 − 1 − 2 − 3 − 3 − 2 − 1 0 1 2 3

  19. marginalization (“integration”) 3 2 1 0 − 1 − 2 − 3 − 3 − 2 − 1 0 1 2 3

  20. conditionalization (“slicing”) 3 2 (“joint divided by marginal”) 1 0 − 1 − 2 − 3 -3 -2 -1 0 1 2 3 − 3 − 2 − 1 0 1 2 3

Recommend


More recommend