Probability Theory Intro Jonathan Pillow Mathematical Tools for Neuroscience (NEU 314) Spring, 2016 lecture 12
neural coding problem stimuli spike trains • what is the probabilistic relationship between stimuli and spike trains?
neural coding problem “codebook” stimuli spike trains • what is the probabilistic relationship between stimuli and spike trains?
neural coding problem “encoding” ? novel stimulus “codebook” (Alex Piet, Cosyne 2016)
neural coding problem “decoding” ? “who was that”? “codebook” posterior likelihood prior Bayes’ Rule:
Goals for today • basics of probability • probability vs. statistics • continuous & discrete distributions • joint distributions • marginalization • conditionalization • Bayes’ rule (prior, likelihood, posterior)
• “probability • “events” distribution” • “random variables” model samples parameter
• “probability • “events” distribution” • “random variables” model samples parameter parameter sample space space
• “probability • “events” distribution” • “random variables” model samples parameter parameter sample space space examples 1. coin flipping X = “H” or “T” 2. spike counts mean spike rate
• “probability • “events” distribution” • “random variables” model samples parameter parameter sample space space examples 3. reaction times X ∈ positive reals mean reaction time
Probability vs. Statistics model samples parameter ? parameter sample space space coin flipping probability T, T, H, T, H, T, T, T, T, ….
Probability vs. Statistics model samples parameter ? parameter sample space space statistics T, T, H, T, H, T, T, T, T, H, T, H, T, H, H, T, T “inverse probability”
discrete probability distribution takes finite (or countably infinite) number of values, eg probability mass function (pmf): positive and sum to 1: • •
continuous probability distribution takes values in a continuous space, e.g., probability density function (pdf): positive and integrates to 1: • • •
some friendly neighborhood probability distributions Discrete Bernoulli coin flipping ⇤ n ⌅ sum of n coin p k (1 − p ) n − k binomial P ( k ; n, p ) = flips k P ( k ; λ ) = λ k sum of n coin flips k ! e − λ Poisson with P(heads)= λ /n, in limit n →∞
some friendly neighborhood probability distributions ⇤ ⌅ Continuous � ( x − u ) 2 1 ⇥ Gaussian P ( x ; µ, σ ) = exp √ - 2 σ 2 2 πσ 1 multivariate 2 ( x − µ ) T Λ − 1 ( x − µ ) − 1 � ⇥ P ( x n ; µ, Λ ) = 2 exp Gaussian 1 n 2 | Λ | (2 π ) P ( x ; a ) = ae � ax exponential
joint distribution • positive • sums to 1 3 2 1 0 − 1 − 2 − 3 − 3 − 2 − 1 0 1 2 3
marginalization (“integration”) 3 2 1 0 − 1 − 2 − 3 − 3 − 2 − 1 0 1 2 3
marginalization (“integration”) 3 2 1 0 − 1 − 2 − 3 − 3 − 2 − 1 0 1 2 3
conditionalization (“slicing”) 3 2 (“joint divided by marginal”) 1 0 − 1 − 2 − 3 -3 -2 -1 0 1 2 3 − 3 − 2 − 1 0 1 2 3
Recommend
More recommend