Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou , John A.D. Aston and Adam M. Johansen 6th January 2014 PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Outline PET compartmental models Positron emission tomography (PET) Linear Compartmental models Plasma input PET compartmental models Bayesian model selection for PET Robust modeling of the error structure Biologically informative priors Sequential Monte Carlo Algorithm setting for Bayesian modeling Computational challenge Accuracy of estimators Heterogeneous structure and algorithm tuning Computational cost and parallel computing Results / Conclusions / References Conclusions PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Positron Emission Tomography (PET) ◮ Use compounds labeled with positron emission radionuclides as molecular tracers to image and measure biochemical process in vivo . ◮ One of the few methods available to neuroscientists to study living brains. ◮ Research into diseases where biochemical changes are known to be responsible symptomatic changes. ◮ For example, diagnostic procedure for cancer through fluorodeoxyglucose ([ 18 F]-FDG) tracers. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Linear Compartmental models ◮ Comprise a finite number of macroscopic subunits called compartments . ◮ Each is assumed to contain homogeneous and well-mixed material. ◮ Material flows from one compartment to another at a constant rate. ◮ In PET total concentration of material is measured. These models yield sytems of ODEs: ˙ f ( t ) = Af ( t ) + b ( t ) f (0) = ξ PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Plasma input PET compartmental models K 1 C T 1 C T j k 2 C P C T C T i C T r N.B. We actually focus on linear compartmental models. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Plasma input PET compartmental models System, ˙ C T ( t ) = AC T ( t ) + b C P ( t ) C T ( t ) = 1 T C T ( t ) C T (0) = 0 Solution, � t C T ( t ) = C P ( t − s ) H TP ( s ) d s 0 r � φ i e − θ i t H TP ( t ) = i =1 Parameter of interest, r � ∞ φ i � V D = H TP ( t ) d t = θ i 0 i =1 PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Bayesian model selection for PET ◮ Determine the number of tissue compartments. ◮ “Mass univariate analysis.” ◮ Each time course of C T ( t ) is analyzed individually. ◮ Many: quarter of a million time series per PET scan. ◮ Data is measured at discrete times t = t 1 , . . . , t n , � C ( t i ) y i = C ( t i ) + ε i t i − t i − 1 where ε i are (iid) errors. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Concentration (kBq/mL) Data Set Data Set Data Set Time (sec) Typical PET Time Courses PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Robust modeling of the error structure ◮ Low signal to noise ratio. ◮ Standard approach (in likelihood-based procedures) ◮ Use Normal distributions to model the error. ◮ Employ weighted Non-negative Least Squares. ◮ Assign (arbitrary) small weights to the most noisy data points. ◮ Bayesian modeling ◮ No justifiable way to bound “weights” with normal errors. ◮ Need more robust modeling of the error structure. ◮ Simple solution: Use three-parameter t distribution instead of Normal. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Biologically informative priors [Zhou et al., 2013a] Starting point: ◮ Parameters φ 1: r and θ 1: r are functions of the rate constants. ◮ The matrix A of rate constants obey some simple rules. ◮ Rate constants are constrained by biophysical considerations. Key observations: For θ 1 ≤ θ 2 ≤ · · · ≤ θ r : into the environment. ◮ In the linear plasma input model, there is one outflow, k 2 , θ 1 ≤ k 2 . ◮ There is also only one inflow K 1 , � r i =1 φ i = K 1 . Biophysical knowledge constrains possible values for φ 1: r and θ 1: r . PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Sequential Monte Carlo [Del Moral et al., 2006] ◮ Iteratively generate importance sampling proposal distributions for a sequence { π t } T t =0 . ◮ Use MCMC kernels to propose samples 1. Generate { X ( i ) i =1 from π 0 . Set { W ( i ) 0 } N 0 } N i =1 , the importance weights, to 1 /N . 2. For t = 1 , . . . , T , 2.1 Resample if necessary. 2.2 Generate { X ( i ) t } N i =1 from K ( x t − 1 , x t ) , a π t -invariant Markov kernel. 2.3 Set W ( i ) ∝ W ( i ) w ( i ) w ( i ) ∝ π t ( X ( i ) t ) /π t − 1 ( X ( i ) t − 1 ˜ t , where ˜ t ) . t t PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Algorithm setting for Bayesian modeling Sequence of distributions, π t ( ϕ ) ∝ π 0 ( ϕ )[ L ( ϕ | y 1: n )] α ( t/T ) where ϕ is the parameter vector, π 0 is the prior and L is the likelihood function. Markov kernels, ◮ Update φ 1: r with Normal random walks. ◮ Update θ 1: r with Normal random walks. ◮ Update λ , the scale parameter of the t distributed error, with a Normal random walk on log λ . ◮ Update ν , the degree of freedoms of the t distributed error, with a Normal random walk on log ν . PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Computational challenge ◮ Accuracy of estimator ◮ Heterogeneous structure ◮ Computational cost PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Improve the accuracy of estimators [Zhou et al., 2013b] ◮ Increase the number of particles. ◮ Increase the number of intermediate distributions. ◮ Fast mixing Markov kernels. ◮ Multiple MCMC passes each iteration. ◮ Adaptive proposal scales for random walks. ◮ Better specification of intermediate distributions. ◮ Place more distributions where π t changes fast when α ( t/T ) increases. ◮ Adaptive specification such that the discrepancy between π t and π t − 1 remain almost constant. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
. . CESS Threshold ESS 𝛽 𝑢 . . . Method . . . . . . Improve the accuracy of estimators — adaptive specification of the sequence of distributions 𝛽 𝑢 − 𝛽 𝑢−1 Figure : Variation of the distribution specification parameter α ( t/T ) when using adaptive algorithms. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Heterogeneous structure and algorithm tuning We cannot tune the algorithm for each of 250,000 time series. VD 30 20 10 Figure : Estimates of V D using selected model ◮ SMC is more robust compared than (our) MCMC. ◮ Adaptive strategies. PET compartmental models Bayesian model selection for PET SMC for PET Model Selection Sequential Monte Carlo Y. Zhou, J. A. D. Aston and A. M. Johansen Results / Conclusions / References , Conclusions References
Recommend
More recommend