probabilistic dipole inversion for adaptive quantitative
play

Probabilistic dipole inversion for adaptive quantitative - PowerPoint PPT Presentation

Probabilistic dipole inversion for adaptive quantitative susceptibility mapping Jinwei Zhang 1,2 , Hang Zhang 1,3 , Mert Sabuncu 1,2,3 , Pascal Spincemaille 1 , Thanh Nguyen 1 , Yi Wang 1,2 1 Department of Radiology, Weill Medical College of


  1. Probabilistic dipole inversion for adaptive quantitative susceptibility mapping Jinwei Zhang 1,2 , Hang Zhang 1,3 , Mert Sabuncu 1,2,3 , Pascal Spincemaille 1 , Thanh Nguyen 1 , Yi Wang 1,2 1 Department of Radiology, Weill Medical College of Cornell University, New York, NY, USA 2 Department of Biomedical Engineering, Cornell University, Ithaca, NY, USA 3 Department of Electrical and Computer Engineering, Cornell University, Ithaca, NY, USA

  2. Quantitative susceptibility mapping (QSM) !: tissue susceptibility (Image space) (K-space) ": magnetic field # = ℱ[(] " = - . #-! + , " = ! ∗ ( + , (: dipole kernel ,: measurement noise # ? ! " The zero cone of # in k-space Wang, Yi, and Tian Liu. Magnetic resonance in medicine 73.1 (2015): 82-101. 07/07/20 2

  3. COSMOS and MEDI COSMOS MEDI Single-orientation scan, clinically feasible QSM Multi-orientation scans, golden standard QSM / 1 ! = 6 1 7 8 97!, Σ <|, , / ! ∝ > ?@∥B∇,∥ D ! "#$ = arg min , log /(1|!) + log / ! Zeroes located on a Rotation1 Rotation 2 pair of cone surfaces Liu, Tian, et al. Magnetic Resonance in Medicine 61.1 (2009): 196-204. Binary-valued weighting matrix 5 (three spatial directions) Liu, Jing, et al. Neuroimage 59.3 (2012): 2560-2568. 07/07/20 3

  4. Motivation: fitting susceptibility distributions • Given ! " and ! & " , solving ! " & ? • Traditional approximate inference methods: MCMC, VI. Need to run on each subject. • Can we learn a general distribution ! ()*) " & for any given & ? • Introduce parametrized distributions + , " & , learn - so that + , " & ≈ ! ()*) " & (amortized optimization). 07/07/20 4

  5. COSMOS dataset and modeling ! " , $ " , … , ! & , $ & sampled from ' ()*) (!|$) empirical distribution ' ()*) ! $ = 1 5 6[! = ! 3 |$ = $ (3) ] . 1 Σ 34" 9:[ ̂ ' ()*) ! $ ∥ = > ! $ ] 1 − log = > ! 3 $ 3 ) + D( . 5 ' ()*) ) 1 Σ 34" 07/07/20 5

  6. MEDI dataset and modeling Only ! " , … , ! % are given. & ! ' and & ' . +,[. / ' ! ∥ & ' ! ] +,[. / ' ! ∥ & ' ] − 3 4 5 [log &(!|')] Amortized formulation ? +,[. / ' ! (=) ∥ & ' ] − 3 4 5 [log &(! (=) |')] Σ =>" Regularization Likelihood 07/07/20 6

  7. Probabilistic Dipole Inversion (PDI) network Mean ( " #|% ) ' ( ) ! Local field ( ! ) − 234 ' ( ) ! … COSMOS Input data MEDI Variance ( Σ #|% ) … MC sampling *+[' ( ) ! ||-()|!)] 07/07/20 7

  8. Experimental setups • Pre-trained on COSMOS (3D patches) PDI § 4 training, 1 validation, 2 test, each having 5 orientations • Domain adaptations on MEDI (whole brains) PDI-VI § Multiple sclerosis dataset (6 training, 1 validation, 7 test) § Hemorrhage dataset (4 training, 1 validation, 2 test) 07/07/20 8

  9. Healthy subject with COSMOS QSMnet PDI ( ! "|$ ) COSMOS MEDI FINE PDI ( Σ "|$ ) QSMnet: Yoon, Jaeyeon, et al. Neuroimage 179 (2018): 199-206. FINE: Zhang, Jinwei, et al. NeuroImage 211 (2020): 116579. 07/07/20 9

  10. Healthy subject with COSMOS 07/07/20 10

  11. Multiple sclerosis patients PDI ( ! "|$ ) PDI-VI ( ! "|$ ) MEDI QSMnet FINE PDI ( Σ "|$ ) PDI-VI ( Σ "|$ ) sub 1 sub 2 07/07/20 11

  12. Hemorrhagic patient PDI ( ! "|$ ) PDI-VI ( ! "|$ ) MEDI QSMnet FINE PDI ( Σ "|$ ) PDI-VI ( Σ "|$ ) 07/07/20 12

  13. Discussion: relationship to VAE VAE architecture PDI architecture Encoder Decoder “ Encoder” / A B = “Decoder” 4(=|B) /(1|&) 4(&|1) (forward dipole model) … ! >|? ! "|$ & ' & = D = E F GEB, Σ ?|> I = Σ "|$ Σ >|? … loss function loss function − (6 7("|$) log 4 & 1 − :; / 1 & ∥ 4 1 ) − (6 7 J [log 4(=|B)] − :;[/ A B = ∥ 4 B ]) 07/07/20 13 ELBO Kingma, Diederik P., and Max Welling. "Auto-encoding variational bayes." arXiv preprint arXiv:1312.6114 (2013).

  14. Conclusion • Learn a neural network parametrized distribution which yields the posterior distribution of susceptibility given input local field. • train those parameters by fitting to the empirical distribution defined from COSMOS dataset. • Adapt the pre-trained parameters to different domains using (amortized) variational inference. 07/07/20 14

  15. Future work • More expressive model family for ! " # $ : invertible neural network • Learn a prior density % # instead of pre-defining: autoregressive or VAE density estimations 07/07/20 15

  16. Thank you Questions 07/07/20 16

Recommend


More recommend