Lecture Marte Markov f Chain Carlo : Abdel Rahman Aminezza Scribes Tuan : - , , Wed Out Homework 2 : on
Problem * Models Motivating Hidden Manha : (f) Yt yt t & zz 7 z . , , Et Posterior Parameters al : an fdz PCO Oly ) 71g ) Pc = , t ' High dimensional 3kt K continuous : prams . ( KT configurations ) variables discrete T
Sequential )/pK Monte ( Bootstrapped Particle Filter ) Carlo Break dimensional Intuition high sampling a : problem into of down Sequence a dimensional sampling lower problems . First step : Ws X ?~pCx , ) ) ,x pcy := , , , pcxtlxi Subsequent steps : ' ) .ly risk X pc :t :t ' y , . . X !t list . , )=§oE8xs" ,c× :D ,~ Kit ' . . , , , + . XS , ) :t ~ . Wst , 1×51 itl ply :=
Sequential Monte Carlo Example : ' ) lwi ,x , pH ( wi x ? ) , w ? ,xs , ) ( x ! wi ! ! pcyilx ~ :=
Sequential Monte Carlo Example : ( Wi ,xi ) w ! ,xs , ( , ) : WT , ... Method ' i€?=⇐ois×nx - risk , , x is ,=wI ( ini Discrete ,I ? ) an . , ... FW , " a x= x. |d×8×.kifkli= .tk?.xYtI.xYEY.. Xin flxo ) . ) Xi~p( 1×9 ! Wi ' ) , his ) Disc ( ' w ?z x. piyzk ) a :-. ~ , , ... , ,
Sequential Monte Carlo Example : ' ) Xiit ' lwt ,×k - - \ Hit w ; ,×Il - ( \ ×{ ,x ! ( use ) it t.s~pcx.tl#IDwii=piyzlx.?.t.. ah WTI 51 ) Disc ( ) ~ ... , , ,
* Diverse Degenerate set beginning near near indy Sequ~iaMontCaloExa#pk bad " " 2 particles In sampling repeated step prunes
Sequential ) ( Marte Formulation Carlo General ) , ,X , ) ply Plyiit ,×iit Assume Unnmmalized Densities jflxt , ) ) I × g. : ... . ( Sequential Decomposition BayesNet)) pcytixtix W the p(yiii,×i:t)_ , ) , ) pcy = it ,x = , , , . 91×1 " : ft qktlxiitnl , ) qcx - - =2 t Wi W+ I > - t - 8+1×1 :t ) =j(×i:t)_ K ? M g. = ' .tt ) c qlxiit ) tea 964-1×1 yt . ,( × , ) :t , . :t ) , ) 2+1×1 ) := pcyiit.X.it plytixtlx = :t , . = . - plyiit Jt ,/Xi:t ) i ) Xiiki i. . . .
Sequential ) ( Marte Formulation Carlo General Assume Unnmmalized Densities ytkt , ) ) K g. : ... . Importance Sampling First step : xis ws , ) ycxsilqkil q ( × ~ :-. , xst~qcxi.ly#i7x!t:=x!.x.aiiIilxt~qcx+ix.:t.i Propose Subsequent from steps samples previous : ( WT int Discrete at ) ~ } ' " ' ins , . ' . , !×" , ) × ,~ it + , . . ) 8+4 slit ) .sn#sqcxilx9?I wI÷ Incremental weiswl a , ' . ,K ft 1
Running Mixture Model Gaussian ( HWZ ) Example : Model Dataset Grapical Iris 0€ +00++0 bn µu,£~pcµfycov & 7n ynhn.tn/Vorm4uu.S# µ Model Generative mean - . ,hk ) ( M Discrete zn ~ , ... ,
Manhov Chain Monte Carlo 5 1 Use - Idea to previous sample propose : x the ' sample next x A of Maher Chain variables random : sequence Xl ,Xs when Markov chain ) 1 discrete is time a - . . . , , ×s " xslxs " ! DX . - - Mmhov pnoprtn ) Xsixs " xslx is ' ( ' ' ( ) p = p Morwov At A Chain homogenous when steps is each ' ) X '=xs the 1×5 "=xs ) p( p( '=×s1X=×←i X we use some . = trans dist . .
Manha Monte Carlo Chain A Manha Convergence chain to converges : density target when ( 17 ) x a p(Xs=x lying ) n(X=× ) = . . "I:I÷u III : €*t¥¥ " . which X=x in visited is , " frequency " with h(X=x ) J
Manha Monte Carlo Chain Balance A Manha Detailed chain homogenous : balance detailed satisfies when ' ) PK plx " 'l× ) ) × ) MK MC '× = invariant leaves Implication ' 1×1 pcx Mcx ) ; |dx ' | × ) ( 17 'l× ' ' ) 17 pcx ) Mlxlplx ( × ) IX dx = = . |dx ' ) 1>1×1×1 ) ' 17k = then If Invariance have '~M( and x ) : × your you ' ) MK ) then Semple X plxlx ~ × ~
Metropolis Hastings - from Starting xs sample Idea current the : mmin ↳ accept proposal qcxixs ) ' generate and ~ a × '=× xst probability Perfect proposal with ' : balance detailed ) ( ' ) 91×1×1 ) MCX ' ) 9 ( XIX x 't ) 'l× qlx ncx ( l = ) A = s ) 'l× qcx n( a) Ratio I is probability proposal with ( reject the ) a 1 - "=xs the xs retain sample and previous
Metropolis Hastings - from Starting xs sample Idea current the : = mmin accept proposal qcxixs ) ' generate and ~ a × '=x xst probability with ' ' ) ' ) M q( ( XIX . X ) ) ( l a i qkllx ( × ) n probability with ( proposal reject the ) l d - "=xs xs the retain sample previous and Show Markov Exercise the that chain : balance xs ' detailed satisfies x . . .
Acceptance Forward - Hastings Balance Detailed etropolis a ( xlix ) : Balance A Manha Detailed chain homogenous : 91×1×1 , , g. balance detailed satisfies when ,p , ,× , . . , n my . . , , p Acceptance Proposal 1×1=991×11×7 msn.am# Prob 6 - Hastings → Metropolis Define x. p( : overrepresented ' # Sometimes accept Always underrepresented ] accept d min ( ) 9gY×l Ty # 1×1 ) nlx '1× ' 171×1 ) pcx 9C = 1 × , , min ( Reverse 171×491×1×1 ) ) acceptance ( × ) 91×11×1 n ; rate , 1×1 ) XCX 1 ) = 171×191×11×12 ( ' ncx ) myn ' ) 91×1×4 . 1 ' ¥ - ( × r m X × ' ) ' ) 17 ( PCXIX x =
- Hastings Densities Unrormalired etropolis : Nice Can calculate property prob acceptance : ' ) from densities YCX ) unhormaliud and ycx s n( mmin 17 ' ) ' ) 17 ( × ) ykl 17 ( q( = xlx . x ) ) ( l A = 171×11=81×1/7 qkilx × , s y( mmin - rcx ' ' # " n' I F¥ 91×1×1 ' ) ) = ( , 1 = , qcxllx × , ,x)9K' mmin ' 7 pcy ,× 91×1×1 ) × ) ) ( × ) ) ( pcy ,× y = 1 = ' pcy 1 ( Banes ) Net
largest - Hastings Metropolis Choosing proposals : y 2 E.gg#aMg@@.@ Gaussian variables Continuous : Norm ( 82 ) ' qcxllx ) x × ; = , \ for proposal off Trade How variance big your . are jumps between ? samples 82 small fwb too good acceptance A - ; , but high correlation between samples but } Tune acceptance less 82 correlation large too : - , to 4=0.6-0,5 lower preb acceptance X
- Hastings Metropolis Choosing proposals : Sample from MH Independent proposals : pntw - ' ) ' ) PK " PH PH ' IX ) plylx 9 ( X pcxl ) = gpcx Independent ) ,×)qK'y# f of previous sample pcyixspypyx mmin ' ) ( , , plb q( ,× ,× × a = i ' ' pcy pix main mind " ) MY " ,PpYy#, ) ) ( , = = , , Ratio of likelihoods Dirt always You simple from : can the pntcr
Gibbs Sampling ppg ,µ× , # Can sunphe ,yµ,× from this Propose Idea 1 time distributer at variable : a , holding other variables constant p.mn#y. ( ) plylx y × pk . ) 7 ,×z ,× = , , BnYegqhsamfyx.inpcxi1yxLpyxxn1piy@into1ovenXzr.p ' ) 's , ) ) Xi conditional 1×21 exploit Y Can - probes dim independence ? Acceptance Ratio Car accept prob I with : . , . , , ,p,y p ming a , , my , ,
Mixture Gaussian ( Gibbs Sampling 2) Homework : Model Gibbs Grapical Sampler Steps ,µ,E E ) Znly I 1. Zu y ,µ PC ~ , b " PCMEIY 2. µ ,[ 7 ) ly ,7 ~ , Distributions Conditional 7h Q 7n : i. µ ,7n=h ,µ,C ) , µ ,[ ) PC7n=hlyn ply = Model Generative PEI µu ,£~pc µ ,[ ) , E) pCyl7n=h,µ plz=h ) = t § ,kpiytZn=l .tk ) µ ,&|pl7=l ) Discrete ( n zn~ , ... , , 7m¥nlµi%4 ynlZn=k~N0rm(µu ,£ ) and them ) / Calculate normalize and joint probs
Mixture Gaussian ( Gibbs Sampling 2) Homework : Model { Grapical Distributions Conditional 2. : µ , ~Inv Normal Inverse Wishart prior . bn { %) ( In Wishart u w ~ ✓ . , Normal ( µo 7am . ) Eu 17 µh ~ , & Exploit Gnjugacn Normal ( µu 17h ) N N Model [ I y In ( ✓ a Generative ) Wishart h ,7 , N µu,£~pcµ,E ) N Eu hly µ ,7 ~ , , 17k ) Discrete ( M zn ~ , ... , ynttn.hn/V0rm(/uu.Su )
( Gibbs Sampling 2) van Mu Hu Conjugacu Homework : Sufficient Statistics Conditional Conjugate gP¥n¥g Wishart ( ' I ) [ Number I y of points duster Inv [ h in ,7 ~ , N n Normal ( hly , Lu N M ) [ µ ,7 ~ Nh I[7n=h] :-. =L n . 9Ynn.7otNhvY-VotNhAadudntnyu.jgu@jII7ueHynNIoMotNkYhlWeightedpmp.r point ' of .cat mean µ h = - average ) h Io Nh cluster + in " Weighted + gfottthulyrlu IF . )( In . µ .lt [ average = . between our + &hI[z=h](yi5ul(yr4uT ( prior empirical " cover and
Metropolis Gibbs ( Homework 2) within ,E ) E Znly 1. Zu y ,µ 1 µ PC ~ , , p4u,E|y 2. µ ,[ 71 ly ,7 ~ , . Hastings Use Metropolis Idea Sample to 2 : . Use ' ' ) [ ,E1µ?2 diagonal form ' ql µ µ ← ~ , I. ± £=µ" , ] Accept with pros : ' ) ' 'M :[ ) 9449442 3 ( 9 ' ) ) ' 7 . ,n( ply .tt/us.Elq4u'.E a , m = , 'lµ,[
Recommend
More recommend