Lecture 9 Sampling Gibbs : Maximization Expectation Scribes Jered McInerney : 2- hang Xiongyi
Mixture Gaussian ( Gibbs Sampling 2) Homework : Model Generative Grapical Model let K , Eh I ) Mh pipe ~ . , , , . . b " . it ) Discretely N ' In ~ n = . . - . . , , , fun , fu ) Norm ×nlZn=h n g a Gibbs Sampler Updates he Local variables I ) pl N Fnl I 2- Yu , µ n n - - u , . . , . , Global variables . ) 6--1 Eun plpele.ch/yiin,Ziix , K 14h , - ,
Gibbs Independence Conditional Sampling : 114,2 I ym-tapm.eu Kith Local Variables : N M E ) ) I 71in ply ,7n1µ 114 pcyn = .im , , , h = I , @ ) Global ,Zn=h I plyn µ , @ ) , µ = - 9 I I ) § pcyu ,7u=l I pl7n=h1yn compute µ Can , 044K ) updates all in Mh Variables k : .FI/Ue-th.Ie*u/zplyiiu./u,E12-iin)--Mpyuu.Eu)MNonm/ynspu , h 2- n=h n : - I -
Gibbs Global Update Sampling : likelihood conjugate Idea Ensure that is to prion : conjugate prior likelihood far h cluster posterior - - I ) I Mpcynlzih.in , plmh.su/pCMk,EulyiiN :tn=h3 µ ) In , 2- = n h ) M P' but lnizi.az - - likelihood marginal
Families Exponential An family has form exponential distribution the Depends Only Only depends depend x on on I I and y x an m d hcx ) I YT aey ) ) text y ) pcx I exp = - n E leg ÷ : normalizer ( " , Lebesgue ) ( Canting Base measure ( only depends ) X an
Gaussian Example Univariate ; hcx ) I YT aey ) ) text ) pcxly exp = - ↳ " Hmmm expf.tk#y Dependent " = " expf.is/x2-zqutM)/ga ] = ' ) ( text X x = , ( ) 1/262 fuld y = - , log /u2/26 ' 6 ally ) t = hcxl ITTF I =
: plx Conjugate priors Likelihood : , ] hey petty tix ) acy exp hlx ) = - lqttcx hey i Conjugate prior : := D , .dz ) ( d is exp IT ' I " act hip payed tch ) ) = - ( y tip - acyl ) , Joint hey ) explyt ( i 9 , ) , yl rt Da ) ) aim ( aids ) i - - - exploit - augier acts ) explant ) acts ] ) hey , - , = - # . I IT ) x ) p C pcyl 7 - aids ) Fa -
Conjugate priors Joint ? : hex 51 ) ] I explored ) d alt fix , pint pcyix : pox hcx - = = + plx.nl , , I Dat I = a fdypcx.nl Marginal act ] I ' aol exp , ) = - ptx.gs/plx)=pcy1Dttcxs is g normalizer from marginal leg Com compute Posterior ! ) ) = J family Posterior here Conjugacy saz prior : as - aids ) Fa -
Homework Gibbs Sampling : - likelihood conjugate Idea Ensure that is to prion : conjugate prior likelihood far h cluster posterior - - ) I Mpcynlfihduh.su/plMh,Eul9h :tn=h3 I y µ ) In , Eh , 2- Pl Mk Iim = n M ) lnizi.az/0lYnl7n=h - - likelihood marginal . I this Derive I 9h the ( y , 7 ) ) , Ch p ( Mu homework t = in
Derivatives Moments Normalizer Log of : hcx ) I YT aey ) ) text ) pcx ly exp = - |dx - lax hcxiexpfytki ) Lacy , ) 1 pcxiy exp , → = - = Iq flog lax ( yttcxi ) ) has Iya 'T exp tix ) ) fax ( YT tix ) hcxi 1- exp chain = exp Lacy Rule , ] First I , y , f 't Epix Moment pcxly ) dx y = =
Parameters Moments Natural and hcx ) I YT aey ) ) text ) pcxly exp = - Moments from of computable derivatives acyl - are " than ] at Y d , [ IE ' = pcxiy Fyn linearly to independent exponential When - , are an known family minimal is as family Far minimal and acy ) is convex any - text ) ftp.cx.y , I → y µ i = Etpcxiyjltlxl ) ) from there ( to to I 1. mapping n is a -
Moments Parameters Natural and Distribution Normal Example ; ) t ( ( - h ) ] ⇐ ,xZ ⇒ ⇒ Efx ] ) µ x x = = -291g , ' I ( pile ? EL 1/262 ) y x -_ - 62+15=274,24 , Distribution Discrete Example ; ) =l pczib = expat & log On " " ' = -17 ) Oh IG tufts Yu E I I ya ) Itt Oh exp =
Algorithm Maximization Expectation : * .gg?gxlogpCy11u.E,n * n.ME ) Objective : * a = trnhynynt Repeat ' unchanged ) convergence until ( objective For 1 Ni 7 in n . . . - / , ) ) IlZn=h Ilan - h ] EL dtnpcznlyn.io ) yuh = - ie he Points # cluster in For W Ki 2. in I , . . . N N Empirical I Nh E I Mu yn ruh yuh :-. = = µ h ' Mean h h 't ! & - Iuyuht Ih Empirical = , Covariance h Nh IN cluster Fraction nu = in
Maximization Expectation Example : Iteration O :
Maximization Expectation Example : Iteration 1 :
Maximization Expectation Example : 2 Iteration :
Maximization Expectation Example : 3 Iteration :
Maximization Expectation Example : 4 Iteration :
Maximization Expectation Example : 5 Iteration :
Maximization Expectation Example : 6 Iteration :
Jensen Inequality Intermezzo : 's , Functions Convex Area above fix f- ( tx t ) ) fkn Ci . • t xz curve is a - , . . , set convex • fix t ) flat t s It fix , ) ' t . g - X Xz , Functions Concave Area below •,←¥f f- ( fix flat t ) tx Ci ) t ve is a xz cu - - , set convex - fix , fix t ) flat t 7 It , ) + s - Xz X , Random Variables Corrolary : t.ci#xnl:Efii.i:iit:::::.
Lowen Likelihoods Bounds on Use to Bound Lower Jensen define Idea inequality 's : Marginal fax fax Gaussian 4¥ , 2- goal E gin six , = i - = - , I It lost slog # . I I L . tog I E. * : - - - * " [ Lower bound 7 boy on Mixture Model ; y , PgY¥ lolz I 2- I O ) :O ) pig . 't at :o) ace pig ; = = = , I log "gY÷ I log Ez £10,81 pay ol s ; :-. " " .
Algorithm Generalized Expectation Maximization : Egj Llap ) Objective : is , 9175g ) O ; Initialize £10 Repeat , y ) until unchanged : .gg/loyPlY'tt-9/slogpcy;o7 Expectation Step 1 . I ( O y ) y = angngax , Step Maximization 2 . LIO O r ) anymore = ,
Leiber Kullbach Divergence Intermezzo : - := ) 9¥ KL(q( MK ) ) ↳ Y Measures how H × ) much dx 91×1 × ' ' n deviates from , 171×1 ,× a Properties ( Positive ) ) Semi . definite ) ( 1 KL qcxl 11 nlx 30 . = / galley "g¥ KL ( ) E# galley "f¥ , ) qcxllinkl ax . = ' ? lag ( ## , ,× , I 'g¥sl ) i ) log ( o = = x ) 11171×1 ) KL( qk ) MK ) 2 ( q o = as = . |dx 91×1 Mk ) |d×nk , lgYn¥ : : , by 91¥ qix - o = ,
KL Bound Lower divergence vs - ;o ) pcyit = :O 'P P 's 't 's :o) Eagan , lloypggjt.sc ] £10,21 = + leg Plaything ) #z~q , ;y)|log :o) pig = on £ Kttdiv depend does not rewrite as log an ;n|log9pYftTo , ] # :o) = pig - ← KL ( ;o ) ) log H :O) paly ply qhsy ) - = a \ Does depend Depends not y on on y Maximizing £ ( O ,y ) Implication equivalent : y is wrt O ) ) to ( KL 11 minimizing ; g) pcttly 917 ; o .
Algorithm Generalized Expectation Maximization : Initialize O ; .in/logPgYITjY/slogpcyi07 Llan Define HI : . " . - Repeat until an unchanged : Expectation f ;D ;D 1 act petty : ← . le 07 plush , 2- n ; - Yuh ; O ) plan h I - i yn = - = - ,Zn=l :O ) plgn do Maximization 2 . : ( See ) 2£10 Solve for r ) next slide : o = ,
Maximization EM Step Generalized : " " Update Generalized K for Hard Me : . means M Ilgply Ital Me I [ ) [ 7in ( yn ) = = o - , que n=i for Generalized Update EM : He EffueEganm1logP4gYzIigInI-nT@Eqn.n N f. LLQH = ; 01 ] ;µ|fµe leg plyn , 7h n"& Egan " . .ru/Ittn=eDlyn.Ne)Ee = ynh= II. " e) Ee ynulyn - i.
Maximization EM Step Generalized : " " Update Generalized for Hard K Me . means ; N Ilgply Ital Me I [ )[ 7in ( yn ) = = o - , que no I ,I[7n=l]yn = n§IGn=l ] ten n§ Solution Ne pie : = . for Generalized Update EM : Me t.eu?.ynhynNe=n&rnu = § . µe)[ " ftp.LH.r kulyn ) e = o , N Solution : me
Algorithm Generalized Expectation Maximization : Initialize O ; Llo log Define Eza , ,[logPgYf÷gM ,Hi= :o) pcy s : , ... Repeat until Zn unchanged : Expectation £ ) ) 1 qcz pc Fly ;g : ← ;z , "7n=h 07 PCY =T±[Ifh=hD ' ynh ;O ) plzn=h ' := yn 1 = ,Zn=l :O) plyn Maximization 2210,27/00=0 Solve 2 . : u=fIa§ . µuµT ,Iun§kuyn [ hit ynuynyni ) µh= ,Yhg , .
Recommend
More recommend