pix ) pclx : Likelihood : arggrax Posterior piy.ES/pc5 ) - - PowerPoint PPT Presentation

pix
SMART_READER_LITE
LIVE PREVIEW

pix ) pclx : Likelihood : arggrax Posterior piy.ES/pc5 ) - - PowerPoint PPT Presentation

Inference Recap Bayesian : Prior pix ) pclx : Likelihood : arggrax Posterior piy.ES/pc5 ) pcxly ) : = |dEpc5 ) Evidence pcj ) = : |dEp(y*lI)pcEl5 ( Marginal likelihood ) - MAP - - u plyix ) : pcxly )


slide-1
SLIDE 1

Recap

:

Bayesian

Inference

Prior

:

pix

) Likelihood :

pc§lx±

Posterior : pcxly ) =

piy.ES/pc5

) Evidence :

pcj

) =

|dEpc5¥ )

( Marginal likelihood ) MAP :
  • u
  • anyynax
pcxly ) =

arggrax

plyix ) × Predictive :

pcy*iy

) =

|dEp(y*lI)pcEl5

) Expectation : El [ fk ) 14 ] = |dE

pcxoljlfk

)
slide-2
SLIDE 2 Today : Graphical

Models

Exploit conditional independence : Reduce single high
  • dimensional
integral into Multiple lower
  • dimensional

integrals

slide-3
SLIDE 3 Directed us . Undirected

Graphs

Directed

Graph

Undirected

Graph

j Can define same density

pca

, b. c. d. e ) ( more later

)

slide-4
SLIDE 4

Cyclic

us

Acyclic

Graphs

Directed

Graph

Directed

Acyclic

Graph ( DAG) ( with cycle )

Definition

: No node can be visited twice by following edges
slide-5
SLIDE 5

Connected

Graphs

Singly

Connected .

Multiply

Connected

>

a >

( One

path

)

(

Multiple paths )
slide-6
SLIDE 6 Bayesian Networks Directed

Acyclic

Graph ( DAG )

Example

: Belief Netwonh@btso.B

B

Sherlock Holmes ' apartment was Bungled A The Alarm went
  • ff
W

Watson

heard the alarm

f

Mrs Gibbon heard the Alawn

Conditional Dependencies p

( G. W ,
  • A. B )
=

PCGIA )

PCWIA ) p ( AIB ) , > ( B )

PIBIW

, 6) =

p(

GW , B) / p ( w , .6 )
  • Marginal
  • ver
A. B K Marginal
  • ver
A
slide-7
SLIDE 7 Bayesian Networks Structure reflects causal relationships
slide-8
SLIDE 8 Bayesian Networks

f

Sfompkfted case :

Example

:

Naive Bayes

wo classes

Yin

Bernoulli ( n ) i. i , ... , N ( Message is spam ) Plates

Xij

14 : =c ~ Bernoulli

( Ojc )

i= ' . . it " j=i , ... ,D # Docs ( Word

j

  • ccurs
in message it # Words
slide-9
SLIDE 9 Bayesian Networks

Example

:

Naive Bayes

Conditional Independence

p( X

, 4) =

El

,

phil

1,1 ,

plxiillli

) =

Ml

, p( Yi ,Xi ) ( does are independent )

plllillli

) =

pl1il4i)pl4i_l

PZXI )

( Does not depend an
  • ther
does i ' =/ i )
slide-10
SLIDE 10 Bayesian Networks

Example

:

Naive Bayes

I with unknown params ) / M Y ; 117in
  • Bernoulli
G)

Xij

1 4i=c

,Qi=Op

~ Bernoulli

( Ojc )

M

~

Betak

" , p " )

On

,

Qic

  • Beta ( oo,p° )
, µ Question :

pcxyliilpk

. ,u;) it
slide-11
SLIDE 11 Bayesian Networks

Example

:

Naive Bayes

I with unknown params ) / M Conditional

Dependencies

pl

x. y ) =

pcxiyl

pcy ) N p ( y ) =

|dn

pcn )

M

plyiln ) it ,

Gul

pcxiyi =

|

do

!%pCo

;)

. ,

pkijlbji

;)

slide-12
SLIDE 12

Colliders

Definition A node c is a collider when there is a

path

a → c b , .

It

  • I

4

I

( not a collider ) ( Collider )
slide-13
SLIDE 13 D
  • Separation
and D
  • connection
Definition : Let X , 4 , and 2- be disjoint set
  • f
nodes th a graph 6 . X and 4 are d
  • connected
by 7 when 7 . There exist 's an undirected path U between some x EX and 4th 2 . For every collider c
  • n
U either C E 7
  • r
a descendant
  • f
c is Th 7- 3 , No non
  • collider
is in U A I X and 4 are d- separated

by

2- in all
  • ther
cases
slide-14
SLIDE 14 D
  • Separation
and D
  • connection
Conditional Independence :

XIY

I 2- pix , ylzl =

phiz

)

ply

177 Holds whenever x and y are

Y

" iden d- separated

by

2- .

(

Ig :&

amount

. a

Colliders

ac-

et alle le ? No ate lb ?
slide-15
SLIDE 15 D
  • Separation
and D
  • connection

AIB

. A

#

B pca , bi =

dcplcla.hn/pcalplb)pca.bt-/dcpla1c)plblclpc4=pCalplb

) t play plb )

AfDB

1CAIB K

pl

a ,b Ic ) = picta , b) P' a) P' b ) pea ,bk7=

plaldplbk

) pic )

pcaicspcblc

)
slide-16
SLIDE 16 D
  • Separation
and D
  • connection
AIB ID ? No (
  • bserved
descendant
  • f
a collide

)

A

IB

. Ic

?

Yes (
  • bserved
non
  • collide

)

slide-17
SLIDE 17 Exercise : For which

examples

xIy

IZ ? Patni A iii. " " ' a

N

. ii ? . .

ygpathz

i I . . is . . ' 2- is d- connected descendant
  • f
W No No No No Yes
slide-18
SLIDE 18

Exercise

: Far which examples x by 17

?

No No No No Yes

[

to/from

It links

to/from Cut links unobserved colliders
  • bserved
non
  • collides
slide-19
SLIDE 19 Mixed
  • membership
Models Mixture Models Latent Dirichlet Allocation

Hidden

Markov

Models

slide-20
SLIDE 20 Conditional Independence in Mixture Models

Generative

Model O
  • { pink
, Lik ,n }

PIO )

M ~ pit )

PME

)

h . , , . . . ,k } Will define

Mh

, In ~ these later
  • 7h17
~

Categorical ( Th

, . . . , Mk ) Xu Xu 17h n

Normal ( fuk

,

In )

Q 7- a Conditional

Independence

he 7- n

#

7in # n I Xi :N 7- n
  • 117in
# n I Xun , O } .

pttlx

, =

pttnlxn ,O)

Xu I Xm¥n 10