markov random fields
play

Markov Random Fields Umamahesh Srinivas iPAL Group Meeting - PowerPoint PPT Presentation

Markov Random Fields Umamahesh Srinivas iPAL Group Meeting February 25, 2011 Outline Basic graph-theoretic concepts 1 Markov chain 2 Markov random field (MRF) 3 Gauss-Markov random field (GMRF), and applications 4 Other popular MRFs 5


  1. Markov Random Fields Umamahesh Srinivas iPAL Group Meeting February 25, 2011

  2. Outline Basic graph-theoretic concepts 1 Markov chain 2 Markov random field (MRF) 3 Gauss-Markov random field (GMRF), and applications 4 Other popular MRFs 5 02/25/2011 iPAL Group Meeting 2

  3. References Charles Bouman, Markov random fields and stochastic image 1 models . Tutorial presented at ICIP 1995 Mario Figueiredo, Bayesian methods and Markov random fields . 2 Tutorial presented at CVPR 1998 02/25/2011 iPAL Group Meeting 3

  4. Basic graph-theoretic concepts A graph G = ( V , E ) is a finite collection of nodes (or vertices) � V � V = { n 1 , n 2 , . . . , n N } and set of edges E ⊂ 2 We consider only undirected graphs Neighbor: Two nodes n i , n j ∈ V are neighbors if ( n i , n j ) ∈ E Neighborhood of a node: N ( n i ) = { n j : ( n i , n j ) ∈ E} Neighborhood is a symmetric relation: n i ∈ N ( n j ) ⇔ n j ∈ N ( n i ) Complete graph: ∀ n i ∈ V , N ( n i ) = { ( n i , n j ) , j = { 1 , 2 , . . . , N }\{ i }} Clique: a complete subgraph of G . Maximal clique: Clique with maximal number of nodes; cannot add any other node while still retaining complete connectedness. 02/25/2011 iPAL Group Meeting 4

  5. Basic graph-theoretic concepts A graph G = ( V , E ) is a finite collection of nodes (or vertices) � V � V = { n 1 , n 2 , . . . , n N } and set of edges E ⊂ 2 We consider only undirected graphs Neighbor: Two nodes n i , n j ∈ V are neighbors if ( n i , n j ) ∈ E Neighborhood of a node: N ( n i ) = { n j : ( n i , n j ) ∈ E} Neighborhood is a symmetric relation: n i ∈ N ( n j ) ⇔ n j ∈ N ( n i ) Complete graph: ∀ n i ∈ V , N ( n i ) = { ( n i , n j ) , j = { 1 , 2 , . . . , N }\{ i }} Clique: a complete subgraph of G . Maximal clique: Clique with maximal number of nodes; cannot add any other node while still retaining complete connectedness. 02/25/2011 iPAL Group Meeting 4

  6. Basic graph-theoretic concepts A graph G = ( V , E ) is a finite collection of nodes (or vertices) � V � V = { n 1 , n 2 , . . . , n N } and set of edges E ⊂ 2 We consider only undirected graphs Neighbor: Two nodes n i , n j ∈ V are neighbors if ( n i , n j ) ∈ E Neighborhood of a node: N ( n i ) = { n j : ( n i , n j ) ∈ E} Neighborhood is a symmetric relation: n i ∈ N ( n j ) ⇔ n j ∈ N ( n i ) Complete graph: ∀ n i ∈ V , N ( n i ) = { ( n i , n j ) , j = { 1 , 2 , . . . , N }\{ i }} Clique: a complete subgraph of G . Maximal clique: Clique with maximal number of nodes; cannot add any other node while still retaining complete connectedness. 02/25/2011 iPAL Group Meeting 4

  7. Illustration V = { 1 , 2 , 3 , 4 , 5 , 6 } E = { (1 , 2) , (1 , 3) , (2 , 4) , (2 , 5) , (3 , 4) , (3 , 6) , (4 , 6) , (5 , 6) } N (4) = { 2 , 3 , 6 } Examples of cliques: { (1) , (3 , 4 , 6) , (2 , 5) } Set of all cliques: V ∪ E ∪ { 3 , 4 , 6 } 02/25/2011 iPAL Group Meeting 5

  8. Separation Let A, B, C be three disjoint subsets of V C separates A from B if any path from a node in A to a node in B contains some node in C Example: C = { 1 , 4 , 6 } separates A = { 3 } from B = { 2 , 5 } 02/25/2011 iPAL Group Meeting 6

  9. Markov chains Graphical model: Associate each node of a graph with a random variable (or a collection thereof) Homogeneous 1-D Markov chain: p ( x n | x i , i < n ) = p ( x n | x n − 1 ) Probability of a sequence given by: N � p ( x ) = p ( x 0 ) p ( x n | x n − 1 ) n =1 02/25/2011 iPAL Group Meeting 7

  10. 2-D Markov chains Advantages: Simple expressions for probability Simple parameter estimation Disadvantages: No natural ordering of image pixels Anisotropic model behavior 02/25/2011 iPAL Group Meeting 8

  11. Random fields on graphs Consider a collection of random variables x = ( x 1 , x 2 , . . . , x N ) with associated joint probability distribution p ( x ) Let A, B, C be three disjoint subsets of V . Let x A denote the collection of random variables in A . Conditional independence: A ⊥ ⊥ B | C A ⊥ ⊥ B | C ⇔ p ( x A , x B | x C ) = p ( x A | x C ) p ( x B | x C ) Markov random field: undirected graphical model in which each node corresponds to a random variable or a collection of random variables, and the edges identify conditional dependencies. 02/25/2011 iPAL Group Meeting 9

  12. Random fields on graphs Consider a collection of random variables x = ( x 1 , x 2 , . . . , x N ) with associated joint probability distribution p ( x ) Let A, B, C be three disjoint subsets of V . Let x A denote the collection of random variables in A . Conditional independence: A ⊥ ⊥ B | C A ⊥ ⊥ B | C ⇔ p ( x A , x B | x C ) = p ( x A | x C ) p ( x B | x C ) Markov random field: undirected graphical model in which each node corresponds to a random variable or a collection of random variables, and the edges identify conditional dependencies. 02/25/2011 iPAL Group Meeting 9

  13. Markov properties Pairwise Markovianity: ( n i , n j ) / ∈ E ⇒ x i and x j are independent when conditioned on all other variables p ( x i , x j | x \{ i,j } ) = p ( x i | x \{ i,j } ) p ( x j | x \{ i,j } ) Local Markovianity: Given its neighborhood, a variable is independent on the rest of the variables p ( x i | x V\{ i } ) = p ( x i | x N ( i ) ) Global Markovianity: Let A, B, C be three disjoint subsets of V . If C separates A from B ⇒ p ( x A , x B | x C ) = p ( x A | x C ) p ( x B | x C ) , then p ( · ) is global Markov w.r.t. G . 02/25/2011 iPAL Group Meeting 10

  14. Markov properties Pairwise Markovianity: ( n i , n j ) / ∈ E ⇒ x i and x j are independent when conditioned on all other variables p ( x i , x j | x \{ i,j } ) = p ( x i | x \{ i,j } ) p ( x j | x \{ i,j } ) Local Markovianity: Given its neighborhood, a variable is independent on the rest of the variables p ( x i | x V\{ i } ) = p ( x i | x N ( i ) ) Global Markovianity: Let A, B, C be three disjoint subsets of V . If C separates A from B ⇒ p ( x A , x B | x C ) = p ( x A | x C ) p ( x B | x C ) , then p ( · ) is global Markov w.r.t. G . 02/25/2011 iPAL Group Meeting 10

  15. Markov properties Pairwise Markovianity: ( n i , n j ) / ∈ E ⇒ x i and x j are independent when conditioned on all other variables p ( x i , x j | x \{ i,j } ) = p ( x i | x \{ i,j } ) p ( x j | x \{ i,j } ) Local Markovianity: Given its neighborhood, a variable is independent on the rest of the variables p ( x i | x V\{ i } ) = p ( x i | x N ( i ) ) Global Markovianity: Let A, B, C be three disjoint subsets of V . If C separates A from B ⇒ p ( x A , x B | x C ) = p ( x A | x C ) p ( x B | x C ) , then p ( · ) is global Markov w.r.t. G . 02/25/2011 iPAL Group Meeting 10

  16. Hammersley-Clifford Theorem Consider a random field x on a graph G , such that p ( x ) > 0 . Let C denote the set of all maximal cliques of the graph. If the field has the local Markov property, then p ( x ) can be written as a Gibbs distribution: � � p ( x ) = 1 � Z exp − V C ( x C ) , C ∈C where Z , the normalizing constant, is called the partition function; V C ( x C ) are the clique potentials If p ( x ) can be written in Gibbs form for the cliques of some graph, then it has the global Markov property. Fundamental consequence: every Markov random field can be specified via clique potentials. 02/25/2011 iPAL Group Meeting 11

  17. Hammersley-Clifford Theorem Consider a random field x on a graph G , such that p ( x ) > 0 . Let C denote the set of all maximal cliques of the graph. If the field has the local Markov property, then p ( x ) can be written as a Gibbs distribution: � � p ( x ) = 1 � Z exp − V C ( x C ) , C ∈C where Z , the normalizing constant, is called the partition function; V C ( x C ) are the clique potentials If p ( x ) can be written in Gibbs form for the cliques of some graph, then it has the global Markov property. Fundamental consequence: every Markov random field can be specified via clique potentials. 02/25/2011 iPAL Group Meeting 11

  18. Hammersley-Clifford Theorem Consider a random field x on a graph G , such that p ( x ) > 0 . Let C denote the set of all maximal cliques of the graph. If the field has the local Markov property, then p ( x ) can be written as a Gibbs distribution: � � p ( x ) = 1 � Z exp − V C ( x C ) , C ∈C where Z , the normalizing constant, is called the partition function; V C ( x C ) are the clique potentials If p ( x ) can be written in Gibbs form for the cliques of some graph, then it has the global Markov property. Fundamental consequence: every Markov random field can be specified via clique potentials. 02/25/2011 iPAL Group Meeting 11

  19. Regular rectangular lattices V = { ( i, j ) , i = 1 , . . . , M, j = 1 , . . . , N } Order- K neighborhood system: N K ( i, j ) = { ( m, n ) : ( i − m ) 2 + ( j − n ) 2 ≤ K } 02/25/2011 iPAL Group Meeting 12

  20. Auto-models Only pair-wise interactions In terms of clique potentials: | C | > 2 ⇒ V C ( · ) = 0 Simplest possible neighborhood models 02/25/2011 iPAL Group Meeting 13

  21. Gauss-Markov Random Fields (GMRF) Joint probability function (assuming zero mean): 1 � − 1 � 2 x T Σ − 1 x p ( x ) = (2 π ) n/ 2 | Σ | 1 / 2 exp Quadratic form in the exponent: x T Σ − 1 x = � � x i x i Σ − 1 i,j ⇒ auto-model i j The neighborhood system is determined by the potential matrix Σ − 1 Local conditionals are univariate Gaussian 02/25/2011 iPAL Group Meeting 14

Recommend


More recommend