em like algorithms for nonparametric estimation in
play

EM-like algorithms for nonparametric estimation in multivariate - PowerPoint PPT Presentation

Mixture models and EM algorithms Multivariate non-parametric npEM algorithms Further extensions EM-like algorithms for nonparametric estimation in multivariate mixtures Didier Chauveau MAPMO - UMR 6628 - Universit dOrlans Joint


  1. Mixture models and EM algorithms Multivariate non-parametric “npEM” algorithms Further extensions EM-like algorithms for nonparametric estimation in multivariate mixtures Didier Chauveau MAPMO - UMR 6628 - Université d’Orléans Joint work with D. Hunter & T. Benaglia (Penn State University, USA) COMPSTAT 2010 – Paris, August 24th 2010 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  2. Mixture models and EM algorithms Multivariate non-parametric “npEM” algorithms Further extensions Outline Mixture models and EM algorithms 1 Motivations, examples and notation Review of EM algorithm-ology Multivariate non-parametric “npEM” algorithms 2 Model and algorithm Examples Adaptive bandwidths in the npEM algorithm Further extensions 3 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  3. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Outline: Next up. . . Mixture models and EM algorithms 1 Motivations, examples and notation Review of EM algorithm-ology Multivariate non-parametric “npEM” algorithms 2 Model and algorithm Examples Adaptive bandwidths in the npEM algorithm Further extensions 3 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  4. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Outline: Next up. . . Mixture models and EM algorithms 1 Motivations, examples and notation Review of EM algorithm-ology Multivariate non-parametric “npEM” algorithms 2 Model and algorithm Examples Adaptive bandwidths in the npEM algorithm Further extensions 3 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  5. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Finite mixture estimation problem Multivariate observation x = ( x 1 , . . . , x r ) ∈ R r from the mixture m � g ( x ) = λ j f j ( x ) j = 1 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  6. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Finite mixture estimation problem Multivariate observation x = ( x 1 , . . . , x r ) ∈ R r from the mixture m � g ( x ) = λ j f j ( x ) j = 1 Assume independence of x 1 , . . . , x r conditional of the component from which x comes (Hall and Zhou 2003,. . . ): m r � � g ( x ) = λ j f jk ( x k ) j = 1 k = 1 i.e. the dependence is induced by the mixture. D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  7. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Finite mixture estimation problem Multivariate observation x = ( x 1 , . . . , x r ) ∈ R r from the mixture m � g ( x ) = λ j f j ( x ) j = 1 Assume independence of x 1 , . . . , x r conditional of the component from which x comes (Hall and Zhou 2003,. . . ): m r � � g ( x ) = λ j f jk ( x k ) j = 1 k = 1 i.e. the dependence is induced by the mixture. Goal: Estimate θ = ( λ , f ) given an i.i.d. sample from g D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  8. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Nonparametric mixture model In parametric case f j ( · ) ≡ f ( · ; φ j ) ∈ F , a parametric family indexed by a parameter φ ∈ R d The parameter of the mixture model is θ = ( λ , φ ) = ( λ 1 , . . . , λ m , φ 1 , . . . , φ m ) Usual example: the univariate Gaussian mixture model, � � x ; ( µ j , σ 2 = the pdf of N ( µ j , σ 2 f ( x ; φ j ) = f j ) j ) . D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  9. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Nonparametric mixture model In parametric case f j ( · ) ≡ f ( · ; φ j ) ∈ F , a parametric family indexed by a parameter φ ∈ R d The parameter of the mixture model is θ = ( λ , φ ) = ( λ 1 , . . . , λ m , φ 1 , . . . , φ m ) Usual example: the univariate Gaussian mixture model, � � x ; ( µ j , σ 2 = the pdf of N ( µ j , σ 2 f ( x ; φ j ) = f j ) j ) . Motivations here: Do not assume any parametric form for the f jk ’s (e.g., avoid assumptions on tails...) D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  10. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Notational convention We have: n = # of individuals in the sample m = # of M ixture components r = # of R epeated measurements (coordinates) Throughout, we use the subscripts: 1 ≤ i ≤ n , 1 ≤ j ≤ m , 1 ≤ k ≤ r D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  11. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Notational convention We have: n = # of individuals in the sample m = # of M ixture components r = # of R epeated measurements (coordinates) Throughout, we use the subscripts: 1 ≤ i ≤ n , 1 ≤ j ≤ m , 1 ≤ k ≤ r The log-likelihood given data x 1 , . . . , x n is   n m r � � � L ( θ ) = log λ j f jk ( x ik )   i = 1 j = 1 k = 1 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  12. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Motivating example: Water-level data Example from Thomas Lohaus and Brainerd (1993). Vessel tilted to point at 1:00 The task: n = 405 subjects are shown r = 8 vessels, pointing at 1, 2, 4, 5, 7, 8, 10 and 11 o’clock They draw the water surface for each Measure: (signed) angle formed by surface with horizontal D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  13. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Outline: Next up. . . Mixture models and EM algorithms 1 Motivations, examples and notation Review of EM algorithm-ology Multivariate non-parametric “npEM” algorithms 2 Model and algorithm Examples Adaptive bandwidths in the npEM algorithm Further extensions 3 D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  14. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Review of standard EM for mixtures For MLE in finite mixtures, EM algorithms are standard. A “complete” observation ( X , Z ) consists of: The observed, “incomplete” data X The “missing” vector Z , defined by � 1 if X comes from component j for 1 ≤ j ≤ m , Z j = 0 otherwise D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  15. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Review of standard EM for mixtures For MLE in finite mixtures, EM algorithms are standard. A “complete” observation ( X , Z ) consists of: The observed, “incomplete” data X The “missing” vector Z , defined by � 1 if X comes from component j for 1 ≤ j ≤ m , Z j = 0 otherwise What does this mean? In simulations: We generate Z first, then X | Z j = 1 ∼ f j In real data, Z is a latent variable whose interpretation depends on context. D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  16. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Parametric (univariate) EM algorithm for mixtures Let θ t be an “arbitrary” value of θ D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

  17. Mixture models and EM algorithms Motivations, examples and notation Multivariate non-parametric “npEM” algorithms Review of EM algorithm-ology Further extensions Parametric (univariate) EM algorithm for mixtures Let θ t be an “arbitrary” value of θ E-step: Amounts to find the conditional expectation of each Z λ t j f ( x i ; φ t j ) Z t ij := P θ t [ Z ij = 1 | x i ] = j ′ λ t j ′ f ( x i ; φ t � j ′ ) D. Chauveau – COMPSTAT 2010 Nonparametric multivariate mixtures

Recommend


More recommend