workshop 11 2a generalized linear mixed effects models
play

Workshop 11.2a: Generalized Linear Mixed Effects Models (GLMM) - PowerPoint PPT Presentation

Workshop 11.2a: Generalized Linear Mixed Effects Models (GLMM) Murray Logan 07 Feb 2017 Section 1 Generalized Linear Mixed Effects Models Parameter Estimation lm LME (integrate likelihood across all unobserved levels random


  1. Workshop 11.2a: Generalized Linear Mixed Effects Models (GLMM) Murray Logan 07 Feb 2017

  2. Section 1 Generalized Linear Mixed Effects Models

  3. Parameter Estimation lm ฀฀ LME (integrate likelihood across all unobserved levels random effects)

  4. Parameter Estimation lm ฀฀ LME (integrate likelihood across all unobserved levels random effects) glm ฀-฀฀฀฀฀ GLMM Not so easy - need to approximate

  5. Parameter Estimation • Penalized quasi-likelihood • Laplace approximation • Gauss-Hermite quadrature

  6. Penalized quasi-likelihood (PQL) i n g g h t w e i r e ) e ( t i v e r a I t • LMM to estimate vcov structure • fixed effects estimated by fitting GLM (incorp vcov) • refit LMM to re-estimate vcov • cycle

  7. Penalized quasi-likelihood (PQL) s a g e a n t A d v • relatively simple • leverage variance-covariance structures for heterogeneity and dependency structures g e s n t a d v a i s a D • biased when expected values less ฀5 • approximates likelihood (no AIC or LTR)

  8. Laplace approximation Second-order Taylor series expansion - to approximate likelihood at unobserved levels of random effects

  9. Laplace approximation Second-order Taylor series expansion - to approximate likelihood at unobserved levels of random effects g e s n t a d v a A • more accurate

  10. Laplace approximation Second-order Taylor series expansion - to approximate likelihood at unobserved levels of random effects s a g e a n t A d v • more accurate g e s t a v a n s a d D i • slower • no way to incorporate vcov

  11. Gauss-Hermite quadrature (GHQ) • approximates value of integrals at specific points (quadratures) • points (and weights) selected by optimizer

  12. Gauss-Hermite quadrature (GHQ) • approximates value of integrals at specific points (quadratures) • points (and weights) selected by optimizer e s t a g v a n A d • even more accurate

  13. Gauss-Hermite quadrature (GHQ) • approximates value of integrals at specific points (quadratures) • points (and weights) selected by optimizer g e s n t a d v a A • even more accurate s a g e a n t a d v D i s • even slower • no way to incorporate vcov

  14. Markov Chain Monte Carlo (MCMC) • recreate likelihood by sampling proportionally to likelihood

  15. Markov Chain Monte Carlo (MCMC) • recreate likelihood by sampling proportionally to likelihood e s t a g a n A d v • very accurate (not an approximation) • very robust

  16. Markov Chain Monte Carlo (MCMC) • recreate likelihood by sampling proportionally to likelihood e s t a g v a n A d • very accurate (not an approximation) • very robust g e s n t a v a s a d D i • very slow • currently complex

  17. Inference (hypothesis) testing M M G L Depends on: • Estimation engine (PQL, Laplace, GHQ) • Overdispersed • Fixed or random factors

  18. glmer Inference (hypothesis) testing Approximation Characteristics Associated infer- R Function ence Penalized Quasi- Fast and simple, accommodates heterogeneity and Wald tests only glmmPQL (MASS) likelihood (PQL) dependency structures, biased for small samples Laplace More accurate (less biased), slower, does not accom- LRT (lme4), modate heterogeneity and dependency structures glmmadmb (glmmADMB) Gauss-Hermite Evan more accurate (less biased), slower, does LRT glmer (lme4)?? - does not quadrature not accommodate heterogeneity and dependency seem to work structures, cant handle more than 1 random ef- fect Markov Chain Bayesian, very flexible and accurate, yet very Bayesian credibil- Numerous (see Tutorial Monte Carlo (MCMC) slow and more complex ity intervals, Bayes 9.2b) factors

  19. glmmadmb Inference (hypothesis) testing Feature glmmQPL (MASS) glmer (lme4) (glm- MCMC mADMB) Varoamce amd covariance structures Yes - not yet Yes Overdispersed (Quasi) families Yes limited some - Mixture families limited limited limited Yes Zero-inflation - - Yes Yes Residual degrees of freedom Between-within -฀ - NA Parameter tests Wald t Wald Z Wald Z UI Wald F , χ 2 Wald F , χ 2 Wald F , χ 2 Marginal tests (fixed effects) UI Wald F , χ 2 Marginal tests (random effects) LRT LRT UI Information criterion - AIC AIC AIC, WAIC

  20. 1 Yes or no Zero-inflation glmmadmb(.., zeroInflated=TRUE) . . yes . Laplace or GHQ . Overdispersed Model Inference Random effects glmer() or glmmadmb() glmmPQL(.., family='negative.binomial') LRT (ML) Fixed effects No glmer() or glmmadmb() no Yes glmer(..(1|Obs)) . Clumpiness glmer(.., family='negative.binomial') no glmmamd(.., family='nbinom') . . Clumpiness glmmadmb(.., zeroInflated=TRUE) . Normally distributed data . Random effects . lm (), gls() . no . lme() . yes . yes Data normalizable (via transformations) no . yes . PQL . Overdispersed Model Inference No glmmPQL() . Yes glmmPQL(.., family='quasi..') Zero-inflation Inference (hypothesis) testing Expected value > 5 Wald Z or χ 2 Wald t or F Wald t or F Wald t or F Wald Z or χ 2 Wald t or F Wald t or F Wald t or F Wald t or F

  21. Additional assumptions • dispersion • (multi)collinearity • design balance and Type III (marginal) SS • heteroscadacity • spatial/temporal autocorrelation

  22. Section 2 Worked Examples

  23. Worked Examples log ( y ij ) = γ Sitei + β 0 + β 1 Treat i + ε ij ε ∼ Pois ( λ ) ∑ where γ = 0

Recommend


More recommend