presentation 7 3a multiple linear regression
play

Presentation 7.3a: Multiple linear regression Murray Logan 19 Jul - PowerPoint PPT Presentation

Presentation 7.3a: Multiple linear regression Murray Logan 19 Jul 2017 Section 1 Theory Multiple Linear Regression l o d e e m i v d i t A d growth = intercept + temperature + nitrogen y i = 0 + 1 x i 1 + 2 x i 2 + ... +


  1. Presentation 7.3a: Multiple linear regression Murray Logan 19 Jul 2017

  2. Section 1 Theory

  3. Multiple Linear Regression l o d e e m i v d i t A d growth = intercept + temperature + nitrogen y i = β 0 + β 1 x i 1 + β 2 x i 2 + ... + β j x ij + ϵ i OR N ∑ y i = β 0 + β j x ji + ϵ i j =1: n

  4. Multiple Linear Regression e l m o d v e i t i A d d growth = intercept + temperature + nitrogen y i = β 0 + β 1 x i 1 + β 2 x i 2 + ... + β j x ij + ϵ i - effect of one predictor holding the other(s) constant

  5. Multiple Linear Regression l o d e e m t i v d d i A growth = intercept + temperature + nitrogen y i = β 0 + β 1 x i 1 + β 2 x i 2 + ... + β j x ij + ϵ i Y X1 X2 3 22.7 0.9 2.5 23.7 0.5 6 25.7 0.6 5.5 29.1 0.7 9 22 0.8 8.6 29 1.3 12 29.4 1

  6. Multiple Linear Regression l d e m o i v e d i t A d 3 = β 0 + ( β 1 × 22 . 7) + ( β 2 × 0 . 9) + ε 1 2 . 5 = β 0 + ( β 1 × 23 . 7) + ( β 2 × 0 . 5) + ε 2 6 = β 0 + ( β 1 × 25 . 7) + ( β 2 × 0 . 6) + ε 3 5 . 5 = β 0 + ( β 1 × 29 . 1) + ( β 2 × 0 . 7) + ε 4 9 = β 0 + ( β 1 × 22) + ( β 2 × 0 . 8) + ε 5 8 . 6 = β 0 + ( β 1 × 29) + ( β 2 × 1 . 3) + ε 6 12 = β 0 + ( β 1 × 29 . 4) + ( β 2 × 1) + ε 7

  7. Multiple Linear Regression d e l m o v e a t i l i c t i p M u l growth = intercept + temp + nitro + temp × nitro y i = β 0 + β 1 x i 1 + β 2 x i 2 + β 3 x i 1 x i 2 + ... + ϵ i

  8. Multiple Linear Regression d e l m o i v e c a t p l i l t i M u + ( β 1 × 22 . 7) + ( β 2 × 0 . 9) + ( β 3 × 22 . 7 × 0 . 9) + ε 1 3 = β 0 + ( β 1 × 23 . 7) + ( β 2 × 0 . 5) + ( β 3 × 23 . 7 × 0 . 5) + ε 2 2 . 5 = β 0 6 = β 0 + ( β 1 × 25 . 7) + ( β 2 × 0 . 6) + ( β 3 × 25 . 7 × 0 . 6) + ε 3 5 . 5 = β 0 + ( β 1 × 29 . 1) + ( β 2 × 0 . 7) + ( β 3 × 29 . 1 × 0 . 7) + ε 4 9 = β 0 + ( β 1 × 22) + ( β 2 × 0 . 8) + ( β 3 × 22 × 0 . 8) + ε 5 8 . 6 = β 0 + ( β 1 × 29) + ( β 2 × 1 . 3) + ( β 3 × 29 × 1 . 3) + ε 6 12 = β 0 + ( β 1 × 29 . 4) + ( β 2 × 1) + ( β 3 × 29 . 4 × 1) + ε 7

  9. Section 2 Centering data

  10. Multiple Linear Regression g r i n n t e C e ● ● ● ● ● ● ● ● ● ● 20 ● ● ● ● ● ● ● ● ● ● 10 y 0 −10 −20 0 10 20 30 40 50 60 x

  11. Multiple Linear Regression g r i n n t e C e ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 47 48 49 50 51 52 53 54

  12. Multiple Linear Regression i n g t e r e n C ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 47 48 49 50 51 52 53 54

  13. Multiple Linear Regression g r i n n t e C e ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 47 48 49 50 51 52 53 54 −3 −2 −1 0 1 2 3 4

  14. Multiple Linear Regression n g e r i n t C e 24 ● ● ● ● 22 ● ● ● ● ● y 20 ● ● ● ● ● 18 ● ● ● ● 16 ● −4 −2 0 2 4 cx1

  15. Section 3 Assumptions

  16. Multiple Linear Regression s i o n m p t s s u A Normality, homog., linearity

  17. Multiple Linear Regression s i o n m p t s s u A (multi)collinearity

  18. Multiple Linear Regression o n a t i n f l e i a n c a r i V Strength of a relationship R 2 Strong when R 2 ≥ 0 . 8

  19. Multiple Linear Regression n t i o f l a i n c e i a n V a r 1 var . inf = 1 − R 2 Collinear when var . inf > = 5 Some prefer > 3

  20. 1.743817 1.743817 library (car) cx2 cx1 vif ( lm (y ~ cx1 + cx2, data)) # additive model - scaled predictors Multiple Linear Regression o n s t i u m p A s s (multi)collinearity

  21. 5.913254 16.949468 # additive model - scaled predictors 7.259729 x1:x2 x2 x1 vif ( lm (y ~ x1 * x2, data)) # multiplicative model - raw predictors 1.743817 1.743817 cx2 cx1 vif ( lm (y ~ cx1 + cx2, data)) library (car) Multiple Linear Regression n s t i o u m p A s s (multi)collinearity

  22. 1.769411 1.771994 1.018694 vif ( lm (y ~ x1 * x2, data)) cx1:cx2 cx2 cx1 vif ( lm (y ~ cx1 * cx2, data)) # multiplicative model - scaled predictors 5.913254 16.949468 7.259729 x1:x2 x2 x1 # multiplicative model - raw predictors Multiple Linear Regression o n s p t i s u m A s

  23. Section 4 Multiple linear models in R

  24. data.add.lm <- lm (y~cx1+cx2, data) Model fitting Additive model y i = β 0 + β 1 x i 1 + β 2 x i 2 + ϵ i

  25. data.mult.lm <- lm (y~cx1+cx2+cx1:cx2, data) data.add.lm <- lm (y~cx1+cx2, data) #OR data.mult.lm <- lm (y~cx1*cx2, data) Model fitting Additive model y i = β 0 + β 1 x i 1 + β 2 x i 2 + ϵ i Multiplicative model y i = β 0 + β 1 x i 1 + β 2 x i 2 + β 3 x i 1 x i 2 + ϵ i

  26. plot (data.add.lm) Model evaluation Additive model Residuals vs Fitted Normal Q−Q 40 ● 40 ● ● 2 ● ● 2 ● ● ● ● ● ● ● ● ● Standardized residuals ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 1 ● ● ● ●● ● ● ● ● 1 ● ● ● ● ● ● ● Residuals ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● −1 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● −1 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● −2 ● ● ● ● ● ● ● 74 ● −2 ● ● 30 ● 74 ● 30 −3 −2 −1 0 −2 −1 0 1 2 Fitted values Theoretical Quantiles Scale−Location Residuals vs Leverage 1.5 40 ● 30 ● ● 74 ● 40 ● ● ● ● ● ● 2 ● ● 19 ● ● ● ● ● ● ● Standardized residuals ● ● ● ● ● Standardized residuals ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 1.0 ● ● ● ● 1 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0.5 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● −1 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 10 ● −2 ● ● ● ● ● ● ● ● ● Cook's distance 0.0 −3 −2 −1 0 0.00 0.02 0.04 0.06 Fitted values Leverage

Recommend


More recommend