p rincipal c omponent a nalysis pca
play

P RINCIPAL C OMPONENT A NALYSIS (PCA) Singular Value Decomposition - PowerPoint PPT Presentation

M ODEL I DENTIFICATION BY G RADIENT M ETHODS Dr. Julien Billeter Laboratoire d'Automatique Ecole Polytechnique Fdrale de Lausanne (EPFL) MLS-S03 | 2013-2014 M ODEL I DENTIFICATION BY G RADIENT METHODS D YNAMIC M ODELS Conservation


  1. M ODEL I DENTIFICATION BY G RADIENT M ETHODS Dr. Julien Billeter Laboratoire d'Automatique Ecole Polytechnique Fédérale de Lausanne (EPFL) MLS-S03 | 2013-2014

  2. M ODEL I DENTIFICATION BY G RADIENT METHODS • D YNAMIC M ODELS – Conservation of Mass (Concentration Measurements) – Conservation of Energy (Calorimetry) – Beer’s Law (Spectroscopy) • I NTEGRATION OF D YNAMIC M ODELS – Euler’s Method – Runge-Kutta’s Methods (RK) • L INEAR R EGRESSION (OLS) P ROBLEMS – Calibration-free Calorimetry and Spectroscopy • G RADIENT - BASED N ONLINEAR R EGRESSION (NLR) M ETHODS – Steepest Descent Method (SD) – Newton-Raphson and Newton-Gauss Methods (NG) – Newton-Gauss Levenberg Marquardt Method (NGLM) • R EFERENCES MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 2

  3. S CALAR , V ECTOR AND M ATRIX N OTATION a , , , A • Scalars ω Ω (1 × 1) = number of dim 1 written in lowercase/UPPERCASE italics a , ω • Vectors ( n × 1) = n -dim array (column vector) written in lowercase boldface A , • Matrices Ω ( n × m ) = array of dimensions n (rows) by m (columns) written in UPPERCASE BOLDFACE MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 3

  4. S CALAR , V ECTOR AND M ATRIX O PERATIONS a A , α α • Scalar multiplication a b A B , + + • Addition a b A B , • Multiplication T T a , A • Transposition -1 -1 A A A A I • Inverse (identity matrix) = = rank ( ), A A ker A ( ) 0 = • Rank and null space (kernel) dim ( )= A rank ( ) A nullity ( ) A • Rank-nullity theorem + MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 4

  5. P RINCIPAL C OMPONENT A NALYSIS (PCA) • Singular Value Decomposition (SVD) is a method to Y decompose a matrix into a product of orthonormal U T V column ( ) and row ( ) singular vectors weighted by S singular values ( ). T 2 Y U S V with S = = Λ • Principal Component Analysis (PCA) is a method to reduce Y the dimensionality of a matrix to its number of significant singular values. T with Y Y noise Y Y U S V − = ≈ = MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 5

  6. L AW OF C ONSERVATION OF M ASS • “Nothing is lost, nothing is created, everything is transformed” – Lavoisier (1743-1794) T T    m t ( ) 1 m ( ) t 0 1 M n ( ) t 0 = = → = S S w u ( ) t T n  N r W W u n n n ( ) t V t ( ) ( ) t ( ) t ( ) t ( ), t (0) = ± ζ + − = out m m in in m t ( ) 0 u ( ) t T  c ( ) t N r ( ) t W ( ) t W ( ) ( t c t ), c (0) c = ± ζ + − ω = in m c i n V t ( ) 0 T T 1 ( ) t ( )  1 u ( ) t ζ  u ( ) t V t ( )  ( ) t m t  ( )  ( ) t with ( ) t ρ , V t ( ) V t ( ) ρ p m ω = + = p in ± − = − out m m t ( ) V t ( ) m t ( ) m t ( ) ( ) t m t ( ) ( ) t ρ ρ  T T and m t ( ) 1 u ( ) t u ( ) t 1 ( ) t = − ± ζ p in out p m m MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 6

  7. L AW OF C ONSERVATION OF E NERGY • “Any theory which demands the annihilation of energy , is necessarily erroneous” – Joule (1818-1889)  T = 1 q Q t ( ) 0 q ( ) t ( ) t = → acc  m t c t T t ( ) ( ) ( ) q q q q q q q , (0) T T = ± + + − + − = p r m ex in loss h out 0 T h r with q t ( ) V t ( )( ) ( ) t , = −Δ r r T q ( ) t ( h ) ( t ) , = −Δ ζ m m m ( ) q t UA T T t ( ) ( ) , = − ex j ( ) T q ( ) t c u ( ) t T T ( ) , t = − in p in , in in q ( ) t c t u ( ) ( ) ( ) t T t = out p out MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 7

  8. B EER ’ S L AW • “The absorbance of a solution is proportional to the product of its concentration and the distance light travels through it” – Beer (1825-1863), Lambert (1728- 1777) and Bouguer (1698-1758)  Y C A = = T Y C A with Y ( nt nw ), × T Units conversion: C c c [ ( ),..., ( t t )] ( nt S ) = × 1 n t A log (T), T I A a a = − = and [  ( w ),...,  ( w )] ( S nw ) = × 10 I 1 nw 0 MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 8

  9. N UMERICAL I NTEGRATION OF ODE’ S • Euler’s method (implicit, explicit) was invented by the Swiss mathematician Euler (1707-1783) y y h y  h : integration stepsize = + 2 O h ( ) + i 1 i + • Runge-Kutta’s methods (RK2, RK4, explicit, implicit) were elaborated by Runge (1856-1927) and Kutta (1867-1944) y y k k k k h ( +2 +2 + ) y y h y  1 = + 5 = + 3 O h ( ) O h ( ) + + + i 1 i 6 1 2 3 4 i 1 i 1 + i + 2 with k y  ( , t y ), =  with y y y ( , t y ) h = + 1 i i 1 i i i i 2 + 2 k y  ( t , y k ) h h = + + y  y  ( t , y ) h = + 2 i 2 i 2 1 1 i 2 1 i i + + k y  ( t , y k ), h h = + + 2 2 3 i 2 i 2 2 k y  ( t h , y h k ) = + + 4 i i 3 MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 9

  10. R EGRESSION P ROBLEMS • A regression problem consists in minimizing the difference between y ˆ( , p p y ( ) t t , ) measured output variables and modeled output variables f g f t p ( , ) (the objective/cost function) by postulating a dynamic model and f c p p p p g ( ( , t ), ) an output model , and adjusting the parameters (and ). f g f g   ( ) p p y y ˆ p p { , }* arg min ( ), ( , t t , )   = φ f g f g   p , p f g  s.t. ˆ c p p ( , t ) f t ( , ) = f f ( ) y ˆ( , t p p , ) g c ( t , p ), p = f g f g φ • In least-squares problems, is defined as the sum of squared residuals ˆ T ssq = vec R ( ) vec R ( ) R Y Y = − ( with ) and the following matrices are defined: ˆ ˆ T T T Y [ ( ),..., ( y t y t )] , Y [ ( , ˆ y t p p , ),..., ( y ˆ t , p p , )] , C [ ( , c t p ),..., ( c t , p )] = = = 1 nt 1 f g nt f g 1 f nt f MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 10

  11. S YSTEMS OF L INEAR E QUATIONS • A systems of linear equations can be written in matrix  a x  a x y + + =  1,1 1 1, n n 1  S :     A x y  =  a x  a x y + + =  m ,1 1 m n , n m A ( m n ), ( x n 1) y ( m × 1) × × with the regressors and the regressands • The number of solutions of S is: ¥ m < n when underdetermined system  1 1 m = n x A y − determined system = ¥ m > n overdetermined system MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 11

  12. L INEAR R EGRESSION (LR, OLS) For univariate data (data organized in a vector y ), a linear model relating • the n independent variables (regressors, x ) to the m > n dependent variables (regressands, y ) can be constructed as: A ( m n ), ( x n 1) y ( m × 1) y A x × × = with and { } T T 1 T x * arg min vec Ax ( y ) vec Ax ( y ) A + y with A + ( A A ) − A = − − = = x A + rank ( )= A dim ( )= A n The left pseudo-inverse exists onl y if For multivariate data (data organized in a matrix Y ), the linear model • relating the n ∙ w regressors X to the m ∙ w regressands Y is built as: Y A X X ( n w ) Y ( m w ) = × × with and { } T X * arg min vec AX ( Y ) vec AX ( Y ) A + Y = − − = x MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 12

  13. L EFT OR R IGHT P SEUDO -I NVERSE ? T 1 T A + ( A A ) − A rank ( ) A dim ( ) A = = • Left pseudo-inverse { }  T Y A X X * arg mi X vec AX n ( Y ) vec AX ( Y ) A + Y = = − − =  y C a a * C + Y = = Spectroscopy : rank ( ) C S =  Y C A * A C + Y = =  * q R ( h ) h R + q rank ( R ) R Calorimetry : = −Δ −Δ = = r v r r v r v T T 1 X + X XX − rank ( ) X dim ( ) X ( ) = = • Right pseudo-inverse { }  T Y A X A A vec AX Y vec AX Y YX + * arg mi n ( ) ( ) = = − − =  rank ( ) A S Y C A C Y A + * = = = Spectroscopy : Y ( nt nw ), ( C nt S ), ( A S nw ), q ( nt 1), R ( nt R ), h ( R 1) with × × × × × Δ × r v MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 13

  14. E XPLICIT vs I MPLICIT C ALIBRATION • In explicit calibration, a static calibration set is used to construct a calibration model from which concentrations are predicted for dynamic experiments. ˆ ˆ   ˆ   Y C A A C Y + C Y A + = = = • In implicit calibration (i.e. calibration free), dynamic experiments are used as an internal calibration set to eliminate the (static) linear counter-part A . ˆ ˆ ˆ +   ˆ         Y C A A C Y Y C C Y + = = = The implicit calibration can even be used in case of rank-deficient data, i.e. when rank( C ) < S MLS-S03 M ODEL I DENTIFICATION BY G RADIENT M ETHODS 14

Recommend


More recommend