solving linear system of equations the undo button for
play

Solving Linear System of Equations The Undo button for Linear - PowerPoint PPT Presentation

Solving Linear System of Equations The Undo button for Linear Operations Matrix-vector multiplication: given the data and the operator , we can find such that = transformation What if we know


  1. Solving Linear System of Equations

  2. The โ€œUndoโ€ button for Linear Operations Matrix-vector multiplication: given the data ๐’š and the operator ๐‘ฉ , we can find ๐’› such that ๐’› = ๐‘ฉ ๐’š ๐‘ฉ ๐’š ๐’› transformation What if we know ๐’› but not ๐’š ? How can we โ€œundoโ€ the transformation? ๐‘ฉ !๐Ÿ ๐’š ๐’› Solve ๐‘ฉ ๐’š = ๐’› for ๐’š ?

  3. Image Blurring Example โ€ข Image is stored as a 2D array of real numbers between 0 and 1 (0 represents a white pixel, 1 represents a black pixel) โ€ข ๐’š๐’๐’ƒ๐’– has 40 rows of pixels and 100 columns of pixels โ€ข Flatten the 2D array as a 1D array โ€ข ๐’š contains the 1D data with dimension 4000, โ€ข Apply blurring operation to data ๐’š, i.e. ๐’› = ๐‘ฉ ๐’š where ๐‘ฉ is the blur operator and ๐’› is the blurred image

  4. Blur operator ๐’š Blur operator ๐‘ฉ blurred "originalโ€ Blur operator ๐’› image image (4000,4000) (4000,) (4000,) ๐’› = ๐‘ฉ ๐’š

  5. โ€Undoโ€ Blur to recover original image ๐’› Assumptions: 1. we know the blur operator ๐‘ฉ Solve 2. the data set ๐’› does not ๐‘ฉ ๐’š = ๐’› have any noise (โ€œclean for ๐’š dataโ€ ๐’š

  6. โ€Undoโ€ Blur to recover original image ๐’› + ๐‘ โˆ— 10 !" (๐‘ โˆˆ 0,1 ) ๐’› + ๐‘ โˆ— 10 !# (๐‘ โˆˆ 0,1 ) Solve ๐‘ฉ ๐’š = ๐’› for ๐’š How much noise can we add and still be able to recover meaningful information from the original image? At which point this inverse transformation fails? We will talk about sensitivity of the โ€œundoโ€ operation later.

  7. Linear System of Equations How do we actually solve ๐‘ฉ ๐’š = ๐’„ ? We can start with an โ€œeasierโ€ system of equationsโ€ฆ Letโ€™s consider triangular matrices (lower and upper): ๐‘ฆ ! ๐‘€ !! 0 ๐‘ โ€ฆ 0 ๐‘ฆ " ๐‘€ "! ๐‘€ "" ๐‘ " โ€ฆ 0 = โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ ๐‘ฆ # ๐‘€ #! ๐‘€ #" โ€ฆ ๐‘€ ## ๐‘ # ๐‘ฆ ! ๐‘‰ !! ๐‘‰ !" โ€ฆ ๐‘‰ !# ๐‘ ๐‘ฆ " 0 ๐‘‰ "" โ€ฆ ๐‘‰ "# ๐‘ " = โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฎ ๐‘ฆ # โ€ฆ ๐‘‰ ## ๐‘ # 0 0

  8. Example: Forward-substitution for lower triangular systems ๐‘ฆ ! 2 0 0 0 2 ๐‘ฆ " 3 2 0 0 2 = ๐‘ฆ # 1 2 6 0 6 ๐‘ฆ $ 1 3 4 2 4 2 ๐‘ฆ $ = 2 โ†’ ๐‘ฆ $ = 1 3 ๐‘ฆ $ + 2 ๐‘ฆ % = 2 โ†’ ๐‘ฆ % = 2 โˆ’ 3 = โˆ’0.5 2 1 ๐‘ฆ $ + 2 ๐‘ฆ % + 6 ๐‘ฆ & = 6 โ†’ ๐‘ฆ & = 6 โˆ’ 1 + 1 = 1.0 6 1 ๐‘ฆ $ + 3 ๐‘ฆ % + 4 ๐‘ฆ & + 2 ๐‘ฆ # = 4 โ†’ ๐‘ฆ & = 4 โˆ’ 1 + 1.5 โˆ’ 4 = 0.25 ๐‘ฆ ! 1 2 ๐‘ฆ " โˆ’0.5 = ๐‘ฆ # 1.0 ๐‘ฆ $ 0.25

  9. Triangular Matrices ๐‘ฆ ! ๐‘‰ !! ๐‘‰ !" โ€ฆ ๐‘‰ !# ๐‘ ๐‘ฆ " โ€ฆ ๐‘‰ "# 0 ๐‘‰ "" ๐‘ " = โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฎ ๐‘ฆ # โ€ฆ ๐‘‰ ## ๐‘ # 0 0 Recall that we can also write ๐‘ฝ ๐’š = ๐’„ as a linear combination of the columns of ๐‘ฝ ๐‘ฆ ! ๐• : , 1 + ๐‘ฆ " ๐• : , 2 + โ‹ฏ + ๐‘ฆ % ๐• : , ๐‘œ = ๐’„ Hence we can write the solution as ๐‘‰ '' ๐‘ฆ ' = ๐‘ ' ๐‘ฆ $ ๐• : , 1 + โ‹ฏ + ๐‘ฆ '!$ ๐• : , ๐‘œ โˆ’ 1 = ๐’„ โˆ’๐‘ฆ ' ๐• : , ๐‘œ โ†’ ๐‘‰ '!$,'!$ ๐‘ฆ '!$ = ๐‘ '!$ โˆ’ ๐‘‰ '!$,' ๐‘ฆ ' ๐‘ฆ $ ๐• : , 1 + โ‹ฏ + ๐‘ฆ '!% ๐• : , ๐‘œ โˆ’ 2 = ๐’„ โˆ’๐‘ฆ ' ๐• : , ๐‘œ โˆ’ ๐‘ฆ '!$ ๐• : , ๐‘œ โˆ’ 1 Or in general (backward-substitution for upper triangular systems): % ๐‘ & โˆ’ โˆ‘ '(&)! ๐‘‰ &' ๐‘ฆ ' ๐‘ฆ % = ๐‘ % /๐‘‰ %% ๐‘ฆ & = , ๐‘— = ๐‘œ โˆ’ 1, ๐‘œ โˆ’ 2, โ€ฆ , 1 ๐‘‰ &&

  10. Triangular Matrices Forward-substitution for lower-triangular systems: ๐‘ฆ ! ๐‘€ !! 0 ๐‘ โ€ฆ 0 ๐‘ฆ " ๐‘€ "! ๐‘€ "" ๐‘ " โ€ฆ 0 = โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ ๐‘ฆ # ๐‘€ #! ๐‘€ #" โ€ฆ ๐‘€ ## ๐‘ # &*! ๐‘€ &' ๐‘ฆ ' ๐‘ & โˆ’ โˆ‘ '(! ๐‘ฆ ! = ๐‘ ! /๐‘€ !! ๐‘ฆ & = , ๐‘— = 2,3, โ€ฆ , ๐‘œ ๐‘€ &&

  11. Cost of solving triangular systems % ๐‘ & โˆ’ โˆ‘ '(&)! ๐‘‰ &' ๐‘ฆ ' ๐‘ฆ % = ๐‘ % /๐‘‰ %% ๐‘ฆ & = , ๐‘— = ๐‘œ โˆ’ 1, ๐‘œ โˆ’ 2, โ€ฆ , 1 ๐‘‰ && ๐‘œ divisions Computational complexity is ๐‘ƒ(๐‘œ " ) ๐‘œ ๐‘œ โˆ’ 1 /2 subtractions/additions ๐‘œ ๐‘œ โˆ’ 1 /2 multiplications &*! ๐‘€ &' ๐‘ฆ ' ๐‘ & โˆ’ โˆ‘ '(! ๐‘ฆ ! = ๐‘ ! /๐‘€ !! ๐‘ฆ & = , ๐‘— = 2,3, โ€ฆ , ๐‘œ ๐‘€ && ๐‘œ divisions Computational complexity is ๐‘ƒ(๐‘œ " ) ๐‘œ ๐‘œ โˆ’ 1 /2 subtractions/additions ๐‘œ ๐‘œ โˆ’ 1 /2 multiplications

  12. Linear System of Equations How do we solve ๐‘ฉ ๐’š = ๐’„ when ๐‘ฉ is a non-triangular matrix? We can perform LU factorization: given a ๐‘œร—๐‘œ matrix ๐‘ฉ , obtain lower triangular matrix ๐‘ด and upper triangular matrix ๐‘ฝ such that ๐‘ฉ = ๐‘ด๐‘ฝ where we set the diagonal entries of ๐‘ด to be equal to 1. 1 0 ๐‘‰ !! ๐‘‰ !" โ€ฆ ๐‘‰ !# ๐ต !! ๐ต !" โ€ฆ ๐ต !# โ€ฆ 0 ๐‘€ "! 1 0 ๐‘‰ "" โ€ฆ ๐‘‰ "# ๐ต "! ๐ต "" โ€ฆ ๐ต "# โ€ฆ 0 = โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ โ‹ฎ ๐‘€ #! ๐‘€ #" โ€ฆ ๐‘‰ ## ๐ต #! ๐ต #" โ€ฆ ๐ต ## โ€ฆ 1 0 0

  13. LU Factorization โ€ฆ ๐‘‰ !# โ€ฆ ๐ต !# 1 0 ๐‘‰ !! ๐‘‰ !" ๐ต !! ๐ต !" โ€ฆ 0 ๐‘€ "! 1 0 ๐‘‰ "" โ€ฆ ๐‘‰ "# ๐ต "! ๐ต "" โ€ฆ ๐ต "# โ€ฆ 0 = โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฑ โ‹ฎ โ‹ฎ โ‹ฎ ๐‘€ #! ๐‘€ #" โ€ฆ ๐‘‰ ## ๐ต #! ๐ต #" โ€ฆ ๐ต ## โ€ฆ 1 0 0 Assuming the LU factorization is know, we can solve the general system ๐‘ด๐‘ฝ ๐’š = ๐’„ By solving two triangular systems: Solve for ๐’› ๐‘ด ๐’› = ๐’„ Forward-substitution with complexity ๐‘ƒ(๐‘œ " ) Solve for ๐’š ๐‘ฝ ๐’š = ๐’› Backward-substitution with complexity ๐‘ƒ(๐‘œ " ) But what is the cost of the LU factorization? Is it beneficial?

  14. 2x2 LU Factorization (simple example) ๐ต !! ๐ต !" ๐‘‰ !! ๐‘‰ !" 1 0 = ๐‘€ "! 1 ๐ต "! ๐ต "" 0 ๐‘‰ "" ๐‘‰ >> = ๐ต => /๐‘‰ >> ๐ต !! ๐ต !" ๐‘‰ !! ๐‘‰ !" = ๐ต "! ๐ต "" ๐‘€ "! ๐‘‰ !! ๐‘€ "! ๐‘‰ !" + ๐‘‰ "" 3) ๐‘‰ == = ๐ต == โˆ’ ๐‘€ => ๐‘‰ >= 2) ๐‘€ => = ๐ต => /๐‘‰ >> Seems quite simple! Can we generalize this for a ๐‘œร—๐‘œ matrix ๐‘ฉ ?

  15. ๐‘ $$ : scalar LU Factorization ๐’ƒ $% : row vector (1ร—(๐‘œ โˆ’ 1)) ๐’ƒ %$ : column vector (๐‘œ โˆ’ 1)ร—1 ๐‘ $$ ๐’ƒ $% ๐‘ฉ %% : matrix (๐‘œ โˆ’ 1)ร—(๐‘œ โˆ’ 1) ๐ต !! ๐ต !" โ€ฆ ๐ต !# ๐‘ !! ๐’ƒ !" ๐‘ฃ !! ๐’— !" ๐ต "! ๐ต "" โ€ฆ ๐ต "# 1 ๐Ÿ = = ๐’ƒ "! ๐‘ฉ "" ๐Ÿ ๐‘ฝ "" ๐’Ž "! ๐‘ด "" โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ ๐ต #! ๐ต #" โ€ฆ ๐ต ## ๐’ƒ %$ 1) First row of ๐‘ฝ is ๐‘ฉ %% the first row of ๐‘ฉ ๐‘ !! ๐’ƒ !" ๐‘ฃ !! ๐’— !" = ๐’ƒ "! ๐‘ฉ "" ๐‘ฃ !! ๐’Ž "! ๐’Ž "! ๐’— !" + ๐‘ด "" ๐‘ฝ "" ๐’Ž => = > ๐‘ฉ "" = ๐’Ž "! ๐’— !" + ๐‘ด "" ๐‘ฝ "" + !! ๐’ƒ => Known! 2) First column of ๐‘ด is the 3) ๐‘ด "" ๐‘ฝ "" = ๐‘ฉ "" โˆ’ ๐’Ž "! ๐’— !" first column of ๐‘ฉ / ๐‘ฃ >> Need another factorization!

  16. ๐‘ $$ : scalar LU Factorization ๐’ƒ $% : row vector (1ร—(๐‘œ โˆ’ 1)) ๐’ƒ %$ : column vector (๐‘œ โˆ’ 1)ร—1 ๐‘ $$ ๐’ƒ $% ๐‘ฉ %% : matrix (๐‘œ โˆ’ 1)ร—(๐‘œ โˆ’ 1) ๐ต !! ๐ต !" โ€ฆ ๐ต !# ๐‘ !! ๐’ƒ !" ๐‘ฃ !! ๐’— !" ๐ต "! ๐ต "" โ€ฆ ๐ต "# 1 ๐Ÿ = = ๐’ƒ "! ๐‘ฉ "" ๐Ÿ ๐‘ฝ "" ๐’Ž "! ๐‘ด "" โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ ๐ต #! ๐ต #" โ€ฆ ๐ต ## ๐’ƒ %$ 1) First row of ๐‘ฝ is ๐‘ฉ %% the first row of ๐‘ฉ ๐‘ !! ๐’ƒ !" ๐‘ฃ !! ๐’— !" = ๐’ƒ "! ๐‘ฉ "" ๐‘ฃ !! ๐’Ž "! ๐’Ž "! ๐’— !" + ๐‘ด "" ๐‘ฝ "" 2) ๐’Ž => = > Known! + !! ๐’ƒ => 3) ๐‘ต = ๐‘ด "" ๐‘ฝ "" = ๐‘ฉ "" โˆ’ ๐’Ž "! ๐’— !" First column of ๐‘ด is the first column of ๐‘ฉ / ๐‘ฃ >> Need another factorization!

  17. Example 1) First row of ๐‘ฝ is the first row of ๐‘ฉ 2) First column of ๐‘ด is the first column of ๐‘ฉ / ๐‘ฃ >> 2 8 4 1 3) ๐‘ด "" ๐‘ฝ "" = ๐‘ฉ "" โˆ’ ๐’Ž "! ๐’— !" 3 3 1 2 ๐‘ต = 1 2 6 2 1 3 4 2

  18. Example 2 8 4 1 1 0 0 0 2 8 4 1 3 3 0.5 0 0 0 0 0 0 0 1 2 ๐‘ต = ๐‘ด = ๐‘ฝ = 1 2 6 2 0 0 0 0 0 0 0.5 0 1 3 4 2 0.5 0 0 0 0 0 0 0 2 8 4 1 1 0 0 0 2 8 4 1 1 โˆ’2 1 2.5 0.5 1 0 0 0 โˆ’2 1 2.5 ๐‘ต = ๐‘ด = ๐‘ฝ = 1 โˆ’2 0 0 0 0 0 0 4 1.5 0.5 1 1 โˆ’1 2 1.5 0.5 0.5 0 0 0 0 0 0 1 0 0 0 2 8 4 1 2 8 4 1 0.5 1 0 0 1 โˆ’2 1 2.5 0 โˆ’2 1 2.5 ๐‘ด = ๐‘ต = ๐‘ฝ = 0.5 1 1 0 1 โˆ’2 3 โˆ’1 0 0 3 โˆ’1 1 โˆ’1 1.5 0.25 0.5 0.5 0.5 0 0 0 0 0 2 8 4 1 1 0 0 0 4 1 2 8 0 โˆ’2 1 2.5 0.5 1 0 0 1 โˆ’2 1 2.5 ๐‘ฝ = ๐‘ด = ๐‘ต = 0 0 3 โˆ’1 0.5 1 1 0 1 โˆ’2 3 โˆ’1 0 0 0 0.75 0.5 0.5 0.5 1 1 โˆ’1 1.5 0.75

  19. Algorithm: LU Factorization of matrix A

  20. Side note: Cost of LU factorization % ๐‘— = 1 $ 2 ๐‘› ๐‘› + 1 "#$ % ๐‘— & = 1 $ 6 ๐‘› ๐‘› + 1 2๐‘› + 1 "#$

Recommend


More recommend