Lecture 7: More Math + Image Filtering Justin Johnson EECS 442 WI 2020: Lecture 7 - 1 January 30, 2020
Administrative HW0 was due yesterday! HW1 due a week from yesterday Justin Johnson EECS 442 WI 2020: Lecture 7 - 2 January 30, 2020
Cool Talk Today: https://cse.engin.umich.edu/event/numpy-a-look-at-the-past-present-and-future-of-array-computation Justin Johnson EECS 442 WI 2020: Lecture 7 - 3 January 30, 2020
Last Time: Matrices, Vectorization, Linear Algebra Justin Johnson EECS 442 WI 2020: Lecture 7 - 4 January 30, 2020
Eigensystems โข An eigenvector ๐ ๐ and eigenvalue ๐ $ of a matrix ๐ฉ satisfy ๐ฉ๐ ๐ = ๐ $ ๐ ๐ ( ๐ฉ๐ ๐ is scaled by ๐ $ ) โข Vectors and values are always paired and typically ' = 1 you assume ๐ ๐ โข Biggest eigenvalue of A gives bounds on how much ๐ ๐ = ๐ฉ๐ stretches a vector x . โข Hints of what people really mean: โข โLargest eigenvectorโ = vector w/ largest value โข โSpectralโ just means thereโs eigenvectors somewhere Justin Johnson EECS 442 WI 2020: Lecture 7 - 5 January 30, 2020
Suppose I have points in a grid Justin Johnson EECS 442 WI 2020: Lecture 7 - 6 January 30, 2020
Now I apply f( x ) = Ax to these points Pointy-end: Ax . Non-Pointy-End: x Justin Johnson EECS 442 WI 2020: Lecture 7 - 7 January 30, 2020
๐ฉ = 1.1 0 0 1.1 Red box โ unit square, Blue box โ after f( x ) = Ax . What are the yellow lines and why? Justin Johnson EECS 442 WI 2020: Lecture 7 - 8 January 30, 2020
๐ฉ = 0.8 0 0 1.25 Now I apply f( x ) = Ax to these points Pointy-end: Ax . Non-Pointy-End: x Justin Johnson EECS 442 WI 2020: Lecture 7 - 9 January 30, 2020
๐ฉ = 0.8 0 0 1.25 Red box โ unit square, Blue box โ after f( x ) = Ax . What are the yellow lines and why? Justin Johnson EECS 442 WI 2020: Lecture 7 - 10 January 30, 2020
๐ฉ = cos(๐ข) โsin(๐ข) sin(๐ข) cos(๐ข) Red box โ unit square, Blue box โ after f( x ) = Ax . Can we draw any yellow lines? Justin Johnson EECS 442 WI 2020: Lecture 7 - 11 January 30, 2020
Eigenvectors of Symmetric Matrices โข Always n mutually orthogonal eigenvectors with n (not necessarily) distinct eigenvalues โข For symmetric ๐ฉ , the eigenvector with the largest ๐ ๐ผ ๐ฉ๐ eigenvalue maximizes ๐ ๐ผ ๐ (smallest/min) โข So for unit vectors (where ๐ ๐ผ ๐ = 1 ), that eigenvector maximizes ๐ ๐ผ ๐ฉ๐ โข A surprisingly large number of optimization problems rely on (max/min)imizing this Justin Johnson EECS 442 WI 2020: Lecture 7 - 12 January 30, 2020
Singular Value Decomposition Can always write a mxn matrix A as: ๐ฉ = ๐ฝ๐ป๐พ ๐ผ 0 ฯ 1 A = U โ ฯ 2 ฯ 3 0 Scale Rotation Eigenvectors Sqrt of of AA T Eigenvalues of A T A Justin Johnson EECS 442 WI 2020: Lecture 7 - 13 January 30, 2020
Singular Value Decomposition Can always write a mxn matrix A as: ๐ฉ = ๐ฝ๐ป๐พ ๐ผ V T A = U โ Rotation Scale Rotation Eigenvectors Sqrt of Eigenvectors of AA T Eigenvalues of A T A of A T A Justin Johnson EECS 442 WI 2020: Lecture 7 - 14 January 30, 2020
Singular Value Decomposition โข Every matrix is a rotation, scaling, and rotation โข Number of non-zero singular values = rank / number of linearly independent vectors โข โClosestโ matrix to A with a lower rank 0 ฯ 1 V T ฯ 2 = A U ฯ 3 0 Justin Johnson EECS 442 WI 2020: Lecture 7 - 15 January 30, 2020
Singular Value Decomposition โข Every matrix is a rotation, scaling, and rotation โข Number of non-zero singular values = rank / number of linearly independent vectors โข โClosestโ matrix to A with a lower rank 0 ฯ 1 V T ฯ 2 = ร U 0 0 Justin Johnson EECS 442 WI 2020: Lecture 7 - 16 January 30, 2020
Singular Value Decomposition โข Every matrix is a rotation, scaling, and rotation โข Number of non-zero singular values = rank / number of linearly independent vectors โข โClosestโ matrix to A with a lower rank โข Secretly behind basically many things you do with matrices Justin Johnson EECS 442 WI 2020: Lecture 7 - 17 January 30, 2020
Solving Least-Squares Start with two points (x i ,y i ) (x 2 ,y 2 ) ๐ = ๐ฉ๐ ๐ง @ ๐ง ' = ๐ฆ @ 1 ๐ ๐ฆ ' 1 ๐ (x 1 ,y 1 ) ๐ง @ ๐ง ' = ๐๐ฆ @ + ๐ ๐๐ฆ ' + ๐ We know how to solve this โ invert A and find v (i.e., (m,b) that fits points) Justin Johnson EECS 442 WI 2020: Lecture 7 - 18 January 30, 2020
Solving Least-Squares Start with two points (x i ,y i ) (x 2 ,y 2 ) ๐ = ๐ฉ๐ ๐ง @ ๐ง ' = ๐ฆ @ 1 ๐ ๐ฆ ' 1 ๐ (x 1 ,y 1 ) ' ๐ง @ ๐ง ' โ ๐๐ฆ @ + ๐ ๐ โ ๐ฉ๐ ' = ๐๐ฆ ' + ๐ ' + ๐ง ' โ ๐๐ฆ ' + ๐ ' = ๐ง @ โ ๐๐ฆ @ + ๐ The sum of squared differences between the actual value of y and what the model says y should be. Justin Johnson EECS 442 WI 2020: Lecture 7 - 19 January 30, 2020
Solving Least-Squares Suppose there are n > 2 points ๐ = ๐ฉ๐ ๐ง @ ๐ฆ @ 1 ๐ โฎ โฎ โฎ = ๐ ๐ง G ๐ฆ G 1 Compute ๐ง โ ๐ต๐ฆ ' again K ๐ โ ๐ฉ๐ ' = I ๐ง $ โ (๐๐ฆ $ + ๐) ' $J@ Justin Johnson EECS 442 WI 2020: Lecture 7 - 20 January 30, 2020
Solving Least-Squares Given y , A , and v with y = Av overdetermined ( A tall / more equations than unknowns) We want to minimize ๐ โ ๐ฉ๐ ๐ , or find: arg min ๐ ๐ โ ๐ฉ๐ ' (The value of x that makes the expression smallest) Solution satisfies ๐ฉ ๐ผ ๐ฉ ๐ โ = ๐ฉ ๐ผ ๐ or R@ ๐ฉ ๐ผ ๐ ๐ โ = ๐ฉ ๐ผ ๐ฉ (Donโt actually compute the inverse!) Justin Johnson EECS 442 WI 2020: Lecture 7 - 21 January 30, 2020
When is Least-Squares Possible? Given y , A , and v . Want y = Av Want n outputs, have n knobs to y = A v fiddle with, every knob is useful if A is full rank. A: rows (outputs) > columns = v (knobs). Thus canโt get precise y A output you want (not enough knobs). So settle for โclosestโ knob setting. Justin Johnson EECS 442 WI 2020: Lecture 7 - 22 January 30, 2020
When is Least-Squares Possible? Given y , A , and v . Want y = Av Want n outputs, have n knobs to y = A v fiddle with, every knob is useful if A is full rank. A: columns (knobs) > rows y = A v (outputs). Thus, any output can be expressed in infinite ways. Justin Johnson EECS 442 WI 2020: Lecture 7 - 23 January 30, 2020
Homogeneous Least-Squares Given a set of unit vectors (aka directions) ๐ ๐ , โฆ , ๐ ๐ and I want vector ๐ that is as orthogonal to all the ๐ ๐ as possible (for some definition of orthogonal) Stack ๐ ๐ into A , compute Av ๐ ๐ โฆ ๐ ๐ 0 if ๐ผ ๐ผ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ orthog ๐ฉ๐ = ๐ = โฎ โฎ ๐ผ ๐ผ ๐ โ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐ ๐ฉ๐ ๐ = I ๐ผ ๐ Compute ๐ ๐ ๐ Sum of how orthog. v is to each x Justin Johnson EECS 442 WI 2020: Lecture 7 - 24 January 30, 2020
Homogenous Least-Squares โข A lot of times, given a matrix A we want to find the v that minimizes ๐ฉ๐ ' . โข I.e., want ๐ฐ โ = arg min ' ๐ฉ๐ ' ๐ โข Whatโs a trivial solution? โข Set v = 0 โ Av = 0 โข Exclude this by forcing v to have unit norm Justin Johnson EECS 442 WI 2020: Lecture 7 - 25 January 30, 2020
Homogenous Least-Squares ' Letโs look at ๐ฉ๐ ' ' = ๐ฉ๐ ' Rewrite as dot product ' = ๐๐ฐ Z (๐๐ฐ) ๐ฉ๐ ' Distribute transpose ' = ๐ ๐ผ ๐ฉ ๐ผ ๐๐ฐ = ๐ฐ ๐ ๐ ๐ ๐ ๐ฐ ๐ฉ๐ ' We want the vector minimizing this quadratic form Where have we seen this? Justin Johnson EECS 442 WI 2020: Lecture 7 - 26 January 30, 2020
Homogenous Least-Squares Ubiquitious tool in vision: ๐ [ J@ ๐ฉ๐ ' arg min (1) โSmallestโ* eigenvector of ๐ฉ ๐ผ ๐ฉ (2) โ Smallestโ right singular vector of ๐ฉ For min โ max, switch smallest โ largest *Note: ๐ฉ ๐ผ ๐ฉ is positive semi-definite so it has all non-negative eigenvalues Justin Johnson EECS 442 WI 2020: Lecture 7 - 27 January 30, 2020
Derivatives Justin Johnson EECS 442 WI 2020: Lecture 7 - 28 January 30, 2020
Derivatives Remember derivatives? Derivative: rate at which a function f(x) changes at a point as well as the direction that increases the function Justin Johnson EECS 442 WI 2020: Lecture 7 - 29 January 30, 2020
Given quadratic function f(x) ๐ ๐ฆ, ๐ง = ๐ฆ โ 2 ' + 5 ๐ ๐ฆ is function ๐ ๐ฆ = ๐ ] ๐ฆ aka ๐ ๐ฆ = ๐ ๐๐ฆ ๐(๐ฆ) Justin Johnson EECS 442 WI 2020: Lecture 7 - 30 January 30, 2020
Given quadratic function f(x) ๐ ๐ฆ, ๐ง = ๐ฆ โ 2 ' + 5 Whatโs special about x=2? ๐ ๐ฆ minim. at 2 ๐ ๐ฆ = 0 at 2 a = minimum of f โ ๐ ๐ = 0 Reverse is not true Justin Johnson EECS 442 WI 2020: Lecture 7 - 31 January 30, 2020
Rates of change ๐ ๐ฆ, ๐ง = ๐ฆ โ 2 ' + 5 Suppose I want to increase f(x) by changing x: Blue area: move left Red area: move right Derivative tells you direction of ascent and rate Justin Johnson EECS 442 WI 2020: Lecture 7 - 32 January 30, 2020
Calculus to Know โข Really need intuition โข Need chain rule โข Rest you should look up / use a computer algebra system / use a cookbook โข Partial derivatives (and thatโs it from multivariable calculus) Justin Johnson EECS 442 WI 2020: Lecture 7 - 33 January 30, 2020
Recommend
More recommend