cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 doc1 in new feature space 1 0 1 0 1 doc4 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V
cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 weight of component 1 for 1 0 1 0 1 doc4 doc 1 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V
cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 weight of component 1 over 1 0 1 0 1 doc4 all the data cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V
cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 component 1 1 0 1 0 1 doc4 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V
cong parli the US UK ress ame Singular Value 1 1 1 1 0 doc1 Decomposition (SVD) 1 0 1 0 1 doc2 1 1 0 1 0 doc3 contribution of “the” to 1 0 1 0 1 doc4 component 1 cong parlia the US UK ress ment -0.65 -0.34 -0.51 -0.34 -0.31 d1 -0.60 -0.39 0.70 0.00 3.06 0.00 0.00 0.00 0.00 0.02 -0.54 0.34 -0.54 0.56 d2 -0.48 0.50 -0.12 -0.71 0.00 1.81 0.00 0.00 0.00 d3 -0.43 -0.58 -0.69 0.00 -0.42 0.02 0.79 0.02 -0.44 0.00 0.00 0.57 0.00 0.00 d4 -0.48 0.50 -0.12 0.71 -0.63 0.27 0.00 0.37 0.63 0.00 0.00 0.00 0.00 0.00 -0.04 0.73 0.00 -0.68 0.04 U D V
Singular Value Decomposition (SVD) x = x M U D V m x n m x m m x n n x n
67
d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V m x n m x l l x l l x n
d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V m x n m x l l x l l x n keep only first l components
d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V m x n m x l l x l l x n keep only first l components “best l-rank approximation of M”
d e t a Singular Value c n u r T Decomposition (SVD) x = x M U D V ||M - UDV|| 2 as small as possible m x n m x l l x l l x n keep only first l components “best l-rank approximation of M”
72
Dimensionality Reduction • “Low Rank Assumption”: we typically assume that our features contain a large amount of redundant information • We can throw away a lot of principle components without losing too much of the signal needed for our task
Clicker Question!
Clicker Question! In practice, is this assumption of low rank valid? a) Yes b) No c) Yeah, sure, why not?
Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*
Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*
Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*
Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”*
Matrices IRL • Data is noisy, so M is most likely full-rank • We assume that M is close to a low rank matrix, and we approximate the matrix it is close to • Viewed as a “de-noised” version of M • “Original matrix exhibits redundancy and noise, low-rank reconstruction exploits the former to remove the latter”* *Matrix and Tensor Factorization Methods for Natural Language Processing. (ACL 2015)
Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…
Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…
Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…
Matrices IRL • Data is also often incomplete…missing values, new observations, etc. • Can we use SVD for this? • Yes! Though we need to make a few changes…
Matrix Completion to all the ballad of mud- boys i roma buster okja bound loved scruggs… before user1 1 0 1 user2 0 0 1 user3 1 0 1 0 1 0 user4 user5 1
Matrix Completion to all the ballad of mud- boys i roma buster okja bound loved scruggs… before 1 0 user1 1 0 1 user2 0 0 1 1 user3 1 0 1 0 1 0 user4 user5 1 “people also liked…”
Matrix Completion
Matrix Completion M ≈ UDV = M’
Matrix Completion M ≈ UDV = M’ original completed
Matrix Completion M ≈ UDV = M’ original completed problems?
Matrix Completion x x = M U D V Exact SVD assumes M is complete…
Matrix Completion x x = M U D V …just gradient descent that MF!
MF with Gradient Descent x x = M U D V
MF with Gradient Descent = x M U V
MF with Gradient Descent = x M U V Not properly SVD (fewer guarantees, e.g. components not orthonormal) but good enough
<latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> MF with Gradient Descent = x M U V X ( M ij − u i · v j ) 2 min U,V ij
<latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> MF with Gradient Descent = x M U V X ( M ij − u i · v j ) 2 min U,V ij But! Only consider cases when M ij is observed!
Clicker Question!
<latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> Clicker Question! X ( M ij − u i · v j ) 2 min U,V ij Compute the loss given this setting of U and V… 2 1 1 2 3 2 2 0 0 M U V a) 14 b) 10 c) 6
<latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> <latexit sha1_base64="FxlQFTpt3XT/cP7nfIKG0d4whfk=">ACGXicbVDLSgMxFM34rPU16tJNsAgVtMwUQZdFN26ECvYBnXHIpGmbNskMSaZQhv6G3/FjQtFXOrKvzFtZ6GtBy73cM69JPeEMaNKO863tbS8srq2ntvIb25t7+zae/t1FSUSkxqOWCSbIVKEUFqmpGmrEkiIeMNMLB9cRvDIlUNBL3ehQTn6OuoB2KkTZSYDsepyJIa6ewPoaeSniQ0v4YFm9n/QwmAYUebkcaDoP+yUM5sAtOyZkCLhI3IwWQoRrYn147wgknQmOGlGq5Tqz9FElNMSPjvJcoEiM8QF3SMlQgTpSfTi8bw2OjtGEnkqaEhlP190aKuFIjHpJjnRPzXsT8T+vlejOpZ9SESeaCDx7qJMwqCM4iQm2qSRYs5EhCEtq/gpxD0mEtQkzb0Jw509eJPVyXVK7t15oXKVxZEDh+AIFIELkAF3IAqAEMHsEzeAVv1pP1Yr1bH7PRJSvbOQB/YH39ACajnxs=</latexit> Clicker Question! X ( M ij − u i · v j ) 2 min U,V ij Compute the loss given this setting of U and V… 1 2 2 1 x 1 2 = 2 4 3 2 0 0 2 0 0 M U V M’ a) 14 b) 10 c) 6
Recommend
More recommend