neu neural c al collab ollabor orativ ive e subspace ce
play

Neu Neural C al Collab ollabor orativ ive e Subspace ce - PowerPoint PPT Presentation

Neu Neural C al Collab ollabor orativ ive e Subspace ce Clustering Tong Zhang, Pan Ji , Mehrtash Harandi, Wenbing Huang, Hongdong Li International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019 Subspace Clustering


  1. Neu Neural C al Collab ollabor orativ ive e Subspace ce Clustering Tong Zhang, Pan Ji , Mehrtash Harandi, Wenbing Huang, Hongdong Li International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  2. Subspace Clustering • Cluster data points drawn from a union of low-dimensional subspaces • Applications: image clustering, motion segmentation, etc. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  3. Subspace Clustering Methods • STOA methods consist of two steps: 1. Construct an affinity matrix for the whole dataset, 2. Apply normalized cuts or spectral clustering. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  4. Scalability Issue!! • Affinity matrix construction is expensive: • Large memory footprint; • High complexity in optimization. • Spectral clustering is expensive: • Computing SVD on large matrices is demanding. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  5. Can we avoid the construction of huge affinity matrices and bypass the spectral clustering? International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  6. Our Idea • Construct affinity matrix in a batch; • Train a classifier using affinity matrices. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  7. Our Idea • Construct affinity matrix in a batch; • Train a classifier using affinity matrices. • How? International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  8. Affinity from Classification • Build connection between clustering and classification via affinity matrices Classification Affinity Matrix from Classification . ! "#$ (&, () = + , + - International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  9. Affinity from Subspace • Subspace affinity ! "#$ from self-expressiveness ! "#$ ' ∗ Z ZC % & input output Encoder Decoder Self-Expressive Layer International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  10. Collaborative Learning • Subspace affinity is more confident of identifying samples from the same class. Positive Confidence ! %&' ! "#$ International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  11. Collaborative Learning (cont’d) • Classification affinity is more confident of identifying samples from different classes. Negative Confidence ! %&' ! "#$ International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  12. Our Framework: Collaborative Learning Data Batch International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  13. Clustering via Classifier • Output the clustering simply through the classification part (bypass the spectral clustering): ! " = $%&'$( ) * ") , ℎ = 1, ⋯ , /, where / is number of clusters. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  14. Experiments: • MNIST • Fashion-MNIST International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  15. Conclusion • Subspace is a powerful tool to represent data in high-dimensional space. • Introduced a collaborative learning paradigm for clustering. • Made subspace clustering scalable through batch-wise training. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

  16. We’re hiring! For more details, visit http://www.nec- labs.com/research-departments/media-analytics. International Conference of Machine Learning (ICML), Long Beach, CA, June 10-15, 2019

Recommend


More recommend