flat metric minimization with applications in generative
play

Flat Metric Minimization with Applications in Generative Modeling - PowerPoint PPT Presentation

Flat Metric Minimization with Applications in Generative Modeling Thomas M ollenhoff Daniel Cremers Motivation Latent concepts often induce an orientation of the data. 1/12 Simard et al. 1992, 1998; Rifai et al. 2011 Motivation Latent


  1. Flat Metric Minimization with Applications in Generative Modeling Thomas M¨ ollenhoff Daniel Cremers

  2. Motivation Latent concepts often induce an orientation of the data. 1/12 Simard et al. 1992, 1998; Rifai et al. 2011

  3. Motivation Latent concepts often induce an orientation of the data. Tangent vectors to the “data manifold”: ▸ Stroke thickness or shear of MNIST digit. X ⊂ R 28 ⋅ 28 1/12 Simard et al. 1992, 1998; Rifai et al. 2011

  4. Motivation Latent concepts often induce an orientation of the data. Tangent vectors to the “data manifold”: ▸ Stroke thickness or shear of MNIST digit. ▸ Camera position, lighting/material in a 3D scene. X ⊂ R 28 ⋅ 28 1/12 Simard et al. 1992, 1998; Rifai et al. 2011

  5. Motivation Latent concepts often induce an orientation of the data. Tangent vectors to the “data manifold”: ▸ Stroke thickness or shear of MNIST digit. ▸ Camera position, lighting/material in a 3D scene. ▸ Arrow of time (videos, time-series data, ...) X ⊂ R 28 ⋅ 28 1/12 Simard et al. 1992, 1998; Rifai et al. 2011

  6. Motivation Latent concepts often induce an orientation of the data. Tangent vectors to the “data manifold”: ▸ Stroke thickness or shear of MNIST digit. ▸ Camera position, lighting/material in a 3D scene. ▸ Arrow of time (videos, time-series data, ...) X ⊂ R 28 ⋅ 28 Contributions: ▸ We propose the novel perspective to represent oriented data with k-currents from geometric measure theory . 1/12 Simard et al. 1992, 1998; Rifai et al. 2011

  7. Motivation Latent concepts often induce an orientation of the data. Tangent vectors to the “data manifold”: ▸ Stroke thickness or shear of MNIST digit. ▸ Camera position, lighting/material in a 3D scene. ▸ Arrow of time (videos, time-series data, ...) X ⊂ R 28 ⋅ 28 Contributions: ▸ We propose the novel perspective to represent oriented data with k-currents from geometric measure theory . ▸ Using this viewpoint within the context of GANs, we learn a generative model which behaves equivariantly to specified tangent vectors. 1/12 Simard et al. 1992, 1998; Rifai et al. 2011

  8. An invitation to geometric measure theory (GMT) 2/12 Morgan 2016, Krantz & Parks 2008, Federer 1969

  9. An invitation to geometric measure theory (GMT) ▸ Differential geometry, generalized through measure theory to deal with surfaces that are not necessarily smooth. 2/12 Morgan 2016, Krantz & Parks 2008, Federer 1969

  10. An invitation to geometric measure theory (GMT) ▸ Differential geometry, generalized through measure theory to deal with surfaces that are not necessarily smooth. ▸ k -currents ≈ generalized (possibly quite irregular) oriented k -dimensional surfaces in d -dimensional space. 2/12 Morgan 2016, Krantz & Parks 2008, Federer 1969

  11. An invitation to geometric measure theory (GMT) ▸ Differential geometry, generalized through measure theory to deal with surfaces that are not necessarily smooth. ▸ k -currents ≈ generalized (possibly quite irregular) oriented k -dimensional surfaces in d -dimensional space. ▸ The class of currents we consider form a linear space . It includes oriented k -dimensional surfaces as elements. 2/12 Morgan 2016, Krantz & Parks 2008, Federer 1969

  12. Generalizing Wasserstein GANs to k -currents S T Z X ▸ T and S are 1-currents representing the data and the (partially oriented) latents. 3/12 Inspired by the optimal transport perspective on GANs: Bottou et al. 2017, Genevay et al. 2017

  13. Generalizing Wasserstein GANs to k -currents g θ ∶ Z → X S T g θ ♯ S Z X X ▸ T and S are 1-currents representing the data and the (partially oriented) latents. ▸ Pushforward operator g θ ♯ , yields transformed current g θ ♯ S . 3/12 Inspired by the optimal transport perspective on GANs: Bottou et al. 2017, Genevay et al. 2017

  14. Generalizing Wasserstein GANs to k -currents g θ ∶ Z → X S F λ ( g θ ♯ S , T ) T g θ ♯ S Z X X ▸ T and S are 1-currents representing the data and the (partially oriented) latents. ▸ Pushforward operator g θ ♯ , yields transformed current g θ ♯ S . ▸ We propose to use the flat metric F λ as a distance between g ♯ S and T . 3/12 Inspired by the optimal transport perspective on GANs: Bottou et al. 2017, Genevay et al. 2017

  15. Generalizing Wasserstein GANs to k -currents g θ ∶ Z → X S F λ ( g θ ♯ S , T ) T g θ ♯ S Z X X ▸ T and S are 1-currents representing the data and the (partially oriented) latents. ▸ Pushforward operator g θ ♯ , yields transformed current g θ ♯ S . ▸ We propose to use the flat metric F λ as a distance between g ♯ S and T . ▸ For k = 0 the flat metric is closely related to the Wasserstein − 1 distance and positive 0-currents with unit mass are probability distributions. 3/12 Inspired by the optimal transport perspective on GANs: Bottou et al. 2017, Genevay et al. 2017

  16. k -dimensional orientation in d -dimensional space ▸ Simple k -vectors v = v 1 ∧ ⋯ ∧ v k ∈ Λ k R d describe oriented k -dimensional subspaces together with an area in R d : 2 v 2 v 2 v 2 v 1 v 1 − v 1 1 − v 2 2 v 1 v 2 ∧ v 1 v 1 ∧ v 2 − v 1 ∧ − v 2 1 2 v 1 ∧ 2 v 2 4/12 Graßmann 1844

  17. k -dimensional orientation in d -dimensional space ▸ Simple k -vectors v = v 1 ∧ ⋯ ∧ v k ∈ Λ k R d describe oriented k -dimensional subspaces together with an area in R d : 2 v 2 v 2 v 2 v 1 v 1 − v 1 1 − v 2 2 v 1 v 2 ∧ v 1 v 1 ∧ v 2 − v 1 ∧ − v 2 1 2 v 1 ∧ 2 v 2 ▸ The set of simple k -vectors forms a nonconvex cone in the vector space Λ k R d . 4/12 Graßmann 1844

  18. k -dimensional orientation in d -dimensional space ▸ Simple k -vectors v = v 1 ∧ ⋯ ∧ v k ∈ Λ k R d describe oriented k -dimensional subspaces together with an area in R d : 2 v 2 v 2 v 2 v 1 v 1 − v 1 1 − v 2 2 v 1 v 2 ∧ v 1 v 1 ∧ v 2 − v 1 ∧ − v 2 1 2 v 1 ∧ 2 v 2 ▸ The set of simple k -vectors forms a nonconvex cone in the vector space Λ k R d . ▸ For v = v 1 ∧ ⋯ ∧ v k , w = w 1 ∧ ⋯ ∧ w k : √ ⟨ v , w ⟩ = det ( V ⊺ W ) , ∣ v ∣ = ⟨ v , v ⟩ . 4/12 Graßmann 1844

  19. Oriented manifolds, differential forms and currents ▸ Orientation of a k -dimensional manifold M : continuous simple k -vector map τ M ∶ M → Λ k R d , ∣ τ M ( z )∣ = 1 and T z M “spanned” by τ M ( z ) 5/12

  20. Oriented manifolds, differential forms and currents ▸ Orientation of a k -dimensional manifold M : continuous simple k -vector map τ M ∶ M → Λ k R d , ∣ τ M ( z )∣ = 1 and T z M “spanned” by τ M ( z ) ▸ Differential form : k -covector field ω ∶ R d → Λ k R d 5/12

  21. Oriented manifolds, differential forms and currents ▸ Orientation of a k -dimensional manifold M : continuous simple k -vector map τ M ∶ M → Λ k R d , ∣ τ M ( z )∣ = 1 and T z M “spanned” by τ M ( z ) ▸ Differential form : k -covector field ω ∶ R d → Λ k R d ▸ Integration of a k -form over an oriented k -dimensional manifold: ∫ M ω ∶ = ∫ M ⟨ ω ( z ) , τ M ( z )⟩ d H k ( z ) = � M � ( ω ) 5/12

  22. Oriented manifolds, differential forms and currents ▸ Orientation of a k -dimensional manifold M : continuous simple k -vector map τ M ∶ M → Λ k R d , ∣ τ M ( z )∣ = 1 and T z M “spanned” by τ M ( z ) ▸ Differential form : k -covector field ω ∶ R d → Λ k R d ▸ Integration of a k -form over an oriented k -dimensional manifold: ∫ M ω ∶ = ∫ M ⟨ ω ( z ) , τ M ( z )⟩ d H k ( z ) = � M � ( ω ) ▸ � M � is a k -current. In general, they are continuous linear functionals acting on compactly supported smooth k -forms 5/12

  23. Oriented manifolds, differential forms and currents ▸ Orientation of a k -dimensional manifold M : continuous simple k -vector map τ M ∶ M → Λ k R d , ∣ τ M ( z )∣ = 1 and T z M “spanned” by τ M ( z ) ▸ Differential form : k -covector field ω ∶ R d → Λ k R d ▸ Integration of a k -form over an oriented k -dimensional manifold: ∫ M ω ∶ = ∫ M ⟨ ω ( z ) , τ M ( z )⟩ d H k ( z ) = � M � ( ω ) ▸ � M � is a k -current. In general, they are continuous linear functionals acting on compactly supported smooth k -forms 2-current discrete 2-current discrete 0-current 5/12

  24. Towards a distance between k -currents ▸ Mass of a k -current: M ( T ) = sup ∥ ω ∥ ∗ ≤ 1 T ( ω ) 6/12

  25. Towards a distance between k -currents ▸ Mass of a k -current: M ( T ) = sup ∥ ω ∥ ∗ ≤ 1 T ( ω ) ▸ The boundary operator ∂ maps a k -current to a ( k − 1 ) -current: ∂ T ( ω ) = T ( d ω ) 6/12

  26. Towards a distance between k -currents ▸ Mass of a k -current: M ( T ) = sup ∥ ω ∥ ∗ ≤ 1 T ( ω ) ▸ The boundary operator ∂ maps a k -current to a ( k − 1 ) -current: ∂ T ( ω ) = T ( d ω ) ▸ Stokes’ theorem: ∫ ∂ M ω = ∫ M d ω . 6/12

  27. Towards a distance between k -currents ▸ Mass of a k -current: M ( T ) = sup ∥ ω ∥ ∗ ≤ 1 T ( ω ) ▸ The boundary operator ∂ maps a k -current to a ( k − 1 ) -current: ∂ T ( ω ) = T ( d ω ) ▸ Stokes’ theorem: ∫ ∂ M ω = ∫ M d ω . ▸ Normal currents T ∈ N k , X ( R d ) : Finite mass and boundary mass M ( T ) + M ( ∂ T ) < ∞ 6/12

Recommend


More recommend