manifold learning with random errors and inverse problems
play

Manifold learning with random errors and inverse problems Matti - PowerPoint PPT Presentation

Manifold learning with random errors and inverse problems Matti Lassas in collaboration with Charles Fefferman, Sergei Ivanov, Hariharan Narayanan Finnish Centre of Excellence in Inverse Modelling and Imaging 2018-2025 2018-2025 Outline:


  1. Manifold learning with random errors and inverse problems Matti Lassas in collaboration with Charles Fefferman, Sergei Ivanov, Hariharan Narayanan Finnish Centre of Excellence in Inverse Modelling and Imaging 2018-2025 2018-2025

  2. Outline: ◮ Manifold learning problems and inverse problems ◮ Learning a manifold from distances with small noise ◮ Learning a manifold from distances with large random noise

  3. Construction of a manifold from discrete data. Let ( X , d X ) be a (discrete) metric space. We want to approximate it by a Riemannian manifold ( M ∗ , g ∗ ) so that ◮ ( X , d X ) and ( M ∗ , d g ∗ ) are almost isometric, ◮ the curvature and the injectivity radius of M ∗ are bounded. Note that X is an “abstract metric space” and not a set of points in R d , and we want to learn the intrinsic metric of the manifold.

  4. Example 1: Non-Euclidean metric in data sets Consider a data set X = { x j } N j = 1 ⊂ R d . The ISOMAP face data set contains N = 2370 images of faces with d = 2914 pixels. Question: Define d X ( x j , x k ) using Wasserstein distance related to optimal transport. Does ( X , d X ) approximate a manifold and how this manifold can be constructed?

  5. Example 2: Travel time distances of points Surface waves produced by earthquakes travel near the boundary of the Earth. The observations of several earthquakes give information on travel times d T ( x , y ) between the points x , y ∈ S 2 . Question: Can one determine the Riemannian metric associated to surface waves from the travel times having measurement errors? Figure by Su-Woodward-Dziewonski, 1994

  6. Example 3: An inverse problem for a manifold Consider the eigenvalues λ j and eigenfunctions ϕ j satisfying − ∆ g ϕ j = λ j ϕ j on M . In the inverse interior spectral problem one is given a ball B = B M ( p , r ) ⊂ M , eigenvalues λ j , j = 1 , 2 , 3 , . . . , restrictions of eigenfunctions, ϕ j | B , j = 1 , 2 , 3 , . . . and the goal is to determine the isometry type of ( M , g ) .

  7. Theorem (Bosi-Kurylev-L. 2017) Let n ∈ Z + and K , D , i 0 , r 0 > 0 . There are θ, C 0 , δ 0 such that for all δ < δ 0 the following is true: Let ( M , g ) be a Riemannnian manifold such that � Ric ( M ) � C 3 ( M ) ≤ K , diam ( M ) ≤ D , inj ( M ) ≥ i 0 . Identify the ball B M ( p , r 0 ) with B ( r 0 ) ⊂ R n in normal coordinates. Assume that we are given g a , ϕ a j and λ a j such that i) The metric tensor satisfies � g a − g � L ∞ ( B ( r 0 )) < δ, ii) | λ a j − λ j | < δ and � ϕ a j − ϕ j � L 2 ( B ( r 0 )) < δ when λ j < 1 δ . Then we can construct a metric space ( X , d X ) such that C 0 d GH ( M , X ) ≤ �� θ = ε, � � ln 1 ln δ that is, there is an ε -dense subset { p j : j = 1 , . . . , N } ⊂ M and X = { x j : j = 1 , . . . , N } such that | d M ( p j , p k ) − d X ( x j , x k ) | ≤ ε .

  8. Some earlier methods for manifold learning j = 1 ⊂ R d be points on submanifold M ⊂ R d , d > n . Let { x j } J ◮ ‘Multi Dimensional Scaling’ (MDS) finds an embedding of data points into R m , n < m < d by minimising a cost function � � J � 2 � � � � � � y j − y k � R m − d jk min , d jk = � x j − x k � R d � y 1 ,..., y J ∈ R m j , k = 1 ◮ ‘Isomap’ makes a graph of K nearest neighbours and computes graph distances d G jk that approximate distances d M ( x j , x k ) along the surface. Then MDS is applied. Note that if there is F : M → R m such that | F ( x ) − F ( x ′ ) | = d M ( x , x ′ ) , then the curvature of M is zero. Figure by Tenenbaum et al., Science 2000

  9. Outline: ◮ Manifold learning problems and inverse problems ◮ Learning a manifold from distances with small noise ◮ Learning a manifold from distances with large random noise

  10. Theorem (Fefferman, Ivanov, Kurylev, L., Narayanan 2015) Let 0 < δ < c 1 ( n , K ) and M be a compact n -dimensional manifold with | Sec( M ) | ≤ K and inj( M ) > 2 ( δ/ K ) 1 / 3 . Let X = { x j } N j = 1 be δ -dense in M and � d : X × X → R + ∪ { 0 } satisfy | � d ( x , y ) − d M ( x , y ) | ≤ δ, x , y ∈ X . Given the values � d ( x j , x k ) , j , k = 1 , . . . , N , one can construct a compact n -dimensional Riemannian manifold ( M ∗ , g ∗ ) such that: 1. There is a diffeomorphism F : M ∗ → M satisfying 1 L ≤ d M ( F ( x ) , F ( y )) for x , y ∈ M ∗ , L = 1 + C n K 1 / 3 δ 2 / 3 . ≤ L , d M ∗ ( x , y ) 2. | Sec( M ∗ ) | ≤ C n K . 3. The injectivity radius inj( M ∗ ) of M ∗ satisfies inj( M ∗ ) ≥ min { ( C n K ) − 1 / 2 , ( 1 − C n K 1 / 3 δ 2 / 3 ) inj( M ) } .

  11. Outline: ◮ Manifold learning problems and inverse problems ◮ Learning a manifold from distances with small noise ◮ Learning a manifold from distances with large random noise

  12. Random sample points and random errors Manifolds with bounded geometry: Let n ≥ 2 be an integer, K > 0 , D > 0, i 0 > 0. Let ( M , g ) be a compact Riemannian manifold of dimension n such that i ) � Sec M � L ∞ ( M ) ≤ K , (1) ii ) diam ( M ) ≤ D , iii ) inj ( M ) ≥ i 0 , We consider measurements in randomly sampled points: Let X j , j = 1 , 2 , . . . , N be independently samples from probability distribution µ on M such that d µ 0 < c min ≤ ≤ c max . d Vol g

  13. Definition Let X j , j = 1 , 2 , . . . , N be independent, identically distributed (i.i.d.) random variables having distribution µ . Let σ > 0 , β > 1 and η jk be i.i.d. random variables satisfying E e | η jk | = β. E ( η 2 jk ) = σ 2 , E η jk = 0 , In particular, Gaussian noise satisfies these conditions. We assume that all random variables η jk and X j are independent. We consider noisy measurements D jk = d M ( X j , X k ) + η jk .

  14. Definition Let X j , j = 1 , 2 , . . . , N be independent, identically distributed (i.i.d.) random variables having distribution µ . Let σ > 0 , β > 1 and η jk be i.i.d. random variables satisfying E e | η jk | = β. E ( η 2 jk ) = σ 2 , E η jk = 0 , In particular, Gaussian noise satisfies these conditions. We assume that all random variables η jk and X j are independent. We consider noisy measurements D jk = d M ( X j , X k ) + η jk . s

  15. Definition Let X j , j = 1 , 2 , . . . , N be independent, identically distributed (i.i.d.) random variables having distribution µ . Let σ > 0 , β > 1 and η jk be i.i.d. random variables satisfying E e | η jk | = β. E ( η 2 jk ) = σ 2 , E η jk = 0 , In particular, Gaussian noise satisfies these conditions. We assume that all random variables η jk and X j are independent. We consider noisy measurements D jk = d M ( X j , X k ) + η jk . s s

  16. Definition Let X j , j = 1 , 2 , . . . , N be independent, identically distributed (i.i.d.) random variables having distribution µ . Let σ > 0 , β > 1 and η jk be i.i.d. random variables satisfying E e | η jk | = β. E ( η 2 jk ) = σ 2 , E η jk = 0 , In particular, Gaussian noise satisfies these conditions. We assume that all random variables η jk and X j are independent. We consider noisy measurements D jk = d M ( X j , X k ) + η jk . s s s

  17. Definition Let X j , j = 1 , 2 , . . . , N be independent, identically distributed (i.i.d.) random variables having distribution µ . Let σ > 0 , β > 1 and η jk be i.i.d. random variables satisfying E e | η jk | = β. E ( η 2 jk ) = σ 2 , E η jk = 0 , In particular, Gaussian noise satisfies these conditions. We assume that all random variables η jk and X j are independent. We consider noisy measurements D jk = d M ( X j , X k ) + η jk . s s s s

  18. Definition Let X j , j = 1 , 2 , . . . , N be independent, identically distributed (i.i.d.) random variables having distribution µ . Let σ > 0 , β > 1 and η jk be i.i.d. random variables satisfying E e | η jk | = β. E ( η 2 jk ) = σ 2 , E η jk = 0 , In particular, Gaussian noise satisfies these conditions. We assume that all random variables η jk and X j are independent. We consider noisy measurements D jk = d M ( X j , X k ) + η jk . s s s s s

  19. Theorem (Fefferman, Ivanov, L., Narayanan 2019) Let n ≥ 2 , D , K , i 0 , c min , c max , σ, β > 0 be given. Then there are δ 0 , C 0 and C 1 such that the following holds: Let δ ∈ ( 0 , δ 0 ) , θ ∈ ( 0 , 1 2 ) and ( M , g ) be a compact manifold satisfying bounds (1). Then with a probability 1 − θ , σ 2 and the noisy distances D jk = d M ( X j , X k ) + η jk , j , k ≤ N of N randomly chosen points, where � � 1 log 2 ( 1 θ ) + log 8 ( 1 N ≥ C 0 δ ) , δ 3 n determine a Riemannian manifold ( M ∗ , g ∗ ) such that 1. There is a diffeomorphism F : M ∗ → M satisfying 1 L ≤ d M ( F ( x ) , F ( y )) for all x , y ∈ M ∗ , ≤ L , d M ∗ ( x , y ) where L = 1 + C 1 δ . 2. The sectional curvature Sec M ∗ of M ∗ satisfies | Sec M ∗ | ≤ C 1 K . 3. The injectivity radius inj ( M ∗ ) of M ∗ is close to inj ( M ) .

Recommend


More recommend