restoring latently lost meaning in population dynamical
play

RESTORING LATENTLY-LOST MEANING IN POPULATION-DYNAMICAL GALAXY - PowerPoint PPT Presentation

) $'$% * ! $% & '$% ( RESTORING LATENTLY-LOST MEANING IN POPULATION-DYNAMICAL GALAXY DECOMPOSITIONS Prashin Jethwa University of Vienna Ling Zhu (SHAO), Adriano Poci (ESO/Macquarie), Jesus Falcon-Barroso (IAC), Glenn


  1. ) ∈ ℝ $'$% * ! ∈ ℝ $% & '$% ( RESTORING LATENTLY-LOST MEANING IN POPULATION-DYNAMICAL GALAXY DECOMPOSITIONS Prashin Jethwa – University of Vienna Ling Zhu (SHAO), Adriano Poci (ESO/Macquarie), Jesus Falcon-Barroso (IAC), Glenn Van deVen (Uni. Vienna)

  2. RECENT ACRETION

  3. Belokurov et al 2017 ANCIENT MERGER OF THE MILKY WAY • Gaia decomposes inner stellar halo of the MW, revealing a major merger: • ancient (7-10 Gyr) • massive (M * ~ 10 9 ) • radial orbit • Detection required combination of kinematics and stellar populations • Can we extend this beyond the MW…? Gaia collaboration 2018

  4. EXTERNAL GALAXIES: POPULATION-DYNAMICAL DECOMPOSITION Spectral stellar population recovery in EAGLE simulation True Recovered Boeker et al 2019

  5. EXTERNAL GALAXIES: POPULATION-DYNAMICAL DECOMPOSITION observed kinematics library of orbits recovered orbit distribution Zhu et al 2018

  6. EXTERNAL GALAXIES: POPULATION-DYNAMICAL DECOMPOSITION data noise covariance IFU data + ,, . /0 ~2 3, Σ model spectrum of population 8 model velocities of orbits 9 3 = 6 7 8, 9 : ,; 8 ∗ =(. 2D , , ; 9) d8d9 distribution function A 8, 9 encodes all quantities of interest inverse problem: B 7 + )

  7. OUR CURRENT APPROACH TO POPULATION-DYNAMICAL DECOMPOSITION data + ,, . /0 (3) extract observed All steps formulated as linear problems • (1) extract population maps D = EF + G velocity maps D = ∑ I J(,, 8 K )! I e.g. step 3: • 7 8 (8; . /0 ) = [J 1 , … , O p ] F 7 = (C; . /0 ) = E F Constraints F ≽ 0 • (2) find orbits (4) find mapping from Dimensions: dim (E) = S, T • orbits to populations S ~10 V'W T ~10 X'Y 7 9 (9) & 8 ( 9 ) 7 9 (9)

  8. maps extracted from IFU data-cube APPLICATION TO NGC 3115 Poci et al 19 surface density data + ,, . /0 (3) extract observed (1) extract mean velocity population maps velocity maps 7 8 (8; . /0 ) 7 = (C; . /0 ) velocity dispersion (2) find orbits (4) find mapping from orbits to populations h 3 7 9 (9) & 8 ( 9 ) 7 9 (9) ~ skewness

  9. observed maps modelled maps APPLICATION TO NGC 3115 Poci et al 19 data + ,, . /0 (3) extract observed (1) extract population maps velocity maps 7 8 (8; . /0 ) 7 = (C; . /0 ) (2) find orbits (4) find mapping from recovered orbital orbits to populations distribution 7 9 (9) & 8 ( 9 ) 7 9 (9)

  10. observed maps age metallicity APPLICATION TO NGC 3115 Poci et al 19 data + ,, . /0 (3) extract observed (1) extract population maps velocity maps 7 8 (8; . /0 ) 7 = (C; . /0 ) (2) find orbits (4) find mapping from orbits to populations 7 9 (9) & 8 ( 9 ) 7 9 (9)

  11. observed maps age metallicity APPLICATION TO NGC 3115 Poci et al 19 data + ,, . /0 (3) extract observed (1) extract population maps velocity maps 7 8 (8; . /0 ) 7 = (C; . /0 ) (2) find orbits (4) find mapping from orbits to populations 7 9 (9) & 8 ( 9 ) 7 9 (9) modelled maps

  12. THE BUILD-UP OF NGC 3115’S STELLAR DISK vertical velocity dispersion in disk Poci et al 19

  13. MERGED SATELLITES? • Clumpy orbit distribution à merged satellites? • Unclear… depends on regularization recovered orbital distribution of NGC 3115

  14. THE NEED FOR REGULARISATION unregularized recovery

  15. THE NEED FOR REGULARISATION better fit to data better smoothness

  16. POSTERIOR ON DECOMPOSITION- WEIGHTS truth best MAP recovery median of posterior distribution TIME: seconds ~ 30 minutes

  17. THE NEED FOR DIMENSIONALITY REDUCTION 1. T o detect accreted satellites, we need to understand uncertainties in decomposition • à we want the posterior on the decomposition weights à for speed, we must reduce dimension of parameter space F ∈ ℝ $% & '$% ( • 2. May help us tackle the full problem • i.e. invert the full generative model rather than current step-by-step approach • Strategy: • go to low dimensional latent space à sample posterior à de-project to original space

  18. LINEAR DIMENSIONALITY REDUCTION (A.K.A. matrix factorization) TRANSFORMED PROBLEM ORIGINAL PROBLEM • Factorise E = `a b • D ~ 2 EF, Z • Dims: S, T → S, d d, T where d ≪ T • P F ∝ 2 ], ^ 2 9 • Underlying probabilistic model: . f ~2 a g f , h g~2 ], 9 h = di diag (l 1 , … , l S ) • Regression becomes: D ~ 2 EF, Z → D ~ 2 `m , Z + h m = a b F • Sample posterior P(m|D ) then invert: F = pm p = ps pseudo-in inverse of of a b

  19. LINEAR DIMENSIONALITY REDUCTION (A.K.A. matrix factorization) TRANSFORMED PROBLEM ORIGINAL PROBLEM • Factorise E = `a b • D ~ 2 EF, Z • Dims: S, T → S, d d, T where d ≪ T • P F ∝ 2 ], ^ 2 9 • Underlying probabilistic model: . f ~2 a g f , h g~2 ], 9 h = di diag (l 1 , … , l S ) • Regression becomes: D ~ 2 EF, Z → D ~ 2 `m , Z + h m = a b F • Sample posterior P(m|D ) then invert: F = pm p = ps pseudo-in inverse of of a b Bayesian Latent Factor Regression - West 2003

  20. LINEAR DIMENSIONALITY REDUCTION (A.K.A. matrix factorization) TRANSFORMED PROBLEM ORIGINAL PROBLEM • Factorise E = `a b • D ~ 2 EF, Z • Dims: S, T → S, d d, T where d ≪ T • P F ∝ 2 ], ^ 2 9 • Underlying probabilistic model: . f ~2 a g f , h everything normal g~2 ], 9 à covariances sum h = di diag (l 1 , … , l S ) • Regression becomes: D ~ 2 EF, Z → D ~ 2 `m , Z + h m = a b F strategy for picking q: • Sample posterior P(m|D ) then invert: high enough that F = pm h ≪ Z p = ps pseudo-in inverse of of a b Bayesian Latent Factor Regression - West 2003

  21. LINEAR DIMENSIONALITY REDUCTION (A.K.A. matrix factorization) TRANSFORMED PROBLEM ORIGINAL PROBLEM • Factorise E = `a b • D ~ 2 EF, Z • Dims: S, T → S, d d, T where d ≪ T • P F ∝ 2 ], ^ 2 9 • Underlying probabilistic model: . f ~2 a g f , h g~2 ], 9 h = di diag (l 1 , … , l S ) • Regression becomes: D ~ 2 EF, Z → D ~ 2 `m , Z + h m = a b F • Sample posterior P(m|D ) then invert: F = pm p = pseudo-inverse of a b Bayesian Latent Factor Regression - West 2003

  22. LINEAR DIMENSIONALITY REDUCTION (A.K.A. matrix factorization) TRANSFORMED PROBLEM ORIGINAL PROBLEM • Factorise E = `a b • D ~ 2 EF, Z • Dims: S, T → S, d d, T where d ≪ T • P F ∝ 2 EF, Z but our • Underlying probabilistic model: . f ~2 a g f , h problem has P F ∝ z2 ], ^ 2 9 i f F ≽ 0 g~2 ], 9 positivity 0 otherwise h = di diag (l 1 , … , l S ) constraints • Regression becomes: D ~ 2 EF, Z → D ~ 2 `m , Z + h m = a b F no guarantee that • Sample posterior P(m|D ) then invert: least-square F = pm inverse satisfies p = pseudo-inverse of a b constraints Bayesian Latent Factor Regression - West 2003

  23. , m = a b F D ~ 2 EF, Z D ~ 2 `m , Z + h , à DEPROJECTING WITH POSITIVITY CONSTRAINTS • The least-squares inverse is incomplete i.e. F = pm where p = pseudo−inverse of a b • We can add any vector  from null-space of a b , i.e. F = pm + Ä where Å = orthogonal complement of p in ℝ Ö • T o invert, for each m ~ P(m|D ) we need to find a  such that F satisfies positivity constraints • Finding  à solving a quadratic programming problem: F FF b subject to argmin 1 à F ≽ 0 and a b F = m

  24. , m = a b F D ~ 2 EF, Z D ~ 2 `m , Z + h , à DEPROJECTING WITH POSITIVITY CONSTRAINTS • The least-squares inverse is incomplete i.e. F = pm where p = pseudo−inverse of a b • We can add any vector  from null-space of a b , i.e. F = pm + Ä find ä where Å = orthogonal complement of p in ℝ Ö rather than sample  • T o invert, for each m ~ P(m|D ) we need to find a  such that F satisfies positivity constraints • Finding  à solving a quadratic programming problem: infinitely many F FF b subject to argmin solutions – so pick 1 à the one which F ≽ 0 and a b F = m minimizes L2 norm

  25. EXAMPLE • In practice: • use SVD to factorise E (i.e. PCA) • pick latent dimension d so that variance PCA << noise variance i.e. h ≪ Z • sample m ~ P(m|D ) • for each sampled m , solve quadratic programming problem to get F

  26. POSTERIOR ON DECOMPOSITION- WEIGHTS truth best MAP recovery median of median of full posterior latent posterior 1 minute TIME: seconds ~ 30 minutes

  27. • Population–dynamical galaxy decompositions • A detailed look into the distant pasts of nearby galaxies • Application to NGC 3115 SUMMARY • To progress, dimensionality reduction: adapted linear decomposition for posterior deprojection for • positivity constraints proof-of-concept of efficacy/efficiency: • 30 minutes 1 minute

Recommend


More recommend