aspects of sequential and simultaneous assimilation
play

Aspects of sequential and simultaneous assimilation May 29, 2013 - PowerPoint PPT Presentation

Aspects of sequential and simultaneous assimilation May 29, 2013 Motivation There exist several algorithms for conditioning a model to data, s.a. EnKF, RML, ES, EnRML, MDA,... All methods generate approximate samples from the same


  1. Aspects of sequential and simultaneous assimilation May 29, 2013

  2. Motivation ◮ There exist several algorithms for conditioning a model to data, s.a. EnKF, RML, ES, EnRML, MDA,... ◮ All methods generate approximate samples from the same distribution ◮ Methods sample correctly for linear problems ◮ Methods give different results for non-linear problems ◮ Differences between methods are defined by some key characteristics ◮ Focus on: Sequential vs. simultaneous assimilation of data for updating static parameters

  3. Motivation ◮ Formal Bayesian expression ◮ Seq. data assimilation = sim. data assimilation

  4. Motivation ◮ Formal Bayesian expression ◮ Seq. data assimilation = sim. data assimilation ◮ Approximate methods ◮ Linear forward models: seq. data assimilation = sim. data assimilation

  5. Motivation ◮ Formal Bayesian expression ◮ Seq. data assimilation = sim. data assimilation ◮ Approximate methods ◮ Linear forward models: seq. data assimilation = sim. data assimilation ◮ Non-linear forward models: seq. data assimilation � = sim. data assimilation

  6. Motivation ◮ Formal Bayesian expression ◮ Seq. data assimilation = sim. data assimilation ◮ Approximate methods ◮ Linear forward models: seq. data assimilation = sim. data assimilation ◮ Non-linear forward models: seq. data assimilation � = sim. data assimilation 0.2 0.2 5 5 0.15 0.15 10 10 15 15 0.1 0.1 20 20 0.05 0.05 25 25 0 0 5 10 15 5 10 15 Seq. scheme Sim. scheme

  7. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity

  8. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data

  9. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy:

  10. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy: ◮ Define comparable variants of seq./sim. RML and EnKF/ES

  11. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy: ◮ Define comparable variants of seq./sim. RML and EnKF/ES ◮ Analyze differences between the methods

  12. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy: ◮ Define comparable variants of seq./sim. RML and EnKF/ES ◮ Analyze differences between the methods ◮ Extend linear RML result to new RML variants for combination of linear and non-linear data

  13. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy: ◮ Define comparable variants of seq./sim. RML and EnKF/ES ◮ Analyze differences between the methods ◮ Extend linear RML result to new RML variants for combination of linear and non-linear data ◮ Extend linear RML result for variants of EnKF/ES

  14. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy: ◮ Define comparable variants of seq./sim. RML and EnKF/ES ◮ Analyze differences between the methods ◮ Extend linear RML result to new RML variants for combination of linear and non-linear data ◮ Extend linear RML result for variants of EnKF/ES

  15. Characteristics & Algorithms ◮ Define variants of EnKF and the RML method ◮ Remove impact of other characteristics than seq./sim. by ensuring 1. Updates based on ensemble 2. Perform one complete run 3. Focus on static parameters ◮ Choose versions of RML and EnKF honoring 1-3

  16. Characteristics & Algorithms ◮ EnKF honors ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run

  17. Characteristics & Algorithms ◮ EnKF honors ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ EnKF does not honor ◮ Point 3: Focus on static parameters

  18. Characteristics & Algorithms ◮ EnKF honors ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ EnKF does not honor ◮ Point 3: Focus on static parameters ◮ Solution ◮ Restart from initial time after each assimilation

  19. Characteristics & Algorithms ◮ EnKF honors ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ EnKF does not honor ◮ Point 3: Focus on static parameters ◮ Solution ◮ Restart from initial time after each assimilation ◮ EnKF → Half-iterative EnKS (Hi-EnKS)

  20. Characteristics & Algorithms ◮ EnKF honors ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ EnKF does not honor ◮ Point 3: Focus on static parameters ◮ Solution ◮ Restart from initial time after each assimilation ◮ EnKF → Half-iterative EnKS (Hi-EnKS) ◮ If data are assimilated simultaneously: Hi-EnKS → ES

  21. Characteristics & Algorithms ◮ EnKF honors ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ EnKF does not honor ◮ Point 3: Focus on static parameters ◮ Solution ◮ Restart from initial time after each assimilation ◮ EnKF → Half-iterative EnKS (Hi-EnKS) ◮ If data are assimilated simultaneously: Hi-EnKS → ES ◮ Hi-EnKS: sequential scheme honoring 1-3 ◮ ES: simultaneous scheme honoring 1-3

  22. Characteristics & Algorithms ◮ RML honors ◮ Point 3: Focus on static parameters

  23. Characteristics & Algorithms ◮ RML honors ◮ Point 3: Focus on static parameters ◮ RML does not honor ◮ Point 1: Updates based on ensemble

  24. Characteristics & Algorithms ◮ RML honors ◮ Point 3: Focus on static parameters ◮ RML does not honor ◮ Point 1: Updates based on ensemble ◮ Solution ◮ EnRML updates using an ensemble approximation to gradient: RML → EnRML

  25. Characteristics & Algorithms ◮ RML honors ◮ Point 3: Focus on static parameters ◮ RML does not honor ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ Solution ◮ EnRML updates using an ensemble approximation to gradient: RML → EnRML

  26. Characteristics & Algorithms ◮ RML honors ◮ Point 3: Focus on static parameters ◮ RML does not honor ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ Solution ◮ EnRML updates using an ensemble approximation to gradient: RML → EnRML ◮ Minimize utilizing one full Gauss-Newton step: EnRML → GN-EnRML

  27. Characteristics & Algorithms ◮ RML honors ◮ Point 3: Focus on static parameters ◮ RML does not honor ◮ Point 1: Updates based on ensemble ◮ Point 2: Perform one complete run ◮ Solution ◮ EnRML updates using an ensemble approximation to gradient: RML → EnRML ◮ Minimize utilizing one full Gauss-Newton step: EnRML → GN-EnRML ◮ Sim. GN-EnRML: Simultaneous scheme honoring 1-3 ◮ Seq. GN-EnRML: Sequential scheme honoring 1-3

  28. Analytical strategy Goal: Understand the importance of seq. and sim. assimilation when combining data with different degrees of non-linearity ◮ Note: Analytical result exist for seq. vs sim. RML with linear data ◮ Strategy: ◮ Define comparable variants of seq./sim. RML and EnKF/ES ◮ Analyze differences between the methods ◮ Extend linear RML result to new RML variants for combination of linear and non-linear data ◮ Extend linear RML result for variants of EnKF/ES

  29. Comparing Hi-EnKS & GN-EnRML ◮ GN-EnRML update: � − 1 � G T + C d G T � j + C m ˜ GC m ˜ ˜ m a j = m f m f � �� d j − g j ◮ Hi-EnKS parameter update � − 1 � � j + ˜ ˜ m a j = m f m f � �� C mg C gg + C d d j − g j

  30. Comparing Hi-EnKS & GN-EnRML ◮ GN-EnRML update: � − 1 � G T + C d G T � j + C m ˜ GC m ˜ ˜ m a j = m f m f � �� d j − g j ◮ Hi-EnKS parameter update � − 1 � � j + ˜ ˜ m a j = m f m f � �� C mg C gg + C d d j − g j ◮ The two methods are equal if G T = ˜ ◮ C m ˜ C mg G T = ˜ ◮ ˜ GC m ˜ C gg

  31. Comparing Hi-EnKS & GN-EnRML ✞ ☎ G T = ˜ C m ˜ ˜ C mg ✝ ✆ ◮ Ensemble gradient given by the pseudo inverse ˜ G = ∆ d ∆ m † ◮ Ensemble covariance 1 ˜ N e − 1 ∆ m ∆ m T C m =

  32. Comparing Hi-EnKS & GN-EnRML ✞ ☎ G T = ˜ C m ˜ ˜ C mg ✝ ✆ ◮ Ensemble gradient given by the pseudo inverse ˜ G = ∆ d ∆ m † ◮ Ensemble covariance 1 ˜ N e − 1 ∆ m ∆ m T C m = ◮ Rewriting 1 ∆ m † � T ∆ d T G T = C m ˜ ˜ N e − 1 ∆ m ∆ m T � 1 G T = N e − 1 ∆ m ∆ d T = ˜ ⇒ ˜ C m ˜ C mg

  33. Comparing Hi-EnKS & GN-EnRML ✞ ☎ G T = ˜ G ˜ ˜ C m ˜ C gg ✝ ✆ ◮ Inserting for ˜ G and ˜ C m 1 G T = G ˜ ˜ C m ˜ N e − 1 ∆ dV p V T p ∆ d T

Recommend


More recommend