Relation between Five Data Assimilation Methods for a Simplistic Weakly Non-linear Problem Trond Mannseth Uni Research CIPR
Methods
Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates)
Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update)
Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance)
Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance) Half-iteration Ensemble Kalman Filter (HIEnKF) (Sequential est., A single upd.)
Methods Randomized Maximum Likelihood (RML) (Simultaneous estimation, Iterative updates) Ensemble Smoother (ES) (Sim. est., A single update) Ensemble Smoother with Multiple Data Assimilations (ESMDA) (Sim. est., Multiple upd. with inflated data covariance) Half-iteration Ensemble Kalman Filter (HIEnKF) (Sequential est., A single upd.) Half-iteration Ensemble Kalman Filter with MDA (HIEnKFMDA) (Seq. est., Multiple upd. with inflated data covariance)
Motivation
Motivation Methods are equivalent for gauss-linear problems
Motivation Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems
Motivation Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems RML is iterative, ES is purely non-iterative, while ESMDA, HIEnKF and HIEnKFMDA ‘lie somewhere in between’
Motivation Methods are equivalent for gauss-linear problems Methods behave differently for non-linear problems RML is iterative, ES is purely non-iterative, while ESMDA, HIEnKF and HIEnKFMDA ‘lie somewhere in between’ Systematic differences for weakly non-linear problems?
Investigation
Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem
Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences
Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences Asymptotic calculations (additional assumptions) to first order in non-linearity strength
Investigation Compare methods on simplistic, weakly non-linear parameter estimation problem Focus on differences in data handling – remove other differences Asymptotic calculations (additional assumptions) to first order in non-linearity strength Numerical calculations with full methods and relaxed assumptions
Simplistic Weakly Non-linear Parameter Estimation Problem
Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where
Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where d = ( d 1 . . . d D ) T ,
Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ),
Simplistic Weakly Non-linear Parameter Estimation Problem Estimate x from d , where d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; | n im | ‘not too big’ m
Focus on Diff. in Data Handling – Remove Other Diff.
Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains
Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains K = C xy ( C yy + C d ) − 1 Kalman gain (global)
Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains K = C xy ( C yy + C d ) − 1 Kalman gain (global) Replace C xy by C x G T e and C yy by G e C x G T e
Focus on Diff. in Data Handling – Remove Other Diff. Equip ES (and ESMDA, HIEnKF, HIEnKFMDA) with local gains K = C xy ( C yy + C d ) − 1 Kalman gain (global) Replace C xy by C x G T e and C yy by G e C x G T e � − 1 K e = C x G T � G e C x G T → Local gains e + C d e
Additional Assumptions facilitating Asymptotic Calculations
Additional Assumptions facilitating Asymptotic Calculations I consider the updates of a single (arbitrary) ensemble member
Additional Assumptions facilitating Asymptotic Calculations I consider the updates of a single (arbitrary) ensemble member Assumptions y i ( x ) = x 1+ n i Univariate x →
Additional Assumptions facilitating Asymptotic Calculations I consider the updates of a single (arbitrary) ensemble member Assumptions y i ( x ) = x 1+ n i Univariate x → Negligible data error → d i = y i ( x ref ),
Additional Assumptions . . . (continued)
Additional Assumptions . . . (continued) So far | n 1 | ∧ | n 2 | ‘not too big’ y 1 ( x ) d 1 y 2 ( x ) d 2 x ref
Additional Assumptions . . . (continued) So far I assume | n 1 | ∧ | n 2 | ‘not too big’ n 1 = n 2 = n , | n | ≪ 1 y 1 ( x ) y ( x ) , y ( x ) d 1 d , d y 2 ( x ) d 2 x ref x ref
Results from Asymptotic Calculations to O ( n )
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 )
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML |
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) |
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES =
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES = A − 1 Qn + O ( n 2 ) ∆ ESMDA =
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES = A − 1 Qn + O ( n 2 ) ∆ ESMDA = D − 1 Qn + O ( n 2 ) ∆ HIEnKF =
Results from Asymptotic Calculations to O ( n ) x RML = d − ( d ln d ) n + O ( n 2 ) Define ∆ method = | x method − x RML | Q = | d (ln d − ln x prior ) − ( d − x prior ) | Qn + O ( n 2 ) ∆ ES = A − 1 Qn + O ( n 2 ) ∆ ESMDA = D − 1 Qn + O ( n 2 ) ∆ HIEnKF = ( AD ) − 1 Qn + O ( n 2 ) ∆ HIEnKFMDA =
Results from Asymptotic Calculations to O ( n ) ∆ ES ≈ Qn A − 1 Qn ∆ ESMDA ≈ D − 1 Qn ∆ HIEnKF ≈ ( AD ) − 1 Qn ∆ HIEnKFMDA ≈ dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 D
Numerical Results with Full Methods dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 D
Numerical Results with Full Methods dES dHIEnKF dESMDA2 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 D ‘Ranking’ stable for n ∈ [ − 0 . 5 , 5]
Num. Calc. with Full Methods and Relaxed Assumptions d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; ¯ n im = 0 . 4 m
Num. Calc. with Full Methods and Relaxed Assumptions d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; ¯ n im = 0 . 4 m M = 1 , 2 , 5 .
Num. Calc. with Full Methods and Relaxed Assumptions d = ( d 1 . . . d D ) T , ǫ i ∼ N (0 , σ 2 d i = y i ( x ref ) + ǫ i ; i ), y i ( x ) = � M m =1 c im x 1+ n im ; ¯ n im = 0 . 4 m M = 1 , 2 , 5 . Draw 300 realizations of x ref , x prior , n im , c im
Num. Res. with Full Methods and Relaxed Assumptions M = 1, Arbitrary realization M = 1, Mean dES dES dHIEnKF dHIEnKF dESMDA2 dESMDA2 dESMDA4 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA2 dHIEnKFMDA4 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 D D
Num. Res. with Full Methods and Relaxed Assumptions M = 2, Arbitrary realization M = 2, Mean dES dES dHIEnKF dHIEnKF dESMDA2 dESMDA2 dESMDA4 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA2 dHIEnKFMDA4 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 D D
Num. Res. with Full Methods and Relaxed Assumptions M = 5, Arbitrary realization M = 5, Mean dES dES dHIEnKF dHIEnKF dESMDA2 dESMDA2 dESMDA4 dESMDA4 dHIEnKFMDA2 dHIEnKFMDA2 dHIEnKFMDA4 dHIEnKFMDA4 0 2 4 6 8 10 12 14 16 18 20 0 2 4 6 8 10 12 14 16 18 20 D D
Summary
Summary Compared five different ways to assimilate data on simplistic, weakly non-linear parameter estimation problem
Recommend
More recommend