a metric for the
play

A Metric for the Prognostic Outreach of Scenarios Learning from - PowerPoint PPT Presentation

Spatio-Temporal Uncertainty Assessment of GHG Emission Inventories with the Specific Focus on Austria and Ukraine: Learning in Space and Time and into the Future A Metric for the Prognostic Outreach of Scenarios Learning from the Past to


  1. Spatio-Temporal Uncertainty Assessment of GHG Emission Inventories with the Specific Focus on Austria and Ukraine: Learning in Space and Time and into the Future A Metric for the Prognostic Outreach of Scenarios Learning from the Past to Establish a Standard in Applied Systems Analysis M. JONAS, E. ROVENSKAYA and P. Żebrowski LViv, Ukraine; 13 October 2015 M. Jonas 30 April 2015 – 1

  2. This talk covers 1. Motivation 2. Framing conditions and definitions 3. Why diagnostic and prognostic uncertainty are different and independent 4. Learning in a prognostic context 5. Toward application: an accurate and precise system 6. Learning in a diagnostic context 7. Insights and outlook M. Jonas et al. 14 October 2015 – 2

  3. 1. Motivation Our motivation is two-fold: 1. to expand Jonas et al. (2014) Uncertainty in an emissions-constrained world emerging from the 3 rd (2010) Uncertainty Workshop; 2. and to contribute to the unresolved question of How limited are prognostic scenarios? We are still moving at a theoretical level but we already encounter important insights and windfall profits! M. Jonas et al. 14 October 2015 – 3

  4. 1. Motivation (2) An easy-to-apply metric or indicator is needed that informs non-experts about the time in the future at which a prognostic scenario ceases to be (for whatever reasons) in accordance with the system’s past . This indicator should be applicable in treating a system / model coherently (from beginning to end)! M. Jonas et al. 14 October 2015 – 4

  5. 1. Motivation (1) Jonas et al. (2014): The mode of bridging diagnostic and prognostic uncertainty across temporal scales relies on two discrete points in time: ‘today’ and 2050. Now we want to become continuous ... 2050 2014 Past Future M. Jonas et al. 14 October 2015 – 5

  6. 2. Framing conditions and definitions Net Storage in the Atmosphere Sphere of Activity under “Non - Kyoto” FF Industry Kyoto Biosphere the KP Biosphere Impacting? Only F FF_C , F terr_C and F oc_C can be discriminated top-down Globe or Group of Countries or individual Country globally! M. Jonas et al. Jonas and Nilsson (2007: Fig. 4); modified 14 October 2015 – 6

  7. 2. Framing conditions and definitions Atmosphere F net 2e Time Bottom-up / top-down (full C) accounting is not in place. We cannot yet verify D C fluxes at the country scale! t 1 t 2 M. Jonas et al. Jonas and Nilsson (2007: Fig. 6); modified 14 October 2015 – 7

  8. 2. Framing conditions and definitions diagnostic prognostic M. Jonas et al. Moss & Schneider (2000: Fig. 5; see also Giles, 2002); IPCC ( 2006: Vol. 1, Fig. 3.2) 14 October 2015 – 8

  9. 3. Diagnostic vs prognostic uncertainty Diagnostic uncertainty  can increase or decrease depending on whether or not our knowledge of accounting emissions becomes more accurate and precise! Prognostic uncertainty  under a prognostic scenario always increases with time! M. Jonas et al. 14 October 2015 – 9

  10. 3. Diagnostic vs prognostic uncertainty Meinshausen et al. (2009: Fig. 2) M. Jonas et al. 14 October 2015 – 10

  11. 3. Diagnostic vs prognostic uncertainty 42 25 234 10 M. Jonas et al. Meinshausen et al. (2009: Fig. 3) 14 October 2015 – 11

  12. 3. Diagnostic vs prognostic uncertainty Probability of exceeding 2 o C: M. Jonas et al. Meinshausen et al. (2009: Tab. 1) 14 October 2015 – 12

  13. 3. Diagnostic and prognostic uncertainty Diagnostic Prognostic Additional undershooting Combined Time 2050 M. Jonas et al. Massari Coelho et al. (2012: Fig. 10) 14 October 2015 – 13

  14. 4. Learning in a prognostic context M. Jonas et al. 14 October 2015 – 14

  15. 4. Learning in a prognostic context Task: Find optimum between ’order of the signal’s dynamics’ and both the extension and the opening of uncertainty wedge! M. Jonas et al. 14 October 2015 – 15

  16. 4. Learning in a prognostic context M. Jonas et al. Andriana (2015:Slide 15); modified 14 October 2015 – 16

  17. 5. Toward application: accurate + precise system Assume that we have learned from a RL exercise • that each historical data record has a memory and exhibits (but not necessarily) a linear dynamics; Y • that each data record’s uncertainty (learning) wedge Y = Y(t) D Y  a y * t unfolds linearly into the future (until when?); • and that our data records exhibit linear inter- dependencies [eg: T = T(C) ; C = C(E) ; E = E(t) ] today t M. Jonas et al. 14 October 2015 – 17

  18. 5. Toward application: accurate + precise system = 0 We merge an accurate-precise system with classical statistics! D f Et combines Unc (learn) + Dyn (mem) knowledge! M. Jonas et al. 14 October 2015 – 18

  19. 5. Toward application: accurate + precise system Source: http://en.wikipedia.org/wiki/Propagation_of_uncertainty M. Jonas et al. 14 October 2015 – 19

  20. 5. Toward application: accurate + precise system M. Jonas et al. 14 October 2015 – 20

  21. 5. Toward application: accurate + precise system This is a game changer that has not so far been considered! M. Jonas et al. 14 October 2015 – 21

  22. 5. Toward application: accurate + precise system M. Jonas et al. Jonas &Nilsson (2007: Fig. 9); modified 14 October 2015 – 22

  23. 7. Insights and outlook 1. The risk of exceeding a 2050 global temperature target (eg, 2 o C) appears to be greater than assessed by the IPCC! The correct approach would have been to deal with cumulated emissions and removals individually to determine their combined risk of exceeding the agreed temperature target. RL allows exactly this to be done: RL overcomes this shortfall and allows the effect of learning about emissions and removals individually to be grasped. M. Jonas et al. 14 October 2015 – 23

  24. 7. Insights and outlook 2. We anticipate that, in the case of success, the way of constructing prognostic models and conducting systems analysis will have to meet certain quality standards: • Better diagnostic data handling (retrospective learning)! • S pecifying the models’ outreach limits! • Safe-guarding complex models by means of meta-models which fulfill the above! M. Jonas et al. 14 October 2015 – 24

  25. 6. Learning in a diagnostic context Emissions Most recent emission estimates 5.5 EU-15: Total Uncertainty (CO 2 , w/o LULUCF) Most recent precision estimates 4.5 Initial emission estimates UNFCCC: (%) - 4.2%/yr 3.5 Accuracy 2.5 Initial precision estimates R 2 = 0.9345 Time today 1.5 1983 1988 1993 1998 2003 2008 2013 M. Jonas et al. Hamal ( 2010: Fig. 9, 12); modified 14 October 2015 – 25

  26. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 26

  27. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 27

  28. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 28

  29. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 29

  30. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 30

  31. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 31

  32. 6. Learning in a diagnostic context M. Jonas et al. 14 October 2015 – 32

  33. 7. Insights and outlook 3. We consider diagnostic learning being on the right track. Nonetheless: • We see the need for complete emission records/histories. • We see the need of agreeing on a standard for processing emission data since we operate at the limits of skillful resolution! • We see an important role for top-down emissions accounting! M. Jonas et al. 14 October 2015 – 33

  34. References M. Jonas et al. 14 October 2015 – 34

Recommend


More recommend