Workflow for routine evaluation of CMIP6 models with the ESMValTool Björn Brötz, Veronika Eyring, Axel Lauer, Mattia Righi Deutsches Zentrum für Luft- und Raumfahrt (DLR) Institut für Physik der Atmosphäre, Oberpfaffenhofen, Germany Stephan Kindermann, Carsten Ehbrecht Deutsches Klimarechenzentrum (DKRZ), Hamburg, Germany IS-ENES Workshop on Workflow Solutions and Metadata Generation 28 September 2016 Lisboa, Portugal
DLR.de • Chart 2 Motivation Difficulties with the workflow for model evaluation during CMIP5 • Local download of high volume data => multiple copies at many institutions − Time and resource intensive − Need to manage versioning of data by non-data specialist − Need to preserve metadata in the final result by non-data specialist • Duplication of efforts by non coordinated development of evaluation routines • Evaluation by individual scientists (whenever they had time) => delays in the availability of the evaluation results Envisaged workflow for model evaluation in CMIP6 • More coordination of software efforts through development of community evaluation tools as open source software • Processing capabilities at the ESGF nodes so that the tools can run alongside the ESGF as soon as the output is published • Ensuring traceability & reproducibility of evaluation results • Support for model development & assessments (via quick and comprehensive feedback )
DLR.de • Chart 3 Routine Benchmarking and Evaluation – A central Part of CMIP6 • Many aspects of ESM evaluation need to be performed much more efficiently • The resulting enhanced systematic characterization of models will identify strengths & weaknesses of the simulations more quickly and openly to the community Eyring et al., ESD, in rev.. (2016)
Models are Increasing in Complexity and Resolution Slide 4 From AOGCMs to Earth System Models with biogeochemical cycles, from lowres to highres 130 km resolution orography II. Allows to study new physical & biogeochemical processes & feedbacks (e.g., carbon cycle, chemistry, aerosols, ice sheets) Atmospheric Chemistry 25 km resolution orography Increase in complexity and resolution More (and new) models participating in CMIP6 I. Allows to study processes Ø Increase in data volume (from ~2PB in as horizontal resolution is CMIP5 to ~20-40 PB in CMIP6) increased to “weather- resolving” global model Ø Large zoo of models in CMIP6 resolutions (~25km or finer)
DLR.de • Chart 5 How to evaluate the wide variety of models in CMIP6? Community-tools that will be applied for routine evaluation of CMIP6 models: • Earth System Model Evaluation Tool ( ESMValTool, Eyring et al., GMD (2016b) that includes other software packages such as the NCAR CVDP (Phillips et al., 2014 )) and • PCMDI Metrics Package ( PMP , Gleckler et al., EOS (2016) ) To produce well-established analyses as soon as CMIP model output is available
DLR.de • Chart 6 ESMValTool integration into the ESGF Infrastructure • A community diagnostic & performance metrics tool for routine evaluation of ESMs in CMIP https://www.esmvaltool.org and https://github.com/ESMValTool-Core/ESMValTool • Community development under a source controlled repository − Currently ~70 scientists part of the development team from ~30 institutions − Allows multiple developers from different institutions to contribute and join − Regular releases as open source software (latest release version 1.0.1) • Allows traceability and reproducibility by preserving and logging metadata and details of analysis software • Goals: − Improve ESM evaluation beyond the state-of-the-art − Reproducing well established and additional analyses − Routine evaluation of the CMIP DECK and historical simulations as soon as the output is published to the ESGF − Support of individual modelling centers: o ESMValTool integrated in local evaluation workflow (e.g. at GFDL) o Run the tool locally to compare to different model versions or other CMIP models o Run the tool locally before publication to the ESGF as quality control
DLR.de • Chart 7 Software architecture of the ESMValTool From: Eyring et al., ESMValToolv1.0, GMD, 2016
DLR.de • Chart 8 Example Namelist – Performance Metrics
DLR.de • Chart 9 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2
DLR.de • Chart 10 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.4
DLR.de • Chart 11 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.5 Fig. 9.4
DLR.de • Chart 12 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.5 Fig. 9.7 Fig. 9.4
DLR.de • Chart 13 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.10 Fig. 9.5 Fig. 9.7 Fig. 9.4
DLR.de • Chart 14 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.10 Fig. 9.5 Fig. 9.24 Fig. 9.7 Fig. 9.4
DLR.de • Chart 15 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.10 Fig. 9.5 Fig. 9.24 Fig. 9.7 Fig. 9.4 Fig. 9.23
DLR.de • Chart 16 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.10 Fig. 9.5 Fig. 9.24 Fig. 9.32 Fig. 9.7 Fig. 9.4 Fig. 9.23
DLR.de • Chart 17 Example Namelist: IPCC AR5 Climate Model Evaluation Chapter Fig. 9.2 Fig. 9.10 Fig. 9.5 Fig. 9.24 Fig. 9.32 Fig. 9.7 Fig. 9.45 Fig. 9.4 Fig. 9.23
DLR.de • Chart 18 Examples of ESMValTool Namelists implemented so far Emphasis on diagnostics & metrics with demonstrated importance for ESM evaluation Physics Atmospheric composition • Clouds • Aerosol • Cloud regime error metric (CREM) • Land and ocean components of the • Diurnal cycle of convection global carbon cycle • Evapotranspiration • Emergent constraints on carbon cycle • Madden-Julian Oscillation (MJO) feedbacks • Performance metrics for essential • Ozone and associated climate impacts climate parameters • Ozone and some precursors • South Asian monsoon • Southern Hemisphere Ocean • Standardized precipitation index (SPI) • Marine biogeochemistry • Tropical variability • NCAR climate variability diagnostics • West African monsoon package (CVDP) • Extreme events (in progress) • Southern Ocean • Regional diagnostics (in progress) Cryosphere • Sea ice Land General • Catchment analysis • IPCC AR5 chapter 9 and 12 (in progress)
DLR.de • Chart 19 Reproducibility & Traceability of evaluation results Namelist Logfile Evaluation analysis is controlled by the At each execution of the tool a log namelist file that defines the internal file is automatically created workflow for the desired analysis. The log file contains: It defines: • The list of all input data which • Input datasets (observations, models) have been used (version, data source, etc.) • Regridding operation (if needed) • The list of variables that have • Set of diagnostics been processed • Misc. (output formats, output folder, • The list of diagnostics that have etc…) been applied Output files (NetCDF) • The list of authors and Contain meta data from input files and contributors to the given meta data generated by ESMValTool diagnostic, together with the Observational data relevant references and projects • Well defined processing chain • Software version of ESMValTool that was used • creation of metadata
DLR.de • Chart 20 ESMValTool version 1.0 www.esmvaltool.org • Eyring et al., Geosci. Model Dev., 2016 • www.github.com/ESMValTool-Core/ESMValTool • • doi:10.17874/ac8548f0315
DLR.de • Chart 21 Routine Benchmarking and Evaluation in CMIP6 Due to the high volume of the data in CMIP6, ESGF replication is likely to be slow (took months in CMIP5) It was therefore recommended to the ESGF teams that the data used by the CMIP evaluation tools be replicated with higher priority . This should substantially speed up the evaluation of model results after submission of the simulation output to the ESGF Eyring et al., ESD, in rev.. (2016)
DLR.de • Chart 22 Example for extended CMIP6 Workflow with the ESMValTool at the DKRZ* Production phase Data management phase Routine evaluation ESGF DKRZ routine Create ESGF remote evaluation Technical cmorized compliance to quality control CMIP write simulation conventions output timestep web publication ESGF local (CMORize, visualization Scientific to ESGF MPI-ESM Metadata, etc.) quality control MPI-ICON monitoring EMAC native *Defined in the Project CMIP6-DICAD freva: https://freva.met.fu-berlin.de
23 ESMValTool Workflow for routine evaluation at the ESGF (CMIP6-DICAD) Download data to Cache plot Step-wise access: Web based 1. ESMValTool core team Visualisation netCDF 2. Modelling groups 3. Public log file Derived from: Eyring et al., ESMValToolv1.0, GMD, 2016
Recommend
More recommend