Modular AWI-CM: An Earth System Model (ESM) prototype using the esm-interface library for a modular ESM coupling approach

Author(s):  
Nadine Wieters ◽  
Dirk Barbi ◽  
Luisa Cristini

<p>Earth System Models (ESMs) are composed of different components, including submodels as well as whole domain models. Within such an ESM, these model components need to exchange information to account for the interactions between the different compartments. This exchange of data is the purpose of a “model coupler”.</p><p>Within the Advanced Earth System Modelling Capacity (ESM) project, a goal is to develop a modular framework that allows for a flexible ESM configuration. One approach is to implement purpose build model couplers in a more modular way.</p><p>For this purpose, we developed the esm-interface library, in consideration of the following objectives: (i) To obtain a more modular ESM, that allows model components and model couplers to be exchangeable; and (ii) to account for a more flexible coupling configuration of an ESM setup.</p><p>As a first application of the esm-interface library, we implemented it into the AWI Climate Model (AWI-CM) [Sidorenko et al., 2015] as an interface between the model components and the model coupler (OASIS3-MCT; Valcke [2013]). In a second step, we extended the esm-interface library for a second coupler (YAC; Hanke et al. [2016]).</p><p>In this presentation, we will discuss the general idea of the esm-interface library, it’s implementation in an ESM setup and show first results from the first modular prototype of AWI-CM.</p>

Author(s):  
Warren M Washington ◽  
Lawrence Buja ◽  
Anthony Craig

The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.


2013 ◽  
Vol 9 (3) ◽  
pp. 1111-1140 ◽  
Author(s):  
M. Eby ◽  
A. J. Weaver ◽  
K. Alexander ◽  
K. Zickfeld ◽  
A. Abe-Ouchi ◽  
...  

Abstract. Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.


2018 ◽  
Vol 11 (9) ◽  
pp. 3781-3794 ◽  
Author(s):  
Joy Merwin Monteiro ◽  
Jeremy McGibbon ◽  
Rodrigo Caballero

Abstract. sympl (System for Modelling Planets) and climt (Climate Modelling and Diagnostics Toolkit) are an attempt to rethink climate modelling frameworks from the ground up. The aim is to use expressive data structures available in the scientific Python ecosystem along with best practices in software design to allow scientists to easily and reliably combine model components to represent the climate system at a desired level of complexity and to enable users to fully understand what the model is doing. sympl is a framework which formulates the model in terms of a state that gets evolved forward in time or modified within a specific time by well-defined components. sympl's design facilitates building models that are self-documenting, are highly interoperable, and provide fine-grained control over model components and behaviour. sympl components contain all relevant information about the input they expect and output that they provide. Components are designed to be easily interchanged, even when they rely on different units or array configurations. sympl provides basic functions and objects which could be used in any type of Earth system model. climt is an Earth system modelling toolkit that contains scientific components built using sympl base objects. These include both pure Python components and wrapped Fortran libraries. climt provides functionality requiring model-specific assumptions, such as state initialization and grid configuration. climt's programming interface designed to be easy to use and thus appealing to a wide audience. Model building, configuration and execution are performed through a Python script (or Jupyter Notebook), enabling researchers to build an end-to-end Python-based pipeline along with popular Python data analysis and visualization tools.


2021 ◽  
Author(s):  
Xavier Yepes-Arbós ◽  
Miguel Castrillo ◽  
Mario C. Acosta ◽  
Kim Serradell

<p>The increase in the capability of Earth System Models (ESMs) is strongly linked to the amount of computing power, given that the spatial resolution used for global climate experimentation is a limiting factor to correctly reproduce climate mean state and variability. However, higher spatial resolutions require new High Performance Computing (HPC) platforms, where the improvement of the computational efficiency of ESMs will be mandatory. In this context, porting a new ultra-high resolution configuration into a new and more powerful HPC cluster is a challenging task, involving technical expertise to deploy and improve the computational performance of such a novel configuration.</p><p>To take advantage of this foreseeable landscape, the new EC-Earth 4 climate model is being developed by coupling OpenIFS 43R3 and NEMO 4 as atmosphere and ocean components respectively. An important effort has been made to improve the computational efficiency of this new EC-Earth version, such as extending the asynchronous I/O capabilities of the XIOS server to OpenIFS. </p><p>In order to anticipate the computational behaviour of EC-Earth 4 for new pre-exascale machines such as the upcoming MareNostrum 5 of the Barcelona Supercomputing Center (BSC), OpenIFS and NEMO models are therefore benchmarked on a petascale machine (MareNostrum 4) to find potential computational bottlenecks introduced by new developments or to investigate if previous known performance limitations are solved. The outcome of this work can also be used to efficiently set up new ultra-high resolutions from a computational point of view, not only for EC-Earth, but also for other ESMs.</p><p>Our benchmarking consists of large strong scaling tests (tens of thousands of cores) by running different output configurations, such as changing multiple XIOS parameters and number of 2D and 3D fields. These very large tests need a huge amount of computational resources (up to 2,595 nodes, 75 % of the supercomputer), so they require a special allocation that can be applied once a year.</p><p>OpenIFS is evaluated with a 9 km global horizontal resolution (Tco1279) and using three different output data sets: no output, CMIP6-based fields and huge output volume (8.8 TB) to stress the I/O part. In addition, different XIOS parameters, XIOS resources, affinity, MPI-OpenMP hybridisation and MPI library are tested. Results suggest new features introduced in 43R3 do not represent a bottleneck in terms of performance as the model scales. The I/O scheme is also improved when outputting data through XIOS according to the scalability curve.</p><p>NEMO is scaled using a 3 km global horizontal resolution (ORCA36) with and without the sea-ice module. As in OpenIFS, different I/O configurations are benchmarked, such as disabling model output, only enabling 2D fields, or either producing 3D variables on an hourly basis. XIOS is also scaled and tested with different parameters. While NEMO has good scalability during the most part of the exercise, a severe degradation is observed before the model uses 70% of the machine resources (2,546 nodes). The I/O overhead is moderate for the best XIOS configuration, but it demands many resources.</p>


2021 ◽  
Author(s):  
Sam Hatfield ◽  
Kristian Mogensen ◽  
Peter Dueben ◽  
Nils Wedi ◽  
Michail Diamantakis

<p>Earth-System models traditionally use double-precision, 64 bit floating-point numbers to perform arithmetic. According to orthodoxy, we must use such a relatively high level of precision in order to minimise the potential impact of rounding errors on the physical fidelity of the model. However, given the inherently imperfect formulation of our models, and the computational benefits of lower precision arithmetic, we must question this orthodoxy. At ECMWF, a single-precision, 32 bit variant of the atmospheric model IFS has been undergoing rigorous testing in preparation for operations for around 5 years. The single-precision simulations have been found to have effectively the same forecast skill as the double-precision simulations while finishing in 40% less time, thanks to the memory and cache benefits of single-precision numbers. Following these positive results, other modelling groups are now also considering single-precision as a way to accelerate their simulations.</p><p>In this presentation I will present the rationale behind the move to lower-precision floating-point arithmetic and up-to-date results from the single-precision atmospheric model at ECMWF, which will be operational imminently. I will then provide an update on the development of the single-precision ocean component at ECMWF, based on the NEMO ocean model, including a verification of quarter-degree simulations. I will also present new results from running ECMWF's coupled atmosphere-ocean-sea-ice-wave forecasting system entirely with single-precision. Finally I will discuss the feasibility of even lower levels of precision, like half-precision, which are now becoming available through GPU- and ARM-based systems such as Summit and Fugaku, respectively. The use of reduced-precision floating-point arithmetic will be an essential consideration for developing high-resolution, storm-resolving Earth-System models.</p>


2017 ◽  
Author(s):  
Michail Alvanos ◽  
Theodoros Christoudias

Abstract. This paper presents an application of GPU accelerators in Earth system modelling. We focus on atmospheric chemical kinetics, one of the most computationally intensive tasks in climate-chemistry model simulations. We developed a software package that automatically generates CUDA kernels to numerically integrate atmospheric chemical kinetics in the global climate model ECHAM/MESSy Atmospheric Chemistry (EMAC), used to study climate change and air quality scenarios. A source-to-source compiler outputs a CUDA compatible kernel, by parsing the FORTRAN code generated by the Kinetic Pre-Processor (KPP) general analysis tool. All Rosenbrock methods that are available in the KPP numerical library are supported. Performance evaluation, using Fermi and Pascal CUDA-enabled GPU accelerators shows achieved speedups of 4.5× and 22.4× respectively of the kernel execution time. A node-to-node real-world production performance comparison shows a 1.75× speed-up over the non-accelerated application using the KPP 3-stage Rosenbrock solver. We provide a detailed description of the code optimizations used to improve the performance including memory optimizations, control code simplification, and reduction of idle time. The accuracy and correctness of the accelerated implementation are evaluated by comparing to the CPU-only version of the application. The relative difference is found to be less than 0.00005 % when comparing the output of the accelerated kernel the CPU-only code, within the target level of relative accuracy (relative error tolerance) of 0.1 %. The approach followed, including the computational workload division and the developed GPU solver code can potentially be used as the basis for hardware acceleration of numerous geoscientific models that rely on KPP for atmospheric chemical kinetics applications.


2013 ◽  
Vol 10 (6) ◽  
pp. 4189-4210 ◽  
Author(s):  
D. Dalmonech ◽  
S. Zaehle

Abstract. Terrestrial ecosystem models used for Earth system modelling show a significant divergence in future patterns of ecosystem processes, in particular the net land–atmosphere carbon exchanges, despite a seemingly common behaviour for the contemporary period. An in-depth evaluation of these models is hence of high importance to better understand the reasons for this disagreement. Here, we develop an extension for existing benchmarking systems by making use of the complementary information contained in the observational records of atmospheric CO2 and remotely sensed vegetation activity to provide a novel set of diagnostics of ecosystem responses to climate variability in the last 30 yr at different temporal and spatial scales. The selection of observational characteristics (traits) specifically considers the robustness of information given that the uncertainty of both data and evaluation methodology is largely unknown or difficult to quantify. Based on these considerations, we introduce a baseline benchmark – a minimum test that any model has to pass – to provide a more objective, quantitative evaluation framework. The benchmarking strategy can be used for any land surface model, either driven by observed meteorology or coupled to a climate model. We apply this framework to evaluate the offline version of the MPI Earth System Model's land surface scheme JSBACH. We demonstrate that the complementary use of atmospheric CO2 and satellite-based vegetation activity data allows pinpointing of specific model deficiencies that would not be possible by the sole use of atmospheric CO2 observations.


2018 ◽  
Author(s):  
Ufuk Utku Turuncoglu

Abstract. The data volume being produced by regional and global multi-component earth system models are rapidly increasing due to the improved spatial and temporal resolution of the model components, sophistication of the used numerical models in terms of represented physical processes and their non-linear complex interactions. In particular, very short time steps have to be defined in multi-component and multi-scale non-hydrostatic modelling systems to represent the evolution of the fast-moving processes such as turbulence, extra-tropical cyclones, convective lines, jet streams, internal waves, vertical turbulent mixing and surface gravity waves. Consequently, the used small time steps cause extra computation and disk I/O overhead in the used modelling system even if today's most powerful high-performance computing and data storage systems are being considered. Analysis of the high volume of data from multiple earth system model components at different temporal and spatial resolution also poses a challenging problem to efficiently perform integrated data analysis of the massive amounts of data by relying on the conventional post-processing methods available today. This study basically aims to explore the feasibility and added value of integrating existing in-situ visualization and data analysis methods with the model coupling framework (ESMF) to increase interoperability between multi-component simulation code and data processing pipelines by providing easy to use, efficient, generic and standardized modeling environment for earth system science applications. The new data analysis approach enables simultaneous analysis of the vast amount of data produced by multi-component regional earth system models (atmosphere, ocean etc.) during the run process. The methodology aims to create an integrated modeling environment for analyzing fast-moving processes and their evolution in both time and space to support better understanding of the underplaying physical mechanisms. The state-of-art approach can also be used to solve common problems in earth system model development workflow such as designing new sub-grid scale parametrizations (convection, air–sea interaction etc.) that requires inspecting the integrated model behavior in a higher temporal and spatial scale during the run or supporting visual debugging of the multi-component modeling systems, which usually are not facilitated by existing model coupling libraries and modeling systems.


2020 ◽  
Author(s):  
Dirk Barbi ◽  
Nadine Wieters ◽  
Luisa Cristini ◽  
Paul Gierz ◽  
Sara Khosravi ◽  
...  

<p>Earth system and climate modelling involves the simulation of processes on a large range of scales, and within very different components of the earth system. In practice, component models from different institutes are mostly developed independently, and then combined using a dedicated coupling software.</p><p>This procedure not only leads to a wildly growing number of available versions of model components as well as coupled setups, but also to a specific way of obtaining and operating many of these. This can be a challenging problem (and potentially a huge waste of time) especially for unexperienced researchers, or scientists aiming to change to a different model system, e.g. for intercomparisons.</p><p>In order to define a standard way of downloading, configuring, compiling and running modular ESMs on a variety of HPC systems, AWI and partner institutions develop and maintain the OpenSource ESM-Tools software (https://www.esm-tools.net). Our aim is to provide standard solutions to typical problems occurring within the workflow of model simulations such as calendar operations, data postprocessing and monitoring, sanity checks, sorting and archiving of output, and script-based coupling (e.g. ice sheet models, isostatic adjustment models). The user only provides a short (30-40 lines) runscript of absolutely necessary experiment specific definitions, while the ESM-Tools execute the phases of a simulation in the correct order. A user-friendly API ensures that more experienced users have full control over each of these phases, and can easily add functionality. A GUI has been developed to provide a more intuitive approach to the modular system, and also to add a graphical overview over the available models and combinations.</p><p>Since revision 2 (released on March 19<sup>th</sup> 2019), the ESM-Tools were entirely re-written, separating the implementation of actions (written in Python 3) from any information that we have, either on models, coupled setups, software tools, HPC systems etc. into nicely structured yaml configuration files. This has been done to reduce maintenance problems, and also to ensure that also unexperienced scientist can easily edit configurations, or even add new models or software without any programming experience. Since revision 3 the ESM-Tools support four ocean models (FESOM1, FESOM2, NEMO, MPIOM), three atmosphere models (ECHAM6, OpenIFS, ICON), two BGC models (HAMOCC, REcoM), an ice sheet (PISM) and an isostatic adjustment model (VILMA) as well as standard settings for five HPC systems. For the future we plan to add interfaces to regional models and soil/hydrology models.</p><p>The Tools currently have more than 70 registered users from 5 institutions, and more than 40 authors of contributions to either model configurations or functionality.</p>


2014 ◽  
Vol 7 (6) ◽  
pp. 7505-7524 ◽  
Author(s):  
M. S. Long ◽  
R. Yantosca ◽  
J. E. Nielsen ◽  
C. A. Keller ◽  
A. da Silva ◽  
...  

Abstract. The GEOS-Chem global chemical transport model (CTM), used by a large atmospheric chemistry research community, has been re-engineered to also serve as an atmospheric chemistry module for Earth System Models (ESMs). This was done using an Earth System Modelling Framework (ESMF) interface that operates independently of the GEOS-Chem scientific code, permitting the exact same GEOS-Chem code to be used as an ESM module or as a stand-alone CTM. In this manner, the continual stream of updates contributed by the CTM user community is automatically passed on to the ESM module, which remains state-of-science and referenced to the latest version of the standard GEOS-Chem CTM. A major step in this re-engineering was to make GEOS-Chem grid-independent, i.e., capable of using any geophysical grid specified at run time. GEOS-Chem data "sockets" were also created for communication between modules and with external ESM code via the ESMF. The grid-independent, ESMF-compatible GEOS-Chem is now the standard version of the GEOS-Chem CTM. It has been implemented as an atmospheric chemistry module into the NASA GEOS-5 ESM. The coupled GEOS-5/GEOS-Chem system was tested for scalability and performance with a tropospheric oxidant-aerosol simulation (120 coupled species, 66 transported tracers) using 48–240 cores and MPI parallelization. Numerical experiments demonstrate that the GEOS-Chem chemistry module scales efficiently for the number of processors tested. Although inclusion of atmospheric chemistry in ESMs is computationally expensive, the excellent scalability of the chemistry module means that the relative cost goes down with increasing number of MPI processes.


Sign in / Sign up

Export Citation Format

Share Document