scholarly journals Towards a performance portable, architecture agnostic implementation strategy for weather and climate models

2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2012 ◽  
Vol 93 (8) ◽  
pp. 1171-1187 ◽  
Author(s):  
Mitchell W. Moncrieff ◽  
Duane E. Waliser ◽  
Martin J. Miller ◽  
Melvyn A. Shapiro ◽  
Ghassem R. Asrar ◽  
...  

The Year of Tropical Convection (YOTC) project recognizes that major improvements are needed in how the tropics are represented in climate models. Tropical convection is organized into multiscale precipitation systems with an underlying chaotic order. These organized systems act as building blocks for meteorological events at the intersection of weather and climate (time scales up to seasonal). These events affect a large percentage of the world's population. Much of the uncertainty associated with weather and climate derives from incomplete understanding of how meteorological systems on the mesoscale (~1–100 km), synoptic scale (~1,000 km), and planetary scale (~10,000 km) interact with each other. This uncertainty complicates attempts to predict high-impact phenomena associated with the tropical atmosphere, such as tropical cyclones, the Madden–Julian oscillation, convectively coupled tropical waves, and the monsoons. These and other phenomena influence the extratropics by migrating out of the tropics and by the remote effects of planetary waves, including those generated by the MJO. The diurnal and seasonal cycles modulate all of the above. It will be impossible to accurately predict climate on regional scales or to comprehend the variability of the global water cycle in a warmer world without comprehensively addressing tropical convection and its interactions across space and time scales.


2014 ◽  
Vol 1 (2) ◽  
pp. 1283-1312
Author(s):  
M. Abbas ◽  
A. Ilin ◽  
A. Solonen ◽  
J. Hakkarainen ◽  
E. Oja ◽  
...  

Abstract. In this work, we consider the Bayesian optimization (BO) approach for tuning parameters of complex chaotic systems. Such problems arise, for instance, in tuning the sub-grid scale parameterizations in weather and climate models. For such problems, the tuning procedure is generally based on a performance metric which measures how well the tuned model fits the data. This tuning is often a computationally expensive task. We show that BO, as a tool for finding the extrema of computationally expensive objective functions, is suitable for such tuning tasks. In the experiments, we consider tuning parameters of two systems: a simplified atmospheric model and a low-dimensional chaotic system. We show that BO is able to tune parameters of both the systems with a low number of objective function evaluations and without the need of any gradient information.


2016 ◽  
Author(s):  
Andrew Dawson ◽  
Peter Düben

Abstract. This paper describes the rpe library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialised hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision. The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.


2013 ◽  
Vol 94 (11) ◽  
pp. 1691-1706 ◽  
Author(s):  
A. A. M. Holtslag ◽  
G. Svensson ◽  
P. Baas ◽  
S. Basu ◽  
B. Beare ◽  
...  

The representation of the atmospheric boundary layer is an important part of weather and climate models and impacts many applications such as air quality and wind energy. Over the years, the performance in modeling 2-m temperature and 10-m wind speed has improved but errors are still significant. This is in particular the case under clear skies and low wind speed conditions at night as well as during winter in stably stratified conditions over land and ice. In this paper, the authors review these issues and provide an overview of the current understanding and model performance. Results from weather forecast and climate models are used to illustrate the state of the art as well as findings and recommendations from three intercomparison studies held within the Global Energy and Water Exchanges (GEWEX) Atmospheric Boundary Layer Study (GABLS). Within GABLS, the focus has been on the examination of the representation of the stable boundary layer and the diurnal cycle over land in clear-sky conditions. For this purpose, single-column versions of weather and climate models have been compared with observations, research models, and large-eddy simulations. The intercomparison cases are based on observations taken in the Arctic, Kansas, and Cabauw in the Netherlands. From these studies, we find that even for the noncloudy boundary layer important parameterization challenges remain.


2017 ◽  
Vol 8 (2) ◽  
pp. 429-438 ◽  
Author(s):  
Francine J. Schevenhoven ◽  
Frank M. Selten

Abstract. Weather and climate models have improved steadily over time as witnessed by objective skill scores, although significant model errors remain. Given these imperfect models, predictions might be improved by combining them dynamically into a so-called supermodel. In this paper a new training scheme to construct such a supermodel is explored using a technique called cross pollination in time (CPT). In the CPT approach the models exchange states during the prediction. The number of possible predictions grows quickly with time, and a strategy to retain only a small number of predictions, called pruning, needs to be developed. The method is explored using low-order dynamical systems and applied to a global atmospheric model. The results indicate that the CPT training is efficient and leads to a supermodel with improved forecast quality as compared to the individual models. Due to its computational efficiency, the technique is suited for application to state-of-the art high-dimensional weather and climate models.


Author(s):  
Sarah N ◽  
Robert S ◽  
Pallav Ray ◽  
Katherine Chen ◽  
Angie Lassman ◽  
...  

2020 ◽  
Vol 11 (1) ◽  
Author(s):  
Nina N. Ridder ◽  
Andy J. Pitman ◽  
Seth Westra ◽  
Anna Ukkola ◽  
Hong X. Do ◽  
...  

AbstractCompound events (CEs) are weather and climate events that result from multiple hazards or drivers with the potential to cause severe socio-economic impacts. Compared with isolated hazards, the multiple hazards/drivers associated with CEs can lead to higher economic losses and death tolls. Here, we provide the first analysis of multiple multivariate CEs potentially causing high-impact floods, droughts, and fires. Using observations and reanalysis data during 1980–2014, we analyse 27 hazard pairs and provide the first spatial estimates of their occurrences on the global scale. We identify hotspots of multivariate CEs including many socio-economically important regions such as North America, Russia and western Europe. We analyse the relative importance of different multivariate CEs in six continental regions to highlight CEs posing the highest risk. Our results provide initial guidance to assess the regional risk of CE events and an observationally-based dataset to aid evaluation of climate models for simulating multivariate CEs.


Sign in / Sign up

Export Citation Format

Share Document