scholarly journals Climate Modelling in Low Precision: Effects of Both Deterministic & Stochastic Rounding.

2021 ◽  
pp. 1-43
Author(s):  
E. Adam Paxton ◽  
Matthew Chantry ◽  
Milan Klöwer ◽  
Leo Saffin ◽  
Tim Palmer

AbstractMotivated by recent advances in operational weather forecasting, we study the efficacy of low-precision arithmetic for climate simulations. We develop a framework to measure rounding error in a climate model which provides a stress-test for a low-precision version of the model, and we apply our method to a variety of models including the Lorenz system; a shallow water approximation for ow over a ridge; and a coarse resolution spectral global atmospheric model with simplified parameterisations (SPEEDY). Although double precision (52 significant bits) is standard across operational climate models, in our experiments we find that single precision (23 sbits) is more than enough and that as low as half precision (10 sbits) is often sufficient. For example, SPEEDY can be run with 12 sbits across the code with negligible rounding error, and with 10 sbits if minor errors are accepted, amounting to less than 0.1 mm/6hr for average grid-point precipitation, for example. Our test is based on the Wasserstein metric and this provides stringent non-parametric bounds on rounding error accounting for annual means as well as extreme weather events. In addition, by testing models using both round-to-nearest (RN) and stochastic rounding (SR) we find that SR can mitigate rounding error across a range of applications, and thus our results also provide some evidence that SR could be relevant to next-generation climate models. Further research is needed to test if our results can be generalised to higher resolutions and alternative numerical schemes. However, the results open a promising avenue towards the use of low-precision hardware for improved climate modelling.

2020 ◽  
Author(s):  
Peter Watson ◽  
Sarah Sparrow ◽  
William Ingram ◽  
Simon Wilson ◽  
Drouard Marie ◽  
...  

<p>Multi-thousand member climate model simulations are highly valuable for showing how extreme weather events will change as the climate changes, using a physically-based approach. However, until now, studies using such an approach have been limited to using models with a resolution much coarser than the most modern systems. We have developed a global atmospheric model with 5/6°x5/9° resolution (~60km in middle latitudes) that can be run in the climateprediction.net distributed computing system to produce such large datasets. This resolution is finer than that of many current global climate models and sufficient for good simulation of extratropical synoptic features such as storms. It will also allow many extratropical extreme weather events to be simulated without requiring regional downscaling. We will show that this model's simulation of extratropical weather is competitive with that in other current models. We will also present results from the first multi-thousand member ensembles produced at this resolution, showing the impact of 1.5°C and 2°C global warming on extreme winter rainfall and extratropical cyclones in Europe.</p>


2017 ◽  
Vol 10 (5) ◽  
pp. 1849-1872 ◽  
Author(s):  
Benoit P. Guillod ◽  
Richard G. Jones ◽  
Andy Bowery ◽  
Karsten Haustein ◽  
Neil R. Massey ◽  
...  

Abstract. Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes in such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows one to generate a very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of the weather@home system (weather@home 2) with a higher-resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid-scale land-surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM temperature biases are overall reduced, in particular the warm bias over eastern Europe, but large biases remain. Precipitation is improved over the Alps in summer, with mixed changes in other regions and seasons. The model is shown to represent the main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.


2016 ◽  
Author(s):  
Benoit P. Guillod ◽  
Andy Bowery ◽  
Karsten Haustein ◽  
Richard G. Jones ◽  
Neil R. Massey ◽  
...  

Abstract. Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes of such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows to generate very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of weather@home system (weather@home 2) with a higher resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid scale land surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 km to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM biases are overall reduced, in particular the warm and dry bias over eastern Europe, but large biases remain. The model is shown to represent main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.


2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2016 ◽  
Vol 20 (5) ◽  
pp. 2047-2061 ◽  
Author(s):  
Sebastiano Piccolroaz ◽  
Michele Di Lazzaro ◽  
Antonio Zarlenga ◽  
Bruno Majone ◽  
Alberto Bellin ◽  
...  

Abstract. We present HYPERstream, an innovative streamflow routing scheme based on the width function instantaneous unit hydrograph (WFIUH) theory, which is specifically designed to facilitate coupling with weather forecasting and climate models. The proposed routing scheme preserves geomorphological dispersion of the river network when dealing with horizontal hydrological fluxes, irrespective of the computational grid size inherited from the overlaying climate model providing the meteorological forcing. This is achieved by simulating routing within the river network through suitable transfer functions obtained by applying the WFIUH theory to the desired level of detail. The underlying principle is similar to the block-effective dispersion employed in groundwater hydrology, with the transfer functions used to represent the effect on streamflow of morphological heterogeneity at scales smaller than the computational grid. Transfer functions are constructed for each grid cell with respect to the nodes of the network where streamflow is simulated, by taking advantage of the detailed morphological information contained in the digital elevation model (DEM) of the zone of interest. These characteristics make HYPERstream well suited for multi-scale applications, ranging from catchment up to continental scale, and to investigate extreme events (e.g., floods) that require an accurate description of routing through the river network. The routing scheme enjoys parsimony in the adopted parametrization and computational efficiency, leading to a dramatic reduction of the computational effort with respect to full-gridded models at comparable level of accuracy. HYPERstream is designed with a simple and flexible modular structure that allows for the selection of any rainfall-runoff model to be coupled with the routing scheme and the choice of different hillslope processes to be represented, and it makes the framework particularly suitable to massive parallelization, customization according to the specific user needs and preferences, and continuous development and improvements.


1998 ◽  
Vol 27 ◽  
pp. 565-570 ◽  
Author(s):  
William M. Connolley ◽  
Siobhan P. O'Farrell

We compare observed temperature variations in Antarctica with climate-model runs over the last century. The models used are three coupled global climate models (GCMs) — the UKMO, the CSIRO and the MPI forced by the CO2 increases observed over the last century, and an atmospheric model experiment forced with observed sea-surface temperatures and sea-ice extents over the last century. Despite some regions of agreement, in general the GCM runs appear to be incompatible with each other and with the observations, although the short observational record and high natural variability make verification difficult. One of the best places for a more detailed study is the Antarctic Peninsula where the density of stations is higher and station records are longer than elsewhere in Antarctica. Observations show that this area has seen larger temperature rises than anywhere else in Antarctica. None of the three GCMs simulate such large temperature changes in the Peninsula region, in either climate-change runs radiatively forced by CO2 increases or control runs which assess the level of model variability.


2010 ◽  
Vol 23 (15) ◽  
pp. 4121-4132 ◽  
Author(s):  
Dorian S. Abbot ◽  
Itay Halevy

Abstract Most previous global climate model simulations could only produce the termination of Snowball Earth episodes at CO2 partial pressures of several tenths of a bar, which is roughly an order of magnitude higher than recent estimates of CO2 levels during and shortly after Snowball events. These simulations have neglected the impact of dust aerosols on radiative transfer, which is an assumption of potentially grave importance. In this paper it is argued, using the Dust Entrainment and Deposition (DEAD) box model driven by GCM results, that atmospheric dust aerosol concentrations may have been one to two orders of magnitude higher during a Snowball Earth event than today. It is furthermore asserted on the basis of calculations using NCAR’s Single Column Atmospheric Model (SCAM)—a radiative–convective model with sophisticated aerosol, cloud, and radiative parameterizations—that when the surface albedo is high, such increases in dust aerosol loading can produce several times more surface warming than an increase in the partial pressure of CO2 from 10−4 to 10−1 bar. Therefore the conclusion is reached that including dust aerosols in simulations may reconcile the CO2 levels required for Snowball termination in climate models with observations.


2019 ◽  
Vol 13 (11) ◽  
pp. 3023-3043
Author(s):  
Julien Beaumet ◽  
Michel Déqué ◽  
Gerhard Krinner ◽  
Cécile Agosta ◽  
Antoinette Alias

Abstract. Owing to increase in snowfall, the Antarctic Ice Sheet surface mass balance is expected to increase by the end of the current century. Assuming no associated response of ice dynamics, this will be a negative contribution to sea-level rise. However, the assessment of these changes using dynamical downscaling of coupled climate model projections still bears considerable uncertainties due to poorly represented high-southern-latitude atmospheric circulation and sea surface conditions (SSCs), that is sea surface temperature and sea ice concentration. This study evaluates the Antarctic surface climate simulated using a global high-resolution atmospheric model and assesses the effects on the simulated Antarctic surface climate of two different SSC data sets obtained from two coupled climate model projections. The two coupled models from which SSCs are taken, MIROC-ESM and NorESM1-M, simulate future Antarctic sea ice trends at the opposite ends of the CMIP5 RCP8.5 projection range. The atmospheric model ARPEGE is used with a stretched grid configuration in order to achieve an average horizontal resolution of 35 km over Antarctica. Over the 1981–2010 period, ARPEGE is driven by the SSCs from MIROC-ESM, NorESM1-M and CMIP5 historical runs and by observed SSCs. These three simulations are evaluated against the ERA-Interim reanalyses for atmospheric general circulation as well as the MAR regional climate model and in situ observations for surface climate. For the late 21st century, SSCs from the same coupled climate models forced by the RCP8.5 emission scenario are used both directly and bias-corrected with an anomaly method which consists in adding the future climate anomaly from coupled model projections to the observed SSCs with taking into account the quantile distribution of these anomalies. We evaluate the effects of driving the atmospheric model by the bias-corrected instead of the original SSCs. For the simulation using SSCs from NorESM1-M, no significantly different climate change signals over Antarctica as a whole are found when bias-corrected SSCs are used. For the simulation driven by MIROC-ESM SSCs, a significant additional increase in precipitation and in winter temperatures for the Antarctic Ice Sheet is obtained when using bias-corrected SSCs. For the range of Antarctic warming found (+3 to +4 K), we confirm that snowfall increase will largely outweigh increases in melt and rainfall. Using the end members of sea ice trends from the CMIP5 RCP8.5 projections, the difference in warming obtained (∼ 1 K) is much smaller than the spread of the CMIP5 Antarctic warming projections. This confirms that the errors in representing the Southern Hemisphere atmospheric circulation in climate models are also determinant for the diversity of their projected late 21st century Antarctic climate change.


2020 ◽  
Vol 172 ◽  
pp. 02006
Author(s):  
Hamed Hedayatnia ◽  
Marijke Steeman ◽  
Nathan Van Den Bossche

Understanding how climate change accelerates or slows down the process of material deterioration is the first step towards assessing adaptive approaches for the preservation of historical heritage. Analysis of the climate change effects on the degradation risk assessment parameters like salt crystallization cycles is of crucial importance when considering mitigating actions. Due to the vulnerability of cultural heritage in Iran to climate change, the impact of this phenomenon on basic parameters plus variables more critical to building damage like salt crystallization index needs to be analyzed. Regional climate modelling projections can be used to asses the impact of climate change effects on heritage. The output of two different regional climate models, the ALARO-0 model (Ghent University-RMI, Belgium) and the REMO model (HZG-GERICS, Germany), is analyzed to find out which model is more adapted to the region. So the focus of this research is mainly on the evaluation to determine the reliability of both models over the region. For model validation, a comparison between model data and observations was performed in 4 different climate zones for 30 years to find out how reliable these models are in the field of building pathology.


2001 ◽  
Vol 8 (4/5) ◽  
pp. 201-209 ◽  
Author(s):  
V. P. Dymnikov ◽  
A. S. Gritsoun

Abstract. In this paper we discuss some theoretical results obtained for climate models (theorems for the existence of global attractors and inertial manifolds, estimates of attractor dimension and Lyapunov exponents, symmetry property of Lyapunov spectrum). We define the conditions for "quasi-regular behaviour" of a climate system. Under these conditions, the system behaviour is subject to the Kraichnan fluctuation-dissipation relation. This fact allows us to solve the problem of determining a system's sensitivity to small perturbations to an external forcing. The applicability of the above approach to the analysis of the climate system sensitivity is verified numerically with the example of the two-layer quasi-geostrophic atmospheric model.


Sign in / Sign up

Export Citation Format

Share Document