Metrics and Diagnostics for Precipitation-Related Processes in Climate Model Short-Range Hindcasts

2013 ◽  
Vol 26 (5) ◽  
pp. 1516-1534 ◽  
Author(s):  
H.-Y. Ma ◽  
S. Xie ◽  
J. S. Boyle ◽  
S. A. Klein ◽  
Y. Zhang

Abstract In this study, several metrics and diagnostics are proposed and implemented to systematically explore and diagnose climate model biases in short-range hindcasts and quantify how fast hindcast biases approach to climate biases with an emphasis on tropical precipitation and associated moist processes. A series of 6-day hindcasts with NCAR and the U.S. Department of Energy Community Atmosphere Model, version 4 (CAM4) and version 5 (CAM5), were performed and initialized with ECMWF operational analysis every day at 0000 UTC during the Year of Tropical Convection (YOTC). An Atmospheric Model Intercomparison Project (AMIP) type of ensemble climate simulations was also conducted for the same period. The analyses indicate that initial drifts in precipitation and associated moisture processes (“fast processes”) can be identified in the hindcasts, and the biases share great resemblance to those in the climate runs. Comparing to Tropical Rainfall Measuring Mission (TRMM) observations, model hindcasts produce too high a probability of low- to intermediate-intensity precipitation at daily time scales during northern summers, which is consistent with too frequently triggered convection by its deep convection scheme. For intense precipitation events (>25 mm day−1), however, the model produces a much lower probability partially because the model requires a much higher column relative humidity than observations to produce similar precipitation intensity as indicated by the proposed diagnostics. Regional analysis on precipitation bias in the hindcasts is also performed for two selected locations where most contemporary climate models show the same sign of bias. Based on moist static energy diagnostics, the results suggest that the biases in the moisture and temperature fields near the surface and in the lower and middle troposphere are primarily responsible for precipitation biases. These analyses demonstrate the usefulness of these metrics and diagnostics to diagnose climate model biases.

2008 ◽  
Vol 136 (3) ◽  
pp. 808-832 ◽  
Author(s):  
J. Boyle ◽  
S. Klein ◽  
G. Zhang ◽  
S. Xie ◽  
X. Wei

Abstract Short-term (1–10 day) forecasts are made with climate models to assess the parameterizations of the physical processes. The time period for the integrations is that of the intensive observing period (IOP) of the Tropical Ocean Global Atmosphere Coupled Ocean–Atmosphere Response Experiment (TOGA COARE). The models used are the National Center for Atmospheric Research (NCAR) Community Climate Model, version 3.1 (CAM3.1); CAM3.1 with a modified deep convection parameterization; and the Geophysical Fluid Dynamics Laboratory (GFDL) Atmospheric Model, version 2 (AM2). The models were initialized using the state variables from the 40-yr ECMWF Re-Analysis (ERA-40). The CAM deep convective parameterization fails to demonstrate the sensitivity to the imposed forcing to simulate precipitation patterns associated with the Madden–Julian oscillations (MJOs) present during the period. AM2 and modified CAM3.1 exhibit greater correspondence to the observations at the TOGA COARE site, suggesting that convective parameterizations that have some type of limiter (as do AM2 and the modified CAM3.1) simulate the MJO rainfall with more fidelity than those without. None of the models are able to fully capture the correct phasing of westerly wind bursts with respect to precipitation in the eastward-moving MJO disturbance. Better representation of the diabatic heating and effective static stability profiles is associated with a better MJO simulation. Because the models’ errors in the forecast mode bear a resemblance to the errors in the climate mode in simulating the MJO, the forecasts may allow for a better way to dissect the reasons for model error.


2020 ◽  
Vol 33 (5) ◽  
pp. 1915-1933 ◽  
Author(s):  
Jesús Vergara-Temprado ◽  
Nikolina Ban ◽  
Davide Panosetti ◽  
Linda Schlemmer ◽  
Christoph Schär

AbstractThe “gray zone” of convection is defined as the range of horizontal grid-space resolutions at which convective processes are partially but not fully resolved explicitly by the model dynamics (typically estimated from a few kilometers to a few hundred meters). The representation of convection at these scales is challenging, as both parameterizing convective processes or relying on the model dynamics to resolve them might cause systematic model biases. Here, a regional climate model over a large European domain is used to study model biases when either using parameterizations of deep and shallow convection or representing convection explicitly. For this purpose, year-long simulations at horizontal resolutions between 50- and 2.2-km grid spacing are performed and evaluated with datasets of precipitation, surface temperature, and top-of-the-atmosphere radiation over Europe. While simulations with parameterized convection seem more favorable than using explicit convection at around 50-km resolution, at higher resolutions (grid spacing ≤ 25 km) models tend to perform similarly or even better for certain model skills when deep convection is turned off. At these finer scales, the representation of deep convection has a larger effect in model performance than changes in resolution when looking at hourly precipitation statistics and the representation of the diurnal cycle, especially over nonorographic regions. The shortwave net radiative balance at the top of the atmosphere is the variable most strongly affected by resolution changes, due to the better representation of cloud dynamical processes at higher resolutions. These results suggest that an explicit representation of convection may be beneficial in representing some aspects of climate over Europe at much coarser resolutions than previously thought, thereby reducing some of the uncertainties derived from parameterizing deep convection.


2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2018 ◽  
Vol 115 (18) ◽  
pp. 4577-4582 ◽  
Author(s):  
Kathleen A. Schiro ◽  
Fiaz Ahmed ◽  
Scott E. Giangrande ◽  
J. David Neelin

A substantial fraction of precipitation is associated with mesoscale convective systems (MCSs), which are currently poorly represented in climate models. Convective parameterizations are highly sensitive to the assumptions of an entraining plume model, in which high equivalent potential temperature air from the boundary layer is modified via turbulent entrainment. Here we show, using multiinstrument evidence from the Green Ocean Amazon field campaign (2014–2015; GoAmazon2014/5), that an empirically constrained weighting for inflow of environmental air based on radar wind profiler estimates of vertical velocity and mass flux yields a strong relationship between resulting buoyancy measures and precipitation statistics. This deep-inflow weighting has no free parameter for entrainment in the conventional sense, but to a leading approximation is simply a statement of the geometry of the inflow. The structure further suggests the weighting could consistently apply even for coherent inflow structures noted in field campaign studies for MCSs over tropical oceans. For radar precipitation retrievals averaged over climate model grid scales at the GoAmazon2014/5 site, the use of deep-inflow mixing yields a sharp increase in the probability and magnitude of precipitation with increasing buoyancy. Furthermore, this applies for both mesoscale and smaller-scale convection. Results from reanalysis and satellite data show that this holds more generally: Deep-inflow mixing yields a strong precipitation–buoyancy relation across the tropics. Deep-inflow mixing may thus circumvent inadequacies of current parameterizations while helping to bridge the gap toward representing mesoscale convection in climate models.


1998 ◽  
Vol 27 ◽  
pp. 565-570 ◽  
Author(s):  
William M. Connolley ◽  
Siobhan P. O'Farrell

We compare observed temperature variations in Antarctica with climate-model runs over the last century. The models used are three coupled global climate models (GCMs) — the UKMO, the CSIRO and the MPI forced by the CO2 increases observed over the last century, and an atmospheric model experiment forced with observed sea-surface temperatures and sea-ice extents over the last century. Despite some regions of agreement, in general the GCM runs appear to be incompatible with each other and with the observations, although the short observational record and high natural variability make verification difficult. One of the best places for a more detailed study is the Antarctic Peninsula where the density of stations is higher and station records are longer than elsewhere in Antarctica. Observations show that this area has seen larger temperature rises than anywhere else in Antarctica. None of the three GCMs simulate such large temperature changes in the Peninsula region, in either climate-change runs radiatively forced by CO2 increases or control runs which assess the level of model variability.


2010 ◽  
Vol 23 (15) ◽  
pp. 4121-4132 ◽  
Author(s):  
Dorian S. Abbot ◽  
Itay Halevy

Abstract Most previous global climate model simulations could only produce the termination of Snowball Earth episodes at CO2 partial pressures of several tenths of a bar, which is roughly an order of magnitude higher than recent estimates of CO2 levels during and shortly after Snowball events. These simulations have neglected the impact of dust aerosols on radiative transfer, which is an assumption of potentially grave importance. In this paper it is argued, using the Dust Entrainment and Deposition (DEAD) box model driven by GCM results, that atmospheric dust aerosol concentrations may have been one to two orders of magnitude higher during a Snowball Earth event than today. It is furthermore asserted on the basis of calculations using NCAR’s Single Column Atmospheric Model (SCAM)—a radiative–convective model with sophisticated aerosol, cloud, and radiative parameterizations—that when the surface albedo is high, such increases in dust aerosol loading can produce several times more surface warming than an increase in the partial pressure of CO2 from 10−4 to 10−1 bar. Therefore the conclusion is reached that including dust aerosols in simulations may reconcile the CO2 levels required for Snowball termination in climate models with observations.


2018 ◽  
Vol 31 (16) ◽  
pp. 6591-6610 ◽  
Author(s):  
Martin Aleksandrov Ivanov ◽  
Jürg Luterbacher ◽  
Sven Kotlarski

Climate change impact research and risk assessment require accurate estimates of the climate change signal (CCS). Raw climate model data include systematic biases that affect the CCS of high-impact variables such as daily precipitation and wind speed. This paper presents a novel, general, and extensible analytical theory of the effect of these biases on the CCS of the distribution mean and quantiles. The theory reveals that misrepresented model intensities and probability of nonzero (positive) events have the potential to distort raw model CCS estimates. We test the analytical description in a challenging application of bias correction and downscaling to daily precipitation over alpine terrain, where the output of 15 regional climate models (RCMs) is reduced to local weather stations. The theoretically predicted CCS modification well approximates the modification by the bias correction method, even for the station–RCM combinations with the largest absolute modifications. These results demonstrate that the CCS modification by bias correction is a direct consequence of removing model biases. Therefore, provided that application of intensity-dependent bias correction is scientifically appropriate, the CCS modification should be a desirable effect. The analytical theory can be used as a tool to 1) detect model biases with high potential to distort the CCS and 2) efficiently generate novel, improved CCS datasets. The latter are highly relevant for the development of appropriate climate change adaptation, mitigation, and resilience strategies. Future research needs to focus on developing process-based bias corrections that depend on simulated intensities rather than preserving the raw model CCS.


2019 ◽  
Vol 13 (11) ◽  
pp. 3023-3043
Author(s):  
Julien Beaumet ◽  
Michel Déqué ◽  
Gerhard Krinner ◽  
Cécile Agosta ◽  
Antoinette Alias

Abstract. Owing to increase in snowfall, the Antarctic Ice Sheet surface mass balance is expected to increase by the end of the current century. Assuming no associated response of ice dynamics, this will be a negative contribution to sea-level rise. However, the assessment of these changes using dynamical downscaling of coupled climate model projections still bears considerable uncertainties due to poorly represented high-southern-latitude atmospheric circulation and sea surface conditions (SSCs), that is sea surface temperature and sea ice concentration. This study evaluates the Antarctic surface climate simulated using a global high-resolution atmospheric model and assesses the effects on the simulated Antarctic surface climate of two different SSC data sets obtained from two coupled climate model projections. The two coupled models from which SSCs are taken, MIROC-ESM and NorESM1-M, simulate future Antarctic sea ice trends at the opposite ends of the CMIP5 RCP8.5 projection range. The atmospheric model ARPEGE is used with a stretched grid configuration in order to achieve an average horizontal resolution of 35 km over Antarctica. Over the 1981–2010 period, ARPEGE is driven by the SSCs from MIROC-ESM, NorESM1-M and CMIP5 historical runs and by observed SSCs. These three simulations are evaluated against the ERA-Interim reanalyses for atmospheric general circulation as well as the MAR regional climate model and in situ observations for surface climate. For the late 21st century, SSCs from the same coupled climate models forced by the RCP8.5 emission scenario are used both directly and bias-corrected with an anomaly method which consists in adding the future climate anomaly from coupled model projections to the observed SSCs with taking into account the quantile distribution of these anomalies. We evaluate the effects of driving the atmospheric model by the bias-corrected instead of the original SSCs. For the simulation using SSCs from NorESM1-M, no significantly different climate change signals over Antarctica as a whole are found when bias-corrected SSCs are used. For the simulation driven by MIROC-ESM SSCs, a significant additional increase in precipitation and in winter temperatures for the Antarctic Ice Sheet is obtained when using bias-corrected SSCs. For the range of Antarctic warming found (+3 to +4 K), we confirm that snowfall increase will largely outweigh increases in melt and rainfall. Using the end members of sea ice trends from the CMIP5 RCP8.5 projections, the difference in warming obtained (∼ 1 K) is much smaller than the spread of the CMIP5 Antarctic warming projections. This confirms that the errors in representing the Southern Hemisphere atmospheric circulation in climate models are also determinant for the diversity of their projected late 21st century Antarctic climate change.


2001 ◽  
Vol 8 (4/5) ◽  
pp. 201-209 ◽  
Author(s):  
V. P. Dymnikov ◽  
A. S. Gritsoun

Abstract. In this paper we discuss some theoretical results obtained for climate models (theorems for the existence of global attractors and inertial manifolds, estimates of attractor dimension and Lyapunov exponents, symmetry property of Lyapunov spectrum). We define the conditions for "quasi-regular behaviour" of a climate system. Under these conditions, the system behaviour is subject to the Kraichnan fluctuation-dissipation relation. This fact allows us to solve the problem of determining a system's sensitivity to small perturbations to an external forcing. The applicability of the above approach to the analysis of the climate system sensitivity is verified numerically with the example of the two-layer quasi-geostrophic atmospheric model.


2020 ◽  
Author(s):  
Julia Hargreaves ◽  
James Annan

<p>Paleoclimate simulations are widely used as a test of the ability of climate models to simulate climate states that are substantially different to the present day, and quantitative reconstructions of these climate states is an essential component of model evaluation.  With there being no large network of instrumental observations from these periods, we must rely on inferences from a relatively modest number of unevenly distributed proxy records which are believed to be quantitatively indicative of the climate state.  In order to robustly establish climatic conditions over global scales, we require methods for smoothing and interpolating between these sparse and imperfect estimates.  In recent years, we have worked on this problem and created a global reconstruction of the Last Glacial Maximum [Annan and Hargreaves, 2013, Climate of the Past] using the data and models which were available at that time.  The method uses scaled patterns from the PMIP ensemble of structurally diverse climate simulations, combined with sparse sets of proxy data, to produce spatially coherent and complete  data  fields  for  surface  air  and  sea  temperatures  (potentially  including  the  seasonal cycle)  along  with  uncertainty  estimates  over  the  whole  field.   This  approach  is  more  robust than alternative methods, which either perform a purely statistical interpolation of the data or at best combine the data with a single climate model. Here, we aim to improve the method, update the inputs, and apply the same technique to both Last Glacial Maximum and mid Pliocene climate intervals. As well as generating spatially complete and coherent maps of climate variables, our approach also generates well-calibrated uncertainty estimates.</p>


Sign in / Sign up

Export Citation Format

Share Document