scholarly journals How Well Can We Represent the Spectrum of Convective Clouds in a Climate Model? Comparisons between Internal Parameterization Variables and Radar Observations

2018 ◽  
Vol 75 (5) ◽  
pp. 1509-1524 ◽  
Author(s):  
Laurent Labbouz ◽  
Zak Kipling ◽  
Philip Stier ◽  
Alain Protat

Current climate models cannot resolve individual convective clouds, and hence parameterizations are needed. The primary goal of convective parameterization is to represent the bulk impact of convection on the gridbox-scale variables. Spectral convective parameterizations also aim to represent the key features of the subgrid-scale convective cloud field such as cloud-top-height distribution and in-cloud vertical velocities in addition to precipitation rates. Ground-based radar retrievals of these quantities have been made available at Darwin, Australia, permitting direct comparisons of internal parameterization variables and providing new observational references for further model development. A spectral convective parameterization [the convective cloud field model (CCFM)] is discussed, and its internal equation of motion is improved. Results from the ECHAM–HAM model in single-column mode using the CCFM and the bulk mass flux Tiedtke–Nordeng scheme are compared with the radar retrievals at Darwin. The CCFM is found to outperform the Tiedtke–Nordeng scheme for cloud-top-height and precipitation-rate distributions. Radar observations are further used to propose a modified CCFM configuration with an aerodynamic drag and reduced entrainment parameter, further improving both the convective cloud-top-height distribution (important for large-scale impact of convection) and the in-cloud vertical velocities (important for aerosol activation). This study provides a new development in the CCFM, improving the representation of convective cloud spectrum characteristics observed in Darwin. This is a step toward an improved representation of convection and ultimately of aerosol effects on convection. It also shows how long-term radar observations of convective cloud properties can help constrain parameters of convective parameterization schemes.

2019 ◽  
Author(s):  
Zak Kipling ◽  
Laurent Labbouz ◽  
Philip Stier

Abstract. The interactions between aerosols and convective clouds represent some of the greatest uncertainties in the climate impact of aerosols in the atmosphere. A wide variety of mechanisms have been proposed by which aerosols may invigorate, suppress, or change the properties of individual convective clouds, some of which can be reproduced in high-resolution limited-area models. However, there may also be mesoscale, regional or global adjustments which modulate or dampen such impacts which cannot be captured in the limited domain of such models. The Convective Cloud Field Model (CCFM) provides a mechanism to explicitly simulate a population of convective clouds within each grid column at resolutions used for global climate modelling, so that a representation of the microphysical aerosol response within each parameterised cloud type is possible. Using CCFM within the global aerosol–climate model ECHAM–HAM, we demonstrate how the parameterised cloud field responds to the present-day anthropogenic aerosol perturbation in different regions. In particular, we show that in regions with strongly-forced deep convection and/or significant aerosol effects via large-scale processes, the changes in the convective cloud field due to microphysical effects is rather small; however in a more weakly-forced regime such as the Caribbean, where large-scale aerosol effects are small, a signature of convective invigoration does become apparent.


2020 ◽  
Vol 20 (7) ◽  
pp. 4445-4460
Author(s):  
Zak Kipling ◽  
Laurent Labbouz ◽  
Philip Stier

Abstract. The interactions between aerosols and convective clouds represent some of the greatest uncertainties in the climate impact of aerosols in the atmosphere. A wide variety of mechanisms have been proposed by which aerosols may invigorate, suppress or change the properties of individual convective clouds, some of which can be reproduced in high-resolution limited-area models. However, there may also be mesoscale, regional or global adjustments which modulate or dampen such impacts which cannot be captured in the limited domain of such models. The Convective Cloud Field Model (CCFM) provides a mechanism to simulate a population of convective clouds, complete with microphysics and interactions between clouds, within each grid column at resolutions used for global climate modelling, so that a representation of the microphysical aerosol response within each parameterised cloud type is possible. Using CCFM within the global aerosol–climate model ECHAM–HAM, we demonstrate how the parameterised cloud field responds to the present-day anthropogenic aerosol perturbation in different regions. In particular, we show that in regions with strongly forced deep convection and/or significant aerosol effects via large-scale processes, the changes in the convective cloud field due to microphysical effects are rather small; however in a more weakly forced regime such as the Caribbean, where large-scale aerosol effects are small, a signature of convective invigoration does become apparent.


2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2008 ◽  
Vol 21 (22) ◽  
pp. 6052-6059 ◽  
Author(s):  
B. Timbal ◽  
P. Hope ◽  
S. Charles

Abstract The consistency between rainfall projections obtained from direct climate model output and statistical downscaling is evaluated. Results are averaged across an area large enough to overcome the difference in spatial scale between these two types of projections and thus make the comparison meaningful. Undertaking the comparison using a suite of state-of-the-art coupled climate models for two forcing scenarios presents a unique opportunity to test whether statistical linkages established between large-scale predictors and local rainfall under current climate remain valid in future climatic conditions. The study focuses on the southwest corner of Western Australia, a region that has experienced recent winter rainfall declines and for which climate models project, with great consistency, further winter rainfall reductions due to global warming. Results show that as a first approximation the magnitude of the modeled rainfall decline in this region is linearly related to the model global warming (a reduction of about 9% per degree), thus linking future rainfall declines to future emission paths. Two statistical downscaling techniques are used to investigate the influence of the choice of technique on projection consistency. In addition, one of the techniques was assessed using different large-scale forcings, to investigate the impact of large-scale predictor selection. Downscaled and direct model projections are consistent across the large number of models and two scenarios considered; that is, there is no tendency for either to be biased; and only a small hint that large rainfall declines are reduced in downscaled projections. Among the two techniques, a nonhomogeneous hidden Markov model provides greater consistency with climate models than an analog approach. Differences were due to the choice of the optimal combination of predictors. Thus statistically downscaled projections require careful choice of large-scale predictors in order to be consistent with physically based rainfall projections. In particular it was noted that a relative humidity moisture predictor, rather than specific humidity, was needed for downscaled projections to be consistent with direct model output projections.


2021 ◽  
Author(s):  
Gunter Stober ◽  
Ales Kuchar ◽  
Dimitry Pokhotelov ◽  
Huixin Liu ◽  
Han-Li Liu ◽  
...  

Abstract. Long-term and continuous observations of mesospheric/lower thermospheric winds are rare, but they are important to investigate climatological changes at these altitudes on time scales of several years, covering a solar cycle and longer. Such long time series are a natural heritage of the mesosphere/lower thermosphere climate, and they are valuable to compare climate models or long term runs of general circulation models (GCMs). Here we present a climatological comparison of wind observations from six meteor radars at two conjugate latitudes to validate the corresponding mean winds and atmospheric diurnal and semidiurnal tides from three GCMs, namely Ground-to-Topside Model of Atmosphere and Ionosphere for Aeronomy (GAIA), Whole Atmosphere Community Climate Model Extension (Specified Dynamics) (WACCM-X(SD)) and Upper Atmosphere ICOsahedral Non-hydrostatic (UA-ICON) model. Our results indicate that there are interhemispheric differences in the seasonal characteristics of the diurnal and semidiurnal tide. There also are some differences in the mean wind climatologies of the models and the observations. Our results indicate that GAIA shows a reasonable agreement with the meteor radar observations during the winter season, whereas WACCM-X(SD) shows a better agreement with the radars for the hemispheric zonal summer wind reversal, which is more consistent with the meteor radar observations. The free running UA-ICON tends to show similar winds and tides compared to WACCM-X(SD).


2021 ◽  
Vol 17 (4) ◽  
pp. 1665-1684
Author(s):  
Leonore Jungandreas ◽  
Cathy Hohenegger ◽  
Martin Claussen

Abstract. Global climate models experience difficulties in simulating the northward extension of the monsoonal precipitation over north Africa during the mid-Holocene as revealed by proxy data. A common feature of these models is that they usually operate on grids that are too coarse to explicitly resolve convection, but convection is the most essential mechanism leading to precipitation in the West African Monsoon region. Here, we investigate how the representation of tropical deep convection in the ICOsahedral Nonhydrostatic (ICON) climate model affects the meridional distribution of monsoonal precipitation during the mid-Holocene by comparing regional simulations of the summer monsoon season (July to September; JAS) with parameterized and explicitly resolved convection. In the explicitly resolved convection simulation, the more localized nature of precipitation and the absence of permanent light precipitation as compared to the parameterized convection simulation is closer to expectations. However, in the JAS mean, the parameterized convection simulation produces more precipitation and extends further north than the explicitly resolved convection simulation, especially between 12 and 17∘ N. The higher precipitation rates in the parameterized convection simulation are consistent with a stronger monsoonal circulation over land. Furthermore, the atmosphere in the parameterized convection simulation is less stably stratified and notably moister. The differences in atmospheric water vapor are the result of substantial differences in the probability distribution function of precipitation and its resulting interactions with the land surface. The parametrization of convection produces light and large-scale precipitation, keeping the soils moist and supporting the development of convection. In contrast, less frequent but locally intense precipitation events lead to high amounts of runoff in the explicitly resolved convection simulations. The stronger runoff inhibits the moistening of the soil during the monsoon season and limits the amount of water available to evaporation in the explicitly resolved convection simulation.


2020 ◽  
Vol 13 (5) ◽  
pp. 2355-2377
Author(s):  
Vijay S. Mahadevan ◽  
Iulian Grindeanu ◽  
Robert Jacob ◽  
Jason Sarich

Abstract. One of the fundamental factors contributing to the spatiotemporal inaccuracy in climate modeling is the mapping of solution field data between different discretizations and numerical grids used in the coupled component models. The typical climate computational workflow involves evaluation and serialization of the remapping weights during the preprocessing step, which is then consumed by the coupled driver infrastructure during simulation to compute field projections. Tools like Earth System Modeling Framework (ESMF) (Hill et al., 2004) and TempestRemap (Ullrich et al., 2013) offer capability to generate conservative remapping weights, while the Model Coupling Toolkit (MCT) (Larson et al., 2001) that is utilized in many production climate models exposes functionality to make use of the operators to solve the coupled problem. However, such multistep processes present several hurdles in terms of the scientific workflow and impede research productivity. In order to overcome these limitations, we present a fully integrated infrastructure based on the Mesh Oriented datABase (MOAB) (Tautges et al., 2004; Mahadevan et al., 2015) library, which allows for a complete description of the numerical grids and solution data used in each submodel. Through a scalable advancing-front intersection algorithm, the supermesh of the source and target grids are computed, which is then used to assemble the high-order, conservative, and monotonicity-preserving remapping weights between discretization specifications. The Fortran-compatible interfaces in MOAB are utilized to directly link the submodels in the Energy Exascale Earth System Model (E3SM) to enable online remapping strategies in order to simplify the coupled workflow process. We demonstrate the superior computational efficiency of the remapping algorithms in comparison with other state-of-the-science tools and present strong scaling results on large-scale machines for computing remapping weights between the spectral element atmosphere and finite volume discretizations on the polygonal ocean grids.


2021 ◽  
Author(s):  
Antoine Doury ◽  
Samuel Somot ◽  
Sébastien Gadat ◽  
Aurélien Ribes ◽  
Lola Corre

Abstract Providing reliable information on climate change at local scale remains a challenge of first importance for impact studies and policymakers. Here, we propose a novel hybrid downscaling method combining the strengths of both empirical statistical downscaling methods and Regional Climate Models (RCMs). The aim of this tool is to enlarge the size of high-resolution RCM simulation ensembles at low cost.We build a statistical RCM-emulator by estimating the downscaling function included in the RCM. This framework allows us to learn the relationship between large-scale predictors and a local surface variable of interest over the RCM domain in present and future climate. Furthermore, the emulator relies on a neural network architecture, which grants computational efficiency. The RCM-emulator developed in this study is trained to produce daily maps of the near-surface temperature at the RCM resolution (12km). The emulator demonstrates an excellent ability to reproduce the complex spatial structure and daily variability simulated by the RCM and in particular the way the RCM refines locally the low-resolution climate patterns. Training in future climate appears to be a key feature of our emulator. Moreover, there is a huge computational benefit in running the emulator rather than the RCM, since training the emulator takes about 2 hours on GPU, and the prediction is nearly instantaneous. However, further work is needed to improve the way the RCM-emulator reproduces some of the temperature extremes, the intensity of climate change, and to extend the proposed methodology to different regions, GCMs, RCMs, and variables of interest.


2012 ◽  
Vol 16 (2) ◽  
pp. 305-318 ◽  
Author(s):  
I. Haddeland ◽  
J. Heinke ◽  
F. Voß ◽  
S. Eisner ◽  
C. Chen ◽  
...  

Abstract. Due to biases in the output of climate models, a bias correction is often needed to make the output suitable for use in hydrological simulations. In most cases only the temperature and precipitation values are bias corrected. However, often there are also biases in other variables such as radiation, humidity and wind speed. In this study we tested to what extent it is also needed to bias correct these variables. Responses to radiation, humidity and wind estimates from two climate models for four large-scale hydrological models are analysed. For the period 1971–2000 these hydrological simulations are compared to simulations using meteorological data based on observations and reanalysis; i.e. the baseline simulation. In both forcing datasets originating from climate models precipitation and temperature are bias corrected to the baseline forcing dataset. Hence, it is only effects of radiation, humidity and wind estimates that are tested here. The direct use of climate model outputs result in substantial different evapotranspiration and runoff estimates, when compared to the baseline simulations. A simple bias correction method is implemented and tested by rerunning the hydrological models using bias corrected radiation, humidity and wind values. The results indicate that bias correction can successfully be used to match the baseline simulations. Finally, historical (1971–2000) and future (2071–2100) model simulations resulting from using bias corrected forcings are compared to the results using non-bias corrected forcings. The relative changes in simulated evapotranspiration and runoff are relatively similar for the bias corrected and non bias corrected hydrological projections, although the absolute evapotranspiration and runoff numbers are often very different. The simulated relative and absolute differences when using bias corrected and non bias corrected climate model radiation, humidity and wind values are, however, smaller than literature reported differences resulting from using bias corrected and non bias corrected climate model precipitation and temperature values.


2016 ◽  
Vol 20 (5) ◽  
pp. 2047-2061 ◽  
Author(s):  
Sebastiano Piccolroaz ◽  
Michele Di Lazzaro ◽  
Antonio Zarlenga ◽  
Bruno Majone ◽  
Alberto Bellin ◽  
...  

Abstract. We present HYPERstream, an innovative streamflow routing scheme based on the width function instantaneous unit hydrograph (WFIUH) theory, which is specifically designed to facilitate coupling with weather forecasting and climate models. The proposed routing scheme preserves geomorphological dispersion of the river network when dealing with horizontal hydrological fluxes, irrespective of the computational grid size inherited from the overlaying climate model providing the meteorological forcing. This is achieved by simulating routing within the river network through suitable transfer functions obtained by applying the WFIUH theory to the desired level of detail. The underlying principle is similar to the block-effective dispersion employed in groundwater hydrology, with the transfer functions used to represent the effect on streamflow of morphological heterogeneity at scales smaller than the computational grid. Transfer functions are constructed for each grid cell with respect to the nodes of the network where streamflow is simulated, by taking advantage of the detailed morphological information contained in the digital elevation model (DEM) of the zone of interest. These characteristics make HYPERstream well suited for multi-scale applications, ranging from catchment up to continental scale, and to investigate extreme events (e.g., floods) that require an accurate description of routing through the river network. The routing scheme enjoys parsimony in the adopted parametrization and computational efficiency, leading to a dramatic reduction of the computational effort with respect to full-gridded models at comparable level of accuracy. HYPERstream is designed with a simple and flexible modular structure that allows for the selection of any rainfall-runoff model to be coupled with the routing scheme and the choice of different hillslope processes to be represented, and it makes the framework particularly suitable to massive parallelization, customization according to the specific user needs and preferences, and continuous development and improvements.


Sign in / Sign up

Export Citation Format

Share Document