scholarly journals A data-driven multi-cloud model for stochastic parametrization of deep convection

Author(s):  
J. Dorrestijn ◽  
D. T. Crommelin ◽  
J. A. Biello ◽  
S. J. Böing

Stochastic subgrid models have been proposed to capture the missing variability and correct systematic medium-term errors in general circulation models. In particular, the poor representation of subgrid-scale deep convection is a persistent problem that stochastic parametrizations are attempting to correct. In this paper, we construct such a subgrid model using data derived from large-eddy simulations (LESs) of deep convection. We use a data-driven stochastic parametrization methodology to construct a stochastic model describing a finite number of cloud states. Our model emulates, in a computationally inexpensive manner, the deep convection-resolving LES. Transitions between the cloud states are modelled with Markov chains. By conditioning the Markov chains on large-scale variables, we obtain a conditional Markov chain, which reproduces the time evolution of the cloud fractions. Furthermore, we show that the variability and spatial distribution of cloud types produced by the Markov chains become more faithful to the LES data when local spatial coupling is introduced in the subgrid Markov chains. Such spatially coupled Markov chains are equivalent to stochastic cellular automata.

2010 ◽  
Vol 23 (5) ◽  
pp. 1127-1145 ◽  
Author(s):  
A. Bellucci ◽  
S. Gualdi ◽  
A. Navarra

Abstract The double–intertropical convergence zone (DI) systematic error, affecting state-of-the-art coupled general circulation models (CGCMs), is examined in the multimodel Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) ensemble of simulations of the twentieth-century climate. The aim of this study is to quantify the DI error on precipitation in the tropical Pacific, with a specific focus on the relationship between the DI error and the representation of large-scale vertical circulation regimes in climate models. The DI rainfall signal is analyzed using a regime-sorting approach for the vertical circulation regimes. Through the use of this compositing technique, precipitation events are regime sorted based on the large-scale vertical motions, as represented by the midtropospheric Lagrangian pressure tendency ω500 dynamical proxy. This methodology allows partition of the precipitation signal into deep and shallow convective components. Following the regime-sorting diagnosis, the total DI bias is split into an error affecting the magnitude of precipitation associated with individual convective events and an error affecting the frequency of occurrence of single convective regimes. It is shown that, despite the existing large intramodel differences, CGCMs can be ultimately grouped into a few homogenous clusters, each featuring a well-defined rainfall–vertical circulation relationship in the DI region. Three major behavioral clusters are identified within the AR4 models ensemble: two unimodal distributions, featuring maximum precipitation under subsidence and deep convection regimes, respectively, and one bimodal distribution, displaying both components. Extending this analysis to both coupled and uncoupled (atmosphere only) AR4 simulations reveals that the DI bias in CGCMs is mainly due to the overly frequent occurrence of deep convection regimes, whereas the error on rainfall magnitude associated with individual convective events is overall consistent with errors already present in the corresponding atmosphere stand-alone simulations. A critical parameter controlling the strength of the DI systematic error is identified in the model-dependent sea surface temperature (SST) threshold leading to the onset of deep convection (THR), combined with the average SST in the southeastern Pacific. The models featuring a THR that is systematically colder (warmer) than their mean surface temperature are more (less) prone to exhibit a spurious southern intertropical convergence zone.


2011 ◽  
Vol 68 (2) ◽  
pp. 240-264 ◽  
Author(s):  
Boualem Khouider ◽  
Amik St-Cyr ◽  
Andrew J. Majda ◽  
Joseph Tribbia

Abstract The adequate representation of the dominant intraseasonal and synoptic-scale variability in the tropics, characterized by the Madden–Julian oscillation (MJO) and convectively coupled waves, is still problematic in current operational general circulation models (GCMs). Here results are presented using the next-generation NCAR GCM—the High-Order Methods Modeling Environment (HOMME)—as a dry dynamical core at a coarse resolution of about 167 km, coupled to a simple multicloud parameterization. The coupling is performed through a judicious choice of heating vertical profiles for the three cloud types—congestus, deep, and stratiform—that characterize organized tropical convection. Important control parameters that affect the types of waves that emerge are the background vertical gradient of the moisture and the stratiform fraction in the multicloud parameterization, which set the strength of large-scale moisture convergence and unsaturated downdrafts in the wake of deep convection, respectively. Three numerical simulations using different moisture gradients and different stratiform fractions are considered. The first experiment uses a large moisture gradient and a small stratiform fraction and provides an MJO-like example. It results in an intraseasonal oscillation of zonal wavenumber 2, moving eastward at a constant speed of roughly 5 m s−1. The second uses a weaker background moisture gradient and a large stratiform fraction and yields convectively coupled Rossby, Kelvin, and two-day waves, embedded in and interacting with each other; and the third experiment combines the small stratiform fraction and the weak background moisture gradient to yield a planetary-scale (wavenumber 1) second baroclinic Kelvin wave. While the first two experiments provide two benchmark examples that reproduce several key features of the observational record, the third is more of a demonstration of a bad MJO model solution that exhibits very unrealistic features.


2013 ◽  
Vol 9 (6) ◽  
pp. 2741-2757 ◽  
Author(s):  
A. Mairesse ◽  
H. Goosse ◽  
P. Mathiot ◽  
H. Wanner ◽  
S. Dubinkina

Abstract. The mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of this relatively large amount of information, we have compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but the models underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data-assimilation method based on a particle filter. In one simulation, all the 50 proxy-based records are used while in the other two only the continental or oceanic proxy-based records constrain the model results. As expected, data assimilation leads to improving the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at midlatitude that warms up northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxy-based paleoclimate records whose reconstructed signal is either incompatible with the signal recorded by some other proxy-based records or with model physics.


2013 ◽  
Vol 9 (4) ◽  
pp. 3953-3991 ◽  
Author(s):  
A. Mairesse ◽  
H. Goosse ◽  
P. Mathiot ◽  
H. Wanner ◽  
S. Dubinkina

Abstract. The mid-Holocene (6 thousand years before present) is a key period to study the consistency between model results and proxy data as it corresponds to a standard test for models and a reasonable number of proxy records are available. Taking advantage of this relatively large amount of information, we have first compared a compilation of 50 air and sea surface temperature reconstructions with the results of three simulations performed with general circulation models and one carried out with LOVECLIM, a model of intermediate complexity. The conclusions derived from this analysis confirm that models and data agree on the large-scale spatial pattern but underestimate the magnitude of some observed changes and that large discrepancies are observed at the local scale. To further investigate the origin of those inconsistencies, we have constrained LOVECLIM to follow the signal recorded by the proxies selected in the compilation using a data assimilation method based on a particle filter. In one simulation, all the 50 proxies are used while in the other two, only the continental or oceanic proxies constrains the model results. This assimilation improves the consistency between model results and the reconstructions. In particular, this is achieved in a robust way in all the experiments through a strengthening of the westerlies at mid-latitude that warms up the Northern Europe. Furthermore, the comparison of the LOVECLIM simulations with and without data assimilation has also objectively identified 16 proxies whose reconstructed signal is either incompatible with the one recorded by some other proxies or with model physics.


2013 ◽  
Vol 141 (3) ◽  
pp. 1099-1117 ◽  
Author(s):  
Andrew Charles ◽  
Bertrand Timbal ◽  
Elodie Fernandez ◽  
Harry Hendon

Abstract Seasonal predictions based on coupled atmosphere–ocean general circulation models (GCMs) provide useful predictions of large-scale circulation but lack the conditioning on topography required for locally relevant prediction. In this study a statistical downscaling model based on meteorological analogs was applied to continental-scale GCM-based seasonal forecasts and high quality historical site observations to generate a set of downscaled precipitation hindcasts at 160 sites in the South Murray Darling Basin region of Australia. Large-scale fields from the Predictive Ocean–Atmosphere Model for Australia (POAMA) 1.5b GCM-based seasonal prediction system are used for analog selection. Correlation analysis indicates modest levels of predictability in the target region for the selected predictor fields. A single best-match analog was found using model sea level pressure, meridional wind, and rainfall fields, with the procedure applied to 3-month-long reforecasts, initialized on the first day of each month from 1980 to 2006, for each model day of 10 ensemble members. Assessment of the total accumulated rainfall and number of rainy days in the 3-month reforecasts shows that the downscaling procedure corrects the local climate variability with no mean effect on predictive skill, resulting in a smaller magnitude error. The amount of total rainfall and number of rain days in the downscaled output is significantly improved over the direct GCM output as measured by the difference in median and tercile thresholds between station observations and downscaled rainfall. Confidence in the downscaled output is enhanced by strong consistency between the large-scale mean of the downscaled and direct GCM precipitation.


2015 ◽  
Vol 72 (1) ◽  
pp. 55-74 ◽  
Author(s):  
Qiang Deng ◽  
Boualem Khouider ◽  
Andrew J. Majda

Abstract The representation of the Madden–Julian oscillation (MJO) is still a challenge for numerical weather prediction and general circulation models (GCMs) because of the inadequate treatment of convection and the associated interactions across scales by the underlying cumulus parameterizations. One new promising direction is the use of the stochastic multicloud model (SMCM) that has been designed specifically to capture the missing variability due to unresolved processes of convection and their impact on the large-scale flow. The SMCM specifically models the area fractions of the three cloud types (congestus, deep, and stratiform) that characterize organized convective systems on all scales. The SMCM captures the stochastic behavior of these three cloud types via a judiciously constructed Markov birth–death process using a particle interacting lattice model. The SMCM has been successfully applied for convectively coupled waves in a simplified primitive equation model and validated against radar data of tropical precipitation. In this work, the authors use for the first time the SMCM in a GCM. The authors build on previous work of coupling the High-Order Methods Modeling Environment (HOMME) NCAR GCM to a simple multicloud model. The authors tested the new SMCM-HOMME model in the parameter regime considered previously and found that the stochastic model drastically improves the results of the deterministic model. Clear MJO-like structures with many realistic features from nature are reproduced by SMCM-HOMME in the physically relevant parameter regime including wave trains of MJOs that organize intermittently in time. Also one of the caveats of the deterministic simulation of requiring a doubling of the moisture background is not required anymore.


2007 ◽  
Vol 64 (11) ◽  
pp. 3766-3784 ◽  
Author(s):  
Philippe Lopez

Abstract This paper first reviews the current status, issues, and limitations of the parameterizations of atmospheric large-scale and convective moist processes that are used in numerical weather prediction and climate general circulation models. Both large-scale (resolved) and convective (subgrid scale) moist processes are dealt with. Then, the general question of the inclusion of diabatic processes in variational data assimilation systems is addressed. The focus is put on linearity and resolution issues, the specification of model and observation error statistics, the formulation of the control vector, and the problems specific to the assimilation of observations directly affected by clouds and precipitation.


2021 ◽  
Author(s):  
Anna Mackie ◽  
Michael P. Byrne

<div> <p>Uncertainty in the response of clouds to warming remains a significant barrier to reducing the range in projected climate sensitivity. A key question is to what extent cloud feedbacks can be attributed to changes in circulation, such as the strengthening or weakening of ascent or changes in the areas of convecting vs subsiding air. Previous research has shown that, in general circulation models (GCMs), the ‘dynamic’ component of the cloud feedback – that which is due to changes in circulation rather than changes in the thermodynamic properties of clouds (Bony et al., 2006) – is generally small (Byrne and Schneider, 2018). An open question, however, is whether this extends to models at cloud resolving resolutions that explicitly simulate deep convection.  </p> </div><div> <p>Here, we utilize simulations from the Radiative-Convective Equilibrium Model Intercomparison Project (RCEMIP, Wing et al., 2018, 2020) to quantify the impact of circulation on tropical cloud feedbacks. RCE is a simple idealisation of the tropical atmosphere and we focus on simulations in a long channel configuration with uniform sea surface temperatures of 295, 300 and 305K. The dynamic component of the total cloud feedback is substantial for this suite of cloud resolving models (CRMs), and is driven by circulation changes and nonlinearity in the climatological relationship between clouds and circulation. The large spread in dynamic component across models is linked to the extent to which convection strengthens and narrows with warming. This strengthening/narrowing of convective regions is further linked to changes in clear-sky radiative cooling and mid-tropospheric static stability in subsiding regions. </p> </div><div> <p> </p> </div>


2017 ◽  
Author(s):  
Wilton Aguiar ◽  
Mauricio M. Mata ◽  
Rodrigo Kerr

Abstract. Deep convection in open ocean polynyas are common sources of error on the representation of Antarctic Bottom Water (AABW) formation in Ocean General Circulation Models. Even though those events are well described in non-assimilatory ocean simulations, recent appearance of open ocean polynya in Estimating the Circulation and Climate of the Ocean Phase II reanalysis product raises a question if this spurious event is also found in state-of-art reanalysis products. In order to answer this question, we evaluate how three recently released high-resolution ocean reanalysis form AABW in their simulations. We found that two of them (ECCO2 and SoSE) create AABW by open ocean deep convection events in Weddell Sea, showing that assimilation of sea ice has not been enough to avoid open ocean polynya appearance. The third reanalysis – My Ocean University Reading – actually creates AABW by a rather dynamically accurate mechanism, depicting both continental shelf convection, and exporting of Dense Shelf Water to open ocean. Although the accuracy of the AABW formation in this reanalysis allows an advance in represent this process, the differences found between the real ocean and the simulated one suggests that ocean reanalysis still need substantial improvements to accurately represent AABW formation.


2020 ◽  
Vol 50 (4) ◽  
pp. 1045-1064 ◽  
Author(s):  
Steven L. Morey ◽  
Ganesh Gopalakrishnan ◽  
Enric Pallás Sanz ◽  
Joao Marcos Azevedo Correia De Souza ◽  
Kathleen Donohue ◽  
...  

AbstractThree simulations of the circulation in the Gulf of Mexico (the “Gulf”) using different numerical general circulation models are compared with results of recent large-scale observational campaigns conducted throughout the deep (>1500 m) Gulf. Analyses of these observations have provided new understanding of large-scale mean circulation features and variability throughout the deep Gulf. Important features include cyclonic flow along the continental slope, deep cyclonic circulation in the western Gulf, a counterrotating pair of cells under the Loop Current region, and a cyclonic cell to the south of this pair. These dominant circulation features are represented in each of the ocean model simulations, although with some obvious differences. A striking difference between all the models and the observations is that the simulated deep eddy kinetic energy under the Loop Current region is generally less than one-half of that computed from observations. A multidecadal integration of one of these numerical simulations is used to evaluate the uncertainty of estimates of velocity statistics in the deep Gulf computed from limited-length (4 years) observational or model records. This analysis shows that the main deep circulation features identified from the observational studies appear to be robust and are not substantially impacted by variability on time scales longer than the observational records. Differences in strengths and structures of the circulation features are identified, however, and quantified through standard error analysis of the statistical estimates using the model solutions.


Sign in / Sign up

Export Citation Format

Share Document