intermediate complexity
Recently Published Documents


TOTAL DOCUMENTS

187
(FIVE YEARS 39)

H-INDEX

30
(FIVE YEARS 4)

2021 ◽  
Vol 14 (12) ◽  
pp. 7725-7747
Author(s):  
Alexey V. Eliseev ◽  
Rustam D. Gizatullin ◽  
Alexandr V. Timazhev

Abstract. A stationary, computationally efficient scheme ChAP 1.0 (Chemical and Aerosol Processes, version 1.0) for the sulfur cycle in the troposphere is developed. This scheme is designed for Earth system models of intermediate complexity (EMICs). The scheme accounts for sulfur dioxide emissions into the atmosphere, its deposition to the surface, oxidation to sulfates, and dry and wet deposition of sulfates on the surface. The calculations with the scheme are forced by anthropogenic emissions of sulfur dioxide into the atmosphere for 1850–2000 adopted from the CMIP5 dataset and by the ERA-Interim meteorology assuming that natural sources of sulfur into the atmosphere remain unchanged during this period. The ChAP output is compared to changes of the tropospheric sulfur cycle simulations with the CMIP5 data, with the IPCC TAR ensemble, and with the ACCMIP phase II simulations. In addition, in regions of strong anthropogenic sulfur pollution, ChAP results are compared to other data, such as the CAMS reanalysis, EMEP MSC-W, and individual model simulations. Our model reasonably reproduces characteristics of the tropospheric sulfur cycle known from these information sources. In our scheme, about half of the emitted sulfur dioxide is deposited to the surface, and the rest is oxidised into sulfates. In turn, sulfates are mostly removed from the atmosphere by wet deposition. The lifetimes of the sulfur dioxide and sulfates in the atmosphere are close to 1 and 5 d, respectively. The limitations of the scheme are acknowledged, and the prospects for future development are figured out. Despite its simplicity, ChAP may be successfully used to simulate anthropogenic sulfur pollution in the atmosphere at coarse spatial scales and timescales.


2021 ◽  
pp. 1-38
Author(s):  
Ting Liu ◽  
Xunshu Song ◽  
Youmin Tang ◽  
Zheqi Shen ◽  
Xiaoxiao Tan

AbstractIn this study, we conducted an ensemble retrospective prediction from 1881 to 2017 using the Community Earth System Model to evaluate El Niño–Southern Oscillation (ENSO) predictability and its variability on different timescales. To our knowledge, this is the first assessment of ENSO predictability using a long-term ensemble hindcast with a complicated coupled general circulation model (CGCM). Our results indicate that both the dispersion component (DC) and signal component (SC) contribute to the interannual variation of ENSO predictability (measured by relative entropy, RE). In detail, the SC is more important for ENSO events, whereas the DC is of comparable important for short lead times and in weak ENSO signal years. The SC dominates the seasonal variation of ENSO predictability, and an abrupt decrease in signal intensity results in the spring predictability barrier feature of ENSO. At the interdecadal scale, the SC controls the variability of ENSO predictability, while the magnitude of ENSO predictability is determined by the DC. The seasonal and interdecadal variations of ENSO predictability in the CGCM are generally consistent with results based on intermediate complexity and hybrid coupled models. However, the DC has a greater contribution in the CGCM than that in the intermediate complexity and hybrid coupled models.


2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Cristina Ramos-Hernández ◽  
Maribel Botana-Rial ◽  
Rosa Cordovilla-Pérez ◽  
Manuel Núñez-Delgado ◽  
Alberto Fernández-Villar

Abstract Background This was an observational, cross-sectional, and multicentre study carried out from October to December 2020, through a survey sent to Spanish Society of Pulmonology and Thoracic Surgery members in public hospitals with different levels of complexity. Our objective was to complete a national analysis of clinical practice, organisation, infrastructure, the services portfolio, teaching, and research activity related to ultrasound. Results Data from 104 hospitals were analysed. Ultrasound was used in 56.7% of cases, both in the area of bronchopleural techniques and on conventional wards, with no differences between centres. Lung ultrasound (LUS) was performed more often in the procedures area in intermediate-complexity centres compared to high- and low-complexity centres (36% vs. 31% and 6.25%, respectively). More high-complexity centres had three or more ultrasound scanners than intermediate-complexity centres (38% vs. 16%); 43% of low-complexity centres shared their ultrasound equipment with other specialties. Fewer than 6% of centres did not have an ultrasound machine. LUS was most often used during the treatment of pleural effusion (91.3%), in the differential diagnosis of dyspnoea (51.9%), and to rule out iatrogenic pneumothorax (50.9%). Only 5.7% of the centres had a pulmonologist specialised in LUS. Finally, fewer than 35% of the hospitals were teaching centres and fewer than 18% participated in research projects. Conclusions The use and availability of LUS has grown in pulmonology services, however, still relatively few pulmonologists are specialised in its use. Moreover, teaching and research activity in this field is scarce. Strategies are necessary to improve physicians’ skill at using LUS and to promote its use, with the ultimate goal of improving healthcare activity.


2021 ◽  
Vol 14 (3) ◽  
pp. 1657-1680
Author(s):  
Johannes Horak ◽  
Marlis Hofer ◽  
Ethan Gutmann ◽  
Alexander Gohm ◽  
Mathias W. Rotach

Abstract. The evaluation of models in general is a nontrivial task and can, due to epistemological and practical reasons, never be considered complete. Due to this incompleteness, a model may yield correct results for the wrong reasons, i.e., via a different chain of processes than found in observations. While guidelines and strategies exist in the atmospheric sciences to maximize the chances that models are correct for the right reasons, these are mostly applicable to full physics models, such as numerical weather prediction models. The Intermediate Complexity Atmospheric Research (ICAR) model is an atmospheric model employing linear mountain wave theory to represent the wind field. In this wind field, atmospheric quantities such as temperature and moisture are advected and a microphysics scheme is applied to represent the formation of clouds and precipitation. This study conducts an in-depth process-based evaluation of ICAR, employing idealized simulations to increase the understanding of the model and develop recommendations to maximize the probability that its results are correct for the right reasons. To contrast the obtained results from the linear-theory-based ICAR model to a full physics model, idealized simulations with the Weather Research and Forecasting (WRF) model are conducted. The impact of the developed recommendations is then demonstrated with a case study for the South Island of New Zealand. The results of this investigation suggest three modifications to improve different aspects of ICAR simulations. The representation of the wind field within the domain improves when the dry and the moist Brunt–Väisälä frequencies are calculated in accordance with linear mountain wave theory from the unperturbed base state rather than from the time-dependent perturbed atmosphere. Imposing boundary conditions at the upper boundary that are different to the standard zero-gradient boundary condition is shown to reduce errors in the potential temperature and water vapor fields. Furthermore, the results show that there is a lowest possible model top elevation that should not be undercut to avoid influences of the model top on cloud and precipitation processes within the domain. The method to determine the lowest model top elevation is applied to both the idealized simulations and the real terrain case study. Notable differences between the ICAR and WRF simulations are observed across all investigated quantities such as the wind field, water vapor and hydrometeor distributions, and the distribution of precipitation. The case study indicates that the precipitation maximum calculated by the ICAR simulation employing the developed recommendations is spatially shifted upwind in comparison to an unmodified version of ICAR. The cause for the shift is found in influences of the model top on cloud formation and precipitation processes in the ICAR simulations. Furthermore, the results show that when model skill is evaluated from statistical metrics based on comparisons to surface observations only, such an analysis may not reflect the skill of the model in capturing atmospheric processes like gravity waves and cloud formation.


2021 ◽  
Author(s):  
Louis Quéno ◽  
Paul Morin ◽  
Rebecca Mott ◽  
Tobias Jonas

<p>In mountainous terrain, wind-driven transport of deposited snow affects the overall distribution of snow, and can have a significant effect on snowmelt patterns even at coarser resolution.  In an operational modelling perspective, a compromise must be found to represent this complex small-scale process with enough accuracy while mitigating the computational costs of snow cover simulations over large domains. To achieve this compromise, we implemented the SNOWTRAN-3D snow transport module within the FSM intermediate complexity snow cover model. We included a new layering scheme and a historical variable of past snow wetting, but without resolving the snow microstructure. Simulations are run and evaluated over a small mountain range in the Swiss Alps at 25 to 100 m resolution. Being implemented in the model framework of the SLF operational snow hydrology service (OSHD), simulations further benefit from snow data assimilation techniques to provide improved estimates of solid precipitation fields. As complex wind patterns in mountains are the key processes driving snow transport, we tested statistical and dynamical methods to downscale 1 km resolution COSMO winds to better reflect topographically-induced flow patterns. These simulations are a first step working towards the integration of wind transport processes over large domains in an intermediate-complexity and -resolution operational modelling framework.</p>


2021 ◽  
Author(s):  
Nathaelle Bouttes ◽  
Didier Roche ◽  
Fanny Lhardy ◽  
Aurelien Quiquet ◽  
Didier Paillard ◽  
...  

<p>The last deglaciation is a time of large climate transition from a cold Last Glacial Maximum at 21,000 years BP with extensive ice sheets, to the warmer Holocene 9,000 years BP onwards with reduced ice sheets. Despite more and more proxy data documenting this transition, the evolution of climate is not fully understood and difficult to simulate. The PMIP4 protocol (Ivanovic et al., 2016) has indicated which boundary conditions to use in model simulations during this transition. The common boundary conditions should enable consistent multi model and model-data comparisons. While the greenhouse gas concentration evolution and orbital forcing are well known and easy to prescribe, the evolution of ice sheets is less well constrained and several choices can be made by modelling groups. First, two ice sheet reconstructions are available: ICE-6G (Peltier et al., 2015) and GLAC-1D (Tarasov et al., 2014). On top of topographic changes, it is left to modelling groups to decide whether to account for the associated bathymetry and land-sea mask changes, which is technically more demanding. These choices could potentially lead to differences in the climate evolution, making model comparisons more complicated.</p><p>We use the iLOVECLIM model of intermediate complexity (Goosse et al., 2010) to evaluate the impact of different ice sheet reconstructions and the effect of bathymetry changes on the global climate evolution during the Last deglaciation. We test the two ice sheet reconstructions (ICE-6G and GLAC-1D), and have implemented changes of bathymetry and land-sea mask. In addition, we also evaluate the impact of accounting for the Antarctic ice sheet evolution compared to the Northern ice sheets only.</p><p>We show that despite showing the same long-term changes, the two reconstructions lead to different evolutions. The bathymetry plays a role, although only few changes take place before ~14ka. Finally, the impact of the Antarctic ice sheet is important during the deglaciation and should not be neglected.</p><p>References</p><p>Goosse, H., et al., Description of the Earth system model of intermediate complexity LOVECLIM version 1.2, Geosci. Model Dev., 3, 603–633, https://doi.org/10.5194/gmd-3-603-2010, 2010</p><p>Ivanovic, R. F., et al., Transient climate simulations of the deglaciation 21–9 thousand years before present (version 1) – PMIP4 Core experiment design and boundary conditions, Geosci. Model Dev., 9, 2563–2587, https://doi.org/10.5194/gmd-9-2563-2016, 2016</p><p>Peltier, W. R., Argus, D. F., and Drummond, R., Space geodesy constrains ice age terminal deglaciation: The global ICE-6G_C (VM5a) model, J. Geophys. Res.-Sol. Ea., 120, 450–487, doi:10.1002/2014JB011176, 2015</p><p>Tarasov,L.,  et al., The global GLAC-1c deglaciation chronology, melwater pulse 1-a, and a question of missing ice, IGS Symposium on Contribution of Glaciers and Ice Sheets to Sea-Level Change, 2014</p>


2021 ◽  
Author(s):  
Dylan Reynolds ◽  
Bert Kruyt ◽  
Ethan Gutmann ◽  
Tobias Jonas ◽  
Michael Lehning ◽  
...  

<p>            Snow deposition patterns in complex terrain are heavily dependent on the underlying topography. This topography affects precipitating clouds at the kilometer-scale and causes changes to the wind field at the sub-kilometer scale, resulting in altered advection of falling hydrometeors. Snow particles are particularly sensitive to changes in the near-surface flow field due to their low density. Atmospheric models which run at the kilometer scale cannot resolve the actual heterogeneity of the underlying terrain, resulting in precipitation maps which do not capture terrain-affected precipitation patterns. Thus, snow-atmosphere interactions such as preferential deposition are often not resolved in precipitation data used as input to snow models. To bridge this spatial gap and resolve snow-atmosphere interactions at the sub-kilometer scale, we couple an intermediate complexity atmospheric model (ICAR) to the COSMO NWP model. Applying this model to sub-kilometer terrain (horizontal resolution of 50 and 250 m) required changes to ICAR’s computational grid, atmospheric dynamics, and boundary layer flow. As a result, the near-surface flow now accounts for surface roughness and topographically induced speed up. This has been achieved by using terrain descriptors calculated once at initialization which consider a point’s exposure or sheltering relative to surrounding terrain. In particular, the use of a 3-dimensional Sx parameter allows us to simulate areas of stagnation and recirculation on the lee of terrain features. Our approach maintains the accurate large-scale precipitation patterns from COSMO but resolves the dynamics induced by terrain at the sub-kilometer scale without adding additional computational burden. We find that solid precipitation patterns at the ridge scale, such as preferential deposition of snow, are better resolved in the high-resolution version of ICAR than the current ICAR or COSMO models. This updated version of ICAR presents a new tool to dynamically downscale NWP output for snow models and enables future studies of snow-atmosphere interactions at domain scales of 100’s of kilometers.</p>


2021 ◽  
Author(s):  
Oliver Mehling ◽  
Elisa Ziegler ◽  
Heather Andres ◽  
Martin Werner ◽  
Kira Rehfeld

<p>The global hydrological cycle is of crucial importance for life on Earth. Hence, it is a focus of both future climate projections and paleoclimate modeling. The latter typically requires long integrations or large ensembles of simulations, and therefore models of reduced complexity are needed to reduce the computational cost. Here, we study the hydrological cycle of the the Planet Simulator (PlaSim) [1], a general circulation model (GCM) of intermediate complexity, which includes evaporation, precipitation, soil hydrology, and river advection.</p><p>Using published parameter configurations for T21 resolution [2, 3], PlaSim strongly underestimates precipitation in the mid-latitudes as well as global atmospheric water compared to ERA5 reanalysis data [4]. However, the tuning of PlaSim has been limited to optimizing atmospheric temperatures and net radiative fluxes so far [3].</p><p>Here, we present a different approach by tuning the model’s atmospheric energy balance and water budget simultaneously. We argue for the use of the globally averaged mean absolute error (MAE) for 2 m temperature, net radiation, and evaporation in the objective function. To select relevant model parameters, especially with respect to radiation and the hydrological cycle, we perform a sensitivity analysis and evaluate the feature importance using a Random Forest regressor. An optimal set of parameters is obtained via Bayesian optimization.</p><p>Using the optimized set of parameters, the mean absolute error of temperature and cloud cover is reduced on most model levels, and mid-latitude precipitation patterns are improved. In addition to annual zonal-mean patterns, we examine the agreement with the seasonal cycle and discuss regions in which the bias remains considerable, such as the monsoon region over the Pacific.</p><p>We discuss the robustness of this tuning with regards to resolution (T21, T31, and T42), and compare the atmosphere-only results to simulations with a mixed-layer ocean. Finally, we provide an outlook on the applicability of our parametrization to climate states other than present-day conditions.</p><p>[1] K. Fraedrich et al., <em>Meteorol. Z.</em> <strong>1</strong><strong>4</strong>, 299–304 (2005)<br>[2] F. Lunkeit et al., <em>Planet Simulator User’s Guide Version 16.0</em> (University of Hamburg, 2016)<br>[3] G. Lyu et al., <em>J. Adv. Model. Earth Sy</em><em>st</em><em>.</em> <strong>10</strong>, 207–222 (2018)<br>[4] H. Hersbach et al., <em>Q. J. R. Meteorol. Soc.</em><em> </em><strong>146</strong>, 1999–2049 (2020)</p>


Sign in / Sign up

Export Citation Format

Share Document