scholarly journals Parameterizing deep convection using the assumed probability density function method

2014 ◽  
Vol 7 (3) ◽  
pp. 3803-3849
Author(s):  
R. L. Storer ◽  
B. M. Griffin ◽  
J. Höft ◽  
J. K. Weber ◽  
E. Raut ◽  
...  

Abstract. Due to their coarse horizontal resolution, present-day climate models must parameterize deep convection. This paper presents single-column simulations of deep convection using a probability density function (PDF) parameterization. The PDF parameterization predicts the PDF of subgrid variability of turbulence, clouds, and hydrometeors. That variability is interfaced to a prognostic microphysics scheme using a Monte Carlo sampling method. The PDF parameterization is used to simulate tropical deep convection, the transition from shallow to deep convection over land, and mid-latitude deep convection. These parameterized single-column simulations are compared with 3-D reference simulations. The agreement is satisfactory except when the convective forcing is weak. The same PDF parameterization is also used to simulate shallow cumulus and stratocumulus layers. The PDF method is sufficiently general to adequately simulate these five deep, shallow, and stratiform cloud cases with a single equation set. This raises hopes that it may be possible in the future, with further refinements at coarse time step and grid spacing, to parameterize all cloud types in a large-scale model in a unified way.

2015 ◽  
Vol 8 (1) ◽  
pp. 1-19 ◽  
Author(s):  
R. L. Storer ◽  
B. M. Griffin ◽  
J. Höft ◽  
J. K. Weber ◽  
E. Raut ◽  
...  

Abstract. Due to their coarse horizontal resolution, present-day climate models must parameterize deep convection. This paper presents single-column simulations of deep convection using a probability density function (PDF) parameterization. The PDF parameterization predicts the PDF of subgrid variability of turbulence, clouds, and hydrometeors. That variability is interfaced to a prognostic microphysics scheme using a Monte Carlo sampling method. The PDF parameterization is used to simulate tropical deep convection, the transition from shallow to deep convection over land, and midlatitude deep convection. These parameterized single-column simulations are compared with 3-D reference simulations. The agreement is satisfactory except when the convective forcing is weak. The same PDF parameterization is also used to simulate shallow cumulus and stratocumulus layers. The PDF method is sufficiently general to adequately simulate these five deep, shallow, and stratiform cloud cases with a single equation set. This raises hopes that it may be possible in the future, with further refinements at coarse time step and grid spacing, to parameterize all cloud types in a large-scale model in a unified way.


2018 ◽  
Vol 41 (4) ◽  
pp. 1110-1122 ◽  
Author(s):  
Geng Jie ◽  
Li Dong ◽  
Du Guang-sheng ◽  
Liu Zheng-gang ◽  
Hu De-cai

Turbulence generated by invading structures in pipes introduces uncertainties into measurements using ultrasonic flowmeters. A numerical model based on large eddy simulation (LES) is developed here to study the features of turbulent errors in a typical U-shaped ultrasonic flowmeter. According to the flow structures obtained by LES, a one-dimensional turbulent spectrum and probability density function of helicity along the effective ultrasonic propagation path are calculated. It can be seen that the turbulent error is dominated by the large-scale anisotropic vortices in a specific range on the sampling line, and the anisotropic vortices have the common largest fluctuating scales in the error-dominated area in which the Reynolds number ( Re) exceeds 15,000. The results provide possibilities for reducing the turbulent error by limiting the largest fluctuating scale through structure optimization. The probability density function of streamwise helicity is first introduced to unveil the continuous turbulent development in an ultrasonic flowmeter, which has the potential to contribute to turbulent analyses.


Energies ◽  
2016 ◽  
Vol 9 (2) ◽  
pp. 91 ◽  
Author(s):  
Emilio Gómez-Lázaro ◽  
María Bueso ◽  
Mathieu Kessler ◽  
Sergio Martín-Martínez ◽  
Jie Zhang ◽  
...  

2015 ◽  
Vol 28 (3) ◽  
pp. 1268-1287 ◽  
Author(s):  
A. Gettelman ◽  
H. Morrison

Abstract Prognostic precipitation is added to a cloud microphysical scheme for global climate models. Results indicate very similar performance to other commonly used mesoscale schemes in an offline driver for idealized warm rain cases, better than the previous version of the global model microphysics scheme with diagnostic precipitation. In the mixed phase regime, there is significantly more water and less ice, which may address a common bias seen with the scheme in climate simulations in the Arctic. For steady forcing cases, the scheme has limited sensitivity to time step out to the ~15-min time steps typical of global models. The scheme is similar to other schemes with moderate sensitivity to vertical resolution. The limited time step sensitivity bodes well for use of the scheme in multiscale models from the mesoscale to the large scale. The scheme is sensitive to idealized perturbations of cloud drop and crystal number. Precipitation decreases and condensate increases with increasing drop number, indicating substantial decreases in precipitation efficiency. The sensitivity is less than with the previous version of the scheme for low drop number concentrations (Nc < 100 cm−3). Ice condensate increases with ice number, with large decreases in liquid condensate as well for a mixed phase case. As expected with prognostic precipitation, accretion is stronger than with diagnostic precipitation and the accretion to autoconversion ratio increases faster with liquid water path (LWP), in better agreement with idealized models and earlier studies than the previous version.


Atmosphere ◽  
2021 ◽  
Vol 12 (5) ◽  
pp. 638
Author(s):  
Jiabo Li ◽  
Xindong Peng ◽  
Xiaohan Li ◽  
Yanluan Lin ◽  
Wenchao Chu

Scale-aware parameterizations of subgrid scale physics are essentials for multiscale atmospheric modeling. A single-ice (SI) microphysics scheme and Gaussian probability-density-function (Gauss-PDF) macrophysics scheme were implemented in the single-column Global-to-Regional Integrated forecast System model (SGRIST) and they were tested using the Tropical Warm Pool-International Cloud Experiment (TWP-ICE) and the Atmospheric Radiation Measurement Southern Great Plains Experiment in 1997 (ARM97). Their performance was evaluated against observations and other reference schemes. The new schemes simulated reasonable precipitation with proper fluctuations and peaks, ice, and liquid water contents, especially in lower levels below 650 hPa during the wet period in the TWP-ICE. The root mean square error (RMSE) of the simulated cloud fraction was below 200 hPa was 0.10/0.08 in the wet/dry period, which showed an obvious improvement when compared to that, i.e., 0.11/0.11 of original scheme. Accumulated ice water content below the melting level decreased by 21.57% in the SI. The well-matched average liquid water content displayed between the new scheme and observations, which was two times larger than those with the referencing scheme. In the ARM97 simulations, the SI scheme produced considerable ice water content, especially when convection was active. Low-level cloud fraction and precipitation extremes were improved using the Gauss-PDF scheme, which displayed the RMSE of cloud fraction of 0.02, being only half of the original schemes. The study indicates that the SI and Gauss-PDF schemes are promising approaches to simplify the microphysics process and improve the low-level cloud modeling.


2021 ◽  
Vol 14 (1) ◽  
pp. 177-204
Author(s):  
Chein-Jung Shiu ◽  
Yi-Chi Wang ◽  
Huang-Hsiung Hsu ◽  
Wei-Ting Chen ◽  
Hua-Lu Pan ◽  
...  

Abstract. Cloud macrophysics schemes are unique parameterizations for general circulation models. We propose an approach based on a probability density function (PDF) that utilizes cloud condensates and saturation ratios to replace the assumption of critical relative humidity (RH). We test this approach, called the Global Forecast System (GFS) – Taiwan Earth System Model (TaiESM) – Sundqvist (GTS) scheme, using the macrophysics scheme within the Community Atmosphere Model version 5.3 (CAM5.3) framework. Via single-column model results, the new approach simulates the cloud fraction (CF)–RH distributions closer to those of the observations when compared to those of the default CAM5.3 scheme. We also validate the impact of the GTS scheme on global climate simulations with satellite observations. The simulated CF is comparable to CloudSat/Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) data. Comparisons of the vertical distributions of CF and cloud water content (CWC), as functions of large-scale dynamic and thermodynamic parameters, with the CloudSat/CALIPSO data suggest that the GTS scheme can closely simulate observations. This is particularly noticeable for thermodynamic parameters, such as RH, upper-tropospheric temperature, and total precipitable water, implying that our scheme can simulate variation in CF associated with RH more reliably than the default scheme. Changes in CF and CWC would affect climatic fields and large-scale circulation via cloud–radiation interaction. Both climatological means and annual cycles of many of the GTS-simulated variables are improved compared with the default scheme, particularly with respect to water vapor and RH fields. Different PDF shapes in the GTS scheme also significantly affect global simulations.


2021 ◽  
pp. 107754632110201
Author(s):  
Mohammad Ali Heravi ◽  
Seyed Mehdi Tavakkoli ◽  
Alireza Entezami

In this article, the autoregressive time series analysis is used to extract reliable features from vibration measurements of civil structures for damage diagnosis. To guarantee the adequacy and applicability of the time series model, Leybourne–McCabe hypothesis test is used. Subsequently, the probability density functions of the autoregressive model parameters and residuals are obtained with the aid of a kernel density estimator. The probability density function sets are considered as damage-sensitive features of the structure and fast distance correlation method is used to make decision for detecting damages in the structure. Experimental data of a well-known three-story laboratory frame and a large-scale bridge benchmark structure are used to verify the efficiency and accuracy of the proposed method. Results indicate the capability of the method to identify the location and severity of damages, even under the simulated operational and environmental variability.


2018 ◽  
Vol 611 ◽  
pp. A53 ◽  
Author(s):  
S. Jamal ◽  
V. Le Brun ◽  
O. Le Fèvre ◽  
D. Vibert ◽  
A. Schmitt ◽  
...  

Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate.Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function.Methods. We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms.Results. As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy ~58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy ~98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels.Conclusions. Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for spectroscopic redshift measurements. This newly-defined method is very promising for next-generation large spectroscopic surveys from the ground and in space, such as Euclid and WFIRST.


Sign in / Sign up

Export Citation Format

Share Document