scholarly journals Convective Forcing Fluctuations in a Cloud-Resolving Model: Relevance to the Stochastic Parameterization Problem

2007 ◽  
Vol 20 (2) ◽  
pp. 187-202 ◽  
Author(s):  
G. J. Shutts ◽  
T. N. Palmer

Abstract Idealized cloud-resolving model (CRM) simulations spanning a large part of the tropical atmosphere are used to evaluate the extent to which deterministic convective parameterizations fail to capture the statistical fluctuations in deep-convective forcing, and to provide probability distribution functions that may be used in stochastic parameterization schemes for global weather and climate models. A coarse-graining methodology is employed to deduce an effective convective warming rate appropriate to the grid scale of a forecast model, and a convective parameterization scheme is used to bin these computed tendencies into different ranges of convective forcing strength. The dependence of the probability distribution functions for the coarse-grained temperature tendency on parameterized tendency is then examined. An aquaplanet simulation using a climate model, configured with similar horizontal resolution to that of the coarse-grained CRM fields, was used to compare temperature tendency variation (less the effect of advection and radiation) with that deduced as an effective forcing function from the CRM. The coarse-grained temperature tendency of the CRM is found to have a substantially broader probability distribution function than the equivalent quantity in the climate model. The CRM-based probability distribution functions of precipitation rate and convective warming are related to the statistical mechanics theory of Craig and Cohen and the “stochastic physics” scheme of Buizza et al. It is found that the standard deviation of the coarse-grained effective convective warming is an approximately linear function of its mean, thereby providing some support for the Buizza et al. scheme, used operationally by ECMWF.


1997 ◽  
Vol 78 (10) ◽  
pp. 1904-1907 ◽  
Author(s):  
Weinan E ◽  
Konstantin Khanin ◽  
Alexandre Mazel ◽  
Yakov Sinai


2021 ◽  
Author(s):  
Hamed Farhadi ◽  
Manousos Valyrakis

<p>Applying an instrumented particle [1-3], the probability density functions of kinetic energy of a coarse particle (at different solid densities) mobilised over a range of above threshold flow conditions conditions corresponding to the intermittent transport regime, were explored. The experiments were conducted in the Water Engineering Lab at the University of Glasgow on a tilting recirculating flume with 800 (length) × 90 (width) cm dimension. Twelve different flow conditions corresponding to intermittent transport regime for the range of particle densities examined herein, have been implemented in this research. Ensuring fully developed flow conditions, the start of the test section was located at 3.2 meters upstream of the flume outlet. The bed surface of the flume is flat and made up of well-packed glass beads of 16.2 mm diameter, offering a uniform roughness over which the instrumented particle is transported. MEMS sensors are embedded within the instrumented particle with 3-axis gyroscope and 3-axis accelerometer. At the beginning of each experimental run, instrumented particle is placed at the upstream of the test section, fully exposed to the free stream flow. Its motion is recorded with top and side cameras to enable a deeper understanding of particle transport processes. Using results from sets of instrumented particle transport experiments with varying flow rates and particle densities, the probability distribution functions (PDFs) of the instrumented particles kinetic energy, were generated. The best-fitted PDFs were selected by applying the Kolmogorov-Smirnov test and the results were discussed considering the light of the recent literature of the particle velocity distributions.</p><p>[1] Valyrakis, M.; Alexakis, A. Development of a “smart-pebble” for tracking sediment transport. In Proceedings of the International Conference on Fluvial Hydraulics (River Flow 2016), St. Louis, MO, USA, 12–15 July 2016.</p><p>[2] Al-Obaidi, K., Xu, Y. & Valyrakis, M. 2020, The Design and Calibration of Instrumented Particles for Assessing Water Infrastructure Hazards, Journal of Sensors and Actuator Networks, vol. 9, no. 3, 36.</p><p>[3] Al-Obaidi, K. & Valyrakis, M. 2020, Asensory instrumented particle for environmental monitoring applications: development and calibration, IEEE sensors journal (accepted).</p>



Author(s):  
D. Xue ◽  
S. Y. Cheing ◽  
P. Gu

This research introduces a new systematic approach to identify the optimal design configuration and attributes to minimize the potential construction project changes. The second part of this paper focuses on the attribute design aspect. In this research, the potential changes of design attribute values are modeled by probability distribution functions. Attribute values of the design whose construction tasks are least sensitive to the changes of these attribute values are identified based upon Taguchi Method. In addition, estimation of the potential project change cost due to the potential design attribute value changes is also discussed. Case studies in pipeline engineering design and construction have been conducted to show the effectiveness of the introduced approach.



2015 ◽  
Vol 19 (sup8) ◽  
pp. S8-32-S8-37 ◽  
Author(s):  
K. Deng ◽  
J. Deng ◽  
T. Sun ◽  
Y. Guan ◽  
F. Yang ◽  
...  


2014 ◽  
Vol 29 (5) ◽  
pp. 1259-1265 ◽  
Author(s):  
David R. Novak ◽  
Keith F. Brill ◽  
Wallace A. Hogsett

Abstract An objective technique to determine forecast snowfall ranges consistent with the risk tolerance of users is demonstrated. The forecast snowfall ranges are based on percentiles from probability distribution functions that are assumed to be perfectly calibrated. A key feature of the technique is that the snowfall range varies dynamically, with the resultant ranges varying based on the spread of ensemble forecasts at a given forecast projection, for a particular case, for a particular location. Furthermore, this technique allows users to choose their risk tolerance, quantified in terms of the expected false alarm ratio for forecasts of snowfall range. The technique is applied to the 4–7 March 2013 snowstorm at two different locations (Chicago, Illinois, and Washington, D.C.) to illustrate its use in different locations with different forecast uncertainties. The snowfall range derived from the Weather Prediction Center Probabilistic Winter Precipitation Forecast suite is found to be statistically reliable for the day 1 forecast during the 2013/14 season, providing confidence in the practical applicability of the technique.



Sign in / Sign up

Export Citation Format

Share Document