scholarly journals An empirical method for estimating probability density functions of gridded daily minimum and maximum temperature

2013 ◽  
Vol 10 (1) ◽  
pp. 59-64
Author(s):  
C. Lussana

Abstract. The presented work focuses on the investigation of gridded daily minimum (TN) and maximum (TX) temperature probability density functions (PDFs) with the intent of both characterising a region and detecting extreme values. The empirical PDFs estimation procedure has been realised using the most recent years of gridded temperature analysis fields available at ARPA Lombardia, in Northern Italy. The spatial interpolation is based on an implementation of Optimal Interpolation using observations from a dense surface network of automated weather stations. An effort has been made to identify both the time period and the spatial areas with a stable data density otherwise the elaboration could be influenced by the unsettled station distribution. The PDF used in this study is based on the Gaussian distribution, nevertheless it is designed to have an asymmetrical (skewed) shape in order to enable distinction between warming and cooling events. Once properly defined the occurrence of extreme events, it is possible to straightforwardly deliver to the users the information on a local-scale in a concise way, such as: TX extremely cold/hot or TN extremely cold/hot.

2010 ◽  
Vol 23 (24) ◽  
pp. 6605-6623 ◽  
Author(s):  
Jiafu Mao ◽  
Xiaoying Shi ◽  
Lijuan Ma ◽  
Dale P. Kaiser ◽  
Qingxiang Li ◽  
...  

Abstract Using a recently homogenized observational daily maximum (TMAX) and minimum temperature (TMIN) dataset for China, the extreme temperatures from the 40-yr ECMWF Re-Analysis (ERA-40), the Japanese 25-year Reanalysis (JRA-25), the NCEP/Department of Energy Global Reanalysis 2 (NCEP-2), and the ECMWF’s ERA-Interim (ERAIn) reanalyses for summer (June–August) and winter (December–February) are assessed by probability density functions for the periods 1979–2001 and 1990–2001. For 1979–2001, no single reanalysis appears to be consistently accurate across eight areas examined over China. The ERA-40 and JRA-25 reanalyses show similar representations and close skill scores over most of the regions of China for both seasons. NCEP-2 generally has lower skill scores, especially over regions with complex topography. The regional and seasonal differences identified are commonly associated with different geographical locations and the methods used to diagnose these quantities. All the selected reanalysis products exhibit better performance for winter compared to summer over most regions of China. The TMAX values from the reanalysis tend to be systematically underestimated, while TMIN is systematically closer to observed values than TMAX. Comparisons of the reanalyses to reproduce the 99.7 percentiles for TMAX and 0.3 percentiles for TMIN show that most reanalyses tend to underestimate the 99.7 percentiles in maximum temperature both in summer and winter. For the 0.3 percentiles in TMIN, NCEP-2 is relatively inaccurate with a −12°C cold bias over the Qinghai–Tibetan Plateau in winter. ERA-40 and JRA-25 generally overestimate the extreme TMIN, and the extreme percentage differences of ERA-40 and JRA-25 are quite similar over all of the regions. The results are generally similar for 1990–2001, but in contrast to the other three reanalysis products the newly released ERAIn is very reasonable, especially for wintertime TMIN, with a skill score greater than 0.83 for each region of China. This demonstrates the great potential of this product for use in future impact assessments on continental scales where those impacts are based on extreme temperatures.


2007 ◽  
Vol 20 (17) ◽  
pp. 4356-4376 ◽  
Author(s):  
S. E. Perkins ◽  
A. J. Pitman ◽  
N. J. Holbrook ◽  
J. McAneney

Abstract The coupled climate models used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change are evaluated. The evaluation is focused on 12 regions of Australia for the daily simulation of precipitation, minimum temperature, and maximum temperature. The evaluation is based on probability density functions and a simple quantitative measure of how well each climate model can capture the observed probability density functions for each variable and each region is introduced. Across all three variables, the coupled climate models perform better than expected. Precipitation is simulated reasonably by most and very well by a small number of models, although the problem with excessive drizzle is apparent in most models. Averaged over Australia, 3 of the 14 climate models capture more than 80% of the observed probability density functions for precipitation. Minimum temperature is simulated well, with 10 of the 13 climate models capturing more than 80% of the observed probability density functions. Maximum temperature is also reasonably simulated with 6 of 10 climate models capturing more than 80% of the observed probability density functions. An overall ranking of the climate models, for each of precipitation, maximum, and minimum temperatures, and averaged over these three variables, is presented. Those climate models that are skillful over Australia are identified, providing guidance on those climate models that should be used in impacts assessments where those impacts are based on precipitation or temperature. These results have no bearing on how well these models work elsewhere, but the methodology is potentially useful in assessing which of the many climate models should be used by impacts groups.


Algorithms ◽  
2020 ◽  
Vol 13 (7) ◽  
pp. 164 ◽  
Author(s):  
Wojciech Rafajłowicz

We consider a rather general problem of nonparametric estimation of an uncountable set of probability density functions (p.d.f.’s) of the form: f ( x ; r ) , where r is a non-random real variable and ranges from R 1 to R 2 . We put emphasis on the algorithmic aspects of this problem, since they are crucial for exploratory analysis of big data that are needed for the estimation. A specialized learning algorithm, based on the 2D FFT, is proposed and tested on observations that allow for estimate p.d.f.’s of a jet engine temperatures as a function of its rotation speed. We also derive theoretical results concerning the convergence of the estimation procedure that contains hints on selecting parameters of the estimation algorithm.


2021 ◽  
Vol 13 (12) ◽  
pp. 2307
Author(s):  
J. Javier Gorgoso-Varela ◽  
Rafael Alonso Ponce ◽  
Francisco Rodríguez-Puerta

The diameter distributions of trees in 50 temporary sample plots (TSPs) established in Pinus halepensis Mill. stands were recovered from LiDAR metrics by using six probability density functions (PDFs): the Weibull (2P and 3P), Johnson’s SB, beta, generalized beta and gamma-2P functions. The parameters were recovered from the first and the second moments of the distributions (mean and variance, respectively) by using parameter recovery models (PRM). Linear models were used to predict both moments from LiDAR data. In recovering the functions, the location parameters of the distributions were predetermined as the minimum diameter inventoried, and scale parameters were established as the maximum diameters predicted from LiDAR metrics. The Kolmogorov–Smirnov (KS) statistic (Dn), number of acceptances by the KS test, the Cramér von Misses (W2) statistic, bias and mean square error (MSE) were used to evaluate the goodness of fits. The fits for the six recovered functions were compared with the fits to all measured data from 58 TSPs (LiDAR metrics could only be extracted from 50 of the plots). In the fitting phase, the location parameters were fixed at a suitable value determined according to the forestry literature (0.75·dmin). The linear models used to recover the two moments of the distributions and the maximum diameters determined from LiDAR data were accurate, with R2 values of 0.750, 0.724 and 0.873 for dg, dmed and dmax. Reasonable results were obtained with all six recovered functions. The goodness-of-fit statistics indicated that the beta function was the most accurate, followed by the generalized beta function. The Weibull-3P function provided the poorest fits and the Weibull-2P and Johnson’s SB also yielded poor fits to the data.


2021 ◽  
Vol 502 (2) ◽  
pp. 1768-1784
Author(s):  
Yue Hu ◽  
A Lazarian

ABSTRACT The velocity gradients technique (VGT) and the probability density functions (PDFs) of mass density are tools to study turbulence, magnetic fields, and self-gravity in molecular clouds. However, self-absorption can significantly make the observed intensity different from the column density structures. In this work, we study the effects of self-absorption on the VGT and the intensity PDFs utilizing three synthetic emission lines of CO isotopologues 12CO (1–0), 13CO (1–0), and C18O (1–0). We confirm that the performance of VGT is insensitive to the radiative transfer effect. We numerically show the possibility of constructing 3D magnetic fields tomography through VGT. We find that the intensity PDFs change their shape from the pure lognormal to a distribution that exhibits a power-law tail depending on the optical depth for supersonic turbulence. We conclude the change of CO isotopologues’ intensity PDFs can be independent of self-gravity, which makes the intensity PDFs less reliable in identifying gravitational collapsing regions. We compute the intensity PDFs for a star-forming region NGC 1333 and find the change of intensity PDFs in observation agrees with our numerical results. The synergy of VGT and the column density PDFs confirms that the self-gravitating gas occupies a large volume in NGC 1333.


2015 ◽  
Vol 34 (6) ◽  
pp. 1-13 ◽  
Author(s):  
Minh Dang ◽  
Stefan Lienhard ◽  
Duygu Ceylan ◽  
Boris Neubert ◽  
Peter Wonka ◽  
...  

Geosciences ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 322
Author(s):  
Evelina Volpe ◽  
Luca Ciabatta ◽  
Diana Salciarini ◽  
Stefania Camici ◽  
Elisabetta Cattoni ◽  
...  

The development of forecasting models for the evaluation of potential slope instability after rainfall events represents an important issue for the scientific community. This topic has received considerable impetus due to the climate change effect on territories, as several studies demonstrate that an increase in global warming can significantly influence the landslide activity and stability conditions of natural and artificial slopes. A consolidated approach in evaluating rainfall-induced landslide hazard is based on the integration of rainfall forecasts and physically based (PB) predictive models through deterministic laws. However, considering the complex nature of the processes and the high variability of the random quantities involved, probabilistic approaches are recommended in order to obtain reliable predictions. A crucial aspect of the stochastic approach is represented by the definition of appropriate probability density functions (pdfs) to model the uncertainty of the input variables as this may have an important effect on the evaluation of the probability of failure (PoF). The role of the pdf definition on reliability analysis is discussed through a comparison of PoF maps generated using Monte Carlo (MC) simulations performed over a study area located in the Umbria region of central Italy. The study revealed that the use of uniform pdfs for the random input variables, often considered when a detailed geotechnical characterization for the soil is not available, could be inappropriate.


Sign in / Sign up

Export Citation Format

Share Document