scholarly journals Weighted Means of Likelihood Ratio as Indices Measuring Similarity Between Densities

Author(s):  
Hideyoshi Ko

Abstract Criteria for similarity between probability density functions are important in the field of statistics such as density estimation. In this short paper, a set of indices measuring similarity between probability densities is proposed using the weighted means of the likelihood ratio function. Numerical simulations demonstrate that the estimates of these indices are easily obtained from observations and could be useful for both parametric and nonparametric density estimation with numerical optimization.

Author(s):  
FRED ESPEN BENTH ◽  
GLEDA KUTROLLI ◽  
SILVANA STEFANI

In this paper, we introduce a dynamical model for the time evolution of probability density functions incorporating uncertainty in the parameters. The uncertainty follows stochastic processes, thereby defining a new class of stochastic processes with values in the space of probability densities. The purpose is to quantify uncertainty that can be used for probabilistic forecasting. Starting from a set of traded prices of equity indices, we do some empirical studies. We apply our dynamic probabilistic forecasting to option pricing, where our proposed notion of model uncertainty reduces to uncertainty on future volatility. A distribution of option prices follows, reflecting the uncertainty on the distribution of the underlying prices. We associate measures of model uncertainty of prices in the sense of Cont.


2020 ◽  
Author(s):  
Meriem ALLALI ◽  
Patrick Portecop ◽  
Michel Carles ◽  
Dominique Gibert

We propose a method to detect early-warning information in relation with subtle changes occurring in the trend of evolution in data time series of the COVID-19 epidemic spread (e.g. daily new cases). The method is simple and easy to implement on laptop computers. It is designed to be able to provide reliable results even with very small amounts of data (i.e. ≈ 10 − 20). The results are given as compact graphics easy to interpret. The data are separated into two subsets: the old data used as control points to statistically define a "trend" and the recent data that are tested to evaluate their conformity with this trend. The trend is characterised by bootstrapping in order to obtain probability density functions of the expected misfit of each data point. The probability densities are used to compute distance matrices where data clusters and outliers are easily visually recognised. In addition to be able to detect very subtle changes in trend, the method is also able to detect outliers. A simulated case is analysed where R0 is slowly augmented (i.e. from 1.5 to 2.0 in 20 days) to pass from a stable damped control of the epidemic spread to an exponentially diverging situation. The method is able to give an early warning signal as soon as the very beginning of the R0 variation. Application to the data of Guadeloupe shows that a small destabilising event occurred in the data near April 30, 2020. This may be due to an increase of R0 ≈ 0.7 around April 13-15, 2020.


2007 ◽  
Vol 73 (6) ◽  
pp. 821-830 ◽  
Author(s):  
H. HOMANN ◽  
R. GRAUER ◽  
A. BUSSE ◽  
W. C. MÜLLER

AbstractWe report on a comparison of high-resolution numerical simulations of Lagrangian particles advected by incompressible turbulent hydro- and magnetohydrodynamic (MHD) flows. Numerical simulations were performed with up to 10243 collocation points and 10 million particles in the Navier–Stokes case and 5123 collocation points and 1 million particles in the MHD case. In the hydrodynamics case our findings compare with recent experiments from Mordant et al. (2004 New J. Phys.6, 116) and Xu et al. (2006 Phys. Rev. Lett.96, 024503). They differ from the simulations of Biferale et al. (2004 Phys. Rev. Lett.93, 064502) due to differences of the ranges chosen for evaluating the structure functions. In Navier–Stokes turbulence intermittency is stronger than predicted by the multifractal approach of Biferale et al. (2004 Phys. Rev. Lett.93, 064502) whereas in MHD turbulence the predictions from the multifractal approach are more intermittent than observed in our simulations. In addition, our simulations reveal that Lagrangian Navier–Stokes turbulence is more intermittent than MHD turbulence, whereas the situation is reversed in the Eulerian case. Those findings can not consistently be described by the multifractal modeling. The crucial point is that the geometry of the dissipative structures have different implications for Lagrangian and Eulerian intermittency. Application of the multifractal approach for the modeling of the acceleration probability density functions works well for the Navier–Stokes case but in the MHD case just the tails are well described.


Geophysics ◽  
1968 ◽  
Vol 33 (1) ◽  
pp. 11-35 ◽  
Author(s):  
R. L. Sengbush ◽  
M. R. Foster

Optimum systems have been developed to correspond to the sub‐optimum moveout discrimination systems presented previously by several authors. The seismic data on the lth trace is assumed to be additive signal S with moveout [Formula: see text], coherent noise N with moveout [Formula: see text], and incoherent noise [Formula: see text], expressed [Formula: see text] where S, N, and [Formula: see text] are independent, second order stationary random processes and [Formula: see text] and [Formula: see text] are random variables with prescribed probability density functions. The signal estimate S⁁ is produced by filtering each trace with its corresponding filter [Formula: see text] and summing the outputs [Formula: see text] We choose the system of filters [Formula: see text] to make the signal estimate optimum in the Wiener sense (minimum mean‐square error of the signal ensemble). For the special cases discussed, the moveouts are linear functions of the trace number l determined by the moveout/trace τ for signal and [Formula: see text] for noise. Thus, the optimum system is determined by the probability densities of τ and [Formula: see text] together with the noise/signal power spectrum ratios [Formula: see text] and [Formula: see text]. In comparison, suboptimum systems are controlled completely by the cut‐off moveout/trace [Formula: see text]. Events whose moveout/trace falls within [Formula: see text] of the expected dip moveout/trace are accepted, and those falling outside this range are suppressed. Suboptimum systems can be derived from optimum systems by choosing probability densities for τ and [Formula: see text] that are uniform within the above ranges and letting [Formula: see text] be very large. Optimum systems have increased flexibility over suboptimum systems due to control over the probability density functions and the power spectrum ratios and allow increased noise suppression in selected regions of f‐k space.


2018 ◽  
Vol 10 (7) ◽  
pp. 168781401878556 ◽  
Author(s):  
Chunbo Su ◽  
Shui Yu ◽  
Zhonglai Wang ◽  
Zafar Tayyab

This article proposes two strategies for time-dependent probabilistic fatigue analysis considering stochastic loadings and strength degradation based on the failure transformation and multi-dimensional kernel density estimation method. The time-dependent safety margin function is first established to describe the limit state of the time-dependent failure probability for mechatronics equipment with stochastic loadings and strength degradation. Considering the effective safety margin points and the corresponding number of the load cycles, two strategies for transforming the time-dependent failure probability calculation to the static reliability calculation are then proposed. Multi-dimensional kernel density estimation method is finally employed to build the probability density functions and the reliability is estimated based on the probability density functions. An engineering case of a filtering gear reducer is presented to validate the effectiveness of the proposed methods both in computational efficiency and accuracy.


Sign in / Sign up

Export Citation Format

Share Document