scholarly journals Distribution Functions of Polymers with and without Interactions. I. The Distribution Function of the Square Distance of the Center of Mass from One Fixed End of a Polymer Chain

1977 ◽  
Vol 9 (3) ◽  
pp. 239-251 ◽  
Author(s):  
Takao Minato ◽  
Akira Hatano
Author(s):  
Stefan Thurner ◽  
Rudolf Hanel ◽  
Peter Klimekl

Scaling appears practically everywhere in science; it basically quantifies how the properties or shapes of an object change with the scale of the object. Scaling laws are always associated with power laws. The scaling object can be a function, a structure, a physical law, or a distribution function that describes the statistics of a system or a temporal process. We focus on scaling laws that appear in the statistical description of stochastic complex systems, where scaling appears in the distribution functions of observable quantities of dynamical systems or processes. The distribution functions exhibit power laws, approximate power laws, or fat-tailed distributions. Understanding their origin and how power law exponents can be related to the particular nature of a system, is one of the aims of the book.We comment on fitting power laws.


2020 ◽  
Vol 2020 (9) ◽  
Author(s):  
S. Acharya ◽  
◽  
D. Adamová ◽  
A. Adler ◽  
J. Adolfsson ◽  
...  

Abstract Measurement of Z-boson production in p-Pb collisions at $$ \sqrt{s_{\mathrm{NN}}} $$ s NN = 8.16 TeV and Pb-Pb collisions at $$ \sqrt{s_{\mathrm{NN}}} $$ s NN = 5.02 TeV is reported. It is performed in the dimuon decay channel, through the detection of muons with pseudorapidity −4 < ημ< −2.5 and transverse momentum $$ {p}_{\mathrm{T}}^{\mu } $$ p T μ > 20 GeV/c in the laboratory frame. The invariant yield and nuclear modification factor are measured for opposite-sign dimuons with invariant mass 60 < mμμ< 120 GeV/c2 and rapidity 2.5 <$$ {y}_{\mathrm{cms}}^{\mu \mu} $$ y cms μμ < 4. They are presented as a function of rapidity and, for the Pb-Pb collisions, of centrality as well. The results are compared with theoretical calculations, both with and without nuclear modifications to the Parton Distribution Functions (PDFs). In p-Pb collisions the center-of-mass frame is boosted with respect to the laboratory frame, and the measurements cover the backward (−4.46 <$$ {y}_{\mathrm{cms}}^{\mu \mu} $$ y cms μμ < −2.96) and forward (2.03 <$$ {y}_{\mathrm{cms}}^{\mu \mu} $$ y cms μμ < 3.53) rapidity regions. For the p-Pb collisions, the results are consistent within experimental and theoretical uncertainties with calculations that include both free-nucleon and nuclear-modified PDFs. For the Pb-Pb collisions, a 3.4σ deviation is seen in the integrated yield between the data and calculations based on the free-nucleon PDFs, while good agreement is found once nuclear modifications are considered.


2020 ◽  
Vol 49 (1) ◽  
pp. 1-23
Author(s):  
Shunpu Zhang ◽  
Zhong Li ◽  
Zhiying Zhang

Estimation of distribution functions has many real-world applications. We study kernel estimation of a distribution function when the density function has compact support. We show that, for densities taking value zero at the endpoints of the support, the kernel distribution estimator does not need boundary correction. Otherwise, boundary correction is necessary. In this paper, we propose a boundary distribution kernel estimator which is free of boundary problem and provides non-negative and non-decreasing distribution estimates between zero and one. Extensive simulation results show that boundary distribution kernel estimator provides better distribution estimates than the existing boundary correction methods. For practical application of the proposed methods, a data-dependent method for choosing the bandwidth is also proposed.


Filomat ◽  
2018 ◽  
Vol 32 (17) ◽  
pp. 5931-5947
Author(s):  
Hatami Mojtaba ◽  
Alamatsaz Hossein

In this paper, we propose a new transformation of circular random variables based on circular distribution functions, which we shall call inverse distribution function (id f ) transformation. We show that M?bius transformation is a special case of our id f transformation. Very general results are provided for the properties of the proposed family of id f transformations, including their trigonometric moments, maximum entropy, random variate generation, finite mixture and modality properties. In particular, we shall focus our attention on a subfamily of the general family when id f transformation is based on the cardioid circular distribution function. Modality and shape properties are investigated for this subfamily. In addition, we obtain further statistical properties for the resulting distribution by applying the id f transformation to a random variable following a von Mises distribution. In fact, we shall introduce the Cardioid-von Mises (CvM) distribution and estimate its parameters by the maximum likelihood method. Finally, an application of CvM family and its inferential methods are illustrated using a real data set containing times of gun crimes in Pittsburgh, Pennsylvania.


2005 ◽  
Vol 23 (6) ◽  
pp. 429-461
Author(s):  
Ian Lerche ◽  
Brett S. Mudford

This article derives an estimation procedure to evaluate how many Monte Carlo realisations need to be done in order to achieve prescribed accuracies in the estimated mean value and also in the cumulative probabilities of achieving values greater than, or less than, a particular value as the chosen particular value is allowed to vary. In addition, by inverting the argument and asking what the accuracies are that result for a prescribed number of Monte Carlo realisations, one can assess the computer time that would be involved should one choose to carry out the Monte Carlo realisations. The arguments and numerical illustrations are carried though in detail for the four distributions of lognormal, binomial, Cauchy, and exponential. The procedure is valid for any choice of distribution function. The general method given in Lerche and Mudford (2005) is not merely a coincidence owing to the nature of the Gaussian distribution but is of universal validity. This article provides (in the Appendices) the general procedure for obtaining equivalent results for any distribution and shows quantitatively how the procedure operates for the four specific distributions. The methodology is therefore available for any choice of probability distribution function. Some distributions have more than two parameters that are needed to define precisely the distribution. Estimates of mean value and standard error around the mean only allow determination of two parameters for each distribution. Thus any distribution with more than two parameters has degrees of freedom that either have to be constrained from other information or that are unknown and so can be freely specified. That fluidity in such distributions allows a similar fluidity in the estimates of the number of Monte Carlo realisations needed to achieve prescribed accuracies as well as providing fluidity in the estimates of achievable accuracy for a prescribed number of Monte Carlo realisations. Without some way to control the free parameters in such distributions one will, presumably, always have such dynamic uncertainties. Even when the free parameters are known precisely, there is still considerable uncertainty in determining the number of Monte Carlo realisations needed to achieve prescribed accuracies, and in the accuracies achievable with a prescribed number of Monte Carol realisations because of the different functional forms of probability distribution that can be invoked from which one chooses the Monte Carlo realisations. Without knowledge of the underlying distribution functions that are appropriate to use for a given problem, presumably the choices one makes for numerical implementation of the basic logic procedure will bias the estimates of achievable accuracy and estimated number of Monte Carlo realisations one should undertake. The cautionary note, which is the main point of this article, and which is exhibited sharply with numerical illustrations, is that one must clearly specify precisely what distributions one is using and precisely what free parameter values one has chosen (and why the choices were made) in assessing the accuracy achievable and the number of Monte Carlo realisations needed with such choices. Without such available information it is not a very useful exercise to undertake Monte Carlo realisations because other investigations, using other distributions and with other values of available free parameters, will arrive at very different conclusions.


Author(s):  
Dariusz Jacek Jakóbczak

Proposed method, called Probabilistic Nodes Combination (PNC), is the method of 2D curve interpolation and extrapolation using the set of key points (knots or nodes). Nodes can be treated as characteristic points of data for modeling and analyzing. The model of data can be built by choice of probability distribution function and nodes combination. PNC modeling via nodes combination and parameter ? as probability distribution function enables value anticipation in risk analysis and decision making. Two-dimensional curve is extrapolated and interpolated via nodes combination and different functions as discrete or continuous probability distribution functions: polynomial, sine, cosine, tangent, cotangent, logarithm, exponent, arc sin, arc cos, arc tan, arc cot or power function. Novelty of the paper consists of two generalizations: generalization of previous MHR method with various nodes combinations and generalization of linear interpolation with different (no basic) probability distribution functions and nodes combinations.


Sign in / Sign up

Export Citation Format

Share Document