Non-universal fluctuations of the empirical measure for isotropic stationary fields on S2×R

2021 ◽  
Vol 31 (5) ◽  
Author(s):  
Domenico Marinucci ◽  
Maurizia Rossi ◽  
Anna Vidotto
Keyword(s):  
1979 ◽  
Vol 16 (3) ◽  
pp. 665-670 ◽  
Author(s):  
Burt V. Bronk

Some inequalities for moments and coefficients of variation of probability densities over the positive real line are obtained by means of simple geometrical relationships. As an illustrative application rigorous bounds are obtained for the ratio of weight average to number average molecular weight for a large class of distributions of macromolecules, giving a more precise characterization of this empirical measure of heterogeneity.


2011 ◽  
Vol 41 (3) ◽  
pp. 271-281 ◽  
Author(s):  
Rachael-Anne Knight

Despite the current popularity of rhythm metrics, there has been relatively little work aimed at establishing their validity or reliability, important characteristics of any empirical measure. The current paper focuses on the stability, or temporal reliability, of rhythm metrics by establishing if they give consistent results for the same speakers, in the same task, on successive occasions. Four speakers of Southern British English were recorded reading ‘The North Wind and the Sun’ (NWS) passage on three consecutive days. Results indicated that some measures correlate more highly across time than others, and the choice of a measure that is both reliable and valid is discussed. It is suggested that the metric that best fits these criteria is formulated in terms of the proportion of vowels within an utterance (%V).


2021 ◽  
Author(s):  
Qi Chen ◽  
Katherine Schipper ◽  
Ning Zhang

We develop and validate an empirical measure of the informativeness of accounting assets in measuring firm-specific economic capital, an important determinant of both cash flows and intrinsic values. Our validation tests show that the asset informativeness measure is sensitive to differences in both accounting methods and implementation decisions at the firm level, and corresponds to the way equity investors use the information in accounting assets. We find that accounting assets contain substantial information about firms' productive capacity (economic capital) and the information is not summarized in several earnings attributes often associated with earnings quality.


2017 ◽  
Vol 49 (2) ◽  
pp. 549-580 ◽  
Author(s):  
Bertrand Cloez

AbstractWe consider a particle system in continuous time, a discrete population, with spatial motion, and nonlocal branching. The offspring's positions and their number may depend on the mother's position. Our setting captures, for instance, the processes indexed by a Galton–Watson tree. Using a size-biased auxiliary process for the empirical measure, we determine the asymptotic behaviour of the particle system. We also obtain a large population approximation as a weak solution of a growth-fragmentation equation. Several examples illustrate our results. The main one describes the behaviour of a mitosis model; the population is size structured. In this example, the sizes of the cells grow linearly and if a cell dies then it divides into two descendants.


2018 ◽  
Vol 50 (3) ◽  
pp. 983-1004 ◽  
Author(s):  
Tanguy Cabana ◽  
Jonathan D. Touboul

Abstract We continue the analysis of large deviations for randomly connected neural networks used as models of the brain. The originality of the model relies on the fact that the directed impact of one particle onto another depends on the state of both particles, and they have random Gaussian amplitude with mean and variance scaling as the inverse of the network size. Similarly to the spatially extended case (see Cabana and Touboul (2018)), we show that under sufficient regularity assumptions, the empirical measure satisfies a large deviations principle with a good rate function achieving its minimum at a unique probability measure, implying, in particular, its convergence in both averaged and quenched cases, as well as a propagation of a chaos property (in the averaged case only). The class of model we consider notably includes a stochastic version of the Kuramoto model with random connections.


2011 ◽  
Vol 62 (3) ◽  
Author(s):  
Jerome L. Stein

SummaryThe financial crisis was precipitated by the mortgage crisis. A whole structure of financial derivatives was based upon the ultimate debtors, the mortgagors. Insofar as the mortgagors were unable to service their debts, the values of the derivatives fell. The financial intermediaries whose assets and liabilities were based upon the value of derivatives were very highly leveraged. Changes in the values of their net worth were large multiples of changes in asset values. A cascade was precipitated by the mortgage defaults. In this manner, the mortgage debt crisis turned into a financial crisis. The crucial variable is the optimal debt of the real estate sector, which depends upon the capital gain and the interest rate. I apply the Stochastic Optimal Control (SOC) analysis to derive the optimal debt. Two models of the stochastic process on the capital gain and interest rate are presented. Each implies a different value of the optimal debt/net worth. I derive an upper bound of the optimal debt ratio, based upon the alternative models. An empirical measure of the excess debt: actual less the upper bound of the optimal ratio, is shown to be an early warning signal (EWS) of the debt crisis.


2017 ◽  
Vol 48 (3) ◽  
pp. 608-641 ◽  
Author(s):  
Akos Rona-Tas ◽  
Antoine Cornuéjols ◽  
Sandrine Blanchemanche ◽  
Antonin Duroy ◽  
Christine Martin

Recently, both sociology of science and policy research have shown increased interest in scientific uncertainty. To contribute to these debates and create an empirical measure of scientific uncertainty, we inductively devised two systems of classification or ontologies to describe scientific uncertainty in a large corpus of food safety risk assessments with the help of machine learning (ML). We ask three questions: (1) Can we use ML to assist with coding complex documents such as food safety risk assessments on a difficult topic like scientific uncertainty? (2) Can we assess using ML the quality of the ontologies we devised? (3) And, finally, does the quality of our ontologies depend on social factors? We found that ML can do surprisingly well in its simplest form identifying complex meanings, and it does not benefit from adding certain types of complexity to the analysis. Our ML experiments show that in one ontology which is a simple typology, against expectations, semantic opposites attract each other and support the taxonomic structure of the other. And finally, we found some evidence that institutional factors do influence how well our taxonomy of uncertainty performs, but its ability to capture meaning does not vary greatly across the time, institutional context, and cultures we investigated.


2020 ◽  
Vol 24 ◽  
pp. 435-453
Author(s):  
Mickael Albertus

The raking-ratio method is a statistical and computational method which adjusts the empirical measure to match the true probability of sets of a finite partition. The asymptotic behavior of the raking-ratio empirical process indexed by a class of functions is studied when the auxiliary information is given by estimates. These estimates are supposed to result from the learning of the probability of sets of partitions from another sample larger than the sample of the statistician, as in the case of two-stage sampling surveys. Under some metric entropy hypothesis and conditions on the size of the information source sample, the strong approximation of this process and in particular the weak convergence are established. Under these conditions, the asymptotic behavior of the new process is the same as the classical raking-ratio empirical process. Some possible statistical applications of these results are also given, like the strengthening of the Z-test and the chi-square goodness of fit test.


2002 ◽  
Vol 77 (s-1) ◽  
pp. 35-59 ◽  
Author(s):  
Patricia M. Dechow ◽  
Ilia D. Dichev

This paper suggests a new measure of one aspect of the quality of working capital accruals and earnings. One role of accruals is to shift or adjust the recognition of cash flows over time so that the adjusted numbers (earnings) better measure firm performance. However, accruals require assumptions and estimates of future cash flows. We argue that the quality of accruals and earnings is decreasing in the magnitude of estimation error in accruals. We derive an empirical measure of accrual quality as the residuals from firm-specific regressions of changes in working capital on past, present, and future operating cash flows. We document that observable firm characteristics can be used as instruments for accrual quality (e.g., volatility of accruals and volatility of earnings). Finally, we show that our measure of accrual quality is positively related to earnings persistence.


Sign in / Sign up

Export Citation Format

Share Document