sufficient statistics
Recently Published Documents


TOTAL DOCUMENTS

361
(FIVE YEARS 56)

H-INDEX

27
(FIVE YEARS 4)

Author(s):  
П.В. Полухин

В работе предложены математические инструменты на основе достаточных статистик и декомпозиции выборок в сочетании с алгоритмами распределенных вычислений, позволяющие существенно повысить эффективность процедуры фильтрации. Filtering algorithms are used to assess the state of dynamic systems when solving various practical problems, such as voice synthesis and determining the geo-position and monitoring the movement of objects. In the case of complex hierarchical dynamic systems with a large number of time slices, the process of calculating probabilistic characteristics becomes very time-consuming due to the need to generate a large number of samples. The essence of optimization is to reduce the number of samples generated by the filter, increase their consistency and speed up computational operations. The paper offers mathematical tools based on sufficient statistics and sample decomposition in combination with distributed computing algorithms that can significantly improve the efficiency of the filtering procedure.


2021 ◽  
Author(s):  
Antoine Ferey ◽  
Benjamin Lockwood ◽  
Dmitry Taubinsky

Author(s):  
Jianhai Zhang ◽  
Zhiyong Feng ◽  
Yong Su ◽  
Meng Xing

For the merits of high-order statistics and Riemannian geometry, covariance matrix has become a generic feature representation for action recognition. An independent action can be represented by an empirical statistics over all of its pose samples. Two major problems of covariance include the following: (1) it is prone to be singular so that actions fail to be represented properly, and (2) it is short of global action/pose-aware information so that expressive and discriminative power is limited. In this article, we propose a novel Bayesian covariance representation by a prior regularization method to solve the preceding problems. Specifically, covariance is viewed as a parametric maximum likelihood estimate of Gaussian distribution over local poses from an independent action. Then, a Global Informative Prior (GIP) is generated over global poses with sufficient statistics to regularize covariance. In this way, (1) singularity is greatly relieved due to sufficient statistics, (2) global pose information of GIP makes Bayesian covariance theoretically equivalent to a saliency weighting covariance over global action poses so that discriminative characteristics of actions can be represented more clearly. Experimental results show that our Bayesian covariance with GIP efficiently improves the performance of action recognition. In some databases, it outperforms the state-of-the-art variant methods that are based on kernels, temporal-order structures, and saliency weighting attentions, among others.


2021 ◽  
pp. 1-52
Author(s):  
Yu Li ◽  
Hu Wang ◽  
Biyu Li ◽  
Jiaquan Wang ◽  
Enying Li

Abstract The purpose of this study is to obtain a margin of safety for material and process parameters in sheet metal forming. Commonly applied forming criteria are difficult to comprehensively evaluate the forming quality directly. Therefore, an image-driven criterion is suggested for uncertainty parameter identification of sheet metal forming. In this way, more useful characteristics, material flow, and distributions of safe and crack regions, can be considered. Moreover, to improve the efficiency for obtaining sufficient statistics of Approximate Bayesian Computation (ABC), a manifold learning-assisted ABC uncertainty inverse framework is proposed. Based on the framework, the design parameters of two sheet metal forming problems, an air conditioning cover and an engine inner hood, are identified.


2021 ◽  
pp. 1-46
Author(s):  
Michael Geruso ◽  
Timothy J. Layton ◽  
Grace McCormack ◽  
Mark Shepard

Abstract Insurance markets often feature consumer sorting along both an extensive margin (whether to buy) and an intensive margin (which plan to buy). We present a new graphical theoretical framework that extends a workhorse model to incorporate both selection margins simultaneously. A key insight from our framework is that policies aimed at addressing one margin of selection often involve an economically meaningful trade-off on the other margin in terms of prices, enrollment, and welfare. Using data fromMassachusetts, we illustrate these trade-offs in an empirical sufficient statistics approach that is tightly linked to the graphical framework we develop.


Author(s):  
I. S. Pulkin ◽  
A. V. Tatarintsev

The task of estimating the parameters of the Pareto distribution, first of all, of an indicator of this distribution for a given sample, is relevant. This article establishes that for this estimate, it is sufficient to know the product of the sample elements. It is proved that this product is a sufficient statistic for the Pareto distribution parameter. On the basis of the maximum likelihood method the distribution degree indicator is estimated. It is proved that this estimate is biased, and a formula eliminating the bias is justified. For the product of the sample elements considered as a random variable the distribution function and probability density are found; mathematical expectation, higher moments, and differential entropy are calculated. The corresponding graphs are built. In addition, it is noted that any function of this product is a sufficient statistic, in particular, the geometric mean. For the geometric mean also considered as a random variable, the distribution function, probability density, and the mathematical expectation are found; the higher moments, and the differential entropy are also calculated, and the corresponding graphs are plotted. In addition, it is proved that the geometric mean of the sample is a more convenient sufficient statistic from a practical point of view than the product of the sample elements. Also, on the basis of the Rao–Blackwell–Kolmogorov theorem, effective estimates of the Pareto distribution parameter are constructed. In conclusion, as an example, the technique developed here is applied to the exponential distribution. In this case, both the sum and the arithmetic mean of the sample can be used as sufficient statistics.


Author(s):  
Narges Rezvani Majid ◽  
Michael Röckner

This paper is about the structure of all entrance laws (in the sense of Dynkin) for time-inhomogeneous Ornstein–Uhlenbeck processes with Lévy noise in Hilbert state spaces. We identify the extremal entrance laws with finite weak first moments through an explicit formula for their Fourier transforms, generalizing corresponding results by Dynkin for Wiener noise and nuclear state spaces. We then prove that an arbitrary entrance law with finite weak first moments can be uniquely represented as an integral over extremals. It is proved that this can be derived from Dynkin’s seminal work “Sufficient statistics and extreme points” in Ann. Probab. 1978, which contains a purely measure theoretic generalization of the classical analytic Krein–Milman and Choquet Theorems. As an application, we obtain an easy uniqueness proof for [Formula: see text]-periodic entrance laws in the general periodic case. A number of further applications to concrete cases are presented.


2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Henrik J. Kleven

This article reviews and generalizes the sufficient statistics approach to policy evaluation. The idea of the approach is that the welfare effect of policy changes can be expressed in terms of estimable reduced-form elasticities, allowing for policy evaluation without estimating the structural primitives of fully specified models. The approach relies on three assumptions: that policy changes are small, that government policy is the only source of market imperfection, and that a set of high-level restrictions on the environment and on preferences can be used to reduce the number of elasticities to be estimated. We generalize the approach in all three dimensions. It is possible to develop transparent sufficient statistics formulas under very general conditions, but the estimation requirements increase greatly. Starting from such general formulas elucidates that feasible empirical implementations are in fact structural approaches. Expected final online publication date for the Annual Review of Economics, Volume 13 is August 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.


2021 ◽  
Vol 111 ◽  
pp. 272-276
Author(s):  
David Baqaee ◽  
Emmanuel Farhi

The COVID-19 crisis is a seemingly all-encompassing shock to supply and demand. These negative shocks affected industries differently: some switched to remote work, maintaining employment and production, while others reduced capacity and shed workers. We consider a stripped-down version of the model in Baqaee and Farhi (2020). The model allows for an arbitrary input-output network, complementarities, incomplete markets, downward wage rigidity, and a zero lower bound. Nevertheless, the model has a stark property: factor income shares at the initial equilibrium are global sufficient statistics for the production network, clarifying assumptions that must be broken if the network is to matter.


Author(s):  
Yuri L. Koziratsky ◽  
Roman G. Khilchenko ◽  
Ruslan E. Merkulov ◽  
Anton A. Koziratsky

An algorithm for estimating the position of the laser beam by the component scattered in the atmosphere by a matrix photodetector against a background of Gaussian noise is developed. The algorithm is based on the Lehman-Scheffe theorem, which allows one to obtain effective estimates of the distribution parameters using complete sufficient statistics for reports of the observed implementation of scattered laser radiation. The obtained estimates provide the best quality of parameter estimation for any finite sample sizes and do not require additional studies


Sign in / Sign up

Export Citation Format

Share Document