scholarly journals Squeezing metrology: a unified framework

Quantum ◽  
2020 ◽  
Vol 4 ◽  
pp. 292
Author(s):  
Lorenzo Maccone ◽  
Alberto Riccardi

Quantum metrology theory has up to now focused on the resolution gains obtainable thanks to the entanglement among N probes. Typically, a quadratic gain in resolution is achievable, going from the 1/N of the central limit theorem to the 1/N of the Heisenberg bound. Here we focus instead on quantum squeezing and provide a unified framework for metrology with squeezing, showing that, similarly, one can generally attain a quadratic gain when comparing the resolution achievable by a squeezed probe to the best N-probe classical strategy achievable with the same energy. Namely, here we give a quantification of the Heisenberg squeezing bound for arbitrary estimation strategies that employ squeezing. Our theory recovers known results (e.g. in quantum optics and spin squeezing), but it uses the general theory of squeezing and holds for arbitrary quantum systems.

Mathematics ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 880
Author(s):  
Igoris Belovas

In this research, we continue studying limit theorems for combinatorial numbers satisfying a class of triangular arrays. Using the general results of Hwang and Bender, we obtain a constructive proof of the central limit theorem, specifying the rate of convergence to the limiting (normal) distribution, as well as a new proof of the local limit theorem for the numbers of the tribonacci triangle.


2021 ◽  
Vol 36 (2) ◽  
pp. 243-255
Author(s):  
Wei Liu ◽  
Yong Zhang

AbstractIn this paper, we investigate the central limit theorem and the invariance principle for linear processes generated by a new notion of independently and identically distributed (IID) random variables for sub-linear expectations initiated by Peng [19]. It turns out that these theorems are natural and fairly neat extensions of the classical Kolmogorov’s central limit theorem and invariance principle to the case where probability measures are no longer additive.


2021 ◽  
Vol 22 (1) ◽  
pp. 285-334
Author(s):  
Eric Kades

Abstract There are powerful fairness and efficiency arguments for making charitable donations to soup kitchens 100% deductible. These arguments have no purchase for donations to fund opulent church organs, yet these too are 100% deductible under the current tax code. This stark dichotomy is only the tip of the iceberg. Looking at a wider sampling of charitable gifts reveals a charitable continuum. Based on sliding scales for efficiency, multiple theories of fairness, pluralism, institutional competence and social welfare dictate that charitable deductions should in most cases be fractions between zero and one. Moreover, the Central Limit Theorem strongly suggests that combining this welter of largely independent criteria with the wide variety of charitable gifts results in a classic bell-shaped normal curve of optimal deductions, with a peak at some central value and quickly decaying to zero at the extremes of 0% and 100%. Given that those are the only two options under the current tax code, the current charitable deduction regime inevitably makes large errors in most cases. Actually calculating a precise optimal percentage for each type of charitable donation is of course impractical. This Article suggests, however, that we can do much better than the systematically erroneous current charitable deduction. Granting a 100% deduction only for donations to the desperately poor, along with 50%, 25%, and 0% for gifts yielding progressively fewer efficiency, fairness, pluralism, and institutional competence benefits, promises to deliver a socially more desirable charitable deduction.


Author(s):  
Felix Herold ◽  
Daniel Hug ◽  
Christoph Thäle

AbstractPoisson processes in the space of $$(d-1)$$ ( d - 1 ) -dimensional totally geodesic subspaces (hyperplanes) in a d-dimensional hyperbolic space of constant curvature $$-1$$ - 1 are studied. The k-dimensional Hausdorff measure of their k-skeleton is considered. Explicit formulas for first- and second-order quantities restricted to bounded observation windows are obtained. The central limit problem for the k-dimensional Hausdorff measure of the k-skeleton is approached in two different set-ups: (i) for a fixed window and growing intensities, and (ii) for fixed intensity and growing spherical windows. While in case (i) the central limit theorem is valid for all $$d\ge 2$$ d ≥ 2 , it is shown that in case (ii) the central limit theorem holds for $$d\in \{2,3\}$$ d ∈ { 2 , 3 } and fails if $$d\ge 4$$ d ≥ 4 and $$k=d-1$$ k = d - 1 or if $$d\ge 7$$ d ≥ 7 and for general k. Also rates of convergence are studied and multivariate central limit theorems are obtained. Moreover, the situation in which the intensity and the spherical window are growing simultaneously is discussed. In the background are the Malliavin–Stein method for normal approximation and the combinatorial moment structure of Poisson U-statistics as well as tools from hyperbolic integral geometry.


2011 ◽  
Vol 26 (24) ◽  
pp. 1771-1782 ◽  
Author(s):  
H. C. EGGERS ◽  
M. B. DE KOCK ◽  
J. SCHMIEGEL

Lowest-order cumulants provide important information on the shape of the emission source in femtoscopy. For the simple case of noninteracting identical particles, we show how the fourth-order source cumulant can be determined from measured cumulants in momentum space. The textbook Gram–Charlier series is found to be highly inaccurate, while the related Edgeworth series provides increasingly accurate estimates. Ordering of terms compatible with the Central Limit Theorem appears to play a crucial role even for non-Gaussian distributions.


Sign in / Sign up

Export Citation Format

Share Document