statistical sampling
Recently Published Documents


TOTAL DOCUMENTS

313
(FIVE YEARS 43)

H-INDEX

22
(FIVE YEARS 1)

Symmetry ◽  
2022 ◽  
Vol 14 (1) ◽  
pp. 163
Author(s):  
Karl Hess

This review is related to the Einstein-Bohr debate and to Einstein–Podolsky–Rosen’s (EPR) and Bohm’s (EPRB) Gedanken-experiments as well as their realization in actual experiments. I examine a significant number of papers, from my minority point of view and conclude that the well-known theorems of Bell and Clauser, Horne, Shimony and Holt (CHSH) deal with mathematical abstractions that have only a tenuous relation to quantum theory and the actual EPRB experiments. It is also shown that, therefore, Bell-CHSH cannot be used to assess the nature of quantum entanglement, nor can physical features of entanglement be used to prove Bell-CHSH. Their proofs are, among other factors, based on a statistical sampling argument that is invalid for general physical entities and processes and only applicable for finite “populations”; not for elements of physical reality that are linked, for example, to a time-like continuum. Bell-CHSH have, furthermore, neglected the subtleties of the theorem of Vorob’ev that includes their theorems as special cases. Vorob’ev found that certain combinatorial-topological cyclicities of classical random variables form a necessary and sufficient condition for the constraints that are now known as Bell-CHSH inequalities. These constraints, however, must not be linked to the observables of quantum theory nor to the actual EPRB experiments for a variety of reasons, including the existence of continuum-related variables and appropriate considerations of symmetry.


2022 ◽  
Vol 2 (1) ◽  
Author(s):  
Hailey M. Cambra ◽  
Naren P. Tallapragada ◽  
Prabhath Mannam ◽  
David T. Breault ◽  
Allon M. Klein

2021 ◽  
Vol 38 (12) ◽  
pp. 121401
Author(s):  
Zhu-Fang Cui ◽  
Daniele Binosi ◽  
Craig D. Roberts ◽  
Sebastian M. Schmidt

Using a procedure based on interpolation via continued fractions supplemented by statistical sampling, we analyze proton magnetic form factor data obtained via electron+proton scattering on Q 2 ∈ [0.027, 0.55] GeV2 with the goal of determining the proton magnetic radius. The approach avoids assumptions about the function form used for data interpolation and ensuing extrapolation onto Q 2 ≃ 0 for extraction of the form factor slope. In this way, we find r M = 0.817(27) fm. Regarding the difference between proton electric and magnetic radii calculated in this way, extant data are seen to be compatible with the possibility that the slopes of the proton Dirac and Pauli form factors, F 1,2(Q 2), are not truly independent observables; to wit, the difference F ′ 1 ( 0 ) − F ′ 2 ( 0 ) / κ p = [ 1 + κ p ] / [ 4 m p 2 ] , viz., the proton Foldy term.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Yuri Biondi

Abstract Infection, hospitalization and mortality statistics have played a pivotal role in forming social attitudes and support for policy decisions about the 2020-21 SARS-CoV-2 (COVID-19) pandemic. This article raises some questions on some of the most widely-used indicators, such as the case fatality rate, derived from these statistics, recommending replacing them with information based on regular stratified statistical sampling, coupled with diagnostic assessment. Some implications for public health policies and pandemic management are developed, opposing individualistic and holistic approaches.


Author(s):  
Xiao-Kan Guo

In this paper, we study the construction of classical geometry from the quantum entanglement structure by using information geometry. In the information geometry of classical spacetime, the Fisher information metric is related to a blurred metric of a classical physical space. We first show that a local information metric can be obtained from the entanglement contour in a local subregion. This local information metric measures the fine structure of entanglement spectra inside the subregion, which suggests a quantum origin of the information-geometric blurred space. We study both the continuous and the classical limits of the quantum-originated blurred space by using the techniques from the statistical sampling algorithms, the sampling theory of spacetime and the projective limit. A scheme for going from a blurred space with quantum features to a classical geometry is also explored.


2021 ◽  
pp. 97-109
Author(s):  
Scott A. Chamberlin
Keyword(s):  

Author(s):  
Michael Sant'Ambrogio ◽  
Adam S. Zimmerman

This chapter considers how administrative agencies in different countries use aggregate procedures to hear common claims brought by large groups of people. In many countries, administrative agencies promise each individual a ‘day in court’ to appear before a neutral decision-maker and receive a reasoned decision based on the factual record they develop. A handful of US and other countries’ administrative hearing programmes, however, have quietly bucked this trend—using class actions, statistical sampling, agency restitution, public inquiries, ‘test case’ proceedings, and other forms of mass adjudication to resolve disputes involving large groups of people. This chapter examines how administrative agencies can more effectively resolve common disputes with aggregate procedures. Aggregate procedures offer administrative agencies several benefits, including: 1) efficiently creating ways to pool information about recurring problems and enjoin systemic harms; 2) achieving greater equality in outcomes than individual adjudication; and 3) securing legal and expert assistance at critical stages in the process. By charting how administrative systems in different countries aggregate cases, we hope to show that collective hearing procedures can form an integral part of the adjudicatory process, while serving several different models of administrative justice.


2021 ◽  
Vol 14 (8) ◽  
pp. 384
Author(s):  
Michele Caraglio ◽  
Fulvio Baldovin ◽  
Attilio L. Stella

A definition of time based on the assumption of scale invariance may enhance and simplify the analysis of historical series with cyclically recurrent patterns and seasonalities. By enforcing simple-scaling and stationarity of the distributions of returns, we identify a successful protocol of time definition in finance, functional from tens of minutes to a few days. Within this time definition, the significant reduction of cyclostationary effects allows analyzing the structure of the stochastic process underlying the series on the basis of statistical sampling sliding along the whole time series. At the same time, the duration of periods in which markets remain inactive is properly quantified by the novel clock, and the corresponding returns (e.g., overnight or weekend) can be consistently taken into account for financial applications. The method is applied to the S&P500 index recorded at a 1 min frequency between September 1985 and June 2013.


Sign in / Sign up

Export Citation Format

Share Document