statistical precision
Recently Published Documents


TOTAL DOCUMENTS

131
(FIVE YEARS 36)

H-INDEX

22
(FIVE YEARS 4)

2021 ◽  
Vol 111 (12) ◽  
pp. 2157-2166
Author(s):  
Samuel H. Zuvekas ◽  
David Kashihara

The COVID-19 pandemic caused substantial disruptions in the field operations of all 3 major components of the Medical Expenditure Panel Survey (MEPS). The MEPS is widely used to study how policy changes and major shocks, such as the COVID-19 pandemic, affect insurance coverage, access, and preventive and other health care utilization and how these relate to population health. We describe how the MEPS program successfully responded to these challenges by reengineering field operations, including survey modes, to complete data collection and maintain data release schedules. The impact of the pandemic on response rates varied considerably across the MEPS. Investigations to date show little effect on the quality of data collected. However, lower response rates may reduce the statistical precision of some estimates. We also describe several enhancements made to the MEPS that will allow researchers to better understand the impact of the pandemic on US residents, employers, and the US health care system. (Am J Public Health. 2021;111(12):2157–2166. https://doi.org/10.2105/AJPH.2021.306534 )


2021 ◽  
Vol 136 (12) ◽  
Author(s):  
Paolo Azzurri

AbstractThe FCC-ee physics program will deliver two complementary top-notch precision determinations of the W boson mass, and width. The first and main measurement relies on the rapid rise of the W-pair production cross section near its kinematic threshold. This method is extremely simple and clean, involving only the selection and counting of events, in all different decay channels. An optimal threshold-scan strategy with a total integrated luminosity of $$12\,\mathrm{ab}^{-1}$$ 12 ab - 1 shared on energy points between 157 and 163 GeV will provide a statistical uncertainty on the W mass of 0.5 MeV and on the W width of 1.2 MeV. For these measurements, the goal of keeping the impact of systematic uncertainties below the statistical precision will be demanding, but feasible. The second method exploits the W-pair final state reconstruction and kinematic fit, making use of events with either four jets or two jets, one lepton and missing energy. The projected statistical precision of the second method is similar to the first method’s, with uncertainties of $$\sim 0.5$$ ∼ 0.5 (1) MeV for the W mass (width), employing W-pair data collected at the production threshold and at 240–365 GeV. For the kinematic reconstruction method, the final impact of systematic uncertainties is currently less clear, in particular uncertainties connected to the modeling of the W hadronic decays. The use and interplay of Z$$\gamma $$ γ and ZZ events, reconstructed and fitted with the same techniques as the WW events, will be important for the extraction of W mass measurements with data at the higher 240 and 365 GeV energies.


2021 ◽  
Vol 12 ◽  
Author(s):  
Ana C. Holt ◽  
William G. Hopkins ◽  
Robert J. Aughey ◽  
Rodney Siegel ◽  
Vincent Rouillard ◽  
...  

Purpose: Instrumentation systems are increasingly used in rowing to measure training intensity and performance but have not been validated for measures of power. In this study, the concurrent validity of Peach PowerLine (six units), Nielsen-Kellerman EmPower (five units), Weba OarPowerMeter (three units), Concept2 model D ergometer (one unit), and a custom-built reference instrumentation system (Reference System; one unit) were investigated.Methods: Eight female and seven male rowers [age, 21 ± 2.5 years; rowing experience, 7.1 ± 2.6 years, mean ± standard deviation (SD)] performed a 30-s maximal test and a 7 × 4-min incremental test once per week for 5 weeks. Power per stroke was extracted concurrently from the Reference System (via chain force and velocity), the Concept2 itself, Weba (oar shaft-based), and either Peach or EmPower (oarlock-based). Differences from the Reference System in the mean (representing potential error) and the stroke-to-stroke variability (represented by its SD) of power per stroke for each stage and device, and between-unit differences, were estimated using general linear mixed modeling and interpreted using rejection of non-substantial and substantial hypotheses.Results: Potential error in mean power was decisively substantial for all devices (Concept2, –11 to –15%; Peach, −7.9 to −17%; EmPower, −32 to −48%; and Weba, −7.9 to −16%). Between-unit differences (as SD) in mean power lacked statistical precision but were substantial and consistent across stages (Peach, ∼5%; EmPower, ∼7%; and Weba, ∼2%). Most differences from the Reference System in stroke-to-stroke variability of power were possibly or likely trivial or small for Peach (−3.0 to −16%), and likely or decisively substantial for EmPower (9.7–57%), and mostly decisively substantial for Weba (61–139%) and the Concept2 (−28 to 177%).Conclusion: Potential negative error in mean power was evident for all devices and units, particularly EmPower. Stroke-to-stroke variation in power showed a lack of measurement sensitivity (apparent smoothing) that was minor for Peach but larger for the Concept2, whereas EmPower and Weba added random error. Peach is therefore recommended for measurement of mean and stroke power.


2021 ◽  
Vol 136 (11) ◽  
Author(s):  
Patrizia Azzi ◽  
Emmanuel Perez

AbstractCircular colliders have the advantage of delivering collisions to multiple interaction points, which allow different detector designs to be studied and optimised—up to four for FCC-ee. On the one hand, the detectors must satisfy the constraints imposed by the invasive interaction region layout. On the other hand, the performance of heavy-flavour tagging, of particle identification, of tracking and particle-flow reconstruction, and of lepton, jet, missing energy and angular resolution, need to match the physics programme and the exquisite statistical precision offered by FCC-ee. During the FCC feasibility study (2021–2025), benchmark physics processes will be used to determine, via appropriate simulations, the requirements on the detector performance or design that must be satisfied to ensure that the systematic uncertainties of the measurements are commensurate with their statistical precision. The usage of the data themselves, in order to reach the challenging goals on the stability and on the alignment of the detector, in particular for the programme at and around the Z peak, will also be studied. In addition, the potential for discovering very weakly coupled new particles, in decays of Z or Higgs bosons, could motivate dedicated detector designs that would increase the efficiency for reconstructing the unusual signatures of such processes. These studies are crucial input to the further optimisation of the two concepts described in the FCC-ee conceptual design report, CLD and IDEA, and to the development of new concepts which might actually prove to be better adapted to the FCC-ee physics programme, or parts thereof.


Author(s):  
Montes-Perez Ruben ◽  
Lopez-Coba Ermilo ◽  
Pacheco-Sierra Gualberto ◽  
May-Cruz Christian ◽  
Sierra-Gomez Andrés III

Aims: Estimate the population density of deer in the municipality of Tzucacab, Yucatán in the periods of 2003-2004, 2007-2008 and 2008-2009, determine the use of the habitat by these populations and the sustainability of the deer harvest from the estimated population densities. Study Design: A descriptive and vertical free-living deer population study was carried out in southern Yucatan, Mexico over a three-year period. Methodology: The map of the municipality of Tuzcacab was zoned in quadrants of 36 km2, completing a total of 36 quadrants; Unrestricted random sampling was applied to select seven quadrants in the period from 2003 to 2004 and 18 in each annual period between 2007 and 2009. Population samplings were carried out by applying three population estimation methods: direct sighting in a linear transect of 5 km in length, count of tracks in transect except period 2003-2004 and faecal pellets group count in plots. The evaluation of the use of habitat was carried out using the Bonferroni intervals, from the data of faecal pellets count. The evaluation of the deer harvest was carried out using the sustainable harvest model. Results: The population densities were different in each method, the density by the excreta count was 4.63 ± 2.49 deer / km2 in 2003-2004, 0.294 ± 0.198 deer / km2 in 2007-2008, and in the year 2008-2009 was 0.419 ± 0.0000085 deer / km2. Habitat use in 2007-08 and 2008-2009 was higher in the tropical forest, lower in agriculture and similar to that expected in secondary succession forest (acahual). The values of sustainable harvest, taking as a value the density per count of excreta in the plot because it showed the highest statistical precision, in the period 2003-04 it is sustainable, but in the period from 2007 to 2009 it is not sustainable. Conclusion: The population densities of deer (O. virginianus and M. americana) in Tuzcacab by means of the excreta count method, have decreased significantly. The habitat use preference is the tropical forest. The deer harvest in the period from 2007 to 2009 is not sustainable.


2021 ◽  
Vol 81 (10) ◽  
Author(s):  
Ryuta Kiuchi ◽  
Yanxi Gu ◽  
Min Zhong ◽  
Lingteng Kong ◽  
Alex Schuy ◽  
...  

AbstractThe precision of the yield measurement of the Higgs boson decaying into a pair of Z bosons process at the Circular Electron Positron Collider is evaluated. Including the recoil Z boson associated with the Higgs production (Higgsstrahlung) a total of three Z bosons are involved for this channel, from which final states characterized by the presence of a pair of leptons, quarks, and neutrinos are chosen for the signal. Two analysis approaches are compared and the final statistical precision of $${\sigma }_{\mathrm {ZH}}{\cdot }$$ σ ZH · BR($$H \rightarrow ZZ^{*}$$ H → Z Z ∗ ) is estimated to be 6.9% using a multivariate analysis technique, based on boosted decision trees. The relative precision of the Higgs boson width, using this $$H \rightarrow ZZ^{*}$$ H → Z Z ∗ decay topology, is estimated by combining the obtained result with the precision of the inclusive ZH cross section measurement.


2021 ◽  
Vol 2021 (10) ◽  
Author(s):  
Tyler D. Blanton ◽  
Andrew D. Hanlon ◽  
Ben Hörz ◽  
Colin Morningstar ◽  
Fernando Romero-López ◽  
...  

Abstract We study two- and three-meson systems composed either of pions or kaons at maximal isospin using Monte Carlo simulations of lattice QCD. Utilizing the stochastic LapH method, we are able to determine hundreds of two- and three-particle energy levels, in nine different momentum frames, with high precision. We fit these levels using the relativistic finite-volume formalism based on a generic effective field theory in order to determine the parameters of the two- and three-particle K-matrices. We find that the statistical precision of our spectra is sufficient to probe not only the dominant s-wave interactions, but also those in d waves. In particular, we determine for the first time a term in the three-particle K-matrix that contains two-particle d waves. We use three Nf = 2 + 1 CLS ensembles with pion masses of 200, 280, and 340 MeV. This allows us to study the chiral dependence of the scattering observables, and compare to the expectations of chiral perturbation theory.


2021 ◽  
Vol 10 (6) ◽  
Author(s):  
Anja Butter ◽  
Sascha Diefenbacher ◽  
Gregor Kasieczka ◽  
Benjamin Nachman ◽  
Tilman Plehn

A critical question concerning generative networks applied to event generation in particle physics is if the generated events add statistical precision beyond the training sample. We show for a simple example with increasing dimensionality how generative networks indeed amplify the training statistics. We quantify their impact through an amplification factor or equivalent numbers of sampled events.


2021 ◽  
Vol 8 ◽  
Author(s):  
Cristina Paissoni ◽  
Carlo Camilloni

The reliability and usefulness of molecular dynamics simulations of equilibrium processes rests on their statistical precision and their capability to generate conformational ensembles in agreement with available experimental knowledge. Metadynamics Metainference (M&M), coupling molecular dynamics with the enhanced sampling ability of Metadynamics and with the ability to integrate experimental information of Metainference, can in principle achieve both goals. Here we show that three different Metadynamics setups provide converged estimate of the populations of the three-states populated by a model peptide. Errors are estimated correctly by block averaging, but higher precision is obtained by performing independent replicates. One effect of Metadynamics is that of dramatically decreasing the number of effective frames resulting from the simulations and this is relevant for M&M where the number of replicas should be large enough to capture the conformational heterogeneity behind the experimental data. Our simulations allow also us to propose that monitoring the relative error associated with conformational averaging can help to determine the minimum number of replicas to be simulated in the context of M&M simulations. Altogether our data provides useful indication on how to generate sound conformational ensemble in agreement with experimental data.


PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0251558
Author(s):  
Mihai Stelian Rusu

Recent scholarship in critical toponymy studies has refashioned the understanding of street names from innocent labels to nominal loci of historical memory and vectors of collective identity that are embroiled with power relations. Urban nomenclatures consist of more than mere linguistic signposts deployed onto space to facilitate navigation. Street names are also powerful signposts that indicate the political regime and its socio-cultural values. Drawing on these theoretical insights, this paper is focused on Sibiu (Romania) and explore the city’s shifting namescape in a longitudinal perspective spanning one century and a half of modern history (1875–2020). The analysis is based on a complete dataset of street names and street name changes registered across five political regimes (Habsburg Empire, Kingdom of Romania, Romanian People’s Republic, Socialist Republic of Romania, and post-socialist Romania). A series of multiple logistic regression models were carried out to determine the factors that influence toponymic change. The statistical results point out several significant predictors of street renaming: (1) the streets’ toponymic characteristics (politicized or neutral name); (2) artery rank (public squares and large avenues or ordinary streets and alleys); and (3) topographic features (a street’s size and centrality). Such a quantitative approach coupled with a longitudinal perspective contributes to the scholarly literature on place-naming practices in three major ways: firstly, by advancing an innovative methodological framework and analytical model for the study of street name changes; secondly, by delineating with statistical precision the factors that model toponymic change; and thirdly, by embedding these renaming practices observed especially after significant power shifts in the broader historical context of the changes brought in the city’s street nomenclature.


Sign in / Sign up

Export Citation Format

Share Document