sampling statistics
Recently Published Documents


TOTAL DOCUMENTS

100
(FIVE YEARS 15)

H-INDEX

13
(FIVE YEARS 2)

2021 ◽  
Vol 14 (12) ◽  
pp. 7681-7691
Author(s):  
Karlie N. Rees ◽  
Timothy J. Garrett

Abstract. Due to the discretized nature of rain, the measurement of a continuous precipitation rate by disdrometers is subject to statistical sampling errors. Here, Monte Carlo simulations are employed to obtain the precision of rain detection and rate as a function of disdrometer collection area and compared with World Meteorological Organization guidelines for a 1 min sample interval and 95 % probability. To meet these requirements, simulations suggest that measurements of light rain with rain rates R ≤ 0.50 mm h−1 require a collection area of at least 6 cm × 6 cm, and for R = 1 mm h−1, the minimum collection area is 13 cm × 13 cm. For R = 0.01 mm h−1, a collection area of 2 cm × 2 cm is sufficient to detect a single drop. Simulations are compared with field measurements using a new hotplate device, the Differential Emissivity Imaging Disdrometer. The field results suggest an even larger plate may be required to meet the stated accuracy, likely in part due to non-Poissonian hydrometeor clustering.


2021 ◽  
Vol 169 (1-2) ◽  
Author(s):  
Theodore G. Shepherd

AbstractThe treatment of uncertainty in climate-change science is dominated by the far-reaching influence of the ‘frequentist’ tradition in statistics, which interprets uncertainty in terms of sampling statistics and emphasizes p-values and statistical significance. This is the normative standard in the journals where most climate-change science is published. Yet a sampling distribution is not always meaningful (there is only one planet Earth). Moreover, scientific statements about climate change are hypotheses, and the frequentist tradition has no way of expressing the uncertainty of a hypothesis. As a result, in climate-change science, there is generally a disconnect between physical reasoning and statistical practice. This paper explores how the frequentist statistical methods used in climate-change science can be embedded within the more general framework of probability theory, which is based on very simple logical principles. In this way, the physical reasoning represented in scientific hypotheses, which underpins climate-change science, can be brought into statistical practice in a transparent and logically rigorous way. The principles are illustrated through three examples of controversial scientific topics: the alleged global warming hiatus, Arctic-midlatitude linkages, and extreme event attribution. These examples show how the principles can be applied, in order to develop better scientific practice.“La théorie des probabilités n’est que le bon sens reduit au calcul.” (Pierre-Simon Laplace, Essai Philosophiques sur les Probabilités, 1819).“It is sometimes considered a paradox that the answer depends not only on the observations but on the question; it should be a platitude.” (Harold Jeffreys, Theory of Probability, 1st edition, 1939).


BMC Biology ◽  
2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Guillaume Le Goc ◽  
Julie Lafaye ◽  
Sophia Karpenko ◽  
Volker Bormuth ◽  
Raphaël Candelier ◽  
...  

Abstract Background Variability is a hallmark of animal behavior. It contributes to survival by endowing individuals and populations with the capacity to adapt to ever-changing environmental conditions. Intra-individual variability is thought to reflect both endogenous and exogenous modulations of the neural dynamics of the central nervous system. However, how variability is internally regulated and modulated by external cues remains elusive. Here, we address this question by analyzing the statistics of spontaneous exploration of freely swimming zebrafish larvae and by probing how these locomotor patterns are impacted when changing the water temperatures within an ethologically relevant range. Results We show that, for this simple animal model, five short-term kinematic parameters — interbout interval, turn amplitude, travelled distance, turn probability, and orientational flipping rate — together control the long-term exploratory dynamics. We establish that the bath temperature consistently impacts the means of these parameters, but leave their pairwise covariance unchanged. These results indicate that the temperature merely controls the sampling statistics within a well-defined kinematic space delineated by this robust statistical structure. At a given temperature, individual animals explore the behavioral space over a timescale of tens of minutes, suggestive of a slow internal state modulation that could be externally biased through the bath temperature. By combining these various observations into a minimal stochastic model of navigation, we show that this thermal modulation of locomotor kinematics results in a thermophobic behavior, complementing direct gradient-sensing mechanisms. Conclusions This study establishes the existence of a well-defined locomotor space accessible to zebrafish larvae during spontaneous exploration, and quantifies self-generated modulation of locomotor patterns. Intra-individual variability reflects a slow diffusive-like probing of this space by the animal. The bath temperature in turn restricts the sampling statistics to sub-regions, endowing the animal with basic thermophobicity. This study suggests that in zebrafish, as well as in other ectothermic animals, ambient temperature could be used to efficiently manipulate internal states in a simple and ethological way.


2021 ◽  
Vol 36 (1) ◽  
pp. 187-190
Author(s):  
Esther Silviya ◽  
Dr.S. Rabiyathul Basariya

Retention of Employees is one of the basic key challenges confronted by the means of IT agencies in our country India. It is determined that there may be additionally a high-quality name for expert of IT professionals inside India and foreign countries, which has ended mainly in technophile leaving the employer in search of a greener and wider pastures. The IT corporations today’s in context cannot come up with the cash to lose their indispensable physique of employees due to uncertainty of altering economy, growing opposition and shortage of knowledgeable physique of employees as this will in flip have an impact on their bottom traces notably. All these created the want for designing the effective retention strategies. The reason of the prevailing document is to take a seem to be at factors like profits, superior – subordinate courting, boom possibilities, centers, rules and tactics, reputation, appreciation, tips, co- humans via which it allows to understand the Attrition stage inside the firms and factors concerning conserving them. This find out about moreover allows to locate out in which the businesses are lagging in maintaining. Researcher accrued sampling statistics from 130 respondents of software industry.


Metals ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 774
Author(s):  
Chris A. Simpson ◽  
David M. Knowles ◽  
Mahmoud Mostafavi

Accurate residual lattice strain measurements are highly dependent upon the precision of the diffraction peak location and the underlying microstructure suitability. The suitability of the microstructure is related to the requirement for valid powder diffraction sampling statistics and the associated number of appropriately orientated illuminated. In this work, these two sources of uncertainty are separated, and a method given for both the quantification of errors associated with insufficient grain sampling statistics and minimization of the total lattice strain measurement uncertainty. It is possible to reduce the total lattice strain measurement uncertainty by leveraging diffraction peak measurements made at multiple azimuthal angles. Lattice strain measurement data acquired during eight synchrotron X-ray diffraction experiments, monochromatic and energy dispersive, has been assessed as per this approach, with microstructural suitability being seen to dominate total measurement uncertainty when the number of illuminated grains was <106. More than half of the studied experimental data fell into this category, with a severe underestimation of total strain measurement uncertainty being possible when microstructural suitability is not considered. To achieve a strain measurement uncertainty under 10−4, approximately 3×105 grains must be within the sampled gauge volume, with this value varying with the multiplicity of the family of lattice planes under study. Where additional azimuthally arrayed data are available an in-plane lattice strain tensor can be extracted. This improves overall strain measurement accuracy and an uncertainty under 10−4 can then be achieved with just 4×104 grains.


2021 ◽  
Author(s):  
Guillaume Le Goc ◽  
Sophia Karpenko ◽  
Volker Bormuth ◽  
Raphael Candelier ◽  
Georges Debregeas

Variability is a hallmark of animal behavior. It endows individuals and populations with the capacity to adapt to ever-changing conditions. How variability is internally regulated and modulated by external cues remains elusive. Here we address this question by focusing on the exploratory behavior of zebrafish larvae as they freely swim at different, yet ethologically relevant, water temperatures. We show that, for this simple animal model, five short-term kinematic parameters together control the long-term exploratory dynamics. We establish that the bath temperature consistently impacts the means and variances of these parameters, but leave their pairwise covariance unchanged. These results indicate that the temperature merely controls the sampling statistics within a well-defined accessible locomotor repertoire. At a given temperature, the exploration of the behavioral space is found to take place over tens of minutes, suggestive of a slow internal state modulation that could be externally biased through the bath temperature. By combining these various observations into a minimal stochastic model of navigation, we show that this thermal modulation of locomotor kinematics results in a thermophobic behavior, complementing direct gradient-sensing mechanisms.


Forests ◽  
2020 ◽  
Vol 11 (12) ◽  
pp. 1364
Author(s):  
Andrew J. Lister ◽  
Hans Andersen ◽  
Tracey Frescino ◽  
Demetrios Gatziolis ◽  
Sean Healey ◽  
...  

Globally, forests are a crucial natural resource, and their sound management is critical for human and ecosystem health and well-being. Efforts to manage forests depend upon reliable data on the status of and trends in forest resources. When these data come from well-designed natural resource monitoring (NRM) systems, decision makers can make science-informed decisions. National forest inventories (NFIs) are a cornerstone of NRM systems, but require capacity and skills to implement. Efficiencies can be gained by incorporating auxiliary information derived from remote sensing (RS) into ground-based forest inventories. However, it can be difficult for countries embarking on NFI development to choose among the various RS integration options, and to develop a harmonized vision of how NFI and RS data can work together to meet monitoring needs. The NFI of the United States, which has been conducted by the USDA Forest Service’s (USFS) Forest Inventory and Analysis (FIA) program for nearly a century, uses RS technology extensively. Here we review the history of the use of RS in FIA, beginning with general background on NFI, FIA, and sampling statistics, followed by a description of the evolution of RS technology usage, beginning with paper aerial photography and ending with present day applications and future directions. The goal of this review is to offer FIA’s experience with NFI-RS integration as a case study for other countries wishing to improve the efficiency of their NFI programs.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Qiudi Zhao ◽  
Yaohuan Huang ◽  
Yesen Liu

The spatial and temporal distribution of the higher-education population (HEP) is a fundamental characteristic of the development level of higher education in a region or a country. Based on the annual population sampling statistics from 2000 to 2015, the spatiotemporal evolution pattern of the HEP in China is systematically analyzed. Meanwhile, 9 driving factors related to natural conditions and socioeconomic conditions of average slope, average elevation, the city location, the city size, high-speed railways, highways, gross domestic product (GDP) density, nonagricultural population, and population density of 2000 and 2010 at the municipal level are constructed. Then, the factors driving the distribution of the HEP are quantitatively analyzed using the geodetector model. The results show that the centroid of the HEP, shifting from the northeast to the southwest from 2000 to 2010, is markedly different from that of the total population from 2000 to 2015 in China. Despite their different moving directions, the distance between the two centroids is decreasing, indicating both significant regional differences of the HEP in China and a narrowing gap between the HEP and the total population in recent years. The results of the factor detector of 2000 and 2010 suggest that the proportion of the nonagricultural population and the city location are the main driving factors of the distribution of the HEP, with driving forces between 0.494 and 0.627, followed by the city size, highways, and GDP density, with driving forces are between 0.199 and 0.302. It indicates that urbanization levels and urban locations are the main factors affecting the spatial distribution of the HEP. The results of the interaction detection reveal that the interaction of the nonagricultural population and the GDP density can explain 92.7% of the spatial variety of the HEP in 2000, while that of the nonagricultural population and the population density can explain 97.6% of the spatial variety of the HEP in 2010, which reflects a more balanced development of the HEP. In addition, a large proportion of the HEP transfers from economically developed areas to densely populated areas.


2020 ◽  
Vol 148 (6) ◽  
pp. 2623-2643
Author(s):  
Toshiyuki Ishibashi

Abstract In data assimilation for NWP, accurate estimation of error covariance matrices (ECMs) and their use are essential to improve NWP accuracy. The objective of this study is to estimate ECMs of all observations and background variables using sampling statistics, and improve global NWP accuracy by using them. This study presents the first results of such all ECM refinement. ECM diagnostics combining multiple methods, and analysis and forecast cycle experiments were performed on the JMA global NWP system, where diagonal components of all ECMs and off-diagonal components of radiance observations are refined. The ECM diagnostic results are as follows: 1) the diagnosed error standard deviations (SDs) are generally much smaller than those of the JMA operational system (CNTL); 2) interchannel correlations in humidity-sensitive radiance errors are much larger than 0.2; and 3) horizontal correlation distances of AMSU-A are ~50 km, excluding channel 4. The experimental results include the following: 1) the diagnosed ECMs generally improve forecast accuracy over CNTL even without additional tunings; 2) the supplemental tuning parameter, which is the deflation factor (0.6 in SD) applied for the estimated ECMs of nonsatellite conventional data and GPS radio occultation data, statistically significantly improves forecast accuracy; 3) this value 0.6 is set as the same value as the ratio of the estimated background error SD to that in CNTL; 4) high-density assimilation (10 times) of AMSU-A is better than CNTL, not better than that with 5 times; and 5) ECMs estimated using boreal summer data can improve forecast accuracy in winter, which indicates their robustness.


Sign in / Sign up

Export Citation Format

Share Document