scholarly journals ON THE PRECISION OF THE NUCLEATOR

2017 ◽  
Vol 36 (2) ◽  
pp. 123 ◽  
Author(s):  
Javier González-Villa ◽  
Marcos Cruz ◽  
Luis M. Cruz-Orive

The nucleator is a design unbiased method of local stereology for estimating the volume of a bounded object. The only information required lies in the intersection of the object with an isotropic random ray emanating from a fixed point (called the pivotal point) associated with the object. For instance, the volume of a neuron can be estimated from a random ray emanating from its nucleolus. The nucleator is extensively used in biosciences because it is efficient and easy to apply. The estimator variance can be reduced by increasing the number of rays. In an earlier paper a systematic sampling design was proposed, and theoretical variance predictors were derived, for the corresponding volume estimator. Being the only variance predictors hitherto available for the nucleator, our basic goal was to check their statistical performance by means of Monte Carlo resampling on computer reconstructions of real objects. As a plus, the empirical distribution of the volume estimator revealed statistical properties of practical relevance.

2002 ◽  
Vol 34 (03) ◽  
pp. 469-483
Author(s):  
Ximo Gual-Arnau ◽  
Luis M. Cruz-Orive

Geometric sampling, and local stereology in particular, often require observations at isotropic random directions on the sphere, and some sort of systematic design on the sphere becomes necessary on grounds of efficiency and practical applicability. Typically, the relevant probes are of nucleator type, in which several rays may be contained in a sectioning plane through a fixed point (e.g. through a nucleolus within a biological cell). The latter requirement considerably reduces the choice of design in practice; in this paper, we concentrate on a nucleator design based on splitting the sphere into regions of equal area, but not of identical shape; this design is pseudosystematic rather than systematic in a strict sense. Firstly, we obtain useful exact representations of the variance of an estimator under pseudosystematic sampling on the sphere. Then we adopt a suitable covariogram model to obtain a variance predictor from a single sample of arbitrary size, and finally we examine the prediction accuracy by way of simulation on a synthetic particle model.


2008 ◽  
Vol 40 (2) ◽  
pp. 454-472 ◽  
Author(s):  
Ivan Gentil ◽  
Bruno Rémillard

While the convergence properties of many sampling selection methods can be proven, there is one particular sampling selection method introduced in Baker (1987), closely related to ‘systematic sampling’ in statistics, that has been exclusively treated on an empirical basis. The main motivation of the paper is to start to study formally its convergence properties, since in practice it is by far the fastest selection method available. We will show that convergence results for the systematic sampling selection method are related to properties of peculiar Markov chains.


2013 ◽  
Vol 13 (3) ◽  
pp. 737-754 ◽  
Author(s):  
Y. Paudel ◽  
W. J. W. Botzen ◽  
J. C. J. H. Aerts

Abstract. This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on these results, flood insurance premiums are estimated using two different practical methods that each account in different ways for an insurer's risk aversion and the dispersion rate of loss data. This study is of practical relevance because insurers have been considering the introduction of flood insurance in the Netherlands, which is currently not generally available.


2013 ◽  
Vol 10 (3) ◽  
pp. 348-356
Author(s):  
Yolanda Jordaan

Within the current privacy sensitive environment, an understanding of consumers’ information privacy concerns is critical. The objective of the study is to establish whether there is a difference between victims and non-victims of information privacy invasion, and whether this has an influence on their privacy concerns and protective behaviour. A probability (systematic) sampling design was used to draw a representative sample of 800 households where-after 800 telephone interviews were conducted with adults from these households. The findings show that victims had increased concern about information misuse by, and solicitation practices of, organisations, and they exhibit more protective behaviour than non-victims. This suggests that organisations should recognise that consumers believe that they have ownership of their personal information. Furthermore, organisations should share information of consumers in a way that is respectful, relevant and beneficial.


2002 ◽  
Vol 34 (3) ◽  
pp. 469-483 ◽  
Author(s):  
Ximo Gual-Arnau ◽  
Luis M. Cruz-Orive

Geometric sampling, and local stereology in particular, often require observations at isotropic random directions on the sphere, and some sort of systematic design on the sphere becomes necessary on grounds of efficiency and practical applicability. Typically, the relevant probes are of nucleator type, in which several rays may be contained in a sectioning plane through a fixed point (e.g. through a nucleolus within a biological cell). The latter requirement considerably reduces the choice of design in practice; in this paper, we concentrate on a nucleator design based on splitting the sphere into regions of equal area, but not of identical shape; this design is pseudosystematic rather than systematic in a strict sense. Firstly, we obtain useful exact representations of the variance of an estimator under pseudosystematic sampling on the sphere. Then we adopt a suitable covariogram model to obtain a variance predictor from a single sample of arbitrary size, and finally we examine the prediction accuracy by way of simulation on a synthetic particle model.


1994 ◽  
Vol 26 (1) ◽  
pp. 1-12 ◽  
Author(s):  
E. B. Vedel Jensen ◽  
K. Kiêu

Unbiased stereological estimators of d-dimensional volume in ℝn are derived, based on information from an isotropic random r-slice through a specified point. The content of the slice can be subsampled by means of a spatial grid. The estimators depend only on spatial distances. As a fundamental lemma, an explicit formula for the probability that an isotropic random r-slice in ℝn through O hits a fixed point in ℝn is given.


2004 ◽  
Vol 38 ◽  
pp. 351-356 ◽  
Author(s):  
A. N. Bozhinskiy

AbstractThe statistical modelling of gravitational avalanche-type processes is carried out using the Monte Carlo method. The process of snow avalanche origin is described with the model of stress state and stability of snow cover on a slope. The statistical simulation of the stress state of a snow slab is performed for avalanche site No. 22 (Khibiny, Russia). The strength characteristics of the snow slab are considered as randomvariables.The influence of the first moments of the distributions of the slab-strength parameters on the probability of avalanche release is studied. Using a hydraulic model of a dense flow avalanche, the statistical modelling of avalanche dynamics for the avalanche site “Domestic” (Elbrus region, Russia) is carried out. The coefficients of dry and turbulent friction and snow entrainment are considered as random parameters of the model. The histograms and distribution functions of the run-out distance, thickness and volume of avalanche depositions are obtained. The model and empirical distribution functions of the avalanche run-out distance are compared. Statistical simulation of slushflow dynamics (basin of Bear brook, Khibiny, Russia) is performed. The two-layer deterministic model of slushflow is used. The random parameters of the model assumed are: the water inflow on the “tail”of the flow and the coefficient of dry friction for slush. The histograms and distribution functions of dynamic characteristics of flow are obtained. The model outcomes are compared with field data.


1985 ◽  
Vol 42 (11) ◽  
pp. 1806-1814 ◽  
Author(s):  
J. F. Schweigert ◽  
C. W. Haegele ◽  
M. Stocker

Three estimators for two-stage sampling designs assuming unequal sized primaries (transects) were compared. The ratio estimator was found to provide the most consistent estimates of the mean and variance and so was used for estimating optimal sample design. Preliminary results from some biased sampling during 1976 and 1978 provided guidelines for the 1981 study designed to derive an optimal sampling design. Inconsistent results from the two areas surveyed during 1981 prevented general conclusions, but a corroborating resurvey of one area in 1983 suggested that a sampling intensity of five samples per 100 m of transect and transects every 250–400 m along the length of the spawn should result in estimates of the mean egg density with a standard error no greater than 25%. Systematic sampling is logistically preferable to random sampling and can be incorporated into the two-stage design described herein which should be used in future spawn surveys designed to estimate spawning escapement.


2009 ◽  
Vol 9 (1) ◽  
pp. 56-66
Author(s):  
Dennis Peque ◽  

This paper presents adaptive cluster sampling (ACS) as a method of assessing forest biodiversity. In this study, ACS was used to estimate the abundance of ecologically sparse population of Diospyros philippinensis (Desrousseaux) within the Visayas State University Forest Reserve. Its statistical efficiency were analyzed by comparing them to the conventional systematic sampling (Syst) estimator. Results indicated that adaptive cluster sampling (ACS) plots captured more trees into the sample compared to systematic sampling (Syst) plots. In addition, ACS estimates for mean and total numbers of individuals per ha was higher than systematic sampling estimates and in terms of variance ACS gave substantially lower variance than systematic sampling. However, the ratio of the adjusted SE of ACS to the adjusted SE of systematic sampling for each species and the combined data of the two species was generally lesser than 1 which means that ACS was not a better design than systematic sampling.


Sign in / Sign up

Export Citation Format

Share Document