scholarly journals Population size estimation based upon zero-truncated, one-inflated and sparse count data

Author(s):  
Dankmar Böhning ◽  
Herwig Friedl

AbstractEstimating the size of a hard-to-count population is a challenging matter. In particular, when only few observations of the population to be estimated are available. The matter gets even more complex when one-inflation occurs. This situation is illustrated with the help of two examples: the size of a dice snake population in Graz (Austria) and the number of flare stars in the Pleiades. The paper discusses how one-inflation can be easily handled in likelihood approaches and also discusses how variances and confidence intervals can be obtained by means of a semi-parametric bootstrap. A Bayesian approach is mentioned as well and all approaches result in similar estimates of the hidden size of the population. Finally, a simulation study is provided which shows that the unconditional likelihood approach as well as the Bayesian approach using Jeffreys’ prior perform favorable.

2009 ◽  
Vol 101 (4) ◽  
pp. 2186-2193 ◽  
Author(s):  
Sam Behseta ◽  
Tamara Berdyyeva ◽  
Carl R. Olson ◽  
Robert E. Kass

When correlation is measured in the presence of noise, its value is decreased. In single-neuron recording experiments, for example, the correlation of selectivity indices in a pair of tasks may be assessed across neurons, but, because the number of trials is limited, the measured index values for each neuron will be noisy. This attenuates the correlation. A correction for such attenuation was proposed by Spearman more than 100 yr ago, and more recent work has shown how confidence intervals may be constructed to supplement the correction. In this paper, we propose an alternative Bayesian correction. A simulation study shows that this approach can be far superior to Spearman's, both in accuracy of the correction and in coverage of the resulting confidence intervals. We demonstrate the usefulness of this technology by applying it to a set of data obtained from the frontal cortex of a macaque monkey while performing serial order and variable reward saccade tasks. There the correction results in a substantial increase in the correlation across neurons in the two tasks.


2001 ◽  
Vol 58 (8) ◽  
pp. 1663-1671 ◽  
Author(s):  
Milo D Adkison ◽  
Zhenming Su

In this simulation study, we compared the performance of a hierarchical Bayesian approach for estimating salmon escapement from count data with that of separate maximum likelihood estimation of each year's escapement. We simulated several contrasting counting schedules resulting in data sets that differed in information content. In particular, we were interested in the ability of the Bayesian approach to estimate escapement and timing in years where few or no counts are made after the peak of escapement. We found that the Bayesian hierarchical approach was much better able to estimate escapement and escapement timing in these situations. Separate estimates for such years could be wildly inaccurate. However, even a single postpeak count could dramatically improve the estimability of escapement parameters.


2021 ◽  
Vol 5 (2) ◽  
pp. 139-154
Author(s):  
Warisa Thangjai ◽  
Sa-Aat Niwitpong ◽  
Suparat Niwitpong

Herein, we propose the Bayesian approach for constructing the confidence intervals for both the coefficient of variation of a log-normal distribution and the difference between the coefficients of variation of two log-normal distributions. For the first case, the Bayesian approach was compared with large-sample, Chi-squared, and approximate fiducial approaches via Monte Carlo simulation. For the second case, the Bayesian approach was compared with the method of variance estimates recovery (MOVER), modified MOVER, and approximate fiducial approaches using Monte Carlo simulation. The results show that the Bayesian approach provided the best approach for constructing the confidence intervals for both the coefficient of variation of a log-normal distribution and the difference between the coefficients of variation of two log-normal distributions. To illustrate the performances of the confidence limit construction approaches with real data, they were applied to analyze real PM10 datasets from the Nan and Chiang Mai provinces in Thailand, the results of which are in agreement with the simulation results. Doi: 10.28991/esj-2021-01264 Full Text: PDF


Methodology ◽  
2010 ◽  
Vol 6 (2) ◽  
pp. 71-82 ◽  
Author(s):  
Byron J. Gajewski ◽  
Diane K. Boyle ◽  
Sarah Thompson

We demonstrate the utility of a Bayesian-based approach for calculating intervals of Cronbach’s alpha from a psychological instrument having ordinal responses with a dynamic scale. A small number of response options on an instrument will cause traditional-based interval estimates to be biased. Ordinal-based solutions are problematic because there is no clear mechanism for handling the dynamic scale. One way to remedy the bias is to adjust with a Bayesian approach. The Bayesian approach adjusts the bias and allows theoretically simple calculations of Cronbach’s alpha and intervals. We demonstrate the calculations of the Bayesian approach while at the same time offer a comparison to more traditional-based methods using both credible (or confidence) intervals and mean squared error. Practical advice is offered.


2021 ◽  
Vol 14 (2) ◽  
pp. 231-232
Author(s):  
Adnan Kastrati ◽  
Alexander Hapfelmeier

2021 ◽  
pp. 263208432199622
Author(s):  
Tim Mathes ◽  
Oliver Kuss

Background Meta-analysis of systematically reviewed studies on interventions is the cornerstone of evidence based medicine. In the following, we will introduce the common-beta beta-binomial (BB) model for meta-analysis with binary outcomes and elucidate its equivalence to panel count data models. Methods We present a variation of the standard “common-rho” BB (BBST model) for meta-analysis, namely a “common-beta” BB model. This model has an interesting connection to fixed-effect negative binomial regression models (FE-NegBin) for panel count data. Using this equivalence, it is possible to estimate an extension of the FE-NegBin with an additional multiplicative overdispersion term (RE-NegBin), while preserving a closed form likelihood. An advantage due to the connection to econometric models is, that the models can be easily implemented because “standard” statistical software for panel count data can be used. We illustrate the methods with two real-world example datasets. Furthermore, we show the results of a small-scale simulation study that compares the new models to the BBST. The input parameters of the simulation were informed by actually performed meta-analysis. Results In both example data sets, the NegBin, in particular the RE-NegBin showed a smaller effect and had narrower 95%-confidence intervals. In our simulation study, median bias was negligible for all methods, but the upper quartile for median bias suggested that BBST is most affected by positive bias. Regarding coverage probability, BBST and the RE-NegBin model outperformed the FE-NegBin model. Conclusion For meta-analyses with binary outcomes, the considered common-beta BB models may be valuable extensions to the family of BB models.


Sign in / Sign up

Export Citation Format

Share Document