The impact of ocean sound speed variability on the uncertainty of geoacoustic parameter estimates.

2009 ◽  
Vol 125 (4) ◽  
pp. 2590-2590
Author(s):  
Ross Chapman ◽  
Yongmin Jiang
Methodology ◽  
2015 ◽  
Vol 11 (3) ◽  
pp. 89-99 ◽  
Author(s):  
Leslie Rutkowski ◽  
Yan Zhou

Abstract. Given a consistent interest in comparing achievement across sub-populations in international assessments such as TIMSS, PIRLS, and PISA, it is critical that sub-population achievement is estimated reliably and with sufficient precision. As such, we systematically examine the limitations to current estimation methods used by these programs. Using a simulation study along with empirical results from the 2007 cycle of TIMSS, we show that a combination of missing and misclassified data in the conditioning model induces biases in sub-population achievement estimates, the magnitude and degree to which can be readily explained by data quality. Importantly, estimated biases in sub-population achievement are limited to the conditioning variable with poor-quality data while other sub-population achievement estimates are unaffected. Findings are generally in line with theory on missing and error-prone covariates. The current research adds to a small body of literature that has noted some of the limitations to sub-population estimation.


2021 ◽  
Vol 45 (3) ◽  
pp. 159-177
Author(s):  
Chen-Wei Liu

Missing not at random (MNAR) modeling for non-ignorable missing responses usually assumes that the latent variable distribution is a bivariate normal distribution. Such an assumption is rarely verified and often employed as a standard in practice. Recent studies for “complete” item responses (i.e., no missing data) have shown that ignoring the nonnormal distribution of a unidimensional latent variable, especially skewed or bimodal, can yield biased estimates and misleading conclusion. However, dealing with the bivariate nonnormal latent variable distribution with present MNAR data has not been looked into. This article proposes to extend unidimensional empirical histogram and Davidian curve methods to simultaneously deal with nonnormal latent variable distribution and MNAR data. A simulation study is carried out to demonstrate the consequence of ignoring bivariate nonnormal distribution on parameter estimates, followed by an empirical analysis of “don’t know” item responses. The results presented in this article show that examining the assumption of bivariate nonnormal latent variable distribution should be considered as a routine for MNAR data to minimize the impact of nonnormality on parameter estimates.


2021 ◽  
pp. 001316442199240
Author(s):  
Chunhua Cao ◽  
Eun Sook Kim ◽  
Yi-Hsin Chen ◽  
John Ferron

This study examined the impact of omitting covariates interaction effect on parameter estimates in multilevel multiple-indicator multiple-cause models as well as the sensitivity of fit indices to model misspecification when the between-level, within-level, or cross-level interaction effect was left out in the models. The parameter estimates produced in the correct and the misspecified models were compared under varying conditions of cluster number, cluster size, intraclass correlation, and the magnitude of the interaction effect in the population model. Results showed that the two main effects were overestimated by approximately half of the size of the interaction effect, and the between-level factor mean was underestimated. None of comparative fit index, Tucker–Lewis index, root mean square error of approximation, and standardized root mean square residual was sensitive to the omission of the interaction effect. The sensitivity of information criteria varied depending majorly on the magnitude of the omitted interaction, as well as the location of the interaction (i.e., at the between level, within level, or cross level). Implications and recommendations based on the findings were discussed.


2005 ◽  
Vol 52 (10-11) ◽  
pp. 503-508 ◽  
Author(s):  
K. Chandran ◽  
Z. Hu ◽  
B.F. Smets

Several techniques have been proposed for biokinetic estimation of nitrification. Recently, an extant respirometric assay has been presented that yields kinetic parameters for both nitrification steps with minimal physiological change to the microorganisms during the assay. Herein, the ability of biokinetic parameter estimates from the extant respirometric assay to adequately describe concurrently obtained NH4+-N and NO2−-N substrate depletion profiles is evaluated. Based on our results, in general, the substrate depletion profiles resulted in a higher estimate of the maximum specific growth rate coefficient, μmax for both NH4+-N to NO2−-N oxidation and NO2−-N to NO3−-N oxidation compared to estimates from the extant respirograms. The trends in the kinetic parameter estimates from the different biokinetic estimation techniques are paralleled in the nature of substrate depletion profiles obtained from best-fit parameters. Based on a visual inspection, in general, best-fit parameters from optimally designed complete respirograms provided a better description of the substrate depletion profiles than estimates from isolated respirograms. Nevertheless, the sum of the squared errors for the best-fit respirometry based parameters was outside the 95% joint confidence interval computed for the best-fit substrate depletion based parameters. Notwithstanding the difference in kinetic parameter estimates determined in this study, the different biokinetic estimation techniques still are close to estimates reported in literature. Additional parameter identifiability and sensitivity analysis of parameters from substrate depletion assays revealed high precision of parameters and high parameter correlation. Although biokinetic estimation via automated extant respirometry is far more facile than via manual substrate depletion measurements, additional sensitivity analyses are needed to test the impact of differences in the resulting parameter values on continuous reactor performance.


BMJ Open ◽  
2022 ◽  
Vol 12 (1) ◽  
pp. e031573
Author(s):  
Sam Abbott ◽  
Hannah Christensen ◽  
Ellen Brooks-Pollock

ObjectivesIn 2005, England and Wales switched from universal BCG vaccination against tuberculosis (TB) disease for school-age children to targeted vaccination of neonates. We aimed to recreate and re-evaluate a previously published model, the results of which informed this policy change.DesignWe recreated an approach for estimating the impact of ending the BCG schools scheme, correcting a methodological flaw in the model, updating the model with parameter uncertainty and improving parameter estimates where possible. We investigated scenarios for the assumed annual decrease in TB incidence rates considered by the UK’s Joint Committee on Vaccination and Immunisation and explored alternative scenarios using notification data.SettingEngland and Wales.Outcome measuresThe number of vaccines needed to prevent a single notification and the average annual additional notifications caused by ending the policy change.ResultsThe previously published model was found to contain a methodological flaw and to be spuriously precise. It greatly underestimated the impact of ending school-age vaccination compared with our updated, corrected model. The updated model produced predictions with wide CIs when parameter uncertainty was included. Model estimates based on an assumption of an annual decrease in TB incidence rates of 1.9% were closest to those estimated using notification data. Using this assumption, we estimate that 1600 (2.5; 97.5% quantiles: 1300, 2000) vaccines would have been required to prevent a single notification in 2004.ConclusionsThe impact of ending the BCG schools scheme was found to be greater than previously thought when notification data were used. Our results highlight the importance of independent evaluations of modelling evidence, including uncertainty, and evaluating multiple scenarios when forecasting the impact of changes in vaccination policy.


2016 ◽  
Vol 23 (2) ◽  
pp. 448-459 ◽  
Author(s):  
Richard T. Melstrom

This article presents an exponential model of tourist expenditures estimated by a quasi-maximum likelihood (QML) technique. The advantage of this approach is that, unlike conventional OLS and Tobit estimators, it produces consistent parameter estimates under conditions of a corner solution at zero and heteroscedasticity. An application to sportfishing evaluates the role of socioeconomic demographics and species preferences on trip spending. The bias from an inappropriate estimator is illustrated by comparing the results from QML and OLS estimation, which shows that OLS significantly overstates the impact of trip duration on trip expenditures compared with the QML estimator. Both sets of estimates imply that trout and bass anglers spend significantly more on their fishing trips compared with other anglers.


2021 ◽  
pp. 107699862199436
Author(s):  
Yue Liu ◽  
Hongyun Liu

The prevalence and serious consequences of noneffortful responses from unmotivated examinees are well-known in educational measurement. In this study, we propose to apply an iterative purification process based on a response time residual method with fixed item parameter estimates to detect noneffortful responses. The proposed method is compared with the traditional residual method and noniterative method with fixed item parameters in two simulation studies in terms of noneffort detection accuracy and parameter recovery. The results show that when severity of noneffort is high, the proposed method leads to a much higher true positive rate with a small increase of false discovery rate. In addition, parameter estimation is significantly improved by the strategies of fixing item parameters and iteratively cleansing. These results suggest that the proposed method is a potential solution to reduce the impact of data contamination due to severe low test-taking effort and to obtain more accurate parameter estimates. An empirical study is also conducted to show the differences in the detection rate and parameter estimates among different approaches.


2020 ◽  
Vol 14 (1) ◽  
pp. 28-61 ◽  
Author(s):  
Masudul Hasan Adil ◽  
Neeraj Hatekar ◽  
Pravakar Sahoo

Traditional money demand functions are often criticized for persistent over-prediction, implausible parameter estimates, highly serially correlated errors and unstable money demand. This study argues that some of these problems may have emerged for the lack of factoring financial innovation into the money demand function. This study estimates money demand for India during the post-reform period, from 1996:Q2 to 2016:Q3. The money demand function is estimated with the linear ARDL approach to cointegration developed by Pesaran, Shin, & Smith (2001), Bounds testing approaches to the analysis of level relationships, Journal of Applied Econometrics, 16(3), 289–326, after employing various proxies for financial innovation. In conclusion, the study finds that there is a stable long-run relationship among variables, such as real money balances, and the scale and opportunity cost variables. In a nutshell, the study assesses the relative importance of financial innovation variables in the money demand equation, and finds that financial innovation plays a very significant role in the money demand specification and its stability. JEL Classification: E41, E44, E42, E52, O16, O53


2020 ◽  
pp. 001316442093963
Author(s):  
Stefanie A. Wind ◽  
Randall E. Schumacker

Researchers frequently use Rasch models to analyze survey responses because these models provide accurate parameter estimates for items and examinees when there are missing data. However, researchers have not fully considered how missing data affect the accuracy of dimensionality assessment in Rasch analyses such as principal components analysis (PCA) of standardized residuals. Because adherence to unidimensionality is a prerequisite for the appropriate interpretation and use of Rasch model results, insight into the impact of missing data on the accuracy of this approach is critical. We used a simulation study to examine the accuracy of standardized residual PCA with various proportions of missing data and multidimensionality. We also explored an adaptation of modified parallel analysis in combination with standardized residual PCA as a source of additional information about dimensionality when missing data are present. Our results suggested that missing data impact the accuracy of PCA on standardized residuals, and that the adaptation of modified parallel analysis provides useful supplementary information about dimensionality when there are missing data.


Sign in / Sign up

Export Citation Format

Share Document