Tuning of jackknife estimator

Author(s):  
Sarjinder Singh ◽  
Stephen A. Sedory ◽  
Maria del Mar Rueda ◽  
Antonio Arcos ◽  
Raghunath Arnab
Keyword(s):  
1998 ◽  
Vol 06 (04) ◽  
pp. 357-375
Author(s):  
Gabriela Ciuperca

In this paper we present a method for the estimation of the parameters of models described by a nonlinear system of differential equations: we study the maximum likelihood estimator and the jackknife estimator for parameters of the system and for the covariance matrix of the state variables and we seek possible linear relations between parameters. We take into account the difficulty due to the small number of observations. The optimal experimental design for this kind of problem is determined. We give an application of this method for the glucose metabolism of goats.


1980 ◽  
Vol 37 (12) ◽  
pp. 2346-2351 ◽  
Author(s):  
S. J. Smith

Two methods of estimating the variance of the estimate of catch per unit effort are compared empirically here. The second method, an application of the Jackknife estimator proved to be the overall best technique due to its generality. In addition the Jackknife method provides means for the estimation of confidence intervals.Key words: catch per unit effort, variance estimation, Jackknife estimator


2006 ◽  
Vol 15 (3) ◽  
Author(s):  
Christina D. Smith ◽  
Jeffrey S. Pontius

1983 ◽  
Vol 27 (3) ◽  
pp. 329-337
Author(s):  
P.N. Kokic ◽  
N.C. Weber

The limiting behaviour of the J∞ jackknife estimator for parameters associated with stochastic processes is shown to depend on the nature of the underlying process through the asymptotic behaviour of the estimator being jackknifed. In particular, the jackknifed versions of certain estimators associated with renewal processes are shown to have an asymptotic normal distribution.


Biometrics ◽  
1996 ◽  
Vol 52 (1) ◽  
pp. 291 ◽  
Author(s):  
Stuart R. Lipsitz ◽  
Michael Parzen

2012 ◽  
Vol 4 (2) ◽  
Author(s):  
Gareth D. Liu-Evans ◽  
Garry D. A. Phillips

AbstractWe compare a number of bias-correction methodologies in terms of mean squared error and remaining bias, including the residual bootstrap, the relatively unexplored Quenouille jackknife, and methods based on analytical approximation of moments. We introduce a new higher-order jackknife estimator for the AR(1) with constant. Simulation results are presented for four different error structures, including GARCH. We include results for a relatively extreme situation where the errors are highly skewed and leptokurtic. It is argued that the bootstrap and analytical-correction (COLS) approaches are to be favoured overall, though the jackknife methods are the least biased. We find that COLS tends to have the lowest mean squared error, though the bootstrap also does well.


Sign in / Sign up

Export Citation Format

Share Document