scholarly journals The challenge of estimating a residual spatial autocorrelation from forest inventory data

2017 ◽  
Vol 47 (11) ◽  
pp. 1557-1566 ◽  
Author(s):  
Steen Magnussen ◽  
Johannes Breidenbach ◽  
Fransisco Mauro

Estimates of stand averages are needed by forest management for planning purposes. In forest enterprise inventories supported by remotely sensed auxiliary data, these estimates are typically derived exclusively from a model that does not consider stand effects in the study variable. Variance estimators for these means may seriously underestimate uncertainty, and confidence intervals may be too narrow when a model used for computing a stand mean omits a nontrivial stand effect in one or more of the model parameters, a nontrivial spatial distance dependent autocorrelation in the model residuals, or both. In simulated sampling from 36 populations with stands of different sizes and differing with respect to (i) the correlation between a study variable (Y) and two auxiliary variables (X), (ii) the magnitude of stand effects in the intercept of a linear population model linking X to Y, and (iii) a first-order autoregression in Y and X, we learned that none of the tested designs provided reliable estimates of the within-stand autocorrelation among model residuals. More-reliable estimates were possible from stand-wide predictions of Y. The anticipated bias in an estimated autoregression parameter had a modest influence on estimates of variance and coverage of nominal 95% confidence intervals for a synthetic stand mean.

Marketing ZFP ◽  
2019 ◽  
Vol 41 (4) ◽  
pp. 33-42
Author(s):  
Thomas Otter

Empirical research in marketing often is, at least in parts, exploratory. The goal of exploratory research, by definition, extends beyond the empirical calibration of parameters in well established models and includes the empirical assessment of different model specifications. In this context researchers often rely on the statistical information about parameters in a given model to learn about likely model structures. An example is the search for the 'true' set of covariates in a regression model based on confidence intervals of regression coefficients. The purpose of this paper is to illustrate and compare different measures of statistical information about model parameters in the context of a generalized linear model: classical confidence intervals, bootstrapped confidence intervals, and Bayesian posterior credible intervals from a model that adapts its dimensionality as a function of the information in the data. I find that inference from the adaptive Bayesian model dominates that based on classical and bootstrapped intervals in a given model.


2020 ◽  
Vol 2020 (12) ◽  
Author(s):  
Francesco Bigazzi ◽  
Alessio Caddeo ◽  
Aldo L. Cotrone ◽  
Angel Paredes

Abstract Using the holographic correspondence as a tool, we study the dynamics of first-order phase transitions in strongly coupled gauge theories at finite temperature. Considering an evolution from the large to the small temperature phase, we compute the nucleation rate of bubbles of true vacuum in the metastable phase. For this purpose, we find the relevant configurations (bounces) interpolating between the vacua and we compute the related effective actions. We start by revisiting the compact Randall-Sundrum model at high temperature. Using holographic renormalization, we compute the derivative term in the effective bounce action, that was missing in the literature. Then, we address the full problem within the top-down Witten-Sakai-Sugimoto model. It displays both a confinement/deconfinement and a chiral symmetry breaking/restoration phase transition which, depending on the model parameters, can happen at different critical temperatures. For the confinement/deconfinement case we perform the numerical analysis of an effective description of the transition and also provide analytic expressions using thick and thin wall approximations. For the chiral symmetry transition, we implement a variational approach that allows us to address the challenging non-linear problem stemming from the Dirac-Born-Infeld action.


2019 ◽  
Vol 292 ◽  
pp. 01063
Author(s):  
Lubomír Macků

An alternative method of determining exothermic reactor model parameters which include first order reaction rate constant is described in this paper. The method is based on known in reactor temperature development and is suitable for processes with changing quality of input substances. This method allows us to evaluate the reaction substances composition change and is also capable of the reaction rate constant (parameters of the Arrhenius equation) determination. Method can be used in exothermic batch or semi- batch reactors running processes based on the first order reaction. An example of such process is given here and the problem is shown on its mathematical model with the help of simulations.


2007 ◽  
Vol 73 (8) ◽  
pp. 2468-2478 ◽  
Author(s):  
Bernadette Klotz ◽  
D. Leo Pyle ◽  
Bernard M. Mackey

ABSTRACT A new primary model based on a thermodynamically consistent first-order kinetic approach was constructed to describe non-log-linear inactivation kinetics of pressure-treated bacteria. The model assumes a first-order process in which the specific inactivation rate changes inversely with the square root of time. The model gave reasonable fits to experimental data over six to seven orders of magnitude. It was also tested on 138 published data sets and provided good fits in about 70% of cases in which the shape of the curve followed the typical convex upward form. In the remainder of published examples, curves contained additional shoulder regions or extended tail regions. Curves with shoulders could be accommodated by including an additional time delay parameter and curves with tails shoulders could be accommodated by omitting points in the tail beyond the point at which survival levels remained more or less constant. The model parameters varied regularly with pressure, which may reflect a genuine mechanistic basis for the model. This property also allowed the calculation of (a) parameters analogous to the decimal reduction time D and z, the temperature increase needed to change the D value by a factor of 10, in thermal processing, and hence the processing conditions needed to attain a desired level of inactivation; and (b) the apparent thermodynamic volumes of activation associated with the lethal events. The hypothesis that inactivation rates changed as a function of the square root of time would be consistent with a diffusion-limited process.


Author(s):  
Yasuhiro Saito ◽  
Tadashi Dohi

Non-Homogeneous Gamma Process (NHGP) is characterized by an arbitrary trend function and a gamma renewal distribution. In this paper, we estimate the confidence intervals of model parameters of NHGP via two parametric bootstrap methods: simulation-based approach and re-sampling-based approach. For each bootstrap method, we apply three methods to construct the confidence intervals. Through simulation experiments, we investigate each parametric bootstrapping and each construction method of confidence intervals in terms of the estimation accuracy. Finally, we find the best combination to estimate the model parameters in trend function and gamma renewal distribution in NHGP.


1985 ◽  
Vol 17 (9) ◽  
pp. 13-21 ◽  
Author(s):  
W K. H. Kinzelbach

At present chlorinated hydrocarbon solvents rank among the major pollutants found in groundwater. In the interpretation of field data and the planning of decontamination measures numerical transport models may be a valuable tool of the environmental engineer. The applicability of one such model is tested on a case of groundwater pollution by 1,1,1,-trichloroethane. The model is composed of a horizontally 2-D flow model and a 3-D ‘random-walk' transport model. It takes into account convective and dispersive transport as well as linear adsorption and a first order decay reaction. Under certain simplifying assumptions the model allows an adequate reproduction of observed concentrations. Due to uncertainty in data and limited comparabili ty of simulated and measured concentrations the model parameters can only be estimated within bounds. The decay rate of 1,1,1-trichloroethane is estimated to lie between 0 and 0.0005 l/d.


2020 ◽  
Vol 9 (1) ◽  
pp. 156-168
Author(s):  
Seyed Mahdi Mousavi ◽  
Saeed Dinarvand ◽  
Mohammad Eftekhari Yazdi

AbstractThe unsteady convective boundary layer flow of a nanofluid along a permeable shrinking/stretching plate under suction and second-order slip effects has been developed. Buongiorno’s two-component nonhomogeneous equilibrium model is implemented to take the effects of Brownian motion and thermophoresis into consideration. It can be emphasized that, our two-phase nanofluid model along with slip concentration at the wall shows better physical aspects relative to taking the constant volume concentration at the wall. The similarity transformation method (STM), allows us to reducing nonlinear governing PDEs to nonlinear dimensionless ODEs, before being solved numerically by employing the Keller-box method (KBM). The graphical results portray the effects of model parameters on boundary layer behavior. Moreover, results validation has been demonstrated as the skin friction and the reduced Nusselt number. We understand shrinking plate case is a key factor affecting non-uniqueness of the solutions and the range of the shrinking parameter for which the solution exists, increases with the first order slip parameter, the absolute value of the second order slip parameter as well as the transpiration rate parameter. Besides, the second-order slip at the interface decreases the rate of heat transfer in a nanofluid. Finally, the analysis for no-slip and first-order slip boundary conditions can also be retrieved as special cases of the present model.


2019 ◽  
Vol 489 (4) ◽  
pp. 4690-4704 ◽  
Author(s):  
Jong-Ho Shinn

ABSTRACT We have revisited the target EON_10.477_41.954 in order to determine more accurately the uncertainties in the model parameters that are important for target classification (i.e. galaxies with or without substantial extraplanar dust). We performed a Markov chain Monte Carlo (MCMC) analysis for the 15 parameters of the three-dimensional radiative-transfer galaxy model we used previously for target classification. To investigate the convergence of the MCMC sampling – which is usually neglected in the literature but should not be – we monitored the integrated autocorrelation time (τint), and we achieved effective sample sizes >5650 for all the model parameters. The confidence intervals are unstable at the beginning of the iterations where the values of τint are increasing, but they become stable in later iterations where those values are almost constant. The final confidence intervals are ∼5–100 times larger than the nominal uncertainties used in our previous study (the standard deviation of three best-fitting results). Thus, those nominal uncertainties are not good proxies for the model-parameter uncertainties. Although the position of EON_10.477_41.954 in the target-classification plot (the scale height to diameter ratio of dust versus that of light source) decreases by about 20–30 per cent when compared to our previous study, its membership in the ‘high-group’ – i.e. among galaxies with substantial extraplanar dust – nevertheless remains unchanged.


2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Yunusa Olufadi ◽  
Cem Kadilar

We suggest an estimator using two auxiliary variables for the estimation of the unknown population variance. The bias and the mean square error of the proposed estimator are obtained to the first order of approximations. In addition, the problem is extended to two-phase sampling scheme. After theoretical comparisons, as an illustration, a numerical comparison is carried out to examine the performance of the suggested estimator with several estimators.


Author(s):  
James J. Higgins

The first order autoregressive model is proposed as a robust model for estimating and testing for means in single subject experiments. It has the advantage of mathematical simplicity, and it provides good approximations to a number of other models of the type typically encountered in behavioral research. Practical considerations on the use of the model are considered including: tests of hypotheses and confidence intervals, sample size requirements, normal approximations, and advantages of the model over the independent error term model. Inferences for means and differences of means are considered.


Sign in / Sign up

Export Citation Format

Share Document