Minimax robust designs for wavelet estimation of nonparametric regression models with autocorrelated errors

Author(s):  
Selvakkadunko Selvaratnam ◽  
Alwell Julius Oyet

We discuss the construction of designs for estimation of nonparametric regression models with autocorrelated errors when the mean response is to be approximated by a finite order linear combination of dilated and translated versions of the Daubechies wavelet bases with four vanishing moments. We assume that the parameters of the resulting model will be estimated by weighted least squares (WLS) or by generalized least squares (GLS). The bias induced by the unused components of the wavelet bases, in the linear approximation, then inflates the natural variation of the WLS and GLS estimates. We therefore construct our designs by minimizing the maximum value of the average mean squared error (AMSE). Such designs are said to be robust in the minimax sense. Our illustrative examples are constructed by using the simulated annealing algorithm to select an optimal [Formula: see text]-point design, which are integers, from a grid of possible values of the explanatory or design variable [Formula: see text]. We found that the integer-valued designs we constructed based on GLS estimation, have smaller minimum loss when compared to the designs for WLS or ordinary least squares (OLS) estimation, except when the correlation parameter [Formula: see text] approaches 1.

1985 ◽  
Vol 15 (2) ◽  
pp. 331-340 ◽  
Author(s):  
T. Cunia ◽  
R. D. Briggs

To construct biomass tables for various tree components that are consistent with each other, one may use linear regression techniques with dummy variables. When the biomass of these components is measured on the same sample trees, one should also use the generalized rather than ordinary least squares method. A procedure is shown which allows the estimation of the covariance matrix of the sample biomass values and circumvents the problem of storing and inverting large covariance matrices. Applied to 20 sets of sample tree data, the generalized least squares regressions generated estimates which, on the average were slightly higher (about 1%) than the sample data. The confidence and prediction bands about the regression function were wider, sometimes considerably wider than those estimated by the ordinary weighted least squares.


Author(s):  
Anatolii Omelchenko ◽  
Oleksandr Vinnichenko ◽  
Pavel Neyezhmakov ◽  
Oleksii Fedorov ◽  
Volodymyr Bolyuh

Abstract In order to develop optimal data processing algorithms in ballistic laser gravimeters under the effect of correlated interference, the method of generalized least squares is applied. In this case, to describe the interference, a mathematical model of the autoregression process is used, for which the inverse correlation matrix has a band type and is expressed through the values of the autoregression coefficients. To convert the “path-time” data from the output of the coincidence circuit of ballistic laser gravimeters to a process uniform in time, their local quadratic interpolation is used. Algorithms for data processing in a ballistic gravimeter, developed on the basis of a method of weighted least squares using orthogonal Hahn polynomials, are considered. To implement a symmetric measurement method, the symmetric Hahn polynomials, characterized by one parameter, are used. The method of mathematical modelling is used to study the gain in the accuracy of measuring the gravitational acceleration by the synthesized algorithms in comparison with the algorithm based on the method of least squares. It is shown that auto seismic interference in ballistic laser gravimeters with a symmetric measurement method can be significantly reduced by using a mathematical model of the second-order autoregressive process in the method of generalized least squares. A comparative analysis of the characteristics of the algorithms developed using the method of generalized least squares, the method of weighted least squares and the method of ordinary least squares is carried out.


2010 ◽  
Vol 62 (4) ◽  
pp. 875-882 ◽  
Author(s):  
A. Dembélé ◽  
J.-L. Bertrand-Krajewski ◽  
B. Barillon

Regression models are among the most frequently used models to estimate pollutants event mean concentrations (EMC) in wet weather discharges in urban catchments. Two main questions dealing with the calibration of EMC regression models are investigated: i) the sensitivity of models to the size and the content of data sets used for their calibration, ii) the change of modelling results when models are re-calibrated when data sets grow and change with time when new experimental data are collected. Based on an experimental data set of 64 rain events monitored in a densely urbanised catchment, four TSS EMC regression models (two log-linear and two linear models) with two or three explanatory variables have been derived and analysed. Model calibration with the iterative re-weighted least squares method is less sensitive and leads to more robust results than the ordinary least squares method. Three calibration options have been investigated: two options accounting for the chronological order of the observations, one option using random samples of events from the whole available data set. Results obtained with the best performing non linear model clearly indicate that the model is highly sensitive to the size and the content of the data set used for its calibration.


Author(s):  
Warha, Abdulhamid Audu ◽  
Yusuf Abbakar Muhammad ◽  
Akeyede, Imam

Linear regression is the measure of relationship between two or more variables known as dependent and independent variables. Classical least squares method for estimating regression models consist of minimising the sum of the squared residuals. Among the assumptions of Ordinary least squares method (OLS) is that there is no correlations (multicollinearity) between the independent variables. Violation of this assumptions arises most often in regression analysis and can lead to inefficiency of the least square method. This study, therefore, determined the efficient estimator between Least Absolute Deviation (LAD) and Weighted Least Square (WLS) in multiple linear regression models at different levels of multicollinearity in the explanatory variables. Simulation techniques were conducted using R Statistical software, to investigate the performance of the two estimators under violation of assumptions of lack of multicollinearity. Their performances were compared at different sample sizes. Finite properties of estimators’ criteria namely, mean absolute error, absolute bias and mean squared error were used for comparing the methods. The best estimator was selected based on minimum value of these criteria at a specified level of multicollinearity and sample size. The results showed that, LAD was the best at different levels of multicollinearity and was recommended as alternative to OLS under this condition. The performances of the two estimators decreased when the levels of multicollinearity was increased.


2020 ◽  
Author(s):  
Mohitosh Kejriwal ◽  
Xuewen Yu

Summary This paper develops a new approach to forecasting a highly persistent time series that employs feasible generalized least squares (FGLS) estimation of the deterministic components in conjunction with Mallows model averaging. Within a local-to-unity asymptotic framework, we derive analytical expressions for the asymptotic mean squared error and one-step-ahead mean squared forecast risk of the proposed estimator and show that the optimal FGLS weights are different from their ordinary least squares (OLS) counterparts. We also provide theoretical justification for a generalized Mallows averaging estimator that incorporates lag order uncertainty in the construction of the forecast. Monte Carlo simulations demonstrate that the proposed procedure yields a considerably lower finite-sample forecast risk relative to OLS averaging. An application to U.S. macroeconomic time series illustrates the efficacy of the advocated method in practice and finds that both persistence and lag order uncertainty have important implications for the accuracy of forecasts.


1985 ◽  
Vol 15 (1) ◽  
pp. 23-28 ◽  
Author(s):  
T. Cunia ◽  
R. D. Briggs

The generalized least squares procedure is applied to sample tree data for which additive biomass tables are required. This procedure is proposed as an alternative to the ordinary weighted least squares in order to account for the fact that several biomass components are measured on the same sample trees. The biomass tables generated by the generalized and the ordinary least squares are very similar, the confidence intervals are sometimes wider, sometimes narrower, but the prediction intervals are always narrower for the generalized least squares method.


2021 ◽  
Vol 6 (11) ◽  
pp. 11850-11878
Author(s):  
SidAhmed Benchiha ◽  
◽  
Amer Ibrahim Al-Omari ◽  
Naif Alotaibi ◽  
Mansour Shrahili ◽  
...  

<abstract><p>Recently, a new lifetime distribution known as a generalized Quasi Lindley distribution (GQLD) is suggested. In this paper, we modified the GQLD and suggested a two parameters lifetime distribution called as a weighted generalized Quasi Lindley distribution (WGQLD). The main mathematical properties of the WGQLD including the moments, coefficient of variation, coefficient of skewness, coefficient of kurtosis, stochastic ordering, median deviation, harmonic mean, and reliability functions are derived. The model parameters are estimated by using the ordinary least squares, weighted least squares, maximum likelihood, maximum product of spacing's, Anderson-Darling and Cramer-von-Mises methods. The performances of the proposed estimators are compared based on numerical calculations for various values of the distribution parameters and sample sizes in terms of the mean squared error (MSE) and estimated values (Es). To demonstrate the applicability of the new model, four applications of various real data sets consist of the infected cases in Covid-19 in Algeria and Saudi Arabia, carbon fibers and rain fall are analyzed for illustration. It turns out that the WGQLD is empirically better than the other competing distributions considered in this study.</p></abstract>


Author(s):  
Parisa Torkaman

The generalized inverted exponential distribution is introduced as a lifetime model with good statistical properties. This paper, the estimation of the probability density function and the cumulative distribution function of with five different estimation methods: uniformly minimum variance unbiased(UMVU), maximum likelihood(ML), least squares(LS), weighted least squares (WLS) and percentile(PC) estimators are considered. The performance of these estimation procedures, based on the mean squared error (MSE) by numerical simulations are compared. Simulation studies express that the UMVU estimator performs better than others and when the sample size is large enough the ML and UMVU estimators are almost equivalent and efficient than LS, WLS and PC. Finally, the result using a real data set are analyzed.


2008 ◽  
Vol 24 (5) ◽  
pp. 1456-1460 ◽  
Author(s):  
Hailong Qian

In this note, based on the generalized method of moments (GMM) interpretation of the usual ordinary least squares (OLS) and feasible generalized least squares (FGLS) estimators of seemingly unrelated regressions (SUR) models, we show that the OLS estimator is asymptotically as efficient as the FGLS estimator if and only if the cross-equation orthogonality condition is redundant given the within-equation orthogonality condition. Using the condition for redundancy of moment conditions of Breusch, Qian, Schmidt, and Wyhowski (1999, Journal of Econometrics 99, 89–111), we then derive the necessary and sufficient condition for the equal asymptotic efficiency of the OLS and FGLS estimators of SUR models. We also provide several useful sufficient conditions for the equal asymptotic efficiency of OLS and FGLS estimators that can be interpreted as various mixings of the two famous sufficient conditions of Zellner (1962, Journal of the American Statistical Association 57, 348–368).


2018 ◽  
Vol 15 (4) ◽  
pp. 356-372 ◽  
Author(s):  
Marcia Martins Mendes De Luca ◽  
Paulo Henrique Nobre Parente ◽  
Emanoel Mamede Sousa Silva ◽  
Ravena Rodrigues Sousa

Purpose Following the tenets of resource-based view, the present study aims to investigate the effect of creative corporate culture according to the competing values framework model at the level of corporate intangibility and its respective repercussions on performance. Design/methodology/approach The sample included 117 non-USA foreign firms traded on the New York Stock Exchange (NYSE), which issued annual financial reports between 2009 and 2014 using the 20-F form. To meet the study objectives, in addition to the descriptive and comparative analyses, the authors performed regression analyses with panel data, estimating generalized least-squares, two-stage least-squares and ordinary least-squares. Findings Creative culture had a negative effect on the level of intangibility and corporate performance, while the level of intangibility did not appear to influence corporate performance. When combined, creative culture and intangibility had a potentially negative effect on corporate results. In conclusion, creative corporate culture had a negative effect on performance, even in firms with higher levels of intangibility, characterized by elements like experimentation and innovation. Originality/value Although the study hypotheses were eventually rejected, the analyses are relevant to both the academic setting and the market because of the organizational and institutional aspects evaluated, especially in relation to intangibility and creative culture and in view of the unique cross-cultural approach adopted. Within the corporate setting, the study provides a spectrum of stakeholders with tools to identify the profile of foreign firms traded on the NYSE.


Sign in / Sign up

Export Citation Format

Share Document