scholarly journals Improved Dividend Estimation from Intraday Quotes

Entropy ◽  
2022 ◽  
Vol 24 (1) ◽  
pp. 95
Author(s):  
Pontus Söderbäck ◽  
Jörgen Blomvall ◽  
Martin Singull

Liquid financial markets, such as the options market of the S&P 500 index, create vast amounts of data every day, i.e., so-called intraday data. However, this highly granular data is often reduced to single-time when used to estimate financial quantities. This under-utilization of the data may reduce the quality of the estimates. In this paper, we study the impacts on estimation quality when using intraday data to estimate dividends. The methodology is based on earlier linear regression (ordinary least squares) estimates, which have been adapted to intraday data. Further, the method is also generalized in two aspects. First, the dividends are expressed as present values of future dividends rather than dividend yields. Second, to account for heteroscedasticity, the estimation methodology was formulated as a weighted least squares, where the weights are determined from the market data. This method is compared with a traditional method on out-of-sample S&P 500 European options market data. The results show that estimations based on intraday data have, with statistical significance, a higher quality than the corresponding single-times estimates. Additionally, the two generalizations of the methodology are shown to improve the estimation quality further.

1986 ◽  
Vol 16 (2) ◽  
pp. 249-255 ◽  
Author(s):  
Edwin J. Green ◽  
William E. Strawderman

A Stein-rule estimator, which shrinks least squares estimates of regression parameters toward their weighted average, was employed to estimate the coefficient in the constant form factor volume equation for 18 species simultaneously. The Stein-rule procedure was applied to ordinary least squares estimates and weighted least squares estimates. Simulation tests on independent validation data sets revealed that the Stein-rule estimates were biased, but predicted better than the corresponding least squares estimates. The Stein-rule procedures also yielded lower estimated mean square errors for the volume equation coefficient than the corresponding least squares procedure. Different methods of withdrawing sample data from the total sample available for each species revealed that the superiority of Stein-rule procedures over least squares decreased as the sample size increased and that the Stein-rule procedures were robust to unequal sample sizes, at least on the scale studied here.


Author(s):  
Michael A. Gebers

Since 1964 the California Department of Motor Vehicles has issued several monographs on driver characteristics and accident risk factors as part of a series of analyses known as the California driver record study. A number of regression analyses were conducted of driving record variables measured over a 6-year time period (1986 to 1991). The techniques presented consist of ordinary least squares, weighted least squares, Poisson, negative binomial, linear probability, and logistic regression models. The objective of the analyses was to compare the results obtained from several different regression techniques under consideration for use in the in-progress California driver record study. The results are informative in determining whether the various regression methods produce similar results for different sample sizes and in exploring whether reliance on ordinary least squares techniques in past California driver record study analyses has produced biased significance levels and parameter estimates. The results indicate that, for these data, the use of the different regression techniques do not lead to any greater increase in individual accident prediction beyond that obtained through application of ordinary least squares regression. The methods produce almost identical results in terms of the relative importance and statistical significance of the independent variables. It therefore appears safe to employ ordinary least squares multiple regression techniques on driver accident count distributions of the type represented by California driver records, at least when the sample sizes are large.


2009 ◽  
Vol 12 (03) ◽  
pp. 297-317 ◽  
Author(s):  
ANOUAR BEN MABROUK ◽  
HEDI KORTAS ◽  
SAMIR BEN AMMOU

In this paper, fractional integrating dynamics in the return and the volatility series of stock market indices are investigated. The investigation is conducted using wavelet ordinary least squares, wavelet weighted least squares and the approximate Maximum Likelihood estimator. It is shown that the long memory property in stock returns is approximately associated with emerging markets rather than developed ones while strong evidence of long range dependence is found for all volatility series. The relevance of the wavelet-based estimators, especially, the approximate Maximum Likelihood and the weighted least squares techniques is proved in terms of stability and estimation accuracy.


2009 ◽  
Vol 2009 ◽  
pp. 1-8 ◽  
Author(s):  
Janet Myhre ◽  
Daniel R. Jeske ◽  
Michael Rennie ◽  
Yingtao Bi

A heteroscedastic linear regression model is developed from plausible assumptions that describe the time evolution of performance metrics for equipment. The inherited motivation for the related weighted least squares analysis of the model is an essential and attractive selling point to engineers with interest in equipment surveillance methodologies. A simple test for the significance of the heteroscedasticity suggested by a data set is derived and a simulation study is used to evaluate the power of the test and compare it with several other applicable tests that were designed under different contexts. Tolerance intervals within the context of the model are derived, thus generalizing well-known tolerance intervals for ordinary least squares regression. Use of the model and its associated analyses is illustrated with an aerospace application where hundreds of electronic components are continuously monitored by an automated system that flags components that are suspected of unusual degradation patterns.


2014 ◽  
Vol 71 (1) ◽  
Author(s):  
Bello Abdulkadir Rasheed ◽  
Robiah Adnan ◽  
Seyed Ehsan Saffari ◽  
Kafi Dano Pati

In a linear regression model, the ordinary least squares (OLS) method is considered the best method to estimate the regression parameters if the assumptions are met. However, if the data does not satisfy the underlying assumptions, the results will be misleading. The violation for the assumption of constant variance in the least squares regression is caused by the presence of outliers and heteroscedasticity in the data. This assumption of constant variance (homoscedasticity) is very important in linear regression in which the least squares estimators enjoy the property of minimum variance. Therefor e robust regression method is required to handle the problem of outlier in the data. However, this research will use the weighted least square techniques to estimate the parameter of regression coefficients when the assumption of error variance is violated in the data. Estimation of WLS is the same as carrying out the OLS in a transformed variables procedure. The WLS can easily be affected by outliers. To remedy this, We have suggested a strong technique for the estimation of regression parameters in the existence of heteroscedasticity and outliers. Here we apply the robust regression of M-estimation using iterative reweighted least squares (IRWLS) of Huber and Tukey Bisquare function and resistance regression estimator of least trimmed squares to estimating the model parameters of state-wide crime of united states in 1993. The outcomes from the study indicate the estimators obtained from the M-estimation techniques and the least trimmed method are more effective compared with those obtained from the OLS.


1985 ◽  
Vol 15 (2) ◽  
pp. 331-340 ◽  
Author(s):  
T. Cunia ◽  
R. D. Briggs

To construct biomass tables for various tree components that are consistent with each other, one may use linear regression techniques with dummy variables. When the biomass of these components is measured on the same sample trees, one should also use the generalized rather than ordinary least squares method. A procedure is shown which allows the estimation of the covariance matrix of the sample biomass values and circumvents the problem of storing and inverting large covariance matrices. Applied to 20 sets of sample tree data, the generalized least squares regressions generated estimates which, on the average were slightly higher (about 1%) than the sample data. The confidence and prediction bands about the regression function were wider, sometimes considerably wider than those estimated by the ordinary weighted least squares.


2007 ◽  
Vol 40 (4) ◽  
pp. 694-701
Author(s):  
Thaung Lwin

Knudsen [X-ray Spectrom.(1981),10, 54–561] proposed and demonstrated a least-squares approach to estimating the unknown parameters of a system of equations required for calibration in X-ray diffraction analysis. The approach is an ordinary least-squares approach which does not incorporate information on the errors of the measured intensities for a set of samples used as standards. The purpose of the present paper is to show that a functional relationship model can be applied to the problem to account for all the variation due to sampling and measurement error in the peak intensities. It is also shown that Knudsen's calibration estimator can be regarded as an approximation to a more general and potentially more efficient weighted least-squares estimator derived from the functional relationship model. The closeness of the approximation depends on the nature of the covariance structure of the intensity measurements.


Sign in / Sign up

Export Citation Format

Share Document