Progress towards a rigorous error propagation for total least-squares estimates

2020 ◽  
Vol 14 (2) ◽  
pp. 159-166 ◽  
Author(s):  
Burkhard Schaffrin ◽  
Kyle Snow

AbstractAfter several attempts at a formal derivation of the dispersion matrix for Total Least-Squares (TLS) estimates within an Errors-In-Variables (EIV) Model, here a refined approach is presented that makes rigorous use of the nonlinear normal equations, though assuming a Kronecker product structure for both observational dispersion matrices at this point. In this way, iterative linearization of a model (that can be established as being equivalent to the original EIV-Model) is avoided, which might be preferred since such techniques are based on the last iteration step only and, therefore, produce dispersion matrices for the estimated parameters that are generally too optimistic. Here, the error propagation is based on the (linearized total differential of the) exact nonlinear normal equations, which should lead to more trustworthy measures of precision.

1965 ◽  
Vol 19 (1) ◽  
pp. 78-83
Author(s):  
Peter Wilson

Several methods for solving normal equations in least squares solutions are explained and the variance-covariance matrix is developed from the law of error propagation.


Mathematics ◽  
2020 ◽  
Vol 8 (9) ◽  
pp. 1450
Author(s):  
Georgios Malissiovas ◽  
Frank Neitzel ◽  
Sven Weisbrich ◽  
Svetozar Petrovic

In this contribution the fitting of a straight line to 3D point data is considered, with Cartesian coordinates xi, yi, zi as observations subject to random errors. A direct solution for the case of equally weighted and uncorrelated coordinate components was already presented almost forty years ago. For more general weighting cases, iterative algorithms, e.g., by means of an iteratively linearized Gauss–Helmert (GH) model, have been proposed in the literature. In this investigation, a new direct solution for the case of pointwise weights is derived. In the terminology of total least squares (TLS), this solution is a direct weighted total least squares (WTLS) approach. For the most general weighting case, considering a full dispersion matrix of the observations that can even be singular to some extent, a new iterative solution based on the ordinary iteration method is developed. The latter is a new iterative WTLS algorithm, since no linearization of the problem by Taylor series is performed at any step. Using a numerical example it is demonstrated how the newly developed WTLS approaches can be applied for 3D straight line fitting considering different weighting cases. The solutions are compared with results from the literature and with those obtained from an iteratively linearized GH model.


Author(s):  
Burkhard Schaffrin ◽  
Sibel Uzun

Reliability has been quantified in a simple Gauss–Markov model (GMM) by Baarda (1968) for the application to geodetic networks as the potential to detect outliers – with a specified significance and power – by testing the least-squares residuals for their zero expectation property after an adjustment assuming “no outliers”. It was shown that, under homoscedastic conditions, the so-called “redundancy numbers” could very well serve as indicators for the “local reliability” of an (individual) observation. In contrast, the maximum effect of any undetectible outlier on the estimated parameters would indicate “global reliability”. This concept had been extended successfully to the case of correlated observations by Schaffrin (1997) quite a while ago. However, no attempt has been made so far to extend Baarda’s results to the (homoscedastic) errors-in-variables (EIV) model for which Golub and van Loan (1980) had found their – now famous – algorithm to generate the total least-squares (TLS) solution, together with all the residuals. More recently, this algorithm has been generalized by Schaffrin and Wieser (2008) to the case where a truly – not just elementwise –weighted TLS solution can be computed when the covariance matrix has the structure of a Kronecker–Zehfuss product. Here, an attempt will be made to define reliability measures within such an EIV-model, in analogy to Baarda’s original approach.


2008 ◽  
Vol 13 (1) ◽  
pp. 55-66 ◽  
Author(s):  
J. Lampe ◽  
H. Voss

The total least squares (TLS) method is a successful approach for linear problems if both the matrix and the right hand side are contaminated by some noise. In a recent paper Sima, Van Huffel and Golub suggested an iterative method for solving regularized TLS problems, where in each iteration step a quadratic eigenproblem has to be solved. In this paper we prove its global convergence, and we present an efficient implementation using an iterative projection method with thick updates.


Sign in / Sign up

Export Citation Format

Share Document