Integral constraints on the accuracy of least-squares estimation
β Scribed by Brett Ninness
- Publisher
- Elsevier Science
- Year
- 1996
- Tongue
- English
- Weight
- 823 KB
- Volume
- 32
- Category
- Article
- ISSN
- 0005-1098
No coin nor oath required. For personal study only.
β¦ Synopsis
It is common to need to estimate the frequency response of a system from observed input-output data. This paper uses integral constraints to characterise the undermodelling-induced errors involved in solving this problem via parametric least-squares methods. This is achieved by exploiting the Hilbert-space structure inherent in the least-squares solution in order to provide a geometric interpretation of the nature of frequency-domain errors. The result is that an intuitive process can be applied in which, for a given data collection method and model structure, one identifies the sides of a right triangle, and then, by noting the hypotenuse to be the longest side, integral constraints on the magnitude estimation error are obtained. By also noting that the triangle sides both lie in a particular plane, integral constraints on phase estimation error are derived. This geometric approach is in contrast to earlier work in this area, which has relied on algebraic manipulation.
π SIMILAR VOLUMES
Abstroct~Every linear parameter estimation problem gives rise to an overdetermined set of linear equations AX ~ B which is usually solved with the ordinary least squares (LS) method. Often, both A and B are inaccurate. For these cases, a more general fitting technique, called total least squares (TL
We consider the problem of estimating the sum of squared error loss \(L=|\beta-\hat{\beta}|^{2}\) of the least-squares esitmator \(\hat{\beta}\) for \(\beta\), the regression coefficient. The standard estimator \(\ell_{0}\) is the expected value of \(L\). Here the error variance is assumed to be kno