On the role of orthonormality of sensitivity functions in parameter optimization problems
โ Scribed by S.P. Bingulac
- Publisher
- Elsevier Science
- Year
- 1969
- Tongue
- English
- Weight
- 332 KB
- Volume
- 5
- Category
- Article
- ISSN
- 0005-1098
No coin nor oath required. For personal study only.
โฆ Synopsis
This paper compares the application of the two algorithms, the classical gradient and the Gauss-Newton, for parameter optimization, and it investigates the influence of the form of sensitivity functions on the convergency of these algorithms. It is shown that the cases where the classical gradient algorithm fails corlespond to the cases where the adjustable parameters are selected in such a manner that the sensitivity functions, used in optimization, approach linearly dependent functions. Hence, a suitable scalar measure of the "degree" of linear dependence of sensitivity functions is defined. By investigating the value of this scalar quantity, it is possible in each particular case to conclude in advance whether the convergency of the classical gradient algorithm will be satisfactory or not.
๐ SIMILAR VOLUMES
When a multivariate elliptical distribution is used as the basis in multivariate analysis all fourth-order cumulants are expressed in terms of a single kurtosis parameter. This and other well-known properties place unrealistic restrictions on the distribution of the covariance matrix. In this paper
Structures are often characterized by parameters, such as mass and stiffness, that are spatially distributed. Parameter identification of distributed structures is subject to many of the difficulties involved in the modelling problem, and the choice of the model can greatly affect the results of the