Fast adaptive algorithms and networks for class-separability features
β Scribed by H.Abrishami Moghaddam; Kh.Amiri Zadeh
- Publisher
- Elsevier Science
- Year
- 2003
- Tongue
- English
- Weight
- 159 KB
- Volume
- 36
- Category
- Article
- ISSN
- 0031-3203
No coin nor oath required. For personal study only.
β¦ Synopsis
In this article, we introduce accelerated algorithms for linear discriminant analysis (LDA) and feature extraction from unimodal multiclass Gaussian data. Current adaptive methods based on the gradient descent optimization technique use a ΓΏxed or a monotonically decreasing step size in each iteration, which results in a slow convergence rate. Here, we use a variable step size, optimally computed in each iteration using the steepest descent method, in order to accelerate the convergence of the algorithm. Based on the new adaptive algorithm, we present a self-organizing neural network for adaptive computation of the square root of the inverse covariance matrix ( -1=2 ) and use it (i) in a network for optimal feature extraction from Gaussian data and (ii) in cascaded form with a principal component analysis network for LDA. Experimental results demonstrate fast convergence and high stability of the algorithm and justify its advantages for on-line pattern recognition applications with stationary and non-stationary input data.
π SIMILAR VOLUMES
We introduce and discuss new accelerated algorithms for linear discriminant analysis (LDA) in unimodal multiclass Gaussian data. These algorithms use a variable step size, optimally computed in each iteration using (i) the steepest descent, (ii) conjugate direction, and (iii) Newton-Raphson methods