𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Algorithms and networks for accelerated convergence of adaptive LDA

✍ Scribed by H. Abrishami Moghaddam; M. Matinfar; S.M. Sajad Sadough; Kh. Amiri Zadeh


Publisher
Elsevier Science
Year
2005
Tongue
English
Weight
331 KB
Volume
38
Category
Article
ISSN
0031-3203

No coin nor oath required. For personal study only.

✦ Synopsis


We introduce and discuss new accelerated algorithms for linear discriminant analysis (LDA) in unimodal multiclass Gaussian data. These algorithms use a variable step size, optimally computed in each iteration using (i) the steepest descent, (ii) conjugate direction, and (iii) Newton-Raphson methods in order to accelerate the convergence of the algorithm. Current adaptive methods based on the gradient descent optimization technique use a fixed or a monotonically decreasing step size in each iteration, which results in a slow convergence rate. Furthermore, the convergence of these algorithms depends on appropriate choices of the step sizes. The new algorithms have the advantage of automatic optimal selection of the step size using the current data samples. Based on the new adaptive algorithms, we present self-organizing neural networks for adaptive computation of -1/2 and use them in cascaded form with a PCA network for LDA. Experimental results demonstrate fast convergence and robustness of the new algorithms and justify their advantages for on-line pattern recognition applications with stationary and non-stationary multidimensional input data.


πŸ“œ SIMILAR VOLUMES


Exponential convergence of products of r
✍ George V. Moustakides πŸ“‚ Article πŸ“… 1998 πŸ› John Wiley and Sons 🌐 English βš– 164 KB πŸ‘ 1 views

We introduce a novel methodology for analysing well known classes of adaptive algorithms. Combining recent developments concerning geometric ergodicity of stationary Markov processes and long existing results from the theory of Perturbations of Linear Operators we first study the behaviour and conve