Hebbian learning subspace method: A new approach
β Scribed by M. Prakash; M.Narasimha Murty
- Publisher
- Elsevier Science
- Year
- 1997
- Tongue
- English
- Weight
- 697 KB
- Volume
- 30
- Category
- Article
- ISSN
- 0031-3203
No coin nor oath required. For personal study only.
β¦ Synopsis
In this paper, we propose a new learning algorithm for the Subspace Pattern Recognition Method (SPRM) called the Hebbian Learning Subspace Method (HLSM). It uses the notion of a weighted squared orthogonal projection distance winch gives different weightages to different basis vectors in the computation of the orthogonal projection distance. The principle applied during learning is the same as that used in the earlier Learning Subspace Method (LSM): the projection on the wrong subspace is always decreased and the one on the correct subspace is always increased. We also propose a neural implementation for the HLSM. Experiments have been conducted on an extensive numeric set of handprinted characters involving 16659 samples using the SPRM, the HLSM and the Averaged LSM. Excellent results have been obtained using all the subspace methods thus demonstrating the suitability of subspace methods for this application.
π SIMILAR VOLUMES
Lyapunov function is constructed for the unsupervised learning equations of a large class of neural networks. These networks have a single layer of adjustable connections; units in the output layer are recurrently connected with fixed symmetric weights. The constructed function is similar in form to
The goal of this article is to propose a new cognitive model that focuses on bottom-up learning of explicit knowledge (i.e., the transformation of implicit knowledge into explicit knowledge). This phenomenon has recently received much attention in empirical research that was not accompanied by a cor