Speedy alternatives to back propagation
โ Scribed by John Moody; Chris Darken
- Publisher
- Elsevier Science
- Year
- 1988
- Tongue
- English
- Weight
- 57 KB
- Volume
- 1
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
We propose three new neuralty-inspired learning alums which offer much greater speed and greater biolog~ plausibility than ~ Propagation. These algorithms include "Learning With Receptive Fields", a new Serf-Organizing Associative Memory, and a new variant of the Cerebellar Model Articulation Controller (CMAC). These new algorithms share one criticaJ feature in common: they utilize only one layer of intemal units. Furthermore, the Self-Organizing Associative Memory and the CMAC models require supervised learning of only the ou~t weights. These features result in increased speed.
๐ SIMILAR VOLUMES
The back-propagation method encounters two problems in practice, i.e., slow learning progress and convergence to a false local minimum. The present study addresses the latter problem and proposes a modified back-propagation method. The basic idea of the method is to keep the sigmoid derivative relat