๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Speedy alternatives to back propagation

โœ Scribed by John Moody; Chris Darken


Publisher
Elsevier Science
Year
1988
Tongue
English
Weight
57 KB
Volume
1
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

โœฆ Synopsis


We propose three new neuralty-inspired learning alums which offer much greater speed and greater biolog~ plausibility than ~ Propagation. These algorithms include "Learning With Receptive Fields", a new Serf-Organizing Associative Memory, and a new variant of the Cerebellar Model Articulation Controller (CMAC). These new algorithms share one criticaJ feature in common: they utilize only one layer of intemal units. Furthermore, the Self-Organizing Associative Memory and the CMAC models require supervised learning of only the ou~t weights. These features result in increased speed.


๐Ÿ“œ SIMILAR VOLUMES


A modified back-propagation method to av
โœ Yutaka Fukuoka; Hideo Matsuki; Haruyuki Minamitani; Akimasa Ishida ๐Ÿ“‚ Article ๐Ÿ“… 1998 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 218 KB

The back-propagation method encounters two problems in practice, i.e., slow learning progress and convergence to a false local minimum. The present study addresses the latter problem and proposes a modified back-propagation method. The basic idea of the method is to keep the sigmoid derivative relat