๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

On the overtraining phenomenon of backpropagation neural networks

โœ Scribed by S.G. Tzafestas; P.J. Dalianis; G. Anthopoulos


Book ID
103897724
Publisher
Elsevier Science
Year
1996
Tongue
English
Weight
981 KB
Volume
40
Category
Article
ISSN
0378-4754

No coin nor oath required. For personal study only.

โœฆ Synopsis


A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an extension to an existing algorithm is described. The extended algorithm provides a new energy function and its advantages, such as improved plasticity and performance along with its dynamic properties, are explained. The algorithm is applied to some common problems (XOR, numeric character recognition and function approximation) and simulation results are presented and discussed.


๐Ÿ“œ SIMILAR VOLUMES


Theory of the backpropagation neural net
โœ Robert Hecht-Nielsen ๐Ÿ“‚ Article ๐Ÿ“… 1988 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 61 KB

Of all the neural networks being applied to real world problems, the backpropagation neural network has proven to be the most useful. The Backpropagation network seems to have been originally invented by Paul Werbos in his 1974 Harvard Ph.D. dissertation, and subsequently reinvented in 1982 by David