𝔖 Bobbio Scriptorium
✦   LIBER   ✦

An analysis of premature saturation in back propagation learning

✍ Scribed by Youngjik Lee; Sang-Hoon Oh; Myung Won Kim


Book ID
104348583
Publisher
Elsevier Science
Year
1993
Tongue
English
Weight
672 KB
Volume
6
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

✦ Synopsis


The back propagation (BP) algorithm is widely used for finding optimum weights of multilayer neural networks in many pattern recognition applications. However, the critical drawbacks of the algorithm are its slow learning speed and convergence to local minima. One of the major reasons for these drawbacks is the "premature saturation'" which is a plwnomenon that the error of the neural network stays significantly high constant.for some period tf time during learning. It is known to be caused by an inappropriate set of initial weights. In this paper, the probability of premature saturation at the beginning epoch of learning procedure in the BP algorithm has been derived in terms o[the maximum vahw of initial weights, the number o.f nodes in each layer, and the maximum slope of the sigmoidal activation function: it has been verO~ed by the Monte Carlo simulation. Using this result, the premature saturation can be avoided with proper initial weight settings.


πŸ“œ SIMILAR VOLUMES


Analysis of the error back-propagation l
✍ Qi Jia; Katsuyuki Hagiwara; Shiro Usui; Naohiro Toda πŸ“‚ Article πŸ“… 1995 πŸ› John Wiley and Sons 🌐 English βš– 768 KB

## Abstract As the method to accelerate the learning by error back‐propagation, several studies have been proposed in which the parameter called gain is introduced. In those studies, however, the acceleration effect is evaluated only numerically, and there is no theoretical analysis of the effect o