An analysis of premature saturation in back propagation learning
β Scribed by Youngjik Lee; Sang-Hoon Oh; Myung Won Kim
- Book ID
- 104348583
- Publisher
- Elsevier Science
- Year
- 1993
- Tongue
- English
- Weight
- 672 KB
- Volume
- 6
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
The back propagation (BP) algorithm is widely used for finding optimum weights of multilayer neural networks in many pattern recognition applications. However, the critical drawbacks of the algorithm are its slow learning speed and convergence to local minima. One of the major reasons for these drawbacks is the "premature saturation'" which is a plwnomenon that the error of the neural network stays significantly high constant.for some period tf time during learning. It is known to be caused by an inappropriate set of initial weights. In this paper, the probability of premature saturation at the beginning epoch of learning procedure in the BP algorithm has been derived in terms o[the maximum vahw of initial weights, the number o.f nodes in each layer, and the maximum slope of the sigmoidal activation function: it has been verO~ed by the Monte Carlo simulation. Using this result, the premature saturation can be avoided with proper initial weight settings.
π SIMILAR VOLUMES
## Abstract As the method to accelerate the learning by error backβpropagation, several studies have been proposed in which the parameter called gain is introduced. In those studies, however, the acceleration effect is evaluated only numerically, and there is no theoretical analysis of the effect o