Fast generating algorithm for a general three-layer perceptron
✍ Scribed by R. Zollner; H.J. Schmitz; F. Wünsch; U. Krey
- Publisher
- Elsevier Science
- Year
- 1992
- Tongue
- English
- Weight
- 513 KB
- Volume
- 5
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
✦ Synopsis
A fast iterative algorithm is proposed for the construction and the learning of a neural net achieving a classification task, with an input layer, one intermediate layer, and an output layer. The network is able to learn an arbitrary training set. The algorithm does not depend on a special learning scheme (e.g., the couplings can be determined by modified Hebbian prescriptions or by more complex learning procedures). During the process the intermediate units are constructed systematically by collecting the patterns into smaller subsets. For simplicity, we consider only the case of one output neuron, but actually this restriction is not necessar.
📜 SIMILAR VOLUMES
## Abstract The back propagation of error in multi‐layer perceptrons when used for supervised training is a non‐local algorithm in space, that is it needs the knowledge of the network topology. On the other hand, learning rules in biological systems with many hidden units, seem to be local in both
In this paper, we propose two fast codebook generation algo-book. Some alternative methods are also available [14]. rithms by making use of the information in the iterative process. The pairwise Nearest Neighbor (PNN) algorithm is a new Comparing to the conventional full search method (the LBG alter