We propose in this paper a novel prescriptive solution to decide the optimum number of neurons in the hidden-layer of multilayer feedforward neural networks. Our approach uses the unconstrained mixed integer nonlinear multicriteria optimization technique. We validate the algorithm using numerical ex
A novel learning algorithm which improves the partial fault tolerance of multilayer neural networks
โ Scribed by Salvatore Cavalieri; Orazio Mirabella
- Publisher
- Elsevier Science
- Year
- 1999
- Tongue
- English
- Weight
- 494 KB
- Volume
- 12
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
The paper deals with the problem of fault tolerance in a multilayer perceptron network. Although it already possesses a reasonable fault tolerance capability, it may be insufficient in particularly critical applications. Studies carried out by the authors have shown that the traditional backpropagation learning algorithm may entail the presence of a certain number of weights with a much higher absolute value than the others. Further studies have shown that faults in these weights is the main cause of deterioration in the performance of the neural network. In other words, the main cause of incorrect network functioning on the occurrence of a fault is the non-uniform distribution of absolute values of weights in each layer. The paper proposes a learning algorithm which updates the weights, distributing their absolute values as uniformly as possible in each layer. Tests performed on benchmark test sets have shown the considerable increase in fault tolerance obtainable with the proposed approach as compared with the traditional backpropagation algorithm and with some of the most efficient fault tolerance approaches to be found in literature.
๐ SIMILAR VOLUMES