Development of a generalized neural network
β Scribed by Greger G Andersson; Peter Kaufmann
- Publisher
- Elsevier Science
- Year
- 2000
- Tongue
- English
- Weight
- 162 KB
- Volume
- 50
- Category
- Article
- ISSN
- 0169-7439
No coin nor oath required. For personal study only.
β¦ Synopsis
The interest for neural networks has grown concomitantly with the increased awareness of the ubiquity of non-linear systems. The main focus on improvements in this field has been on the development of different algorithms that either speed up the convergence rate andror avoid entrapment in local minima. In this work, a different approach is utilized where the existence of local minima is regarded as an exploitable advantage since they can be considered as corresponding to different descriptions of the information content. This study focuses on a method to combine these different descriptions, obtained from several optimized neural networks, into a generalized neural network. The development of generalized neural networks is illustrated using two real-life data sets. The results show that the generalized neural networks improves the estimated Mean Ε½ . Squared Error MSE by at least 23%. Furthermore, the generalized neural network does not overfit the calibration set, as the Ε½ . Mean Squared Error of Calibration MSEC set is in close agreement with the MSE of the independent test set.
π SIMILAR VOLUMES
This article presents a new algorithm.[br the automatic generation of neural architectures and supervised learning. Given a set of examples, the algorithm generates an architecture and s),naptic weights that are adapted to the sampled problem. The algorithm supports an), type and amount of numerical
The Cerebellar Model Articulation Controller (CMAC) is a simple and fast neural network: these characteristics have extended its successful applications, while the analysis of its representation capabilities, as for many other neural networks, did not follow a similar development. In this article w