The number of required hidden units is statistically estimated for feedforward neural networks that are constructed by adding hidden units one by one. The output error decreases with the number of hidden units by an almost constant rate, if each appropriate hidden unit is selected out of a great num
Optimization of the hidden unit function in feedforward neural networks
โ Scribed by Osamu Fujita
- Publisher
- Elsevier Science
- Year
- 1992
- Tongue
- English
- Weight
- 790 KB
- Volume
- 5
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
A novel objective fimction is proposed.for optimizing the hidden unit function in feedforward neural networks. This objective fiznction represents the performance of the hidden unit at minimizing the least squared output errors of the linear output zmit. This is derived from the decrease in the output errors due to the addition of the hidden units. The optimized ozttput state vectors of the hidden units span a proper state space, which includes the desired output vectors for the network. The optimization (maximization of the objective fimction) is equal to minimizing the angle between the desired output vector and tile projection of the hidden unit's output state vector onto the orthogonal complement of the subspace spanned by the other state vectors. Tile approximate solution can be obtained using tile gradient ascent algorithm. This optimization method is usefid in constructingfitlly connected feedforward neural networks and for minimizing tile size of layered networks.
๐ SIMILAR VOLUMES
The size and training parameters of artificial neural networks have a critical effect on their performance. This paper presents the application of the Taguchi Design of Experiments (DoEs) off-line quality control method in the optimization of the design parameters of a neural network. Being a 'paral
We propose in this paper a novel prescriptive solution to decide the optimum number of neurons in the hidden-layer of multilayer feedforward neural networks. Our approach uses the unconstrained mixed integer nonlinear multicriteria optimization technique. We validate the algorithm using numerical ex
For three-layer art~l~cial neural networks (TANs) that take binao, vahtes, the number of hidden units is considered regarding two problems: One is to find the necessary and sufficient number to make mapping between the binary output values of TANs and learning patterns (inputs) arbitrary; and the ot