๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Optimization of the hidden unit function in feedforward neural networks

โœ Scribed by Osamu Fujita


Publisher
Elsevier Science
Year
1992
Tongue
English
Weight
790 KB
Volume
5
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

โœฆ Synopsis


A novel objective fimction is proposed.for optimizing the hidden unit function in feedforward neural networks. This objective fiznction represents the performance of the hidden unit at minimizing the least squared output errors of the linear output zmit. This is derived from the decrease in the output errors due to the addition of the hidden units. The optimized ozttput state vectors of the hidden units span a proper state space, which includes the desired output vectors for the network. The optimization (maximization of the objective fimction) is equal to minimizing the angle between the desired output vector and tile projection of the hidden unit's output state vector onto the orthogonal complement of the subspace spanned by the other state vectors. Tile approximate solution can be obtained using tile gradient ascent algorithm. This optimization method is usefid in constructingfitlly connected feedforward neural networks and for minimizing tile size of layered networks.


๐Ÿ“œ SIMILAR VOLUMES


Statistical estimation of the number of
โœ Osamu Fujita ๐Ÿ“‚ Article ๐Ÿ“… 1998 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 273 KB

The number of required hidden units is statistically estimated for feedforward neural networks that are constructed by adding hidden units one by one. The output error decreases with the number of hidden units by an almost constant rate, if each appropriate hidden unit is selected out of a great num

Optimizing the parameters of multilayere
โœ M. S. Packianather; P. R. Drake; H. Rowlands ๐Ÿ“‚ Article ๐Ÿ“… 2000 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 252 KB ๐Ÿ‘ 1 views

The size and training parameters of artificial neural networks have a critical effect on their performance. This paper presents the application of the Taguchi Design of Experiments (DoEs) off-line quality control method in the optimization of the design parameters of a neural network. Being a 'paral

A novel multicriteria optimization algor
โœ Krishna Kottathra; Yianni Attikiouzel ๐Ÿ“‚ Article ๐Ÿ“… 1996 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 195 KB

We propose in this paper a novel prescriptive solution to decide the optimum number of neurons in the hidden-layer of multilayer feedforward neural networks. Our approach uses the unconstrained mixed integer nonlinear multicriteria optimization technique. We validate the algorithm using numerical ex

Bounds on the number of hidden units in
โœ Masahiko Arai ๐Ÿ“‚ Article ๐Ÿ“… 1993 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 461 KB

For three-layer art~l~cial neural networks (TANs) that take binao, vahtes, the number of hidden units is considered regarding two problems: One is to find the necessary and sufficient number to make mapping between the binary output values of TANs and learning patterns (inputs) arbitrary; and the ot