๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Optimization of feedforward neural networks

โœ Scribed by Jun Han; Claudio Moraga; Stefan Sinne


Publisher
Elsevier Science
Year
1996
Tongue
English
Weight
885 KB
Volume
9
Category
Article
ISSN
0952-1976

No coin nor oath required. For personal study only.


๐Ÿ“œ SIMILAR VOLUMES


Optimizing the parameters of multilayere
โœ M. S. Packianather; P. R. Drake; H. Rowlands ๐Ÿ“‚ Article ๐Ÿ“… 2000 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 252 KB ๐Ÿ‘ 1 views

The size and training parameters of artificial neural networks have a critical effect on their performance. This paper presents the application of the Taguchi Design of Experiments (DoEs) off-line quality control method in the optimization of the design parameters of a neural network. Being a 'paral

A novel multicriteria optimization algor
โœ Krishna Kottathra; Yianni Attikiouzel ๐Ÿ“‚ Article ๐Ÿ“… 1996 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 195 KB

We propose in this paper a novel prescriptive solution to decide the optimum number of neurons in the hidden-layer of multilayer feedforward neural networks. Our approach uses the unconstrained mixed integer nonlinear multicriteria optimization technique. We validate the algorithm using numerical ex

Support vector machine based training of
โœ Wei-Qi Lin; Jian-Hui Jiang; Yan-Ping Zhou; Hai-Long Wu; Guo-Li Shen; Ru-Qin Yu ๐Ÿ“‚ Article ๐Ÿ“… 2006 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 398 KB ๐Ÿ‘ 1 views

## Abstract Multilayer feedforward neural networks (MLFNNs) are important modeling techniques widely used in QSAR studies for their ability to represent nonlinear relationships between descriptors and activity. However, the problems of overfitting and premature convergence to local optima still pos