𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Classes of feedforward neural networks and their circuit complexity

✍ Scribed by John S. Shawe-Taylor; Martin H.G. Anthony; Walter Kern


Publisher
Elsevier Science
Year
1992
Tongue
English
Weight
594 KB
Volume
5
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

✦ Synopsis


Th& paper aims to p&ce neural networks in the conte.\t ol'booh'an citz'ldt complexit.l: 1,1~, de/itte aplm~priate classes qlfeedybrward neural networks with specified fan-in, accm'ac)' olcomputation and depth and ttsing techniques" o./commzmication comph:Β₯ity proceed to show t/tat the classes.fit into a well-studied hieralz'h)' q/boolean circuits. Results cover both classes of sigmoid activation./hnction networks and linear threshold networks. Tiffs provides a much needed theoretical basis./or the study o/the computational power qlilbed[brward neural networks.


πŸ“œ SIMILAR VOLUMES


Optimizing the parameters of multilayere
✍ M. S. Packianather; P. R. Drake; H. Rowlands πŸ“‚ Article πŸ“… 2000 πŸ› John Wiley and Sons 🌐 English βš– 252 KB πŸ‘ 1 views

The size and training parameters of artificial neural networks have a critical effect on their performance. This paper presents the application of the Taguchi Design of Experiments (DoEs) off-line quality control method in the optimization of the design parameters of a neural network. Being a 'paral

Influences of variable scales and activa
✍ Gao Daqi; Yang Genxing πŸ“‚ Article πŸ“… 2003 πŸ› Elsevier Science 🌐 English βš– 310 KB

This paper gives insight into the methods about how to improve the learning capabilities of multilayer feedforward neural networks with linear basis functions in the case of limited number of patterns according to the basic principles of support vector machine (SVM), namely, about how to get the opt