Classes of feedforward neural networks and their circuit complexity
β Scribed by John S. Shawe-Taylor; Martin H.G. Anthony; Walter Kern
- Publisher
- Elsevier Science
- Year
- 1992
- Tongue
- English
- Weight
- 594 KB
- Volume
- 5
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
Th& paper aims to p&ce neural networks in the conte.\t ol'booh'an citz'ldt complexit.l: 1,1~, de/itte aplm~priate classes qlfeedybrward neural networks with specified fan-in, accm'ac)' olcomputation and depth and ttsing techniques" o./commzmication comph:Β₯ity proceed to show t/tat the classes.fit into a well-studied hieralz'h)' q/boolean circuits. Results cover both classes of sigmoid activation./hnction networks and linear threshold networks. Tiffs provides a much needed theoretical basis./or the study o/the computational power qlilbed[brward neural networks.
π SIMILAR VOLUMES
The size and training parameters of artificial neural networks have a critical effect on their performance. This paper presents the application of the Taguchi Design of Experiments (DoEs) off-line quality control method in the optimization of the design parameters of a neural network. Being a 'paral
This paper gives insight into the methods about how to improve the learning capabilities of multilayer feedforward neural networks with linear basis functions in the case of limited number of patterns according to the basic principles of support vector machine (SVM), namely, about how to get the opt