New study on neural networks: the essential order of approximation
โ Scribed by Jianjun Wang; Zongben Xu
- Publisher
- Elsevier Science
- Year
- 2010
- Tongue
- English
- Weight
- 374 KB
- Volume
- 23
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
For the nearly exponential type of feedforward neural networks (neFNNs), the essential order of their approximation is revealed. It is proven that for any continuous function defined on a compact set of R(d), there exist three layers of neFNNs with the fixed number of hidden neurons that attain the essential order. Under certain assumption on the neFNNs, the ideal upper bound and lower bound estimations on approximation precision of the neFNNs are provided. The obtained results not only characterize the intrinsic property of approximation of the neFNNs, but also proclaim the implicit relationship between the precision (speed) and the number of hidden neurons of the neFNNs.
๐ SIMILAR VOLUMES
In this paper, we prove that any continuous mapping can be approximately realized by Rumelhart-Hinton-Williams' multilayer neural networks with at least one hidden layer whose output functions are sigmoid functions. The starting point of the proof for the one hidden layer case is an integral formula
Cryptosystems rely on the assumption that a number of mathematical problems are computationally intractable, in the sense that they cannot be solved in polynomial time. Numerous approaches have been applied to address these problems. In this paper, we consider artificial neural networks and study th