๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

New study on neural networks: the essential order of approximation

โœ Scribed by Jianjun Wang; Zongben Xu


Publisher
Elsevier Science
Year
2010
Tongue
English
Weight
374 KB
Volume
23
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

โœฆ Synopsis


For the nearly exponential type of feedforward neural networks (neFNNs), the essential order of their approximation is revealed. It is proven that for any continuous function defined on a compact set of R(d), there exist three layers of neFNNs with the fixed number of hidden neurons that attain the essential order. Under certain assumption on the neFNNs, the ideal upper bound and lower bound estimations on approximation precision of the neFNNs are provided. The obtained results not only characterize the intrinsic property of approximation of the neFNNs, but also proclaim the implicit relationship between the precision (speed) and the number of hidden neurons of the neFNNs.


๐Ÿ“œ SIMILAR VOLUMES


On the approximate realization of contin
โœ Ken-Ichi Funahashi ๐Ÿ“‚ Article ๐Ÿ“… 1989 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 733 KB

In this paper, we prove that any continuous mapping can be approximately realized by Rumelhart-Hinton-Williams' multilayer neural networks with at least one hidden layer whose output functions are sigmoid functions. The starting point of the proof for the one hidden layer case is an integral formula

Studying the performance of artificial n
โœ E.C. Laskari; G.C. Meletiou; D.K. Tasoulis; M.N. Vrahatis ๐Ÿ“‚ Article ๐Ÿ“… 2006 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 168 KB

Cryptosystems rely on the assumption that a number of mathematical problems are computationally intractable, in the sense that they cannot be solved in polynomial time. Numerous approaches have been applied to address these problems. In this paper, we consider artificial neural networks and study th