Approximation capabilities of multilayer feedforward networks
โ Scribed by Kurt Hornik
- Publisher
- Elsevier Science
- Year
- 1991
- Tongue
- English
- Weight
- 700 KB
- Volume
- 4
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
We show that standard multilayer feedforward networks with as few as a single hidden layer and arbitrary bounded and nonconstant activation function are universal approximators with respect to Lp(ฮผ) performance criteria, for arbitrary finite input environment measures ฮผ, provided only that sufficiently many hidden units are available. If the activation function is continuous, bounded and nonconstant, then continuous mappings can be learned uniformly over compact input sets. We also give very general conditions ensuring that networks with sufficiently smooth activation functions are capable of arbitrarily accurate approximation to a function and its derivatives.
๐ SIMILAR VOLUMES
Api~rosirnation of' reul fmctiom /I!, f&d~orwcrrd rwtwork.s of' thr ~rsurrl kind is sI~o~'rr to h lxr.scd 011 the fkntlwnet~tml principle of approxitnrrtion hx pip~pwisc-c,orl.sttr,lr fhctiotl.s.