Approximation capability of interpolation neural networks
โ Scribed by Feilong Cao; Shaobo Lin; Zongben Xu
- Publisher
- Elsevier Science
- Year
- 2010
- Tongue
- English
- Weight
- 194 KB
- Volume
- 74
- Category
- Article
- ISSN
- 0925-2312
No coin nor oath required. For personal study only.
๐ SIMILAR VOLUMES
In this paper, we construct two types of feed-forward neural networks (FNNs) which can approximately interpolate, with arbitrary precision, any set of distinct data in the metric space. Firstly, for analytic activation function, an approximate interpolation FNN is constructed in the metric space, an
We show that standard multilayer feedforward networks with as few as a single hidden layer and arbitrary bounded and nonconstant activation function are universal approximators with respect to Lp(ฮผ) performance criteria, for arbitrary finite input environment measures ฮผ, provided only that sufficien
The cellular neural network is able to perform different image-processing tasks depending on the template values, i.e. the network parameters, used. In the case of linear templates the parameter space is divided into different regions by hyperplanes. Every region is associated with a task, such that
Let D/R d be a compact set and let 8 be a uniformly bounded set of D ร R functions. For a given real-valued function f defined on D and a given natural number n, we are looking for a good uniform approximation to f of the form n i=1 a i , i , with , i # 8, a i # R. Two main cases are considered: (1)