Limit problems for interpolation by analytic radial basis functions
โ Scribed by Robert Schaback
- Publisher
- Elsevier Science
- Year
- 2008
- Tongue
- English
- Weight
- 267 KB
- Volume
- 212
- Category
- Article
- ISSN
- 0377-0427
No coin nor oath required. For personal study only.
โฆ Synopsis
Interpolation problems for analytic radial basis functions like the Gaussian and inverse multiquadrics can degenerate in two ways: the radial basis functions can be scaled to become increasingly flat, or the data points coalesce in the limit while the radial basis functions stay fixed. Both cases call for a careful regularization, which, if carried out explicitly, yields a preconditioning technique for the degenerating linear systems behind these interpolation problems. This paper deals with both cases. For the increasingly flat limit, we recover results by Larsson and Fornberg together with Lee,Yoon, andYoon concerning convergence of interpolants towards polynomials. With slight modifications, the same technique can also handle scenarios with coalescing data points for fixed radial basis functions. The results show that the degenerating local Lagrange interpolation problems converge towards certain Hermite-Birkhoff problems. This is an important prerequisite for dealing with approximation by radial basis functions adaptively, using freely varying data sites.
๐ SIMILAR VOLUMES
We consider error estimates for interpolation by a special class of compactly supported radial basis functions. These functions consist of a univariate polynomial within their support and are of minimal degree depending on space dimension and smoothness. Their associated ``native'' Hilbert spaces ar
This paper discusses approximation errors for interpolation in a variational setting which may be obtained from the analysis given by Golomb and Weinberger. We show how this analysis may be used to derive the power function estimate of the error as introduced by Schaback and Powell. A simple error t