Worst Case Complexity of Problems with Random Information Noise
β Scribed by Leszek Plaskota
- Publisher
- Elsevier Science
- Year
- 1996
- Tongue
- English
- Weight
- 326 KB
- Volume
- 12
- Category
- Article
- ISSN
- 0885-064X
No coin nor oath required. For personal study only.
β¦ Synopsis
We study the worst case complexity of solving problems for which information is partial and contaminated by random noise. It is well known that if information is exact then adaption does not help for solving linear problems, i.e., for approximating linear operators over convex and symmetric sets. On the other hand, randomization can sometimes help significantly. It turns out that for noisy information, adaption may lead to much better approximations than nonadaption, even for linear problems. This holds because, in the presence of noise, adaption is equivalent to randomization. We present sharp bounds on the worst case complexity of problems with random noise in terms of the randomized complexity with exact information. The results obtained are applied to the d-variate integration and L Θ -approximation of functions belonging to Ho Β¨lder and Sobolev classes. Information is given by function evaluations with Gaussian noise of variance 2 . For exact information, the two problems are intractable since the complexity is proportional to (1/) q where q grows linearly with d. For noisy information the situation is different. For integration, thecomplexity is of order 2 / 2 as goes to zero. Hence the curse of dimensionality is broken due to random noise. for approximation, the complexity is of order 2 (1/) qΟ©2 ln(1/), and the problem is intractable also with random noise.
π SIMILAR VOLUMES