Improved Bounds on the Sample Complexity of Learning
โ Scribed by Yi Li; Philip M. Long; Aravind Srinivasan
- Publisher
- Elsevier Science
- Year
- 2001
- Tongue
- English
- Weight
- 137 KB
- Volume
- 62
- Category
- Article
- ISSN
- 0022-0000
No coin nor oath required. For personal study only.
โฆ Synopsis
We present a new general upper bound on the number of examples required to estimate all of the expectations of a set of random variables uniformly well. The quality of the estimates is measured using a variant of the relative error proposed by Haussler and Pollard. We also show that our bound is within a constant factor of the best possible. Our upper bound implies improved bounds on the sample complexity of learning according to Haussler's decision theoretic model.
๐ SIMILAR VOLUMES
In this note we consider the zero-finding problem for a homogeneous polynomial system, The well-determined (m=n) and underdetermined (m<n) cases are considered together. We also let D=max d i , d=(d 1 , ..., d m ), and The projective Newton method has been introduced by Shub in [6] and is defined
## Abstract After giving a new proof of a wellโknown theorem of Dirac on critical graphs, we discuss the elegant upper bounds of Matula and SzekeresโWilf which follow from it. In order to improve these bounds, we consider the following fundamental coloring problem: given an edgeโcut (__V__~1~, __V_
## Abstract We draw the __n__โdimensional hypercube in the plane with ${5\over 32}4^{n}-\lfloor{{{{n}^{2}+1}\over 2}}\rfloor {2}^{n-2}$ crossings, which improves the previous best estimation and coincides with the long conjectured upper bound of Erdรถs and Guy. ยฉ 2008 Wiley Periodicals, Inc. J Graph