In the recently published article cited above, an error was found in the caption for Figure 6. The correct caption is published below. FIGURE 6 | Soft margin L 1 support vector machine decision boundary estimate based on 600 training data points with C = 100.
Support vector machine regularization
β Scribed by D. M. Reeves; G. M. Jacyna
- Book ID
- 104602967
- Publisher
- Wiley (John Wiley & Sons)
- Year
- 2011
- Tongue
- English
- Weight
- 558 KB
- Volume
- 3
- Category
- Article
- ISSN
- 0163-1829
- DOI
- 10.1002/wics.149
No coin nor oath required. For personal study only.
β¦ Synopsis
Abstract
Finding the best decision boundary for a classification problem involves covariance structures, distance measures, and eigenvectors. This article considers how eigenstructures are an inherent part of the support vector machine (SVM) functional basis that encodes the geometric features of a separating hyperplane. SVM learning capacity involves an eigenvector set that spans the parameter space being learned. The linear SVM has been shown to have insufficient learning capacity when the number of training examples exceeds the dimension of the feature space. For this case, an incomplete eigenvector set spans the observation space. SVM architectures based on insufficient eigenstructures lack sufficient learning capacity for good separating hyperplanes. However, proper regularization ensures that two essential types of βbiasesβ are encoded within SVM functional mappings: an appropriate set of algebraic (and thus geometric) relationships and a sufficient eigenstructure set. WIREs Comp Stat 2011 3 204β215 DOI: 10.1002/wics.149
This article is categorized under:
Statistical Learning and Exploratory Methods of the Data Sciences > Support Vector Machines
π SIMILAR VOLUMES