Based on the analysis of the conclusions in the statistical learning theory, especially the VC dimension of linear functions, linear programming support vector machines (or SVMs) are presented including linear programming linear and nonlinear SVMs. In linear programming SVMs, in order to improve the
Density-induced margin support vector machines
โ Scribed by Li Zhang; Wei-Da Zhou
- Publisher
- Elsevier Science
- Year
- 2011
- Tongue
- English
- Weight
- 503 KB
- Volume
- 44
- Category
- Article
- ISSN
- 0031-3203
No coin nor oath required. For personal study only.
โฆ Synopsis
This paper proposes a new classifier called density-induced margin support vector machines (DMSVMs). DMSVMs belong to a family of SVM-like classifiers. Thus, DMSVMs inherit good properties from support vector machines (SVMs), e.g., unique and global solution, and sparse representation for the decision function. For a given data set, DMSVMs require to extract relative density degrees for all training data points. These density degrees can be taken as relative margins of corresponding training data points. Moreover, we propose a method for estimating relative density degrees by using the K nearest neighbor method. We also show the upper bound on the leave-out-one error of DMSVMs for a binary classification problem and prove it. Promising results are obtained on toy as well as real-world data sets.
๐ SIMILAR VOLUMES
Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems. In this paper, we introduce the use of SVM for multivariate fuzzy linear and nonlinear regression models. Using the basic idea underlying SVM for multivariate fuzzy regressions gives comput
We show that support vector machines of the 1-norm soft margin type are universally consistent provided that the regularization parameter is chosen in a distinct manner and the kernel belongs to a specific class}the so-called universal kernels}which has recently been considered by the author. In par