k-nearest neighbor (k-NN) classiΓΏcation is a well-known decision rule that is widely used in pattern classiΓΏcation. However, the traditional implementation of this method is computationally expensive. In this paper we develop two e ective techniques, namely, template condensing and preprocessing, to
β¦ LIBER β¦
Improving nearest neighbor classification with cam weighted distance
β Scribed by Chang Yin Zhou; Yan Qiu Chen
- Book ID
- 108234328
- Publisher
- Elsevier Science
- Year
- 2006
- Tongue
- English
- Weight
- 428 KB
- Volume
- 39
- Category
- Article
- ISSN
- 0031-3203
No coin nor oath required. For personal study only.
π SIMILAR VOLUMES
Improved k-nearest neighbor classificati
β
Yingquan Wu; Krassimir Ianakiev; Venu Govindaraju
π
Article
π
2002
π
Elsevier Science
π
English
β 113 KB
Overfit prevention in adaptive weighted
β
Elham Parvinnia; Mohammad R. Moosavi; Mansoor Z. Jahromi; Koorush Ziarati
π
Article
π
2011
π
Elsevier
π
English
β 202 KB
Improving nearest neighbor rule with a s
β
Jigang Wang; Predrag Neskovic; Leon N. Cooper
π
Article
π
2007
π
Elsevier Science
π
English
β 164 KB
Nearest-neighbor classification with cat
β
Samuel E. Buttrey
π
Article
π
1998
π
Elsevier Science
π
English
β 736 KB
A technique is presented for adopting nearest-neighbor classification to the case of categorical variables. The set of categories is mapped onto the real line in such a way as to maximize the ratio of total sum of squares to within-class sum of squares, aggregated over classes. The resulting real va
Nearest Neighbor Classification with Exc
β
L. KetskemΓ©ty
π
Article
π
2002
π
Springer US
π
English
β 102 KB
Combining atlas based segmentation and i
β
MichaΓ«l Sdika
π
Article
π
2010
π
Elsevier Science
π
English
β 753 KB