Classificatory filtering in decision systems
✍ Scribed by Hui Wang; Ivo Düntsch; Günther Gediga
- Book ID
- 104347995
- Publisher
- Elsevier Science
- Year
- 2000
- Tongue
- English
- Weight
- 209 KB
- Volume
- 23
- Category
- Article
- ISSN
- 0888-613X
No coin nor oath required. For personal study only.
✦ Synopsis
Classi®catory data ®ltering is concerned with reducing data in size while preserving classi®cation information. D untsch and Gediga [I. D untsch, G. Gediga, International Journal of Approximate Reasoning 18 (1998) 93±106] presented a ®rst approach to this problem. Their technique collects values of a single feature into a single value. In this paper we present a novel approach to classi®catory ®ltering, which can be regarded as a generalisation of the approach in the above mentioned paper. This approach is aimed at collecting values of a set of features into a single value. We look at the problem abstractly in the context of lattices. We focus on hypergranules (arrays of sets) in a problem domain, and it turns out the collection of all hypergranules can be made into a lattice. Our solution (namely, LM algorithm) is formulated to ®nd a set of maximal elements for each class, which covers all elements in a given dataset and is consistent with the dataset. This is done through the lattice sum operation. In terms of decision systems, LM collects attributes values while preserving classi®cation structure.
To use the ®ltered data for classi®cation, we present and justify two measures (C 0 and C 1 ) for the relationship between two hypergranules. Based on the measures, we propose an algorithm (C2) for classi®cation.
Both algorithms are evaluated using real world datasets and are compared with C4.5. The result is analysed using statistical test methods and it turns out that there is no statistical dierence between the two. Regression analysis shows that the reduction ratio is a strong indicator of prediction success.
📜 SIMILAR VOLUMES