Discriminant feature extraction using empirical probability density estimation and a local basis library
✍ Scribed by Naoki Saito; Ronald R. Coifman; Frank B. Geshwind; Fred Warner
- Publisher
- Elsevier Science
- Year
- 2002
- Tongue
- English
- Weight
- 158 KB
- Volume
- 35
- Category
- Article
- ISSN
- 0031-3203
No coin nor oath required. For personal study only.
✦ Synopsis
The authors previously developed the so-called local discriminant basis (LDB) method for signal and image classiÿcation problems. The original LDB method relies on di erences in the time-frequency energy distribution of each class: it selects the subspaces where these energy distributions are well separated by some measure such as the Kullback-Leibler divergence. Through our experience and experiments on various datasets, however, we realized that the time-frequency energy distribution is not always the best quantity to analyze for classiÿcation. In this paper, we propose to use the discrimination of coordinates based, instead, on empirical probability densities. That is, we estimate the probability density of each class in each coordinate in the wavelet packet=local trigonometric bases after expanding signals into such bases. We then evaluate a power of discrimination of each subspace by selecting the m most discriminant coordinates in terms of the "distance" among the corresponding densities (e.g., by the Kullback-Leibler divergence among the densities). This information is then used for selecting a basis for classiÿcation. We will demonstrate the capability of this algorithm using both synthetic and real datasets.