The eigenspace separation transform for neural-network classifiers
β Scribed by Don Torrieri
- Publisher
- Elsevier Science
- Year
- 1999
- Tongue
- English
- Weight
- 156 KB
- Volume
- 12
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
This paper presents a linear transform that compresses data in a manner designed to improve the performance of a neural network used as a binary classifier. The classifier is intended to accommodate data distributions that may be non-normal, may have equal class means, may be multimodal, and have unknown a priori probabilities for the two classes. The transform, which is called the eigenspace separation transform, allows the reduction of the size of a neural network while enhancing its generalization accuracy as a binary classifier. Published by Elsevier Science Ltd.
π SIMILAR VOLUMES
A feedforward backpropagation neural network is formed to identify the stability characteristic of a high speed rotordynamic system. The principal focus resides in accounting for the instability due to the bearing clearance effects. The abnormal operating condition of 'normal-loose' Coulomb rub, tha
We introduce a novel neural network architecture, referred to as the normalizing neural network (NNN), where the propagated signals take the form of finite probability distributions. Appropriately tuned NNN can be applied as the compound voting measure while classifying new cases on the basis of app