𝔖 Bobbio Scriptorium
✦   LIBER   ✦

The eigenspace separation transform for neural-network classifiers

✍ Scribed by Don Torrieri


Publisher
Elsevier Science
Year
1999
Tongue
English
Weight
156 KB
Volume
12
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

✦ Synopsis


This paper presents a linear transform that compresses data in a manner designed to improve the performance of a neural network used as a binary classifier. The classifier is intended to accommodate data distributions that may be non-normal, may have equal class means, may be multimodal, and have unknown a priori probabilities for the two classes. The transform, which is called the eigenspace separation transform, allows the reduction of the size of a neural network while enhancing its generalization accuracy as a binary classifier. Published by Elsevier Science Ltd.


πŸ“œ SIMILAR VOLUMES


A classifier neural network for rotordyn
✍ R. Ganesan; Jin Jionghua; T.S. Sankar πŸ“‚ Article πŸ“… 1995 πŸ› Elsevier Science 🌐 English βš– 633 KB

A feedforward backpropagation neural network is formed to identify the stability characteristic of a high speed rotordynamic system. The principal focus resides in accounting for the instability due to the bearing clearance effects. The abnormal operating condition of 'normal-loose' Coulomb rub, tha

Neural Network Architecture for Synthesi
✍ Dominik; Jakub WrΓ³blewski; Marcin Szczuka πŸ“‚ Article πŸ“… 2003 πŸ› Elsevier Science 🌐 English βš– 707 KB

We introduce a novel neural network architecture, referred to as the normalizing neural network (NNN), where the propagated signals take the form of finite probability distributions. Appropriately tuned NNN can be applied as the compound voting measure while classifying new cases on the basis of app