## Many neural network realizations have been recently proposed for the statistical technique of Principal Component Analysis ( PCA ). Explicit connections between numerical constrained adaptive algorithms and neural networks with constrained Hebbian learning rules art, reviewed. The Stochastic Gr
Feedforward neural networks for principal components extraction
β Scribed by Sandro Nicole
- Publisher
- Elsevier Science
- Year
- 2000
- Tongue
- English
- Weight
- 233 KB
- Volume
- 33
- Category
- Article
- ISSN
- 0167-9473
No coin nor oath required. For personal study only.
β¦ Synopsis
In recent times, an upsurge of interest in the study of artiΓΏcial neural networks apt to computing a principal component extraction has been observed. The present work is devoted to the description and performance analysis (by means of computer simulations) of some neural networks of such a kind. The main conclusion reached is that, while the ΓΏrst principal component is almost always e ciently obtained, lesser components tends to be sloppily approximated. All the nets considered here share the interesting feature of being endowed with a feedforward connectivity, together with an Hebbian law of synaptic weights' adjustment. Their potential usefulness as modular tools, to be inserted in more complex models of psychological and neurological functions has been suggested. There is, however, for the time being no clear evidence supporting any real biological implementation of these simple computational architectures.
π SIMILAR VOLUMES
This paper describes a method of extracting diagnostic rules from trained diagnostic feedforward neural nets that are constructed to recognise di!erent mechanical faults using automated weight and structure learning algorithms. The rule extracting method is based on an interpretation that considers