𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Consensus Principal Components

✍ Scribed by L. P. Lefkovitch


Publisher
John Wiley and Sons
Year
1993
Tongue
English
Weight
592 KB
Volume
35
Category
Article
ISSN
0323-3847

No coin nor oath required. For personal study only.


πŸ“œ SIMILAR VOLUMES


Nonlinear principal components
✍ Victor J. Yohai; Werner Ackermann; Cristina Haigh πŸ“‚ Article πŸ“… 1985 πŸ› Springer Netherlands 🌐 English βš– 794 KB
Discriminant principal components analys
✍ Peter W. Yendle; Halliday J. H. MacFie πŸ“‚ Article πŸ“… 1989 πŸ› John Wiley and Sons 🌐 English βš– 705 KB
Recursive principal components analysis
✍ Thomas Voegtlin πŸ“‚ Article πŸ“… 2005 πŸ› Elsevier Science 🌐 English βš– 273 KB

A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series

Stability of principal components
✍ A. H. Al-Ibrahim; Noriah M. Al-Kandari πŸ“‚ Article πŸ“… 2007 πŸ› Springer 🌐 English βš– 193 KB
Principal component analysis
✍ HervΓ© Abdi; Lynne J. Williams πŸ“‚ Article πŸ“… 2010 πŸ› Wiley (John Wiley & Sons) 🌐 English βš– 564 KB

## Abstract Principal component analysis (PCA) is a multivariate technique that analyzes a data table in which observations are described by several inter‐correlated quantitative dependent variables. Its goal is to extract the important information from the table, to represent it as a set of new or