𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Extension of mixture-of-experts networks for binary classification of hierarchical data

✍ Scribed by Shu-Kay Ng; Geoffrey J. McLachlan


Book ID
113469543
Publisher
Elsevier Science
Year
2007
Tongue
English
Weight
320 KB
Volume
41
Category
Article
ISSN
0933-3657

No coin nor oath required. For personal study only.


πŸ“œ SIMILAR VOLUMES


Improved learning algorithms for mixture
✍ K. Chen; L. Xu; H. Chi πŸ“‚ Article πŸ“… 1999 πŸ› Elsevier Science 🌐 English βš– 328 KB

Mixture of experts (ME) is a modular neural network architecture for supervised learning. A double-loop Expectation-Maximization (EM) algorithm has been introduced to the ME architecture for adjusting the parameters and the iteratively reweighted least squares (IRLS) algorithm is used to perform max