𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Mixture of Experts Classification Using a Hierarchical Mixture Model

✍ Scribed by Titsias, Michalis K.; Likas, Aristidis


Book ID
121244072
Publisher
MIT Press
Year
2002
Tongue
English
Weight
304 KB
Volume
14
Category
Article
ISSN
0899-7667

No coin nor oath required. For personal study only.


πŸ“œ SIMILAR VOLUMES


Towards dynamic catchment modelling: a B
✍ Lucy Marshall; David Nott; Ashish Sharma πŸ“‚ Article πŸ“… 2007 πŸ› John Wiley and Sons 🌐 English βš– 386 KB

## Abstract Despite the abundance of existing hydrological models, there is no single model that has been identified as performing consistently over the range of possible catchment types and catchment conditions. An attractive alternative to selecting a single model is to combine the results from s

Face Detection Using Mixture of MLP Expe
✍ Reza Ebrahimpour; Ehsanollah Kabir; Mohammad Reza Yousefi πŸ“‚ Article πŸ“… 2007 πŸ› Springer US 🌐 English βš– 690 KB
Mixture of experts: a literature survey
✍ Masoudnia, Saeed; Ebrahimpour, Reza πŸ“‚ Article πŸ“… 2012 πŸ› Springer Netherlands 🌐 English βš– 383 KB
Improved learning algorithms for mixture
✍ K. Chen; L. Xu; H. Chi πŸ“‚ Article πŸ“… 1999 πŸ› Elsevier Science 🌐 English βš– 328 KB

Mixture of experts (ME) is a modular neural network architecture for supervised learning. A double-loop Expectation-Maximization (EM) algorithm has been introduced to the ME architecture for adjusting the parameters and the iteratively reweighted least squares (IRLS) algorithm is used to perform max