Mixture of Experts Classification Using a Hierarchical Mixture Model
β Scribed by Titsias, Michalis K.; Likas, Aristidis
- Book ID
- 121244072
- Publisher
- MIT Press
- Year
- 2002
- Tongue
- English
- Weight
- 304 KB
- Volume
- 14
- Category
- Article
- ISSN
- 0899-7667
No coin nor oath required. For personal study only.
π SIMILAR VOLUMES
## Abstract Despite the abundance of existing hydrological models, there is no single model that has been identified as performing consistently over the range of possible catchment types and catchment conditions. An attractive alternative to selecting a single model is to combine the results from s
Mixture of experts (ME) is a modular neural network architecture for supervised learning. A double-loop Expectation-Maximization (EM) algorithm has been introduced to the ME architecture for adjusting the parameters and the iteratively reweighted least squares (IRLS) algorithm is used to perform max