๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Incremental learning with multi-level adaptation

โœ Scribed by Abdelhamid Bouchachia


Publisher
Elsevier Science
Year
2011
Tongue
English
Weight
847 KB
Volume
74
Category
Article
ISSN
0925-2312

No coin nor oath required. For personal study only.

โœฆ Synopsis


Self-adaptation is an inherent part of any natural and intelligent system. Specifically, it is about the ability of a system to reconcile its requirements or goal of existence with the environment it is interacting with, by adopting an optimal behavior. Self-adaptation becomes crucial when the environment changes dynamically over time. In this paper, we investigate self-adaptation of classification systems at three levels: (1) natural adaptation of the base learners to change in the environment, (2) contributive adaptation when combining the base learners in an ensemble, and (3) structural adaptation of the combination as a form of dynamic ensemble. The present study focuses on neural network classification systems to handle a special facet of self-adaptation, that is, incremental learning (IL). With IL, the system self-adjusts to accommodate new and possibly non-stationary data samples arriving over time. The paper discusses various IL algorithms and shows how the three adaptation levels are inherent in the system's architecture proposed and how this architecture is efficient in dealing with dynamic change in the presence of various types of data drift when applying these IL algorithms.

& 2011 Elsevier B.V. All rights reserved.

  1. Adaptivity due to the nature of the classifiers. The classifiers are self-adaptive by construction. 2. Adaptivity due to proportional (weighted) contribution of each classifier in the ensemble decision. 3. Adaptivity due to the structural update (dynamically changing structure) of the ensemble.

๐Ÿ“œ SIMILAR VOLUMES


A dynamic optimization approach for adap
โœ Marcelo N. Kapp; Robert Sabourin; Patrick Maupin, ๐Ÿ“‚ Article ๐Ÿ“… 2011 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 448 KB

A fundamental problem when performing incremental learning is that the best set of a classification system's parameters can change with the evolution of the data. Consequently, unless the system self-adapts to such changes, it will become obsolete, even if the application environment seems to be sta

Vector-field-smoothed Bayesian learning
โœ Jun-ichi Takahashi; Shigeki Sagayama ๐Ÿ“‚ Article ๐Ÿ“… 1997 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 366 KB

This paper describes an on-line adaptation method that combines maximum a posteriori (MAP) estimation for intra-class training (the training scheme incorporates new training samples with prior information) with vector field smoothing (VFS) for inter-class smoothing. Results of experiments comparing

Multi-level adaptive segmentation of mul
โœ A. Zavaljevski; A.P. Dhawan; M. Gaskil; W. Ball; J.D. Johnson ๐Ÿ“‚ Article ๐Ÿ“… 2000 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 709 KB

MR brain image segmentation into several tissue classes is of significant interest to visualize and quantify individual anatomical structures. Traditionally, the segmentation is performed manually in a clinical environment that is operator dependent and may be difficult to reproduce. Though several

Active learning with adaptive regulariza
โœ Zheng Wang; Shuicheng Yan; Changshui Zhang ๐Ÿ“‚ Article ๐Ÿ“… 2011 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 352 KB