๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Adaptive incremental learning in neural networks

โœ Scribed by Abdelhamid Bouchachia; Nadia Nedjah


Publisher
Elsevier Science
Year
2011
Tongue
English
Weight
81 KB
Volume
74
Category
Article
ISSN
0925-2312

No coin nor oath required. For personal study only.


๐Ÿ“œ SIMILAR VOLUMES


Adaptive optimization in neural networks
โœ K.Y.M. Wong; D. Sherrington ๐Ÿ“‚ Article ๐Ÿ“… 1992 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 202 KB

We apply the principle of adaptation to optimize the performance of neural networks with (i) noisy retrieval and (ii) disruptive dilution. ## I. Introduction Learning in neural networks can be described as a search procedure in the space of the adjustable synaptic weights. A performance function

Incremental learning with multi-level ad
โœ Abdelhamid Bouchachia ๐Ÿ“‚ Article ๐Ÿ“… 2011 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 847 KB

Self-adaptation is an inherent part of any natural and intelligent system. Specifically, it is about the ability of a system to reconcile its requirements or goal of existence with the environment it is interacting with, by adopting an optimal behavior. Self-adaptation becomes crucial when the envir

Automatic learning in chaotic neural net
โœ Masataka Watanabe; Kazuyuki Aihara; Shunsuke Kondo ๐Ÿ“‚ Article ๐Ÿ“… 1996 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 445 KB ๐Ÿ‘ 1 views
Learning from hints in neural networks
โœ Yaser S Abu-Mostafa ๐Ÿ“‚ Article ๐Ÿ“… 1990 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 383 KB
A dynamic optimization approach for adap
โœ Marcelo N. Kapp; Robert Sabourin; Patrick Maupin, ๐Ÿ“‚ Article ๐Ÿ“… 2011 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 448 KB

A fundamental problem when performing incremental learning is that the best set of a classification system's parameters can change with the evolution of the data. Consequently, unless the system self-adapts to such changes, it will become obsolete, even if the application environment seems to be sta