๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Self-organization and dynamics reduction in recurrent networks: stimulus presentation and learning

โœ Scribed by Emmanuel Dauce; Mathias Quoy; Bruno Cessac; Bernard Doyon; Manuel Samuelides


Publisher
Elsevier Science
Year
1998
Tongue
English
Weight
409 KB
Volume
11
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

โœฆ Synopsis


Freeman's investigations on the olfactory bulb of the rabbit showed that its signal dynamics was chaotic, and that recognition of a learned stimulus is linked to a dimension reduction of the dynamics attractor. In this paper we address the question whether this behavior is specific of this particular architecture, or if it is a general property. We study the dynamics of a non-convergent recurrent model-the random recurrent neural networks. In that model a mean-field theory can be used to analyze the autonomous dynamics. We extend this approach with various observations on significant changes in the dynamical regime when sending static random stimuli. Then we propose a Hebb-like learning rule, viewed as a self-organization dynamical process inducing specific reactivity to one random stimulus. We numerically show the dynamics reduction during learning and recognition processes and analyze it in terms of dynamical repartition of local neural activity.


๐Ÿ“œ SIMILAR VOLUMES


Motor primitive and sequence self-organi
โœ Rainer W. Paine; Jun Tani ๐Ÿ“‚ Article ๐Ÿ“… 2004 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 709 KB

This study describes how complex goal-directed behavior can be obtained through adaptation processes in a hierarchically organized recurrent neural network using a genetic algorithm (GA). Our experiments, using a simulated Khepera robot, showed that different types of dynamic structures self-organiz