𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Feed-forward chains of recurrent attractor neural networks with finite dilution near saturation

✍ Scribed by F.L. Metz; W.K. Theumann


Publisher
Elsevier Science
Year
2006
Tongue
English
Weight
268 KB
Volume
368
Category
Article
ISSN
0378-4371

No coin nor oath required. For personal study only.

✦ Synopsis


A stationary state replica analysis for a dual neural network model that interpolates between a fully recurrent symmetric attractor network and a strictly feed-forward layered network, studied by Coolen and Viana, is extended in this work to account for finite dilution of the recurrent Hebbian interactions between binary Ising units within each layer. Gradual dilution is found to suppress part of the phase transitions that arise from the competition between recurrent and feedforward operation modes of the network. Despite that, a long chain of layers still exhibits a relatively good performance under finite dilution for a balanced ratio between inter-layer and intra-layer interactions.