This article presents a new algorithm.[br the automatic generation of neural architectures and supervised learning. Given a set of examples, the algorithm generates an architecture and s),naptic weights that are adapted to the sampled problem. The algorithm supports an), type and amount of numerical
A class of convergent neural network dynamics
✍ Scribed by Bernold Fiedler; Tomáš Gedeon
- Publisher
- Elsevier Science
- Year
- 1998
- Tongue
- English
- Weight
- 340 KB
- Volume
- 111
- Category
- Article
- ISSN
- 0167-2789
No coin nor oath required. For personal study only.
✦ Synopsis
We consider a class of systems of differential equations in Nn which exhibits convergent dynamics. We find a Lyapunov function and show that every bounded trajectory converges to the set of equilibria. Our result generalizes the results of Cohen and Grossberg (1983) for convergent neural networks. It replaces the symmetry assumption on the matrix of weights by the assumption on the structure of the connections in the neural network.
We prove the convergence result also for a large class of Lotka-Volterra systems. These are naturally defined on the closed positive orthant. We show that there are no heteroclinic cycles on the boundary of the positive orthant for the systems in this class.
📜 SIMILAR VOLUMES
In this paper, we investigate the asymptotic behavior of solutions to a class of recurrent neural network model with delays. Without assuming M-matrix condition, it is shown that every solution of the network tends to an equilibrium point as t → ∞. Our results improve and extend some corresponding o
In this paper, we focus on the convergence of a stochastic neural process. In this process, a "physiologically plausible" Hebb's learning rule gives rise to a self-organization phenomenon. Some preliminary results concern the asymptotic behaviour of the nework given that the update of neurons is eit