𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Absolute stability of neural networks

✍ Scribed by Kiyotoshi Matsuoka


Publisher
John Wiley and Sons
Year
1992
Tongue
English
Weight
541 KB
Volume
23
Category
Article
ISSN
0882-1666

No coin nor oath required. For personal study only.

✦ Synopsis


Abstract

A sufficient condition for the state of a recurrent neural network to converge stably to an equilibrium state is the symmetry of the weights of connections between constituent units. However, generally, it imposes a strong restriction on the capability of the network. Although several stability conditions have been proposed for asymmetric recurrent networks, they are too strict and not useful for actual neural networks. Six new stability conditions are derived herein by using two types of Lyapunov functions. Some of them provide milder constraints on the connection weights than the conventional results, and others are particularly useful when the mutual connections between units have opposite signs of weights.


πŸ“œ SIMILAR VOLUMES


New necessary and sufficient conditions
✍ Tianguang Chu; Cishen Zhang πŸ“‚ Article πŸ“… 2007 πŸ› Elsevier Science 🌐 English βš– 341 KB

This paper presents new necessary and sufficient conditions for absolute stability of asymmetric neural networks. The main result is based on a solvable Lie algebra condition, which generalizes existing results for symmetric and normal neural networks. An exponential convergence estimate of the neur

Comments on β€œAbsolute exponential stabil
✍ Da-Wei Chang πŸ“‚ Article πŸ“… 2007 πŸ› Elsevier Science 🌐 English βš– 137 KB

Recently, Zhang, Suda and Iwasa [Zhang, Jiye, Suda, Yoshihiro, & Iwasa, Takashi (2004). Absolute exponential stability of a class of neural networks with unbounded delay, Neural Network, 17, 391-397] established the following: Result. Suppose that T be an n Γ— n matrix, and D be an n Γ— n positive de