This paper presents new necessary and sufficient conditions for absolute stability of asymmetric neural networks. The main result is based on a solvable Lie algebra condition, which generalizes existing results for symmetric and normal neural networks. An exponential convergence estimate of the neur
Absolute stability of neural networks
β Scribed by Kiyotoshi Matsuoka
- Publisher
- John Wiley and Sons
- Year
- 1992
- Tongue
- English
- Weight
- 541 KB
- Volume
- 23
- Category
- Article
- ISSN
- 0882-1666
No coin nor oath required. For personal study only.
β¦ Synopsis
Abstract
A sufficient condition for the state of a recurrent neural network to converge stably to an equilibrium state is the symmetry of the weights of connections between constituent units. However, generally, it imposes a strong restriction on the capability of the network. Although several stability conditions have been proposed for asymmetric recurrent networks, they are too strict and not useful for actual neural networks. Six new stability conditions are derived herein by using two types of Lyapunov functions. Some of them provide milder constraints on the connection weights than the conventional results, and others are particularly useful when the mutual connections between units have opposite signs of weights.
π SIMILAR VOLUMES
Recently, Zhang, Suda and Iwasa [Zhang, Jiye, Suda, Yoshihiro, & Iwasa, Takashi (2004). Absolute exponential stability of a class of neural networks with unbounded delay, Neural Network, 17, 391-397] established the following: Result. Suppose that T be an n Γ n matrix, and D be an n Γ n positive de