## Abstract A set of necessary and sufficient conditions is stated and proved for the absolute stability (under any passive terminations), in the βbiboβ sense, of a linear __n__βport characterized by its openβcircuit impedance matrix. A more explicit set of such conditions is derived for the specia
New necessary and sufficient conditions for absolute stability of neural networks
β Scribed by Tianguang Chu; Cishen Zhang
- Publisher
- Elsevier Science
- Year
- 2007
- Tongue
- English
- Weight
- 341 KB
- Volume
- 20
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
This paper presents new necessary and sufficient conditions for absolute stability of asymmetric neural networks. The main result is based on a solvable Lie algebra condition, which generalizes existing results for symmetric and normal neural networks. An exponential convergence estimate of the neural networks is also obtained. Further, it is demonstrated how to generate larger sets of weight matrices for absolute stability of the neural networks from known normal weight matrices through simple procedures. The approach is nontrivial in the sense that non-normal matrices can possibly be contained in the resulting weight matrix set. And the results also provide finite checking for robust stability of neural networks in the presence of parameter uncertainties.
π SIMILAR VOLUMES
In this paper, we have derived some sufficient conditions for existence and uniqueness of equilibrium and global exponential stability in delayed Hopfield neural networks by using a different approach from the usually used one where the existence, uniqueness of equilibrium and stability are proved i
## Abstract A sufficient condition for the state of a recurrent neural network to converge stably to an equilibrium state is the symmetry of the weights of connections between constituent units. However, generally, it imposes a strong restriction on the capability of the network. Although several s