Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays
โ Scribed by Boshan Chen; Jun Wang
- Publisher
- Elsevier Science
- Year
- 2007
- Tongue
- English
- Weight
- 624 KB
- Volume
- 20
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
The paper presents theoretical results on the global exponential periodicity and global exponential stability of a class of recurrent neural networks with various general activation functions and time-varying delays. The general activation functions include monotone nondecreasing functions, globally Lipschitz continuous and monotone nondecreasing functions, semi-Lipschitz continuous mixed monotone functions, and Lipschitz continuous functions. For each class of activation functions, testable algebraic criteria for ascertaining global exponential periodicity and global exponential stability of a class of recurrent neural networks are derived by using the comparison principle and the theory of monotone operator. Furthermore, the rate of exponential convergence and bounds of attractive domain of periodic oscillations or equilibrium points are also estimated. The convergence analysis based on the generalization of activation functions widens the application scope for the model design of neural networks. In addition, the new effective analytical method enriches the toolbox for the qualitative analysis of neural networks.
๐ SIMILAR VOLUMES
## Abstract This paper proposes a class of more general model of recurrent neural networks with __functional__ delay, which has been found more suitable to apply directly. Simple and easily checkable conditions of existence, uniqueness, and global exponential stability of periodic solution for the
In this paper, the conditions ensuring existence, uniqueness, and global exponential stability of the equilibrium point of a class of neural networks with variable delays are studied. Without assuming global Lipschitz conditions on these activation functions, applying idea of vector Lyapunov functio
In this paper, by means of constructing the extended impulsive delayed Halanay inequality and by Lyapunov functional methods, we analyze the global exponential stability and global attractivity of impulsive Hopfield neural networks with time delays. Some new sufficient conditions ensuring exponentia