Global exponential stability of recurrent neural networks with time-varying delays in the presence of strong external stimuli
โ Scribed by Zhigang Zeng; Jun Wang
- Publisher
- Elsevier Science
- Year
- 2006
- Tongue
- English
- Weight
- 1020 KB
- Volume
- 19
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
This paper presents new theoretical results on the global exponential stability of recurrent neural networks with bounded activation functions and bounded time-varying delays in the presence of strong external stimuli. It is shown that the Cohen-Grossberg neural network is globally exponentially stable, if the absolute value of the input vector exceeds a criterion. As special cases, the Hopfield neural network and the cellular neural network are examined in detail. In addition, it is shown that criteria herein, if partially satisfied, can still be used in combination with existing stability conditions. Simulation results are also discussed in two illustrative examples.
๐ SIMILAR VOLUMES
The stability of a class of stochastic Recurrent Neural Networks with time-varying delays is investigated in this paper. With the help of the Lyapunov function and the Dini derivative of the expectation of V (t, X (t)) ''along'' the solution X (t) of the model, a set of novel sufficient conditions o
This paper deals with the problem of stability analysis for a class of discrete-time bidirectional associative memory (BAM) neural networks with time-varying delays. By employing the Lyapunov functional and linear matrix inequality (LMI) approach, a new sufficient conditions is proposed for the glob
The paper presents theoretical results on the global exponential periodicity and global exponential stability of a class of recurrent neural networks with various general activation functions and time-varying delays. The general activation functions include monotone nondecreasing functions, globally