𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Global exponential stability of generalized recurrent neural networks with discrete and distributed delays

✍ Scribed by Yurong Liu; Zidong Wang; Xiaohui Liu


Publisher
Elsevier Science
Year
2006
Tongue
English
Weight
159 KB
Volume
19
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

✦ Synopsis


This paper is concerned with analysis problem for the global exponential stability of a class of recurrent neural networks (RNNs) with mixed discrete and distributed delays. We first prove the existence and uniqueness of the equilibrium point under mild conditions, assuming neither differentiability nor strict monotonicity for the activation function. Then, by employing a new Lyapunov-Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the RNNs to be globally exponentially stable. Therefore, the global exponential stability of the delayed RNNs can be easily checked by utilizing the numerically efficient Matlab LMI toolbox, and no tuning of parameters is required. A simulation example is exploited to show the usefulness of the derived LMI-based stability conditions.


πŸ“œ SIMILAR VOLUMES


Exponential stability preservation in di
✍ Sannay Mohamad πŸ“‚ Article πŸ“… 2008 πŸ› Elsevier Science 🌐 English βš– 280 KB

This paper demonstrates that there is a discrete-time analogue which does not require any restriction on the size of the time-step in order to preserve the exponential stability of an artificial neural network with distributed delays. The analysis exploits an appropriate Lyapunov sequence and a disc