𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Embedding recurrent neural networks into predator–prey models

✍ Scribed by Yves Moreau; Stéphane Louiès; Joos Vandewalle; Léon Brenig


Publisher
Elsevier Science
Year
1999
Tongue
English
Weight
180 KB
Volume
12
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

✦ Synopsis


We study changes of coordinates that allow the embedding of ordinary differential equations describing continuous-time recurrent neural networks into differential equations describing predator-prey models-also called Lotka-Volterra systems. We transform the equations for the neural network first into quasi-monomial form (Brenig, L. (1988). Complete factorization and analytic solutions of generalized Lotka-Volterra equations. Physics Letters A, 133(7-8), 378-382), where we express the vector field of the dynamical system as a linear combination of products of powers of the variables. In practice, this transformation is possible only if the activation function is the hyperbolic tangent or the logistic sigmoid. From this quasi-monomial form, we can directly transform the system further into Lotka-Volterra equations. The resulting Lotka-Volterra system is of higher dimension than the original system, but the behavior of its first variables is equivalent to the behavior of the original neural network. We expect that this transformation will permit the application of existing techniques for the analysis of Lotka-Volterra systems to recurrent neural networks. Furthermore, our results show that Lotka-Volterra systems are universal approximators of dynamical systems, just as are continuous-time neural networks.


📜 SIMILAR VOLUMES


Recurrent neural network modeling of nea
✍ Leo Pape; B.G. Ruessink; Marco A. Wiering; Ian L. Turner 📂 Article 📅 2007 🏛 Elsevier Science 🌐 English ⚖ 811 KB

The temporal evolution of nearshore sandbars (alongshore ridges of sand fringing coasts in water depths less than 10 m and of paramount importance for coastal safety) is commonly predicted using process-based models. These models are autoregressive and require offshore wave characteristics as input,

How embedded memory in recurrent neural
✍ Tsungnan Lin; Bill G. Horne; C.Lee Giles 📂 Article 📅 1998 🏛 Elsevier Science 🌐 English ⚖ 202 KB

Learning long-term temporal dependencies with recurrent neural networks can be a difficult problem. It has recently been shown that a class of recurrent neural networks called NARX networks perform much better than conventional recurrent neural networks for learning certain simple long-term dependen