Hamiltonian dynamics of neural networks
β Scribed by Ulrich Ramacher
- Publisher
- Elsevier Science
- Year
- 1993
- Tongue
- English
- Weight
- 874 KB
- Volume
- 6
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
The activation and weight dynamics of Artificial Neural Networks are derived from a partial differential equation (PDE) that may incorporate weights either as parameters or variables. It is shown that a single first-order Hamilton-Jacobi "parametricai" PDE suffices to derive the various neurodynamical paradigms used today. In the case that weights are taken as variables, a new type of neurodynamics is discovered: A Hamilton function is derived so that the weights obey a second-order ordinary differential equation (ODE). As this ODE models the forces, experienced by the weights in the presence of some generalized error potential, it is called a learning law Results obtained for the association of time-varying patterns, using parametrical as well as dynamical weights, show that learning rules can be replaced by learning laws at equal performance.
π SIMILAR VOLUMES
Persistence is one of the most common characteristics of real-world time series. In this work we investigate the process of learning persistent dynamics by neural networks. We show that for chaotic times series the network can get stuck for long training periods in a trivial minimum of the error fun