Adaptive optimization in neural networks
โ Scribed by K.Y.M. Wong; D. Sherrington
- Publisher
- Elsevier Science
- Year
- 1992
- Tongue
- English
- Weight
- 202 KB
- Volume
- 185
- Category
- Article
- ISSN
- 0378-4371
No coin nor oath required. For personal study only.
โฆ Synopsis
We apply the principle of adaptation to optimize the performance of neural networks with (i) noisy retrieval and (ii) disruptive dilution.
I. Introduction
Learning in neural networks can be described as a search procedure in the space of the adjustable synaptic weights. A performance function of the information to be stored is prescribed, and the purpose of the search is to look for the network configuration which maximizes it.
Although early work [1] has focused on performance functions when errorless inputs are presented to the network, we are more often interested in situations dependent on the operating conditions after the learning state has completed. Networks operating best in unperturbed environment may not do so in disruptive environment. Perturbations to the network may refer to ambient noises present in the dynamics of retrieval, or disruptive cutting of the synaptic weights. In this paper we consider two examples of optimizing the performance of neural networks:
(i) How should we optimize the final overlaps and basins of attraction in attractor neural networks with noisy dynamics of retrieval?
(ii) How should we optimize the robustness against disruptive cutting of the synaptic weights?
2. The principle of adaptation
Below we illustrate how the principle of adaptation enables us to find the optimal solution. In fact, this principle has a potentially much wider scope of
๐ SIMILAR VOLUMES
Resource/equipment allocation applications are becoming increasingly difficult to manage because of the routing/control complexity and contention conflicts, i.e. the amount of equipment operating simultaneously. The technical diversity of this equipment is also growing. Architectures that once inclu