Learning of modular structured networks
β Scribed by Masumi Ishikawa
- Publisher
- Elsevier Science
- Year
- 1995
- Tongue
- English
- Weight
- 755 KB
- Volume
- 75
- Category
- Article
- ISSN
- 0004-3702
No coin nor oath required. For personal study only.
β¦ Synopsis
Learning of large-scale neural networks suffers from computational cost and the local minima problem. One solution to these difficulties is the use of modular structured networks. Proposed here is the learning of modular networks using structural learning with forgetting. It enables the formation of modules. It also enables automatic utilization of appropriate modules from among the previously learned ones. This not only achieves efficient learning, but also makes the resulting network understandable due to its modular character.
In the learning of a Boolean function, the present module acquires information from its subtask module without any supervision. In the parity problem, a previously learned lower-order parity problem is automatically used. The geometrical transformation of figures can be realized by a sequence of elementary transformations. This sequence can also be discovered by the learning of multi-layer modular networks. These examples well demonstrate the effectiveness of modular structured networks constructed by structural learning with forgetting.
π SIMILAR VOLUMES
This is the second paper of a series of two about the structural properties that influence the asymptotic dynamics of random boolean networks. Here we study the functionally independent clusters in which the relevant elements, introduced and studied in our first paper [U. Bastolla, G. Parisi, Physic
For recognition and control of multiple manipulated objects, we present two learning schemes for neuralnetwork controllers based on fi, edback-error-learning and modular architecture. In both schemes, the network consists of a recognition network and modular control networks. In the first scheme, a
Echo State Networks (ESNs) have been shown to be effective for a number of tasks, including motor control, dynamic time series prediction, and memorizing musical sequences. However, their performance on natural language tasks has been largely unexplored until now. Simple Recurrent Networks (SRNs) ha