For three-layer art~l~cial neural networks (TANs) that take binao, vahtes, the number of hidden units is considered regarding two problems: One is to find the necessary and sufficient number to make mapping between the binary output values of TANs and learning patterns (inputs) arbitrary; and the ot
Bounds on the number of hidden units of boltzmann machines
β Scribed by Eduard Moser; Tiko Kameda
- Book ID
- 104348557
- Publisher
- Elsevier Science
- Year
- 1992
- Tongue
- English
- Weight
- 911 KB
- Volume
- 5
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
β¦ Synopsis
lt is known that any given probability distribution ~?f the states of the observable units of a Boltzmann machine can be realized f no limit is imposed on the number of hidden units. But veo, little is known about the number of hidden units necessao, for such realization. We consider Boltzmann machines as associative memories and show that there exist vector sets whose memorization on a Boltzmann machine requires a number of hidden units which is t;xponential in the size of the vectors (i.e., the number qf componous in each vector). Additional results give tight bounds on the number of hidden units needed in terms o.f the vector set size (i.e.. the number of vectors in the set). Furthermore. we show how to construct Boltzmann ma~qfines which realize negation, intersection, and composition of the vector sets memorized by given Boltzmann machines.
π SIMILAR VOLUMES
gTds report presents a back-propagation algorithm t/tat varies the number of hidden units. 77ti,s" algorithm is expected to escape local minima and makes it no longer necessary to decide the number of hidden ttnits. We tested this algorithm on two examples. One was exclusive-OR learning and the othe
Multilayer perceptrons can compute arbitrary dichotomies of a set of \(N\) points of \([0,1]^{d}\). The minimal size of such networks was studied by Baum (1988, J. Complexity 4, 193-215) using the parameter \(N\). In this paper, we show that this question can be addressed using another parameter, th
The number of required hidden units is statistically estimated for feedforward neural networks that are constructed by adding hidden units one by one. The output error decreases with the number of hidden units by an almost constant rate, if each appropriate hidden unit is selected out of a great num