Theory of the backpropagation neural network
โ Scribed by Robert Hecht-Nielsen
- Book ID
- 103926321
- Publisher
- Elsevier Science
- Year
- 1988
- Tongue
- English
- Weight
- 61 KB
- Volume
- 1
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
Of all the neural networks being applied to real world problems, the backpropagation neural network has proven to be the most useful. The Backpropagation network seems to have been originally invented by Paul Werbos in his 1974 Harvard Ph.D. dissertation, and subsequently reinvented in 1982 by David Parker of the Stanford Linear Accelerator Center (and later reinvented multiple times by others). Backpropagation was developed into a useable neural network by David Rumelhart and the PDP group, based at UCSD. The title of Werbos' Ph.D. dissertation --"Beyond Regression" -aptly captures the essential importance of the backpropagation neural network. Viewed as a new approach to regression analysis, the backpropagation neural network and its amazing capabilities come into sharp focus. Traditional statistical regression techniques (which form the basis of central results in subjects as diverse as pattern recognition, control theory, speech synthesis, signal processing and forecasting) require the user to select an appropriate functional form which is then fit to the data by the regression procedure. For example, if the data is known to be periodic, a sine-wave form can be chosen and Fourier analysis (one particular brand of regression analysis) can be applied. However, l~ourier analysis is numerically inappropriate if the functional form to be fit is, for example, exponential. One of the severe limitations of traditional regression analysis is that, for the most interesting problems, there is no useable technique for discovering an appropriate functional form. This is particularly true for high-dimensional function or mapping approximation problems in pattern recognition, control theory, database analysis, and time series prediction. Backpropagation goes beyond regression by taking care of both aspects of regression automatically. Because of the extreme generality of the functional form used in the backpropagation network (namely, hierarchical sigmoided linear combinations of sigmoids), the network can produce virtually any functional form needed (sine waves, exponentials, logarithms, polynomials, etc.) as well as simultaneously fitting this form to the data. Given the insights that have been gained from experimental work with backpropagation, several new theoretical results and conjectures about the network have emerged in the last few months. Since the theme of using neural networks for "super-regression" will probably extend eventually to an entire class of networks in the mapping neural network category, these developments promise to be of great importance to the future of neurocomputing. This talk will review the basic theory of the backpropagation neural network and then discuss these nev, theorems and conjectures. The implications of the backpropagation network for neurocomputing applications will also be discussed.
๐ SIMILAR VOLUMES
A very important subject for the consolidation of neural networks is the study of their capabilities. In this paper, the relationships between network size, training set size and generalization capabilities are examined. The phenomenon of overtraining in backpropagation networks is discussed and an