## Ε½ . Multi-layered feed-forward MLF neural networks are commonly trained by the generalized Delta learning rule. The training method has shown to be robust and easy to implement for various chemical problems in analytical chemistry. However, the slow and unpredictable learning behavior puts cons
Aspects of network training and validation on noisy data: Part 2. Validation aspects
β Scribed by E.P.P.A. Derks; L.M.C. Buydens
- Publisher
- Elsevier Science
- Year
- 1998
- Tongue
- English
- Weight
- 142 KB
- Volume
- 41
- Category
- Article
- ISSN
- 0169-7439
No coin nor oath required. For personal study only.
β¦ Synopsis
This paper focuses on the validation of multi-layered feed-forward MLF neural network models with respect to the predictive ability. Two distinct approaches for the computation of prediction intervals on neural network outputs have been applied and compared using simulated and experimental data. First, bootstrap resampling methodology has been applied and the results are discussed. The use of resampling techniques for variance estimation of network outputs is conveniently facilitated by means of the efficient Levenberg-Marquardt training method, as discussed in Part 1 of this paper. Next, the delta method, based on the linearization of nonlinear functions, has been applied and discussed. The bootstrap and the delta method are used to construct prediction intervals on neural network estimates. Finally, some practical aspects are outlined for both methods and some major conclusions are drawn.
π SIMILAR VOLUMES
## Abstract The distribution of unsaturations in the prepolymer of a typical unsaturated polyester (UP) resin (maleic anhydride, phthalic anhydride and 1,2βpropylene glycol) has been shown to influence the kinetics of the cure process with styrene monomer. Segments containing double bonds in close