𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Variable selection for neural networks in multivariate calibration

✍ Scribed by Frédéric Despagne; Désiré-Luc Massart


Publisher
Elsevier Science
Year
1998
Tongue
English
Weight
397 KB
Volume
40
Category
Article
ISSN
0169-7439

No coin nor oath required. For personal study only.

✦ Synopsis


The problem of variable selection for neural network modeling is discussed in this paper. Two methods that gave the best results in a previous comparative study are presented. One of these methods is a modified version of the Hinton diagrams, the other method is based on saliency estimation and is part of the Optimal Brain Surgeon algorithm for pruning unimportant weights in a neural network. We also propose two new methods, based on the estimation of the contribution of each input variable to the variance of the predicted response. These new methods are designed for situations where input variables are orthogonal, such as the PC scores often used in multivariate calibration. The four methods are tested on synthetic examples, and on real industrial data sets for multivariate calibration. The main characteristics of each method are discussed. In particular, we underline the strong theoretical and experimental limitations of methods like the modified Hinton diagrams, based on weight magnitude estimation. We also demonstrate that although the saliency estimation approach is theoretically more stringent, it gives unstable results on repeated trials. The advantage of the two variance-based approaches is that they are much less dependent on the initial weight randomization than the two other methods, and therefore, the results they produce are more stable and reliable.


📜 SIMILAR VOLUMES


Random correlation in variable selection
✍ D. Jouan-Rimbaud; D.L. Massart; O.E. de Noord 📂 Article 📅 1996 🏛 Elsevier Science 🌐 English ⚖ 549 KB

The importance of the validation step in multiple linear regression of near-infrared spectroscopic data, after selection of wavelengths by a genetic algorithm, is investigated with the use of random variables. It is shown that in spite of a careful validation procedure, the GA can still select irrel

Model selection in neural networks
✍ Ulrich Anders; Olaf Korn 📂 Article 📅 1999 🏛 Elsevier Science 🌐 English ⚖ 418 KB

In this article, we examine how model selection in neural networks can be guided by statistical procedures such as hypothesis tests, information criteria and cross validation. The application of these methods in neural network models is discussed, paying attention especially to the identification pr

Selection of calibration mixtures and wa
✍ F. Navarro-Villoslada; L.V. Pérez-Arribas; M.E. Léon-González; L.M. Polo-Díez 📂 Article 📅 1995 🏛 Elsevier Science 🌐 English ⚖ 601 KB

A comparative study to select calibration mixtures and wavelengths in multivariate calibration methods was made. The methods studied were classical least squares (CLS), inverse least squares (ILS), partial least squares (PLS), principal component regression (PCR) and Kalman filter. For each method t

Cubic approximation neural network for m
✍ Doron Stein; Arie Feuer 📂 Article 📅 1998 🏛 Elsevier Science 🌐 English ⚖ 374 KB

This paper introduces a novel neural network architecture-cubic approximation neural network (CANN), capable of local approximation of multivariate functions. It is particularly simple in concept and in structure. Its simplicity enables a quantitative evaluation of its approximation capabilities, na