Evolutionary -Gaussian radial basis function neural networks for multiclassification
✍ Scribed by Francisco Fernández-Navarro; César Hervás-Martínez; P.A. Gutiérrez; M. Carbonero-Ruz
- Publisher
- Elsevier Science
- Year
- 2011
- Tongue
- English
- Weight
- 564 KB
- Volume
- 24
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
✦ Synopsis
This paper proposes a radial basis function neural network (RBFNN), called the q-Gaussian RBFNN, that reproduces different radial basis functions (RBFs) by means of a real parameter q. The architecture, weights and node topology are learnt through a hybrid algorithm (HA). In order to test the overall performance, an experimental study with sixteen data sets taken from the UCI repository is presented. The q-Gaussian RBFNN was compared to RBFNNs with Gaussian, Cauchy and inverse multiquadratic RBFs in the hidden layer and to other probabilistic classifiers, including different RBFNN design methods, support vector machines (SVMs), a sparse classifier (sparse multinomial logistic regression, SMLR) and a non-sparse classifier (regularized multinomial logistic regression, RMLR). The results show that the q-Gaussian model can be considered very competitive with the other classification methods.
📜 SIMILAR VOLUMES
This article presents a new family of reformulated radial basis function (RBF) neural networks that employ adjustable weighted norms to measure the distance between the training vectors and the centers of the radial basis functions. The reformulated RBF model introduced in this article incorporates
Selective model structure and parameter updating algorithms are introduced for both the online estimation of NARMAX models and training of radial basis function neural networks. Techniques for on-line model modification, which depend on the vector-shift properties of regression variables in linear m