## Abstract Multilayer feedforward neural networks (MLFNNs) are important modeling techniques widely used in QSAR studies for their ability to represent nonlinear relationships between descriptors and activity. However, the problems of overfitting and premature convergence to local optima still pos
Hybridized particle swarm algorithm for adaptive structure training of multilayer feed-forward neural network: QSAR studies of bioactivity of organic compounds
✍ Scribed by Qi Shen; Jian-Hui Jiang; Chen-Xu Jiao; Wei-Qi Lin; Guo-Li Shen; Ru-Qin Yu
- Publisher
- John Wiley and Sons
- Year
- 2004
- Tongue
- English
- Weight
- 145 KB
- Volume
- 25
- Category
- Article
- ISSN
- 0192-8651
No coin nor oath required. For personal study only.
✦ Synopsis
Abstract
The multilayer feed‐forward ANN is an important modeling technique used in QSAR studying. The training of ANN is usually carried out only to optimize the weights of the neural network and without paying attention to the network topology. Some other strategies used to train ANN are, first, to discover an optimum structure of the network, and then to find weights for an already defined structure. These methods tend to converge to local optima, and may also lead to overfitting. In this article, a hybridized particle swarm optimization (PSO) approach was applied to the neural network structure training (HPSONN). The continuous version of PSO was used for the weight training of ANN, and the modified discrete PSO was applied to find appropriate the network architecture. The network structure and connectivity are trained simultaneously. The two versions of PSO can jointly search the global optimal ANN architecture and weights. A new objective function is formulated to determine the appropriate network architecture and optimum value of the weights. The proposed HPSONN algorithm was used to predict carcinogenic potency of aromatic amines and biological activity of a series of distamycin and distamycin‐like derivatives. The results were compared to those obtained by PSO and GA training in which the network architecture was kept fixed. The comparison demonstrated that the HPSONN is a useful tool for training ANN, which converges quickly towards the optimal position, and can avoid overfitting in some extent. © 2004 Wiley Periodicals, Inc. J Comput Chem 25: 1726–1735, 2004
📜 SIMILAR VOLUMES