Training neural networks with heterogeneous data
โ Scribed by John A. Drakopoulos; Ahmad Abdulkader
- Book ID
- 103853745
- Publisher
- Elsevier Science
- Year
- 2005
- Tongue
- English
- Weight
- 140 KB
- Volume
- 18
- Category
- Article
- ISSN
- 0893-6080
No coin nor oath required. For personal study only.
โฆ Synopsis
Data pruning and ordered training are two methods and the results of a small theory that attempts to formalize neural network training with heterogeneous data. Data pruning is a simple process that attempts to remove noisy data. Ordered training is a more complex method that partitions the data into a number of categories and assigns training times to those assuming that data size and training time have a polynomial relation. Both methods derive from a set of premises that form the 'axiomatic' basis of our theory. Both methods have been applied to a time-delay neural network-which is one of the main learners in Microsoft's Tablet PC handwriting recognition system. Their effect is presented in this paper along with a rough estimate of their effect on the overall multi-learner system. The handwriting data and the chosen language are Italian.
๐ SIMILAR VOLUMES
Abatraet-For neural networks to develop good internal representations for pattern mapping, noise in the training set data must be controlled. Because of the many difficulties associated with manually validating training data, we have focused on using decision table techniques as a practical, domain-
Neural networks (NNs) are often used as black-box techniques for the modelling of system relations. Standard NNs are static models, whereas in practice one often has to deal with dynamic systems or processes. In such cases, dynamic neural networks (DNNs) may be better suited. We will argue that the
This study investigates the emerging possibilities of combining unsupervised and supervised learning in neural network ensembles. Such strategy is used to get an efficient partition of a noisy input data set in order to focus the training of neural networks on the most complex and informative domain