Training recurrent neural networks to perform certain tasks is known to be difficult. The possibility of adding synaptic delays to the network properties makes the training task more difficult. However, the disadvantage of tough training procedure is diminished by the improved network performance. D
โฆ LIBER โฆ
Efficient training of Time Delay Neural Networks for sequential patterns
โ Scribed by Rossella Cancelliere; Roberto Gemello
- Book ID
- 113399531
- Publisher
- Elsevier Science
- Year
- 1996
- Tongue
- English
- Weight
- 608 KB
- Volume
- 10
- Category
- Article
- ISSN
- 0925-2312
No coin nor oath required. For personal study only.
๐ SIMILAR VOLUMES
Efficient Training of Recurrent Neural N
โ
Barak Cohen; David Saad; Emanuel Marom
๐
Article
๐
1997
๐
Elsevier Science
๐
English
โ 671 KB
An efficient global algorithm for superv
โ
K.K. Shukla; Raghunath
๐
Article
๐
1999
๐
Elsevier Science
๐
English
โ 285 KB
Delay-dependent stability analysis for r
โ
Lu, C.-Y.; Su, T.-J.; Huang, S.-C.
๐
Article
๐
2008
๐
The Institution of Engineering and Technology
๐
English
โ 151 KB
Stability analysis for recurrent neural
โ
Yuan-Yuan Wu; Yu-Qiang Wu
๐
Article
๐
2009
๐
Institute of Automation, Chinese Academy of Scienc
๐
English
โ 175 KB
Efficient Partition of Learning Data Set
โ
Igor V. Tetko; Alessandro E.P. Villa
๐
Article
๐
1997
๐
Elsevier Science
๐
English
โ 641 KB
This study investigates the emerging possibilities of combining unsupervised and supervised learning in neural network ensembles. Such strategy is used to get an efficient partition of a noisy input data set in order to focus the training of neural networks on the most complex and informative domain
Conditions of asymptotic stability for c
โ
Zhongfu Wu; Xiaofeng Liao; Juebang Yu
๐
Article
๐
2000
๐
SP Science Press
๐
English
โ 295 KB