In this paper a method for adapting the stepsize in on-line network training is presented. The proposed technique derives from the stochastic gradient descent proposed by Almeida et al. [On-line Learning in Neural Networks, 111-134, Cambridge University Press, 1998]. The new aspect of our approach c
β¦ LIBER β¦
Adaptive Training of Neural Networks for Automatic Seismic Phase Identification
β Scribed by J. Wang
- Book ID
- 105762084
- Publisher
- Springer
- Year
- 2002
- Tongue
- English
- Weight
- 370 KB
- Volume
- 159
- Category
- Article
- ISSN
- 0033-4533
No coin nor oath required. For personal study only.
π SIMILAR VOLUMES
Adaptive stepsize algorithms for on-line
β
G.D. Magoulas; V.P. Plagianakos; M.N. Vrahatis
π
Article
π
2001
π
Elsevier Science
π
English
β 312 KB
Cross validation of neural network appli
β
H. Cenk Γzmutlu; Fatih Γavdur; Amanda Spink; Seda Γzmutlu
π
Article
π
2006
π
Wiley (John Wiley & Sons)
π
English
β 970 KB
## Abstract There are recent studies in the literature on automatic topicβshift identification in Web search engine user sessions; however most of this work applied their topicβshift identification algorithms on data logs from a single search engine. The purpose of this study is to provide the cros
Cross-validation of neural network appli
β
H. Cenk Ozmutlu; Fatih Cavdur; Seda Ozmutlu
π
Article
π
2008
π
John Wiley and Sons
π
English
β 524 KB
An adaptive conjugate gradient learning
β
H. Adeli; S.L. Hung
π
Article
π
1994
π
Elsevier Science
π
English
β 1003 KB
An improved SPSA algorithm for system id
β
Ahmad T. Abdulsadda; Kamran Iqbal
π
Article
π
2011
π
Institute of Automation, Chinese Academy of Scienc
π
English
β 751 KB
Automatic training of a min-max neural n
β
R. K. Brouwer
π
Article
π
2004
π
Springer
π
English
β 298 KB