Learning rates of gradient descent algorithm for classification
โ Scribed by Xue-Mei Dong; Di-Rong Chen
- Book ID
- 104005899
- Publisher
- Elsevier Science
- Year
- 2009
- Tongue
- English
- Weight
- 702 KB
- Volume
- 224
- Category
- Article
- ISSN
- 0377-0427
No coin nor oath required. For personal study only.
โฆ Synopsis
In this paper, a stochastic gradient descent algorithm is proposed for the binary classification problems based on general convex loss functions. It has computational superiority over the existing algorithms when the sample size is large. Under some reasonable assumptions on the hypothesis space and the underlying distribution, the learning rate of the algorithm has been established, which is faster than that of closely related algorithms.
๐ SIMILAR VOLUMES
## Abstract MRI gradient coil design is a type of nonlinear constrained optimization. A practical problem in transverse gradient coil design using the conjugate gradient descent (CGD) method is that wire elements move at different rates along orthogonal directions (__r, ฯ, z__), and tend to cross,
In this paper, we try to analyze several conventional neuro-fuzzy learning algorithms, which are widely used in recent fuzzy applications for tuning fuzzy rules, and give a summarization of their properties in detail. Some of these properties show that the uses of the conventional neuro-fuzzy learni