๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Learning rates of gradient descent algorithm for classification

โœ Scribed by Xue-Mei Dong; Di-Rong Chen


Book ID
104005899
Publisher
Elsevier Science
Year
2009
Tongue
English
Weight
702 KB
Volume
224
Category
Article
ISSN
0377-0427

No coin nor oath required. For personal study only.

โœฆ Synopsis


In this paper, a stochastic gradient descent algorithm is proposed for the binary classification problems based on general convex loss functions. It has computational superiority over the existing algorithms when the sample size is large. Under some reasonable assumptions on the hypothesis space and the underlying distribution, the learning rate of the algorithm has been established, which is faster than that of closely related algorithms.


๐Ÿ“œ SIMILAR VOLUMES


Momentum-weighted conjugate gradient des
โœ Hanbing Lu; Andrzej Jesmanowicz; Shi-Jiang Li; James S. Hyde ๐Ÿ“‚ Article ๐Ÿ“… 2003 ๐Ÿ› John Wiley and Sons ๐ŸŒ English โš– 164 KB

## Abstract MRI gradient coil design is a type of nonlinear constrained optimization. A practical problem in transverse gradient coil design using the conjugate gradient descent (CGD) method is that wire elements move at different rates along orthogonal directions (__r, ฯ†, z__), and tend to cross,

Some considerations on conventional neur
โœ Yan Shi; Masaharu Mizumoto ๐Ÿ“‚ Article ๐Ÿ“… 2000 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 232 KB

In this paper, we try to analyze several conventional neuro-fuzzy learning algorithms, which are widely used in recent fuzzy applications for tuning fuzzy rules, and give a summarization of their properties in detail. Some of these properties show that the uses of the conventional neuro-fuzzy learni