In this paper we develop a new class of conjugate gradient methods for unconstrained optimization problems. A new nonmonotone line search technique is proposed to guarantee the global convergence of these conjugate gradient methods under some mild conditions. In particular, Polak-Ribiรฉre-Polyak and
New step lengths in conjugate gradient methods
โ Scribed by Yunda Dong
- Publisher
- Elsevier Science
- Year
- 2010
- Tongue
- English
- Weight
- 306 KB
- Volume
- 60
- Category
- Article
- ISSN
- 0898-1221
No coin nor oath required. For personal study only.
โฆ Synopsis
Consider any conjugate gradient method for finding a zero point of a given gradient whose function is implicit. We propose two different types of conditions for selecting the step length using the gradient information only. One is used for re-proving known convergence results under the same gradient-Lipschitz assumption. Moreover, if the gradient is merely continuous then we are still able to get some interesting convergence results. The other also allows for convergence of the resulting conjugate gradient methods, with an application to convergence analysis of the Fletcher-Reeves conjugate gradient method. Preliminary numerical experiments show the efficiency of our proposed step length rules in practice.
๐ SIMILAR VOLUMES
To obtain an ecient parallel algorithm to solve sparse linear systems with the preconditioned conjugate gradient (PCG) method, two types of parallel preconditioners are introduced. The ยฎrst is a polynomial preconditioner type based on a multisplitting of the matrix system, and the second one is obta
In this paper, a new spectral PRP conjugate gradient algorithm has been developed for solving unconstrained optimization problems, where the search direction was a kind of combination of the gradient and the obtained direction, and the steplength was obtained by the Wolfe-type inexact line search. I