Recently an efficient method (DACG) for the partial solution of the symmetric generalized eigenproblem Ax = Ξ»Bx has been developed, based on the conjugate gradient (CG) minimization of the Rayleigh quotient over successive deflated subspaces of decreasing size. The present paper provides a numerical
New results on the convergence of the conjugate gradient method
β Scribed by R. Bouyouli; G. Meurant; L. Smoch; H. Sadok
- Publisher
- John Wiley and Sons
- Year
- 2009
- Tongue
- English
- Weight
- 120 KB
- Volume
- 16
- Category
- Article
- ISSN
- 1070-5325
- DOI
- 10.1002/nla.618
No coin nor oath required. For personal study only.
β¦ Synopsis
Abstract
This paper is concerned with proving theoretical results related to the convergence of the conjugate gradient (CG) method for solving positive definite symmetric linear systems. Considering the inverse of the projection of the inverse of the matrix, new relations for ratios of the Aβnorm of the error and the norm of the residual are provided, starting from some earlier results of Sadok (Numer. Algorithms 2005; 40:201β216). The proofs of our results rely on the wellβknown correspondence between the CG method and the Lanczos algorithm. Copyright Β© 2008 John Wiley & Sons, Ltd.
π SIMILAR VOLUMES
In this paper, we consider the rate of convergence of the parameter estimation error and the cost function for the stochastic gradient-type algorithm. The problem is solved in the case of the minimum-variance stochastic adaptive control. It is proven that the cost function has the rate of convergenc