In this paper, we propose a new nonmonotone line search technique for unconstrained optimization problems. By using this new technique, we establish the global convergence under conditions weaker than those of the existed nonmonotone line search techniques.
A derivative-free nonmonotone line-search technique for unconstrained optimization
✍ Scribed by M.A. Diniz-Ehrhardt; J.M. Martínez; M. Raydan
- Publisher
- Elsevier Science
- Year
- 2008
- Tongue
- English
- Weight
- 210 KB
- Volume
- 219
- Category
- Article
- ISSN
- 0377-0427
No coin nor oath required. For personal study only.
✦ Synopsis
A tolerant derivative-free nonmonotone line-search technique is proposed and analyzed. Several consecutive increases in the objective function and also nondescent directions are admitted for unconstrained minimization. To exemplify the power of this new line search we describe a direct search algorithm in which the directions are chosen randomly. The convergence properties of this random method rely exclusively on the line-search technique. We present numerical experiments, to illustrate the advantages of using a derivative-free nonmonotone globalization strategy, with approximated-gradient type methods and also with the inverse SR1 update that could produce nondescent directions. In all cases we use a local variation finite differences approximation to the gradient.
📜 SIMILAR VOLUMES
In this paper, we present a new algorithm using the nonmonotone second-order Wolfe's line search. By using the negative curvature information from the Hessian, we prove that the generated sequence converges to the stationary points that satisfy the second-order optimality conditions. We also report
In this paper, we present a nonmonotone conic trust region method based on line search technique for unconstrained optimization. The new algorithm can be regarded as a combination of nonmonotone technique, line search technique and conic trust region method. When a trial step is not accepted, the me