๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

A superlinearly convergent method of quasi-strongly sub-feasible directions with active set identifying for constrained optimization

โœ Scribed by Jin-bao Jian; Yi Liu


Publisher
Elsevier Science
Year
2011
Tongue
English
Weight
297 KB
Volume
12
Category
Article
ISSN
1468-1218

No coin nor oath required. For personal study only.

โœฆ Synopsis


Combining the norm-relaxed sequential quadratic programming (SQP) method and the idea of method of quasi-strongly sub-feasible directions (MQSSFD) with active set identification technique, a new SQP algorithm for solving nonlinear inequality constrained optimization is proposed. Unlike the previous work, at each iteration of the proposed algorithm, the norm-relaxed quadratic programming (QP) subproblem only consists of the constraints corresponding to an active identification set. Moreover, the high-order correction direction (used to avoid the Maratos effect) is yielded by solving a system of linear equations (SLE) which also includes only the constraints and their gradients corresponding to the active identification set, therefore, the scale and the computation cost of the high-order correction directions are further decreased. The arc search in our algorithm can effectively combine the initialization processes with the optimization processes, and the iteration points can get into the feasible set after a finite number of iterations. Furthermore, the arc search conditions are weaker than the previous work, and the computation cost is further reduced. The global convergence is proved under the Mangasarian-Fromovitz constraint qualification (MFCQ). If the strong second-order sufficient conditions are satisfied, then the active constraints are exactly identified by the identification set. Without the strict complementarity, superlinear convergence can be obtained. Finally, some elementary numerical results are reported.


๐Ÿ“œ SIMILAR VOLUMES


A generalized super-memory gradient proj
โœ Jin-Bao Jian; You-Fang Zeng; Chun-Ming Tang ๐Ÿ“‚ Article ๐Ÿ“… 2007 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 362 KB

In this work, combining the properties of the generalized super-memory gradient projection methods with the ideas of the strongly sub-feasible directions methods, we present a new algorithm with strong convergence for nonlinear inequality constrained optimization. At each iteration, the proposed alg