<p><span>This self-contained monograph presents the reader with an authoritative view of Continuous Optimization, an area of mathematical optimization that has experienced major developments during the past 40 years. The book contains results which have not yet been covered in a systematic way as we
Introduction to Continuous Optimization (Springer Optimization and Its Applications, 172)
✍ Scribed by Roman A. Polyak
- Publisher
- Springer
- Year
- 2021
- Tongue
- English
- Leaves
- 552
- Edition
- 1st ed. 2021
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
This self-contained monograph presents the reader with an authoritative view of Continuous Optimization, an area of mathematical optimization that has experienced major developments during the past 40 years. The book contains results which have not yet been covered in a systematic way as well as a summary of results on NR theory and methods developed over the last several decades. The readership is aimed to graduate students in applied mathematics, computer science, economics, as well as researchers working in optimization and those applying optimization methods for solving real life problems. Sufficient exercises throughout provide graduate students and instructors with practical utility in a two-semester course in Continuous Optimization.
The topical coverage includes interior point methods, self-concordance theory and related complexity issues, first and second order methods with accelerated convergence, nonlinear rescaling (NR) theory and exterior point methods, just to mention a few. The book contains a unified approach to both interior and exterior point methods with emphasis of the crucial duality role. One of the main achievements of the book shows what makes the exterior point methods numerically attractive and why.
The book is composed in five parts. The first part contains the basics of calculus, convex analysis, elements of unconstrained optimization, as well as classical results of linear and convex optimization. The second part contains the basics of self-concordance theory and interior point methods, including complexity results for LP, QP, and QP with quadratic constraint, semidefinite and conic programming. In the third part, the NR and Lagrangian transformation theories are considered and exterior point methods are described. Three important problems in finding equilibrium are considered in the fourth part. In the fifth and final part of the book, several important applications arising in economics, structural optimization, medicine, statistical learning theory, and more, are detailed. Numerical results, obtained by solving a number of real life and test problems, are also provided.
✦ Table of Contents
Preface
Contents
1 Introduction
2 Elements of Calculus and Convex Analysis
2.0 Introduction
2.1 Elements of Calculus
2.1.1 Differentiation of Scalar Functions
2.1.2 Differentiation of Vector Functions
2.1.3 Second Derivatives
2.1.4 Convex Functions in Rn
2.1.5 Strictly and Strongly Convex Functions in Rn
2.2 Convex Sets
2.2.1 Open and Closed Sets
2.2.2 Convex Sets
2.2.3 Affine Sets
2.2.4 Cones
2.2.5 Recession Cones
2.2.6 Polyhedrons and Polytopes
2.3 Closed Convex Functions
2.3.1 Operations on Closed Convex Functions
2.3.2 Projection on a Closed Convex Set
2.3.3 Separation Theorems
2.3.4 Some Properties of Convex Functions
2.3.4.1 Continuity of Convex Functions
2.3.4.2 Differentiability of Convex Functions
2.3.5 Subgradients
2.3.6 Support Functions
2.4 The Legendre–Fenchel Transformation
2.4.1 Basic LF Transformation Property
2.4.2 The LF Identity and the LF Invariant
3 Few Topics in Unconstrained Optimization
3.0 Introduction
3.1 Optimality Conditions
3.1.1 First-Order Necessary Condition
3.1.2 Second-Order Necessary Condition
3.1.3 Second-Order Sufficient Condition
3.2 Nondifferentiable Unconstrained Minimization
3.2.1 Subgradient Method
3.3 Gradient Methods
3.3.1 Gradient Method
3.3.2 Fast Gradient Method
3.3.3 Gradient Method for Strongly Convex Functions
3.4 Method Regularization
3.5 Proximal Point Method
3.6 Newton's Method and Regularized Newton Method
3.6.1 Introduction
3.6.2 Newton's Method
3.6.3 Local Quadratic Convergence of Newton's Method
3.6.4 Damped Newton Method
3.6.5 Global Convergence of DNM and Its Complexity
3.6.6 Regularized Newton Method
3.6.7 Local Quadratic Convergence Rate of RNM
3.6.8 Damped Regularized Newton Method
3.6.9 The Complexity of DRNM
3.6.10 Newton's Method as an Affine Invariant
4 Optimization with Equality Constraints
4.0 Introduction
4.1 Lagrangian and First-Order Optimality Condition
4.2 Second-Order Necessary and Sufficient Optimality Condition
4.3 Optimality Condition for Constrained Optimization Problems with Both Inequality Constraints and Equations
4.4 Duality for Equality-Constrained Optimization
4.5 Courant's Penalty Method as Tikhonov's Regularization for the Dual Problem
4.6 Gradient Methods for ECO
4.7 Newton's Method for Nonlinear System of Equations
4.8 Newton's Method for ECO
4.9 Augmented Lagrangian
4.10 The Multipliers Method and the Dual Quadratic Prox
4.11 Primal–Dual AL Method for ECO
5 Basics in Linear and Convex Optimization
5.0 Introduction
5.1 Linear Programming
5.1.1 Primal and Dual LP Problems
5.1.2 Optimality Condition for LP Problem
5.1.3 Farkas Lemma
5.2 The Karush–Kuhn–Tucker's Theorem
5.3 The KKT's Theorem for Convex Optimization with Linear Constraints
5.4 Duality in Convex Optimization
5.5 Wolfe's Duality
5.6 LP Duality
5.7 Some Structural LP Properties
5.8 Simplex Method
5.9 Interior Point Methods
5.9.1 Newton Log– Barrier Method for LP
5.9.2 Primal–Dual Interior Point Method
5.9.3 Affine Scaling Method
5.10 SUMT as Dual Interior Regularization
5.10.1 Log–Barrier Method and Its Dual Equivalent
5.10.2 Hyperbolic Barrier as Dual Parabolic Regularization
5.10.3 Exponential Penalty as Dual Regularization with Shannon's Entropy Function
5.10.4 Log–Sigmoid Method as Dual Regularization with Fermi–Dirac's Entropy Function
5.10.5 Interior Distance Functions
5.11 Primal–Dual IPM for Convex Optimization
5.12 Gradient Projection Method
5.12.1 Convergence of the GP Method
5.12.2 Fast GP Method
5.12.3 GP Method for Strongly Convex Function
5.13 Quadratic Programming
5.13.1 Dual GP Method for QP
5.13.2 Dual Fast Gradient Projection Method
5.14 Quadratic Programming Problems with Quadratic Constraints
5.15 Conditional Gradient Method
5.16 Primal–Dual Feasible Direction Method
6 Self-Concordant Functions and IPM Complexity
6.0 Introduction
6.1 LF Invariant and SC Functions
6.2 Basic Properties of SC Functions
6.3 Newton's Method for Minimization of SC Functions
6.4 SC Barrier
6.5 Path-Following Method
6.6 Applications of ν-SC Barrier. IPM Complexity for LP and QP
6.6.1 Linear and Quadratic Optimization
6.6.2 The Lorentz Cone
6.6.3 Semidefinite Optimization
6.7 Primal–Dual Predictor–Corrector for LP
7 Nonlinear Rescaling: Theory and Methods
7.0 Introduction
7.1 Nonlinear Rescaling
7.1.1 Preliminaries
7.1.2 Constraints Transformation and Lagrangian for the Equivalent Problem: Local Properties
7.1.3 Primal Transformations and Dual Kernels
7.1.4 NR Method and Dual Prox with -Divergence Distance
7.1.5 Q-Linear Convergence Rate
7.1.6 Stopping Criteria
7.1.7 Newton NR Method and Hot'' Start Phenomenon
7.2 NR withDynamic'' Scaling Parameters
7.2.0 Introduction
7.2.1 Nonlinear Rescaling as Interior Quadratic Prox
7.2.2 Convergence of the NR Method
7.2.3 Rate of Convergence
7.2.4 Nonlinear Rescaling for LP
7.3 Primal–Dual NR Method for Convex Optimization
7.3.0 Introduction
7.3.1 Local Convergence of the PDNR
7.3.2 Global Convergence of the PDNR
7.4 Nonlinear Rescaling and Augmented Lagrangian
7.4.0 Introduction
7.4.1 Problem Formulation and Basic Assumptions
7.4.2 Lagrangian for the Equivalent Problem
7.4.3 Multipliers Method
7.4.4 NRAL and the Dual Prox
8 Realizations of the NR Principle
8.0 Introduction
8.1 Modified Barrier Functions
8.1.0 Introduction
8.1.1 Logarithmic MBF
8.1.2 Convergence of the Logarithmic MBF Method
8.1.3 Convergence Rate
8.1.4 MBF and Duality Issues
8.2 Exterior Distance Functions
8.2.0 Introduction
8.2.1 Exterior Distance
8.2.2 Exterior Point Method: Convergence and Convergence Rate
8.2.3 Stopping Criteria
8.2.4 Modified Interior Distance Functions
8.2.5 Local MIDF Properties
8.2.6 Modified Center Method
8.2.7 Basic Theorem
8.3 Nonlinear Rescaling vs. Smoothing Technique
8.3.0 Introduction
8.3.1 Log-Sigmoid Transformation and Its Modification
8.3.2 Equivalent Problem and LS Lagrangian
8.3.3 LS Multipliers Method as Interior Prox with Fermi–Dirac Entropy Distance
8.3.4 Convergence of the LS Multipliers Method
8.3.5 The Upper Bound for the Number of Steps
8.3.6 Asymptotic Convergence Rate
8.3.7 Generalization and Extension
8.3.8 LS Multipliers Method for Linear Programming
9 Lagrangian Transformation and Interior Ellipsoid Methods
9.0 Introduction
9.1 Lagrangian Transformation
9.2 Bregman's Distance
9.3 Primal LT and Dual Interior Quadratic Prox
9.4 Convergence Analysis
9.5 LT with Truncated MBF and Interior Ellipsoid Method
9.6 Lagrangian Transformation and Dual Affine Scaling Method for LP
10 Finding Nonlinear Equilibrium
10.0 Introduction
10.1 General NE Problem and the Equivalent VI
10.2 Problems Leading to NE
10.2.1 Convex Optimization
10.2.2 Finding a Saddle Point
10.2.3 Matrix Game
10.2.4 J. Nash Equilibrium in n-Person Concave Game
10.2.5 Walras–Wald Equilibrium
10.3 NE for Optimal Resource Allocation
10.3.1 Introduction
10.3.2 Problem Formulation
10.3.3 NE as a VI
10.3.4 Existence and Uniqueness of the NE
10.4 Nonlinear Input–Output Equilibrium
10.4.1 Introduction
10.4.2 Preliminaries
10.4.3 Problem Formulation
10.4.4 NIOE as a VI
10.4.5 Existence and Uniqueness of the NIOE
10.5 Finding NE for Optimal Resource Allocation
10.5.1 Introduction
10.5.2 Basic Assumptions
10.5.3 Pseudo-gradient Projection Method
10.5.4 Extra Pseudo-gradient Method for Finding NE
10.5.5 Convergence Rate
10.5.6 Bound for the Lipschitz Constant
10.5.7 Finding NE as a Pricing Mechanizm
10.6 Finding Nonlinear Input–Output Equilibrium
10.6.1 Introduction
10.6.2 Basic Assumptions
10.6.3 PGP Method for Finding NIOE
10.6.4 EPG Method for Finding NIOE
10.6.5 Convergence Rate and Complexity of the EPG Method
10.6.6 Lipschitz Constant
10.7 Finding J. Nash Equilibrium in n-Person Concave Game
10.7.1 Projection Onto Probability Simplex
10.7.2 Algorithm for Projection onto PS
11 Applications and Numerical Results
11.0 Introduction
11.1 Truss Topology Design
11.2 Intensity-Modulated Radiation Therapy Planning
11.3 QP and Its Applications
11.3.1 Non-negative Least Squares
11.3.2 Support Vector Machines
11.3.3 Fast Gradient Projection for Dual QP. Numerical Results
11.4 Finding Nonlinear Equilibrium
11.5 The ``Hot'' Start Phenomenon
Concluding Remarks
Appendix
References
📜 SIMILAR VOLUMES
Focusing on the study of nonsmooth vector functions, this book presents a comprehensive account of the calculus of generalized Jacobian matrices and their applications to continuous nonsmooth optimization problems, as well as variational inequalities in finite dimensions. The treatment is motivated
<p>The satellite range scheduling (SRS) problem, an important operations research problem in the aerospace industry consisting of allocating tasks among satellites and Earth-bound objects, is examined in this book. SRS principles and solutions are applicable to many areas, including:</p><p></p><ul><
This book examines the main methodological and theoretical developments in stochastic global optimization. It is designed to inspire readers to explore various stochastic methods of global optimization by clearly explaining the main methodological principles and features of the methods. Among the bo
This book concerns the theory of optimal transport (OT) and its applications to solving problems in geometric optics. It is a self-contained presentation including a detailed analysis of the Monge problem, the Monge-Kantorovich problem, the transshipment problem, and the network flow problem. A chap
<span>Polynomial optimization is a fascinating field of study that has revolutionized the way we approach nonlinear problems described by polynomial constraints. The applications of this field range from production planning processes to transportation, energy consumption, and resource control.<br>Th