An Introduction to Applied Optimal Control (Mathematics in Science and Engineering, Vol. 159)
โ Scribed by Greg Knowles
- Publisher
- Academic Press
- Year
- 1982
- Tongue
- English
- Leaves
- 191
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
Published by Academic Press.
โฆ Table of Contents
Front Page
An Introduction to Applied Optimal Control
Copyright Page
Contents
Preface
Chapter I. Examples of Control Systems; the Control Problem
General Form of the Control Problem
Chapter II. The General Linear Time Optimal Problem
1. Introduction
2. Applications of the Maximum Principle
3. Normal SystemsโUniqueness of the Optimal Control
4. Further Examples of Time Optimal Control
5. Numerical Computation of the Switching Times
References
Chapter III. The Pontryagin Maximum Principle
1. The Maximum Principle
2. Classical Calculus of Variations
3. More Examples of the Maximum Principle
References
Chapter IV. The General Maximum Principle; Control Problems with Terminal Payoff
1. Introduction
2. Control Problems with Terminal Payoff
3. Existence of Optimal Controls
References
Chapter V. Numerical Solution of Two-Point Boundary-Value Problems
1. Linear Two-Point Boundary-Value Problems
2. Nonlinear Shooting Methods
3. Nonlinear Shooting Methods: Implicit Boundary Conditions
4. Quasi-Linearization
5. Finite-Difference Schemes and Multiple Shooting
6. Summary
References
Chapter VI. Dynamic Programming and Differential Games
1. Discrete Dynamic Pogramming
2. Continuous Dynamic RogrammingโControl Problems
3. Continuous Dynamic ProgrammingโDifferential Games
References
Chapter VII. Controllability and Observability
1. Controllable Linear Systems
2. Observability
References
Chapter VIII. State-Constrained Control Problems
1. The Restricted Mmimum Principle
2. Jump Conditions
3. The Continuous Wheat Trading Model without Shortselling
4. Some Models in Production and Inventory Control
References
Chapter IX. Optimal Control of Systems Governed by Partial Differential Equations
1. Some Examples of Elliptic Control Problems
2. Necessary and Sufficient Conditions for Optimality
3. Boundary Control and Approximate Controllability of Elliptic Systems
4. The Control of Systems Governed by Parabolic Equations
5. Time Optimal Control
6. Approximate Controllability for Parabolic Problems
References
Appendix I. Geometry of Rn
Appendix II. Existence of Time Optimal Controls and the Bang-Bang Principle
Appendix III. Stability
Index
๐ SIMILAR VOLUMES
<p><span>Combining control theory and modeling, this textbook introduces and builds on methods for simulating and tackling concrete problems in a variety of applied sciences. Emphasizing "learning by doing," the authors focus on examples and applications to real-world problems. An elementary present
<span>Even with the advances in signal processing and digital communications, robustness to uncertain channel statistics continues to be a fundamental issue in the design and performance analysis of today's communications, radar, and sonar systems. The variability of digital communications systems c
<span>Geared toward advanced undergraduate and graduate engineering students, this text introduces the theory and applications of optimal control. It serves as a bridge to the technical literature, enabling students to evaluate the implications of theoretical control work, and to judge the merits of