𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Optimal Control of Dynamic Systems Driven by Vector Measures: Theory and Applications

✍ Scribed by N. U. Ahmed, Shian Wang


Publisher
Springer
Year
2021
Tongue
English
Leaves
328
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


This book is devoted to the development of optimal control theory for finite dimensional systems governed by deterministic and stochastic differential equations driven by vector measures. The book deals with a broad class of controls, including regular controls (vector-valued measurable functions), relaxed controls (measure-valued functions) and controls determined by vector measures, where both fully and partially observed control problems are considered.

In the past few decades, there have been remarkable advances in the field of systems and control theory thanks to the unprecedented interaction between mathematics and the physical and engineering sciences. Recently, optimal control theory for dynamic systems driven by vector measures has attracted increasing interest. This book presentsΒ this theory for dynamic systems governed by both ordinary and stochastic differential equations, including extensive results on the existence of optimal controls and necessary conditions for optimality. Computational algorithms are developed based on the optimality conditions, with numerical results presented to demonstrate the applicability of the theoretical results developed in the book.

This book will be of interest to researchers in optimal control or applied functional analysis interested in applications of vector measures to control theory, stochastic systems driven by vector measures, and related topics. In particular, this self-contained account can be a starting point for further advances in the theory and applications of dynamic systems driven and controlled by vector measures.

✦ Table of Contents


Preface
Contents
1 Mathematical Preliminaries
1.1 Introduction
1.2 Vector Space
1.3 Normed Space
1.4 Banach Space
1.5 Measures and Measurable Functions
1.6 Modes of Convergence and Lebesgue Integral
1.6.1 Modes of Convergence
1.6.2 Lebesgue Integral
1.7 Selected Results From Measure Theory
1.8 Special Hilbert and Banach Spaces
1.8.1 Hilbert Spaces
1.8.2 Special Banach Spaces
1.9 Metric Space
1.10 Banach Fixed Point Theorems
1.11 Frequently Used Results From Analysis
1.12 Bibliographical Notes
2 Linear Systems
2.1 Introduction
2.2 Representation of Solutions for TIS
2.2.1 Classical System Models
2.2.2 Impulsive System Models
2.3 Representation of Solutions for TVS
2.3.1 Classical System Models
2.3.2 Measure Driven System Models
2.3.3 Measure Induced Structural Perturbation
2.3.4 Measure Driven Control Systems
2.4 Bibliographical Notes
3 Nonlinear Systems
3.1 Introduction
3.2 Fixed Point Theorems for Multi-Valued Maps
3.3 Regular Systems (Existence of Solutions)
3.4 Impulsive Systems (Existence of Solutions)
3.4.1 Classical Impulsive Models
3.4.2 Systems Driven by Vector Measures
3.4.3 Systems Driven by Finitely Additive Measures
3.5 Differential Inclusions
3.6 Bibliographical Notes
4 Optimal Control: Existence Theory
4.1 Introduction
4.2 Regular Controls
4.3 Relaxed Controls
4.4 Impulsive Controls I
4.5 Impulsive Controls II
4.6 Structural Control
4.7 Differential Inclusions (Regular Controls)
4.8 Differential Inclusions (Measure-Valued Controls)
4.9 Systems Controlled by Discrete Measures
4.10 Existence of Optimal Controls
4.11 Bibliographical Notes
5 Optimal Control: Necessary Conditions of Optimality
5.1 Introduction
5.2 Relaxed Controls
5.2.1 Discrete Control Domain
5.3 Regular Controls
5.4 Transversality Conditions
5.4.1 Necessary Conditions Under State Constraints
5.5 Impulsive and Measure-Valued Controls
5.5.1 Signed Measures as Controls
5.5.2 Vector Measures as Controls
5.6 Convergence Theorem
5.7 Implementability of Necessary Conditions of Optimality
5.7.1 Discrete Measures
5.7.2 General Measures
5.8 Structural Controls
5.9 Discrete Measures with Variable Supports as Controls
5.10 Bibliographical Notes
6 Stochastic Systems Controlled by Vector Measures
6.1 Introduction
6.2 Conditional Expectations
6.3 SDE Based on Brownian Motion
6.3.1 SDE Driven by Vector Measures (Impulsive Forces)
6.4 SDE Based on Poisson Random Processes
6.5 Optimal Relaxed Controls
6.5.1 Existence of Optimal Controls
6.5.2 Necessary Conditions of Optimality
6.6 Regulated (Filtered) Impulsive Controls
6.6.1 Application to Special Cases
6.7 Unregulated Measure-Valued Controls
6.7.1 An Application
6.8 Fully Observed Optimal State Feedback Controls
6.8.1 Existence of Optimal State Feedback Laws
6.8.2 Necessary Conditions of Optimality
6.9 Partially Observed Optimal Feedback Controls
6.9.1 Existence of Optimal Feedback Laws
6.9.2 Necessary Conditions of Optimality
6.10 Bellman's Principle of Optimality
6.11 Bibliographical Notes
7 Applications to Physical Examples
7.1 Numerical Algorithms
7.1.1 Numerical Algorithm I
7.1.2 Numerical Algorithm II
7.2 Examples of Physical Systems
7.2.1 Cancer Immunotherapy
7.2.2 Geosynchronous Satellites
7.2.3 Prey-Predator Model
7.2.4 Stabilization of Building Maintenance Units
7.2.5 An Example of a Stochastic System
Bibliography
Index


πŸ“œ SIMILAR VOLUMES


Optimal Control of Dynamic Systems Drive
✍ N. U. Ahmed, Shian Wang πŸ“‚ Library πŸ“… 2021 πŸ› Springer 🌐 English

This book is devoted to the development of optimal control theory for finite dimensional systems governed by deterministic and stochastic differential equations driven by vector measures. The book deals with a broad class of controls, including regular controls (vector-valued measurable functions),

Optimal Control of Distributed Systems:
✍ A. V. Fursikov πŸ“‚ Library πŸ“… 2000 πŸ› American Mathematical Soc. 🌐 English

This volume presents the analysis of optimal control problems for systems described by partial differential equations. The book offers simple and clear exposition of main results in this area. The methods proposed by the author cover cases where the controlled system corresponds to well-posed or ill

Optimization and control of bilinear sys
✍ Panos M. Pardalos, Vitaliy A. Yatsenko πŸ“‚ Library πŸ“… 2008 πŸ› Springer 🌐 English

<P>Covers developments in bilinear systems theory </P> <P>Focuses on the control of open physical processes functioning in a non-equilibrium mode </P> <P>Emphasis is on three primary disciplines: modern differential geometry, control of dynamical systems, and optimization theory </P> <P>Includes

Optimization and Control of Bilinear Sys
✍ Panos M. Pardalos; Vitaliy A. Yatsenko πŸ“‚ Library πŸ“… 2010 πŸ› Springer Science & Business Media 🌐 English

Covers developments in bilinear systems theory Focuses on the control of open physical processes functioning in a non-equilibrium mode Emphasis is on three primary disciplines: modern differential geometry, control of dynamical systems, and optimization theory Includes applications to the fields of

Optimization and control of bilinear sys
✍ P M Pardalos; Vitaliy Yatsenko πŸ“‚ Library πŸ“… 2008 πŸ› Springer 🌐 English

The purpose of this book is to acquaint the reader with the developments in bilinear systems theory and its applications. Bilinear systems can be used to represent a wide range of physical, chemical, biological, and social systems, as well as manufacturing processes, which cannot be effectively mode

Optimization and Control of Bilinear Sys
✍ Panos M. Pardalos, Vitaliy Yatsenko πŸ“‚ Library πŸ“… 2008 🌐 English

Covers developments in bilinear systems theory Focuses on the control of open physical processes functioning in a non-equilibrium mode Emphasis is on three primary disciplines: modern differential geometry, control of dynamical systems, and optimization theory Includes applications to the fiel