Exact solutions of the hamilton-jacobi-bellman equation for problems of optimal correction with a constrained overall control resource
โ Scribed by A.S. Bratus'; K.A. Volosov
- Publisher
- Elsevier Science
- Year
- 2004
- Tongue
- English
- Weight
- 641 KB
- Volume
- 68
- Category
- Article
- ISSN
- 0021-8928
No coin nor oath required. For personal study only.
โฆ Synopsis
The problem of controlling the oscillations of a mathematical pendulum is considered. The overall control resource is subject to an integral constraint: the modulus of the control function to an arbitrary non-negative power (greater than or equal to unity) is a summable function over a specified time interval. The purpose of the control is to minimize a specified function of the phase variables to a fixed instant of time (Mayer's problem). Together with the deterministic case, a stochastic case is studied when the system is subject to random perturbations in the form of Gaussian white noise. In this case, it is required to minimize the mathematical expectation of specified functionals or to maximize the probability that a phase coordinate falls within a specified domain by a fixed instant of time. It is well known [1,2] that the problem of constructing an optimal feedback control can be reduced to solving a Cauchy problem in an unbounded domain for the corresponding Hamilton-Jacobi-Bellman equation. It is proved that this problem is equivalent to a Cauchy problem for a linear parabolic equation. Exact solutions of this problem are found for the class of optimal control problems being considered. The case of a pulse correction, when the value of the integral of the modulus of the control function is bounded, is considered separately. The results obtained are extended to the case of an arbitrary number of phase variables if the control functions are square integrable.
๐ SIMILAR VOLUMES