A control problem with risk sensitive ergodic performance criterion is considered for a discrete time Feller process. Using assumptions of uniform ergodicity and small risk factor, the existence and uniqueness of the solution to the Bellman equation is proved. Uniform approximations to such solution
Risk sensitive control of Markov processes in countable state space
✍ Scribed by Daniel Hernandez-Hernández; Steven I. Marcus
- Publisher
- Elsevier Science
- Year
- 1996
- Tongue
- English
- Weight
- 450 KB
- Volume
- 29
- Category
- Article
- ISSN
- 0167-6911
No coin nor oath required. For personal study only.
✦ Synopsis
In this paper we consider infinite horizon risk-sensitive control of Markov processes with discrete time and denumerable state space. This problem is solved by proving, under suitable conditions, that there exists a bounded solution to the dynamic programming equation. The dynamic programming equation is transformed into an Isaacs equation for a stochastic game, and the vanishing discount method is used to study its solution. In addition, we prove that the existence conditions are also necessary.
📜 SIMILAR VOLUMES
Dynamic programming for piecewise deterministic Markov processes is studied where only the jumps but not the deterministic flow can be controlled. Then one can dispense with relaxed controls, There exists an optimal stationary policy of feedback form. Further, a piecewise deterministic Markov model