<p>J. P. La Salle has developed in [20] a stability theory for systems of difference equations (see also [8]) which we introduce in the first chapter within the framework of metric spaces. The stability theory for such systems can also be found in [13] in a slightly modified form. We start with auto
Discrete–Time Stochastic Control and Dynamic Potential Games: The Euler–Equation Approach
✍ Scribed by David González-Sánchez, Onésimo Hernández-Lerma (auth.)
- Publisher
- Springer International Publishing
- Year
- 2013
- Tongue
- English
- Leaves
- 81
- Series
- SpringerBriefs in Mathematics
- Edition
- 1
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
There are several techniques to study noncooperative dynamic games, such as dynamic programming and the maximum principle (also called the Lagrange method). It turns out, however, that one way to characterize dynamic potential games requires to analyze inverse optimal control problems, and it is here where the Euler equation approach comes in because it is particularly well–suited to solve inverse problems. Despite the importance of dynamic potential games, there is no systematic study about them. This monograph is the first attempt to provide a systematic, self–contained presentation of stochastic dynamic potential games.
✦ Table of Contents
Front Matter....Pages i-xiv
Introduction and Summary....Pages 1-10
Direct Problem: The Euler Equation Approach....Pages 11-34
The Inverse Optimal Control Problem....Pages 35-47
Dynamic Games....Pages 49-60
Conclusions and Suggestions for Future Research....Pages 61-63
Back Matter....Pages 65-69
✦ Subjects
Systems Theory, Control; Probability Theory and Stochastic Processes; Control
📜 SIMILAR VOLUMES
<p><P>This book provides a comprehensive introduction to stochastic control problems in discrete and continuous time. The material is presented logically, beginning with the discrete-time case before proceeding to the stochastic continuous-time models. Central themes are dynamic programming in discr
<p><P>This book provides a comprehensive introduction to stochastic control problems in discrete and continuous time. The material is presented logically, beginning with the discrete-time case before proceeding to the stochastic continuous-time models. Central themes are dynamic programming in discr
<p><P><EM>Discrete-time Stochastic Systems</EM> gives a comprehensive introduction to the estimation and control of dynamic stochastic systems and provides complete derivations of key results such as the basic relations for Wiener filtering. The book covers both state-space methods and those based o
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.