[Communications and Control Engineering]
โ
Chang, Hyeong Soo; Hu, Jiaqiao; Fu, Michael C.; Marcus, Steven I.
๐
Article
๐
2013
๐
Springer London
๐
English
โ 599 KB
Markov decision process (MDP) models are widely used for modeling sequential decision-making problems that arise in engineering, economics, computer science, and the social sciences.ย Many real-world problems modeled by MDPs have huge state and/or action spaces, giving an opening to the curse of dim