Markov decision processes: a tool for sequential decision making under uncertainty

Markov decision processes: a tool for sequential decision making under uncertainty