Structured Reachability Analysis for Markov Decision Processes

Craig Boutilier
Department of Computer Science
University of British Columbia
Vancouver, BC, CANADA, V6T 1Z4
email: cebly@cs.ubc.ca

Ronen I. Brafman
Department of Math and Computer Science
Ben-Gurion University
Beer Sheva, ISRAEL 84105
email: brafman@cs.bgu.ac.il

Christopher Geib
Honeywell Technology Center
MN65-2600, 3660 Technology Dr.
Minneapolis, MN, USA 55418
email: geib@htc.honeywell.com

Abstract
Recent research in decision theoretic planning has focussed on making the solution of Markov decision processes (MDPs) more feasible. We develop a family of algorithms for structured reachability analysis of MDPs that are suitable when an initial state (or set of states) is known. Using compact, structured representations of MDPs (e.g., Bayesian networks), our methods, which vary in the tradeoff between complexity and accuracy, produce structured descriptions of (estimated) reachable states that can be used to eliminate variables or variable values from the problem description, reducing the size of the MDP and making it easier to solve. One contribution of our work is the extension of ideas from GraphPlan to deal with the distributed nature of action representations typically embodied within Bayes nets and the problem of correlated action effects. We also demonstrate that our algorithm can be made more complete by using k-ary constraints instead of binary constraints. Another contribution is the illustration of how the compact representation of reachability constraints can be exploited by several existing (exact and approximate) abstraction algorithms for MDPs.

To appear, UAI-98

Return to List of Papers