Decentralized Markov Decision Processes with Event-Driven Interactions

Raphen Becker, Shlomo Zilberstein, and Victor Lesser. Decentralized Markov Decision Processes with Event-Driven Interactions. Proceedings of the Third International Joint Conference on Autonomous Agents and Multi Agent Systems (AAMAS), 302-309, New York City, 2004.

Abstract

Decentralized MDPs provide a powerful formal framework for planning in multi-agent systems, but the complexity of the model limits its usefulness. We study in this paper a class of DEC-MDPs that restricts the interactions between the agents to a structured, event-driven dependency. These dependencies can model locking a shared resource or temporal enabling constraints, both of which arise frequently in practice. The complexity of this class of problems is shown to be no harder than exponential in the number of states and doubly exponential in the number of dependencies. Since the number of dependencies is much smaller than the number of states for many problems, this is significantly better than the doubly exponential (in the state space) complexity of DEC-MDPs. We also demonstrate how an algorithm we previously developed can be used to solve problems in this class both optimally and approximately. Experimental work indicates that this solution technique is significantly faster than a naive policy search approach.

Bibtex entry:

@inproceedings{BZLaamas04,
  author	= {Raphen Becker and Shlomo Zilberstein and Victor Lesser},
  title		= {Decentralized {M}arkov Decision Processes with Event-Driven
                   Interactions},
  booktitle     = {Proceedings of the Third International Joint Conference on 
                   Autonomous Agents and Multi Agent Systems},
  year		= {2004},
  pages		= {302-309},
  address       = {New York City},
  url		= {http://rbr.cs.umass.edu/shlomo/papers/BZLaamas04.html}
}

shlomo@cs.umass.edu
UMass Amherst