Distributed constraint optimization

Connecting BnB-ADOPT with Soft Arc Consistency: Initial Results

Conference Paper

Source:

Lecture Notes in Computer Science, Springer, Volume 6384, p.19-37 (2011)

Abstract:

Distributed constraint optimization problems with finite domains can be solved by asynchronous procedures. ADOPT is the reference algorithm for this kind of problems. Several versions of this algorithm have been proposed, one of them is BnB-ADOPT which changes the nature of the original algorithm from best-first to depth-first search. With BnB-ADOPT, we can assure in some cases that the value of a variable will not be used in the optimal solution. Then, this value can be deleted unconditionally. The contribution of this work consists in propagating these unconditionally deleted values using soft arc consistency techniques, in such a way that they can be known by other variables that share cost functions. When we propagate these unconditional deletions we may generate some new deletions that will also be propagated. The global effect is that we search in a smaller space, causing performance improvements. The effect of the propagation is evaluated on several benchmarks.

Global Constraints in Distributed Constraint Satisfaction

Conference Paper

Source:

11th International Joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS 2012), Valencia, Spain, p.1263-1264 (2012)

Abstract:

Global constraints have been crucial for the advancement of centralized constraint processing. Here, we propose the inclusion of global constraints in distributed constraint satisfaction. We detail how this inclusion can be done, considering different decompositions for global contraints. In the context of the ABT algorithm, we provide experimental evidence of their benefits on several benchmarks.

Conference Paper

Source:

11th International Joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS 2012), Valencia, Spain, p.273-280 (2012)

Abstract:

Several multiagent tasks can be formulated and solved as DCOPs. BnB-ADOPT$^+$-AC is one of the most efficient algorithms for optimal DCOP solving. It is based on BnB-ADOPT, removing redundant messages and maintaining soft arc consistency during search. In this paper, we present several improvements for this algorithm, namely (i) a better implementation, (ii) processing exactly simultaneous deletions, and (iii) searching on arc consistent cost functions. We present empirical results showing the benefits of these improvements on several benchmarks.

Decomposing Global Cost Functions

Conference Paper

Source:

CP 2011 workshop: Preferences and Soft Constraints, Perugia, Italy, p.16-30 (2011)

Abstract:

Similarly to what has been done with Global Constraints in Constraint Programming, different results have been recently published on Global Cost Functions in weighted CSPs, defining the premises of a Cost Function Programming paradigm.
In this paper, in the spirit of Berge-acyclic decompositions of global constraints such as Regular, we explore the possibility of decomposing Global Cost Functions in such a way that enforcing soft local consistencies on the decomposed cost function offers guarantees on the level of consistency enforced on the original global cost function. We show that an extension of Directional Arc Consistency to arbitrary arities and Virtual Arc Consistency offer specific guarantees. We conclude by preliminary experiments on WeightedRegular decompositions that show that decompositions may be very useful to easily integrate global cost functions in existing solvers with good efficiency.

Distributed Constraint Optimization Problems related with Soft Arc Consistency (Extended Abstract)

Conference Paper

Source:

22nd International Joint Conference on Artificial Intelligence (IJCAI 2011), Barcelona, Spain, p.2812-2813 (2011)

Abstract:

Distributed Constraint Optimization Problems (DCOPs) can be optimally solved by distributed search algorithms, such as ADOPT and BnB-ADOPT. In centralized solving, maintaining soft arc consistency during search has proved to be beneficial for performance. In this thesis we aim to explore the maintenance of different levels of soft arc consistency in distributed search when solving DCOPs.

Multi-Agent Coordination: DCOPs and Beyond

Conference Paper

Authors:

Marc Pujol-Gonzalez

Source:

Twenty-Second International Joint Conference on Artificial Intelligence, AAAI Press, Volume 3, Barcelona, p.2838-2839 (2011)

ISBN:

978-1-57735-515-1

URL:

http://ijcai.org/papers11/contents.php

Abstract:

Distributed constraint optimization problems (DCOPs) are a model for representing multi-agent systems in which agents cooperate to optimize a global objective. The DCOP model has two main advantages: it can represent a wide range of problem domains, and it supports the development of generic algorithms to solve them. Firstly, this paper presents some advances in both complete and approximate DCOP algorithms. Secondly, it explains that the DCOP model makes a number of unrealistic assumptions that severely limit its range of application. Finally, it points out hints on how to tackle such limitations.

Improving function filtering for computationally demanding DCOPs

Conference Paper

Source:

Workshop on Distributed Constraint Reasoning at IJCAI 2011, Barcelona, p.99-111 (2011)

Abstract:

In this paper we focus on solving DCOPs in computationally demanding scenarios. GDL optimally solves DCOPs, but requires exponentially large cost functions, being impractical in such settings. Function filtering is a technique that reduces the size of cost functions. We improve the effectiveness of function filtering to reduce the amount of resources required to optimally solve DCOPs. As a result, we enlarge the range of problems solvable by algorithms employing function filtering.

Enforcing Soft Local Consistency on Multiple Representations for DCOP Solving

Conference Paper

Source:

CP 2010 workshop: Preferences and Soft Constraints, St. Andrews, Scotland, p.98-113 (2010)

Abstract:

Connecting soft arc consistency with distributed search in DCOP solving has been very beneficial for performance. However, including higher levels of soft arc consistency breaks usual privacy requirements. To avoid this issue, we propose to keep different representations of the same problem on each agent, on which soft arc consistencies are enforced respecting privacy. Deletions caused in one representation can be legally propagated to others. Experimentally, this causes significant benefits.

Conference Paper

Source:

22nd International Joint Conference on Artificial Intelligence (IJCAI 2011), Barcelona, Spain, p.554-559 (2011)

Abstract:

ADOPT and BnB-ADOPT are two optimal DCOP search algorithms that are similar except for their search strategies: the former uses best-first search and the latter uses depth-first branch-and-bound search. In this paper, we present a new algorithm, called ADOPT($k$), that generalizes them. Its behavior depends on the $k$ parameter. It behaves like ADOPT when $k=1$, like BnB-ADOPT when $k=\infty$ and like a hybrid of ADOPT and BnB-ADOPT when $1 < k < \infty$. We prove that ADOPT($k$) is a correct and complete algorithm and experimentally show that ADOPT($k$) outperforms ADOPT and BnB-ADOPT on several benchmarks across several metrics.

BnB-ADOPT+ with Several Soft AC Levels

Conference Paper

Source:

European Conference on Artificial Intelligence (ECAI-10), Lisbon, Portugal, p.67-72 (2010)

Keywords:

Distributed constraint optimization; soft constraints; arc consistency

Abstract:

Distributed constraint optimization problems can be solved by BnB-ADOPT$^+$, a distributed asynchronous search algorithm. In the centralized case, local consistency techniques applied to constraint optimization have been shown very beneficial to increase performance. In this paper, we combine BnB-ADOPT$^+$ with different levels of soft arc consistency, propagating unconditional deletions caused by either the enforced local consistency or by distributed search. The new algorithm maintains BnB-ADOPT$^+$ optimality and termination. In practice, this approach decreases substantially BnB-ADOPT$^+$ requirements in communication cost and computation effort when solving commonly used benchmarks.