Bandit-based Search for Constraint Programming - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2013

Bandit-based Search for Constraint Programming

Résumé

Constraint Programming (CP) solvers classically explore the solution space using tree-search based heuristics. Monte-Carlo Tree-Search (MCTS) is a method aimed at optimal sequential decision making under uncertainty. It simultaneously estimates node values (with respect to some reward function) by Monte-Carlo trials and uses them to bias the exploration towards the most promising regions of the tree, borrowing the multi-armed-bandit decision rule. At the crossroads of CP and MCTS, this paper presents the Bandit Search for Constraint Programming (BASCOP) algorithm, adapting MCTS to the specifics of CP search trees. These adaptations concern i) the design of a generic reward function suited for CP problems; ii) the replacement of Monte-Carlo trials by iterations of a complete depth-first-search procedure; iii) the ability to take into account an existing value-ordering heuristics; iv) the aggregation of statistics in order to handle multiple restarts. BASCOP, using Gecode as the underlying constraint solver, shows significant improvements over the depth-first-search baseline on some CP benchmark suites, demonstrating its potential as a generic yet robust search method for CP.
Fichier principal
Vignette du fichier
paper123.pdf (184.44 Ko) Télécharger le fichier
Origine Fichiers éditeurs autorisés sur une archive ouverte
Loading...

Dates et versions

hal-00863451 , version 1 (18-09-2013)

Identifiants

  • HAL Id : hal-00863451 , version 1

Citer

Manuel Loth, Michèle Sebag, Youssef Hamadi, Marc Schoenauer. Bandit-based Search for Constraint Programming. International Conference on Principles and Practice of Constraint Programming, Sep 2013, Uppsala, Sweden. pp.464-480. ⟨hal-00863451⟩
486 Consultations
1241 Téléchargements

Partager

More