Online Sparse bandit for Card Games - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2012

Online Sparse bandit for Card Games

Abstract

Finding an approximation of a Nash equilibria in matrix games is an important topic that reaches beyond the strict application to matrix games. A bandit algorithm commonly used to approximate a Nash equilibrium is EXP3 [?]. However, the solution to many problems is often sparse, yet EXP3 inherently fails to exploit this property. To the knowledge of the authors, there exist only an offline truncation pro-posed by [?] to tackle such issue. In this paper, we propose a variation of EXP3 to exploit the fact that solution is sparse by dynamically removing arms; the resulting algorithm empirically performs better than previous versions. We apply the resulting algorithm to a MCTS program for the Urban Rivals card game.
Fichier principal
Vignette du fichier
onlinesparse.pdf (274.46 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-01116714 , version 1 (17-02-2015)

Identifiers

  • HAL Id : hal-01116714 , version 1

Cite

David Saint-Pierre, Quentin Louveaux, Olivier Teytaud. Online Sparse bandit for Card Games. Advances in Computer Games, Nov 2011, Tilburg, France. ⟨hal-01116714⟩
163 View
211 Download

Share

Gmail Facebook Twitter LinkedIn More