Accelerated greedy mixture learning - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2004

Accelerated greedy mixture learning

Jakob Verbeek
Nikos Vlassis
  • Fonction : Auteur
  • PersonId : 853678

Résumé

Mixture probability densities are popular models that are used in several data mining and machine learning applications, e.g., clustering. A standard algorithm for learning such models from data is the Expectation-Maximization (EM) algorithm. However, EM can be slow with large datasets, and therefore approximation techniques are needed. In this paper we propose a variational approximation to the greedy EM algorithm which oers speedups that are at least linear in the number of data points. Moreover, by strictly increasing a lower bound on the data log-likelihood in every learning step, our algorithm guarantees convergence. We demonstrate the proposed algorithm on a synthetic experiment where satisfactory results are obtained.
Fichier principal
Vignette du fichier
verbeek04bnl.pdf (107.57 Ko) Télécharger le fichier
Vignette du fichier
NVV04.png (11.79 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Format Figure, Image
Loading...

Dates et versions

inria-00321482 , version 1 (02-02-2011)
inria-00321482 , version 2 (05-04-2011)

Identifiants

  • HAL Id : inria-00321482 , version 2

Citer

Jan Nunnink, Jakob Verbeek, Nikos Vlassis. Accelerated greedy mixture learning. Benelearn: Annual Machine Learning Conference of Belgium and the Netherlands, Jan 2004, Brussels, Belgium. ⟨inria-00321482v2⟩
117 Consultations
166 Téléchargements

Partager

Gmail Mastodon Facebook X LinkedIn More