Variational inference for Dirichlet process mixtures
Citations Over TimeTop 1% of 2006 papers
Abstract
Dirichlet process (DP) mixture models are the cornerstone of nonparametric Bayesian statistics, and the development of Monte-Carlo Markov chain (MCMC) sampling methods for DP mixtures has enabled the application of nonparametric Bayesian methods to a variety of practical data analysis problems. However, MCMC sampling can be prohibitively slow, and it is important to explore alternatives. One class of alternatives is provided by variational methods, a class of deterministic algorithms that convert inference problems into optimization problems (Opper and Saad 2001; Wainwright and Jordan 2003). Thus far, variational methods have mainly been explored in the parametric setting, in particular within the formalism of the exponential family (Attias 2000; Ghahramani and Beal 2001; Blei et al. 2003). In this paper, we present a variational inference algorithm for DP mixtures. We present experiments that compare the algorithm to Gibbs sampling algorithms for DP mixtures of Gaussians and present an application to a large-scale image analysis problem.
Related Papers
- Echantillonnage de Gibbs et autres application econometriques des chaines merkoviennes(1995)
- → Markov Chain Monte Carlo: The Gibbs Sampler and the Metropolis Algorithm(1993)1 cited
- → Monte Carlo Approach: Gibbs Sampling(2014)
- → Gibbs Sampling and Bayesian Inference(2006)
- → Use of Gibbs sampling in variance component estimation and breeding value prediction.(2014)