Bayesian Estimation of Beta Mixture Models with Variational Inference
Citations Over TimeTop 1% of 2011 papers
Abstract
Bayesian estimation of the parameters in beta mixture models (BMM) is analytically intractable. The numerical solutions to simulate the posterior distribution are available, but incur high computational cost. In this paper, we introduce an approximation to the prior/posterior distribution of the parameters in the beta distribution and propose an analytically tractable (closed form) Bayesian approach to the parameter estimation. The approach is based on the variational inference (VI) framework. Following the principles of the VI framework and utilizing the relative convexity bound, the extended factorized approximation method is applied to approximate the distribution of the parameters in BMM. In a fully Bayesian model where all of the parameters of the BMM are considered as variables and assigned proper distributions, our approach can asymptotically find the optimal estimate of the parameters posterior distribution. Also, the model complexity can be determined based on the data. The closed-form solution is proposed so that no iterative numerical calculation is required. Meanwhile, our approach avoids the drawback of overfitting in the conventional expectation maximization algorithm. The good performance of this approach is verified by experiments with both synthetic and real data.
Related Papers
- → Posterior predictive checks can and should be Bayesian: Comment on Gelman and Shalizi, ‘Philosophy and the practice of Bayesian statistics’(2012)67 cited
- → A Bayesian Analysis of the Linear Calibration Problem(1981)24 cited
- → Bayesian analysis for proportions with an independent background effect(2006)7 cited
- Paired Comparison Model: A Bayesian Non-Informative Analysis(2015)
- → Characterization of the Bayesian Posterior Distribution in Terms of Self-information(2017)