Model-Based Learning Using a Mixture of Mixtures of Gaussian and Uniform Distributions
Citations Over TimeTop 10% of 2011 papers
Abstract
We introduce a mixture model whereby each mixture component is itself a mixture of a multivariate Gaussian distribution and a multivariate uniform distribution. Although this model could be used for model-based clustering (model-based unsupervised learning) or model-based classification (model-based semi-supervised learning), we focus on the more general model-based classification framework. In this setting, we fit our mixture models to data where some of the observations have known group memberships and the goal is to predict the memberships of observations with unknown labels. We also present a density estimation example. A generalized expectation-maximization algorithm is used to estimate the parameters and thereby give classifications in this mixture of mixtures model. To simplify the model and the associated parameter estimation, we suggest holding some parameters fixed-this leads to the introduction of more parsimonious models. A simulation study is performed to illustrate how the model allows for bursts of probability and locally higher tails. Two further simulation studies illustrate how the model performs on data simulated from multivariate Gaussian distributions and on data from multivariate t-distributions. This novel approach is also applied to real data and the performance of our approach under the various restrictions is discussed.
Related Papers
- → Parameter Estimation of Mixture PDF Model for Mobile Robots by EM Algorithm(2014)1 cited
- → Segmentation of medical image based on mean shift and deterministic annealing EM algorithm(2008)
- → Addendum to “Modeling human mortality using mixtures of bathtub shaped failure distributions”(2007)
- Pruning algorithm for mixture components in extended object(2014)
- → Robust Fitting of Mixture Models using Weighted Complete Estimating Equations(2020)