pith. machine review for the scientific record. sign in

arxiv: 2404.16746 · v2 · submitted 2024-04-25 · 📊 stat.ME · math.ST· stat.ML· stat.TH

Recognition: unknown

Estimating the Number of Components in Finite Mixture Models via Variational Approximation

Authors on Pith no claims yet
classification 📊 stat.ME math.STstat.MLstat.TH
keywords componentsmodelnumberapproximationelbofmmsmixturevariational
0
0 comments X
read the original abstract

This work introduces a new method for selecting the number of components in finite mixture models (FMMs) using variational Bayes, inspired by the large-sample properties of the Evidence Lower Bound (ELBO) derived from mean-field (MF) variational approximation. Specifically, we establish matching upper and lower bounds for the ELBO without assuming conjugate priors, suggesting the consistency of model selection for FMMs based on maximizing the ELBO. As a by-product of our proof, we demonstrate that the MF approximation inherits the stable behavior (benefited from model singularity) of the posterior distribution, which tends to eliminate the extra components under model misspecification where the number of mixture components is over-specified. This stable behavior also leads to the $n^{-1/2}$ convergence rate for parameter estimation, up to a logarithmic factor, under this model overspecification. Empirical experiments are conducted to validate our theoretical findings and compare with other state-of-the-art methods for selecting the number of components in FMMs.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. On Bayesian Softmax-Gated Mixture-of-Experts Models

    stat.ML 2026-04 unverdicted novelty 7.0

    Bayesian softmax-gated mixture-of-experts models achieve posterior contraction for density estimation and parameter recovery using Voronoi losses, plus two strategies for choosing the number of experts.

  2. PAC-Bayes Bounds for Gibbs Posteriors via Singular Learning Theory

    stat.ML 2026-04 unverdicted novelty 6.0

    PAC-Bayes bounds for Gibbs posteriors are obtained via singular learning theory, producing explicit and tighter posterior-averaged risk bounds that adapt to data structure in overparameterized models.