pith. machine review for the scientific record. sign in

arxiv: 1804.11130 · v4 · submitted 2018-04-30 · 💻 cs.LG · cs.AI· stat.ML

Recognition: unknown

Competitive Training of Mixtures of Independent Deep Generative Models

Authors on Pith no claims yet
classification 💻 cs.LG cs.AIstat.ML
keywords trainingdatadistributiongenerativeindependentmodelscapturecompetitive
0
0 comments X
read the original abstract

A common assumption in causal modeling posits that the data is generated by a set of independent mechanisms, and algorithms should aim to recover this structure. Standard unsupervised learning, however, is often concerned with training a single model to capture the overall distribution or aspects thereof. Inspired by clustering approaches, we consider mixtures of implicit generative models that ``disentangle'' the independent generative mechanisms underlying the data. Relying on an additional set of discriminators, we propose a competitive training procedure in which the models only need to capture the portion of the data distribution from which they can produce realistic samples. As a by-product, each model is simpler and faster to train. We empirically show that our approach splits the training distribution in a sensible way and increases the quality of the generated samples.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.