pith. machine review for the scientific record. sign in

arxiv: 1505.05770 · v6 · submitted 2015-05-21 · 📊 stat.ML · cs.AI· cs.LG· stat.CO· stat.ME

Recognition: unknown

Variational Inference with Normalizing Flows

Authors on Pith no claims yet
classification 📊 stat.ML cs.AIcs.LGstat.COstat.ME
keywords variationalinferenceposteriorapproximationsflowsnormalizingsimpleapproaches
0
0 comments X
read the original abstract

The choice of approximate posterior distribution is one of the core problems in variational inference. Most applications of variational inference employ simple families of posterior approximations in order to allow for efficient inference, focusing on mean-field or other simple structured approximations. This restriction has a significant impact on the quality of inferences made using variational methods. We introduce a new approach for specifying flexible, arbitrarily complex and scalable approximate posterior distributions. Our approximations are distributions constructed through a normalizing flow, whereby a simple initial density is transformed into a more complex one by applying a sequence of invertible transformations until a desired level of complexity is attained. We use this view of normalizing flows to develop categories of finite and infinitesimal flows and provide a unified view of approaches for constructing rich posterior approximations. We demonstrate that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 10 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Neural Ordinary Differential Equations

    cs.LG 2018-06 accept novelty 8.0

    Neural networks are redefined as continuous dynamical systems by learning the derivative of the hidden state with a neural network and integrating it with an ODE solver.

  2. Density estimation using Real NVP

    cs.LG 2016-05 accept novelty 8.0

    Real NVP uses affine coupling layers to create invertible transformations that support exact density estimation, sampling, and latent inference without approximations.

  3. Testing machine-learned distributions against Monte Carlo data for the QCD chiral phase transition

    hep-lat 2026-05 unverdicted novelty 7.0

    Conditional MAFs interpolate QCD chiral phase structure across coupling, mass, and volume, reproducing reweighting while cutting required ensembles despite bias near transitions.

  4. Information as Maximum-Caliber Deviation: A bridge between Integrated Information Theory and the Free Energy Principle

    q-bio.NC 2026-05 unverdicted novelty 6.0

    Information defined as maximum-caliber deviation derives IIT 3.0 cause-effect repertoires from constrained entropy maximization and equates to prediction error under CLT and LDT.

  5. ML for the hKLM at the 2nd Detector

    physics.ins-det 2026-04 unverdicted novelty 6.0

    Graph neural networks trained on simulated hits outperform classical methods for energy resolution, timing, and particle identification in an iron-scintillator sampling calorimeter, with an integrated multi-objective ...

  6. HuggingFace's Transformers: State-of-the-art Natural Language Processing

    cs.CL 2019-10 accept novelty 6.0

    Hugging Face releases an open-source Python library that supplies a unified API and pretrained weights for major Transformer architectures used in natural language processing.

  7. Pre-localization of Massive Black Hole Binaries in the Millihertz Band

    gr-qc 2026-04 unverdicted novelty 5.0

    A neural spline flow pipeline performs amortized inference on millihertz MBHB signals, delivering ~20 deg² pre-merger sky localizations in ~1 minute while matching PTMCMC sky modes and parameter uncertainties.

  8. Machine Learning Techniques for Astrophysics and Cosmology: Simulation-Based Inference

    astro-ph.CO 2026-05 unverdicted novelty 2.0

    Simulation-based inference uses neural networks trained on simulations to enable parameter inference in cosmology and astrophysics where traditional likelihood calculations are intractable.

  9. Application of Machine Learning to 21 cm Cosmology

    astro-ph.CO 2026-05 unverdicted novelty 2.0

    Machine learning helps 21 cm cosmology most when it preserves physically relevant structure and propagates uncertainty explicitly instead of replacing the forward model.

  10. Application of Machine Learning to 21 cm Cosmology

    astro-ph.CO 2026-05 unverdicted novelty 1.0

    Machine learning can address data contamination, accelerate modeling, and aid inference in 21 cm cosmology when it preserves physical structure and uncertainty rather than acting as an opaque replacement.