pith. machine review for the scientific record. sign in

arxiv: 1703.09194 · v3 · submitted 2017-03-27 · 📊 stat.ML · cs.LG

Recognition: unknown

Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference

Authors on Pith no claims yet
classification 📊 stat.ML cs.LG
keywords gradientvariationalestimatorapproachesposteriorsimpleanalyzeapproximate
0
0 comments X
read the original abstract

We propose a simple and general variant of the standard reparameterized gradient estimator for the variational evidence lower bound. Specifically, we remove a part of the total derivative with respect to the variational parameters that corresponds to the score function. Removing this term produces an unbiased gradient estimator whose variance approaches zero as the approximate posterior approaches the exact posterior. We analyze the behavior of this gradient estimator theoretically and empirically, and generalize it to more complex variational distributions such as mixtures and importance-weighted posteriors.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. DreamFusion: Text-to-3D using 2D Diffusion

    cs.CV 2022-09 accept novelty 7.0

    Optimizes a Neural Radiance Field via probability density distillation from a 2D diffusion model to produce text-conditioned 3D scenes viewable from any angle.