pith. machine review for the scientific record. sign in

arxiv: 1811.08723 · v1 · submitted 2018-11-21 · 📊 stat.ML · cs.LG

Recognition: unknown

Sequential Neural Methods for Likelihood-free Inference

Authors on Pith no claims yet
classification 📊 stat.ML cs.LG
keywords inferenceneuralapproachesapproximatelikelihoodlikelihood-freemethodssimulations
0
0 comments X
read the original abstract

Likelihood-free inference refers to inference when a likelihood function cannot be explicitly evaluated, which is often the case for models based on simulators. Most of the literature is based on sample-based `Approximate Bayesian Computation' methods, but recent work suggests that approaches based on deep neural conditional density estimators can obtain state-of-the-art results with fewer simulations. The neural approaches vary in how they choose which simulations to run and what they learn: an approximate posterior or a surrogate likelihood. This work provides some direct controlled comparisons between these choices.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Overcoming Selection Bias in Statistical Studies With Amortized Bayesian Inference

    stat.ML 2026-04 unverdicted novelty 6.0

    Embedding selection mechanisms into generative simulators enables amortized Bayesian inference to produce debiased, well-calibrated posteriors without tractable likelihoods.