Recognition: unknown
Diffusive Nested Sampling
read the original abstract
We introduce a general Monte Carlo method based on Nested Sampling (NS), for sampling complex probability distributions and estimating the normalising constant. The method uses one or more particles, which explore a mixture of nested probability distributions, each successive distribution occupying ~e^-1 times the enclosed prior mass of the previous distribution. While NS technically requires independent generation of particles, Markov Chain Monte Carlo (MCMC) exploration fits naturally into this technique. We illustrate the new method on a test problem and find that it can achieve four times the accuracy of classic MCMC-based Nested Sampling, for the same computational effort; equivalent to a factor of 16 speedup. An additional benefit is that more samples and a more accurate evidence value can be obtained simply by continuing the run for longer, as in standard MCMC.
This paper has not been read by Pith yet.
Forward citations
Cited by 3 Pith papers
-
emcee: The MCMC Hammer
emcee delivers a stable Python implementation of the affine-invariant ensemble MCMC algorithm that requires minimal hand-tuning and supports easy parallelization.
-
dynesty: A Dynamic Nested Sampling Package for Estimating Bayesian Posteriors and Evidences
dynesty is an open-source Python package for dynamic nested sampling that improves efficiency in Bayesian posterior and evidence estimation compared to MCMC on certain problems.
-
NCCR PlanetS: Observational and computational characterization of exoplanet atmospheres
The paper reviews physical processes, modeling techniques, retrieval methods, and observational strategies for characterizing exoplanet atmospheres, emphasizing Swiss research progress.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.