NumPyro delivers a JIT-compilable iterative NUTS sampler by composing Pyro effect handlers with JAX transformations, achieving faster performance than prior implementations.
TensorFlow Distributions
6 Pith papers cite this work. Polarity classification is still indexing.
abstract
The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation. Building on two basic abstractions, it offers flexible building blocks for probabilistic computation. Distributions provide fast, numerically stable methods for generating samples and computing statistics, e.g., log density. Bijectors provide composable volume-tracking transformations with automatic caching. Together these enable modular construction of high dimensional distributions and transformations not possible with previous libraries (e.g., pixelCNNs, autoregressive flows, and reversible residual networks). They are the workhorse behind deep probabilistic programming systems like Edward and empower fast black-box inference in probabilistic models built on deep-network components. TensorFlow Distributions has proven an important part of the TensorFlow toolkit within Google and in the broader deep learning community.
representative citing papers
QumVQD enables excited-state quantum chemistry calculations on bosonic qumode hardware by enforcing particle-number symmetry and using Hamiltonian fragmentation, achieving chemical accuracy on H2 and spectroscopic accuracy on vibrational modes with far fewer entangling gates than qubit equivalents.
Dreamer learns to control from images by imagining and optimizing behaviors in a learned latent world model, outperforming prior methods on 20 visual tasks in data efficiency and final performance.
RG-inspired lattice models for piecewise GLMs provide explicit interpretable partitions and a replica-analysis-derived scaling law for regularization that allows increasing complexity without expected rise in generalization loss.
TFMPE combines likelihood factorisation with tokenised flow matching to enable efficient hierarchical SBI from single-site simulations, producing well-calibrated posteriors at lower computational cost on a new benchmark and real models.
MissBGM jointly models data generation and missingness in a Bayesian neural generative framework to produce consistent imputations with principled posterior uncertainty.
citing papers explorer
-
Composable Effects for Flexible and Accelerated Probabilistic Programming in NumPyro
NumPyro delivers a JIT-compilable iterative NUTS sampler by composing Pyro effect handlers with JAX transformations, achieving faster performance than prior implementations.
-
Excited-State Quantum Chemistry on Qumode-Based Processors via Variational Quantum Deflation
QumVQD enables excited-state quantum chemistry calculations on bosonic qumode hardware by enforcing particle-number symmetry and using Hamiltonian fragmentation, achieving chemical accuracy on H2 and spectroscopic accuracy on vibrational modes with far fewer entangling gates than qubit equivalents.
-
Dream to Control: Learning Behaviors by Latent Imagination
Dreamer learns to control from images by imagining and optimizing behaviors in a learned latent world model, outperforming prior methods on 20 visual tasks in data efficiency and final performance.
-
A renormalization-group inspired lattice-based framework for piecewise generalized linear models
RG-inspired lattice models for piecewise GLMs provide explicit interpretable partitions and a replica-analysis-derived scaling law for regularization that allows increasing complexity without expected rise in generalization loss.
-
Tokenised Flow Matching for Hierarchical Simulation Based Inference
TFMPE combines likelihood factorisation with tokenised flow matching to enable efficient hierarchical SBI from single-site simulations, producing well-calibrated posteriors at lower computational cost on a new benchmark and real models.
-
Missingness-aware Data Imputation via AI-powered Bayesian Generative Modeling
MissBGM jointly models data generation and missingness in a Bayesian neural generative framework to produce consistent imputations with principled posterior uncertainty.