pith. machine review for the scientific record. sign in

hub

Building Normalizing Flows with Stochastic Interpolants

36 Pith papers cite this work. Polarity classification is still indexing.

36 Pith papers citing it
abstract

A generative model based on a continuous-time normalizing flow between any pair of base and target probability densities is proposed. The velocity field of this flow is inferred from the probability current of a time-dependent density that interpolates between the base and the target in finite time. Unlike conventional normalizing flow inference methods based the maximum likelihood principle, which require costly backpropagation through ODE solvers, our interpolant approach leads to a simple quadratic loss for the velocity itself which is expressed in terms of expectations that are readily amenable to empirical estimation. The flow can be used to generate samples from either the base or target, and to estimate the likelihood at any time along the interpolant. In addition, the flow can be optimized to minimize the path length of the interpolant density, thereby paving the way for building optimal transport maps. In situations where the base is a Gaussian density, we also show that the velocity of our normalizing flow can also be used to construct a diffusion model to sample the target as well as estimate its score. However, our approach shows that we can bypass this diffusion completely and work at the level of the probability flow with greater simplicity, opening an avenue for methods based solely on ordinary differential equations as an alternative to those based on stochastic differential equations. Benchmarking on density estimation tasks illustrates that the learned flow can match and surpass conventional continuous flows at a fraction of the cost, and compares well with diffusions on image generation on CIFAR-10 and ImageNet $32\times32$. The method scales ab-initio ODE flows to previously unreachable image resolutions, demonstrated up to $128\times128$.

hub tools

citation-role summary

method 1

citation-polarity summary

claims ledger

  • abstract A generative model based on a continuous-time normalizing flow between any pair of base and target probability densities is proposed. The velocity field of this flow is inferred from the probability current of a time-dependent density that interpolates between the base and the target in finite time. Unlike conventional normalizing flow inference methods based the maximum likelihood principle, which require costly backpropagation through ODE solvers, our interpolant approach leads to a simple quadratic loss for the velocity itself which is expressed in terms of expectations that are readily ame

co-cited works

roles

method 1

polarities

use method 1

representative citing papers

Generative Modeling with Flux Matching

cs.LG · 2026-05-08 · unverdicted · novelty 8.0

Flux Matching generalizes score-based generative modeling by using a weaker objective that admits infinitely many non-conservative vector fields with the data as stationary distribution, enabling new design choices beyond traditional score matching.

ReConText3D: Replay-based Continual Text-to-3D Generation

cs.CV · 2026-04-15 · conditional · novelty 8.0

ReConText3D is the first replay-memory framework for continual text-to-3D generation that prevents catastrophic forgetting on new textual categories while preserving quality on previously seen classes.

Sampling from Flow Language Models via Marginal-Conditioned Bridges

cs.LG · 2026-05-13 · unverdicted · novelty 7.0

Marginal-conditioned bridges enable training-free sampling from Flow Language Models by drawing clean one-hot endpoints from factorized posteriors and using Ornstein-Uhlenbeck bridges, preserving token marginals and reducing denoising error versus conditional-mean bridges.

One-Step Generative Modeling via Wasserstein Gradient Flows

cs.LG · 2026-05-12 · conditional · novelty 7.0

W-Flow achieves state-of-the-art one-step ImageNet 256x256 generation at 1.29 FID by training a static neural network to follow a Wasserstein gradient flow that minimizes Sinkhorn divergence, delivering roughly 100x faster sampling than comparable multi-step models.

Flow Matching on Symmetric Spaces

cs.LG · 2026-05-05 · unverdicted · novelty 7.0

A general framework reduces flow matching on symmetric spaces to flow matching on a Lie algebra subspace, linearizing geodesics.

Discrete Flow Matching: Convergence Guarantees Under Minimal Assumptions

cs.LG · 2026-05-09 · unverdicted · novelty 6.0

Discrete flow matching on Z_m^d achieves non-asymptotic KL bounds for early-stopped targets and explicit TV convergence to the true target under an approximation error assumption, with improved scaling in dimension d and vocabulary size m.

Conservative Flows: A New Paradigm of Generative Models

cs.LG · 2026-05-07 · unverdicted · novelty 6.0

Conservative flows generate by running probability-preserving stochastic dynamics initialized at data points rather than noise, using corrected Langevin or predictor-corrector mechanisms on top of any pretrained flow model and showing gains on Swiss-roll, ImageNet-256 and Oxford Flowers-102.

SDFlow: Similarity-Driven Flow Matching for Time Series Generation

cs.AI · 2026-05-07 · unverdicted · novelty 6.0 · 2 refs

SDFlow learns a global transport map via similarity-driven flow matching in VQ latent space, using low-rank manifold decomposition and a categorical posterior to handle discreteness, yielding SOTA long-horizon performance and inference speedups.

Free Energy Surface Sampling via Reduced Flow Matching

cs.LG · 2026-05-01 · unverdicted · novelty 6.0

FES-FM applies reduced flow matching with a Hessian-derived prior to directly sample free energy surfaces in collective variable space, claiming lower computational cost and higher accuracy per unit time than standard methods.

Quantum Dynamics via Score Matching on Bohmian Trajectories

quant-ph · 2026-04-28 · unverdicted · novelty 6.0

Neural networks learn the score of the probability density on Bohmian trajectories to recover exact Schrödinger dynamics via self-consistent minimization for nodeless wave functions, demonstrated on double-well splitting and Morse chain vibrations.

Fisher Decorator: Refining Flow Policy via a Local Transport Map

cs.LG · 2026-04-20 · unverdicted · novelty 6.0

Fisher Decorator refines flow policies in offline RL via a local transport map and Fisher-matrix quadratic approximation of the KL constraint, yielding controllable error near the optimum and SOTA benchmark results.

citing papers explorer

Showing 36 of 36 citing papers.