pith. machine review for the scientific record. sign in

arxiv: 2508.06614 · v2 · submitted 2025-08-08 · 💻 cs.LG · cond-mat.stat-mech· quant-ph

Recognition: unknown

Local Diffusion Models and Phases of Data Distributions

Authors on Pith no claims yet
classification 💻 cs.LG cond-mat.stat-mechquant-ph
keywords datalocalphasediffusiondistributionsmodelsdenoisersnetworks
0
0 comments X
read the original abstract

As a class of generative artificial intelligence frameworks inspired by statistical physics, diffusion models have shown extraordinary performance in synthesizing complicated data distributions through a denoising process gradually guided by score functions. Real-life data, like images, is often spatially structured in low-dimensional spaces. However, ordinary diffusion models ignore this local structure and learn spatially global score functions, which are often computationally expensive. In this work, motivated by recent advances in non-equilibrium statistical physics, we develop a generic framework for defining phases of data distributions and use it to analyze the locality requirements of denoisers in diffusion models. We define two distributions as belonging to the same data distribution phase if they can be mutually connected via spatially local operations such as local denoisers, along the same evolution path as the diffusion. We demonstrate that the reverse denoising process consists of an early trivial phase and a late data phase, sandwiching a rapid phase transition where local denoisers must fail. We further demonstrate that the performance of local denoisers is closely tied to spatial Markovianity, which provides an operational criterion for diagnosing such phase transitions. We validate this criterion through numerical experiments on real-world datasets. Our work suggests guidance for simpler and more efficient architectures of diffusion models: far from the phase transition point, we can use small local neural networks to compute the score function; global neural networks are only necessary around the narrow time interval of phase transitions. This result also opens up new directions for studying phases of data distributions, the broader science of generative artificial intelligence, and guiding the design of neural networks inspired by physics concepts.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 3 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Concurrence of Symmetry Breaking and Nonlocality Phase Transitions in Diffusion Models

    cs.LG 2026-05 unverdicted novelty 7.0

    Symmetry breaking and nonlocality phase transitions occur nearly simultaneously during diffusion model generation in modern transformers.

  2. Learning and Generating Mixed States Prepared by Shallow Channel Circuits

    quant-ph 2026-04 unverdicted novelty 7.0

    Any mixed state in the trivial phase can be efficiently learned and approximately generated by a shallow local channel circuit from polynomial measurements, without access to the original circuit.

  3. Learning and Generating Mixed States Prepared by Shallow Channel Circuits

    quant-ph 2026-04 unverdicted novelty 6.0

    Mixed states in the trivial phase can be approximately generated by a learned shallow local channel circuit from measurement copies alone, with polynomial sample and runtime complexity.