pith. machine review for the scientific record. sign in

arxiv: 2604.00843 · v2 · submitted 2026-04-01 · 🧮 math.AP · cs.NA· math.NA· math.PR· math.ST· stat.TH

Recognition: no theorem link

Sharp local sparsity of regularized optimal transport

Authors on Pith no claims yet

Pith reviewed 2026-05-13 22:16 UTC · model grok-4.3

classification 🧮 math.AP cs.NAmath.NAmath.PRmath.STstat.TH
keywords optimal transportentropy regularizationsupport sparsityconditional measuresstrong convexityconvergence ratesmultivariate transport
0
0 comments X

The pith

The support of conditional measures in entropy-regularized optimal transport shrinks locally like balls of radius ε to the power 1 over d(p-1) plus 2.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This paper establishes a sharp local rate for how the support of the regularized optimal transport plan concentrates as the regularization parameter ε goes to zero. It shows that away from boundaries, the conditional measures are supported on sets that behave like balls whose radius scales precisely with ε raised to 1 divided by d times (p minus 1) plus 2. This local sparsity result then implies that the associated potentials are uniformly strongly convex and provides an explicit rate for how these potentials approach their unregularized counterparts. The findings extend previous work from one dimension or self-transport cases to the full multivariate setting.

Core claim

We prove that the supports supp(π_ε(· | x)) of the conditional measures π_ε(· | x) behave like balls of radius ε^{1/(d(p-1)+2)}. This local sparsity allows us to establish uniform strong convexity of the regularized potentials and to derive their rate of convergence to the unregularized limit.

What carries the argument

The conditional support radius scaling as ε to the power 1 over d(p-1)+2, which quantifies the local sparsity of the regularized coupling.

If this is right

  • The regularized potentials become uniformly strongly convex for small ε.
  • The convergence rate of the potentials to their unregularized versions can be quantified explicitly.
  • The results apply to multivariate cases and general costs beyond self-transport.
  • The support of the full coupling π_ε shrinks to that of the unregularized plan at this local rate.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • These support estimates could inform the design of numerical schemes that adaptively refine grids based on the predicted sparsity scale.
  • Similar scaling might appear in other regularized variational problems with entropic penalties.
  • Extending the analysis to include boundary effects would require new techniques for handling the interaction with domain edges.

Load-bearing premise

The underlying measures and cost function must satisfy the conditions that make the unregularized optimal transport problem well-posed, at least locally away from boundaries.

What would settle it

Numerical computation of the support radius of conditional measures for decreasing values of ε in a simple multivariate transport problem should match the predicted exponent 1/(d(p-1)+2).

read the original abstract

In recent years, the use of entropy-regularized optimal transport with $L^p$-type entropies has become increasingly popular. In this setting, the solutions are sparse, in the sense that the support of the regularized optimal coupling, $\mathrm{supp}(\pi_\varepsilon)$, shrinks to the support of the original optimal transport problem as $\varepsilon \to 0$. The main open question concerns the rate of this convergence. In this paper, we obtain sharp local results away from the boundary. We prove that the supports $\mathrm{supp}(\pi_\varepsilon(\cdot \mid x))$ of the conditional measures, $\pi_\varepsilon(\cdot \mid x)$, behave like balls of radius $\varepsilon^\frac 1 {d(p-1)+2}$. This allows us to show that the regularized potentials are uniformly strongly convex and to derive the rate of convergence of these potentials toward their unregularized limit. Our results generalize the results of (Gonz\'alez-Sanz and Nutz, SIAM J.~Math.~Anal.) and (Wiesel and Xu, Ibid.) to the multivariate case and beyond the case of self-transport.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper proves sharp local sparsity results for entropy-regularized optimal transport with L^p entropies. Away from the boundary, the conditional supports supp(π_ε(·|x)) are shown to be asymptotically equivalent to balls of radius ε^{1/(d(p-1)+2)}. This scaling is used to establish uniform strong convexity of the regularized potentials and to obtain convergence rates of these potentials to their unregularized counterparts. The results extend prior work of González-Sanz-Nutz and Wiesel-Xu to the multivariate setting and to non-self-transport problems.

Significance. If the local scaling holds with uniform constants, the work supplies the first explicit quantitative description of sparsity rates in the multivariate regularized OT setting. The explicit exponent and the passage from local support control to uniform strong convexity and potential convergence rates constitute a clear technical advance over the cited one-dimensional and self-transport results.

major comments (2)
  1. [§3] §3 (main theorem on support radius): the lower and upper bounds on r_ε(x) are obtained by balancing the L^p entropy contribution (volume times height^p) against the quadratic cost deviation (r^2). The argument invokes a positive lower bound on the densities of both marginals and a uniform lower bound on the Hessian of the cost in a neighborhood of (x,y). No quantitative control is given showing that these constants remain bounded away from zero uniformly for x ranging over compact subsets of the interior of supp(μ). Without such uniformity the claimed radius scaling cannot be inserted into the strong-convexity argument of §4.
  2. [§4] §4, proof of uniform strong convexity: the passage from the local ball-radius estimate to a uniform lower bound on the Hessian of the regularized potential requires the implicit constants in the radius to be independent of x. The manuscript only states the local result and does not supply a covering argument or modulus of continuity that would guarantee uniformity on compact interior sets.
minor comments (2)
  1. [Abstract] The abstract cites “González-Sanz and Nutz, SIAM J. Math. Anal.” and “Wiesel and Xu, Ibid.”; the second reference should be expanded to the full journal name and year for clarity.
  2. [§2] Notation for the conditional measures π_ε(·|x) is introduced without an explicit definition of the disintegration; a short sentence recalling the disintegration theorem would help readers unfamiliar with the measure-theoretic setting.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the careful reading and constructive comments on the local sparsity results and their application to strong convexity. We address each major comment below and will revise the manuscript accordingly to incorporate the requested uniformity arguments.

read point-by-point responses
  1. Referee: [§3] §3 (main theorem on support radius): the lower and upper bounds on r_ε(x) are obtained by balancing the L^p entropy contribution (volume times height^p) against the quadratic cost deviation (r^2). The argument invokes a positive lower bound on the densities of both marginals and a uniform lower bound on the Hessian of the cost in a neighborhood of (x,y). No quantitative control is given showing that these constants remain bounded away from zero uniformly for x ranging over compact subsets of the interior of supp(μ). Without such uniformity the claimed radius scaling cannot be inserted into the strong-convexity argument of §4.

    Authors: We agree that uniformity of the implicit constants must be established explicitly for the radius scaling to be usable on compact interior sets. In the current proof of Theorem 3.1, the lower bounds on marginal densities and on the minimal eigenvalue of the cost Hessian are taken positive in a neighborhood of each fixed (x,y). Since the interior of supp(μ) is open and the densities are assumed continuous and strictly positive there, any compact K ⊂ int(supp(μ)) admits a uniform positive lower bound on the densities by compactness. The same holds for the Hessian lower bound by uniform continuity of the Hessian on the compact set of pairs (x,y) with x ∈ K and y in a fixed neighborhood of the support. We will add a short lemma (or remark) after the statement of Theorem 3.1 that records this uniformity via compactness, together with the explicit dependence of the constants on the compact set K. This makes the radius r_ε(x) ∼ ε^{1/(d(p-1)+2)} uniform on K and directly usable in §4. revision: yes

  2. Referee: [§4] §4, proof of uniform strong convexity: the passage from the local ball-radius estimate to a uniform lower bound on the Hessian of the regularized potential requires the implicit constants in the radius to be independent of x. The manuscript only states the local result and does not supply a covering argument or modulus of continuity that would guarantee uniformity on compact interior sets.

    Authors: We concur that the transition from the local radius control to a uniform Hessian lower bound for the regularized potential requires an explicit uniformity argument. With the uniform radius estimate now available on any compact K ⊂ int(supp(μ)) (as added in the revision to §3), the strong-convexity proof in §4 proceeds by integrating the local quadratic lower bound over the conditional support ball of uniform radius. To make this rigorous we will insert a brief covering argument: cover K by finitely many small balls on which the local estimates hold with the same constants, then patch the local Hessian lower bounds together. The resulting uniform strong convexity constant depends only on K, the entropy exponent p, and the cost, as claimed. We will expand the relevant paragraph in §4 to include this step. revision: yes

Circularity Check

0 steps flagged

No circularity; direct analytic estimates on regularized OT

full rationale

The paper derives the support radius scaling by balancing the local L^p entropy contribution (volume times height^p) against quadratic cost deviation (scaling with r^2) under local interior density and cost convexity assumptions. This is a self-contained analytic argument with no fitted parameters renamed as predictions, no self-definitional loops, and no load-bearing self-citations that reduce the central claim to prior unverified inputs. Generalization of González-Sanz-Nutz and Wiesel-Xu is cited for context but the multivariate extension proceeds independently via the paper's own estimates. No step reduces by construction to its own inputs.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The central claim rests on standard properties of entropy-regularized optimal transport and local analysis of the associated Euler-Lagrange equations; no new free parameters or invented entities are introduced.

axioms (1)
  • standard math Standard existence, uniqueness, and regularity properties of entropy-regularized optimal transport plans for L^p entropies
    Invoked throughout to guarantee the conditional measures are well-defined and the support analysis applies.

pith-pipeline@v0.9.0 · 5517 in / 1145 out tokens · 50433 ms · 2026-05-13T22:16:11.916357+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

5 extracted references · 5 canonical work pages

  1. [1]

    1, 213–239

    [BEZ25] Erhan Bayraktar, Stephan Eckstein, and Xin Zhang,Stability and sample complexity of divergence regularized optimal transport, Bernoulli31(2025), no. 1, 213–239. [Caf92] Luis A. Caffarelli,The regularity of mappings with a convex potential, J. Amer. Math. Soc.5(1992), no. 1, 99–104. [CLW21] Shibing Chen, Jiakun Liu, and Xu-Jia Wang,Global regularit...

  2. [2]

    MR 4565039 [EN24] Stephan Eckstein and Marcel Nutz,Convergence rates for regularized optimal transport via quantization, Math. Oper. Res.49(2024), no. 2, 1223–1240. MR 4755769 [Eva98] Lawrence C. Evans,Partial differential equations, Graduate Studies in Mathematics, vol. 19, American Mathematical Society, Providence, RI,

  3. [3]

    Gvalani and Lukas Koch,Sparsity and uniform regularity for regularised optimal transport, arXiv preprint (2026)

    MR 1625845 (99e:35001) [GK26] Rishabh S. Gvalani and Lukas Koch,Sparsity and uniform regularity for regularised optimal transport, arXiv preprint (2026). [GSdBN25] Alberto González-Sanz, Eustasio del Barrio, and Marcel Nutz,Sample complexity of quadratically regularized optimal transport, arXiv:2511.09807 (2025). [GSEN25] Alberto González-Sanz, Stephan Ec...

  4. [4]

    [Nut25] Marcel Nutz,Quadratically regularized optimal transport: existence and multiplicity of potentials, SIAM J. Math. Anal.57(2025), no. 3, 2622–2649. MR 4907548 [Roc70] R. Tyrrell Rockafellar,Convex analysis, Princeton Mathematical Series, vol. No. 28, Princeton University Press, Princeton, NJ,

  5. [5]

    MR 2459454 [WX25] Johannes Wiesel and Xingyu Xu,Sparsity of quadratically regularized optimal transport: Bounds on concentration and Bias, SIAM J. Math. Anal.57(2025), no. 6, 6498–6521. MR 4986738 [ZMMS23] Stephen Zhang, Gilles Mordant, Tetsuya Matsumoto, and Geoffrey Schiebinger,Mani- fold learning with sparse regularised optimal transport, arXiv:2307.09...