pith. machine review for the scientific record. sign in

arxiv: 2605.03588 · v1 · submitted 2026-05-05 · 💻 cs.LG · cs.AI

Recognition: unknown

Flow Matching on Symmetric Spaces

Authors on Pith no claims yet

Pith reviewed 2026-05-07 17:06 UTC · model grok-4.3

classification 💻 cs.LG cs.AI
keywords flow matchingRiemannian symmetric spacesLie algebraGrassmanniansgenerative modelsmanifold learningRiemannian geometry
0
0 comments X

The pith

Flow matching on Riemannian symmetric spaces can be reformulated as linear flow matching on a subspace of the Lie algebra.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces a general framework for training flow matching models on Riemannian symmetric spaces such as the sphere, hyperbolic space, and Grassmannians. It exploits the algebraic structure of these manifolds to rewrite the flow matching objective as an equivalent problem on a linear subspace of the Lie algebra of the isometry group. This reformulation linearizes the task and simplifies geodesic computations while preserving the original geometry and probability measure. A reader would care because it makes generative modeling practical on curved spaces that are otherwise difficult to handle directly.

Core claim

We introduce a general framework for training flow matching models on Riemannian symmetric spaces, a large class of manifolds that includes the sphere, hyperbolic space and Grassmannians. We exploit their algebraic structure to reformulate flow matching on symmetric spaces as flow matching on a subspace of the Lie algebra of their isometry group, thus linearizing the problem and greatly simplifying the handling of geodesics. As an application, we showcase our framework on the real Grassmannians SO(n) / SO(k) × SO(n-k).

What carries the argument

Reformulation of flow matching on symmetric spaces as flow matching on a subspace of the Lie algebra of their isometry group

Load-bearing premise

The algebraic structure of Riemannian symmetric spaces permits an exact reformulation of the flow-matching objective as an equivalent problem on a linear subspace of the Lie algebra without loss of the original manifold geometry or probability measure.

What would settle it

An experiment showing that samples or geodesics produced by the Lie-algebra subspace model fail to match the target distribution or paths on the original symmetric space.

Figures

Figures reproduced from arXiv: 2605.03588 by Ferdinando Zanchetta, Francesco Ruscelli, Rita Fioresi.

Figure 1
Figure 1. Figure 1: Local and global geodesic symmetries on a Riemannian symmetric space. view at source ↗
Figure 2
Figure 2. Figure 2: Flow matching trajectories on S 2 = SO(3)/ SO(2). The results were stereographically projected to R 2 . 1. first, for every element X ∈ Mn,k(R) of the datasets we computed the QR decomposition X = QR, where Q ∈ O(n) and R is upper-triangular. It is easy to see that the span of the first k columns of Q is equal to the span of the columns of X and hence such columns forms an orthonormal frame for that subspa… view at source ↗
read the original abstract

We introduce a general framework for training flow matching models on Riemannian symmetric spaces, a large class of manifolds that includes the sphere, hyperbolic space and Grassmannians. We exploit their algebraic structure to reformulate flow matching on symmetric spaces as flow matching on a subspace of the Lie algebra of their isometry group, thus linearizing the problem and greatly simplifying the handling of geodesics. As an application, we showcase our framework on the real Grassmannians $\operatorname{SO}(n) / \operatorname{SO}(k) \times \operatorname{SO}(n-k)$.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 3 minor

Summary. The paper introduces a framework for flow matching on Riemannian symmetric spaces (including spheres, hyperbolic space, and Grassmannians) by exploiting their algebraic structure—specifically the involution and Cartan decomposition g = k ⊕ p—to reformulate the problem as standard Euclidean flow matching on a linear subspace of the Lie algebra of the isometry group. This linearization simplifies geodesic handling via the exponential map. The approach is applied to real Grassmannians SO(n)/SO(k) × SO(n-k).

Significance. If the claimed exact reformulation holds without loss of geometry or probability measure, the work would provide a practical simplification for generative modeling on a broad class of non-Euclidean manifolds, reducing the need for manifold-specific geodesic solvers and enabling reuse of Euclidean flow-matching implementations.

minor comments (3)
  1. The abstract and framework description would benefit from an explicit statement (perhaps in §3 or §4) confirming that the transported vector fields and conditional paths remain equivalent under the isometry group action, including any necessary Jacobian or measure corrections.
  2. In the Grassmannian application, clarify how the subspace identification interacts with the quotient structure to ensure the generated samples lie on the manifold after the exponential map.
  3. Add a short discussion of computational complexity: does the Lie-algebra reformulation reduce the cost of sampling or training compared to direct Riemannian flow matching?

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for their positive summary of our work and for recommending minor revision. We are glad that the referee sees value in the algebraic reformulation that reduces flow matching on symmetric spaces to standard Euclidean flow matching on a Lie algebra subspace, thereby simplifying geodesic computations via the exponential map. As no specific major comments were provided in the report, we have no points to address point-by-point and will incorporate any minor editorial suggestions in the revised manuscript.

Circularity Check

0 steps flagged

No significant circularity

full rationale

The paper's central derivation uses the standard Cartan decomposition g = k ⊕ p of the Lie algebra of the isometry group and the identification of the tangent space at the basepoint with the subspace p to transport the flow-matching vector field and conditional paths from the symmetric space to an equivalent Euclidean problem on p. This is a direct algebraic reformulation that preserves the Riemannian metric and probability measure by construction via the exponential map; it does not rely on fitted parameters, self-referential definitions, or load-bearing self-citations. The Grassmannian application follows immediately from the general case without additional assumptions that close a loop back to the inputs.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The central claim rests on standard facts from Riemannian geometry and Lie theory that are not derived inside the paper.

axioms (1)
  • domain assumption Riemannian symmetric spaces admit a transitive action by their isometry group whose Lie algebra contains a linear subspace that faithfully represents the tangent space geometry for flow purposes.
    Invoked when the authors state that the algebraic structure allows reformulation as flow matching on a subspace of the Lie algebra.

pith-pipeline@v0.9.0 · 5379 in / 1225 out tokens · 31834 ms · 2026-05-07T17:06:06.172868+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

72 extracted references · 5 canonical work pages · 1 internal anchor

  1. [1]

    Bridson and André Haefliger , title =

    Martin R. Bridson and André Haefliger , title =. 1999 , doi =

  2. [2]

    1926 , journal =

    Cartan,. Sur une classe remarquable d'espaces de Riemann , journal =. 1926 , pages =. doi:10.24033/bsmf.1105 , url =

  3. [3]

    Sur une classe remarquable d'espaces de Riemann

    Cartan,. Sur une classe remarquable d'espaces de Riemann. II , journal =. 1927 , pages =. doi:10.24033/bsmf.1113 , url =

  4. [4]

    Eberlein , title =

    Patrick B. Eberlein , title =. 1996 , isbn =

  5. [5]

    2001 , series =

    Sigurdur Helgason , title =. 2001 , series =

  6. [6]

    Valiente , year=

    Kroon, Juan A. Valiente , year=. Conformal Methods in General Relativity , publisher=

  7. [7]

    ArXiv , year=

    Building Normalizing Flows with Stochastic Interpolants , author=. ArXiv , year=

  8. [8]

    International Conference on Machine Learning , year=

    Matching Normalizing Flows and Probability Paths on Manifolds , author=. International Conference on Machine Learning , year=

  9. [9]

    Proceedings of the 37th International Conference on Machine Learning , pages =

    Latent Variable Modelling with Hyperbolic Normalizing Flows , author =. Proceedings of the 37th International Conference on Machine Learning , pages =. 2020 , editor =

  10. [10]

    The Twelfth International Conference on Learning Representations , year=

    Flow Matching on General Geometries , author=. The Twelfth International Conference on Learning Representations , year=

  11. [11]

    Chen, Ricky T. Q. and Rubanova, Yulia and Bettencourt, Jesse and Duvenaud, David , title =. Proceedings of the 32nd International Conference on Neural Information Processing Systems , pages =. 2018 , publisher =

  12. [12]

    The Thirty-eighth Annual Conference on Neural Information Processing Systems , year=

    Variational Flow Matching for Graph Generation , author=. The Thirty-eighth Annual Conference on Neural Information Processing Systems , year=

  13. [13]

    ArXiv , year=

    Continuous normalizing flows on manifolds , author=. ArXiv , year=

  14. [14]

    ArXiv , year=

    Normalizing Flows on Riemannian Manifolds , author=. ArXiv , year=

  15. [15]

    International Conference on Learning Representations , year=

    Scalable Reversible Generative Models with Free-form Continuous Dynamics , author=. International Conference on Learning Representations , year=

  16. [16]

    Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =

    Jonathan Ho and Ajay Jain and Pieter Abbeel , title =. Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =. 2020 , isbn =

  17. [17]

    Proceedings of the 36th International Conference on Neural Information Processing Systems , articleno =

    Chin-Wei Huang and Milad Aghajohari and Avishek Joey Bose and Prakash Panangaden and Aaron Courville , title =. Proceedings of the 36th International Conference on Neural Information Processing Systems , articleno =. 2022 , isbn =

  18. [18]

    Thirty-seventh Conference on Neural Information Processing Systems , year=

    Equivariant flow matching , author=. Thirty-seventh Conference on Neural Information Processing Systems , year=

  19. [19]

    Proceedings of the 37th International Conference on Machine Learning , pages =

    Equivariant Flows: Exact Likelihood Generative Learning for Symmetric Densities , author =. Proceedings of the 37th International Conference on Machine Learning , pages =. 2020 , editor =

  20. [20]

    The Eleventh International Conference on Learning Representations , year=

    Flow Matching for Generative Modeling , author=. The Eleventh International Conference on Learning Representations , year=

  21. [21]

    International conference on learning representations (ICLR) , author =

    Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow , url =. International conference on learning representations (ICLR) , author =

  22. [22]

    Advances in Neural Information Processing Systems (NeurIPS) , year =

    Neural Manifold Ordinary Differential Equations , author =. Advances in Neural Information Processing Systems (NeurIPS) , year =

  23. [23]

    Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =

    Mathieu, Emile and Nickel, Maximilian , title =. Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =. 2020 , isbn =

  24. [24]

    2023 , url=

    Action Matching: A Variational Method for Learning Stochastic Dynamics from Samples , author=. 2023 , url=

  25. [25]

    Proceedings of the 37th International Conference on Machine Learning , pages =

    Normalizing Flows on Tori and Spheres , author =. Proceedings of the 37th International Conference on Machine Learning , pages =. 2020 , editor =

  26. [26]

    Advances in Neural Information Processing Systems , editor=

    Moser Flow: Divergence-based Generative Modeling on Manifolds , author=. Advances in Neural Information Processing Systems , editor=. 2021 , url=

  27. [27]

    2026 , eprint=

    Flow matching on homogeneous spaces , author=. 2026 , eprint=

  28. [28]

    Advances in Neural Information Processing Systems , editor=

    E(n) Equivariant Normalizing Flows , author=. Advances in Neural Information Processing Systems , editor=. 2021 , url=

  29. [29]

    and Smets, Bart M

    Sherry, Finn M. and Smets, Bart M. N. Flow Matching on Lie Groups. Geometric Science of Information. 2026

  30. [30]

    Proceedings of the 33rd International Conference on Neural Information Processing Systems , articleno =

    Yang Song and Stefano Ermon , title =. Proceedings of the 33rd International Conference on Neural Information Processing Systems , articleno =. 2019 , publisher =

  31. [31]

    International Conference on Learning Representations (ICLR) , year=

    Score-Based Generative Modeling through Stochastic Differential Equations , author=. International Conference on Learning Representations (ICLR) , year=

  32. [32]

    Proceedings of the 35th International Conference on Neural Information Processing Systems , articleno =

    Yang Song and Conor Durkan and Iain Murray and Stefano Ermon , title =. Proceedings of the 35th International Conference on Neural Information Processing Systems , articleno =. 2021 , isbn =

  33. [33]

    2024 , eprint=

    Improving and generalizing flow-based generative models with minibatch optimal transport , author=. 2024 , eprint=

  34. [34]

    2023 , eprint=

    Grassmann Manifold Flows for Stable Shape Generation , author=. 2023 , eprint=

  35. [35]

    ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy , year=

    Towards Variational Flow Matching on General Geometries , author=. ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy , year=

  36. [36]

    The Eleventh International Conference on Learning Representations , year=

    Fast Sampling of Diffusion Models with Exponential Integrator , author=. The Eleventh International Conference on Learning Representations , year=

  37. [37]

    Science , volume =

    Frank Noé and Simon Olsson and Jonas Köhler and Hao Wu , title =. Science , volume =. 2019 , doi =

  38. [38]

    Building Normalizing Flows with Stochastic Interpolants

    Michael S. Albergo and Eric Vanden-Eijnden. Building normalizing flows with stochastic interpolants. ArXiv , abs/2209.15571, 2022

  39. [39]

    Bridson and André Haefliger

    Martin R. Bridson and André Haefliger. Metric Spaces of Non-Positive Curvature , volume 319 of Grundlehren der mathematischen Wissenschaften . Springer, Berlin, Heidelberg, 1999

  40. [40]

    Heli Ben-Hamu, Samuel Cohen, Joey Bose, Brandon Amos, Maximilian Nickel, Aditya Grover, Ricky T. Q. Chen, and Yaron Lipman. Matching normalizing flows and probability paths on manifolds. In International Conference on Machine Learning , 2022

  41. [41]

    Latent variable modelling with hyperbolic normalizing flows

    Joey Bose, Ariella Smofsky, Renjie Liao, Prakash Panangaden, and Will Hamilton. Latent variable modelling with hyperbolic normalizing flows. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning , volume 119 of Proceedings of Machine Learning Research , pages 1045--1055. PMLR, 2020

  42. [42]

    Sur une classe remarquable d'espaces de riemann

    \'E lie Cartan. Sur une classe remarquable d'espaces de riemann. Bulletin de la Soci\'et\'e Math\'ematique de France , 54:214--264, 1926

  43. [43]

    Sur une classe remarquable d'espaces de riemann

    \'E lie Cartan. Sur une classe remarquable d'espaces de riemann. ii. Bulletin de la Soci\'et\'e Math\'ematique de France , 55:114--134, 1927

  44. [44]

    Ricky T. Q. Chen and Yaron Lipman. Flow matching on general geometries. In The Twelfth International Conference on Learning Representations , 2024

  45. [45]

    Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural ordinary differential equations. In Proceedings of the 32nd International Conference on Neural Information Processing Systems , NIPS'18, page 6572–6583, Red Hook, NY, USA, 2018. Curran Associates Inc

  46. [46]

    Eberlein

    Patrick B. Eberlein. Geometry of Nonpositively Curved Manifolds . University of Chicago Press, Chicago, 1996

  47. [47]

    Naesseth, Max Welling, and Jan-Willem van de Meent

    Floor Eijkelboom, Grigory Bartosh, Christian A. Naesseth, Max Welling, and Jan-Willem van de Meent. Variational flow matching for graph generation. In The Thirty-eighth Annual Conference on Neural Information Processing Systems , 2024

  48. [48]

    Continuous normalizing flows on manifolds

    Luca Falorsi. Continuous normalizing flows on manifolds. ArXiv , abs/2104.14959, 2021

  49. [49]

    Will Grathwohl, Ricky T. Q. Chen, Jesse Bettencourt, and David Duvenaud. Scalable reversible generative models with free-form continuous dynamics. In International Conference on Learning Representations , 2019

  50. [50]

    Normalizing flows on riemannian manifolds

    Mevlana Gemici, Danilo Jimenez Rezende, and Shakir Mohamed. Normalizing flows on riemannian manifolds. ArXiv , abs/1611.02304, 2016

  51. [51]

    Riemannian diffusion models

    Chin-Wei Huang, Milad Aghajohari, Avishek Joey Bose, Prakash Panangaden, and Aaron Courville. Riemannian diffusion models. In Proceedings of the 36th International Conference on Neural Information Processing Systems , NIPS '22, Red Hook, NY, USA, 2022. Curran Associates Inc

  52. [52]

    Differential Geometry, Lie Groups, and Symmetric Spaces , volume 34 of Graduate Studies in Mathematics

    Sigurdur Helgason. Differential Geometry, Lie Groups, and Symmetric Spaces , volume 34 of Graduate Studies in Mathematics . American Mathematical Society, Providence, RI, 2001

  53. [53]

    Denoising diffusion probabilistic models

    Jonathan Ho, Ajay Jain, and Pieter Abbeel. Denoising diffusion probabilistic models. In Proceedings of the 34th International Conference on Neural Information Processing Systems , NIPS '20, Red Hook, NY, USA, 2020. Curran Associates Inc

  54. [54]

    Equivariant flows: Exact likelihood generative learning for symmetric densities

    Jonas K \"o hler, Leon Klein, and Frank Noe. Equivariant flows: Exact likelihood generative learning for symmetric densities. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning , volume 119 of Proceedings of Machine Learning Research , pages 5361--5370. PMLR, 2020

  55. [55]

    Equivariant flow matching

    Leon Klein, Andreas Kr \"a mer, and Frank Noe. Equivariant flow matching. In Thirty-seventh Conference on Neural Information Processing Systems , 2023

  56. [56]

    Valiente Kroon

    Juan A. Valiente Kroon. Conformal Methods in General Relativity . Cambridge University Press, 2023

  57. [57]

    Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, and Matthew Le. Flow matching for generative modeling. In The Eleventh International Conference on Learning Representations , 2023

  58. [58]

    Flow straight and fast: Learning to generate and transfer data with rectified flow

    Xingchao Liu, Chengyue Gong, and Qiang Liu. Flow straight and fast: Learning to generate and transfer data with rectified flow. International conference on learning representations (ICLR) , 2023

  59. [59]

    Aaron Lou, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser-Nam Lim, and Christopher M. De Sa. Neural manifold ordinary differential equations. Advances in Neural Information Processing Systems (NeurIPS) , 2020

  60. [60]

    Riemannian continuous normalizing flows

    Emile Mathieu and Maximilian Nickel. Riemannian continuous normalizing flows. In Proceedings of the 34th International Conference on Neural Information Processing Systems , NIPS '20, Red Hook, NY, USA, 2020. Curran Associates Inc

  61. [61]

    Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning

    Frank Noé, Simon Olsson, Jonas Köhler, and Hao Wu. Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning. Science , 365(6457):eaaw1147, 2019

  62. [62]

    Action matching: A variational method for learning stochastic dynamics from samples, 2023

    Kirill Neklyudov, Daniel Severo, and Alireza Makhzani. Action matching: A variational method for learning stochastic dynamics from samples, 2023

  63. [63]

    Moser flow: Divergence-based generative modeling on manifolds

    Noam Rozen, Aditya Grover, Maximilian Nickel, and Yaron Lipman. Moser flow: Divergence-based generative modeling on manifolds. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems , 2021

  64. [64]

    Normalizing flows on tori and spheres

    Danilo Jimenez Rezende, George Papamakarios, Sebastien Racaniere, Michael Albergo, Gurtej Kanwar, Phiala Shanahan, and Kyle Cranmer. Normalizing flows on tori and spheres. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning , volume 119 of Proceedings of Machine Learning Research , pages 8083--80...

  65. [65]

    Generative modeling by estimating gradients of the data distribution

    Yang Song and Stefano Ermon. Generative modeling by estimating gradients of the data distribution . Curran Associates Inc., Red Hook, NY, USA, 2019

  66. [66]

    E(n) equivariant normalizing flows

    Victor Garcia Satorras, Emiel Hoogeboom, Fabian Bernd Fuchs, Ingmar Posner, and Max Welling. E(n) equivariant normalizing flows. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems , 2021

  67. [67]

    Sherry and Bart M

    Finn M. Sherry and Bart M. N. Smets. Flow matching on lie groups. In Frank Nielsen and Fr \'e d \'e ric Barbaresco, editors, Geometric Science of Information , pages 54--62, Cham, 2026. Springer Nature Switzerland

  68. [68]

    Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole

    Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole. Score-based generative modeling through stochastic differential equations. In International Conference on Learning Representations (ICLR) , 2021

  69. [69]

    Improving and generalizing flow-based generative models with minibatch optimal transport, 2024

    Alexander Tong, Kilian Fatras, Nikolay Malkin, Guillaume Huguet, Yanlei Zhang, Jarrid Rector-Brooks, Guy Wolf, and Yoshua Bengio. Improving and generalizing flow-based generative models with minibatch optimal transport, 2024

  70. [70]

    Grassmann manifold flows for stable shape generation, 2023

    Ryoma Yataka, Kazuki Hirashima, and Masashi Shiraishi. Grassmann manifold flows for stable shape generation, 2023

  71. [71]

    Fast sampling of diffusion models with exponential integrator

    Qinsheng Zhang and Yongxin Chen. Fast sampling of diffusion models with exponential integrator. In The Eleventh International Conference on Learning Representations , 2023

  72. [72]

    Towards variational flow matching on general geometries

    Olga Zaghen, Floor Eijkelboom, Alison Pouplin, and Erik J Bekkers. Towards variational flow matching on general geometries. In ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy , 2025