Recognition: unknown
Flow Matching on Symmetric Spaces
Pith reviewed 2026-05-07 17:06 UTC · model grok-4.3
The pith
Flow matching on Riemannian symmetric spaces can be reformulated as linear flow matching on a subspace of the Lie algebra.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
We introduce a general framework for training flow matching models on Riemannian symmetric spaces, a large class of manifolds that includes the sphere, hyperbolic space and Grassmannians. We exploit their algebraic structure to reformulate flow matching on symmetric spaces as flow matching on a subspace of the Lie algebra of their isometry group, thus linearizing the problem and greatly simplifying the handling of geodesics. As an application, we showcase our framework on the real Grassmannians SO(n) / SO(k) × SO(n-k).
What carries the argument
Reformulation of flow matching on symmetric spaces as flow matching on a subspace of the Lie algebra of their isometry group
Load-bearing premise
The algebraic structure of Riemannian symmetric spaces permits an exact reformulation of the flow-matching objective as an equivalent problem on a linear subspace of the Lie algebra without loss of the original manifold geometry or probability measure.
What would settle it
An experiment showing that samples or geodesics produced by the Lie-algebra subspace model fail to match the target distribution or paths on the original symmetric space.
Figures
read the original abstract
We introduce a general framework for training flow matching models on Riemannian symmetric spaces, a large class of manifolds that includes the sphere, hyperbolic space and Grassmannians. We exploit their algebraic structure to reformulate flow matching on symmetric spaces as flow matching on a subspace of the Lie algebra of their isometry group, thus linearizing the problem and greatly simplifying the handling of geodesics. As an application, we showcase our framework on the real Grassmannians $\operatorname{SO}(n) / \operatorname{SO}(k) \times \operatorname{SO}(n-k)$.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper introduces a framework for flow matching on Riemannian symmetric spaces (including spheres, hyperbolic space, and Grassmannians) by exploiting their algebraic structure—specifically the involution and Cartan decomposition g = k ⊕ p—to reformulate the problem as standard Euclidean flow matching on a linear subspace of the Lie algebra of the isometry group. This linearization simplifies geodesic handling via the exponential map. The approach is applied to real Grassmannians SO(n)/SO(k) × SO(n-k).
Significance. If the claimed exact reformulation holds without loss of geometry or probability measure, the work would provide a practical simplification for generative modeling on a broad class of non-Euclidean manifolds, reducing the need for manifold-specific geodesic solvers and enabling reuse of Euclidean flow-matching implementations.
minor comments (3)
- The abstract and framework description would benefit from an explicit statement (perhaps in §3 or §4) confirming that the transported vector fields and conditional paths remain equivalent under the isometry group action, including any necessary Jacobian or measure corrections.
- In the Grassmannian application, clarify how the subspace identification interacts with the quotient structure to ensure the generated samples lie on the manifold after the exponential map.
- Add a short discussion of computational complexity: does the Lie-algebra reformulation reduce the cost of sampling or training compared to direct Riemannian flow matching?
Simulated Author's Rebuttal
We thank the referee for their positive summary of our work and for recommending minor revision. We are glad that the referee sees value in the algebraic reformulation that reduces flow matching on symmetric spaces to standard Euclidean flow matching on a Lie algebra subspace, thereby simplifying geodesic computations via the exponential map. As no specific major comments were provided in the report, we have no points to address point-by-point and will incorporate any minor editorial suggestions in the revised manuscript.
Circularity Check
No significant circularity
full rationale
The paper's central derivation uses the standard Cartan decomposition g = k ⊕ p of the Lie algebra of the isometry group and the identification of the tangent space at the basepoint with the subspace p to transport the flow-matching vector field and conditional paths from the symmetric space to an equivalent Euclidean problem on p. This is a direct algebraic reformulation that preserves the Riemannian metric and probability measure by construction via the exponential map; it does not rely on fitted parameters, self-referential definitions, or load-bearing self-citations. The Grassmannian application follows immediately from the general case without additional assumptions that close a loop back to the inputs.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption Riemannian symmetric spaces admit a transitive action by their isometry group whose Lie algebra contains a linear subspace that faithfully represents the tangent space geometry for flow purposes.
Reference graph
Works this paper leans on
-
[1]
Bridson and André Haefliger , title =
Martin R. Bridson and André Haefliger , title =. 1999 , doi =
1999
-
[2]
Cartan,. Sur une classe remarquable d'espaces de Riemann , journal =. 1926 , pages =. doi:10.24033/bsmf.1105 , url =
-
[3]
Sur une classe remarquable d'espaces de Riemann
Cartan,. Sur une classe remarquable d'espaces de Riemann. II , journal =. 1927 , pages =. doi:10.24033/bsmf.1113 , url =
-
[4]
Eberlein , title =
Patrick B. Eberlein , title =. 1996 , isbn =
1996
-
[5]
2001 , series =
Sigurdur Helgason , title =. 2001 , series =
2001
-
[6]
Valiente , year=
Kroon, Juan A. Valiente , year=. Conformal Methods in General Relativity , publisher=
-
[7]
ArXiv , year=
Building Normalizing Flows with Stochastic Interpolants , author=. ArXiv , year=
-
[8]
International Conference on Machine Learning , year=
Matching Normalizing Flows and Probability Paths on Manifolds , author=. International Conference on Machine Learning , year=
-
[9]
Proceedings of the 37th International Conference on Machine Learning , pages =
Latent Variable Modelling with Hyperbolic Normalizing Flows , author =. Proceedings of the 37th International Conference on Machine Learning , pages =. 2020 , editor =
2020
-
[10]
The Twelfth International Conference on Learning Representations , year=
Flow Matching on General Geometries , author=. The Twelfth International Conference on Learning Representations , year=
-
[11]
Chen, Ricky T. Q. and Rubanova, Yulia and Bettencourt, Jesse and Duvenaud, David , title =. Proceedings of the 32nd International Conference on Neural Information Processing Systems , pages =. 2018 , publisher =
2018
-
[12]
The Thirty-eighth Annual Conference on Neural Information Processing Systems , year=
Variational Flow Matching for Graph Generation , author=. The Thirty-eighth Annual Conference on Neural Information Processing Systems , year=
-
[13]
ArXiv , year=
Continuous normalizing flows on manifolds , author=. ArXiv , year=
-
[14]
ArXiv , year=
Normalizing Flows on Riemannian Manifolds , author=. ArXiv , year=
-
[15]
International Conference on Learning Representations , year=
Scalable Reversible Generative Models with Free-form Continuous Dynamics , author=. International Conference on Learning Representations , year=
-
[16]
Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =
Jonathan Ho and Ajay Jain and Pieter Abbeel , title =. Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =. 2020 , isbn =
2020
-
[17]
Proceedings of the 36th International Conference on Neural Information Processing Systems , articleno =
Chin-Wei Huang and Milad Aghajohari and Avishek Joey Bose and Prakash Panangaden and Aaron Courville , title =. Proceedings of the 36th International Conference on Neural Information Processing Systems , articleno =. 2022 , isbn =
2022
-
[18]
Thirty-seventh Conference on Neural Information Processing Systems , year=
Equivariant flow matching , author=. Thirty-seventh Conference on Neural Information Processing Systems , year=
-
[19]
Proceedings of the 37th International Conference on Machine Learning , pages =
Equivariant Flows: Exact Likelihood Generative Learning for Symmetric Densities , author =. Proceedings of the 37th International Conference on Machine Learning , pages =. 2020 , editor =
2020
-
[20]
The Eleventh International Conference on Learning Representations , year=
Flow Matching for Generative Modeling , author=. The Eleventh International Conference on Learning Representations , year=
-
[21]
International conference on learning representations (ICLR) , author =
Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow , url =. International conference on learning representations (ICLR) , author =
-
[22]
Advances in Neural Information Processing Systems (NeurIPS) , year =
Neural Manifold Ordinary Differential Equations , author =. Advances in Neural Information Processing Systems (NeurIPS) , year =
-
[23]
Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =
Mathieu, Emile and Nickel, Maximilian , title =. Proceedings of the 34th International Conference on Neural Information Processing Systems , articleno =. 2020 , isbn =
2020
-
[24]
2023 , url=
Action Matching: A Variational Method for Learning Stochastic Dynamics from Samples , author=. 2023 , url=
2023
-
[25]
Proceedings of the 37th International Conference on Machine Learning , pages =
Normalizing Flows on Tori and Spheres , author =. Proceedings of the 37th International Conference on Machine Learning , pages =. 2020 , editor =
2020
-
[26]
Advances in Neural Information Processing Systems , editor=
Moser Flow: Divergence-based Generative Modeling on Manifolds , author=. Advances in Neural Information Processing Systems , editor=. 2021 , url=
2021
-
[27]
2026 , eprint=
Flow matching on homogeneous spaces , author=. 2026 , eprint=
2026
-
[28]
Advances in Neural Information Processing Systems , editor=
E(n) Equivariant Normalizing Flows , author=. Advances in Neural Information Processing Systems , editor=. 2021 , url=
2021
-
[29]
and Smets, Bart M
Sherry, Finn M. and Smets, Bart M. N. Flow Matching on Lie Groups. Geometric Science of Information. 2026
2026
-
[30]
Proceedings of the 33rd International Conference on Neural Information Processing Systems , articleno =
Yang Song and Stefano Ermon , title =. Proceedings of the 33rd International Conference on Neural Information Processing Systems , articleno =. 2019 , publisher =
2019
-
[31]
International Conference on Learning Representations (ICLR) , year=
Score-Based Generative Modeling through Stochastic Differential Equations , author=. International Conference on Learning Representations (ICLR) , year=
-
[32]
Proceedings of the 35th International Conference on Neural Information Processing Systems , articleno =
Yang Song and Conor Durkan and Iain Murray and Stefano Ermon , title =. Proceedings of the 35th International Conference on Neural Information Processing Systems , articleno =. 2021 , isbn =
2021
-
[33]
2024 , eprint=
Improving and generalizing flow-based generative models with minibatch optimal transport , author=. 2024 , eprint=
2024
-
[34]
2023 , eprint=
Grassmann Manifold Flows for Stable Shape Generation , author=. 2023 , eprint=
2023
-
[35]
ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy , year=
Towards Variational Flow Matching on General Geometries , author=. ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy , year=
2025
-
[36]
The Eleventh International Conference on Learning Representations , year=
Fast Sampling of Diffusion Models with Exponential Integrator , author=. The Eleventh International Conference on Learning Representations , year=
-
[37]
Science , volume =
Frank Noé and Simon Olsson and Jonas Köhler and Hao Wu , title =. Science , volume =. 2019 , doi =
2019
-
[38]
Building Normalizing Flows with Stochastic Interpolants
Michael S. Albergo and Eric Vanden-Eijnden. Building normalizing flows with stochastic interpolants. ArXiv , abs/2209.15571, 2022
work page internal anchor Pith review arXiv 2022
-
[39]
Bridson and André Haefliger
Martin R. Bridson and André Haefliger. Metric Spaces of Non-Positive Curvature , volume 319 of Grundlehren der mathematischen Wissenschaften . Springer, Berlin, Heidelberg, 1999
1999
-
[40]
Heli Ben-Hamu, Samuel Cohen, Joey Bose, Brandon Amos, Maximilian Nickel, Aditya Grover, Ricky T. Q. Chen, and Yaron Lipman. Matching normalizing flows and probability paths on manifolds. In International Conference on Machine Learning , 2022
2022
-
[41]
Latent variable modelling with hyperbolic normalizing flows
Joey Bose, Ariella Smofsky, Renjie Liao, Prakash Panangaden, and Will Hamilton. Latent variable modelling with hyperbolic normalizing flows. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning , volume 119 of Proceedings of Machine Learning Research , pages 1045--1055. PMLR, 2020
2020
-
[42]
Sur une classe remarquable d'espaces de riemann
\'E lie Cartan. Sur une classe remarquable d'espaces de riemann. Bulletin de la Soci\'et\'e Math\'ematique de France , 54:214--264, 1926
1926
-
[43]
Sur une classe remarquable d'espaces de riemann
\'E lie Cartan. Sur une classe remarquable d'espaces de riemann. ii. Bulletin de la Soci\'et\'e Math\'ematique de France , 55:114--134, 1927
1927
-
[44]
Ricky T. Q. Chen and Yaron Lipman. Flow matching on general geometries. In The Twelfth International Conference on Learning Representations , 2024
2024
-
[45]
Ricky T. Q. Chen, Yulia Rubanova, Jesse Bettencourt, and David Duvenaud. Neural ordinary differential equations. In Proceedings of the 32nd International Conference on Neural Information Processing Systems , NIPS'18, page 6572–6583, Red Hook, NY, USA, 2018. Curran Associates Inc
2018
-
[46]
Eberlein
Patrick B. Eberlein. Geometry of Nonpositively Curved Manifolds . University of Chicago Press, Chicago, 1996
1996
-
[47]
Naesseth, Max Welling, and Jan-Willem van de Meent
Floor Eijkelboom, Grigory Bartosh, Christian A. Naesseth, Max Welling, and Jan-Willem van de Meent. Variational flow matching for graph generation. In The Thirty-eighth Annual Conference on Neural Information Processing Systems , 2024
2024
-
[48]
Continuous normalizing flows on manifolds
Luca Falorsi. Continuous normalizing flows on manifolds. ArXiv , abs/2104.14959, 2021
-
[49]
Will Grathwohl, Ricky T. Q. Chen, Jesse Bettencourt, and David Duvenaud. Scalable reversible generative models with free-form continuous dynamics. In International Conference on Learning Representations , 2019
2019
-
[50]
Normalizing flows on riemannian manifolds
Mevlana Gemici, Danilo Jimenez Rezende, and Shakir Mohamed. Normalizing flows on riemannian manifolds. ArXiv , abs/1611.02304, 2016
-
[51]
Riemannian diffusion models
Chin-Wei Huang, Milad Aghajohari, Avishek Joey Bose, Prakash Panangaden, and Aaron Courville. Riemannian diffusion models. In Proceedings of the 36th International Conference on Neural Information Processing Systems , NIPS '22, Red Hook, NY, USA, 2022. Curran Associates Inc
2022
-
[52]
Differential Geometry, Lie Groups, and Symmetric Spaces , volume 34 of Graduate Studies in Mathematics
Sigurdur Helgason. Differential Geometry, Lie Groups, and Symmetric Spaces , volume 34 of Graduate Studies in Mathematics . American Mathematical Society, Providence, RI, 2001
2001
-
[53]
Denoising diffusion probabilistic models
Jonathan Ho, Ajay Jain, and Pieter Abbeel. Denoising diffusion probabilistic models. In Proceedings of the 34th International Conference on Neural Information Processing Systems , NIPS '20, Red Hook, NY, USA, 2020. Curran Associates Inc
2020
-
[54]
Equivariant flows: Exact likelihood generative learning for symmetric densities
Jonas K \"o hler, Leon Klein, and Frank Noe. Equivariant flows: Exact likelihood generative learning for symmetric densities. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning , volume 119 of Proceedings of Machine Learning Research , pages 5361--5370. PMLR, 2020
2020
-
[55]
Equivariant flow matching
Leon Klein, Andreas Kr \"a mer, and Frank Noe. Equivariant flow matching. In Thirty-seventh Conference on Neural Information Processing Systems , 2023
2023
-
[56]
Valiente Kroon
Juan A. Valiente Kroon. Conformal Methods in General Relativity . Cambridge University Press, 2023
2023
-
[57]
Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, and Matthew Le. Flow matching for generative modeling. In The Eleventh International Conference on Learning Representations , 2023
2023
-
[58]
Flow straight and fast: Learning to generate and transfer data with rectified flow
Xingchao Liu, Chengyue Gong, and Qiang Liu. Flow straight and fast: Learning to generate and transfer data with rectified flow. International conference on learning representations (ICLR) , 2023
2023
-
[59]
Aaron Lou, Derek Lim, Isay Katsman, Leo Huang, Qingxuan Jiang, Ser-Nam Lim, and Christopher M. De Sa. Neural manifold ordinary differential equations. Advances in Neural Information Processing Systems (NeurIPS) , 2020
2020
-
[60]
Riemannian continuous normalizing flows
Emile Mathieu and Maximilian Nickel. Riemannian continuous normalizing flows. In Proceedings of the 34th International Conference on Neural Information Processing Systems , NIPS '20, Red Hook, NY, USA, 2020. Curran Associates Inc
2020
-
[61]
Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning
Frank Noé, Simon Olsson, Jonas Köhler, and Hao Wu. Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning. Science , 365(6457):eaaw1147, 2019
2019
-
[62]
Action matching: A variational method for learning stochastic dynamics from samples, 2023
Kirill Neklyudov, Daniel Severo, and Alireza Makhzani. Action matching: A variational method for learning stochastic dynamics from samples, 2023
2023
-
[63]
Moser flow: Divergence-based generative modeling on manifolds
Noam Rozen, Aditya Grover, Maximilian Nickel, and Yaron Lipman. Moser flow: Divergence-based generative modeling on manifolds. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems , 2021
2021
-
[64]
Normalizing flows on tori and spheres
Danilo Jimenez Rezende, George Papamakarios, Sebastien Racaniere, Michael Albergo, Gurtej Kanwar, Phiala Shanahan, and Kyle Cranmer. Normalizing flows on tori and spheres. In Hal Daumé III and Aarti Singh, editors, Proceedings of the 37th International Conference on Machine Learning , volume 119 of Proceedings of Machine Learning Research , pages 8083--80...
2020
-
[65]
Generative modeling by estimating gradients of the data distribution
Yang Song and Stefano Ermon. Generative modeling by estimating gradients of the data distribution . Curran Associates Inc., Red Hook, NY, USA, 2019
2019
-
[66]
E(n) equivariant normalizing flows
Victor Garcia Satorras, Emiel Hoogeboom, Fabian Bernd Fuchs, Ingmar Posner, and Max Welling. E(n) equivariant normalizing flows. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems , 2021
2021
-
[67]
Sherry and Bart M
Finn M. Sherry and Bart M. N. Smets. Flow matching on lie groups. In Frank Nielsen and Fr \'e d \'e ric Barbaresco, editors, Geometric Science of Information , pages 54--62, Cham, 2026. Springer Nature Switzerland
2026
-
[68]
Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole
Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole. Score-based generative modeling through stochastic differential equations. In International Conference on Learning Representations (ICLR) , 2021
2021
-
[69]
Improving and generalizing flow-based generative models with minibatch optimal transport, 2024
Alexander Tong, Kilian Fatras, Nikolay Malkin, Guillaume Huguet, Yanlei Zhang, Jarrid Rector-Brooks, Guy Wolf, and Yoshua Bengio. Improving and generalizing flow-based generative models with minibatch optimal transport, 2024
2024
-
[70]
Grassmann manifold flows for stable shape generation, 2023
Ryoma Yataka, Kazuki Hirashima, and Masashi Shiraishi. Grassmann manifold flows for stable shape generation, 2023
2023
-
[71]
Fast sampling of diffusion models with exponential integrator
Qinsheng Zhang and Yongxin Chen. Fast sampling of diffusion models with exponential integrator. In The Eleventh International Conference on Learning Representations , 2023
2023
-
[72]
Towards variational flow matching on general geometries
Olga Zaghen, Floor Eijkelboom, Alison Pouplin, and Erik J Bekkers. Towards variational flow matching on general geometries. In ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy , 2025
2025
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.