Recognition: unknown
PINNs in More General Geometry
Pith reviewed 2026-05-07 17:54 UTC · model grok-4.3
The pith
PINNs can address differential geometry problems by training neural networks to minimize functionals that encode geometric conditions, as shown through summaries of three related studies.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
Many constructions in differential geometry may be framed as minimisation of a differential functional, these functionals can be coded as loss functions to align the AI loss-minimisation goal with that of solving the geometric problem.
Load-bearing premise
That differential functionals arising in geometry can be directly and effectively encoded as neural network loss functions without significant loss of geometric fidelity or introduction of training artifacts.
Figures
read the original abstract
Neural architectures trained with losses inspired by differential conditions are the basis for PINN models. Since many constructions in differential geometry may be framed as minimisation of a differential functional, these functionals can be coded as loss functions to align the AI loss-minimisation goal with that of solving the geometric problem. This contribution to the Recent Progress in Computational String Geometry workshop proceedings introduces the PINN architecture defining principles, motivates how they are well suited for problems in differential geometry, and demonstrates their use via summaries of three works at this intersection.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper claims that PINN models, trained with losses inspired by differential conditions, are suitable for differential geometry problems because many geometric constructions can be framed as minimization of differential functionals, which can be coded as loss functions. It introduces PINN defining principles, motivates the alignment with geometric problems, and demonstrates the approach by summarizing three existing works in this intersection, as a contribution to the Recent Progress in Computational String Geometry workshop proceedings.
Significance. This expository paper provides a clear conceptual bridge between physics-informed neural networks and variational problems in differential geometry. By highlighting how loss minimization in AI can align with functional minimization in geometry, it may encourage applications in computational string geometry and related fields. The manuscript's strength is in its motivational framework and reference to existing applications, though it does not introduce new technical results or verifications.
Simulated Author's Rebuttal
We thank the referee for the positive assessment of our manuscript and the recommendation for minor revision. The paper is an expository contribution to the workshop proceedings, intended to introduce PINN principles and motivate their suitability for differential geometry problems via alignment of loss minimization with variational functionals, illustrated through summaries of three existing studies. We are pleased that the conceptual bridge is viewed as a strength.
Axiom & Free-Parameter Ledger
Reference graph
Works this paper leans on
-
[1]
A. L. Besse,Einstein Manifolds. Springer-Verlag, Berlin, Heidelberg, New York, 1987
1987
-
[2]
The Weyl and Minkowski problems in differential geometry in the large,
L. Nirenberg, “The Weyl and Minkowski problems in differential geometry in the large,” Communications on Pure and Applied Mathematics6no. 3, (1953) 337–394
1953
-
[3]
Note on embedded surfaces,
T. J. Willmore, “Note on embedded surfaces,”An. Ştiinţ. Univ. “Al. I. Cuza” Iaşi Secţ. I a Mat. (N.S.)11B(1965) 493–496
1965
-
[4]
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,
M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,”J. Comput. Phys.378(2019) 686–707
2019
-
[5]
Physics-informed machine learning,
G. E. Karniadakis, I. G. Kevrekidis, L. Lu, P. Perdikaris, S. Wang, and L. Yang, “Physics-informed machine learning,”Nat. Rev. Phys.3no. 6, (2021) 422–440
2021
-
[6]
Automatic differentiation in machine learning: a survey,
A. G. Baydin, B. A. Pearlmutter, A. A. Radul, and J. M. Siskind, “Automatic differentiation in machine learning: a survey,”J. Mach. Learn. Res.18no. 153, (2018) 1–43
2018
-
[7]
AInstein: Numerical Einstein Metrics via Machine Learning,
E. Hirst, T. S. Gherardini, and A. G. Stapleton, “AInstein: Numerical Einstein Metrics via Machine Learning,”AI Sci.1no. 2, (2025) 025001,arXiv:2502.13043 [hep-th]
-
[8]
A Machine Learning Approach to the Nirenberg Problem,
G. Cortés, M. Esteban-Casadevall, Y. Feng, J. Henkel, E. Hirst, T. S. Gherardini, and A. G. Stapleton, “A Machine Learning Approach to the Nirenberg Problem,”arXiv:2602.12368 [cs.LG]
-
[9]
Numerical Calabi–Yau metrics from holomorphic networks,
D. Michael, L. Subramanian, and Q. Yidi, “Numerical Calabi–Yau metrics from holomorphic networks,”Proceedings of Machine Learning Research145(2020) 223–252, arXiv:2012.04797 [hep-th]
-
[10]
Learning Size and Shape of Calabi-Yau Spaces,
M. Larfors, A. Lukas, F. Ruehle, and R. Schneider, “Learning Size and Shape of Calabi-Yau Spaces,” 11, 2021
2021
-
[11]
Machine-learned Calabi–Yau metrics and curvature,
P. Berglund, G. Butbaia, T. Hüubsch, V. Jejjala, D. Mayorga Peña, C. Mishra, and J. Tan, “Machine-learned Calabi–Yau metrics and curvature,”Adv. Theor. Math. Phys.27no. 4, (2023) 1107–1158,arXiv:2211.09801 [hep-th]
-
[12]
Harmonic 1-forms on real loci of Calabi–Yau manifolds,
M. R. Douglas, D. Platt, and Y. Qi, “Harmonic 1-forms on real loci of Calabi–Yau manifolds,” May, 2024.https://arxiv.org/abs/2405.19402
-
[13]
https://arxiv.org/ abs/2602.12438
E. Heyes, E. Hirst, H. N. S. Earp, and T. S. R. Silva, “Neural and numerical methods for G2-structures on contact Calabi-Yau 7-manifolds,”arXiv:2602.12438 [math.DG]
-
[14]
Approximating high-dimensional minimal surfaces with physics-informed neural networks,
S. Zhou and X. Ye, “Approximating high-dimensional minimal surfaces with physics-informed neural networks,” 2023
2023
-
[15]
Physics-informed neural network solves minimal surfaces in curved spacetime,
K. Hashimoto, K. Kyo, M. Murata, G. Ogiwara, and N. Tanahashi, “Physics-informed neural network solves minimal surfaces in curved spacetime,”Mach. Learn. Sci. Tech.7no. 1, (2026) 015013,arXiv:2509.10866 [hep-th]
-
[16]
Computer-assisted proofs in pde: A survey,
J. Gómez-Serrano, “Computer-assisted proofs in pde: A survey,”Notices of the AMS66 no. 3, (2019) 298–310
2019
-
[17]
Y. Wang, C.-Y. Lai, J. Gómez-Serrano, and T. Buckmaster, “Asymptotic self-similar blow-up profile for three-dimensional axisymmetric Euler equations using neural networks,”Phys. Rev. Lett.130(Jun, 2023) 244002. https://link.aps.org/doi/10.1103/PhysRevLett.130.244002
-
[18]
Non-uniqueness and symmetries for the nirenberg problem using computer assistance,
D. Platt, “Non-uniqueness and symmetries for the nirenberg problem using computer assistance,” 2026.https://arxiv.org/abs/2603.29544
-
[19]
Minimising Willmore Energy via Neural Flow
E. Hirst, H. N. S. Earp, and T. S. R. Silva, “Minimising Willmore Energy via Neural Flow,” arXiv:2604.04321 [math.DG]
work page internal anchor Pith review Pith/arXiv arXiv
-
[20]
Approximation by superpositions of a sigmoidal function,
G. Cybenko, “Approximation by superpositions of a sigmoidal function,”Math. Control Signals Systems2no. 4, (1989) 303–314
1989
-
[21]
Approximation capabilities of multilayer feedforward networks,
K. Hornik, “Approximation capabilities of multilayer feedforward networks,”Neural Netw.4 no. 2, (1991) 251–257. – 11 –
1991
-
[22]
Curvature functions for compact 2-manifolds,
J. L. Kazdan and F. W. Warner, “Curvature functions for compact 2-manifolds,”Annals of Mathematics99no. 1, (1974) 14–47
1974
-
[23]
Scalar curvature and conformal deformation of Riemannian structure,
J. L. Kazdan and F. W. Warner, “Scalar curvature and conformal deformation of Riemannian structure,”Journal of Differential Geometry10no. 1, (1975) 113–134
1975
-
[24]
Prescribing Gaussian curvature on S2,
S.-Y. A. Chang and P. C. Yang, “Prescribing Gaussian curvature on S2,”Acta Mathematica 159no. 1, (1987) 215–259
1987
-
[25]
Min-max theory and the willmore conjecture,
F. C. Marques and A. Neves, “Min-max theory and the willmore conjecture,”Annals of Mathematics179no. 2, (2014) 683–782
2014
-
[26]
Comparison surfaces for the Willmore problem,
R. Kusner, “Comparison surfaces for the Willmore problem,”Pacific J. Math.138no. 2, (1989) 317–345
1989
-
[27]
Complete minimal surfaces inS3,
H. B. Lawson, “Complete minimal surfaces inS3,”Ann. of Math. (2)92no. 3, (1970) 335–374. – 12 –
1970
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.