pith. machine review for the scientific record. sign in

arxiv: 2605.09058 · v1 · submitted 2026-05-09 · ⚛️ physics.comp-ph · cs.LG

Recognition: 2 theorem links

· Lean Theorem

Nonlinear GENERIC Informed Neural Networks (N-GINNs): learning GENERIC dynamics with non-quadratic dissipation potentials

Celia Reina, Michal Pavelka, Vojt\v{e}ch Votruba, Weilun Qiu, Zequn He

Authors on Pith no claims yet

Pith reviewed 2026-05-12 02:32 UTC · model grok-4.3

classification ⚛️ physics.comp-ph cs.LG
keywords nonlinear GENERICneural networksdissipation potentialsthermodynamic consistencymodel discoverygradient flowsnon-equilibrium thermodynamics
0
0 comments X

The pith

Neural networks learn GENERIC dynamics with non-quadratic dissipation while exactly obeying the first and second laws of thermodynamics.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces a neural-network framework to recover evolution equations for systems whose dynamics combine reversible Hamiltonian motion with irreversible dissipation from observed trajectories. Unlike earlier methods limited to quadratic dissipation, the new approach handles any convex dissipation potential and builds the network so that energy is conserved and entropy is non-decreasing by construction. A reader would care because many real systems in mechanics, chemistry, and materials exhibit precisely this mixture of conservative and dissipative behavior, and learned models that automatically respect the thermodynamic laws avoid unphysical predictions. The authors demonstrate the framework on a damped oscillator, an idealized chemical motor, and a viscoplastic solid, showing that the recovered operators reproduce the expected trajectories.

Core claim

By reparameterizing both the bivector operator and the dissipation potential, neural networks can be trained to identify the full nonlinear GENERIC structure from data, thereby recovering dynamics that exactly satisfy the first and second laws for arbitrary convex dissipation potentials.

What carries the argument

The nonlinear GENERIC formalism, consisting of a Hamiltonian bivector for reversible flow superimposed with a generalized gradient flow driven by a convex dissipation potential, with reparameterizations that enforce thermodynamic consistency at the level of the network architecture.

If this is right

  • The method recovers accurate models for a harmonic oscillator coupled to a heat bath.
  • It identifies the dynamics of an idealized chemical motor with nonlinear dissipation.
  • It learns a one-dimensional viscoplastic model of Perzyna type from data.
  • Thermodynamic structure is guaranteed without any post-training corrections or penalty terms.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same reparameterization strategy could be tried on other structured dynamical systems that admit convex potentials, such as certain rate-dependent plasticity models in higher dimensions.
  • If the assumption of exact GENERIC structure is relaxed, the framework might still serve as a regularizer that keeps learned models approximately consistent with the first and second laws.
  • Testing the approach on noisy experimental rather than simulated data would reveal how sensitive the recovered operators are to measurement error.

Load-bearing premise

The observed trajectories must be generated by a system that exactly obeys the nonlinear GENERIC structure with a convex dissipation potential, and the chosen network parameterization must be sufficiently expressive to recover the true operators from finite data.

What would settle it

If the trained network, when applied to fresh initial conditions from one of the three test systems, produces trajectories whose total energy drifts or whose entropy decreases, the claim of exact thermodynamic enforcement would be falsified.

Figures

Figures reproduced from arXiv: 2605.09058 by Celia Reina, Michal Pavelka, Vojt\v{e}ch Votruba, Weilun Qiu, Zequn He.

Figure 1
Figure 1. Figure 1: Schematic of N-GINNs. From the state variables [PITH_FULL_IMAGE:figures/full_fig_p006_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Schematic of the harmonic oscillator in a heat bath (Example 1). A particle of mass [PITH_FULL_IMAGE:figures/full_fig_p007_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Numerical results for the harmonic oscillator in a heat bath. (a)–(c) Comparison of the learned GENERIC prediction (solid) with [PITH_FULL_IMAGE:figures/full_fig_p009_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Schematic of the idealized chemical motor (Example 2). A spring attached to the piston couples mechanical motion ( [PITH_FULL_IMAGE:figures/full_fig_p009_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Numerical results for the idealized chemical motor. (a)–(f) Comparison of the learned GENERIC prediction (solid) with the reference [PITH_FULL_IMAGE:figures/full_fig_p012_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Numerical results for 1D Perzyna viscoplasticity on a representative test trajectory. (a)–(c) Displacement [PITH_FULL_IMAGE:figures/full_fig_p017_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Error statistics for 1D Perzyna viscoplasticity across the full test set. (a)–(d) Mean absolute error fields for (a) displacement [PITH_FULL_IMAGE:figures/full_fig_p017_7.png] view at source ↗
read the original abstract

We introduce Nonlinear GENERIC Informed Neural Networks (N-GINNs), a deep learning framework for discovering evolution equations of systems governed by the nonlinear GENERIC formalism (General Equation for Non-Equilibrium Reversible-Irreversible Coupling). Such systems exhibit coupled conservative and dissipative dynamics, and can be described via the superposition of a Hamiltonian flow and a generalized gradient flow. In contrast to existing approaches, our formulation incorporates generalized gradient flows via convex dissipation potentials, enabling the identification of a broader class of thermodynamically consistent dynamics, including systems with non-quadratic dissipation potentials. Thermodynamic structure is strongly enforced by construction through suitable reparameterizations of both the bivector operator and the dissipation potential, ensuring exact compliance with the first and second laws of thermodynamics. We validate the proposed approach on three representative examples: a harmonic oscillator coupled to a heat bath, an idealized chemical motor, and a one-dimensional viscoplastic model of Perzyna type. These results demonstrate the method's ability to accurately infer thermodynamically consistent models from data for systems incorporating both conservative and nonlinear dissipative dynamics.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper introduces Nonlinear GENERIC Informed Neural Networks (N-GINNs) to discover evolution equations for systems governed by the nonlinear GENERIC formalism from data. The framework parameterizes the Hamiltonian, skew-symmetric bivector operator, and a convex dissipation potential with neural networks, using reparameterizations to enforce exact thermodynamic consistency (first and second laws). It claims to handle a broader class of dynamics than prior quadratic-dissipation approaches and validates the method on three synthetic examples: a harmonic oscillator coupled to a heat bath, an idealized chemical motor, and a one-dimensional viscoplastic Perzyna-type model.

Significance. If the central claims hold, the work offers a structure-preserving neural architecture for learning non-equilibrium dynamics with nonlinear dissipation, which is valuable for physics-informed machine learning in thermodynamics and continuum mechanics. The explicit reparameterization approach to enforce thermodynamic laws by construction is a clear technical strength that could reduce the need for soft constraints in related methods.

major comments (2)
  1. The three validation examples (harmonic oscillator, chemical motor, viscoplastic model) are all generated synthetically from the exact nonlinear GENERIC equations with convex dissipation potentials assumed by the model. This provides no test of recovery when the data-generating process deviates from the assumed structure or under realistic noise/model mismatch, which is load-bearing for the claim that the method identifies a broader class of thermodynamically consistent dynamics.
  2. The manuscript asserts that the chosen reparameterizations of the bivector operator and dissipation potential ensure exact compliance with the first and second laws, but does not include an explicit verification (e.g., numerical check of energy balance or entropy production over long trajectories) that the neural-network outputs remain within the convex cone for the dissipation potential across the training domain.
minor comments (2)
  1. Quantitative metrics (e.g., relative L2 errors on operators or trajectories, comparison to baselines such as standard PINNs or quadratic-dissipation GINNs) are referenced in the abstract but should be reported with error bars and tables in the results section for each example.
  2. Notation for the dissipation potential and its convexity constraint should be clarified with an explicit functional form or architecture diagram to aid reproducibility.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their thoughtful and constructive comments. We address each major comment below and indicate the revisions we will incorporate.

read point-by-point responses
  1. Referee: The three validation examples (harmonic oscillator, chemical motor, viscoplastic model) are all generated synthetically from the exact nonlinear GENERIC equations with convex dissipation potentials assumed by the model. This provides no test of recovery when the data-generating process deviates from the assumed structure or under realistic noise/model mismatch, which is load-bearing for the claim that the method identifies a broader class of thermodynamically consistent dynamics.

    Authors: We agree that the current validation uses noise-free data generated exactly from the assumed nonlinear GENERIC structure, which limits direct evidence for performance under mismatch or noise. These examples were chosen to isolate and verify the method's capacity to recover non-quadratic dissipation potentials when the structure is present. In the revised manuscript we will add two new experiments: (i) training and prediction on data corrupted by moderate Gaussian noise, and (ii) a controlled mismatch case in which the true dynamics are generated from a dissipation potential outside the exact class assumed by the model. These additions will quantify robustness while preserving the original demonstrations of exact structure recovery. revision: yes

  2. Referee: The manuscript asserts that the chosen reparameterizations of the bivector operator and dissipation potential ensure exact compliance with the first and second laws, but does not include an explicit verification (e.g., numerical check of energy balance or entropy production over long trajectories) that the neural-network outputs remain within the convex cone for the dissipation potential across the training domain.

    Authors: The reparameterizations guarantee the required properties analytically: the bivector is constructed to be exactly skew-symmetric for all inputs, and the dissipation potential is parameterized so that its Hessian is positive semi-definite by design (via a convex neural-network representation). We nevertheless acknowledge that an explicit numerical audit strengthens the claim. In the revision we will add (i) long-horizon trajectory plots confirming exact energy conservation and non-negative entropy production, and (ii) domain-wide checks (sampled points and Hessian eigenvalue plots) verifying that the learned dissipation potential remains convex throughout the training region. revision: yes

Circularity Check

0 steps flagged

No significant circularity; structure enforcement is explicit design, not hidden reduction

full rationale

The paper's core contribution is a constrained parameterization (reparameterizations of the bivector and dissipation potential) that forces compliance with GENERIC and the first/second laws by construction. This is presented as a deliberate feature of N-GINNs rather than a derived result. Validation uses synthetic data generated from the exact assumed structure, which tests recovery under the model's own assumptions but does not create a circular derivation—the learned operators are still fitted to data within the enforced class. No self-citations, uniqueness theorems, or renamings reduce the central claim to its inputs. The derivation chain (Hamiltonian + convex dissipation flow, NN approximation, reparam) remains independent and self-contained.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The central claim rests on the assumption that the target systems obey the nonlinear GENERIC formalism with convex dissipation potentials. No free parameters or invented entities are mentioned in the abstract.

axioms (1)
  • domain assumption Target systems obey the nonlinear GENERIC formalism with convex dissipation potentials.
    This is the foundational premise stated in the abstract for the entire approach.

pith-pipeline@v0.9.0 · 5508 in / 1226 out tokens · 42943 ms · 2026-05-12T02:32:03.353402+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

Reference graph

Works this paper leans on

77 extracted references · 77 canonical work pages

  1. [1]

    Discovering governing equations from data by sparse identification of nonlinear dynamical systems.Proceedings of the National Academy of Sciences, 113(15):3932– 3937, 2016

    Steven L Brunton, Joshua L Proctor, and J Nathan Kutz. Discovering governing equations from data by sparse identification of nonlinear dynamical systems.Proceedings of the National Academy of Sciences, 113(15):3932– 3937, 2016

  2. [2]

    Neural ordinary differential equa- tions.Advances in Neural Information Processing Systems, 31, 2018

    Ricky TQ Chen, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. Neural ordinary differential equa- tions.Advances in Neural Information Processing Systems, 31, 2018

  3. [3]

    Maziar Raissi, Paris Perdikaris, and George E Karniadakis. Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations.Journal of Computational Physics, 378:686–707, 2019. 21

  4. [4]

    Physics- informed machine learning.Nature Reviews Physics, 3(6):422–440, 2021

    George Em Karniadakis, Ioannis G Kevrekidis, Lu Lu, Paris Perdikaris, Sifan Wang, and Liu Yang. Physics- informed machine learning.Nature Reviews Physics, 3(6):422–440, 2021

  5. [5]

    Structure-preserving deep learning.European Journal of Applied Mathematics, 32(5):888– 936, 2021

    Elena Celledoni, Matthias J Ehrhardt, Christian Etmann, Robert I McLachlan, Brynjulf Owren, C-B Schonlieb, and Ferdia Sherry. Structure-preserving deep learning.European Journal of Applied Mathematics, 32(5):888– 936, 2021

  6. [6]

    Hamiltonian neural networks.Advances in Neural Information Processing Systems, 32, 2019

    Samuel Greydanus, Misko Dzamba, and Jason Yosinski. Hamiltonian neural networks.Advances in Neural Information Processing Systems, 32, 2019

  7. [7]

    Lagrangian Neural Networks

    Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, and Shirley Ho. Lagrangian Neural Networks. InICLR 2020 Workshop on Integration of Deep Neural Models and Differential Equations, 2019

  8. [8]

    SympNets: intrinsic structure- preserving symplectic networks for identifying Hamiltonian systems.Neural Networks, 132:166–179, 2020

    Pengzhan Jin, Zhen Zhang, Aiqing Zhu, Yifa Tang, and George Em Karniadakis. SympNets: intrinsic structure- preserving symplectic networks for identifying Hamiltonian systems.Neural Networks, 132:166–179, 2020

  9. [9]

    Symplectic ODE-Net: learning Hamiltonian dynamics with control

    Yaofeng Desmond Zhong, Biswadip Dey, and Amit Chakraborty. Symplectic ODE-Net: learning Hamiltonian dynamics with control. InInternational Conference on Learning Representations, 2020

  10. [10]

    Learning Poisson systems and trajectories of autonomous systems via Poisson neural networks.IEEE Transactions on Neural Networks and Learning Systems, 34(11):8271–8283, 2022

    Pengzhan Jin, Zhen Zhang, Ioannis G Kevrekidis, and George Em Karniadakis. Learning Poisson systems and trajectories of autonomous systems via Poisson neural networks.IEEE Transactions on Neural Networks and Learning Systems, 34(11):8271–8283, 2022

  11. [11]

    Direct Poisson neural networks: learning non- symplectic mechanical systems.Journal of Physics A: Mathematical and Theoretical, 56(49):495201, 2023

    Martin Šípka, Michal Pavelka, O ˘gul Esen, and Miroslav Grmela. Direct Poisson neural networks: learning non- symplectic mechanical systems.Journal of Physics A: Mathematical and Theoretical, 56(49):495201, 2023

  12. [12]

    Lie–Poisson neural net- works (LPNets): data-based computing of Hamiltonian systems with symmetries.Neural Networks, 173:106162, 2024

    Christopher Eldred, François Gay-Balmaz, Sofiia Huraka, and Vakhtang Putkaradze. Lie–Poisson neural net- works (LPNets): data-based computing of Hamiltonian systems with symmetries.Neural Networks, 173:106162, 2024

  13. [13]

    From Lagrangian mechanics to nonequilibrium thermodynamics: a variational perspective.Entropy, 21(1):8, 2018

    François Gay-Balmaz and Hiroaki Yoshimura. From Lagrangian mechanics to nonequilibrium thermodynamics: a variational perspective.Entropy, 21(1):8, 2018

  14. [14]

    Variational modeling and complex fluids.Handbook of Mathe- matical Analysis in Mechanics of Viscous Fluids, pages 1–41, 2017

    Mi-Ho Giga, Arkadz Kirshtein, and Chun Liu. Variational modeling and complex fluids.Handbook of Mathe- matical Analysis in Mechanics of Viscous Fluids, pages 1–41, 2017

  15. [15]

    Formulation of thermoelastic dissipative material behavior using GENERIC.Continuum Mechanics and Thermodynamics, 23(3):233–256, 2011

    Alexander Mielke. Formulation of thermoelastic dissipative material behavior using GENERIC.Continuum Mechanics and Thermodynamics, 23(3):233–256, 2011

  16. [16]

    GENERIC framework for reactive fluid flows.ZAMM- Journal of Applied Mathematics and Mechanics/Zeitschrift für Angewandte Mathematik und Mechanik, 103(7):e202100254, 2023

    Andrea Zafferi, Dirk Peschka, and Marita Thomas. GENERIC framework for reactive fluid flows.ZAMM- Journal of Applied Mathematics and Mechanics/Zeitschrift für Angewandte Mathematik und Mechanik, 103(7):e202100254, 2023

  17. [17]

    Dynamics and thermodynamics of complex fluids

    Miroslav Grmela and Hans Christian Öttinger. Dynamics and thermodynamics of complex fluids. I. Development of a general formalism.Physical Review E, 56(6):6620, 1997

  18. [18]

    Dynamics and thermodynamics of complex fluids

    Hans Christian Öttinger and Miroslav Grmela. Dynamics and thermodynamics of complex fluids. II. Illustrations of a general formalism.Physical Review E, 56(6):6633, 1997

  19. [19]

    Poisson brackets in condensed matter physics.Annals of Physics, 125(1):67–97, 1980

    IE Dzyaloshinskii and GE V olovick. Poisson brackets in condensed matter physics.Annals of Physics, 125(1):67–97, 1980

  20. [20]

    Particle and bracket formulations of kinetic equations.Contemp

    Miroslav Grmela. Particle and bracket formulations of kinetic equations.Contemp. Math, 28:125–132, 1984

  21. [21]

    Bracket formulation for irreversible classical fields.Physics Letters A, 100(8):423–427, 1984

    Philip J Morrison. Bracket formulation for irreversible classical fields.Physics Letters A, 100(8):423–427, 1984. 22

  22. [22]

    Dissipative Hamiltonian systems: a unifying principle.Physics Letters A, 100(8):419–422, 1984

    Allan N Kaufman. Dissipative Hamiltonian systems: a unifying principle.Physics Letters A, 100(8):419–422, 1984

  23. [23]

    Walter de Gruyter GmbH & Co KG, 2018

    Michal Pavelka, Václav Klika, and Miroslav Grmela.Multiscale Thermo-Dynamics: Introduction to GENERIC. Walter de Gruyter GmbH & Co KG, 2018

  24. [24]

    John Wiley & Sons, 2005

    Hans Christian Öttinger.Beyond Equilibrium Thermodynamics. John Wiley & Sons, 2005

  25. [25]

    Structure-preserving neural networks.Journal of Computational Physics, 426:109950, 2021

    Quercus Hernández, Alberto Badías, David González, Francisco Chinesta, and Elías Cueto. Structure-preserving neural networks.Journal of Computational Physics, 426:109950, 2021

  26. [26]

    Machine learning structure preserving brackets for forecasting irreversible processes.Advances in Neural Information Processing Systems, 34:5696–5707, 2021

    Kookjin Lee, Nathaniel Trask, and Panos Stinis. Machine learning structure preserving brackets for forecasting irreversible processes.Advances in Neural Information Processing Systems, 34:5696–5707, 2021

  27. [27]

    Zhen Zhang, Yeonjong Shin, and George Em Karniadakis. GFINNs: GENERIC formalism informed neural networks for deterministic and stochastic dynamical systems.Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 380(2229), 2022

  28. [28]

    Thermodynamics-informed graph neural networks.IEEE Transactions on Artificial Intelligence, 5(3):967–976, 2022

    Quercus Hernández, Alberto Badías, Francisco Chinesta, and Elías Cueto. Thermodynamics-informed graph neural networks.IEEE Transactions on Artificial Intelligence, 5(3):967–976, 2022

  29. [29]

    Reversible and irreversible bracket-based dynamics for deep graph neural networks.Advances in Neural Information Processing Systems, 36:38454–38484, 2023

    Anthony Gruber, Kookjin Lee, and Nathaniel Trask. Reversible and irreversible bracket-based dynamics for deep graph neural networks.Advances in Neural Information Processing Systems, 36:38454–38484, 2023

  30. [30]

    Efficiently parameterized neural metriplectic systems

    Anthony Gruber, Kookjin Lee, Haksoo Lim, Noseong Park, and Nathaniel Trask. Efficiently parameterized neural metriplectic systems. InThe Thirteenth International Conference on Learning Representations, 2025

  31. [31]

    C., Arratia, P

    Quercus Hernandez, Max Win, Thomas C. O’Connor, Paulo E. Arratia, and Nathaniel Trask. Data-driven par- ticle dynamics: structure-preserving coarse-graining for emergent behavior in non-equilibrium systems.arXiv preprint arXiv:2508.12569, 2025

  32. [32]

    Energetically consistent model reduction for metriplectic systems.Computer Methods in Applied Mechanics and Engineering, 404:115709, 2023

    Anthony Gruber, Max Gunzburger, Lili Ju, and Zhu Wang. Energetically consistent model reduction for metriplectic systems.Computer Methods in Applied Mechanics and Engineering, 404:115709, 2023

  33. [33]

    Port-metriplectic neural net- works: thermodynamics-informed machine learning of complex physical systems.Computational Mechanics, 72(3):553–561, 2023

    Quercus Hernández, Alberto Badías, Francisco Chinesta, and Elías Cueto. Port-metriplectic neural net- works: thermodynamics-informed machine learning of complex physical systems.Computational Mechanics, 72(3):553–561, 2023

  34. [34]

    Shenglin Huang, Zequn He, and Celia Reina. Variational Onsager neural networks (VONNs): a thermodynamics-based variational learning strategy for non-equilibrium PDEs.Journal of the Mechanics and Physics of Solids, 163:104856, 2022

  35. [35]

    Shenglin Huang, Zequn He, Nicolas Dirr, Johannes Zimmer, and Celia Reina. Statistical-physics-informed neural networks (Stat-PINNs): a machine learning strategy for coarse-graining dissipative dynamics.Journal of the Mechanics and Physics of Solids, 194:105908, 2025

  36. [36]

    Onsager’s variational principle in soft matter.Journal of Physics: Condensed Matter, 23(28):284118, 2011

    Masao Doi. Onsager’s variational principle in soft matter.Journal of Physics: Condensed Matter, 23(28):284118, 2011

  37. [37]

    Onsager’s variational principle in soft matter: introduction and application to the dynamics of adsorption of proteins onto fluid membranes

    Marino Arroyo, Nikhil Walani, Alejandro Torres-Sánchez, and Dimitri Kaurin. Onsager’s variational principle in soft matter: introduction and application to the dynamics of adsorption of proteins onto fluid membranes. In The role of mechanics in the study of lipid bilayers, pages 287–332. Springer, 2017

  38. [38]

    Automated discovery of generalized standard material models with EUCLID.Computer Methods in Applied Mechanics and Engineering, 405:115867, 2023

    Moritz Flaschel, Siddhant Kumar, and Laura De Lorenzis. Automated discovery of generalized standard material models with EUCLID.Computer Methods in Applied Mechanics and Engineering, 405:115867, 2023

  39. [39]

    Convex neural networks learn generalized standard material models.Journal of the Mechanics and Physics of Solids, 200:106103, 2025

    Moritz Flaschel, Paul Steinmann, Laura De Lorenzis, and Ellen Kuhl. Convex neural networks learn generalized standard material models.Journal of the Mechanics and Physics of Solids, 200:106103, 2025. 23

  40. [40]

    Data-driven anisotropic finite viscoelasticity using neural ordinary differential equations.Computer Methods in Applied Mechanics and Engineering, 411:116046, 2023

    Vahidullah Taç, Manuel K Rausch, Francisco Sahli Costabal, and Adrian Buganza Tepole. Data-driven anisotropic finite viscoelasticity using neural ordinary differential equations.Computer Methods in Applied Mechanics and Engineering, 411:116046, 2023

  41. [41]

    Theory and implementation of inelastic constitutive artificial neural networks.Computer Methods in Applied Mechanics and Engineering, 428:117063, 2024

    Hagen Holthusen, Lukas Lamm, Tim Brepols, Stefanie Reese, and Ellen Kuhl. Theory and implementation of inelastic constitutive artificial neural networks.Computer Methods in Applied Mechanics and Engineering, 428:117063, 2024

  42. [42]

    On the relation between gradient flows and the large-deviation principle, with applications to Markov chains and diffusion.Potential Analysis, 41(4):1293– 1327, 2014

    Alexander Mielke, Mark A Peletier, and DR Michiel Renger. On the relation between gradient flows and the large-deviation principle, with applications to Markov chains and diffusion.Potential Analysis, 41(4):1293– 1327, 2014

  43. [43]

    A framework of nonequilibrium statistical mechanics

    Hans Christian Öttinger, Mark A Peletier, and Alberto Montefusco. A framework of nonequilibrium statistical mechanics. I. Role and types of fluctuations.Journal of Non-Equilibrium Thermodynamics, 46(1):1–13, 2021

  44. [44]

    Fluctuation symmetry leads to GENERIC equations with non-quadratic dissipation.Stochastic Processes and their Applications, 130(1):139– 170, 2020

    Richard C Kraaij, Alexandre Lazarescu, Christian Maes, and Mark Peletier. Fluctuation symmetry leads to GENERIC equations with non-quadratic dissipation.Stochastic Processes and their Applications, 130(1):139– 170, 2020

  45. [45]

    Markus Hütter and Bob Svendsen. Quasi-linear versus potential-based formulations of force–flux relations and the GENERIC for irreversible processes: comparisons and examples.Continuum Mechanics and Thermody- namics, 25(6):803–816, 2013

  46. [46]

    On the combined use of friction matrices and dissipation potentials in thermodynamic modeling.Journal of Non-Equilibrium Thermodynamics, 44(3):295–302, 2019

    Hans Christian Öttinger. On the combined use of friction matrices and dissipation potentials in thermodynamic modeling.Journal of Non-Equilibrium Thermodynamics, 44(3):295–302, 2019

  47. [47]

    A framework of nonequilibrium statistical mechanics

    Alberto Montefusco, Mark A Peletier, and Hans Christian Öttinger. A framework of nonequilibrium statistical mechanics. II. Coarse-graining.Journal of Non-Equilibrium Thermodynamics, 46(1):15–33, 2021

  48. [48]

    Number 36

    Antony N Beris and Brian J Edwards.Thermodynamics of Flowing Systems: With Internal Microstructure. Number 36. Oxford University Press, 1994

  49. [49]

    GENERIC guide to the multiscale dynamics and thermodynamics.Journal of Physics Com- munications, 2(3):032001, 2018

    Miroslav Grmela. GENERIC guide to the multiscale dynamics and thermodynamics.Journal of Physics Com- munications, 2(3):032001, 2018

  50. [50]

    Cambridge university press, 2006

    Marián Fecko.Differential geometry and Lie groups for physicists. Cambridge university press, 2006

  51. [51]

    Reciprocal relations in irreversible processes

    Lars Onsager. Reciprocal relations in irreversible processes. I.Physical Review, 37(4):405, 1931

  52. [52]

    Reciprocal relations in irreversible processes

    Lars Onsager. Reciprocal relations in irreversible processes. II.Physical review, 38(12):2265, 1931

  53. [53]

    Non-convex dissipation potentials in multiscale non-equilibrium thermody- namics.Continuum Mechanics and Thermodynamics, 30(4):917–941, 2018

    Adam Jane ˇcka and Michal Pavelka. Non-convex dissipation potentials in multiscale non-equilibrium thermody- namics.Continuum Mechanics and Thermodynamics, 30(4):917–941, 2018

  54. [54]

    Multiscale equilibrium and nonequilibrium thermodynamics in chemical engineering

    Miroslav Grmela. Multiscale equilibrium and nonequilibrium thermodynamics in chemical engineering. In Advances in Chemical Engineering, volume 39, pages 75–129. Elsevier, 2010

  55. [55]

    On the role of geometry in statistical mechanics and thermo- dynamics

    O ˘gul Esen, Miroslav Grmela, and Michal Pavelka. On the role of geometry in statistical mechanics and thermo- dynamics. I. Geometric perspective.Journal of Mathematical Physics, 63(12), 2022

  56. [56]

    Fluctuations in extended mass-action-law dynamics.Physica D: Nonlinear Phenomena, 241(10):976–986, 2012

    Miroslav Grmela. Fluctuations in extended mass-action-law dynamics.Physica D: Nonlinear Phenomena, 241(10):976–986, 2012

  57. [57]

    Weilun Qiu, Shenglin Huang, and Celia Reina. Bridging statistical mechanics and thermodynamics away from equilibrium: a data-driven approach for learning internal variables and their dynamics.Journal of the Mechanics and Physics of Solids, 203:106211, 2025. 24

  58. [58]

    Time-structure invariance criteria for closure approximations

    Brian J Edwards and Hans Christian Öttinger. Time-structure invariance criteria for closure approximations. Physical Review E, 56(4):4097, 1997

  59. [59]

    Rheological modeling with GENERIC and with the Onsager principle.Journal of Non- Equilibrium Thermodynamics, 2026

    Miroslav Grmela. Rheological modeling with GENERIC and with the Onsager principle.Journal of Non- Equilibrium Thermodynamics, 2026

  60. [60]

    A comparison of single and double generator formalisms for thermodynamics-informed neural networks.Computational Mechanics, 75(6):1769–1785, 2025

    Pau Urdeitx, Icíar Alfaro, David González, Francisco Chinesta, and Elías Cueto. A comparison of single and double generator formalisms for thermodynamics-informed neural networks.Computational Mechanics, 75(6):1769–1785, 2025

  61. [61]

    On integrals, Hamiltonian and metriplectic formula- tions of polynomial systems in 3D.Theoretical and Applied Mechanics, 44(1):15–34, 2017

    O ˘gul Esen, Ghose Anindya Choudhury, and Partha Guha. On integrals, Hamiltonian and metriplectic formula- tions of polynomial systems in 3D.Theoretical and Applied Mechanics, 44(1):15–34, 2017

  62. [62]

    GENERIC integrators: structure preserving time integration for thermodynamic sys- tems.Journal of Non-Equilibrium Thermodynamics, 43(2):89–100, 2018

    Hans Christian Öttinger. GENERIC integrators: structure preserving time integration for thermodynamic sys- tems.Journal of Non-Equilibrium Thermodynamics, 43(2):89–100, 2018

  63. [63]

    Xiaocheng Shang and Hans Christian Öttinger. Structure-preserving integrators for dissipative systems based on reversible–irreversible splitting.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 476(2234), 2020

  64. [64]

    Callen.Thermodynamics: An Introduction to the Physical Theories of Equilibrium Thermostatics and Irreversible Thermodynamics

    Herbert B. Callen.Thermodynamics: An Introduction to the Physical Theories of Equilibrium Thermostatics and Irreversible Thermodynamics. John Wiley & Sons, New York, 1960

  65. [65]

    Springer Berlin, Heidelberg, 2nd edition, 2006

    Ernst Hairer, Christian Lubich, and Gerhard Wanner.Geometric Numerical Integration: Structure-Preserving Algorithms for Ordinary Differential Equations, volume 31 ofSpringer Series in Computational Mathematics. Springer Berlin, Heidelberg, 2nd edition, 2006

  66. [66]

    Courier Corporation, 2008

    Jacob Lubliner.Plasticity theory. Courier Corporation, 2008

  67. [67]

    Springer, 1998

    Juan C Simo and Thomas JR Hughes.Computational inelasticity. Springer, 1998

  68. [68]

    EVODMs: variational learning of PDEs for stochastic systems via diffusion models with quantified epistemic uncertainty.Journal of Computational Physics, page 114722, 2026

    Zequn He and Celia Reina. EVODMs: variational learning of PDEs for stochastic systems via diffusion models with quantified epistemic uncertainty.Journal of Computational Physics, page 114722, 2026

  69. [69]

    SPIEDiff: robust learning of long-time macroscopic dynamics from short-time particle simulations with quantified epistemic uncertainty.arXiv preprint arXiv:2505.13501, 2025

    Zequn He and Celia Reina. SPIEDiff: robust learning of long-time macroscopic dynamics from short-time particle simulations with quantified epistemic uncertainty.arXiv preprint arXiv:2505.13501, 2025

  70. [70]

    Learning macro- scopic internal variables and history dependence from microscopic models.Journal of the Mechanics and Physics of Solids, 178:105329, 2023

    Burigede Liu, Eric Ocegueda, Margaret Trautner, Andrew M Stuart, and Kaushik Bhattacharya. Learning macro- scopic internal variables and history dependence from microscopic models.Journal of the Mechanics and Physics of Solids, 178:105329, 2023

  71. [71]

    Viscoelasticty with physics-augmented neural networks: model formulation and training methods without prescribed internal vari- ables.Computational Mechanics, 74(6):1279–1301, 2024

    Max Rosenkranz, Karl A Kalina, Jörg Brummund, WaiChing Sun, and Markus Kästner. Viscoelasticty with physics-augmented neural networks: model formulation and training methods without prescribed internal vari- ables.Computational Mechanics, 74(6):1279–1301, 2024

  72. [72]

    Deep learning of thermodynamics-aware reduced-order models from data.Computer Methods in Applied Mechanics and Engi- neering, 379:113763, 2021

    Quercus Hernandez, Alberto Badias, David Gonzalez, Francisco Chinesta, and Elias Cueto. Deep learning of thermodynamics-aware reduced-order models from data.Computer Methods in Applied Mechanics and Engi- neering, 379:113763, 2021

  73. [73]

    tLaSDI: thermodynamics-informed latent space dynamics identification.Computer Methods in Applied Mechanics and Engineering, 429:117144, 2024

    Jun Sur Richard Park, Siu Wun Cheung, Youngsoo Choi, and Yeonjong Shin. tLaSDI: thermodynamics-informed latent space dynamics identification.Computer Methods in Applied Mechanics and Engineering, 429:117144, 2024

  74. [74]

    Thermody- namically Consistent Latent Dynamics Identification for Parametric Systems.Transactions on Machine Learning Research, 2026

    Xiaolong He, Yeonjong Shin, Anthony Gruber, Sohyeon Jung, Kookjin Lee, and Youngsoo Choi. Thermody- namically Consistent Latent Dynamics Identification for Parametric Systems.Transactions on Machine Learning Research, 2026. J2C Certification. 25

  75. [75]

    On Onsager’s principle of microscopic reversibility.Reviews of Modern Physics, 17(2-3):343, 1945

    Hendrik Brugt Gerhard Casimir. On Onsager’s principle of microscopic reversibility.Reviews of Modern Physics, 17(2-3):343, 1945

  76. [76]

    Nikhil Vyas, Depen Morwani, Rosie Zhao, Itai Shapira, David Brandfonbrener, Lucas Janson, and Sham M. Kakade. SOAP: Improving and Stabilizing Shampoo using Adam for Language Modeling. InThe Thirteenth International Conference on Learning Representations, 2025

  77. [77]

    Gradient alignment in physics-informed neural networks: a second-order optimization perspective

    Sifan Wang, Ananyae Kumar bhartari, Bowen Li, and Paris Perdikaris. Gradient alignment in physics-informed neural networks: a second-order optimization perspective. InThe Thirty-ninth Annual Conference on Neural Information Processing Systems, 2025. 26