Recognition: 2 theorem links
· Lean TheoremNonlinear GENERIC Informed Neural Networks (N-GINNs): learning GENERIC dynamics with non-quadratic dissipation potentials
Pith reviewed 2026-05-12 02:32 UTC · model grok-4.3
The pith
Neural networks learn GENERIC dynamics with non-quadratic dissipation while exactly obeying the first and second laws of thermodynamics.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
By reparameterizing both the bivector operator and the dissipation potential, neural networks can be trained to identify the full nonlinear GENERIC structure from data, thereby recovering dynamics that exactly satisfy the first and second laws for arbitrary convex dissipation potentials.
What carries the argument
The nonlinear GENERIC formalism, consisting of a Hamiltonian bivector for reversible flow superimposed with a generalized gradient flow driven by a convex dissipation potential, with reparameterizations that enforce thermodynamic consistency at the level of the network architecture.
If this is right
- The method recovers accurate models for a harmonic oscillator coupled to a heat bath.
- It identifies the dynamics of an idealized chemical motor with nonlinear dissipation.
- It learns a one-dimensional viscoplastic model of Perzyna type from data.
- Thermodynamic structure is guaranteed without any post-training corrections or penalty terms.
Where Pith is reading between the lines
- The same reparameterization strategy could be tried on other structured dynamical systems that admit convex potentials, such as certain rate-dependent plasticity models in higher dimensions.
- If the assumption of exact GENERIC structure is relaxed, the framework might still serve as a regularizer that keeps learned models approximately consistent with the first and second laws.
- Testing the approach on noisy experimental rather than simulated data would reveal how sensitive the recovered operators are to measurement error.
Load-bearing premise
The observed trajectories must be generated by a system that exactly obeys the nonlinear GENERIC structure with a convex dissipation potential, and the chosen network parameterization must be sufficiently expressive to recover the true operators from finite data.
What would settle it
If the trained network, when applied to fresh initial conditions from one of the three test systems, produces trajectories whose total energy drifts or whose entropy decreases, the claim of exact thermodynamic enforcement would be falsified.
Figures
read the original abstract
We introduce Nonlinear GENERIC Informed Neural Networks (N-GINNs), a deep learning framework for discovering evolution equations of systems governed by the nonlinear GENERIC formalism (General Equation for Non-Equilibrium Reversible-Irreversible Coupling). Such systems exhibit coupled conservative and dissipative dynamics, and can be described via the superposition of a Hamiltonian flow and a generalized gradient flow. In contrast to existing approaches, our formulation incorporates generalized gradient flows via convex dissipation potentials, enabling the identification of a broader class of thermodynamically consistent dynamics, including systems with non-quadratic dissipation potentials. Thermodynamic structure is strongly enforced by construction through suitable reparameterizations of both the bivector operator and the dissipation potential, ensuring exact compliance with the first and second laws of thermodynamics. We validate the proposed approach on three representative examples: a harmonic oscillator coupled to a heat bath, an idealized chemical motor, and a one-dimensional viscoplastic model of Perzyna type. These results demonstrate the method's ability to accurately infer thermodynamically consistent models from data for systems incorporating both conservative and nonlinear dissipative dynamics.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper introduces Nonlinear GENERIC Informed Neural Networks (N-GINNs) to discover evolution equations for systems governed by the nonlinear GENERIC formalism from data. The framework parameterizes the Hamiltonian, skew-symmetric bivector operator, and a convex dissipation potential with neural networks, using reparameterizations to enforce exact thermodynamic consistency (first and second laws). It claims to handle a broader class of dynamics than prior quadratic-dissipation approaches and validates the method on three synthetic examples: a harmonic oscillator coupled to a heat bath, an idealized chemical motor, and a one-dimensional viscoplastic Perzyna-type model.
Significance. If the central claims hold, the work offers a structure-preserving neural architecture for learning non-equilibrium dynamics with nonlinear dissipation, which is valuable for physics-informed machine learning in thermodynamics and continuum mechanics. The explicit reparameterization approach to enforce thermodynamic laws by construction is a clear technical strength that could reduce the need for soft constraints in related methods.
major comments (2)
- The three validation examples (harmonic oscillator, chemical motor, viscoplastic model) are all generated synthetically from the exact nonlinear GENERIC equations with convex dissipation potentials assumed by the model. This provides no test of recovery when the data-generating process deviates from the assumed structure or under realistic noise/model mismatch, which is load-bearing for the claim that the method identifies a broader class of thermodynamically consistent dynamics.
- The manuscript asserts that the chosen reparameterizations of the bivector operator and dissipation potential ensure exact compliance with the first and second laws, but does not include an explicit verification (e.g., numerical check of energy balance or entropy production over long trajectories) that the neural-network outputs remain within the convex cone for the dissipation potential across the training domain.
minor comments (2)
- Quantitative metrics (e.g., relative L2 errors on operators or trajectories, comparison to baselines such as standard PINNs or quadratic-dissipation GINNs) are referenced in the abstract but should be reported with error bars and tables in the results section for each example.
- Notation for the dissipation potential and its convexity constraint should be clarified with an explicit functional form or architecture diagram to aid reproducibility.
Simulated Author's Rebuttal
We thank the referee for their thoughtful and constructive comments. We address each major comment below and indicate the revisions we will incorporate.
read point-by-point responses
-
Referee: The three validation examples (harmonic oscillator, chemical motor, viscoplastic model) are all generated synthetically from the exact nonlinear GENERIC equations with convex dissipation potentials assumed by the model. This provides no test of recovery when the data-generating process deviates from the assumed structure or under realistic noise/model mismatch, which is load-bearing for the claim that the method identifies a broader class of thermodynamically consistent dynamics.
Authors: We agree that the current validation uses noise-free data generated exactly from the assumed nonlinear GENERIC structure, which limits direct evidence for performance under mismatch or noise. These examples were chosen to isolate and verify the method's capacity to recover non-quadratic dissipation potentials when the structure is present. In the revised manuscript we will add two new experiments: (i) training and prediction on data corrupted by moderate Gaussian noise, and (ii) a controlled mismatch case in which the true dynamics are generated from a dissipation potential outside the exact class assumed by the model. These additions will quantify robustness while preserving the original demonstrations of exact structure recovery. revision: yes
-
Referee: The manuscript asserts that the chosen reparameterizations of the bivector operator and dissipation potential ensure exact compliance with the first and second laws, but does not include an explicit verification (e.g., numerical check of energy balance or entropy production over long trajectories) that the neural-network outputs remain within the convex cone for the dissipation potential across the training domain.
Authors: The reparameterizations guarantee the required properties analytically: the bivector is constructed to be exactly skew-symmetric for all inputs, and the dissipation potential is parameterized so that its Hessian is positive semi-definite by design (via a convex neural-network representation). We nevertheless acknowledge that an explicit numerical audit strengthens the claim. In the revision we will add (i) long-horizon trajectory plots confirming exact energy conservation and non-negative entropy production, and (ii) domain-wide checks (sampled points and Hessian eigenvalue plots) verifying that the learned dissipation potential remains convex throughout the training region. revision: yes
Circularity Check
No significant circularity; structure enforcement is explicit design, not hidden reduction
full rationale
The paper's core contribution is a constrained parameterization (reparameterizations of the bivector and dissipation potential) that forces compliance with GENERIC and the first/second laws by construction. This is presented as a deliberate feature of N-GINNs rather than a derived result. Validation uses synthetic data generated from the exact assumed structure, which tests recovery under the model's own assumptions but does not create a circular derivation—the learned operators are still fitted to data within the enforced class. No self-citations, uniqueness theorems, or renamings reduce the central claim to its inputs. The derivation chain (Hamiltonian + convex dissipation flow, NN approximation, reparam) remains independent and self-contained.
Axiom & Free-Parameter Ledger
axioms (1)
- domain assumption Target systems obey the nonlinear GENERIC formalism with convex dissipation potentials.
Lean theorems connected to this paper
-
IndisputableMonolith/Cost/FunctionalEquation.leanwashburn_uniqueness_aczel echoesour formulation incorporates generalized gradient flows via convex dissipation potentials... Thermodynamic structure is strongly enforced by construction through suitable reparameterizations of both the bivector operator and the dissipation potential
-
IndisputableMonolith/Foundation/AbsoluteFloorClosure.leanabsolute_floor_iff_bare_distinguishability echoesΞ(x,x*)=˜Ξ(x,P(x)x*)−˜Ξ(x,0)−... enforces Ξ(x,0)=0, convexity, and degeneracy (ii)
Reference graph
Works this paper leans on
-
[1]
Steven L Brunton, Joshua L Proctor, and J Nathan Kutz. Discovering governing equations from data by sparse identification of nonlinear dynamical systems.Proceedings of the National Academy of Sciences, 113(15):3932– 3937, 2016
work page 2016
-
[2]
Neural ordinary differential equa- tions.Advances in Neural Information Processing Systems, 31, 2018
Ricky TQ Chen, Yulia Rubanova, Jesse Bettencourt, and David K Duvenaud. Neural ordinary differential equa- tions.Advances in Neural Information Processing Systems, 31, 2018
work page 2018
-
[3]
Maziar Raissi, Paris Perdikaris, and George E Karniadakis. Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations.Journal of Computational Physics, 378:686–707, 2019. 21
work page 2019
-
[4]
Physics- informed machine learning.Nature Reviews Physics, 3(6):422–440, 2021
George Em Karniadakis, Ioannis G Kevrekidis, Lu Lu, Paris Perdikaris, Sifan Wang, and Liu Yang. Physics- informed machine learning.Nature Reviews Physics, 3(6):422–440, 2021
work page 2021
-
[5]
Structure-preserving deep learning.European Journal of Applied Mathematics, 32(5):888– 936, 2021
Elena Celledoni, Matthias J Ehrhardt, Christian Etmann, Robert I McLachlan, Brynjulf Owren, C-B Schonlieb, and Ferdia Sherry. Structure-preserving deep learning.European Journal of Applied Mathematics, 32(5):888– 936, 2021
work page 2021
-
[6]
Hamiltonian neural networks.Advances in Neural Information Processing Systems, 32, 2019
Samuel Greydanus, Misko Dzamba, and Jason Yosinski. Hamiltonian neural networks.Advances in Neural Information Processing Systems, 32, 2019
work page 2019
-
[7]
Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, and Shirley Ho. Lagrangian Neural Networks. InICLR 2020 Workshop on Integration of Deep Neural Models and Differential Equations, 2019
work page 2020
-
[8]
Pengzhan Jin, Zhen Zhang, Aiqing Zhu, Yifa Tang, and George Em Karniadakis. SympNets: intrinsic structure- preserving symplectic networks for identifying Hamiltonian systems.Neural Networks, 132:166–179, 2020
work page 2020
-
[9]
Symplectic ODE-Net: learning Hamiltonian dynamics with control
Yaofeng Desmond Zhong, Biswadip Dey, and Amit Chakraborty. Symplectic ODE-Net: learning Hamiltonian dynamics with control. InInternational Conference on Learning Representations, 2020
work page 2020
-
[10]
Pengzhan Jin, Zhen Zhang, Ioannis G Kevrekidis, and George Em Karniadakis. Learning Poisson systems and trajectories of autonomous systems via Poisson neural networks.IEEE Transactions on Neural Networks and Learning Systems, 34(11):8271–8283, 2022
work page 2022
-
[11]
Martin Šípka, Michal Pavelka, O ˘gul Esen, and Miroslav Grmela. Direct Poisson neural networks: learning non- symplectic mechanical systems.Journal of Physics A: Mathematical and Theoretical, 56(49):495201, 2023
work page 2023
-
[12]
Christopher Eldred, François Gay-Balmaz, Sofiia Huraka, and Vakhtang Putkaradze. Lie–Poisson neural net- works (LPNets): data-based computing of Hamiltonian systems with symmetries.Neural Networks, 173:106162, 2024
work page 2024
-
[13]
François Gay-Balmaz and Hiroaki Yoshimura. From Lagrangian mechanics to nonequilibrium thermodynamics: a variational perspective.Entropy, 21(1):8, 2018
work page 2018
-
[14]
Mi-Ho Giga, Arkadz Kirshtein, and Chun Liu. Variational modeling and complex fluids.Handbook of Mathe- matical Analysis in Mechanics of Viscous Fluids, pages 1–41, 2017
work page 2017
-
[15]
Alexander Mielke. Formulation of thermoelastic dissipative material behavior using GENERIC.Continuum Mechanics and Thermodynamics, 23(3):233–256, 2011
work page 2011
-
[16]
Andrea Zafferi, Dirk Peschka, and Marita Thomas. GENERIC framework for reactive fluid flows.ZAMM- Journal of Applied Mathematics and Mechanics/Zeitschrift für Angewandte Mathematik und Mechanik, 103(7):e202100254, 2023
work page 2023
-
[17]
Dynamics and thermodynamics of complex fluids
Miroslav Grmela and Hans Christian Öttinger. Dynamics and thermodynamics of complex fluids. I. Development of a general formalism.Physical Review E, 56(6):6620, 1997
work page 1997
-
[18]
Dynamics and thermodynamics of complex fluids
Hans Christian Öttinger and Miroslav Grmela. Dynamics and thermodynamics of complex fluids. II. Illustrations of a general formalism.Physical Review E, 56(6):6633, 1997
work page 1997
-
[19]
Poisson brackets in condensed matter physics.Annals of Physics, 125(1):67–97, 1980
IE Dzyaloshinskii and GE V olovick. Poisson brackets in condensed matter physics.Annals of Physics, 125(1):67–97, 1980
work page 1980
-
[20]
Particle and bracket formulations of kinetic equations.Contemp
Miroslav Grmela. Particle and bracket formulations of kinetic equations.Contemp. Math, 28:125–132, 1984
work page 1984
-
[21]
Bracket formulation for irreversible classical fields.Physics Letters A, 100(8):423–427, 1984
Philip J Morrison. Bracket formulation for irreversible classical fields.Physics Letters A, 100(8):423–427, 1984. 22
work page 1984
-
[22]
Dissipative Hamiltonian systems: a unifying principle.Physics Letters A, 100(8):419–422, 1984
Allan N Kaufman. Dissipative Hamiltonian systems: a unifying principle.Physics Letters A, 100(8):419–422, 1984
work page 1984
-
[23]
Walter de Gruyter GmbH & Co KG, 2018
Michal Pavelka, Václav Klika, and Miroslav Grmela.Multiscale Thermo-Dynamics: Introduction to GENERIC. Walter de Gruyter GmbH & Co KG, 2018
work page 2018
-
[24]
Hans Christian Öttinger.Beyond Equilibrium Thermodynamics. John Wiley & Sons, 2005
work page 2005
-
[25]
Structure-preserving neural networks.Journal of Computational Physics, 426:109950, 2021
Quercus Hernández, Alberto Badías, David González, Francisco Chinesta, and Elías Cueto. Structure-preserving neural networks.Journal of Computational Physics, 426:109950, 2021
work page 2021
-
[26]
Kookjin Lee, Nathaniel Trask, and Panos Stinis. Machine learning structure preserving brackets for forecasting irreversible processes.Advances in Neural Information Processing Systems, 34:5696–5707, 2021
work page 2021
-
[27]
Zhen Zhang, Yeonjong Shin, and George Em Karniadakis. GFINNs: GENERIC formalism informed neural networks for deterministic and stochastic dynamical systems.Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 380(2229), 2022
work page 2022
-
[28]
Quercus Hernández, Alberto Badías, Francisco Chinesta, and Elías Cueto. Thermodynamics-informed graph neural networks.IEEE Transactions on Artificial Intelligence, 5(3):967–976, 2022
work page 2022
-
[29]
Anthony Gruber, Kookjin Lee, and Nathaniel Trask. Reversible and irreversible bracket-based dynamics for deep graph neural networks.Advances in Neural Information Processing Systems, 36:38454–38484, 2023
work page 2023
-
[30]
Efficiently parameterized neural metriplectic systems
Anthony Gruber, Kookjin Lee, Haksoo Lim, Noseong Park, and Nathaniel Trask. Efficiently parameterized neural metriplectic systems. InThe Thirteenth International Conference on Learning Representations, 2025
work page 2025
-
[31]
Quercus Hernandez, Max Win, Thomas C. O’Connor, Paulo E. Arratia, and Nathaniel Trask. Data-driven par- ticle dynamics: structure-preserving coarse-graining for emergent behavior in non-equilibrium systems.arXiv preprint arXiv:2508.12569, 2025
-
[32]
Anthony Gruber, Max Gunzburger, Lili Ju, and Zhu Wang. Energetically consistent model reduction for metriplectic systems.Computer Methods in Applied Mechanics and Engineering, 404:115709, 2023
work page 2023
-
[33]
Quercus Hernández, Alberto Badías, Francisco Chinesta, and Elías Cueto. Port-metriplectic neural net- works: thermodynamics-informed machine learning of complex physical systems.Computational Mechanics, 72(3):553–561, 2023
work page 2023
-
[34]
Shenglin Huang, Zequn He, and Celia Reina. Variational Onsager neural networks (VONNs): a thermodynamics-based variational learning strategy for non-equilibrium PDEs.Journal of the Mechanics and Physics of Solids, 163:104856, 2022
work page 2022
-
[35]
Shenglin Huang, Zequn He, Nicolas Dirr, Johannes Zimmer, and Celia Reina. Statistical-physics-informed neural networks (Stat-PINNs): a machine learning strategy for coarse-graining dissipative dynamics.Journal of the Mechanics and Physics of Solids, 194:105908, 2025
work page 2025
-
[36]
Masao Doi. Onsager’s variational principle in soft matter.Journal of Physics: Condensed Matter, 23(28):284118, 2011
work page 2011
-
[37]
Marino Arroyo, Nikhil Walani, Alejandro Torres-Sánchez, and Dimitri Kaurin. Onsager’s variational principle in soft matter: introduction and application to the dynamics of adsorption of proteins onto fluid membranes. In The role of mechanics in the study of lipid bilayers, pages 287–332. Springer, 2017
work page 2017
-
[38]
Moritz Flaschel, Siddhant Kumar, and Laura De Lorenzis. Automated discovery of generalized standard material models with EUCLID.Computer Methods in Applied Mechanics and Engineering, 405:115867, 2023
work page 2023
-
[39]
Moritz Flaschel, Paul Steinmann, Laura De Lorenzis, and Ellen Kuhl. Convex neural networks learn generalized standard material models.Journal of the Mechanics and Physics of Solids, 200:106103, 2025. 23
work page 2025
-
[40]
Vahidullah Taç, Manuel K Rausch, Francisco Sahli Costabal, and Adrian Buganza Tepole. Data-driven anisotropic finite viscoelasticity using neural ordinary differential equations.Computer Methods in Applied Mechanics and Engineering, 411:116046, 2023
work page 2023
-
[41]
Hagen Holthusen, Lukas Lamm, Tim Brepols, Stefanie Reese, and Ellen Kuhl. Theory and implementation of inelastic constitutive artificial neural networks.Computer Methods in Applied Mechanics and Engineering, 428:117063, 2024
work page 2024
-
[42]
Alexander Mielke, Mark A Peletier, and DR Michiel Renger. On the relation between gradient flows and the large-deviation principle, with applications to Markov chains and diffusion.Potential Analysis, 41(4):1293– 1327, 2014
work page 2014
-
[43]
A framework of nonequilibrium statistical mechanics
Hans Christian Öttinger, Mark A Peletier, and Alberto Montefusco. A framework of nonequilibrium statistical mechanics. I. Role and types of fluctuations.Journal of Non-Equilibrium Thermodynamics, 46(1):1–13, 2021
work page 2021
-
[44]
Richard C Kraaij, Alexandre Lazarescu, Christian Maes, and Mark Peletier. Fluctuation symmetry leads to GENERIC equations with non-quadratic dissipation.Stochastic Processes and their Applications, 130(1):139– 170, 2020
work page 2020
-
[45]
Markus Hütter and Bob Svendsen. Quasi-linear versus potential-based formulations of force–flux relations and the GENERIC for irreversible processes: comparisons and examples.Continuum Mechanics and Thermody- namics, 25(6):803–816, 2013
work page 2013
-
[46]
Hans Christian Öttinger. On the combined use of friction matrices and dissipation potentials in thermodynamic modeling.Journal of Non-Equilibrium Thermodynamics, 44(3):295–302, 2019
work page 2019
-
[47]
A framework of nonequilibrium statistical mechanics
Alberto Montefusco, Mark A Peletier, and Hans Christian Öttinger. A framework of nonequilibrium statistical mechanics. II. Coarse-graining.Journal of Non-Equilibrium Thermodynamics, 46(1):15–33, 2021
work page 2021
- [48]
-
[49]
Miroslav Grmela. GENERIC guide to the multiscale dynamics and thermodynamics.Journal of Physics Com- munications, 2(3):032001, 2018
work page 2018
-
[50]
Cambridge university press, 2006
Marián Fecko.Differential geometry and Lie groups for physicists. Cambridge university press, 2006
work page 2006
-
[51]
Reciprocal relations in irreversible processes
Lars Onsager. Reciprocal relations in irreversible processes. I.Physical Review, 37(4):405, 1931
work page 1931
-
[52]
Reciprocal relations in irreversible processes
Lars Onsager. Reciprocal relations in irreversible processes. II.Physical review, 38(12):2265, 1931
work page 1931
-
[53]
Adam Jane ˇcka and Michal Pavelka. Non-convex dissipation potentials in multiscale non-equilibrium thermody- namics.Continuum Mechanics and Thermodynamics, 30(4):917–941, 2018
work page 2018
-
[54]
Multiscale equilibrium and nonequilibrium thermodynamics in chemical engineering
Miroslav Grmela. Multiscale equilibrium and nonequilibrium thermodynamics in chemical engineering. In Advances in Chemical Engineering, volume 39, pages 75–129. Elsevier, 2010
work page 2010
-
[55]
On the role of geometry in statistical mechanics and thermo- dynamics
O ˘gul Esen, Miroslav Grmela, and Michal Pavelka. On the role of geometry in statistical mechanics and thermo- dynamics. I. Geometric perspective.Journal of Mathematical Physics, 63(12), 2022
work page 2022
-
[56]
Miroslav Grmela. Fluctuations in extended mass-action-law dynamics.Physica D: Nonlinear Phenomena, 241(10):976–986, 2012
work page 2012
-
[57]
Weilun Qiu, Shenglin Huang, and Celia Reina. Bridging statistical mechanics and thermodynamics away from equilibrium: a data-driven approach for learning internal variables and their dynamics.Journal of the Mechanics and Physics of Solids, 203:106211, 2025. 24
work page 2025
-
[58]
Time-structure invariance criteria for closure approximations
Brian J Edwards and Hans Christian Öttinger. Time-structure invariance criteria for closure approximations. Physical Review E, 56(4):4097, 1997
work page 1997
-
[59]
Miroslav Grmela. Rheological modeling with GENERIC and with the Onsager principle.Journal of Non- Equilibrium Thermodynamics, 2026
work page 2026
-
[60]
Pau Urdeitx, Icíar Alfaro, David González, Francisco Chinesta, and Elías Cueto. A comparison of single and double generator formalisms for thermodynamics-informed neural networks.Computational Mechanics, 75(6):1769–1785, 2025
work page 2025
-
[61]
O ˘gul Esen, Ghose Anindya Choudhury, and Partha Guha. On integrals, Hamiltonian and metriplectic formula- tions of polynomial systems in 3D.Theoretical and Applied Mechanics, 44(1):15–34, 2017
work page 2017
-
[62]
Hans Christian Öttinger. GENERIC integrators: structure preserving time integration for thermodynamic sys- tems.Journal of Non-Equilibrium Thermodynamics, 43(2):89–100, 2018
work page 2018
-
[63]
Xiaocheng Shang and Hans Christian Öttinger. Structure-preserving integrators for dissipative systems based on reversible–irreversible splitting.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 476(2234), 2020
work page 2020
-
[64]
Herbert B. Callen.Thermodynamics: An Introduction to the Physical Theories of Equilibrium Thermostatics and Irreversible Thermodynamics. John Wiley & Sons, New York, 1960
work page 1960
-
[65]
Springer Berlin, Heidelberg, 2nd edition, 2006
Ernst Hairer, Christian Lubich, and Gerhard Wanner.Geometric Numerical Integration: Structure-Preserving Algorithms for Ordinary Differential Equations, volume 31 ofSpringer Series in Computational Mathematics. Springer Berlin, Heidelberg, 2nd edition, 2006
work page 2006
-
[66]
Jacob Lubliner.Plasticity theory. Courier Corporation, 2008
work page 2008
-
[67]
Juan C Simo and Thomas JR Hughes.Computational inelasticity. Springer, 1998
work page 1998
-
[68]
Zequn He and Celia Reina. EVODMs: variational learning of PDEs for stochastic systems via diffusion models with quantified epistemic uncertainty.Journal of Computational Physics, page 114722, 2026
work page 2026
-
[69]
Zequn He and Celia Reina. SPIEDiff: robust learning of long-time macroscopic dynamics from short-time particle simulations with quantified epistemic uncertainty.arXiv preprint arXiv:2505.13501, 2025
-
[70]
Burigede Liu, Eric Ocegueda, Margaret Trautner, Andrew M Stuart, and Kaushik Bhattacharya. Learning macro- scopic internal variables and history dependence from microscopic models.Journal of the Mechanics and Physics of Solids, 178:105329, 2023
work page 2023
-
[71]
Max Rosenkranz, Karl A Kalina, Jörg Brummund, WaiChing Sun, and Markus Kästner. Viscoelasticty with physics-augmented neural networks: model formulation and training methods without prescribed internal vari- ables.Computational Mechanics, 74(6):1279–1301, 2024
work page 2024
-
[72]
Quercus Hernandez, Alberto Badias, David Gonzalez, Francisco Chinesta, and Elias Cueto. Deep learning of thermodynamics-aware reduced-order models from data.Computer Methods in Applied Mechanics and Engi- neering, 379:113763, 2021
work page 2021
-
[73]
Jun Sur Richard Park, Siu Wun Cheung, Youngsoo Choi, and Yeonjong Shin. tLaSDI: thermodynamics-informed latent space dynamics identification.Computer Methods in Applied Mechanics and Engineering, 429:117144, 2024
work page 2024
-
[74]
Xiaolong He, Yeonjong Shin, Anthony Gruber, Sohyeon Jung, Kookjin Lee, and Youngsoo Choi. Thermody- namically Consistent Latent Dynamics Identification for Parametric Systems.Transactions on Machine Learning Research, 2026. J2C Certification. 25
work page 2026
-
[75]
On Onsager’s principle of microscopic reversibility.Reviews of Modern Physics, 17(2-3):343, 1945
Hendrik Brugt Gerhard Casimir. On Onsager’s principle of microscopic reversibility.Reviews of Modern Physics, 17(2-3):343, 1945
work page 1945
-
[76]
Nikhil Vyas, Depen Morwani, Rosie Zhao, Itai Shapira, David Brandfonbrener, Lucas Janson, and Sham M. Kakade. SOAP: Improving and Stabilizing Shampoo using Adam for Language Modeling. InThe Thirteenth International Conference on Learning Representations, 2025
work page 2025
-
[77]
Gradient alignment in physics-informed neural networks: a second-order optimization perspective
Sifan Wang, Ananyae Kumar bhartari, Bowen Li, and Paris Perdikaris. Gradient alignment in physics-informed neural networks: a second-order optimization perspective. InThe Thirty-ninth Annual Conference on Neural Information Processing Systems, 2025. 26
work page 2025
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.