Recognition: 2 theorem links
· Lean TheoremMatterSim: A Deep Learning Atomistic Model Across Elements, Temperatures and Pressures
Pith reviewed 2026-05-17 00:06 UTC · model grok-4.3
The pith
MatterSim predicts Gibbs free energies of inorganic solids at near first-principles accuracy across wide temperatures and pressures.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
MatterSim acts as a machine learning force field that not only predicts ground-state material structures and energetics but also simulates their behavior under realistic temperatures and pressures. This leads to accurate computation of lattice dynamics, mechanical and thermodynamic properties comparable to first-principles methods. In particular, it predicts Gibbs free energies for a wide range of inorganic solids with near-first-principles accuracy and reaches 15 meV/atom resolution for temperatures up to 1000 K against experimental data. This capability allows for the prediction of experimental phase diagrams at minimal computational cost. Additionally, the model serves as a platform for a
What carries the argument
The deep learning neural network trained actively on extensive first-principles data, serving as a versatile machine learning force field that generalizes across elements, temperatures from 0 to 5000 K, and pressures up to 1000 GPa.
If this is right
- Computes lattice dynamics and thermodynamic properties at first-principles accuracy but with much higher efficiency.
- Enables prediction of material phase diagrams at low computational cost.
- Supports customization through fine-tuning with domain-specific data, reducing required data by up to 97%.
- Facilitates simulations of materials under extreme conditions up to 5000 K and 1000 GPa.
- Provides a base for continuous learning and integration into materials design workflows.
Where Pith is reading between the lines
- If correct, this approach could extend to predicting other complex properties not directly trained on, such as electronic or optical behaviors through transfer learning.
- High data efficiency suggests potential for rapid adaptation to new levels of theory or specific material classes with minimal additional computations.
- Could accelerate the discovery of materials for applications in energy, electronics, or structural uses by allowing broader exploration of the design space.
Load-bearing premise
The first-principles training data covers enough of the chemical space and conditions to allow the model to generalize accurately without large errors on unseen materials or extreme conditions.
What would settle it
Running MatterSim on a new composition or at a temperature or pressure outside the training range and comparing its predicted Gibbs free energy or other properties directly to experimental measurements or high-accuracy first-principles results; large discrepancies beyond 15 meV/atom would falsify the generalization claim.
read the original abstract
Accurate and fast prediction of materials properties is central to the digital transformation of materials design. However, the vast design space and diverse operating conditions pose significant challenges for accurately modeling arbitrary material candidates and forecasting their properties. We present MatterSim, a deep learning model actively learned from large-scale first-principles computations, for efficient atomistic simulations at first-principles level and accurate prediction of broad material properties across the periodic table, spanning temperatures from 0 to 5000 K and pressures up to 1000 GPa. Out-of-the-box, the model serves as a machine learning force field, and shows remarkable capabilities not only in predicting ground-state material structures and energetics, but also in simulating their behavior under realistic temperatures and pressures, signifying an up to ten-fold enhancement in precision compared to the prior best-in-class. This enables MatterSim to compute materials' lattice dynamics, mechanical and thermodynamic properties, and beyond, to an accuracy comparable with first-principles methods. Specifically, MatterSim predicts Gibbs free energies for a wide range of inorganic solids with near-first-principles accuracy and achieves a 15 meV/atom resolution for temperatures up to 1000K compared with experiments. This opens an opportunity to predict experimental phase diagrams of materials at minimal computational cost. Moreover, MatterSim also serves as a platform for continuous learning and customization by integrating domain-specific data. The model can be fine-tuned for atomistic simulations at a desired level of theory or for direct structure-to-property predictions, achieving high data efficiency with a reduction in data requirements by up to 97%.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. MatterSim is a deep learning atomistic model actively learned from large-scale first-principles computations. It functions as a machine-learning force field for efficient simulations across the periodic table, with claimed applicability from 0 to 5000 K and pressures up to 1000 GPa. The model is reported to deliver up to ten-fold precision gains over prior best-in-class approaches for ground-state structures, energetics, lattice dynamics, mechanical properties, and thermodynamic quantities. A central result is the prediction of Gibbs free energies for inorganic solids at near-first-principles accuracy, achieving 15 meV/atom resolution versus experiment for temperatures up to 1000 K, thereby enabling low-cost computation of experimental phase diagrams. The work also positions the model as a platform for continuous learning and fine-tuning with reduced data requirements (up to 97 %).
Significance. If the generalization and accuracy claims are substantiated, MatterSim would represent a meaningful advance in universal machine-learning potentials for materials science. It could accelerate high-throughput exploration of thermodynamic stability and phase behavior under realistic operating conditions without repeated DFT calculations. The active-learning strategy and support for domain-specific fine-tuning are positive features that align with practical needs in the field. Credit is due for targeting a broad temperature-pressure range and for framing the model as extensible rather than a one-off fit.
major comments (2)
- [Dataset construction] Dataset construction section: the manuscript provides no quantitative description of the training distribution, such as the number of unique compositions or elements covered, temperature/pressure histograms, or the fraction of data at high-T/high-P regimes. This information is load-bearing for the central generalization claim that the 15 meV/atom Gibbs free energy accuracy extends across unseen inorganic solids up to 1000 K and 1000 GPa.
- [Thermodynamic properties results] Thermodynamic properties results: the reported 15 meV/atom resolution for Gibbs free energies versus experiment is given without error bars on the model predictions, without explicit train/test split details, and without confirmation that the experimental comparison set is disjoint from the actively learned first-principles data. These omissions prevent assessment of whether the accuracy reflects true out-of-distribution performance or interpolation on well-sampled subsets.
minor comments (2)
- [Abstract] Abstract: the phrase 'up to ten-fold enhancement in precision' should specify the baseline model, the exact metric (e.g., force or energy RMSE), and the conditions under which the factor is measured.
- [Figures] Figure captions throughout: parity plots and error distributions would benefit from explicit statements of the number of structures or conditions included and whether any data points were excluded as outliers.
Simulated Author's Rebuttal
We thank the referee for their thorough review and valuable feedback on our manuscript. We address each of the major comments below and have updated the manuscript to incorporate the suggested improvements where appropriate.
read point-by-point responses
-
Referee: [Dataset construction] Dataset construction section: the manuscript provides no quantitative description of the training distribution, such as the number of unique compositions or elements covered, temperature/pressure histograms, or the fraction of data at high-T/high-P regimes. This information is load-bearing for the central generalization claim that the 15 meV/atom Gibbs free energy accuracy extends across unseen inorganic solids up to 1000 K and 1000 GPa.
Authors: We agree with the referee that a quantitative description of the training distribution is important to support the generalization claims. In the revised manuscript, we have expanded the Dataset construction section to include the number of unique compositions and elements covered, as well as histograms showing the distribution of temperatures and pressures in the training data. We have also specified the fraction of data in the high-temperature (above 1000 K) and high-pressure (above 100 GPa) regimes. These additions provide the necessary context for evaluating the model's performance across the claimed range. revision: yes
-
Referee: [Thermodynamic properties results] Thermodynamic properties results: the reported 15 meV/atom resolution for Gibbs free energies versus experiment is given without error bars on the model predictions, without explicit train/test split details, and without confirmation that the experimental comparison set is disjoint from the actively learned first-principles data. These omissions prevent assessment of whether the accuracy reflects true out-of-distribution performance or interpolation on well-sampled subsets.
Authors: We appreciate this comment and have revised the Thermodynamic properties results section accordingly. We have added error bars to the reported Gibbs free energy predictions to indicate the uncertainty in the model outputs. Additionally, we have provided explicit details on the train/test splits used in the active learning process and confirmed that the set of materials used for experimental comparison is disjoint from the first-principles data employed in training the model. This clarification demonstrates that the 15 meV/atom accuracy is achieved on out-of-distribution examples, supporting the generalization capability. revision: yes
Circularity Check
No significant circularity; claims rest on held-out tests and external experimental benchmarks
full rationale
The paper trains MatterSim on first-principles data via active learning and reports accuracy on Gibbs free energies versus independent experiments (15 meV/atom up to 1000 K). This comparison uses external data outside the training distribution. No equations or steps reduce by construction to the inputs (e.g., no fitted parameters renamed as predictions, no self-definitional loops, no load-bearing self-citations). The derivation chain is self-contained against external benchmarks, consistent with standard ML force-field validation.
Axiom & Free-Parameter Ledger
free parameters (1)
- neural network weights
axioms (1)
- domain assumption DFT calculations provide sufficiently accurate reference data for the target properties across the claimed range of elements, temperatures, and pressures.
Lean theorems connected to this paper
-
IndisputableMonolith/Cost/FunctionalEquation.leanwashburn_uniqueness_aczel unclear?
unclearRelation between the paper passage and the cited Recognition theorem.
MatterSim employs an active learning approach... deep graph neural networks, uncertainty-aware sampling... first-principles supervisor... M3GNet and Graphormer backbones
-
IndisputableMonolith/Foundation/BlackBodyRadiationDeep.leanblackBodyRadiationDeepCert unclear?
unclearRelation between the paper passage and the cited Recognition theorem.
predicts Gibbs free energies... 15 meV/atom resolution for temperatures up to 1000 K
What do these tags mean?
- matches
- The paper's claim is directly supported by a theorem in the formal canon.
- supports
- The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
- extends
- The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
- uses
- The paper appears to rely on the theorem as machinery.
- contradicts
- The paper's claim conflicts with a theorem or certificate in the canon.
- unclear
- Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.
Forward citations
Cited by 20 Pith papers
-
Lang2MLIP: End-to-End Language-to-Machine Learning Interatomic Potential Development with Autonomous Agentic Workflows
Lang2MLIP is an LLM multi-agent framework that automates end-to-end development of machine learning interatomic potentials from natural language input for heterogeneous materials systems.
-
Micro-environment of the Eu interstitial in $\beta$-SiAlON:Eu$^{2+}$ green phosphor
First-principles ΔSCF and Huang-Rhys calculations validate the Eu-N₉ coordination model in β-SiAlON:Eu²⁺ at low z and explain the red-shift of emission via zero-phonon line trends, modest Huang-Rhys increases, and con...
-
Inverse Materials Design via Joint Generation of Crystal Structures and Local Electronic Descriptors
A joint diffusion framework for crystal structures and local electronic descriptors improves inverse materials design success rates and structural quality over structure-only models under band-gap and formation-energy...
-
Fast and Accurate Prediction of Lattice Thermal Conductivity via Machine Learning Surrogates
Machine learning models, especially certain deep neural networks, can predict lattice thermal conductivity with useful accuracy across different generalization tests while being orders of magnitude faster than first-p...
-
Intervention-Based Time Series Causal Discovery via Simulator-Generated Interventional Distributions
SVAR-FM uses simulator clamping to produce interventional distributions and flow matching to identify time series causal structures, with an error bound that predicts sign reversal of causal effects below a simulator ...
-
CrystalREPA: Transferring Physical Priors from Universal MLIPs to Crystal Generative Models
CrystalREPA closes the representation gap between crystal generators and universal MLIPs via contrastive alignment, yielding more stable and valid generated crystals while revealing that MLIP teacher quality is better...
-
Compact SO(3) Equivariant Atomistic Foundation Models via Structural Pruning
Structural pruning of SO(3) equivariant atomistic models from large checkpoints yields 1.5-4x fewer parameters and 2.5-4x less pre-training compute than small models trained from scratch, while outperforming them on m...
-
MatterSim-MT: A multi-task foundation model for in silico materials characterization
MatterSim-MT is a foundation model pretrained on over 35 million first-principles structures that predicts material structure, dynamics, and thermodynamics while enabling multi-task simulations of phonon splitting, fe...
-
Generative structure search for efficient and diverse discovery of molecular and crystal structures
GSS unifies diffusion generation and random structure search into a single sampling process using learned scores and physical forces, recovering diverse metastable structures at over tenfold lower cost than pure RSS w...
-
Agentic Fusion of Large Atomic and Language Models to Accelerate Superconductor Discovery
An agentic framework fusing large atomic and language models rediscovers 66 known superconductors and guides experimental verification of four new ones with transition temperatures from 2.5 K to 6.5 K.
-
Iterative learning scheme for crystal structure prediction with anharmonic lattice dynamics
An iterative scheme using foundation models and SSCHA enables efficient crystal structure prediction with anharmonic effects, shown to match DFT benchmarks on the H3S system from 50 to 200 GPa.
-
An experimentally validated end-to-end framework for operando modeling of intrinsically complex metallosilicates
An end-to-end framework combining domain separation, lightweight ML potentials, and de novo in silico synthesis enables quantitative atomistic modeling of mesoporous metallosilicates that matches experimental densitie...
-
Open Materials 2024 (OMat24) Inorganic Materials Dataset and Models
OMat24 provides over 110 million DFT calculations and EquiformerV2 models that reach state-of-the-art performance on material stability and formation energy prediction.
-
GEWUM: General Exploration Workflow for the Utopia of Materials: A Unified Platform for Automated Structure Generation, Selection, and Validation
GEWUM is a unified open-source platform that combines selective random structure search with universal machine learning interatomic potentials to automate materials structure prediction, selection, and validation.
-
Finetuning-Free Diffusion Model with Adaptive Constraint Guidance for Inorganic Crystal Structure Generation
A finetuning-free diffusion model uses adaptive constraint guidance to generate thermodynamically plausible inorganic crystal structures that meet targeted geometric and chemical constraints.
-
Comparing the latent features of universal machine-learning interatomic potentials
Different uMLIPs encode chemical space in distinct ways, with high cross-model feature reconstruction errors, and fine-tuning preserves strong pre-training bias in the latent features.
-
Assessing foundational atomistic models for iron alloys under Earth's core conditions
Foundational atomistic models reproduce some structural and dynamical properties of iron alloys under core conditions but none consistently match first-principles benchmarks due to missing explicit treatment of therma...
-
Quasiparticle Dynamics in the 4d-4f Ising-like Double Perovskite Ba2DyRuO6 studied using Neutron Scattering and Machine-Learning Framework
Ba2DyRuO6 undergoes simultaneous Ru5+ and Dy3+ antiferromagnetic ordering at 47 K into a collinear Ising state with magnons below 10 meV and CEF levels at 46.5 and 71.8 meV.
-
Learning Structure, Energy, and Dynamics: A Survey of Artificial Intelligence for Protein Dynamics
A review summarizing AI techniques for protein conformation generation, trajectory modeling, Boltzmann generators, machine learning potentials, and related challenges in scalability and physical consistency.
-
Inverse Design of Inorganic Compounds with Generative AI
A review of generative AI for inverse design of inorganic compounds, analyzing adaptations for their complexity in composition, geometry, symmetry, and electronic structure, with discussion of future benchmarks and sy...
Reference graph
Works this paper leans on
- [1]
-
[2]
T. Li, G. Galli, Electronic properties of mos2 nanoparticles. The Journal of Physical Chemistry C 111(44), 16192–16196 (2007)
work page 2007
-
[3]
K. Mizushima, P. Jones, P. Wiseman, J.B. Goodenough, Lixcoo2 (0¡ x¡-1): A new cathode material for batteries of high energy density. Materials Research Bulletin 15(6), 783–789 (1980)
work page 1980
-
[4]
G. Ceder, Y.M. Chiang, D. Sadoway, M. Aydinol, Y.I. Jang, B. Huang, Identification of cathode materials for lithium batteries guided by first-principles calculations. Nature 392(6677), 694– 696 (1998)
work page 1998
-
[5]
M.W. Tibbitt, C.B. Rodell, J.A. Burdick, K.S. Anseth, Progress in material design for biomed- ical applications. Proceedings of the National Academy of Sciences 112(47), 14444–14451 (2015)
work page 2015
-
[6]
X. Li, J. Xie, C. Jiang, J. Yu, P. Zhang, Review on design and evaluation of environmental photocatalysts. Frontiers of Environmental Science & Engineering 12, 1–32 (2018)
work page 2018
-
[7]
X. Hu, G. Li, J.C. Yu, Design, fabrication, and modification of nanostructured semiconductor materials for environmental and energy applications. Langmuir 26(5), 3031–3039 (2010)
work page 2010
-
[8]
S. Curtarolo, G.L. Hart, M.B. Nardelli, N. Mingo, S. Sanvito, O. Levy, The high-throughput highway to computational materials design. Nature materials 12(3), 191–201 (2013)
work page 2013
-
[9]
K. Choudhary, B. DeCost, C. Chen, A. Jain, F. Tavazza, R. Cohn, C.W. Park, A. Choudhary, A. Agrawal, S.J. Billinge, et al., Recent advances and applications of deep learning methods in materials science. npj Computational Materials 8(1), 59 (2022)
work page 2022
- [10]
-
[11]
A. Merchant, S. Batzner, S.S. Schoenholz, M. Aykol, G. Cheon, E.D. Cubuk, Scaling deep learning for materials discovery. Nature pp. 1–6 (2023) 72
work page 2023
-
[12]
C. Chen, D.T. Nguyen, S.J. Lee, N.A. Baker, A.S. Karakoti, L. Lauw, C. Owen, K.T. Mueller, B.A. Bilodeau, V. Murugesan, et al., Accelerating computational materials discovery with artificial intelligence and cloud high-performance computing: from large-scale screening to experimental validation. arXiv preprint arXiv:2401.04070 (2024)
-
[13]
R.K. Lindsey, L.E. Fried, N. Goldman, Chimes: A force matched potential with explicit three- body interactions for molten carbon. Journal of chemical theory and computation 13(12), 6222–6229 (2017)
work page 2017
-
[14]
K. Sch¨ utt, P.J. Kindermans, H.E. Sauceda Felix, S. Chmiela, A. Tkatchenko, K.R. M¨ uller, Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017)
work page 2017
-
[15]
A. Musaelian, S. Batzner, A. Johansson, L. Sun, C.J. Owen, M. Kornbluth, B. Kozin- sky, Learning local equivariant representations for large-scale atomistic dynamics. Nature Communications 14(1), 579 (2023)
work page 2023
-
[16]
S. Batzner, A. Musaelian, L. Sun, M. Geiger, J.P. Mailoa, M. Kornbluth, N. Molinari, T.E. Smidt, B. Kozinsky, E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022)
work page 2022
-
[17]
C. Chen, W. Ye, Y. Zuo, C. Zheng, S.P. Ong, Graph networks as a universal machine learning framework for molecules and crystals. Chemistry of Materials 31(9), 3564–3572 (2019)
work page 2019
-
[18]
K. Choudhary, B. DeCost, Atomistic line graph neural network for improved materials property predictions. npj Computational Materials 7(1), 185 (2021)
work page 2021
- [19]
-
[20]
B. Deng, P. Zhong, K. Jun, J. Riebesell, K. Han, C.J. Bartel, G. Ceder, Chgnet as a pretrained universal neural network potential for charge-informed atomistic modelling. Nature Machine Intelligence 5(9), 1031–1041 (2023)
work page 2023
-
[21]
A foundation model for atomistic materials chemistry
I. Batatia, P. Benner, Y. Chiang, A.M. Elena, D.P. Kov´ acs, J. Riebesell, X.R. Advincula, M. Asta, W.J. Baldwin, N. Bernstein, et al., A foundation model for atomistic materials chemistry. arXiv preprint arXiv:2401.00096 (2023) 73
- [22]
- [23]
- [24]
-
[25]
A. Dunn, Q. Wang, A. Ganose, D. Dopp, A. Jain, Benchmarking materials property prediction methods: the matbench test set and automatminer reference algorithm. npj Computational Materials 6(1), 138 (2020)
work page 2020
- [26]
- [27]
-
[28]
P. Hohenberg, W. Kohn, Inhomogeneous electron gas. Physical review 136(3B), B864 (1964)
work page 1964
- [29]
-
[30]
V.I. Anisimov, J. Zaanen, O.K. Andersen, Band theory and mott insulators: Hubbard u instead of stoner i. Physical Review B 44(3), 943 (1991)
work page 1991
- [31]
-
[32]
J.E. Saal, S. Kirklin, M. Aykol, B. Meredig, C. Wolverton, Materials design and discovery with high-throughput density functional theory: the open quantum materials database (oqmd). Jom 65, 1501–1509 (2013)
work page 2013
-
[33]
S. Kirklin, J.E. Saal, B. Meredig, A. Thompson, J.W. Doak, M. Aykol, S. R¨ uhl, C. Wolverton, The open quantum materials database (oqmd): assessing the accuracy of dft formation energies. 74 npj Computational Materials 1(1), 1–15 (2015)
work page 2015
-
[34]
J. Schmidt, N. Hoffmann, H.C. Wang, P. Borlido, P.J. Carri¸ co, T.F. Cerqueira, S. Botti, M.A. Marques, Machine-learning-assisted determination of the global zero-temperature phase diagram of materials. Advanced Materials 35(22), 2210788 (2023)
work page 2023
-
[35]
C. Ying, T. Cai, S. Luo, S. Zheng, G. Ke, D. He, Y. Shen, T.Y. Liu, Do transformers really perform badly for graph representation? Advances in neural information processing systems 34, 28877–28888 (2021)
work page 2021
- [36]
-
[37]
J. Riebesell, H. Yang, R. Goodall, S.G. Baird. Pymatviz: visualization toolkit for materials informatics (2022). https://doi.org/10.5281/zenodo.7486816. URL https://github.com/janosh/ pymatviz. 10.5281/zenodo.7486816 - https://github.com/janosh/pymatviz
-
[38]
M. Horton. Add strict anions option to MaterialsProject2020Compatibility by mkhorton (2024). Accessed May 07, 2024
work page 2024
- [39]
- [40]
-
[41]
C.J. Pickard, R. Needs, Ab initio random structure searching. Journal of Physics: Condensed Matter 23(5), 053201 (2011)
work page 2011
-
[42]
J. Schmidt, H.C. Wang, T.F. Cerqueira, S. Botti, M.A. Marques, A dataset of 175k stable and metastable materials calculated with the pbesol and scan functionals. Scientific Data 9(1), 64 (2022)
work page 2022
-
[43]
J. Schmidt, N. Hoffmann, H.C. Wang, P. Borlido, P.J. Carri¸ co, T.F. Cerqueira, S. Botti, M.A. Marques, Large-scale machine-learning-assisted exploration of the whole materials space. arXiv preprint arXiv:2210.00579 (2022)
-
[44]
G. Bergerhoff, I. Brown, F. Allen, et al., Crystallographic databases. International Union of Crystallography, Chester 360, 77–95 (1987) 75
work page 1987
-
[45]
Bloch, ¨Uber die quantenmechanik der elektronen in kristallgittern
F. Bloch, ¨Uber die quantenmechanik der elektronen in kristallgittern. Zeitschrift f¨ ur physik 52(7), 555–600 (1929)
work page 1929
- [46]
- [47]
-
[48]
P. Giannozzi, S. De Gironcoli, P. Pavone, S. Baroni, Ab initio calculation of phonon dispersions in semiconductors. Physical Review B 43(9), 7231 (1991)
work page 1991
- [49]
-
[50]
H. Yang, M. Govoni, A. Kundu, G. Galli, Combined first-principles calculations of electron– electron and electron–phonon self-energies in condensed systems. Journal of Chemical Theory and Computation 17(12), 7468–7476 (2021)
work page 2021
-
[51]
H. Yang, M. Govoni, A. Kundu, G. Galli, Computational protocol to evaluate electron–phonon interactions within density matrix perturbation theory. Journal of Chemical Theory and Computation 18(10), 6031–6042 (2022)
work page 2022
-
[52]
S. Fang, M. Geiger, J.G. Checkelsky, T. Smidt. Phonon predictions with e(3)-equivariant graph neural networks (2024)
work page 2024
-
[53]
A. Togo. Atsushi togo. URL https://doi.org/10.48505/nims.4197
-
[54]
K. Tolborg, J. Klarbring, A.M. Ganose, A. Walsh, Free energy predictions for crystal stability and synthesisability. Digital Discovery 1(5), 586–595 (2022)
work page 2022
-
[55]
C.J. Bartel, S.L. Millican, A.M. Deml, J.R. Rumptz, W. Tumas, A.W. Weimer, S. Lany, V. Ste- vanovi´ c, C.B. Musgrave, A.M. Holder, Physical descriptor for the gibbs energy of inorganic crystalline solids and temperature-dependent materials chemistry. Nature communications9(1), 4168 (2018)
work page 2018
-
[56]
B1-B2 phase transition in MgO at ultra-high static pressure
N. Dubrovinskaia, S. Petitgirard, S. Chariton, R. Tucoulou, J. Garrevoet, K. Glazyrin, H.P. Liermann, V.B. Prakapenka, L. Dubrovinsky, B1-b2 phase transition in mgo at ultra-high static pressure. arXiv preprint arXiv:1904.00476 (2019) 76
work page internal anchor Pith review Pith/arXiv arXiv 1904
- [57]
-
[58]
R.S. McWilliams, D.K. Spaulding, J.H. Eggert, P.M. Celliers, D.G. Hicks, R.F. Smith, G.W. Collins, R. Jeanloz, Phase transformations and metallization of magnesium oxide at high pressure and temperature. Science 338(6112), 1330–1333 (2012)
work page 2012
-
[59]
O.T. Unke, S. Chmiela, H.E. Sauceda, M. Gastegger, I. Poltavsky, K.T. Sch¨ utt, A. Tkatchenko, K.R. M¨ uller, Machine learning force fields. Chemical Reviews121(16), 10142–10186 (2021)
work page 2021
- [60]
-
[61]
V.L. Deringer, M.A. Caro, G. Cs´ anyi, A general-purpose machine-learning force field for bulk and nanostructured phosphorus. Nature communications 11(1), 5461 (2020)
work page 2020
- [62]
-
[63]
L.B. Skinner, C. Benmore, J.C. Neuefeind, J.B. Parise, The structure of water around the compressibility minimum. The Journal of chemical physics 141(21) (2014)
work page 2014
-
[64]
W. Chen, F. Ambrosio, G. Miceli, A. Pasquarello, Ab initio electronic structure of liquid water. Physical review letters 117(18), 186401 (2016)
work page 2016
- [65]
-
[66]
R.A. DiStasio, B. Santra, Z. Li, X. Wu, R. Car, The individual and collective effects of exact exchange and dispersion interactions on the ab initio structure of liquid water. The Journal of chemical physics 141(8) (2014)
work page 2014
-
[67]
B. Cheng, E.A. Engel, J. Behler, C. Dellago, M. Ceriotti, Ab initio thermodynamics of liquid and solid water. Proceedings of the National Academy of Sciences 116(4), 1110–1115 (2019)
work page 2019
-
[68]
B. Monserrat, J.G. Brandenburg, E.A. Engel, B. Cheng, Liquid water contains the building blocks of diverse ice phases. Nature communications 11(1), 5757 (2020) 77
work page 2020
- [69]
- [70]
-
[71]
P.P. De Breuck, M.L. Evans, G.M. Rignanese, Robust model benchmarking and bias-imbalance in data-driven materials science: a case study on modnet. Journal of Physics: Condensed Matter 33(40), 404002 (2021)
work page 2021
-
[72]
S. Chmiela, V. Vassilev-Galindo, O.T. Unke, A. Kabylda, H.E. Sauceda, A. Tkatchenko, K.R. M¨ uller, Accurate global machine learning force fields for molecules with hundreds of atoms. Science Advances 9(2), eadf0873 (2023)
work page 2023
-
[73]
T.W. Ko, J.A. Finkler, S. Goedecker, J. Behler, Accurate fourth-generation machine learning potentials by electrostatic embedding. Journal of Chemical Theory and Computation 19(12), 3567–3579 (2023)
work page 2023
-
[74]
J. Ansel, E. Yang, H. He, N. Gimelshein, A. Jain, M. Voznesensky, B. Bao, P. Bell, D. Berard, E. Burovski, G. Chauhan, A. Chourdia, W. Constable, A. Desmaison, Z. DeVito, E. Elli- son, W. Feng, J. Gong, M. Gschwind, B. Hirsh, S. Huang, K. Kalambarkar, L. Kirsch, M. Lazos, M. Lezcano, Y. Liang, J. Liang, Y. Lu, C. Luk, B. Maher, Y. Pan, C. Puhrsch, M. Reso...
-
[75]
T.W. Ko, M. Nassar, S. Miret, E. Liu, J. Qi, S.P. Ong. Materials Graph Library (2021). https://doi.org/10.5281/zenodo.8025189
-
[76]
T. Chen, S. Luo, D. He, S. Zheng, T.Y. Liu, L. Wang, Geomformer: A general architecture for geometric molecular representation learning (2023)
work page 2023
-
[77]
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A.N. Gomez, L. Kaiser, I. Polosukhin, Attention is all you need. Advances in neural information processing systems 30 (2017) 78
work page 2017
-
[78]
Decoupled Weight Decay Regularization
I. Loshchilov, F. Hutter, Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
work page internal anchor Pith review Pith/arXiv arXiv 2017
- [79]
-
[80]
A.M. Krajewski, J.W. Siegel, J. Xu, Z.K. Liu, Extensible structure-informed prediction of for- mation energy with improved accuracy and usability employing neural networks. Computational Materials Science 208, 111254 (2022)
work page 2022
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.