pith. machine review for the scientific record. sign in

arxiv: 2510.04015 · v2 · submitted 2025-10-05 · ❄️ cond-mat.mtrl-sci · math-ph· math.MP

Recognition: unknown

Atomistic Machine Learning with Irreducible Cartesian Natural Tensors

Authors on Pith no claims yet
classification ❄️ cond-mat.mtrl-sci math-phmath.MP
keywords cartesianatomistictensorslearningmachinenaturalaccuratecarnet
0
0 comments X
read the original abstract

Atomistic machine learning (ML) is a powerful tool for accurate and efficient investigation of material behavior at the atomic scale. While such models have been constructed within Cartesian space to harness geometric information and preserve intuitive physical representations, they face challenges in providing a systematic, irreducible-representation-based formalism analogous to the spherical-tensor machinery widely used in equivariant networks. We address these challenges by proposing Cartesian Natural Tensor Networks (CarNet) as a general framework for atomistic ML. We present the theory of irreducible representations using Cartesian natural tensors, covering their construction and their products, and further develop a systematic scheme for the decomposition and reconstruction of high-rank physical tensors. Leveraging this machinery, we then develop an equivariant Cartesian model and demonstrate its strong performance across diverse atomistic ML tasks. CarNet delivers machine learning interatomic potentials (MLIPs) for both materials and molecular systems, with performance on par with leading spherical-tensor models. Furthermore, it enables the construction of accurate and efficient structure--property relationships for tensorial quantities ranging from simple properties like the dipole moment to high-rank tensors with complex symmetries, such as the elastic constant tensor. This work strengthens Cartesian approaches for advanced atomistic ML in the understanding and design of new materials.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.