pith. machine review for the scientific record. sign in

arxiv: 1802.08219 · v3 · submitted 2018-02-22 · 💻 cs.LG · cs.AI· cs.CV· cs.NE

Recognition: unknown

Tensor field networks: Rotation- and translation-equivariant neural networks for 3D point clouds

Authors on Pith no claims yet
classification 💻 cs.LG cs.AIcs.CVcs.NE
keywords networksfieldtensorlayerneuralacceptsarbitraryaugmentation
0
0 comments X
read the original abstract

We introduce tensor field neural networks, which are locally equivariant to 3D rotations, translations, and permutations of points at every layer. 3D rotation equivariance removes the need for data augmentation to identify features in arbitrary orientations. Our network uses filters built from spherical harmonics; due to the mathematical consequences of this filter choice, each layer accepts as input (and guarantees as output) scalars, vectors, and higher-order tensors, in the geometric sense of these terms. We demonstrate the capabilities of tensor field networks with tasks in geometry, physics, and chemistry.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 14 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Fast contracted Clebsch--Gordan tensor products for equivariant graph neural networks

    physics.comp-ph 2026-05 unverdicted novelty 7.0

    An O(L^3) algorithm computes contracted Clebsch-Gordan tensor products for equivariant ML potentials using a structured angular grid and spherical Poisson bracket to handle parity-odd terms at fixed CP rank.

  2. Chem-GMNet: A Sphere-Native Geometric Transformer for Molecular Property Prediction

    cs.LG 2026-05 unverdicted novelty 7.0

    Chem-GMNet uses sphere-native embeddings, DualSKA attention, and SH-FFN layers to match or beat ChemBERTa-2 on MoleculeNet tasks with fewer parameters and sometimes no pretraining.

  3. Symmetry-Protected Basin Localization in Variational Quantum Eigensolvers

    quant-ph 2026-05 unverdicted novelty 7.0

    A symmetry-constrained preconditioner maps molecular geometry to VQE circuit parameters in the correlated ground-state basin, cutting initialization errors by 38x to 6250x on stretched molecules.

  4. Diagnosing Spectral Ceilings in Equivariant Neural Force Fields

    cs.LG 2026-05 unverdicted novelty 7.0

    A new diagnostic reveals that L=2 equivariant force field backbones preserve frequencies up to l=4 but collapse at l=5 on aspirin, consistent with a finite-degree span theorem and controls.

  5. Improving Molecular Force Fields with Minimal Temporal Information

    physics.chem-ph 2026-04 unverdicted novelty 7.0

    FRAMES training with minimal temporal information from MD trajectory pairs improves energy and force prediction accuracy over Equiformer on MD17 and ISO17 benchmarks.

  6. Compact SO(3) Equivariant Atomistic Foundation Models via Structural Pruning

    cs.LG 2026-05 unverdicted novelty 6.0

    Structural pruning of SO(3) equivariant atomistic models from large checkpoints yields 1.5-4x fewer parameters and 2.5-4x less pre-training compute than small models trained from scratch, while outperforming them on m...

  7. PRIME: Protein Representation via Physics-Informed Multiscale Equivariant Hierarchies

    cs.LG 2026-05 unverdicted novelty 6.0

    PRIME is a multiscale graph framework that connects five physics-grounded protein structure levels with bidirectional operators and reports large gains on fold classification and reaction prediction benchmarks.

  8. PRIME: Protein Representation via Physics-Informed Multiscale Equivariant Hierarchies

    cs.LG 2026-05 unverdicted novelty 6.0

    PRIME is a five-level hierarchical equivariant graph model for proteins that uses physics-informed deterministic operators to exchange information across scales and achieves state-of-the-art results on fold classifica...

  9. Agentic Fusion of Large Atomic and Language Models to Accelerate Superconductor Discovery

    cs.LG 2026-04 unverdicted novelty 6.0

    An agentic framework fusing large atomic and language models rediscovers 66 known superconductors and guides experimental verification of four new ones with transition temperatures from 2.5 K to 6.5 K.

  10. Metriplector: From Field Theory to Neural Architecture

    cs.AI 2026-03 unverdicted novelty 6.0

    Metriplector treats neural computation as coupled metriplectic field dynamics whose stress-energy tensor readout achieves competitive results on vision, control, Sudoku, language modeling, and pathfinding with small p...

  11. Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges

    cs.LG 2021-04 accept novelty 6.0

    Geometric deep learning provides a unified mathematical framework based on grids, groups, graphs, geodesics, and gauges to explain and extend neural network architectures by incorporating physical regularities.

  12. A Systematic Survey and Benchmark of Deep Learning for Molecular Property Prediction in the Foundation Model Era

    cs.LG 2026-04 accept novelty 5.0

    A systematic survey and benchmark of four deep learning paradigms for molecular property prediction that organizes the field, critiques current data practices, and outlines three future directions.

  13. Machine-learning modeling of magnetization dynamics in quasi-equilibrium and driven metallic spin systems

    cond-mat.str-el 2026-04 unverdicted novelty 5.0

    Generalized ML force fields reproduce non-collinear magnetic orders on lattices and predict voltage-driven domain-wall motion in itinerant magnets using extensions to nonequilibrium torques.

  14. Neuro-evolutionary stochastic architectures in gauge-covariant neural fields

    cs.NE 2026-04 unverdicted novelty 4.0

    Only the fully symmetry-constrained Ginibre U(1) evolutionary model in gauge-covariant neural fields robustly reaches a narrow near-marginal regime and reproduces the predicted low-frequency finite-width spectral behavior.