Self-supervised pre-training on multimodal neutrino detector simulations produces reusable representations that improve downstream classification, regression, and data efficiency over training from scratch.
Title resolution pending
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
citation-role summary
background 1
citation-polarity summary
years
2026 2roles
background 1polarities
background 1representative citing papers
SCGFM creates transferable graph representations by aligning heterogeneous topologies to shared learnable geometric bases via Gromov-Wasserstein distances and re-encoding features accordingly.
citing papers explorer
-
Towards foundation-style models for energy-frontier heterogeneous neutrino detectors via self-supervised pre-training
Self-supervised pre-training on multimodal neutrino detector simulations produces reusable representations that improve downstream classification, regression, and data efficiency over training from scratch.
-
Structure-Centric Graph Foundation Model via Geometric Bases
SCGFM creates transferable graph representations by aligning heterogeneous topologies to shared learnable geometric bases via Gromov-Wasserstein distances and re-encoding features accordingly.