Recognition: unknown
ChatSR: Multimodal Large Language Models for Scientific Formula Discovery
read the original abstract
Current multimodal large language models (MLLMs) are mainly focused on the understanding and processing of perceptual modalities such as images and videos, while their capability for scientific data understanding remains insufficient. To this end, we propose ChatSR, a novel multimodal large language model tailored for scientific data understanding. ChatSR treats scientific data as a new modality analogous to visual content and, through carefully designed encoders and modality alignment mechanisms, maps scientific data into a representation space that can be processed by large language models, enabling the model to grasp the structural characteristics and underlying regularities of scientific data. Building on this foundation, ChatSR further exploits the rich domain knowledge and strong reasoning abilities of large language models to emulate a knowledgeable human scientist: based on user-specified prior constraints and preferences expressed (such as requirements on periodicity, symmetry, etc.), it automatically generates mathematical formulas that not only accurately fit the observed data but also conform to domain priors, thereby characterizing the latent laws embodied in scientific data and promoting the automation of scientific discovery. Experiments on 13 datasets show that ChatSR achieves state-of-the-art performance on traditional symbolic regression benchmarks. In addition, ChatSR exhibits a promising zero-shot ability to understand and utilize types of prior knowledge that are not present in its training data.
This paper has not been read by Pith yet.
Forward citations
Cited by 4 Pith papers
-
GESR: A Genetic Programming-Based Symbolic Regression Method with Gene Editing
GESR uses two BERT models to intelligently guide mutations and crossovers in genetic programming for symbolic regression, claiming better efficiency than standard GP.
-
GESR: A Genetic Programming-Based Symbolic Regression Method with Gene Editing
GESR uses BERT models as guided 'gene editors' within genetic programming to direct mutations and crossovers, yielding higher efficiency and competitive performance on symbolic regression benchmarks.
-
GESR: A Genetic Programming-Based Symbolic Regression Method with Gene Editing
GESR uses two BERT models to intelligently direct mutations and crossovers inside genetic programming, yielding higher efficiency and competitive accuracy on symbolic regression benchmarks.
-
Leveraging Mathematical Reasoning of LLMs for Efficient GPU Thread Mapping
Large language models derive exact analytical GPU thread mappings for complex 2D/3D domains and fractals via in-context learning, outperforming symbolic regression and enabling up to thousands-fold speedups and energy...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.