Applying STP at consecutive semantic reasoning steps achieves 168x more accurate multi-step latent prediction on ProcessBench than frozen baselines, with trajectories forming smooth curves best captured by non-linear predictors.
Ranganath Krishnan, Piyush Khanna, and Omesh Tickoo
5 Pith papers cite this work. Polarity classification is still indexing.
citation-role summary
citation-polarity summary
years
2026 5verdicts
UNVERDICTED 5roles
background 1polarities
background 1representative citing papers
HilbNets discretize Hilbert bundle convolutions through Hilbert Cellular Sheaves whose Laplacians converge to the continuous connection Laplacian, enabling consistent learning across samplings.
Contextual curvature of LLM representational trajectories correlates with and causally modulates next-token entropy.
Transformers face a topological limitation in dynamic state tracking because their feedforward architecture pushes evolving state representations deeper into layers until depth is exhausted, requiring a shift to recurrent architectures for implicit activation dynamics.
AI's compositional reasoning failures originate in psychological learning paradigms that shaped its architectures, and the ReSynth trimodular framework is proposed to embed systematicity structurally.
citing papers explorer
-
Semantic Step Prediction: Multi-Step Latent Forecasting in LLM Reasoning Trajectories via Step Sampling
Applying STP at consecutive semantic reasoning steps achieves 168x more accurate multi-step latent prediction on ProcessBench than frozen baselines, with trajectories forming smooth curves best captured by non-linear predictors.
-
Consistent Geometric Deep Learning via Hilbert Bundles and Cellular Sheaves
HilbNets discretize Hilbert bundle convolutions through Hilbert Cellular Sheaves whose Laplacians converge to the continuous connection Laplacian, enabling consistent learning across samplings.
-
Representational Curvature Modulates Behavioral Uncertainty in Large Language Models
Contextual curvature of LLM representational trajectories correlates with and causally modulates next-token entropy.
-
The Topological Trouble With Transformers
Transformers face a topological limitation in dynamic state tracking because their feedforward architecture pushes evolving state representations deeper into layers until depth is exhausted, requiring a shift to recurrent architectures for implicit activation dynamics.
-
How Psychological Learning Paradigms Shaped and Constrained Artificial Intelligence
AI's compositional reasoning failures originate in psychological learning paradigms that shaped its architectures, and the ReSynth trimodular framework is proposed to embed systematicity structurally.