RNN computation is recovered from multi-hop graph pathways, and constraining these pathways via resolvent regularization yields improved temporal sparsity and task performance over standard L1.
A mathematical theory of semantic development in deep neural networks
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
fields
cs.NE 2years
2026 2verdicts
UNVERDICTED 2representative citing papers
Task connectivity in graph-structured multi-task environments enhances generalization and stability, with stronger benefits for attention models than MLPs.
citing papers explorer
-
Unifying Dynamical Systems and Graph Theory to Mechanistically Understand Computation in Neural Networks
RNN computation is recovered from multi-hop graph pathways, and constraining these pathways via resolvent regularization yields improved temporal sparsity and task performance over standard L1.
-
Attention to task structure for cognitive flexibility
Task connectivity in graph-structured multi-task environments enhances generalization and stability, with stronger benefits for attention models than MLPs.