A new parameter reconstruction method achieves globally optimal training for spiking neural networks by convexifying parallel recurrent threshold networks that include SNNs as a special case.
Teaching arithmetic to small transformers
3 Pith papers cite this work. Polarity classification is still indexing.
citation-role summary
citation-polarity summary
verdicts
UNVERDICTED 3roles
background 1polarities
background 1representative citing papers
FSLR explicitly supervises the initial logical planning step in math problems, boosting LLM accuracy by 3-5% while using 80% fewer training tokens than standard CoT fine-tuning.
LiveCodeBench collects 400 recent contest problems to create a contamination-free benchmark evaluating LLMs on code generation and related capabilities like self-repair and execution.
citing papers explorer
-
Globally Optimal Training of Spiking Neural Networks via Parameter Reconstruction
A new parameter reconstruction method achieves globally optimal training for spiking neural networks by convexifying parallel recurrent threshold networks that include SNNs as a special case.
-
From Implicit to Explicit: Token-Efficient Logical Supervision for Mathematical Reasoning in LLMs
FSLR explicitly supervises the initial logical planning step in math problems, boosting LLM accuracy by 3-5% while using 80% fewer training tokens than standard CoT fine-tuning.
-
LiveCodeBench: Holistic and Contamination Free Evaluation of Large Language Models for Code
LiveCodeBench collects 400 recent contest problems to create a contamination-free benchmark evaluating LLMs on code generation and related capabilities like self-repair and execution.