LSFormer uses local structure-aware spiking self-attention and spiking response pooling to cut global attention bottlenecks, delivering 4.3% and 8.6% accuracy gains on Tiny-ImageNet and N-CALTECH101 over prior transformer-based SNNs.
Dct-snn: Using dct to distribute spatial information over time for low-latency spiking neural networks,
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.NE 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Breaking Global Self-Attention Bottlenecks in Transformer-based Spiking Neural Networks with Local Structure-Aware Self-Attention
LSFormer uses local structure-aware spiking self-attention and spiking response pooling to cut global attention bottlenecks, delivering 4.3% and 8.6% accuracy gains on Tiny-ImageNet and N-CALTECH101 over prior transformer-based SNNs.