Cumulative state updates in CMRU restore gradient flow through time in quantized bistable RNNs, yielding more stable convergence and competitive or superior performance versus LRUs and minGRUs on long-range sequence tasks.
Title resolution pending
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
citation-role summary
background 1
citation-polarity summary
fields
cs.LG 1years
2026 1verdicts
UNVERDICTED 1roles
background 1polarities
background 1representative citing papers
citing papers explorer
-
Improving the Performance and Learning Stability of Parallelizable RNNs Designed for Ultra-Low Power Applications
Cumulative state updates in CMRU restore gradient flow through time in quantized bistable RNNs, yielding more stable convergence and competitive or superior performance versus LRUs and minGRUs on long-range sequence tasks.