Gradient modifications before Adam inflate old-direction learning rates via the second-moment term, but routing modifications solely to the first moment with adaptive strength prevents collapse and yields 3.8-4.8 unit gains over baselines in 8- and 16-domain continual learning.
SplitLoRA: Balancing stability and plasticity in continual learning through gradient space splitting.arXiv preprint arXiv:2505.22370
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Hidden Failure Modes of Gradient Modification under Adam in Continual Learning, and Adaptive Decoupled Moment Routing as a Repair
Gradient modifications before Adam inflate old-direction learning rates via the second-moment term, but routing modifications solely to the first moment with adaptive strength prevents collapse and yields 3.8-4.8 unit gains over baselines in 8- and 16-domain continual learning.