ITF inflates curvature in switching AL-RNNs by conditioning on one regime path while marginal likelihood reduces curvature with a missing-information correction for plausible switches, and evidence fine-tuning can degrade dynamical QoIs despite better held-out evidence.
Detecting Invariant Manifolds in ReLU-Based RNNs
1 Pith paper cite this work. Polarity classification is still indexing.
abstract
Recurrent Neural Networks (RNNs) have found widespread applications in machine learning for time series prediction and dynamical systems reconstruction, and experienced a recent renaissance with improved training algorithms and architectural designs. Understanding why and how trained RNNs produce their behavior is important for scientific and medical applications, and explainable AI more generally. An RNN's dynamical repertoire depends on the topological and geometrical properties of its state space. Stable and unstable manifolds of periodic points play a particularly important role: They dissect a dynamical system's state space into different basins of attraction, and their intersections lead to chaotic dynamics with fractal geometry. Here we introduce a novel algorithm for detecting these manifolds, with a focus on piecewise-linear RNNs (PLRNNs) employing rectified linear units (ReLUs) as their activation function. We demonstrate how the algorithm can be used to trace the boundaries between different basins of attraction, and hence to characterize multistability, a computationally important property. We further show its utility in finding so-called homoclinic points, the intersections between stable and unstable manifolds, and thus establish the existence of chaos in PLRNNs. Finally we show for an empirical example, electrophysiological recordings from a cortical neuron, how insights into the underlying dynamics could be gained through our method.
fields
cs.LG 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Teacher Forcing as Generalized Bayes: Optimization Geometry Mismatch in Switching Surrogates for Chaotic Dynamics
ITF inflates curvature in switching AL-RNNs by conditioning on one regime path while marginal likelihood reduces curvature with a missing-information correction for plausible switches, and evidence fine-tuning can degrade dynamical QoIs despite better held-out evidence.