Residual Reservoir Memory Networks
We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models.
💡 Research Summary
The paper introduces Residual Reservoir Memory Networks (ResRMNs), a novel class of untrained recurrent neural networks within the Reservoir Computing (RC) paradigm that explicitly tackles the long‑term dependency problem. A ResRMN consists of two hierarchically coupled reservoirs: (1) a linear memory reservoir whose state update is m(t)=V_m m(t‑1)+V_x x(t). V_m is initialized as a cyclic orthogonal matrix, guaranteeing a spectral radius of one and a uniform eigenvalue distribution on the unit circle, which endows the module with strong memory capacity. (2) a non‑linear reservoir implemented as a Residual Echo State Network (ResESN). Its update rule is h(t)=α O h(t‑1)+β tanh(W_h h(t‑1)+W_m m(t)+W_x x(t)+b_h), where O is an orthogonal matrix that provides “temporal residual connections”. Three configurations of O are explored: a random orthogonal matrix (O_R), a cyclic orthogonal matrix (O_C) that mirrors the linear module’s structure, and the identity matrix (O_I). The scaling coefficients α and β balance the residual (linear) and non‑linear pathways.
The authors perform a rigorous linear stability analysis by stacking the two state vectors into a global vector H(t)=
Comments & Academic Discussion
Loading comments...
Leave a Comment