📝 Original Info
- Title: CogniSNN: Enabling Neuron-Expandability, Pathway-Reusability, and Dynamic-Configurability with Random Graph Architectures in Spiking Neural Networks
- ArXiv ID: 2512.11743
- Date: 2025-12-12
- Authors: Yongsheng Huang, Peibo Duan, Yujie Wu, Kai Sun, Zhipeng Liu, Changsheng Zhang, Bin Zhang, Mingkun Xu
📝 Abstract
Spiking neural networks (SNNs), regarded as the third generation of artificial neural networks, are expected to bridge the gap between artificial intelligence and computational neuroscience. However, most mainstream SNN research directly adopts the rigid, chain-like hierarchical architecture of traditional artificial neural networks (ANNs), ignoring key structural characteristics of the brain. Biological neurons are stochastically interconnected, forming complex neural pathways that exhibit Neuron-Expandability, Pathway-Reusability, and Dynamic-Configurability. In this paper, we introduce a new SNN paradigm, named Cognition-aware SNN (CogniSNN), by incorporating Random Graph Architecture (RGA). Furthermore, we address the issues of network degradation and dimensional mismatch in deep pathways by introducing an improved pure spiking residual mechanism alongside an adaptive pooling strategy. Then, we design a Key Pathway-based Learning without Forgetting (KP-LwF) approach, which selectively reuses critical neural pathways while retaining historical knowledge, enabling efficient multi-task transfer. Finally, we propose a Dynamic Growth Learning (DGL) algorithm that allows neurons and synapses to grow dynamically along the internal temporal dimension. Extensive experiments demonstrate that CogniSNN achieves performance comparable to, or even surpassing, current state-of-the-art SNNs on neuromorphic datasets and Tiny-ImageNet. The Pathway-Reusability enhances the network's continuous learning capability across different scenarios, while the dynamic growth algorithm improves robustness against interference and mitigates the fixed-timestep constraints during neuromorphic chip deployment. This work demonstrates the potential of SNNs with random graph structures in advancing brain-inspired intelligence and lays the foundation for their practical application on neuromorphic hardware.
💡 Deep Analysis
📄 Full Content
CogniSNN: Enabling Neuron-Expandability, Pathway-Reusability, and
Dynamic-Configurability with Random Graph Architectures in Spiking
Neural Networks
Yongsheng Huanga,b, Peibo Duana,∗, Yujie Wuc, Kai Sund, Zhipeng Liua, Changsheng Zhanga,
Bin Zhanga and Mingkun Xub,∗
aSchool of Software, Northeastern University, Shenyang, 110000, China
bGuangdong Institute of Intelligence Science and Technology, Zhuhai, 519000, China
cDepartment of Computing, The Hong Kong Polytechnic University, Hongkong, 000000, China
dDepartment of Data Science and AI, Monash University, Melbourne, 3000, Australia
A R T I C L E I N F O
Keywords:
Spiking neural networks
Spiking residual learning
Random graph theory
Robustness
Neuromorphic object recognition
A B S T R A C T
Spiking neural networks (SNNs), regarded as the third generation of artificial neural networks, are
expected to bridge the gap between artificial intelligence and computational neuroscience. However,
most mainstream SNN research directly adopts the rigid, chain-like hierarchical architecture of tradi-
tional artificial neural networks (ANNs), ignoring key structural characteristics of the brain. Biological
neurons are stochastically interconnected, forming complex neural pathways that exhibit Neuron-
Expandability, Pathway-Reusability, and Dynamic-Configurability. In this paper, we introduce a
new SNN paradigm, named Cognition-aware SNN (CogniSNN), by incorporating Random Graph
Architecture (RGA). Furthermore, we address the issues of network degradation and dimensional
mismatch in deep pathways by introducing an improved pure spiking residual mechanism alongside
an adaptive pooling strategy. Then, we design a Key Pathway-based Learning without Forgetting
(KP-LwF) approach, which selectively reuses critical neural pathways while retaining historical
knowledge, enabling efficient multi-task transfer. Finally, we propose a Dynamic Growth Learning
(DGL) algorithm that allows neurons and synapses to grow dynamically along the internal temporal
dimension. Extensive experiments demonstrate that CogniSNN achieves performance comparable
to, or even surpassing, current state-of-the-art SNNs on neuromorphic datasets and Tiny-ImageNet.
The Pathway-Reusability enhances the network’s continuous learning capability across different
scenarios, while the dynamic growth algorithm improves robustness against interference and mitigates
the fixed-timestep constraints during neuromorphic chip deployment. This work demonstrates the
potential of SNNs with random graph structures in advancing brain-inspired intelligence and lays
the foundation for their practical application on neuromorphic hardware. The code is available at
https://github.com/Yongsheng124/CogniSNN.
1. Introduction
Originally envisioned to simulate biological firing pro-
cesses, Spiking Neural Networks (SNNs), owing to their
event-driven nature, ultra-low energy consumption, and rich
spatio-temporal dynamics, have garnered significant atten-
tion in recent years Wu et al. (2022). However, in a relent-
less pursuit of performance metrics (Deng et al., 2020b),
exemplified by direct training approaches (Wu et al., 2019;
Zheng et al., 2021), current mainstream SNNs predomi-
nantly adopt architectures derived from traditional Artificial
Neural Networks (ANNs), such as Spiking ResNet
(Hu
et al., 2021), Spiking Transformer (Lu et al., 2025), and
Spiking Mamba (Li et al., 2024), thereby increasingly devi-
ating from these brain-inspired origins. While these equiva-
lents often achieve performance parity with ANNs in static
tasks (He et al., 2020), they fall short of the expectations
placed on SNNs as the intersection of computational neuro-
science and artificial intelligence toward Artificial General
Intelligence (AGI) (Deng et al., 2020a; Xu et al., 2023).
∗Corresponding author.
duanpeibo@swc.neu.edu.cn (P. Duan); xumingkun@gdiist.cn (M. Xu)
ORCID(s): 0009-0001-6620-4343 (Y. Huang)
Specifically, existing architectures excel at single-task pro-
cessing, fueling applications like autonomous driving and
ChatGPT, but struggle significantly when faced with real-
world, multi-task scenarios (Tyagi and Rekha, 2020). They
suffer from catastrophic forgetting and exhibit weak robust-
ness against interference, capabilities where the biological
brain far surpasses artificial systems. These challenges ne-
cessitate a revisiting of the structural principles of the brain
to explore novel paradigms for SNN design.
The brain consists of a vast number of neurons with
stochastic connections, and the connectivity can be ab-
stracted as a Random Graph Architecture (RGA) with small-
world properties (Bullmore and Sporns, 2009). However,
most models employ rigid, chain-like hierarchical architec-
tures (Xie et al., 2019), which fail to reflect the complex
topology of biological networks. While prior works (Xie
et al., 2019; Yan et al., 2024) have utilized random graphs for
Network Architecture Search (NAS), they primarily view the
random structure as a se
Reference
This content is AI-processed based on open access ArXiv data.