All in one timestep: Enhancing Sparsity and Energy efficiency in Multi-level Spiking Neural Networks

Reading time: 2 minute
...

📝 Original Info

  • Title: All in one timestep: Enhancing Sparsity and Energy efficiency in Multi-level Spiking Neural Networks
  • ArXiv ID: 2510.24637
  • Date: 2025-10-28
  • Authors: ** 제공된 논문 정보에 저자 명단이 포함되어 있지 않아 확인할 수 없습니다. **

📝 Abstract

Spiking Neural Networks (SNNs) are one of the most promising bio-inspired neural networks models and have drawn increasing attention in recent years. The event-driven communication mechanism of SNNs allows for sparse and theoretically low-power operations on dedicated neuromorphic hardware. However, the binary nature of instantaneous spikes also leads to considerable information loss in SNNs, resulting in accuracy degradation. To address this issue, we propose a multi-level spiking neuron model able to provide both low-quantization error and minimal inference latency while approaching the performance of full precision Artificial Neural Networks (ANNs). Experimental results with popular network architectures and datasets, show that multi-level spiking neurons provide better information compression, allowing therefore a reduction in latency without performance loss. When compared to binary SNNs on image classification scenarios, multi-level SNNs indeed allow reducing by 2 to 3 times the energy consumption depending on the number of quantization intervals. On neuromorphic data, our approach allows us to drastically reduce the inference latency to 1 timestep, which corresponds to a compression factor of 10 compared to previously published results. At the architectural level, we propose a new residual architecture that we call Sparse-ResNet. Through a careful analysis of the spikes propagation in residual connections we highlight a spike avalanche effect, that affects most spiking residual architectures. Using our Sparse-ResNet architecture, we can provide state-of-the-art accuracy results in image classification while reducing by more than 20% the network activity compared to the previous spiking ResNets.

💡 Deep Analysis

Figure 1

📄 Full Content

📸 Image Gallery

cifar_10_top1.png graded_neuron.drawio.png gradients_resnets.png multi_level_quantization.png resnet18_accuracy_sparsity.drawio.png sparsity_resnets.png spike_avalanche-Page-1.drawio.png spiking_resnets.drawio.png surrogate_function.png vgg16_activity_energy.drawio.png vloss_resnets.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut