학술논문

Are SNNs Really More Energy-Efficient Than ANNs? an In-Depth Hardware-Aware Study
Document Type
Periodical
Source
IEEE Transactions on Emerging Topics in Computational Intelligence IEEE Trans. Emerg. Top. Comput. Intell. Emerging Topics in Computational Intelligence, IEEE Transactions on. 7(3):731-741 Jun, 2023
Subject
Computing and Processing
Neurons
Synapses
Energy consumption
Computer architecture
Hardware
Computational modeling
Membrane potentials
Spiking neural networks
neuromorphic hardware
deep neural network accelerators
energy efficiency
Language
ISSN
2471-285X
Abstract
Spiking Neural Networks (SNNs) hold the promise of lower energy consumption in embedded hardware due to their spike-based computations compared to traditional Artificial Neural Networks (ANNs). The relative energy efficiency of this emerging technology compared to traditional digital hardware has not been fully explored. Many studies do not consider memory accesses, which account for an important fraction of the energy consumption, use naive ANN hardware implementations, or lack generality. In this paper, we compare the relative energy efficiency of classical digital implementations of ANNs with novel event-based SNN implementations based on variants of the Integrate and Fire (IF) model. We provide a theoretical upper bound on the relative energy efficiency of ANNs, by computing the maximum possible benefit from ANN data reuse and sparsity. We also use the Eyeriss ANN accelerator as a case study. We show that the simpler IF model is more energy-efficient than the Leaky IF and temporal continuous synapse models. Moreover, SNNs with the IF model can compete with efficient ANN implementations when there is a very high spike sparsity, i.e. between 0.15 and 1.38 spikes per synapse per inference, depending on the ANN implementation. Our analysis shows that hybrid ANN-SNN architectures, leveraging a SNN event-based approach in layers with high sparsity and ANN parallel processing for the others, are a promising new path for further energy savings.