학술논문

Improving the Robustness of Neural Networks to Noisy Multi-Level Non-Volatile Memory-based Synapses
Document Type
Conference
Source
2023 International Joint Conference on Neural Networks (IJCNN) Neural Networks (IJCNN), 2023 International Joint Conference on. :1-8 Jun, 2023
Subject
Components, Circuits, Devices and Systems
Computing and Processing
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Training
Nonvolatile memory
Neuromorphics
Error analysis
Artificial neural networks
Robustness
Hardware
neuromorphic computing
non-volatile memories
robustness
fault tolerance
spiking neural networks
convolutional neural networks
recurrent neural networks
Language
ISSN
2161-4407
Abstract
The implementation of Artificial Neural Networks (ANNs) using analog Non-Volatile Memories (NVMs) for synaptic weights storage promises improved energy-efficiency and higher density compared to fully-digital implementations. However, NVMs are prone to variability, resulting in a degradation of the accuracy of ANNs. In this paper, a general methodology to evaluate and enhance the accuracy of neural networks implemented with non-ideal multi-level NVMs is presented. A hardware fault model distinguishing two types of errors, namely static and dynamic, capturing the variability of NVMs is proposed. Considering various neural networks, it is shown that error-aware training highly increases the robustness to errors compared to a standard, error-agnostic, training. Moreover, Recurrent NNs (RNNs) and Spiking NNs (SNNs) are found to be inherently more robust to dynamic errors than Convolutional NNs (CNNs). In addition, new insights on the adaptability of neural networks to noisy multi-level NVMs are presented, which could further improve their robustness in this context. The methodology aims at providing tools for hardware-software co-design, paving the way for a broader use of multi-level NVM-based synapses.