학술논문

Subthreshold operation of SONOS analog memory to enable accurate low-power neural network inference
Document Type
Conference
Source
2022 International Electron Devices Meeting (IEDM) Electron Devices Meeting (IEDM), 2022 International. :21.7.1-21.7.4 Dec, 2022
Subject
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Fields, Waves and Electromagnetics
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
SONOS devices
Machine learning algorithms
Neural networks
Life estimation
In-memory computing
Inference algorithms
Energy efficiency
Language
ISSN
2156-017X
Abstract
Hardware accelerators that exploit analog in-memory computing offer an energy-efficient edge deployment solution for machine learning algorithms. We give an overview of the device requirements and hardware-software co-design principles for these systems to achieve efficient and accurate deep neural network (DNN) inference. We designed and fabricated a 40nm test chip with a $1024 \times 1024$ SONOS (siliconoxide-nitride-oxide-silicon) charge trapping memory array for DNN inference. Operating the SONOS memory in the subthreshold regime suppresses the effects of device variability on algorithm accuracy. We experimentally demonstrate accurate DNN inference using the test chip on CIFAR-100 image classification and project a chip-level efficiency of >50 TOPS/W for the SONOS inference accelerator, a $10 \times$ advantage over state-of-the-art digital inference accelerators.