학술논문

Low-current, highly linear synaptic memory device based on MoS2 transistors for online training and inference
Document Type
Conference
Source
2022 IEEE 4th International Conference on Artificial Intelligence Circuits and Systems (AICAS) Artificial Intelligence Circuits and Systems (AICAS), 2022 IEEE 4th International Conference on. :1-4 Jun, 2022
Subject
Components, Circuits, Devices and Systems
Training
Backpropagation
Performance evaluation
MOSFET
Linearity
Logic gates
Sulfur
MoS2
synaptic device
in-memory computing
neural networks
deep learning
Language
Abstract
In-memory computing (IMC) is attracting a strong interest for hardware accelerators of neural networks for artificial intelligence (AI) applications. To that aim, high density memory arrays are used as artificial synaptic arrays, storing the weights of the neural network and performing the matrix-vector multiplication (MVM) involved in the network operation. Within these implementations, in-situ update of the weights can be achieved during network training, thus avoiding power-hungry data movement. For training applications, a key requirement for synaptic devices is the capability to operate at low current to avoid large area of the peripheral circuitry and excessive current-resistance (IR) drop. Also, high linearity of weight update is necessary for accelerating the outer product for online training by backpropagation. To meet all these demands, in this work we present a novel synaptic memory device based on interface-state trapping in MOS transistors with a 2D MoS 2 channel. By operating the device in the deep subthreshold regime, a very low (few nS) and linearly updatable conductance with pulses of equal amplitude is demonstrated. Simulations of neural network training show an accuracy of 96.8% for MNIST, close to the floating-point accuracy of 97.8%.