학술논문

A 44.2-TOPS/W CNN Processor With Variation-Tolerant Analog Datapath and Variation Compensating Circuit
Document Type
Periodical
Author
Source
IEEE Journal of Solid-State Circuits IEEE J. Solid-State Circuits Solid-State Circuits, IEEE Journal of. 59(5):1603-1611 May, 2024
Subject
Components, Circuits, Devices and Systems
Engineered Materials, Dielectrics and Plasmas
Computing and Processing
Convolutional neural networks
Energy efficiency
Data conversion
Capacitors
Analog circuits
Simulation
Random access memory
Analog computing
convolutional neural network (CNN)
hardware accelerators
machine learning
multiply-and-accumulate (MAC)
Language
ISSN
0018-9200
1558-173X
Abstract
Convolutional neural network (CNN) processors that exploit analog computing for high energy efficiency suffer from two major issues. First, frequent data conversions between the layers limit energy efficiency. Second, computing errors occur from analog circuits since they are vulnerable to process, voltage, and temperature (PVT) variations. In this article, a CNN processor featuring a variation-tolerant analog datapath with analog memory (AMEM) is proposed so that data conversion is not needed. To minimize the computing error, both AMEM and ANU are designed in such a way that their performance is not affected by PVT variations. In addition, a variation compensating circuit is also proposed. Prototype implemented in 28-nm complementary metal-oxide-semiconductor (CMOS) achieves energy-efficiency of 437.9 TOPS/W in the analog datapath, 44.2 TOPS/W in the total system and maintains its classification accuracy to within 0.5%p across variations of ${\pm }10\%$ in supply voltage and −20 °C to 85 °C in temperature.