학술논문

Reliable Binarized Neural Networks on Unreliable Beyond Von-Neumann Architecture
Document Type
Periodical
Source
IEEE Transactions on Circuits and Systems I: Regular Papers IEEE Trans. Circuits Syst. I Circuits and Systems I: Regular Papers, IEEE Transactions on. 69(6):2516-2528 Jun, 2022
Subject
Components, Circuits, Devices and Systems
Logic gates
FeFETs
Transistors
Integrated circuit modeling
Neural networks
Computational modeling
Nonvolatile memory
FeFET
logic-in-memory
neural networks
error tolerance
approximate computing
latency
Language
ISSN
1549-8328
1558-0806
Abstract
Specialized hardware accelerators beyond von-Neumann, that offer processing capability in where the data resides without moving it, become inevitable in data-centric computing. Emerging non-volatile memories, like Ferroelectric Field-Effect Transistor (FeFET), are able to build compact Logic-in-Memory (LiM). In this work, we investigate the probability of error (Perror) in FeFET-based XNOR LiM, demonstrating the new trade-off between the speed and reliability. Using our reliability model, we present how Binarized Neural Networks (BNNs) can be proactively trained in the presence of XNOR-induced errors towards obtaining robust BNNs at the design time. Furthermore, leveraging the trade-off between Perror and speed, we present a run-time adaptation technique, that selectively trades-off Perror and XNOR speed for every BNN layer. Our results demonstrate that when a small loss (e.g., 1%) in inference accuracy could be accepted, our design-time and run-time techniques provide error-resilient BNNs that exhibit 75% and 50% (FashionMNIST) and 38% and 24% (CIFAR10) XNOR speedups, respectively.