학술논문

Zero-Overhead Protection for CNN Weights
Document Type
Conference
Source
2021 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT) Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT), 2021 IEEE International Symposium on. :1-6 Oct, 2021
Subject
Aerospace
Components, Circuits, Devices and Systems
Computing and Processing
Sensitivity
Network topology
Bit error rate
Memory management
Fault tolerant systems
Very large scale integration
Robustness
Language
ISSN
2765-933X
Abstract
The numerical format used for representing weights and activations plays a key role in the computational efficiency and robustness of CNNs. Recently, a 16-bit floating point format called Brain-Float 16 (bf16) has been proposed and implemented in hardware accelerators. However, the robustness of accelerators implemented with this format has not yet been studied. In this paper, we perform a comparison of the robustness of state-of-the art CNNs implemented with 8-bit integer, Brain-Float 16 and 32-bit floating point formats. We also introduce an error detection and masking technique, called opportunistic parity (OP), which can detect and mask errors in the weights with zero storage overhead. With this technique, the robustness of floating point weights to bit-flips can be improved by up to three orders of magnitude.