학술논문

On the Dependable Operation of Bidirectional Encoder Representations from Transformers (BERT) in the Presence of Soft Errors
Document Type
Conference
Source
2023 IEEE 23rd International Conference on Nanotechnology (NANO) Nanotechnology (NANO), 2023 IEEE 23rd International Conference on. :582-586 Jul, 2023
Subject
Bioengineering
Components, Circuits, Devices and Systems
Engineered Materials, Dielectrics and Plasmas
Fields, Waves and Electromagnetics
General Topics for Engineers
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Computer vision
Sensitivity
Memory management
Force
Bidirectional control
Transformers
Encoding
Language
ISSN
1944-9380
Abstract
Transformers are widely used in Natural Language Processing (NLP) and computer vision; the Bidirectional Encoder Representations from Transformers (BERT) is one of the most popular pre-trained models for NLP applications. This paper considers the dependable operations of transformers using BERT and studies the impact of soft errors; as a case study, single-precision floating point weights are considered for emotion classification in text. Simulation by error injection is conducted to assess the impact of errors on different parts of the BERT model as well as the bits of the weights. The analysis of the results leads to the following findings: 1) there is a Critical Bit (CB) on which errors significantly affect the performance of the model; 2) errors that degrade the performance in many cases lead to a single class being output, so that BERT appears to generate a predetermined result regardless of the input. The analysis of the impact of errors is also performed on a layer, type, and head basis to evaluate the sensitivity as well as initial studies on error propagation and multiple bit flips.