학술논문

MEET: A Multi-Band EEG Transformer for Brain States Decoding
Document Type
Periodical
Source
IEEE Transactions on Biomedical Engineering IEEE Trans. Biomed. Eng. Biomedical Engineering, IEEE Transactions on. 71(5):1442-1453 May, 2024
Subject
Bioengineering
Computing and Processing
Components, Circuits, Devices and Systems
Communication, Networking and Broadcast Technologies
Electroencephalography
Brain modeling
Transformers
Feature extraction
Biological system modeling
Deep learning
Data models
EEG
multi-band fusion
transformer
brain function dynamics
Language
ISSN
0018-9294
1558-2531
Abstract
Objective: Electroencephalography (EEG) is among the most widely used and inexpensive neuroimaging techniques. Compared to the CNN or RNN based models, Transformer can better capture the temporal information in EEG signals and focus more on global features of the brain's functional activities. Importantly, according to the multiscale nature of EEG signals, it is crucial to consider the multi-band concept into the design of EEG Transformer architecture. Methods: We propose a novel M ulti-band EE G T ransformer (MEET) to represent and analyze the multiscale temporal time series of human brain EEG signals. MEET mainly includes three parts: 1) transform the EEG signals into multi-band images, and preserve the 3D spatial information between electrodes; 2) design a Band Attention Block to compute the attention maps of the stacked multi-band images and infer the fused feature maps; 3) apply the Temporal Self-Attention and Spatial Self-Attention modules to extract the spatiotemporal features for the characterization and differentiation of multi-frame dynamic brain states. Results: The experimental results show that: 1) MEET outperforms state-of-the-art methods on multiple open EEG datasets (SEED, SEED-IV, WM) for brain states classification; 2) MEET demonstrates that 5-bands fusion is the best integration strategy; and 3) MEET identifies interpretable brain attention regions. Significance: MEET is an interpretable and universal model based on the multiband-multiscale characteristics of EEG. Conclusion: The innovative combination of band attention and temporal/spatial self-attention mechanisms in MEET achieves promising data-driven learning of the temporal dependencies and spatial relationships of EEG signals across the entire brain in a holistic and comprehensive fashion.