학술논문

EEG Classification of Covert Speech Using Regularized Neural Networks
Document Type
Periodical
Source
IEEE/ACM Transactions on Audio, Speech, and Language Processing IEEE/ACM Trans. Audio Speech Lang. Process. Audio, Speech, and Language Processing, IEEE/ACM Transactions on. 25(12):2292-2300 Dec, 2017
Subject
Signal Processing and Analysis
Computing and Processing
Communication, Networking and Broadcast Technologies
General Topics for Engineers
Biology
Signal processing
Neural networks
Electroencephalography
Artificial neural networks
Multilayer perceptrons
Speech synthesis
brain-computer interface (BCI)
covert speech
electroencephalography (EEG)
multilayer perceptrons
wavelet transforms
Language
ISSN
2329-9290
2329-9304
Abstract
Communication using brain–computer interfaces (BCIs) can be non-intuitive, often requiring the performance of a conversation-irrelevant task such as hand motor imagery. In this paper, the reliability of electroencephalography (EEG) signals in discriminating between different covert speech tasks is investigated. Twelve participants, across two sessions each, were asked to perform multiple iterations of three differing mental tasks for 10 s each: unconstrained rest or the mental repetition of the words “yes” or “no.” A multilayer perceptron (MLP) artificial neural network (ANN) was used to classify all three pairwise combinations of “yes,” “no,” and rest trials and also for ternary classification. An average accuracy of 75.7% ± 9.6 was reached in the classification of covert speech trials versus rest, with all participants exceeding chance level (57.8%). The classification of “yes” versus “no” yielded an average accuracy of 63.2 ± 6.4 with ten participants surpassing chance level (57.8%). Finally, the ternary classification yielded an average accuracy of 54.1% ± 9.7 with all participants exceeding chance level (39.1%). The proposed MLP network provided significantly higher accuracies compared to some of the most common classification techniques in BCI. To our knowledge, this is the first report of using ANN for the classification of EEG covert speech across multiple sessions. Our findings support further study of covert speech as a BCI activation task, potentially leading to the development of more intuitive BCIs for communication.