학술논문

Electromyography-Based, Robust Hand Motion Classification Employing Temporal Multi-Channel Vision Transformers
Document Type
Conference
Source
2022 9th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob) Biomedical Robotics and Biomechatronics (BioRob), 2022 9th IEEE RAS/EMBS International Conference for. :1-8 Aug, 2022
Subject
Robotics and Control Systems
Performance evaluation
Deep learning
Radio frequency
Biological system modeling
Predictive models
Feature extraction
Transformers
Language
ISSN
2155-1782
Abstract
With an increasing use of robotic and bionic devices for the execution of everyday life, complex tasks, Electromyography (EMG) based interfaces are being explored as candidate technologies for facilitating an intuitive interaction with such devices. However, EM G- based interfaces typically require appropriate features to be extracted from the raw EMG signals using a plethora of feature extraction methods to achieve excellent performance in practical applications. To select an appropriate feature set that will lead to significant EMG-based decoding performance, a deep understanding of available methods and the human musculoskeletal system is needed. To overcome this issue, researchers have proposed the use of deep learning methods for automatically extracting complex features directly from the raw EMG data. In this work, we propose Temporal Multi-Channel Vision Transformers as a deep learning technique that has the potential to achieve dexterous control of robots and bionic hands. The performance of this method is evaluated and compared with other well- known methods, employing the open-access Ninapro dataset.