학술논문

Multitask Learning for Simultaneous Gesture and Force Level Recognition Toward Prosthetic Hand Interaction
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 24(7):11759-11769 Apr, 2024
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Force
Task analysis
Prosthetic hand
Sensors
Convolution
Multitasking
Computational modeling
Attention mechanism
force determination
gesture recognition
multitask learning (MTL)
surface electromyographic (sEMG)
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
In the field of intelligent prosthetics (IPs), the establishment of a natural interaction between prosthetic hands and amputees holds significant importance in restoring hand functionality, enhancing quality of life, and facilitating daily activities and social engagement. Prior investigations on surface electromyographic (sEMG) signals-controlled IPs have predominantly concentrated on gesture recognition, frequently neglecting the equally significant dimension of force level. This study proposes a control strategy integrating a multitask learning (MTL) model to achieve synchronized recognition of gestures and force levels. The MTL model, incorporating shared convolutional blocks, self-attention, and multihead attention layers, enhances prosthetic hand control for seamless user-device interaction. This study consistently showcases exceptional proficiency in recognizing gestures and force levels by conducting meticulous experimentation and validating the findings using datasets from diverse participants. Comparative assessments endorse the superiority of the MTL approach, particularly in real-time testing scenarios. The findings highlight the potential of this innovative myoelectric control strategy, empowering prosthetic users for prompt, precise, and intuitive responses, significantly augmenting their autonomy and quality of life.