학술논문

Lower Limb Motion Intent Recognition Based on Sensor Fusion and Fuzzy Multitask Learning
Document Type
Periodical
Source
IEEE Transactions on Fuzzy Systems IEEE Trans. Fuzzy Syst. Fuzzy Systems, IEEE Transactions on. 32(5):2903-2914 May, 2024
Subject
Computing and Processing
Robot sensing systems
Electromyography
Task analysis
Muscles
Fuzzy systems
Multitasking
Bladder
Fuzzy system
gait recognition
locomotion recognition
multitask learning
pressure mechanomyography (PMMG)
sensor fusion
Language
ISSN
1063-6706
1941-0034
Abstract
Lower limb motion intent recognition is a crucial aspect of wearable robot control and human–machine collaboration. Among the various sensors used for this purpose, the electromyogram (EMG) sensor remains one of the most widely employed. However, EMG signals are highly susceptible to electrical noise, motion artefacts, and perspiration, which can compromise their quality. To address these challenges, we designed an air-pressure mechanomyography (PMMG) sensor and developed a wearable multimodal sensor system that incorporates PMMG thigh-ring, inertial measurement unit, and force-sensitive resistor. To enhance gait phase and locomotion mode recognition performance, we proposed a gate multitask Takagi–Sugeno–Kang fuzzy inference system (GMT-TSK-FIS) algorithm that enables simultaneous handling of multiple recognition tasks. This approach enabled the development of a lower limb motion intent recognition system that can simultaneously recognize gait phase and locomotion mode based on GMT-TSK-FIS. The experimental results showed that the accuracy of gait phase and locomotion mode recognition was 98.28% and 99.96%, respectively. Furthermore, the study demonstrated that multimodal sensor fusion outperformed single-modal sensor fusion, while multi-task recognition exhibited better performance than single-task recognition.