학술논문

Multimodal Sensing and Interaction for a Robotic Hand Orthosis
Document Type
Periodical
Source
IEEE Robotics and Automation Letters IEEE Robot. Autom. Lett. Robotics and Automation Letters, IEEE. 4(2):315-322 Apr, 2019
Subject
Robotics and Control Systems
Computing and Processing
Components, Circuits, Devices and Systems
Robot sensing systems
Electromyography
Tendons
Multimodal sensors
Pressure sensors
Wearable robots
prosthetics and exoskeletons
rehabilitation robotics
Language
ISSN
2377-3766
2377-3774
Abstract
Wearable robotic hand rehabilitation devices can allow greater freedom and flexibility than their workstation-like counterparts. However, the field is generally lacking effective methods by which the user can operate the device: such controls must be effective, intuitive, and robust to the wide range of possible impairment patterns. Even when focusing on a specific condition, such as stroke, the variety of encountered upper limb impairment patterns means that a single sensing modality, such as electromyography (EMG), might not be sufficient to enable controls for a broad range of users. To address this significant gap, we introduce a multimodal sensing and interaction paradigm for an active hand orthosis. In our proof-of-concept implementation, EMG is complemented by other sensing modalities, such as finger bend and contact pressure sensors. We propose multimodal interaction methods that utilize this sensory data as input and show they can enable tasks for stroke survivors who exhibit different impairment patterns. We believe that robotic hand orthoses developed as multimodal sensory platforms with help address some of the key challenges in physical interaction with the user.