학술논문

A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors
Document Type
Periodical
Source
IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans IEEE Trans. Syst., Man, Cybern. A Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on. 41(6):1064-1076 Nov, 2011
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Robotics and Control Systems
Power, Energy and Industry Applications
General Topics for Engineers
Electromyography
Hidden Markov models
Gesture recognition
Sensors
Acceleration
Decision trees
electromyography
hand gesture recognition
hidden Markov models (HMMs)
Language
ISSN
1083-4427
1558-2426
Abstract
This paper presents a framework for hand gesture recognition based on the information fusion of a three-axis accelerometer (ACC) and multichannel electromyography (EMG) sensors. In our framework, the start and end points of meaningful gesture segments are detected automatically by the intensity of the EMG signals. A decision tree and multistream hidden Markov models are utilized as decision-level fusion to get the final results. For sign language recognition (SLR), experimental results on the classification of 72 Chinese Sign Language (CSL) words demonstrate the complementary functionality of the ACC and EMG sensors and the effectiveness of our framework. Additionally, the recognition of 40 CSL sentences is implemented to evaluate our framework for continuous SLR. For gesture-based control, a real-time interactive system is built as a virtual Rubik's cube game using 18 kinds of hand gestures as control commands. While ten subjects play the game, the performance is also examined in user-specific and user-independent classification. Our proposed framework facilitates intelligent and natural control in gesture-based interaction.