학술논문

Classifying Hand Gestures using Artificial Neural Networks for a Robotic Application
Document Type
Conference
Source
2019 SoutheastCon SoutheastCon, 2019. :1-5 Apr, 2019
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Electromyography
Biological neural networks
Prosthetics
Neurons
Sensors
Wrist
Robots
Artificial Neutral Networks
Electromyography (EMG)
Robotics
Myo Armband
Language
ISSN
1558-058X
Abstract
This project serves to design and fabricate a robotic arm that imitates the movements of a biological human arm. The open source design was modified, and individual parts were 3D printed for assembly. Servo-motors act as the muscles, pulling nylon strings connected to the fingers that will perform hand gestures. The Myo Armband is used to collect the electromyographic (EMG) signals from the forearm of the test subject to train an artificial neural network (ANN) having 35 different classes consisting of the American Sign Language. Once the Artificial Neural Network is trained, it was used in real-time classification to make predictions for the robotic arm. Using a two-layer feed forward network, accuracies for offline training reached a recognition rate of 94.7 percent. Previous prosthetic advancement has been too expensive for the general population. Our goal is to build an inexpensive alternative.