학술논문

Realtime Hand Landmark Tracking to Aid Development of a Prosthetic Arm for Reach and Grasp Motions
Document Type
Conference
Source
2021 IEEE International Symposium on Robotic and Sensors Environments (ROSE) Robotic and Sensors Environments (ROSE), 2021 IEEE International Symposium on. :1-7 Oct, 2021
Subject
Robotics and Control Systems
Solid modeling
Three-dimensional displays
Tracking
Shape
Neural networks
Tactile sensors
Arms
prosthetic arm
computer vision
gesture recognition
grip pattern
neural network
Language
Abstract
A prosthetic device, also known as a prosthesis, can help with rehabilitation when an arm or other limbs is severed or lost. The upper-limb prosthesis aids restoration of motor skills, however, tactile sensations are deprived to an amputee that is used for grip control. As a result, it's a frequent assumption that restoring force feedback will help with prosthetic gripping force management. This paper presents a vision-based analysis to collate data using neural network hand landmarks to distinguish between various grip patterns that will aid the prosthesis. The prosthetic arm is developed using 3D printing technology, which makes an efficient low-cost solution that can be personalized for every user in terms of size, shape, or color. The computer vision hand landmark technique uses a machine learning (ML) pipeline. This encompasses various models such as BlazePalm used to deduce 21 3D landmarks in a bounding box of a hand from a single picture. Along with vision data, information regarding grasp motion is best relayed through tactile feedback. In this project, the hand landmark comparison functions along with the pressure data from the tactile sensors placed on the prosthetic arm under the same settings revealed distinct patterns that can be utilized to distinguish different grasp motions.