학술논문

Explorations of Autonomous Prosthetic Grasping via Proximity Vision and Deep Learning
Document Type
Periodical
Source
IEEE Transactions on Medical Robotics and Bionics IEEE Trans. Med. Robot. Bionics Medical Robotics and Bionics, IEEE Transactions on. 6(2):685-694 May, 2024
Subject
Bioengineering
Robotics and Control Systems
Computing and Processing
Sensors
Radar
Prosthetic hand
Instruments
Cameras
Medical robotics
Biomimetics
Autonomous
computer vision
deep learning
grasping
hand prosthesis
inertial
prosthetics
proximity
sensors
Language
ISSN
2576-3202
Abstract
The traumatic loss of a hand is usually followed by significant psychological, functional and rehabilitation challenges. Even though much progress has been reached in the past decades, the prosthetic challenge of restoring the human hand functionality is still far from being achieved. Autonomous prosthetic hands showed promising results and wide potential benefit, a benefit that must be still explored and deployed. Here, we hypothesized that a combination of a radar sensor and a low-resolution time-of-flight camera can be sufficient for object recognition in both static and dynamic scenarios. To test this hypothesis, we analyzed via deep learning algorithms HANDdata, a human-object interaction dataset with particular focus on reach-to-grasp actions. Inference testing was also performed on unseen data purposely acquired. The analyses reported here, broken down to gradually increasing levels of complexity, showed a great potential of using such proximity sensors as alternative or complementary solution to standard camera-based systems. In particular, integrated and low-power radar can be a potential key technology for next generation intelligent and autonomous prostheses.