학술논문

GP-Net: Flexible Viewpoint Grasp Proposal
Document Type
Conference
Source
2023 21st International Conference on Advanced Robotics (ICAR) Advanced Robotics (ICAR), 2023 21st International Conference on. :317-324 Dec, 2023
Subject
Robotics and Control Systems
Transportation
Codes
Grasping
Benchmark testing
Manipulators
6-DOF
Proposals
Convolutional neural networks
grasping
robotics
neural networks
6-DoF grasps
mobile manipulator
ROS
Language
ISSN
2572-6919
Abstract
We present the Grasp Proposal Network (GP-net), a Convolutional Neural Network model which can generate 6-DoF grasps from flexible viewpoints, e.g. as experienced by mobile manipulators. To train GP-net, we synthetically generate a dataset containing depth-images and ground-truth grasp information. In real-world experiments, we use the EGAD evaluation benchmark to evaluate GP-net against two commonly used algorithms, the Volumetric Grasping Network (VGN) and the Grasp Pose Detection package (GPD), on a PAL TIAGo mobile manipulator. In contrast to the state-of-the-art methods in robotic grasping, GP-net can be used for grasping objects from flexible, unknown viewpoints without the need to define the workspace and achieves a grasp success of 54.4% compared to 51.6% for VGN and 44.2% for GPD. We provide a ROS package along with our code and pre-trained models at https://aucoroboticsmu.github.io/GP-net/.