학술논문

Dual-Neighborhood Deep Fusion Network for Point Cloud Analysis
Document Type
Conference
Source
2022 IEEE International Conference on Multimedia and Expo Workshops (ICMEW) Multimedia and Expo Workshops (ICMEW), 2022 IEEE International Conference on. :1-6 Jul, 2022
Subject
Computing and Processing
Point cloud compression
Deep learning
Representation learning
Three-dimensional displays
Shape
Convolution
Conferences
deep neural network
shape descriptors
non-idealized point cloud
deep fusion network
adaptive neighborhood
Language
Abstract
Recently, deep neural networks have made remarkable achievements in 3D point cloud analysis. However, the current shape descriptors are inadequate for capturing the information thoroughly. To handle this problem, a feature representation learning method, named Dual-Neighborhood Deep Fusion Network (DNDFN), is proposed to serve as an improved point cloud encoder for the task of point cloud analysis. Specifically, the traditional local neighborhood ignores the long-distance dependency and DNDFN utilizes an adaptive key neighborhood replenishment mechanism to overcome the limitation. Furthermore, the transmission of information between points depends on the unique potential relationship between them, so a convolution for capturing the relationship is proposed. Extensive experiments on existing benchmarks especially non-idealized datasets verify the effectiveness of DNDFN and DNDFN achieves the state of the arts.