학술논문

Robust graph neural networks with Dirichlet regularization and residual connection
Document Type
Original Paper
Source
International Journal of Machine Learning and Cybernetics. 15(9):3733-3743
Subject
Graph neural networks
Dirichlet regularization
Residual connection
Self-supervised learning
Mask strategy
Language
English
ISSN
1868-8071
1868-808X
Abstract
Graph Neural Network (GNN) has attracted considerable research interest in various graph data modeling tasks. Most GNNs require efficient and sufficient label information during training phase. However, in open environments, the performance of existing GNNs sharply decrease according to the data (structure, attribute and label) missing and noising. Several recent attempts have been made to improve the performance and robustness of GNNs, most of which are contrastive learning-based and auto encoder-based strategies. In this paper, a semi-supervised learning framework is proposed for graph modeling tasks, i.e., robust graph neural network with Dirichlet regularization and Residual connection (DRGNN). Specifically, the structure and feature of the original graph are both masked to generate the masked graph, which is sent to the graph representation learning block (encoder) to learn the latent node representation. Additionally, an initial residual connect is introduced into the graph representation learning block to directly transmit the original node feature to the last layer to retain the inherent information of the node itself. Finally, the whole network is jointly optimized by the structure reconstructed loss, feature reconstructed loss and the classification loss. Note that a Dirichlet regularization constraint is introduced into the learning objective to dominate the latent node representation into a local smoothing scenario, which is more conforms with the manifold assumption of the graph representation learning. Extensive experiments demonstrate the state-of-the-art accuracy and the robustness of the proposed DRGNN on benchmark datasets.