학술논문

Multi-scale Heterogeneous Graph Contrastive Learning*
Document Type
Conference
Source
2023 IEEE International Conference on Big Data (BigData) Big Data (BigData), 2023 IEEE International Conference on. :4432-4441 Dec, 2023
Subject
Bioengineering
Computing and Processing
Geoscience
Robotics and Control Systems
Signal Processing and Analysis
Representation learning
Semantics
Self-supervised learning
Semisupervised learning
Big Data
Graph neural networks
Task analysis
heterogeneous graph
multi-scale
contrastive learning
meta-path
Language
Abstract
In recent years, heterogeneous graph neural networks have become the mainstream approach for handling heterogeneous graph data. However, due to the sparsity of labels, most existing methods for heterogeneous graph neural networks typically employ a semi-supervised learning approach, which has certain limitations in practical applications. To address this issue, we propose a self-supervised heterogeneous graph representation learning method, namely Multi-scale Heterogeneous Graph Contrastive Learning (MHGCL). This approach decodes encoded information from two perspectives: meta-paths and network patterns, in a multi-scale fashion. It uses a loss function that maximizes the similarity between positive pairs at different scales and minimizes the similarity between negative pairs. This encourages related nodes and edges to be close to each other in the embedding space, while unrelated nodes and edges are pushed farther apart. Experimental results demonstrate that MHGCL comprehensively captures semantic information at different scales between nodes. It exhibits excellent performance in node classification tasks, validating its effectiveness in heterogeneous graph node embedding learning.