What is it about?
In this paper, we propose a graph representation learning method based on Prototype-aware contrastive learning and Masked Graph Auto-Encoding, named ProtoMGAE. Our model leverages three complementary objectives, i.e., masked feature reconstruction, clustering consistency, and representation contrasting, to capture graph information and learn node representations from macro, meso, and micro perspectives.
Featured Image
Photo by Conny Schneider on Unsplash
Why is it important?
Enhanced representations. We employ a masked graph modeling strategy to accommodate incomplete graphs with missing node features. Moreover, the contrastive objective of the online-target network pulls positive pairs aligned closely while ensuring the representations are uniformly distributed on the unit hypersphere. These strategies help us learn more robust and discriminative node representations. Performance improvement. Extensive experiments conducted on several datasets demonstrate that the proposed method achieves significantly better or competitive performance on downstream tasks, especially for graph clustering, compared with the state-of-the-art methods, showcasing its superiority in enhancing graph representation learning.
Perspectives
Read the Original
This page is a summary of: ProtoMGAE: Prototype-Aware Masked Graph Auto-Encoder for Graph Representation Learning, ACM Transactions on Knowledge Discovery from Data, April 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3649143.
You can read the full text:
Contributors
The following have contributed to this page