What is it about?
Graph Neural Networks (GNNs) have demonstrated great success in Knowledge Graph Completion (KGC) by modeling how entities and relations interact in recent years. However, most of them are designed to learn from the observed graph structure. Motivated by the causal relationship among the entities on a knowledge graph, we explore this defect through a counterfactual question: “Would the relation still exist if the neighborhood of entities became different from observation?”. In this paper, with a carefully designed instantiation of a causal model on the knowledge graph, we generate the counterfactual relations to answer the question by regarding the representations of entity pair given relation as context, structural information of relation-aware neighborhood as treatment, and validity of the composed triplet as the outcome.
Featured Image
Photo by Gabriella Clare Marino on Unsplash
Why is it important?
Motivated by the need for data augmentation on knowledge graphs, we propose the first and timely instantiation of the causal model for KGs via answering counterfactual questions and considering the relation types. We present a framework that utilizes counterfactual relations to augment the representation learning on KGs with special consideration on the imbalanced relation distribution. Our findings further demonstrate that the counterfactual relations also enhance the interpretation ability through path-based explanation of predictions.
Perspectives
Read the Original
This page is a summary of: Knowledge Graph Completion with Counterfactual Augmentation, April 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3543507.3583401.
You can read the full text:
Contributors
The following have contributed to this page