What is it about?

Link prediction on dynamic graphs is an important task in graph mining. Existing approaches based on dynamic graph neural networks (DGNNs) typically require a significant amount of historical data (interactions over time), which is not always available in practice. The missing links over time, which is a common phenomenon in graph data, further aggravates the issue and thus creates extremely sparse and dynamic graphs. To address this problem, we propose a novel method based on the neural process, called Graph Sequential Neural ODE Process (GSNOP). Specifically, GSNOP combines the advantage of the neural process and neural ordinary differential equation that models the link prediction on dynamic graphs as a dynamic-changing stochastic process. By defining a distribution over functions, GSNOP introduces the uncertainty into the predictions, making it generalize to more situations instead of overfitting to the sparse data. GSNOP is also agnostic to model structures that can be integrated with any DGNN to consider the chronological and geometrical information for link prediction. Extensive experiments on three dynamic graph datasets show that GSNOP can significantly improve the performance of existing DGNNs and outperform other neural process variants.

Featured Image

Why is it important?

Existing approaches of link prediction on dynamic graphs typically require a significant amount of historical data. The missing links over time, which is a common phenomenon in graph data, further aggravates the issue and thus creates extremely sparse and dynamic graphs. How to enable effective link prediction on dynamic and sparse graphs remains a significant challenge in this area.

Read the Original

This page is a summary of: Graph Sequential Neural ODE Process for Link Prediction on Dynamic and Sparse Graphs, February 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3539597.3570465.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page