Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Temporal Graph Learning Workshop

Learning Dynamic Graph Embeddings Using Random Walk With Temporal Backtracking

Chenghan Huang · Lili Wang · Xinyuan Cao · Weicheng Ma · Soroush Vosoughi

Keywords: [ Graph Representation Learning ] [ temporal graphs ] [ graph retrieval ] [ Dynamic Graph Embedding ]


Abstract:

Representation learning on graphs (also referred to as network embedding) can be done at different levels of granularity, from node to graph level. The majority of work on graph representation learning focuses on the former, and while there has been some work done on graph-level embedding, these typically deal with static networks. However, learning low-dimensional graph-level representations for dynamic (i.e., temporal) networks is important for such downstream graph retrieval tasks as temporal graph similarity ranking, temporal graph isomorphism, and anomaly detection. In this paper, we propose a novel temporal graph-level embedding method to fill this gap. Our method first builds a multilayer graph and then utilizes a novel modified random walk with temporal backtracking to generate temporal contexts for the nodes in the graph. Finally, a ``document-level'' language model is learned from these contexts to generate graph-level embeddings. We evaluate our model on five publicly available datasets for two commonly used tasks of graph similarity ranking and anomaly detection. Our results show that our method achieves state-of-the-art performance compared to all prior baselines.

Chat is not available.