Workshop
Temporal Graph Learning Workshop @ NeurIPS 2023
Shenyang Huang · Farimah Poursafaei · Kellin Pelrine · Julia Gastinger · Emanuele Rossi · Michael Bronstein · Reihaneh Rabbany
Room 203 - 205
Temporal graph learning is an emerging area of research in graph representation learning, motivated by the prevalence of evolving and dynamic interconnected data in different domains and applications. In this workshop, which will be the second workshop on temporal graph learning, we plan to bring together researchers working on relevant areas to exchange ideas on different aspects of temporal graph learning including datasets for discrete and continuous time graphs, evaluation strategies, theoretical foundations, as well as using temporal graph learning paradigms in real-world applications.
Schedule
Sat 6:15 a.m. - 6:30 a.m.
|
Opening Remarks
(
Opening Remarks
)
>
SlidesLive Video |
🔗 |
Sat 6:30 a.m. - 6:45 a.m.
|
Continuous-time Graph Representation with Sequential Survival Process
(
Spotlight Talk
)
>
SlidesLive Video |
🔗 |
Sat 6:45 a.m. - 7:00 a.m.
|
Deep Graph Kernel Point Processes
(
Spotlight Talk
)
>
SlidesLive Video |
🔗 |
Sat 7:00 a.m. - 7:30 a.m.
|
Keynote: Daniele Zambon
(
Keynote
)
>
SlidesLive Video |
Daniele Zambon 🔗 |
Sat 7:30 a.m. - 8:00 a.m.
|
Keynote: Ingo Scholtes
(
Keynote
)
>
SlidesLive Video |
Ingo Scholtes 🔗 |
Sat 8:00 a.m. - 8:30 a.m.
|
Coffee Break
(
Break
)
>
|
🔗 |
Sat 8:30 a.m. - 8:45 a.m.
|
SAUC: Sparsity-Aware Uncertainty Calibration for Spatiotemporal Prediction with Graph Neural Networks
(
Spotlight Talk
)
>
SlidesLive Video |
🔗 |
Sat 8:45 a.m. - 9:00 a.m.
|
GenTKG: Generative Forecasting on Temporal Knowledge Graph
(
Spotlight Talk
)
>
SlidesLive Video |
🔗 |
Sat 9:00 a.m. - 10:00 a.m.
|
Poster Session
(
Poster Session
)
>
|
🔗 |
Sat 10:00 a.m. - 11:30 a.m.
|
Lunch Break
(
Break
)
>
|
🔗 |
Sat 11:30 a.m. - 12:00 p.m.
|
Keynote: Rex Ying
(
Keynote
)
>
SlidesLive Video |
Rex Ying 🔗 |
Sat 12:00 p.m. - 12:30 p.m.
|
Keynote: Marinka Zitnik
(
Keynote
)
>
SlidesLive Video |
Marinka Zitnik 🔗 |
Sat 12:30 p.m. - 1:00 p.m.
|
Keynote: Kelsey Allen
(
Keynote
)
>
SlidesLive Video |
Kelsey Allen 🔗 |
Sat 1:00 p.m. - 1:30 p.m.
|
Coffee Break
(
Break
)
>
|
🔗 |
Sat 1:30 p.m. - 2:15 p.m.
|
Round Table Discussion
(
Discussion
)
>
|
🔗 |
Sat 2:15 p.m. - 3:15 p.m.
|
Panel Discussion
(
Discussion Panel
)
>
SlidesLive Video |
🔗 |
Sat 3:15 p.m. - 3:30 p.m.
|
Closing Remarks
(
Closing Remarks
)
>
SlidesLive Video |
🔗 |
-
|
Effective Non-Dissipative Propagation for Continuous-Time Dynamic Graphs
(
Poster
)
>
link
SlidesLive Video Recent research on Deep Graph Networks (DGNs) has broadened the domain of learning on graphs to real-world systems of interconnected entities that evolve over time. This paper addresses prediction problems on graphs defined by a stream of events, possibly irregularly sampled over time, generally referred to as Continuous-Time Dynamic Graphs (C-TDGs). While many predictive problems on graphs may require capturing interactions between nodes at different distances, existing DGNs for C-TDGs are not designed to propagate and preserve long-range information - resulting in suboptimal performance. In this work, we present Continuous-Time Graph Anti-Symmetric Network (CTAN), a DGN for C-TDGs designed within the ordinary differential equations framework that enables efficient propagation of long-range dependencies. We show that our method robustly performs stable and non-dissipative information propagation over dynamically evolving graphs, where the number of ODE discretization steps allows scaling the propagation range.We empirically validate the proposed approach on several real and synthetic graph benchmarks, showing that CTAN leads to improved performance while enabling the propagation of long-range information. |
Alessio Gravina · Giulio Lovisotto · Claudio Gallicchio · Davide Bacciu · Claas Grohnfeldt 🔗 |
-
|
Graph-based Time Series Clustering for End-to-End Hierarchical Forecasting
(
Poster
)
>
link
SlidesLive Video Existing relationships among time series can be exploited as inductive biases in learning effective forecasting models. In hierarchical time series, relationships among subsets of sequences induce hard constraints (hierarchical inductive biases) on the predicted values. In this paper, we propose a graph-based methodology to unify relational and hierarchical inductive biases in the context of deep learning for time series forecasting. In particular, we model both types of relationships as dependencies in a pyramidal graph structure, with each pyramidal layer corresponding to a level of the hierarchy. By exploiting modern - trainable - graph pooling operators we show that the hierarchical structure, if not available as a prior, can be learned directly from data, thus obtaining cluster assignments aligned with the forecasting objective. A differentiable reconciliation stage is incorporated into the processing architecture, allowing hierarchical constraints to act both as an architectural bias as well as a regularization element for predictions. Simulation results on representative datasets show that the proposed method compares favorably against the state of the art. |
Andrea Cini · Danilo Mandic · Cesare Alippi 🔗 |
-
|
Predicting COVID-19 pandemic by spatio-temporal graph neural networks: A New Zealand's study
(
Poster
)
>
link
SlidesLive Video Modeling and simulations of pandemic dynamics play an essential role in understanding and addressing the spreading of highly infectious diseases such as COVID-19. In this work, we propose a novel deep learning architecture named Attention-based Multiresolution Graph Neural Networks (ATMGNN) that learns to combine the spatial graph information, i.e. geographical data, with the temporal information, i.e. timeseries data of number of COVID-19 cases, to predict the future dynamics of the pandemic. The key innovation is that our method can capture the multiscale structures of the spatial graph via a learning-to-cluster algorithm in a data-driven manner. This allows our architecture to learn to pick up either local or global signals of a pandemic, and model both the long-range spatial and temporal dependencies. Importantly, we collected and assembled a new dataset for New Zealand. We established a comprehensive benchmark of statistical methods, temporal architectures, graph neural networks along with our spatio-temporal model. We also incorporated socioeconomic cross-sectional data to further enhance our prediction. Our proposed model have shown highly robust predictions and outperformed all other baselines in various metrics for our new dataset of New Zealand. Our data and source code are publicly available at https://github.com/HySonLab/pandemic_tgnn |
Bach Nguyen · Truong Son Hy · Long Tran-Thanh · Nhung Nghiem 🔗 |
-
|
DspGNN: Bringing Spectral Design to Discrete Time Dynamic Graph Neural Networks for Edge Regression
(
Poster
)
>
link
SlidesLive Video We introduce the Dynamic Spectral-Parsing Graph Neural Network (DspGNN), a novel model that innovatively incorporates spectral-designed graph convolution for representation learning and edge regression on Discrete Time Dynamic Graphs (DTDGs).Our first major contribution is the adaptation and optimization of spectral-designed methods to better capture evolving spectral information on DTDGs. Secondly, to solve the computational challenge of performing eigendecomposition on large DTDGs, we propose a novel technique, Active Node Mapping, that proves to be both simple and effective.Our model consistently outperforms baseline methods on three publicly available datasets for edge regression tasks. Finally, we discuss future challenges and prospects in this under-explored field. |
Leshanshui Yang · Clement Chatelain · Sébastien Adam 🔗 |
-
|
Continuous-time Graph Representation with Sequential Survival Process
(
Poster
)
>
link
SlidesLive Video Over the past two decades, there has been a tremendous increase in the growth of representation learning methods for graphs, with numerous applications across various fields, including bioinformatics, chemistry, and the social sciences. However, current dynamic network approaches focus on discrete-time networks or treat links in continuous-time networks as instantaneous events. Therefore, these approaches have limitations in capturing the persistence or absence of links that continuously emerge and disappear over time for particular durations. To address this, we propose a novel stochastic process relying on survival functions to model the durations of links and their absences over time. This forms a generic new likelihood specification explicitly accounting for intermittent edge-persistent networks, namely GRaS2P: Graph Representation with Sequential Survival Process. We apply the developed framework to a recent continuous time dynamic latent distance model characterizing network dynamics in terms of a sequence of piecewise linear movements of nodes in latent space. We quantitatively assess the developed framework in various downstream tasks, such as link prediction and network completion, demonstrating that the developed modeling framework accounting for link persistence and absence well tracks the intrinsic trajectories of nodes in a latent space and captures the underlying characteristics of evolving network structure. |
Abdulkadir Celikkanat · Nikolaos Nakis · Morten Mørup 🔗 |
-
|
Using Causality-Aware Graph Neural Networks to Predict Temporal Centralities in Dynamic Graphs
(
Poster
)
>
link
SlidesLive Video Node centralities play a pivotal role in network science, social network analysis, and recommender systems.In temporal data, static path-based centralities like closeness or betweeness can give misleading results about the true importance of nodes in a temporal graph. To address this issue, temporal generalizations of betweeness and closeness have been defined that are based on the shortest time-respecting paths between pairs of nodes.However, a major issue of those generalizations is that the calculation of such paths is computationally expensive.Addressing this issue, we study the application of De Bruijn Graph Neural Networks (DBGNN), a causality-aware graph neural network architecture, to predict temporal path-based centralities in time series data.We experimentally evaluate our approach in 13 temporal graphs from biological and social systems and show that it considerably improves the prediction of both betweenness and closeness centrality compared to a static Graph Convolutional Neural Network. |
Franziska Heeg · Ingo Scholtes 🔗 |
-
|
Fast Temporal Wavelet Graph Neural Networks
(
Poster
)
>
link
SlidesLive Video Spatio-temporal signals forecasting plays an important role in numerous domains, especially in neuroscience and transportation. The task is challenging due to the highly intricate spatial structure, as well as the non-linear temporal dynamics of the network. To facilitate reliable and timely forecast for the human brain and traffic networks, we propose the Fast Temporal Wavelet Graph Neural Networks (FTWGNN) that is both time- and memory-efficient for learning tasks on timeseries data with the underlying graph structure, thanks to the theories of multiresolution analysis and wavelet theory on discrete spaces. We employ Multiresolution Matrix Factorization (MMF) (Kondor et al., 2014) to factorize the highly dense graph structure and compute the corresponding sparse wavelet basis that allows us to construct fast wavelet convolution as the backbone of our novel architecture. Experimental results on real-world PEMS-BAY, METR-LA traffic datasets and AJILE12 ECoG dataset show that FTWGNN is competitive with the state-of-the-arts while maintaining a low computational footprint. Our PyTorch implementation is publicly available at https://github.com/HySonLab/TWGNN |
Duc Thien Nguyen · Tuan Nguyen · Truong Son Hy · Risi Kondor 🔗 |
-
|
Adaptive Message Passing Sign Algorithm
(
Poster
)
>
link
SlidesLive Video
A new algorithm named the Adaptive Message Passing Sign (AMPS) algorithm is introduced for online prediction, missing data imputation, and impulsive noise removal in time-varying graph signals. This work investigates the potential of message passing on spectral adaptive graph filters to define online localized node aggregations. AMPS updates a sign error derived from $l_1$-norm optimization between observation and estimation, leading to fast and robust predictions in the presence of impulsive noise. The combination of adaptive spectral graph filters with message passing reveals a different perspective on viewing message passing and vice versa. Testing on a real-world network formed by a map of nationwide weather stations, the AMPS algorithm accurately forecasts time-varying temperatures.
|
Changran Peng · Yi Yan · Ercan KURUOGLU 🔗 |
-
|
GenTKG: Generative Forecasting on Temporal Knowledge Graph
(
Poster
)
>
link
SlidesLive Video The rapid advancements in large language models (LLMs) have ignited interest in the realm of the temporal knowledge graph (TKG) domain, where conventional carefully designed embedding-based and rule-based models dominate. The question remains open of whether pre-trained LLMs can understand structured temporal relational data and replace them as the foundation model for temporal relational forecasting. Therefore, we bring temporal knowledge forecasting into the generative setting. However, challenges occur in the huge chasms between complex graph data structure and sequential natural expressions LLMs can handle, and between the enormous data volume of TKGs and heavy computation costs of finetuning LLMs. To address these challenges, we propose a novel retrieval augmented generation framework named GenTKG combining a temporal logical rule-based retrieval strategy and lightweight few-shot parameter-efficient instruction tuning to solve the above challenges. Extensive experiments have shown that GenTKG is a simple but effective, efficient, and generalizable approach that outperforms conventional methods on temporal relational forecasting with extremely limited computation. Our work opens a new frontier for the temporal knowledge graph domain. |
Ruotong Liao · Xu Jia · Yunpu Ma · Volker Tresp 🔗 |
-
|
Large-scale Graph Representation Learning of Dynamic Brain Connectome with Transformers
(
Poster
)
>
link
SlidesLive Video Graph Transformers have recently been successful in various graph representation learning tasks, providing a number of advantages over message-passing Graph Neural Networks.Utilizing Graph Transformers for learning the representation of the brain functional connectivity network is also gaining interest.However, studies to date have underlooked the temporal dynamics of functional connectivity, which fluctuates over time.Here, we propose a method for learning the representation of dynamic functional connectivity with Graph Transformers.Specifically, we define the connectome embedding, which holds the position, structure, and time information of the functional connectivity graph, and use Transformers to learn its representation across time.We perform experiments with over 50,000 resting-state fMRI samples obtained from three datasets, which is the largest number of fMRI data used in studies by far.The experimental results show that our proposed method outperforms other competitive baselines in gender classification and age regression tasks based on the functional connectivity extracted from the resting-state fMRI data. |
Byung-Hoon Kim · Jungwon Choi · EungGu Yun · Kyungsang Kim · Xiang Li · Juho Lee 🔗 |
-
|
SAUC: Sparsity-Aware Uncertainty Calibration for Spatiotemporal Prediction with Graph Neural Networks
(
Poster
)
>
link
SlidesLive Video Quantifying uncertainty is essential for achieving robust and reliable predictions. However, existing spatiotemporal models predominantly predict deterministic values, often overlooking the uncertainty in their forecasts. Particularly, high-resolution spatiotemporal datasets are rich in zeros, posing further challenges in quantifying the uncertainty of such sparse and asymmetrically distributed data. This paper introduces a novel post-hoc Sparsity-aware Uncertainty Calibration (SAUC) method, calibrating the uncertainty in both zero and non-zero values. We modify the state-of-the-art deterministic spatiotemporal Graph Neural Networks (GNNs) to probabilistic ones as the synthetic models in the pre-calibration phase. Applied to two real-world spatiotemporal datasets of varied granularities, extensive experiments demonstrate SAUC's capacity to adeptly calibrate uncertainty, effectively fitting the variance of zero values and exhibiting robust generalizability. Specifically, our empirical experiments show a 20\% of reduction in calibration errors in zero entries on the sparse traffic accident and urban crime prediction. The results validate our method's theoretical and empirical values, demonstrating calibrated results that provide reliable safety guidance, thereby bridging a significant gap in uncertainty quantification (UQ) for sparse spatiotemporal data. |
Dingyi Zhuang · Yuheng Bu · Guang Wang · Shenhao Wang · Jinhua Zhao 🔗 |
-
|
Leveraging Temporal Graph Networks Using Module Decoupling
(
Poster
)
>
link
SlidesLive Video Modern approaches for learning on dynamic graphs have adopted the use ofbatches instead of applying updates one by one. The use of batches allows thesetechniques to become helpful in streaming scenarios where updates to graphs arereceived at extreme speeds. Using batches, however, forces the models to updateinfrequently, which results in the degradation of their performance. In this work,we suggest a decoupling strategy that enables the models to update frequentlywhile using batches. By decoupling the core modules of temporal graph networksand implementing them using a minimal number of learnable parameters, we havedeveloped the Lightweight Decoupled Temporal Graph Network (LDTGN), an exceptionally efficient model for learning on dynamic graphs. LDTG was validatedon various dynamic graph benchmarks, providing comparable or state-of-the-artresults with significantly higher throughput than previous art. Notably, our methodoutperforms previous approaches by more than 20% on benchmarks that requirerapid model update rates, such as USLegis or UNTrade. The code to reproduceour experiments is available at \href{https://github.com/TPFI22/MODULES-DECOUPLING}{this http url}. |
Or Feldman · Chaim Baskin 🔗 |
-
|
Exploring Graph Structure in Graph Neural Networks for Epidemic Forecasting
(
Poster
)
>
link
SlidesLive Video Graph neural networks (GNNs) that incorporate cross-location signals have the ability to capture spatial patterns during infectious disease epidemics, potentially improving forecasting performance. However, these models may be susceptible to biases arising from mis-specification, particularly related to the level of connectivity within the graph (i.e., graph structure). In this paper, we investigated the impact of graph structure on GNNs for epidemic forecasting. Multiple graph structures are defined and analyzed based on several characteristics i.e., dense or sparse, geography or learned attention. We design a comprehensive ablation study and conduct experiments on real-world data. One of the major findings is that sparse graphs built using geographical information can achieve advanced performance and are more generalizable among different tasks compared with more complex attention-based adjacency matrices. |
Sai Supriya Varugunda · ChingHao Fan · Lijing Wang 🔗 |
-
|
Gen-T: Reduce Distributed Tracing Operational Costs Using Generative Models
(
Poster
)
>
link
SlidesLive Video
Distributed tracing (DT) is an important aspect of modern microservice operations. It allows operators to troubleshoot problems by modeling the sequence of services a specific request traverses in the system. However, transmitting traces incurs significant costs. This forces operators to use coarse-grained prefiltering or sampling techniques, creating undesirable tradeoffs between cost and fidelity. We propose to circumvent these issues using generative modeling to capture the semantic structure of collected traces in a lossy-yet-succinct way. Realizing this potential in practice, however, is challenging. Naively extending ideas from the literature on deep generative models in timeseries generation or graph generation can result in poor cost-fidelity tradeoffs.In designing and implementing Gen-T, we tackle key algorithmic and systems challenges to make deep generative models practical for DT. We design a hybrid generative model that separately models different components of DT data, and conditionally stitches them together.Our system Gen-T, which has been integrated with the widely-used OpenTelemetry framework, achieves a level of fidelity comparable to that of 1:15 sampling, which is more fine-grained than the default 1:20 sampling setting in the Opentelemetry documentation,while maintaining a cost profile equivalent to that of 1:100 lossless-compressed sampling (i.e., a 7$\times$ volume reduction).
|
Saar Tochner · Giulia Fanti · Vyas Sekar 🔗 |
-
|
Do Temporal Knowledge Graph Embedding Models Learn or Memorize
(
Poster
)
>
link
SlidesLive Video Temporal Knowledge Graph Embedding models predict missing facts in temporal knowledge graphs.Previous work on static knowledge graph embedding models has revealed that KGE models utilize shortcuts in test set leakage to achieve high performance. In this work, we show that a similar test set leakage problem exists in widely used temporal knowledge graph datasets ICEWS14 and ICEWS05-15. We propose a naive rule-based model that can achieve start-of-the-art results on both datasets without a deep-learning process. Following this consideration, we construct two more challenging test subsets for the evaluation of TKGEs. |
Jiaxin Pan · Mojtaba Nayyeri · Yinan Li · Steffen Staab 🔗 |
-
|
Marked Neural Spatio-Temporal Point Process Involving a Dynamic Graph Neural Network
(
Poster
)
>
link
SlidesLive Video Spatio-Temporal Point Processes (STPPs) have recently become increasingly interesting for learning dynamic graph data since many scientific fields, ranging from mathematics, biology, social sciences, and physics to computer science, are naturally related and dynamic. While training Recurrent Neural Networks and solving PDEs for representing temporal data is expensive, TPPs were a good alternative. The drawback is that constructing an appropriate TPP for modeling temporal data requires the assumption of a particular temporal behavior of the data. To overcome this problem, Neural TPPs have been developed that enable learning of the parameters of the TPP. However, the research is relatively young for modeling dynamic graphs, and only a few TPPs have been proposed to handle edge-dynamic graphs. To allow for learning on a fully dynamic graph, we propose the first Marked Neural Spatio-Temporal Point Process (MNSTPP) that leverages a Dynamic Graph Neural Network to learn Spatio-TPPs to model and predict any event in a graph stream.In addition, our model can be updated efficiently by considering single events for local retraining. |
Silvia Beddar-Wiesing · Alice Moallemy-Oureh · Rüdiger Nather · Josephine Thomas 🔗 |
-
|
Temporal graph models fail to capture global temporal dynamics
(
Poster
)
>
link
SlidesLive Video A recently released Temporal Graph Benchmark is analyzed in the context of Dynamic Link Property Prediction. We outline our observations and propose a trivial optimization-free baseline of "recently popular nodes" outperforming other methods on medium and large-size datasets in the Temporal Graph Benchmark. We propose two measures based on Wasserstein distance which can quantify the strength of short-term and long-term global dynamics of datasets. By analyzing our unexpectedly strong baseline, we show how standard negative sampling evaluation can be unsuitable for datasets with strong temporal dynamics. We also show how simple negative-sampling can lead to model degeneration during training, resulting in impossible to rank, fully saturated predictions of temporal graph networks. We propose improved negative sampling schemes for both training and evaluation and prove their usefulness. We conduct a comparison with a model trained non-contrastively without negative sampling. Our results provide a challenging baseline and indicate that temporal graph network architectures need deep rethinking for usage in problems with significant global dynamics, such as social media, cryptocurrency markets or e-commerce. We open-source the code for baselines, measures and proposed negative sampling schemes. |
Michal Daniluk · Jacek Dabrowski 🔗 |
-
|
Graph Kalman Filters
(
Poster
)
>
link
SlidesLive Video The well-known Kalman filters model dynamical systems by relying on state-space representations with the next state updated, and its uncertainty controlled, by fresh information associated with newly observed system outputs. This paper generalizes, for the first time in the literature, Kalman and extended Kalman filters to discrete-time settings where inputs, states, and outputs are represented as attributed graphs whose topology and attributes can change with time. The setup allows us to adapt the framework to cases where the output is a vector or a scalar too (node/graph level tasks). Within the proposed theoretical framework, the unknown state transition and readout are learned end-to-end along with the downstream prediction task. |
Daniele Zambon · Cesare Alippi 🔗 |
-
|
A Generative Self-Supervised Framework using Functional Connectivity in fMRI Data
(
Poster
)
>
link
SlidesLive Video Deep neural networks trained on Functional Connectivity (FC) networks extracted from functional Magnetic Resonance Imaging (fMRI) data have gained popularity due to the increasing availability of data and advances in model architectures, including Graph Neural Network (GNN). Recent research on the application of GNN to FC suggests that exploiting the time-varying properties of the FC could significantly improve the accuracy and interpretability of the model prediction. However, the high cost of acquiring high-quality fMRI data and corresponding phenotypic labels poses a hurdle to their application in real-world settings, such that a model naïvely trained in a supervised fashion can suffer from insufficient performance or a lack of generalization on a small number of data.In addition, most Self-Supervised Learning (SSL) approaches for GNNs to date adopt a contrastive strategy, which tends to lose appropriate semantic information when the graph structure is perturbed or does not leverage both spatial and temporal information simultaneously.In light of these challenges, we propose a generative SSL approach that is tailored to effectively harness spatio-temporal information within dynamic FC. Our empirical results, experimented with large-scale (>50,000) fMRI datasets, demonstrate that our approach learns valuable representations and enables the construction of accurate and robust models when fine-tuned for downstream tasks. |
Jungwon Choi · Seongho Keum · EungGu Yun · Byung-Hoon Kim · Juho Lee 🔗 |
-
|
Deep graph kernel point processes
(
Poster
)
>
link
SlidesLive Video Point process models are widely used for continuous asynchronous event data, where each data point includes time and additional information called ``marks'', which can be locations, nodes, or event types. In this paper, we present a novel point process model for discrete event data over graphs, where the event interaction occurs within a latent graph structure. Our model builds upon the classic influence kernel-based formulation by Hawkes in the original self-exciting point processes work to capture the influence of historical events on future events' occurrence. The key idea is to represent the influence kernel by Graph Neural Networks (GNN) to capture the underlying graph structure while harvesting the strong representation power of GNN. Compared with prior works that focus on directly modeling the conditional intensity function using neural networks, our kernel presentation herds the repeated event influence patterns more effectively by combining statistical and deep models, achieving better model estimation/learning efficiency and superior predictive performance. Our work significantly extends the existing deep spatio-temporal kernel for point process data, which is inapplicable to our setting due to the fundamental difference in the nature of the observation space being Euclidean rather than a graph. We present comprehensive experiments on synthetic and real-world data to show the superior performance of the proposed approach against the state-of-the-art in predicting future events and uncovering the relational structure among data. |
Zheng Dong · Matthew Repasky · Xiuyuan Cheng · Yao Xie 🔗 |
-
|
Topological and Temporal Data Augmentation for Temporal Graph Networks
(
Poster
)
>
link
SlidesLive Video Temporal graphs are extensively employed to represent evolving networks, finding applications across diverse fields such as transportation systems, social networks, and biological networks.Temporal Graph Networks (TGNs) build upon these graphs to model and learn from temporal dependencies in dynamic networks.A significant aspect of enhancing the performance of TGNs lies in effective data augmentation, which helps in better capturing the underlying patterns within temporal graphs while ensuring robustness to variations. However, existing data augmentation strategies for temporal graphs are largely heuristic and hand-crafted, which may alter the inherent semantics of temporal graphs, thereby degrading the performance of downstream tasks. To address this, we propose two simple yet effective data augmentation strategies, specifically tailored within the representation space of TGNs, targeting both the graph topology and the temporal axis. Through experiments on future link prediction and node classification tasks, we demonstrate that the integration of our proposed augmentation methods significantly amplifies the performance of TGNs, outperforming state-of-the-art methods. |
Haoran Liu · Jianling Wang · Kaize Ding · James Caverlee 🔗 |
-
|
Spatial-Temporal DAG Convolutional Networks for End-to-End Joint Effective Connectivity Learning and Resting-State fMRI Classification
(
Poster
)
>
link
SlidesLive Video Building comprehensive brain connectomes has proved of fundamental importance in resting-state fMRI (rs-fMRI) analysis. Based on the foundation of brain network, spatial-temporal-based graph convolutional networks have dramatically improved the performance of deep learning methods in rs-fMRI time series classification. However, existing works either pre-define the brain network as the correlation matrix derived from the raw time series or jointly learn the connectome and model parameters without any topology constraint. These methods could suffer from degraded classification performance caused by the deviation from the intrinsic brain connectivity and lack biological interpretability of demonstrating the causal structure (i.e., effective connectivity) among brain regions. Moreover, most existing methods for effective connectivity learning are unaware of the downstream classification task and cannot sufficiently exploit useful rs-fMRI label information. To address these issues in an end-to-end manner, we model the brain network as a directed acyclic graph (DAG) to discover direct causal connections between brain regions and propose Spatial-Temporal DAG Convolutional Network (ST-DAGCN) to jointly infer effective connectivity and classify rs-fMRI time series by learning brain representations based on nonlinear structural equation model. The optimization problem is formulated into a continuous program and solved with score-based learning method via gradient descent. We evaluate ST-DAGCN on two public rs-fMRI databases. Experiments show that ST-DAGCN outperforms existing models by evident margins in rs-fMRI classification and simultaneously learns meaningful edges of effective connectivity that help understand brain activity patterns and pathological mechanisms in brain disease. |
Rui Yang · Wenrui Dai · Huajun She · Yiping Du · Dapeng Wu · Hongkai Xiong 🔗 |
-
|
Hierarchical Joint Graph Learning and Multivariate Time Series Forecasting
(
Poster
)
>
link
Multivariate time series is prevalent in many scientific and industrial domains. Modeling multivariate signals is challenging due to their long-range temporal dependencies and intricate interactions--both direct and indirect. To confront these complexities, we introduce a method of representing multivariate signals as nodes in a graph with edges indicating interdependency between them. Specifically, we leverage graph neural networks (GNN) and attention mechanisms to efficiently learn the underlying relationships within the time series data. Moreover, we suggest employing hierarchical signal decompositions running over the graphs to capture multiple spatial dependencies. The effectiveness of our proposed model is evaluated across various real-world benchmark datasets designed for long-term forecasting tasks. The results consistently showcase the superiority of our model, achieving an average 23\% reduction in mean squared error (MSE) compared to existing models. |
JuHyeon Kim · HyunGeun Lee · Seungwon Yu · Ung Hwang · Wooyul Jung · Miseon Park · Kijung Yoon 🔗 |
-
|
Exploring Time Granularity on Temporal Graphs for Dynamic Link Prediction in Real-world Networks
(
Poster
)
>
link
SlidesLive Video Dynamic Graph Neural Networks (DGNNs) have emerged as the predominant approach for processing dynamic graph-structured data. However, the influence of temporal information on model performance and robustness remains insufficiently explored, particularly regarding how models address prediction tasks with different time granularities. In this paper, we explore the impact of time granularity when training DGNNs on dynamic graphs through extensive experiments. We examine graphs derived from various domains and compare three different DGNNs to the baseline model across four varied time granularities. We mainly consider the interplay between time granularities, model architectures, and negative sampling strategies to obtain general conclusions. Our results reveal that a sophisticated memory mechanism and proper time granularity are crucial for a DGNN to deliver competitive and robust performance in the dynamic link prediction task. We also discuss drawbacks in considered models and datasets and propose promising directions for future research on the time granularity of temporal graphs. |
Xiangjian Jiang · Yanyi Pu 🔗 |
-
|
Inductive Link Prediction in Static and Temporal Graphs for Isolated Nodes
(
Poster
)
>
link
SlidesLive Video Link prediction is a vital task in graph machine learning, involving the anticipation of connections between entities within a network. In the realm of drug discovery, link prediction takes the form of forecasting interactions between drugs and target genes. Likewise, in recommender systems, link prediction entails suggesting items to users. In temporal graphs, link prediction ranges from friendship recommendations to introducing new devices in wireless networks and dynamic routing. However, a prevailing challenge in link prediction lies in the reliance on topological neighborhoods and the lack of informative node metadata for making predictions. Consequently, predictions for nodes with low degrees, and especially for newly introduced nodes with no neighborhood data, tend to be inaccurate and misleading. State-of-the-art models frequently fall short when tasked with predicting interactions between a novel drug and an unexplored disease target or suggesting a new product to a recently onboarded user. In temporal graphs, the link prediction models often misplace a newly introduced entity in the evolving network. This paper delves into the issue of observation bias related to the inequity of data availability for different entities in a network, unavailability of informative node metadata, and explores how contemporary models struggle when it comes to making inductive link predictions for low-degree and previously unseen isolated nodes. Additionally, we harness informative node attributes generated by unsupervised pre-training on corpora different from and with significantly more entities than the observed graphs to enhance the overall generalizability of link prediction models. |
Ayan Chatterjee · Robin Walters · Giulia Menichetti · Tina Eliassi-Rad 🔗 |
-
|
BitGraph: A Framework For Scaling Temporal Graph Queries on GPUs
(
Poster
)
>
link
SlidesLive Video
Graph query languages have become the standard among data scientists analyzing large, dynamic graphs, allowing them to structure their analysis as SQL-like queries. One of the challenges in supporting graph query languages is that, unlike SQL queries, graph queries nearly always involve aggregation of sparse data, making it challenging to scale graph queries without heavy reliance on expensive indices. This paper introduces the first major release of $\textit{BitGraph}$, a graph query processing engine that uses GPU-acceleration to quickly process Gremlin graph queries with minimal memory overhead, along with its supporting stack, $\textit{Gremlin++}$, which provides query language support in C++, and $\textit{Maelstrom}$, a lightweight library for compute-agnostic, accelerated vector operations built on top of $\textit{Thrust}$. This paper also analyzes the performance of BitGraph compared to existing CPU-only backends applied specifically to temporal graph queries, demonstrating BitGraph's superior scalability and speedup of up to 35x over naive CPU implementations.
|
Alexandria Barghi 🔗 |
-
|
TBoost: Gradient Boosting Temporal Graph Neural Networks
(
Poster
)
>
link
SlidesLive Video Fraud prediction, compromised account detection, and attrition signaling are vital problems in the financial domain. Generally, these tasks are temporal classification problems as labels exhibit temporal dependence. The labels of these tasks change with time. Each financial transaction contains heterogeneous data like account number, merchant, amount, decline status, etc. A financial dataset contains chronological transactions. This data possesses three distinct characteristics: heterogeneity, relational structure, and temporal nature. Previous efforts fall short of modeling all these characteristics in a unified way. Gradient-boosted decision trees (GBDTs) are used to tackle heterogeneity. Graph Neural Networks (GNNs) are employed to model relational information. Temporal GNNs account for temporal dependencies in the data. In this paper, we propose a novel unified framework, TBoost, which combines GBDTs and temporal GNNs to jointly model the heterogeneous, relational, and temporal characteristics of the data. It leverages both node and edge-level dynamics to solve temporal classification problems. To validate the effectiveness of TBoost, we conduct extensive experiments, demonstrating its superiority in handling the complexities of financial data. |
Pritam Kumar Nath · Govind Waghmare · Nancy Agrawal · Nitish Kumar · Siddhartha Asthana 🔗 |
-
|
Towards predicting future time intervals on Temporal Knowledge Graphs
(
Poster
)
>
link
SlidesLive Video Temporal Knowledge Graphs (TKGs), a temporal extension of Knowledge Graphs where facts are contextualized by time information, have received increasing attention in the temporal graph learning community. In this paper we focus on TKGs where the temporal contexts are time intervals,and address the time prediction problem in the forecasting setting.We propose both a system for addressing the task, as well as a benchmark construction methodology. |
Roxana Pop · Egor Kostylev 🔗 |
-
|
STGraph: A Framework for Temporal Graph Neural Networks
(
Poster
)
>
link
SlidesLive Video Real-life graphs from various application domains like social networks, transportation networks, and citation networks evolve over time. These evolving graphs can be modeled as (i) interactions between two nodes in a graph and (ii) interactions associated with a single node. Deep learning techniques using Graph Neural Networks (GNNs) are used for analyzing spatial and temporal properties of graphsfrom these application domains. Analyzing temporal graphs is challenging in comparison to static graphs, hence warranting the need for a GNN variant named Temporal Graph Neural Networks (TGNNs). We propose STGraph, a framework to program TGNNs. The proposed framework extends Seastar, a vertex-centric programming model for training static GNNs on GPUs. STGraph supports TGNNs for static temporal and discrete-time dynamic graphs (DTDGs). Existing TGNN frameworks store DTDGs as separate snapshots, incurring high memory overhead. As an improvement, STGraph constructs each snapshot on demand during training. This is achieved by integrating the system with dynamic graph data structures capable of building graph snapshots from temporal updates. Additionally, we present improvements to the Seastar design for easier maintenance and greater software portability. STGraph is benchmarked against Pytorch Geometric Temporal (PyG-T) on an NVIDIA GPU. For static-temporal graphs, STGraph shows up to 1.22× speedup and up to 2.14× memory improvement over PyG-T. For DTDGs, STGraph exhibits up to 1.70× speedup and 1.52× memory improvement over PyG-T. |
Nithin Manoj · Joel Mathew Cherian · Kevin Concessao · Unnikrishnan Cheramgalath 🔗 |
-
|
Anomaly Detection in Continuous-Time Temporal Provenance Graphs
(
Poster
)
>
link
SlidesLive Video Recent advances in Graph Neural Networks (GNNs) have matured the field of learning on graphs, making GNNs essential for prediction tasks in complex, interconnected, and evolving systems.In this paper, we focus on self-supervised, inductive learning for continuous-time dynamic graphs. Without compromising generality, we propose an approach to learn representations and mine anomalies in provenance graphs, which are a form of large-scale, heterogeneous, attributed, and continuous-time dynamic graphs used in the cybersecurity domain, syntactically resembling complex temporal knowledge graphs. We modify the Temporal Graph Network (TGN) framework to heterogeneous input data and directed edges, refining it specifically for inductive learning on provenance graphs.We present and release two pioneering large-scale, continuous-time temporal, heterogeneous, attributed benchmark graph datasets.The datasets incorporate expert-labeled anomalies, promoting subsequent research on representation learning and anomaly detection on intricate real-world networks. Comprehensive experimental analyses of modules, datasets, and baselines underscore the effectiveness of TGN-based inductive learning, affirming its practical utility in identifying semantically significant anomalies in real-world systems. |
Jakub Reha · Giulio Lovisotto · Michele Russo · Alessio Gravina · Claas Grohnfeldt 🔗 |
-
|
DURENDAL: Graph deep learning framework for temporal heterogeneous networks
(
Poster
)
>
link
SlidesLive Video Temporal heterogeneous networks (THNs) are evolving networks that characterize many real-world applications such as citation and events networks, recommender systems, and knowledge graphs. Although different Graph Neural Networks (GNNs) have been successfully applied to dynamic graphs, most of them only support homogeneous graphs or suffer from model design heavily influenced by specific THNs prediction tasks. Furthermore, there is a lack of temporal heterogeneous networked data in current standard graph benchmark datasets. Hence, in this work, we propose DURENDAL, a graph deep learning framework for THNs. DURENDAL can help to easily repurpose any heterogeneous graph learning model to evolving networks by combining design principles from snapshot-based and multirelational message-passing graph learning models. We introduce two different schemes to update embedding representations for THNs, discussing the strengths and weaknesses of both strategies. We also extend the set of benchmarks for TNHs by introducing two novel high-resolution temporal heterogeneous graph datasets derived from an emerging Web3 platform and a well-established e-commerce website. Overall, we conducted the experimental evaluation of the framework over four temporal heterogeneous network datasets on future link prediction tasks in an evaluation setting that takes into account the evolving nature of the data. Experiments show the prediction power of DURENDAL compared to current solutions for evolving and dynamic graphs, and the effectiveness of its model design. |
Manuel Dileo · Matteo Zignani · Sabrina Gaito 🔗 |
-
|
Learning Temporal Higher-order Patterns to Detect Anomalous Brain Activity
(
Poster
)
>
link
Due to recent advances in machine learning on graphs, representing the connections of the human brain as a network has become one of the most pervasive analytical paradigms. However, most existing graph machine learning-based methods suffer from a subset of five critical limitations: They are (1) designed for simple pair-wise interactions while recent studies on the human brain show the existence of higher-order dependencies of brain regions, (2) designed to perform on pre-constructed networks from time-series data, which limits their generalizability, (3) designed for classifying brain networks, limiting their ability to reveal underlying patterns that might cause the symptoms of a disease or disorder, (4) designed for learning of static patterns, missing the dynamics of human brain activity, and (5) designed in supervised setting, relying their performance on the existence of labeled data. To address these limitations, we present HADiB, an end-to-end anomaly detection model that automatically learns the structure of the hypergraph representation of the brain from neuroimage data. HADiB uses a tetra-stage message-passing mechanism along with an attention mechanism that learns the importance of higher-order dependencies of brain regions. We further present a new adaptive hypergraph pooling to obtain brain-level representation, enabling HADiB to detect the neuroimage of people living with a specific disease or disorder. Our experiments on Parkinson’s Disease, Attention Deficit Hyperactivity Disorder, and Autism Spectrum Disorder show the efficiency and effectiveness of our approaches in detecting anomalous brain activity. |
Ali Behrouz · Farnoosh Hashemi 🔗 |
-
|
Mitigating Cold-start Problem using Cold Causal Demand Forecasting Model
(
Poster
)
>
link
SlidesLive Video Forecasting multivariate time series data, which involves predicting future values of variables over time using historical data, has significant practical applications. Although deep learning-based models have shown promise in this field, they often fail to capture the causal relationship between dependent variables, leading to less accurate forecasts. Additionally, these models cannot handle the cold-start problem in time series data, where certain variables lack historical data, posing challenges in identifying dependencies among variables. To address these limitations, we introduce the Cold Causal Demand Forecasting (CDF-cold) framework that integrates causal inference with deep learning-based models to enhance the forecasting accuracy of multivariate time series data affected by the cold-start problem.To validate the effectiveness of the proposed approach, we collect 15 multivariate time-series datasets containing the network traffic of different Google data centers. Our experiments demonstrate that the CDF-cold framework outperforms state-of-the-art forecasting models in predicting future values of multivariate time series data suffering from cold-start problem. |
Zahra Fatemi · Minh Huynh · Elena Zheleva · Zamir Syed · Xiaojun Di 🔗 |
-
|
An Information-Theoretic Analysis on Temporal Graph Evolution
(
Poster
)
>
link
SlidesLive Video In this paper, we present a novel model termed Network Evolution Chains for simulating the temporal dynamics of networks. Our model's design is tailored to enable comprehensive analysis through information theory. We establish that this model creates a stationary and ergodic stochastic process, thus facilitating the application of the asymptotic equipartition property. This breakthrough paves the way for a thorough information-theoretic investigation into network behavior, encompassing the definition of typical sequences, future state prediction, and beyond. |
Amirmohammad Farzaneh 🔗 |
-
|
Todyformer: Towards Holistic Dynamic Graph Transformers with Structure-Aware Tokenization
(
Poster
)
>
link
Temporal Graph Neural Networks have garnered substantial attention for their capacity to model evolving structural and temporal patterns while exhibiting impressive performance. However, it is known that these architectures are encumbered by issues that constrain their performance, such as over-squashing and over-smoothing. Meanwhile, Transformers have demonstrated exceptional computational capacity to effectively address challenges related to long-range dependencies. Consequently, we introduce Todyformer—a novel Transformer-based neural network tailored for dynamic graphs. It unifies the local encoding capacity of Message-Passing Neural Networks (MPNNs) with the global encoding of Transformers through i) a novel patchifying paradigm for dynamic graphs to improve over-squashing, ii) a structure-aware parametric tokenization strategy leveraging MPNNs, iii) a Transformer with temporal positional-encoding to capture long-range dependencies, and iv) an encoding architecture that alternates between local and global contextualization, mitigating over-smoothing in MPNNs. Experimental evaluations on public benchmark datasets demonstrate that Todyformer consistently outperforms the state-of-the-art methods for the downstream tasks. Furthermore, we illustrate the underlying aspects of the proposed model in effectively capturing extensive temporal dependencies in dynamic graph. |
Mahdi Biparva · Raika Karimi · Faezeh Faez · Yingxueff Zhang 🔗 |