Skip to yearly menu bar Skip to main content


Fast and Flexible Temporal Point Processes with Triangular Maps

Oleksandr Shchur · Nicholas Gao · Marin Biloš · Stephan Günnemann

Poster Session 5 #1369

Keywords: [ Algorithms ] [ Few-Shot Learning ] [ Meta-Learning ]


Temporal point process (TPP) models combined with recurrent neural networks provide a powerful framework for modeling continuous-time event data. While such models are flexible, they are inherently sequential and therefore cannot benefit from the parallelism of modern hardware. By exploiting the recent developments in the field of normalizing flows, we design TriTPP - a new class of non-recurrent TPP models, where both sampling and likelihood computation can be done in parallel. TriTPP matches the flexibility of RNN-based methods but permits several orders of magnitude faster sampling. This enables us to use the new model for variational inference in continuous-time discrete-state systems. We demonstrate the advantages of the proposed framework on synthetic and real-world datasets.

Chat is not available.