Skip to yearly menu bar Skip to main content


Poster

Wasserstein Learning of Deep Generative Point Process Models

Shuai Xiao · Mehrdad Farajtabar · Xiaojing Ye · Junchi Yan · Xiaokang Yang · Le Song · Hongyuan Zha

Pacific Ballroom #106

Keywords: [ Time Series Analysis ] [ Adversarial Networks ] [ Generative Models ]


Abstract:

Point processes are becoming very popular in modeling asynchronous sequential data due to their sound mathematical foundation and strength in modeling a variety of real-world phenomena. Currently, they are often characterized via intensity function which limits model's expressiveness due to unrealistic assumptions on its parametric form used in practice. Furthermore, they are learned via maximum likelihood approach which is prone to failure in multi-modal distributions of sequences. In this paper, we propose an intensity-free approach for point processes modeling that transforms nuisance processes to a target one. Furthermore, we train the model using a likelihood-free leveraging Wasserstein distance between point processes. Experiments on various synthetic and real-world data substantiate the superiority of the proposed point process model over conventional ones.

Live content is unavailable. Log in and register to view live content