Probabilistic Transformer For Time Series Analysis

Binh Tang · David S Matteson

Keywords: [ Transformers ] [ Deep Learning ] [ Generative Model ] [ Vision ]

[ Abstract ]
[ OpenReview
Thu 9 Dec 4:30 p.m. PST — 6 p.m. PST


Generative modeling of multivariate time series has remained challenging partly due to the complex, non-deterministic dynamics across long-distance timesteps. In this paper, we propose deep probabilistic methods that combine state-space models (SSMs) with transformer architectures. In contrast to previously proposed SSMs, our approaches use attention mechanism to model non-Markovian dynamics in the latent space and avoid recurrent neural networks entirely. We also extend our models to include several layers of stochastic variables organized in a hierarchy for further expressiveness. Compared to transformer models, ours are probabilistic, non-autoregressive, and capable of generating diverse long-term forecasts with uncertainty estimates. Extensive experiments show that our models consistently outperform competitive baselines on various tasks and datasets, including time series forecasting and human motion prediction.

Chat is not available.