Timezone: »

Adversarial Sparse Transformer for Time Series Forecasting
Sifan Wu · Xi Xiao · Qianggang Ding · Peilin Zhao · Ying Wei · Junzhou Huang

Mon Dec 07 09:00 PM -- 11:00 PM (PST) @ Poster Session 0 #122

Many approaches have been proposed for time series forecasting, in light of its significance in wide applications including business demand prediction. However, the existing methods suffer from two key limitations. Firstly, most point prediction models only predict an exact value of each time step without flexibility, which can hardly capture the stochasticity of data. Even probabilistic prediction using the likelihood estimation suffers these problems in the same way. Besides, most of them use the auto-regressive generative mode, where ground-truth is provided during training and replaced by the network’s own one-step ahead output during inference, causing the error accumulation in inference. Thus they may fail to forecast time series for long time horizon due to the error accumulation. To solve these issues, in this paper, we propose a new time series forecasting model -- Adversarial Sparse Transformer (AST), based on Generated Adversarial Networks (GANs). Specifically, AST adopts a Sparse Transformer as the generator to learn a sparse attention map for time series forecasting, and uses a discriminator to improve the prediction performance from sequence level. Extensive experiments on several real-world datasets show the effectiveness and efficiency of our method.

Author Information

Sifan Wu (Tsinghua University)
Xi Xiao (Tsinghua University)
Qianggang Ding (Tsinghua University)
Peilin Zhao (Tencent AI Lab)
Ying Wei (Tencent AI Lab)
Junzhou Huang (University of Texas at Arlington / Tencent AI Lab)

More from the Same Authors