Timezone: »

 
Poster
Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting
Yong Liu · Haixu Wu · Jianmin Wang · Mingsheng Long

@

Transformers have shown great power in time series forecasting due to their global-range modeling ability. However, their performance can degenerate terribly on non-stationary real-world data in which the joint distribution changes over time. Previous studies primarily adopt stationarization to attenuate the non-stationarity of original series for better predictability. But the stationarized series deprived of inherent non-stationarity can be less instructive for real-world bursty events forecasting. This problem, termed over-stationarization in this paper, leads Transformers to generate indistinguishable temporal attentions for different series and impedes the predictive capability of deep models. To tackle the dilemma between series predictability and model capability, we propose Non-stationary Transformers as a generic framework with two interdependent modules: Series Stationarization and De-stationary Attention. Concretely, Series Stationarization unifies the statistics of each input and converts the output with restored statistics for better predictability. To address the over-stationarization problem, De-stationary Attention is devised to recover the intrinsic non-stationary information into temporal dependencies by approximating distinguishable attentions learned from raw series. Our Non-stationary Transformers framework consistently boosts mainstream Transformers by a large margin, which reduces MSE by 49.43% on Transformer, 47.34% on Informer, and 46.89% on Reformer, making them the state-of-the-art in time series forecasting. Code is available at this repository: https://github.com/thuml/Nonstationary_Transformers.

Author Information

Yong Liu (Tsinghua University, Tsinghua University)
Yong Liu

I ‘m currently a PhD student (from fall, 2021) at the School of Software of Tsinghua University and a member of the THUML, advised by Prof. MingSheng Long. My research interests cover Deep Learning and Transfer Learning. I am currently working on deep model applications for Time Series Forecasting. Previously, I have done researches on the transferability measurement of pretrained models (PTM). The pursuit of my reasearch is to implement deep learning methodology to valuable real-world applications. For more information, you may take a look at my publications.

Haixu Wu (Tsinghua University)
Jianmin Wang (Tsinghua University)
Mingsheng Long (Tsinghua University)

More from the Same Authors