Timezone: »
Time series are widely used as signals in many classification/regression tasks. It is ubiquitous that time series contains many missing values. Given multiple correlated time series data, how to fill in missing values and to predict their class labels? Existing imputation methods often impose strong assumptions of the underlying data generating process, such as linear dynamics in the state space. In this paper, we propose BRITS, a novel method based on recurrent neural networks for missing value imputation in time series data. Our proposed method directly learns the missing values in a bidirectional recurrent dynamical system, without any specific assumption. The imputed values are treated as variables of RNN graph and can be effectively updated during the backpropagation. BRITS has three advantages: (a) it can handle multiple correlated missing values in time series; (b) it generalizes to time series with nonlinear dynamics underlying; (c) it provides a data-driven imputation procedure and applies to general settings with missing data. We evaluate our model on three real-world datasets, including an air quality dataset, a health-care data, and a localization data for human activity. Experiments show that our model outperforms the state-of-the-art methods in both imputation and classification/regression accuracies.
Author Information
Wei Cao (Tsinghua University)
Dong Wang (Duke University)
Jian Li (Tsinghua University)
Hao Zhou (Bytedance AI Lab)
Lei Li (ByteDance AI Lab)
Yitan Li (ByteDance.Inc)
More from the Same Authors
-
2019 Poster: On Fenchel Mini-Max Learning »
Chenyang Tao · Liqun Chen · Shuyang Dai · Junya Chen · Ke Bai · Dong Wang · Jianfeng Feng · Wenlian Lu · Georgiy Bobashev · Lawrence Carin -
2016 Poster: Combinatorial Multi-Armed Bandit with General Reward Functions »
Wei Chen · Wei Hu · Fu Li · Jian Li · Yu Liu · Pinyan Lu