Timezone: »

 
Workshop
NIPS 2017 Time Series Workshop
Vitaly Kuznetsov · Oren Anava · Scott Yang · Azadeh Khaleghi

Fri Dec 08 08:00 AM -- 06:30 PM (PST) @ Grand Ballroom A
Event URL: https://sites.google.com/site/nipsts2017/home »

Data, in the form of time-dependent sequential observations emerge in many key real-world problems, ranging from biological data, financial markets, weather forecasting to audio/video processing. However, despite the ubiquity of such data, most mainstream machine learning algorithms have been primarily developed for settings in which sample points are drawn i.i.d. from some (usually unknown) fixed distribution. While there exist algorithms designed to handle non-i.i.d. data, these typically assume specific parametric form for the data-generating distribution. Such assumptions may undermine the complex nature of modern data which can possess long-range dependency patterns, and for which we now have the computing power to discern. On the other extreme lie on-line learning algorithms that consider a more general framework without any distributional assumptions. However, by being purely-agnostic, common on-line algorithms may not fully exploit the stochastic aspect of time-series data.

This is the third instalment of time series workshop at NIPS and will build on the success of the previous events: NIPS 2015 Time Series Workshop and NIPS 2016 Time Series Workshop.

The goal of this workshop is to bring together theoretical and applied researchers interested in the analysis of time series and development of new algorithms to process sequential data. This includes algorithms for time series prediction, classification, clustering, anomaly and change point detection, correlation discovery, dimensionality reduction as well as a general theory for learning and comparing stochastic processes. We invite researchers from the related areas of batch and online learning, reinforcement learning, data analysis and statistics, econometrics, and many others to contribute to this workshop.

Fri 9:00 a.m. - 9:15 a.m. [iCal]
Introduction to Time Series Workshop (Openning remarks)
Fri 9:15 a.m. - 10:00 a.m. [iCal]

I will present in this talk a modification of the dynamic time warping distance which is, unlike the original quantity, differentiable in all of its inputs. As a result, that alternative distance can be used naturally as a learning loss to learn with datasets of time series, to produce means, clusters or structured prediction where the goal is to forecast entire time series.

Fri 10:00 a.m. - 10:15 a.m. [iCal]
Learning theory and algorithms for shapelets and other local features. Daiki Suehiro, Kohei Hatano, Eiji Takimoto, Shuji Yamamoto, Kenichi Bannai and Akiko Takeda. (Contributed talk)
Fri 10:15 a.m. - 10:30 a.m. [iCal]
Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting. Yaguang Li, Rose Yu, Cyrus Shahabi and Yan Liu. (Contributed talk)
Yaguang Li
Fri 10:30 a.m. - 11:00 a.m. [iCal]
Morning Coffee Break (Break)
Fri 11:00 a.m. - 11:45 a.m. [iCal]
Panel discussion featureing Marco Cuturi (ENSAE / CREST), Claire Monteleoni (GWU), Karthik Sridharan (Cornell), Firdaus Janoos (Two Sigma) and Matthias Seeger (Amazon) (Panel Discussion)
Fri 11:45 a.m. - 12:30 p.m. [iCal]

Feel free to enjoy posters at lunch time as well!

Víctor Campos, Brendan Jou, Xavier Giró-I-Nieto, Jordi Torres and Shih-Fu Chang. Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks.

Yao-Hung Hubert Tsai, Han Zhao, Nebojsa Jojic and Ruslan Salakhutdinov. DISCOVERING ORDER IN UNORDERED DATASETS: GENERATIVE MARKOV NETWORKS.

Yaguang Li, Rose Yu, Cyrus Shahabi and Yan Liu. Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting.

Alex Tank, Emily Fox and Ali Shojaie. An Efficient ADMM Algorithm for Structural Break Detection in Multivariate Time Series.

Hossein Soleimani, James Hensman and Suchi Saria. Scalable Joint Models for Reliable Event Prediction.

Daiki Suehiro, Kohei Hatano, Eiji Takimoto, Shuji Yamamoto, Kenichi Bannai and Akiko Takeda. Learning theory and algorithms for shapelets and other local features.

Tao-Yi Lee, Yuh-Jye Lee, Hsing-Kuo Pao, You-Hua Lin and Yi-Ren Yeh. Elastic Motif Segmentation and Alignment of Time Series for Encoding and Classification.

Yun Jie Serene Yeo, Kian Ming A. Chai, Weiping Priscilla Fan, Si Hui Maureen Lee, Junxian Ong, Poh Ling Tan, Yu Li Lydia Law and Kok-Yong Seng. DP Mixture of Warped Correlated GPs for Individualized Time Series Prediction.

Anish Agarwal, Muhammad Amjad, Devavrat Shah and Dennis Shen. Time Series Forecasting = Matrix Estimation.

Rose Yu, Stephan Zheng, Anima Anandkumar and Yisong Yue. Long-term Forecasting using Tensor-Train RNNs.

Pranamesh Chakraborty, Chinmay Hegde and Anuj Sharma. Trend Filtering in Network Time Series with Applications to Traffic Incident Detection.

Jaleh Zand and Stephen Roberts. MiDGaP: Mixture Density Gaussian Processes.

Dimitrios Giannakis, Joanna Slawinska, Abbas Ourmazd and Zhizhen Zhao. Vector-Valued Spectral Analysis of Space-Time Data.

Ruofeng Wen, Kari Torkkola and Balakrishnan Narayanaswamy. A Multi-Horizon Quantile Recurrent Forecaster.

Alessandro Davide Ialongo, Mark van der Wilk and Carl Edward Rasmussen. Closed-form Inference and Prediction in Gaussian Process State-Space Models.

Hao Liu, Haoli Bai, Lirong He and Zenglin Xu. Structured Inference for Recurrent Hidden Semi-markov Model.

Petar Veličković, Laurynas Karazija, Nicholas Lane, Sourav Bhattacharya, Edgar Liberis, Pietro Lio, Angela Chieh, Otmane Bellahsen and Matthieu Vegreville. Cross-modal Recurrent Models for Weight Objective Prediction from Multimodal Time-series Data.

Kun Tu, Bruno Ribeiro, Ananthram Swami and Don Towsley. Temporal Clustering in time-varying Networks with Time Series Analysis.

Shaojie Bai, J. Zico Kolter and Vladlen Koltun. Convolutional Sequence Modeling Revisited.

Apurv Shukla, Se-Young Yun and Daniel Bienstock. Non-Stationary Streaming PCA.

Kun Zhao, Takayuki Osogami and Rudy Raymond. Fluid simulation with dynamic Boltzmann machine in batch manner.

Anderson Zhang, Miao Lu, Deguang Kong and Jimmy Yang. Bayesian Time Series Forecasting with Change Point and Anomaly Detection.

Akara Supratak, Steffen Schneider, Hao Dong, Ling Li and Yike Guo. Towards Desynchronization Detection in Biosignals.

Rudy Raymond, Takayuki Osogami and Sakyasingha Dasgupta. Dynamic Boltzmann Machines for Second Order Moments and Generalized Gaussian Distributions.

Itamar Ben-Ari and Ravid Shwartz-Ziv. Sequence modeling using a memory controller extension for LSTM.

Neil Dhir and Adam Kosiorek. Bayesian delay embeddings for dynamical systems.

Aleksander Wieczorek and Volker Roth. Time Series Classification with Causal Compression.

Daniel Hernandez, Liam Paninski and John Cunningham. Variational inference for latent nonlinear dynamics.

Alex Tank, Ian Covert, Nick Foti, Ali Shojaie and Emily Fox. An Interpretable and Sparse Neural Network Model for Nonlinear Granger Causality Discovery.

John Alberg and Zachary Lipton. Improving Factor-Based Quantitative Investing by Forecasting Company Fundamentals.

Achintya Kr. Sarkar and Zheng-Hua Tan. Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification.

Ankit Gandhi, Vineet Chaoji and Arijit Biswas. Modeling Customer Time Series for Age Prediction.

Zahra Ebrahimzadeh and Samantha Kleinberg. Multi-Scale Change Point Detection in Multivariate Time Series.

Jaleh Zand, Kun Tu, Michael (Tao-Yi) Lee, Ian Covert, Daniel Hernandez, Shina Ebrahimzadeh, Joanna Slawinska, Akara Supratak, Miao Lu, John Alberg, Dennis Shen, Serene Yeo, Hsing-Kuo K Pao, Kian Ming Adam Chai, Anish Agarwal, Dimitrios Giannakis, Muhammad Amjad
Fri 12:30 p.m. - 2:30 p.m. [iCal]

Lunch on your own

Fri 2:30 p.m. - 2:45 p.m. [iCal]
DISCOVERING ORDER IN UNORDERED DATASETS: GENERATIVE MARKOV NETWORKS. Yao-Hung Hubert Tsai, Han Zhao, Nebojsa Jojic and Ruslan Salakhutdinov. (Contributed talk)
Fri 2:45 p.m. - 3:30 p.m. [iCal]
Vitaly Kuznetsov: Kaggle web traffic time series forecasting competition: results and insights (Talk)
Fri 3:30 p.m. - 4:00 p.m. [iCal]
Afternoon Coffee Break (Break)
Fri 4:00 p.m. - 4:15 p.m. [iCal]
Skip RNN: Learning to Skip State Updates in Recurrent Neural Networks. Víctor Campos, Brendan Jou, Xavier Giró-I-Nieto, Jordi Torres and Shih-Fu Chang. (Contributed talk)
Fri 4:15 p.m. - 5:00 p.m. [iCal]

Online learning is a framework that makes minimal assumptions about the sequence of instances provided to a learner. This makes online learning an excellent framework for dealing with sequences of instances that vary with time. In this talk, we will look at inherent connections between online learning, certain Probabilistic Inequalities and the so called Burkholder Method. We will see how one can derive new, optimal, adaptive online learning algorithms using the Burkholder Method via the connection with Probabilistic Inequalities. We will use this insight to help us get a step closer to what I shall term Plug-&-Play ML. That is, help us move a step towards building machine learning systems automatically.

Fri 5:00 p.m. - 5:00 p.m. [iCal]
Scalable Joint Models for Reliable Event Prediction. Hossein Soleimani, James Hensman and Suchi Saria. (Contributed talk)
Fri 5:15 p.m. - 6:00 p.m. [iCal]

Climate Informatics is emerging as a compelling application of machine learning. This is due in part to the urgent nature of climate change, and its many remaining uncertainties (e.g. how will a changing climate affect severe storms and other extreme weather events?). Meanwhile, progress in climate informatics is made possible in part by the public availability of vast amounts of data, both simulated by large-scale physics-based models, and observed. Not only are time series at the crux of the study of climate science, but also, by definition, climate change implies non-stationarity. In addition, much of the relevant data is spatiotemporal, and also varies over location. In this talk, I will discuss our work on learning in the presence of spatial and temporal non-stationarity, and exploiting local dependencies in time and space. Along the way, I will highlight open problems in which machine learning, including deep learning methods, may prove fruitful.

Fri 6:00 p.m. - 6:15 p.m. [iCal]
An Efficient ADMM Algorithm for Structural Break Detection in Multivariate Time Series. Alex Tank, Emily Fox and Ali Shojaie. (Contributed talk)
Alex Tank
Fri 6:15 p.m. - 6:20 p.m. [iCal]
Conclusion and Awards (Concluding remarks and Awards)

Author Information

Vitaly Kuznetsov (Google Research)

Vitaly Kuznetsov is a research scientist at Google. Prior to joining Google Research, Vitaly received his Ph.D. in mathematics from the Courant Institute of Mathematical Sciences at New York University. Vitaly has contributed to a number of different areas in machine learning, in particular the development of the theory and algorithms for forecasting non-stationary time series. At Google, his work is focused on the design and implementation of large-scale machine learning tools and algorithms for time series modeling, forecasting and anomaly detection. His current research interests include all aspects of applied and theoretical time series analysis, in particular, in non-stationary environments.

Oren Anava (Technion)
Scott Yang (D. E. Shaw & Co.)
Azadeh Khaleghi (Mathematics & Statistics, Lancaster University)

More from the Same Authors