Towards Efficient Foundation Model: A Novel Time Series Embedding
Jessy Xinyi Han · Arth Dharaskar · Nathaniel Lanier · Abdullah Alomar · Aditya Agrawal · Angela Yuan · Jocelyn Hsieh · Ishan Shah · Muhammad Amjad · Devavrat Shah
Abstract
Time Series Foundation Model (TSFM) learns appropriate embeddings from pre-training data and uses them to embed the input time series for in-context learning to produce forecast. TSFM requires rich pre-training dataset and large computational resources to learn effective embeddings. In contrast, traditional time series modeling paradigm generates forecast for a given time series by fitting one of many pre-determined models and using the best of them to produce a forecast. Though resource efficient, it suffers from inability to utilize the pre-training data along with the challenges involved in the best model selection. In this work, we are motivated to bring the best of both worlds together to enable resource efficient TSFM approach. Towards that, we introduce a novel embedding of time series of any length and scale by mapping them to unit square (i.e $[0, 1]^2$) or equivalently a 2D image. To evaluate its efficacy compared to embedding from a TSFM, we consider the task of model identification or classification for dataset where each time series is generated from one of many pre-determined model class. We find that the performance of the proposed embeddings is comparable to that of embeddings from a pre-trained TSFM, but at a fraction of resource requirement. This suggests an alternative architectural possibility for a compute efficient TSFM.
Chat is not available.
Successful Page Load