Skip to yearly menu bar Skip to main content

Workshop: 4th Workshop on Self-Supervised Learning: Theory and Practice

Soft Contrastive Learning for Time Series

Seunghan Lee · Taeyoung Park · Kibok Lee


In contrastive learning for time series, contrasting similar time series instances or values from adjacent timestamps within a time series leads to ignore their inherent correlations, deteriorating the quality of learned representations. To address this issue, we propose SoftCLT, a simple yet effective soft contrastive learning strategy for time series. This is achieved by introducing instance-wise and temporal contrastive loss with soft assignments. Specifically, we define soft assignments for 1) instance-wise contrastive loss by the distance between time series on the data space, and 2) temporal contrastive loss by the difference of timestamps. SoftCLT is a plug-and-play method for time series contrastive learning that improves the quality of learned representations. In experiments, we demonstrate that SoftCLT consistently improves the performance in various downstream tasks.

Chat is not available.