Timezone: »

 
SynBench: Task-Agnostic Benchmarking of Pretrained Representations using Synthetic Data
Ching-Yun Ko · Pin-Yu Chen · Jeet Mohapatra · Payel Das · Luca Daniel

Fri Dec 02 08:30 AM -- 08:32 AM (PST) @
Event URL: https://openreview.net/forum?id=YVwQvDKFYNs »

Recent success in fine-tuning large models, that are pretrained on broad data at scale, on downstream tasks has led to a significant paradigm shift in deep learning, from task-centric model design to task-agnostic representation learning and task-specific fine-tuning. As the representations of pretrained models are used as a foundation for different downstream tasks, this paper proposes a new task-agnostic framework, \textit{SynBench}, to measure the quality of pretrained representations using synthetic data. Our framework applies to a wide range of pretrained models taking continuous data inputs and is independent of the downstream tasks and datasets. Evaluated with several pretrained vision transformer models, the experimental results show that our SynBench score well matches the actual linear probing performance of the pre-trained model , and can inform the design of robust linear probing on pretrained representations to mitigate the robustness-accuracy tradeoff in downstream tasks.

Author Information

Ching-Yun Ko (MIT)
Pin-Yu Chen (IBM Research)
Jeet Mohapatra (MIT)
Payel Das (IBM Research)
Luca Daniel (Massachusetts Institute of Technology)

More from the Same Authors