firstbacksecondback
78 Results
Affinity Workshop
|
DistillEmb: Distilling Word Embeddings via Contrastive Learning Amanuel Mersha |
||
Poster
|
Teach Less, Learn More: On the Undistillable Classes in Knowledge Distillation Yichen Zhu · Ning Liu · Zhiyuan Xu · Xin Liu · Weibin Meng · Louis Wang · Zhicai Ou · Jian Tang |
||
Affinity Workshop
|
DistillEmb: Distilling Word Embeddings via Contrastive Learning Amanuel Mersha · Stephen Wu |
||
Poster
|
Wed 9:00 |
Distilled Gradient Aggregation: Purify Features for Input Attribution in the Deep Neural Network Giyoung Jeon · Haedong Jeong · Jaesik Choi |
|
Poster
|
Learning Generalizable Models for Vehicle Routing Problems via Knowledge Distillation Jieyi Bi · Yining Ma · Jiahai Wang · Zhiguang Cao · Jinbiao Chen · Yuan Sun · Yeow Meng Chee |
||
Poster
|
Wed 14:00 |
Fairness without Demographics through Knowledge Distillation Junyi Chai · Taeuk Jang · Xiaoqian Wang |