Skip to yearly menu bar Skip to main content


Search All 2023 Events
 

17 Results

<<   <   Page 2 of 2   >>   >
Poster
Wed 15:00 NEO-KD: Knowledge-Distillation-Based Adversarial Training for Robust Multi-Exit Neural Networks
Seokil Ham · Jungwuk Park · Dong-Jun Han · Jaekyun Moon
Workshop
Data Distillation for Neural Network Potentials toward Foundational Dataset
Gang Seob Jung · Sangkeun Lee · Jong Choi
Poster
Wed 15:00 Does Graph Distillation See Like Vision Dataset Counterpart?
Beining Yang · Kai Wang · Qingyun Sun · Cheng Ji · Xingcheng Fu · Hao Tang · Yang You · Jianxin Li
Poster
Tue 15:15 Accelerating Molecular Graph Neural Networks via Knowledge Distillation
Filip Ekström Kelvinius · Dimitar Georgiev · Artur Toshev · Johannes Gasteiger
Workshop
Mixup-Based Knowledge Distillation with Causal Intervention for Multi-Task Speech Classification
Kwangje Baeg · Hyeopwoo Lee · Yeomin Yoon · Jongmo Kim