firstbacksecondback
26 Results
Poster
|
Wed 15:00 |
NEO-KD: Knowledge-Distillation-Based Adversarial Training for Robust Multi-Exit Neural Networks Seokil Ham · Jungwuk Park · Dong-Jun Han · Jaekyun Moon |
|
Workshop
|
What is Lost in Knowledge Distillation? Manas Ranjan Mohanty · Tanya Roosta · Peyman Passban |
||
Workshop
|
Scene-adaptive Knowledge Distillation for Sequential Recommendation via Differentiable Architecture Search Lei Chen |
||
Workshop
|
Toward Student-oriented Teacher Network Training for Knowledge Distillation Chengyu Dong · Liyuan Liu · Jingbo Shang |
||
Poster
|
Wed 15:00 |
Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering Yijun Dong · Kevin Miller · Qi Lei · Rachel Ward |
|
Poster
|
Tue 8:45 |
Knowledge Diffusion for Distillation Tao Huang · Yuan Zhang · Mingkai Zheng · Shan You · Fei Wang · Chen Qian · Chang Xu |
|
Poster
|
Thu 15:00 |
Knowledge Distillation Performs Partial Variance Reduction Mher Safaryan · Alexandra Peste · Alexandra Peste · Dan Alistarh · Dan Alistarh |
|
Poster
|
Wed 8:45 |
What Knowledge Gets Distilled in Knowledge Distillation? Utkarsh Ojha · Yuheng Li · Anirudh Sundara Rajan · Yingyu Liang · Yong Jae Lee |
|
Expo Demonstration
|
Sun 9:00 |
Fast and Accurate Inference of LLaMA2-Chat 7B on a Smartphone via Quantization Aware Training and Speculative Decoding with Knowledge Distillation Ron Tindall |
|
Workshop
|
What Does Knowledge Distillation Distill? Cindy Wu · Ekdeep S Lubana · Bruno Mlodozeniec · Robert Kirk · David Krueger |
||
Workshop
|
What Does Knowledge Distillation Distill? Cindy Wu · Ekdeep S Lubana · Bruno Mlodozeniec · Robert Kirk · David Krueger |
||
Poster
|
Wed 8:45 |
Propagating Knowledge Updates to LMs Through Distillation Shankar Padmanabhan · Yasumasa Onoe · Michael Zhang · Greg Durrett · Eunsol Choi |