Skip to yearly menu bar Skip to main content


Search All 2022 Events
 

78 Results

<<   <   Page 2 of 7   >   >>
Poster
Tue 9:00 XTC: Extreme Compression for Pre-trained Transformers Made Simple and Efficient
Xiaoxia Wu · Zhewei Yao · Minjia Zhang · Conglong Li · Yuxiong He
Poster
Wed 14:00 Structural Knowledge Distillation for Object Detection
Philip de Rijk · Lukas Schneider · Marius Cordts · Dariu Gavrila
Poster
Thu 14:00 Efficient Dataset Distillation using Random Feature Approximation
Noel Loo · Ramin Hasani · Alexander Amini · Daniela Rus
Poster
Respecting Transfer Gap in Knowledge Distillation
Yulei Niu · Long Chen · Chang Zhou · Hanwang Zhang
Poster
Tue 14:00 Decomposing NeRF for Editing via Feature Field Distillation
Sosuke Kobayashi · Eiichi Matsumoto · Vincent Sitzmann
Poster
Tue 9:00 Dataset Distillation using Neural Feature Regression
Yongchao Zhou · Ehsan Nezhadarya · Jimmy Ba
Poster
Tue 9:00 What Makes a "Good" Data Augmentation in Knowledge Distillation - A Statistical Perspective
Huan Wang · Suhas Lohit · Michael Jones · Yun Fu
Poster
Tue 14:00 DENSE: Data-Free One-Shot Federated Learning
Jie Zhang · Chen Chen · Bo Li · Lingjuan Lyu · Shuang Wu · Shouhong Ding · Chunhua Shen · Chao Wu
Poster
Tue 9:00 Knowledge Distillation: Bad Models Can Be Good Role Models
Gal Kaplun · Eran Malach · Preetum Nakkiran · Shai Shalev-Shwartz
Poster
Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again
Xin-Chun Li · Wen-shu Fan · Shaoming Song · Yinchuan Li · bingshuai Li · Shao Yunfeng · De-Chuan Zhan
Poster
Thu 14:00 SeqPATE: Differentially Private Text Generation via Knowledge Distillation
Zhiliang Tian · Yingxiu Zhao · Ziyue Huang · Yu-Xiang Wang · Nevin L. Zhang · He He
Poster
Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks
Lirong Wu · Haitao Lin · Yufei Huang · Stan Z. Li