Skip to yearly menu bar Skip to main content


Poster

Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher

Guangda Ji · Zhanxing Zhu
2020 Poster

Abstract

Video

Chat is not available.