Workshop
|
|
Constructing Memory: Consolidation as Teacher-Student Training of a Generative Model
Eleanor Spens · Neil Burgess
|
|
Poster
|
Tue 9:00
|
Knowledge Distillation: Bad Models Can Be Good Role Models
Gal Kaplun · Eran Malach · Preetum Nakkiran · Shai Shalev-Shwartz
|
|
Poster
|
Thu 9:00
|
An Analytical Theory of Curriculum Learning in Teacher-Student Networks
Luca Saglietti · Stefano Mannelli · Andrew Saxe
|
|
Poster
|
Thu 14:00
|
Training Spiking Neural Networks with Local Tandem Learning
Qu Yang · Jibin Wu · Malu Zhang · Yansong Chua · Xinchao Wang · Haizhou Li
|
|
Poster
|
Tue 9:00
|
What Makes a "Good" Data Augmentation in Knowledge Distillation - A Statistical Perspective
Huan Wang · Suhas Lohit · Michael Jones · Yun Fu
|
|
Workshop
|
|
RoTaR: Efficient Row-Based Table Representation Learning via Teacher-Student Training (Short Paper)
Zui Chen · Lei Cao · Samuel Madden
|
|
Workshop
|
|
Honest Students from Untrusted Teachers: Learning an Interpretable Question-Answering Pipeline from a Pretrained Language Model
Jacob Eisenstein · Daniel Andor · Bernd Bohnet · Michael Collins · David Mimno
|
|