Skip to yearly menu bar Skip to main content


Contributed Talk
in
Workshop: Machine Learning with Guarantees

James Lucas, "Information-theoretic limitations on novel task generalization"

James Lucas


Abstract:

Machine learning models have traditionally been developed under the assumption that the training and test distributions match exactly. However, recent success in few-shot learning and related problems are encouraging signs that these models can be adapted to more realistic settings where train and test differ. Unfortunately, there is severely limited theoretical support for these algorithms and little is known about the difficulty of these problems. In this work, we provide novel information-theoretic lower-bounds on minimax rates of convergence for algorithms which are trained on data from multiple sources and tested on novel data. Our bounds depend intuitively on the information shared between sources of data and characterizes the difficulty of learning in this setting for arbitrary algorithms.

Live content is unavailable. Log in and register to view live content