Timezone: »

A Novel Stochastic Gradient Descent Algorithm for LearningPrincipal Subspaces
Charline Le Lan · Joshua Greaves · Jesse Farebrother · Mark Rowland · Fabian Pedregosa · Rishabh Agarwal · Marc Bellemare
Event URL: https://openreview.net/forum?id=i1h0gZ0KTxZ »

In this paper, we derive an algorithm that learns a principal subspace from sample entries, can be applied when the approximate subspace is represented by a neural network, and hence can bescaled to datasets with an effectively infinite number of rows and columns. Our method consistsin defining a loss function whose minimizer is the desired principal subspace, and constructing agradient estimate of this loss whose bias can be controlled.

Author Information

Charline Le Lan (University of Oxford)
Joshua Greaves (Google)
Jesse Farebrother (Mila / McGill University)
Mark Rowland (DeepMind)
Fabian Pedregosa (Google AI)
Rishabh Agarwal (Google Research, Brain Team)

My research work mainly revolves around deep reinforcement learning (RL), often with the goal of making RL methods suitable for real-world problems, and includes an outstanding paper award at NeurIPS.

Marc Bellemare (Google Brain)

More from the Same Authors