Timezone: »

 
Poster
Stochastic Multiple Choice Learning for Training Diverse Deep Ensembles
Stefan Lee · Senthil Purushwalkam · Michael Cogswell · Viresh Ranjan · David Crandall · Dhruv Batra

Tue Dec 06 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #130

Many practical perception systems exist within larger processes which often include interactions with users or additional components that are capable of evaluating the quality of predicted solutions. In these contexts, it is beneficial to provide these oracle mechanisms with multiple highly likely hypotheses rather than a single prediction. In this work, we pose the task of producing multiple outputs as a learning problem over an ensemble of deep networks -- introducing a novel stochastic gradient descent based approach to minimize the loss with respect to an oracle. Our method is simple to implement, agnostic to both architecture and loss function, and parameter-free. Our approach achieves lower oracle error compared to existing methods on a wide range of tasks and deep architectures. We also show qualitatively that solutions produced from our approach often provide interpretable representations of task ambiguity.

Author Information

Stefan Lee (Indiana University)
Senthil Purushwalkam (Carnegie Mellon)
Michael Cogswell (Virginia Tech)
Viresh Ranjan (Virginia Tech)
David Crandall (Indiana University)
Dhruv Batra (FAIR (Meta) / Georgia Tech)

More from the Same Authors