Timezone: »
Spotlight
Constant Regret, Generalized Mixability, and Mirror Descent
Zakaria Mhammedi · Robert Williamson
We consider the setting of prediction with expert advice; a learner makes predictions by aggregating those of a group of experts. Under this setting, and for the right choice of loss function and ``mixing'' algorithm, it is possible for the learner to achieve a constant regret regardless of the number of prediction rounds. For example, a constant regret can be achieved for \emph{mixable} losses using the \emph{aggregating algorithm}. The \emph{Generalized Aggregating Algorithm} (GAA) is a name for a family of algorithms parameterized by convex functions on simplices (entropies), which reduce to the aggregating algorithm when using the \emph{Shannon entropy} $\operatorname{S}$. For a given entropy $\Phi$, losses for which a constant regret is possible using the \textsc{GAA} are called $\Phi$-mixable. Which losses are $\Phi$-mixable was previously left as an open question. We fully characterize $\Phi$-mixability and answer other open questions posed by \cite{Reid2015}. We show that the Shannon entropy $\operatorname{S}$ is fundamental in nature when it comes to mixability; any $\Phi$-mixable loss is necessarily $\operatorname{S}$-mixable, and the lowest worst-case regret of the \textsc{GAA} is achieved using the Shannon entropy. Finally, by leveraging the connection between the \emph{mirror descent algorithm} and the update step of the GAA, we suggest a new \emph{adaptive} generalized aggregating algorithm and analyze its performance in terms of the regret bound.
Author Information
Zakaria Mhammedi (The Australian National University)
Robert Williamson (Australian National University & Data61)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Constant Regret, Generalized Mixability, and Mirror Descent »
Wed Dec 5th 03:45 -- 05:45 PM Room Room 210
More from the Same Authors
-
2020 Poster: PAC-Bayesian Bound for the Conditional Value at Risk »
Zakaria Mhammedi · Benjamin Guedj · Robert Williamson -
2020 Poster: Learning the Linear Quadratic Regulator from Nonlinear Observations »
Zakaria Mhammedi · Dylan Foster · Max Simchowitz · Dipendra Misra · Wen Sun · Akshay Krishnamurthy · Alexander Rakhlin · John Langford -
2020 Spotlight: PAC-Bayesian Bound for the Conditional Value at Risk »
Zakaria Mhammedi · Benjamin Guedj · Robert Williamson -
2019 Poster: PAC-Bayes Un-Expected Bernstein Inequality »
Zakaria Mhammedi · Peter Grünwald · Benjamin Guedj -
2019 Poster: A Primal-Dual link between GANs and Autoencoders »
Hisham Husain · Richard Nock · Robert Williamson -
2018 Poster: A loss framework for calibrated anomaly detection »
Aditya Menon · Robert Williamson -
2018 Spotlight: A loss framework for calibrated anomaly detection »
Aditya Menon · Robert Williamson -
2017 Poster: f-GANs in an Information Geometric Nutshell »
Richard Nock · Zac Cranko · Aditya K Menon · Lizhen Qu · Robert Williamson -
2017 Spotlight: f-GANs in an Information Geometric Nutshell »
Richard Nock · Zac Cranko · Aditya K Menon · Lizhen Qu · Robert Williamson -
2015 Poster: Learning with Symmetric Label Noise: The Importance of Being Unhinged »
Brendan van Rooyen · Aditya Menon · Robert Williamson -
2015 Spotlight: Learning with Symmetric Label Noise: The Importance of Being Unhinged »
Brendan van Rooyen · Aditya Menon · Robert Williamson -
2014 Poster: From Stochastic Mixability to Fast Rates »
Nishant Mehta · Robert Williamson -
2014 Oral: From Stochastic Mixability to Fast Rates »
Nishant Mehta · Robert Williamson -
2012 Poster: Mixability in Statistical Learning »
Tim van Erven · Peter Grünwald · Mark Reid · Robert Williamson -
2011 Workshop: Relations between machine learning problems - an approach to unify the field »
Robert Williamson · John Langford · Ulrike von Luxburg · Mark Reid · Jennifer Wortman Vaughan -
2011 Poster: Composite Multiclass Losses »
Elodie Vernet · Robert Williamson · Mark Reid -
2009 Workshop: Clustering: Science or art? Towards principled approaches »
Margareta Ackerman · Shai Ben-David · Avrim Blum · Isabelle Guyon · Ulrike von Luxburg · Robert Williamson · Reza Zadeh