Timezone: »
Poster
Pointwise Bounds for Distribution Estimation under Communication Constraints
WeiNing Chen · Peter Kairouz · Ayfer Ozgur
We consider the problem of estimating a $d$dimensional discrete distribution from its samples observed under a $b$bit communication constraint. In contrast to most previous results that largely focus on the global minimax error, we study the local behavior of the estimation error and provide \emph{pointwise} bounds that depend on the target distribution $p$. In particular, we show that the $\ell_2$ error decays with $O\left(\frac{\lVert p\rVert_{1/2}}{n2^b}\vee \frac{1}{n}\right)$ when $n$ is sufficiently large, hence it is governed by the \emph{halfnorm} of $p$ instead of the ambient dimension $d$. For the achievability result, we propose a tworound sequentially interactive estimation scheme that achieves this error rate uniformly over all $p$. Our scheme is based on a novel local refinement idea, where we first use a standard global minimax scheme to localize $p$ and then use the remaining samples to locally refine our estimate.We also develop a new local minimax lower bound with (almost) matching $\ell_2$ error, showing that any interactive scheme must admit a $\Omega\left( \frac{\lVert p \rVert_{{(1+\delta)}/{2}}}{n2^b}\right)$ $\ell_2$ error for any $\delta > 0$ when $n$ is sufficiently large. The lower bound is derived by first finding the best parametric submodel containing $p$, and then upper bounding the quantized Fisher information under this model. Our upper and lower bounds together indicate that the $\mathsf{H}_{1/2}(p) = \log(\lVert p \rVert_{{1}/{2}})$ bits of communication is both sufficient and necessary to achieve the optimal (centralized) performance, where $\mathsf{H}_{{1}/{2}}(p)$ is the R\'enyi entropy of order $2$. Therefore, under the $\ell_2$ loss, the correct measure of the local communication complexity at $p$ is its R\'enyi entropy.
Author Information
WeiNing Chen (Stanford University)
Peter Kairouz (Google)
Ayfer Ozgur (Stanford University)
More from the Same Authors

2021 : Communication Efficient Federated Learning with Secure Aggregation and Differential Privacy »
WeiNing Chen · Christopher ChoquetteChoo · Peter Kairouz 
2021 Poster: The Skellam Mechanism for Differentially Private Federated Learning »
Naman Agarwal · Peter Kairouz · Ken Liu 
2021 Poster: Batched Thompson Sampling »
Cem Kalkanli · Ayfer Ozgur 
2020 Tutorial: (Track1) Federated Learning and Analytics: Industry Meets Academia Q&A »
Peter Kairouz · Brendan McMahan · Virginia Smith 
2020 Poster: Breaking the CommunicationPrivacyAccuracy Trilemma »
WeiNing Chen · Peter Kairouz · Ayfer Ozgur