Poster
Pointwise Bounds for Distribution Estimation under Communication Constraints
Wei-Ning Chen · Peter Kairouz · Ayfer Ozgur
Abstract:
We consider the problem of estimating a -dimensional discrete distribution from its samples observed under a -bit communication constraint. In contrast to most previous results that largely focus on the global minimax error, we study the local behavior of the estimation error and provide \emph{pointwise} bounds that depend on the target distribution . In particular, we show that the error decays with when is sufficiently large, hence it is governed by the \emph{half-norm} of instead of the ambient dimension . For the achievability result, we propose a two-round sequentially interactive estimation scheme that achieves this error rate uniformly over all . Our scheme is based on a novel local refinement idea, where we first use a standard global minimax scheme to localize and then use the remaining samples to locally refine our estimate.We also develop a new local minimax lower bound with (almost) matching error, showing that any interactive scheme must admit a error for any when is sufficiently large. The lower bound is derived by first finding the best parametric sub-model containing , and then upper bounding the quantized Fisher information under this model. Our upper and lower bounds together indicate that the bits of communication is both sufficient and necessary to achieve the optimal (centralized) performance, where is the R\'enyi entropy of order . Therefore, under the loss, the correct measure of the local communication complexity at is its R\'enyi entropy.
Chat is not available.