Skip to yearly menu bar Skip to main content


Poster

Estimation of Entropy in Constant Space with Improved Sample Complexity

Maryam Aliakbarpour · Andrew McGregor · Jelani Nelson · Erik Waingarten

Hall J (level 1) #819

Keywords: [ Sample Complexity ] [ Shannon Entropy ] [ Data streams ]


Abstract: Recent work of Acharya et al.~(NeurIPS 2019) showed how to estimate the entropy of a distribution D over an alphabet of size k up to ±ϵ additive error by streaming over (k/ϵ3)polylog(1/ϵ) i.i.d.\ samples and using only O(1) words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to (k/ϵ2)polylog(1/ϵ). We conjecture that this is optimal up to polylog(1/ϵ) factors.

Chat is not available.