Abstract:
We revisit the fundamental problem of learning Axis-Aligned-Rectangles over a finite grid with differential privacy. Existing results show that the sample complexity of this problem is at most . That is, existing constructions either require sample complexity that grows linearly with , or else it grows super linearly with the dimension . We present a novel algorithm that reduces the sample complexity to only , attaining a dimensionality optimal dependency without requiring the sample complexity to grow with . The technique used in order to attain this improvement involves the deletion of "exposed" data-points on the go, in a fashion designed to avoid the cost of the adaptive composition theorems.The core of this technique may be of individual interest, introducing a new method for constructing statistically-efficient private algorithms.
Chat is not available.