Poster
Non-parametric classification via expand-and-sparsify representation
Kaushik Sinha
Abstract:
In *expand-and-sparsify* (EaS) representation, a data point in is first randomly mapped to higher dimension , where , followed by a sparsification operation where the informative of the coordinates are set to one and the rest are set to zero. We propose two algorithms for non-parametric classification using such EaS representation. For our first algorithm, we use *winners-take-all* operation for the sparsification step and show that the proposed classifier admits the form of a locally weighted average classifier and establish its consistency via Stone's Theorem. Further, assuming that the conditional probability function is H\"{o}lder continuous and for optimal choice of , we show that the convergence rate of this classifier is minimax-optimal. For our second algorithm, we use *empirical -thresholding* operation for the sparsification step, and under the assumption that data lie on a low dimensional manifold of dimension , we show that the convergence rate of this classifier depends only on and is again minimax-optimal. Empirical evaluations performed on real-world datasets corroborate our theoretical results.
Chat is not available.