Skip to yearly menu bar Skip to main content


Poster

A Unified Near-Optimal Estimator For Dimension Reduction in $l_\alpha$ ($0<\alpha\leq 2$) Using Sta

Ping Li · Trevor Hastie


Abstract: Many tasks (e.g., clustering) in machine learning only require the $l_\alpha$ distances instead of the original data. For dimension reductions in the $l_\alpha$ norm ($0<\alpha\leq 2$), the method of {\em stable random projections} can efficiently compute the $l_\alpha$ distances in massive datasets (e.g., the Web or massive data streams) in one pass of the data. The estimation task for {\em stable random projections} has been an interesting topic. We propose a simple estimator based on the {\em fractional power} of the samples (projected data), which is surprisingly near-optimal in terms of the asymptotic variance. In fact, it achieves the Cram\'er-Rao bound when $\alpha =2$ and $\alpha =0+$. This new result will be useful when applying {\em stable random projections} to distance-based clustering, classifications, kernels, massive data streams etc.

Live content is unavailable. Log in and register to view live content