Poster
A Theory-Based Evaluation of Nearest Neighbor Models Put Into Practice
Hendrik Fichtenberger · Dennis Rohde
Room 210 #91
Keywords: [ Learning Theory ] [ Classification ] [ Computational Complexity ]
[
Abstract
]
Abstract:
In the -nearest neighborhood model (-NN), we are given a set of points , and we shall answer queries by returning the nearest neighbors of in according to some metric. This concept is crucial in many areas of data analysis and data processing, e.g., computer vision, document retrieval and machine learning. Many -NN algorithms have been published and implemented, but often the relation between parameters and accuracy of the computed -NN is not explicit. We study property testing of -NN graphs in theory and evaluate it empirically: given a point set and a directed graph , is a -NN graph, i.e., every point has outgoing edges to its nearest neighbors, or is it -far from being a -NN graph? Here, -far means that one has to change more than an -fraction of the edges in order to make a -NN graph. We develop a randomized algorithm with one-sided error that decides this question, i.e., a property tester for the -NN property, with complexity measured in terms of the number of vertices and edges it inspects, and we prove a lower bound of . We evaluate our tester empirically on the -NN models computed by various algorithms and show that it can be used to detect -NN models with bad accuracy in significantly less time than the building time of the -NN model.
Live content is unavailable. Log in and register to view live content