Skip to yearly menu bar Skip to main content


Poster

Entropic Graph Regularization in Non-Parametric Semi-Supervised Classification

Amarnag Subramanya · Jeffrey A Bilmes


Abstract:

We prove certain theoretical properties of a graph-regularized transductive learning objective that is based on minimizing a Kullback-Leibler divergence based loss. These include showing that the iterative alternating minimization procedure used to minimize the objective converges to the correct solution and deriving a test for convergence. We also propose a graph node ordering algorithm that is cache cognizant and leads to a linear speedup in parallel computations. This ensures that the algorithm scales to large data sets. By making use of empirical evaluation on the TIMIT and Switchboard I corpora, we show this approach is able to out-perform other state-of-the-art SSL approaches. In one instance, we solve a problem on a 120 million node graph.

Live content is unavailable. Log in and register to view live content