Poster
On the Algorithmics and Applications of a Mixed-norm based Kernel Learning Formulation
Saketha Nath Jagarlapudi · dinesh govindaraj · Raman S · Chiranjib Bhattacharyya · Aharon Ben-Tal · K. R. Ramakrishnan
[
Abstract
]
Abstract:
Motivated from real world problems, like object categorization, we study a particular mixed-norm regularization for Multiple Kernel Learning (MKL). It is assumed that the given set of kernels are grouped into distinct components where each component is crucial for the learning task at hand. The formulation hence employs $l_\infty$ regularization for promoting combinations at the component level and $l_1$ regularization for promoting sparsity among kernels in each component. While previous attempts have formulated this as a non-convex problem, the formulation given here is an instance of non-smooth convex optimization problem which admits an efficient Mirror-Descent (MD) based procedure. The MD procedure optimizes over product of simplexes, which is not a well-studied case in literature. Results on real-world datasets show that the new MKL formulation is well-suited for object categorization tasks and that the MD based algorithm outperforms state-of-the-art MKL solvers like \texttt{simpleMKL} in terms of computational effort.
Live content is unavailable. Log in and register to view live content