Skip to yearly menu bar Skip to main content


Poster

A Risk Minimization Principle for a Class of Parzen Estimators

Kristiaan Pelckmans · Johan Suykens · Bart De Moor


Abstract: This paper explores the use of a Maximal Average Margin (MAM) optimality principle for the design of learning algorithms. It is shown that the application of this risk minimization principle results in a class of (computationally) simple learning machines similar to the classical Parzen window classifier. A direct relation with the Rademacher complexities is established, as such facilitating analysis and providing a notion of certainty of prediction. This analysis is related to Support Vector Machines by means of a margin transformation. The power of the MAM principle is illustrated further by application to ordinal regression tasks, resulting in an O(n) algorithm able to process large datasets in reasonable time.

Live content is unavailable. Log in and register to view live content