Timezone: »
Poster
A Risk Minimization Principle for a Class of Parzen Estimators
Kristiaan Pelckmans · Johan Suykens · Bart De Moor
This paper explores the use of a Maximal Average Margin (MAM) optimality principle for the design of learning algorithms. It is shown that the application of this risk minimization principle results in a class of (computationally) simple learning machines similar to the classical Parzen window classifier. A direct relation with the Rademacher complexities is established, as such facilitating analysis and providing a notion of certainty of prediction. This analysis is related to Support Vector Machines by means of a margin transformation. The power of the MAM principle is illustrated further by application to ordinal regression tasks, resulting in an $O(n)$ algorithm able to process large datasets in reasonable time.
Author Information
Kristiaan Pelckmans (Uppsala University)
Johan Suykens (KU Leuven)
Bart De Moor
More from the Same Authors
-
2022 Poster: On the Double Descent of Random Features Models Trained with SGD »
Fanghui Liu · Johan Suykens · Volkan Cevher -
2020 Poster: A Theoretical Framework for Target Propagation »
Alexander Meulemans · Francesco Carzaniga · Johan Suykens · João Sacramento · Benjamin F. Grewe -
2020 Spotlight: A Theoretical Framework for Target Propagation »
Alexander Meulemans · Francesco Carzaniga · Johan Suykens · João Sacramento · Benjamin F. Grewe -
2010 Workshop: Tensors, Kernels, and Machine Learning »
Tamara G Kolda · Vicente Malave · David F Gleich · Johan Suykens · Marco Signoretto · Andreas Argyriou -
2008 Workshop: New Challanges in Theoretical Machine Learning: Data Dependent Concept Spaces »
Maria-Florina F Balcan · Shai Ben-David · Avrim Blum · Kristiaan Pelckmans · John Shawe-Taylor