Timezone: »
We propose a boosting method, DirectBoost, a greedy coordinate descent algorithm that builds an ensemble classifier of weak classifiers through directly minimizing empirical classification error over labeled training examples; once the training classification error is reduced to a local coordinatewise minimum, DirectBoost runs a greedy coordinate ascent algorithm that continuously adds weak classifiers to maximize any targeted arbitrarily defined margins until reaching a local coordinatewise maximum of the margins in a certain sense. Experimental results on a collection of machine-learning benchmark datasets show that DirectBoost gives consistently better results than AdaBoost, LogitBoost, LPBoost with column generation and BrownBoost, and is noise tolerant when it maximizes an n'th order bottom sample margin.
Author Information
Shaodan Zhai (Wright State University)
Tian Xia (Wright State University)
Ming Tan (Wright State University)
Shaojun Wang (Wright State University)
More from the Same Authors
-
2009 Poster: A Rate Distortion Approach for Semi-Supervised Conditional Random Fields »
Yang Wang · Gholamreza Haffari · Shaojun Wang · Greg Mori -
2006 Poster: Learning to Model Spatial Dependency: Semi-Supervised Discriminative Random Fields »
Chi-Hoon Lee · Shaojun Wang · Feng Jiao · Dale Schuurmans · Russell Greiner -
2006 Poster: implicit Online Learning with Kernels »
Li Cheng · Vishwanathan S V N · Dale Schuurmans · Shaojun Wang · Terry Caelli