Timezone: »

 
Poster
Relaxed Clipping: A Global Training Method for Robust Regression and Classification
Yao-Liang Yu · Min Yang · Linli Xu · Martha White · Dale Schuurmans

Tue Dec 07 12:00 AM -- 12:00 AM (PST) @

Robust regression and classification are often thought to require non-convex loss functions that prevent scalable, global training. However, such a view neglects the possibility of reformulated training methods that can yield practically solvable alternatives. A natural way to make a loss function more robust to outliers is to truncate loss values that exceed a maximum threshold. We demonstrate that a relaxation of this form of ``loss clipping'' can be made globally solvable and applicable to any standard loss while guaranteeing robustness against outliers. We present a generic procedure that can be applied to standard loss functions and demonstrate improved robustness in regression and classification problems.

Author Information

Yao-Liang Yu (University of Waterloo)
Min Yang (University of Alberta)
Linli Xu (University of Science and Tech)
Martha White (University of Alberta)
Dale Schuurmans (Google Brain & University of Alberta)

More from the Same Authors