Skip to yearly menu bar Skip to main content


Poster

Smooth And Consistent Probabilistic Regression Trees

Sami Alkhoury · Emilie Devijver · Marianne Clausel · Myriam Tami · Eric Gaussier · georges Oppenheim

Poster Session 5 #1614

Abstract:

We propose here a generalization of regression trees, referred to as Probabilistic Regression (PR) trees, that adapt to the smoothness of the prediction function relating input and output variables while preserving the interpretability of the prediction and being robust to noise. In PR trees, an observation is associated to all regions of a tree through a probability distribution that reflects how far the observation is to a region. We show that such trees are consistent, meaning that their error tends to 0 when the sample size tends to infinity, a property that has not been established for similar, previous proposals as Soft trees and Smooth Transition Regression trees. We further explain how PR trees can be used in different ensemble methods, namely Random Forests and Gradient Boosted Trees. Lastly, we assess their performance through extensive experiments that illustrate their benefits in terms of performance, interpretability and robustness to noise.

Chat is not available.