Skip to yearly menu bar Skip to main content


Poster

Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting

Jun Shu · Qi Xie · Lixuan Yi · Qian Zhao · Sanping Zhou · Zongben Xu · Deyu Meng

East Exhibition Hall B + C #48

Keywords: [ Object Recognition ] [ Algorithms -> Classification; Algorithms -> Meta-Learning; Applications ] [ Algorithms ] [ Semi-Supervised Learning ]


Abstract:

Current deep neural networks(DNNs) can easily overfit to biased training data with corrupted labels or class imbalance. Sample re-weighting strategy is commonly used to alleviate this issue by designing a weighting function mapping from training loss to sample weight, and then iterating between weight recalculating and classifier updating. Current approaches, however, need manually pre-specify the weighting function as well as its additional hyper-parameters. It makes them fairly hard to be generally applied in practice due to the significant variation of proper weighting schemes relying on the investigated problem and training data. To address this issue, we propose a method capable of adaptively learning an explicit weighting function directly from data. The weighting function is an MLP with one hidden layer, constituting a universal approximator to almost any continuous functions, making the method able to fit a wide range of weighting function forms including those assumed in conventional research. Guided by a small amount of unbiased meta-data, the parameters of the weighting function can be finely updated simultaneously with the learning process of the classifiers. Synthetic and real experiments substantiate the capability of our method for achieving proper weighting functions in class imbalance and noisy label cases, fully complying with the common settings in traditional methods, and more complicated scenarios beyond conventional cases. This naturally leads to its better accuracy than other state-of-the-art methods.

Live content is unavailable. Log in and register to view live content