Skip to yearly menu bar Skip to main content


Poster

DropMax: Adaptive Variational Softmax

Hae Beom Lee · Juho Lee · Saehoon Kim · Eunho Yang · Sung Ju Hwang

Room 210 #76

Keywords: [ CNN Architectures ] [ Object Recognition ] [ Supervised Deep Networks ] [ Variational Inference ]


Abstract:

We propose DropMax, a stochastic version of softmax classifier which at each iteration drops non-target classes according to dropout probabilities adaptively decided for each instance. Specifically, we overlay binary masking variables over class output probabilities, which are input-adaptively learned via variational inference. This stochastic regularization has an effect of building an ensemble classifier out of exponentially many classifiers with different decision boundaries. Moreover, the learning of dropout rates for non-target classes on each instance allows the classifier to focus more on classification against the most confusing classes. We validate our model on multiple public datasets for classification, on which it obtains significantly improved accuracy over the regular softmax classifier and other baselines. Further analysis of the learned dropout probabilities shows that our model indeed selects confusing classes more often when it performs classification.

Live content is unavailable. Log in and register to view live content