Skip to yearly menu bar Skip to main content


Poster

Rankmax: An Adaptive Projection Alternative to the Softmax Function

Weiwei Kong · Walid Krichene · Nicolas E Mayoraz · Steffen Rendle · Li Zhang

Poster Session 0 #0

Keywords: [ Learning Theory ] [ Theory ] [ Algorithms ] [ Unsupervised Learning ]


Abstract:

Several machine learning models involve mapping a score vector to a probability vector. Usually, this is done by projecting the score vector onto a probability simplex, and such projections are often characterized as Lipschitz continuous approximations of the argmax function, whose Lipschitz constant is controlled by a parameter that is similar to a softmax temperature. The aforementioned parameter has been observed to affect the quality of these models and is typically either treated as a constant or decayed over time. In this work, we propose a method that adapts this parameter to individual training examples. The resulting method exhibits desirable properties, such as sparsity of its support and numerically efficient implementation, and we find that it can significantly outperform competing non-adaptive projection methods.

Chat is not available.