Skip to yearly menu bar Skip to main content


Poster

On the Downstream Performance of Compressed Word Embeddings

Avner May · Jian Zhang · Tri Dao · Christopher RĂ©

East Exhibition Hall B, C #185

Keywords: [ Embedding Approaches ] [ Deep Learning ] [ Algorithms -> Classification; Algorithms -> Kernel Methods; Algorithms -> Regression; Applications ] [ Hardware and Systems; App ]


Abstract:

Compressing word embeddings is important for deploying NLP models in memory-constrained settings. However, understanding what makes compressed embeddings perform well on downstream tasks is challenging---existing measures of compression quality often fail to distinguish between embeddings that perform well and those that do not. We thus propose the eigenspace overlap score as a new measure. We relate the eigenspace overlap score to downstream performance by developing generalization bounds for the compressed embeddings in terms of this score, in the context of linear and logistic regression. We then show that we can lower bound the eigenspace overlap score for a simple uniform quantization compression method, helping to explain the strong empirical performance of this method. Finally, we show that by using the eigenspace overlap score as a selection criterion between embeddings drawn from a representative set we compressed, we can efficiently identify the better performing embedding with up to 2x lower selection error rates than the next best measure of compression quality, and avoid the cost of training a separate model for each task of interest.

Live content is unavailable. Log in and register to view live content