Skip to yearly menu bar Skip to main content


Poster

Generative Neural Machine Translation

Harshil Shah · David Barber

Room 210 #17

Keywords: [ Latent Variable Models ] [ Natural Language Processing ] [ Representation Learning ] [ Semi-Supervised Learning ] [ Variational Inference ]


Abstract:

We introduce Generative Neural Machine Translation (GNMT), a latent variable architecture which is designed to model the semantics of the source and target sentences. We modify an encoder-decoder translation model by adding a latent variable as a language agnostic representation which is encouraged to learn the meaning of the sentence. GNMT achieves competitive BLEU scores on pure translation tasks, and is superior when there are missing words in the source sentence. We augment the model to facilitate multilingual translation and semi-supervised learning without adding parameters. This framework significantly reduces overfitting when there is limited paired data available, and is effective for translating between pairs of languages not seen during training.

Live content is unavailable. Log in and register to view live content