Poster
Neural Machine Translation with Soft Prototype
Yiren Wang · Yingce Xia · Fei Tian · Fei Gao · Tao Qin · Cheng Xiang Zhai · Tie-Yan Liu

Thu Dec 12th 05:00 -- 07:00 PM @ East Exhibition Hall B + C #138

Neural machine translation models usually use the encoder-decoder framework and generate translation from left to right (or right to left) without fully utilizing the target-side global information. A few recent approaches seek to exploit the global information through two-pass decoding, yet have limitations in translation quality and model efficiency. In this work, we propose a new framework that introduces a soft prototype into the encoder-decoder architecture, which allows the decoder to have indirect access to both past and future information, such that each target word can be generated based on the better global understanding. We further provide an efficient and effective method to generate the prototype. Empirical studies on various neural machine translation tasks show that our approach brings significant improvement in generation quality over the baseline model, with little extra cost in storage and inference time, demonstrating the effectiveness of our proposed framework. Specially, we achieve state-of-the-art results on WMT2014, 2015 and 2017 English to German translation.

Author Information

Yiren Wang (University of Illinois at Urbana-Champaign)
Yingce Xia (Microsoft Research Asia)
Fei Tian (Facebook)
Fei Gao (University of Chinese Academy of Sciences)
Tao Qin (Microsoft Research)
Cheng Xiang Zhai (University of Illinois at Urbana-Champaign)
Tie-Yan Liu (Microsoft Research)

More from the Same Authors