Timezone: »
We propose InsNet, an expressive insertion-based text generator with efficient training and flexible decoding (parallel or sequential). Unlike most existing insertion-based text generation works that require re-encoding of the (decoding) context after each insertion operation and thus are inefficient to train, InsNet only requires one pass of context encoding for the entire insertion sequence during training by using a novel insertion-oriented position encoding to enable computation sharing. Furthermore, InsNet provides a controllable switch between parallel and sequential decoding, making it flexible to handle more parallelizable tasks such as machine translation to support efficient decoding, or less parallelizable tasks such as lexically constrained text generation to guarantee high-quality outputs. Experiments on two unsupervised lexically constrained text generation datasets and three machine translation datasets demonstrate InsNet’s advantages over previous insertion-based methods in terms of training speed, inference efficiency, and generation quality.
Author Information
Sidi Lu (University of California, Los Angeles)
Tao Meng (, University of California, Los Angeles)
Nanyun Peng (University of California, Los Angeles)
More from the Same Authors
-
2022 : Parameter-Efficient Low-Resource Dialogue State Tracking by Prompt Tuning »
Mingyu Derek Ma · Jiun-Yu Kao · Shuyang Gao · arpit gupta · Di Jin · Tagyoung Chung · Nanyun Peng -
2023 Poster: DesCo: Learning Object Recognition with Rich Language Descriptions »
Liunian Li · Zi-Yi Dou · Nanyun Peng · Kai-Wei Chang -
2022 Poster: Coarse-to-Fine Vision-Language Pre-training with Fusion in the Backbone »
Zi-Yi Dou · Aishwarya Kamath · Zhe Gan · Pengchuan Zhang · Jianfeng Wang · Linjie Li · Zicheng Liu · Ce Liu · Yann LeCun · Nanyun Peng · Jianfeng Gao · Lijuan Wang -
2022 Poster: Controllable Text Generation with Neurally-Decomposed Oracle »
Tao Meng · Sidi Lu · Nanyun Peng · Kai-Wei Chang -
2020 : Invited talk - Creative Language Generation: Stories, Sarcasms, and Similes - Nanyun Peng »
Nanyun Peng