Timezone: »

Controllable Text Generation with Neurally-Decomposed Oracle
Tao Meng · Sidi Lu · Nanyun Peng · Kai-Wei Chang

Tue Nov 29 02:00 PM -- 04:00 PM (PST) @ Hall J #425

We propose a general and efficient framework to control auto-regressive generation models with NeurAlly-Decomposed Oracle (NADO). Given a pre-trained base language model and a sequence-level boolean oracle function, we aim to decompose the oracle function into token-level guidance to steer the base model in text generation. Specifically, the token-level guidance is provided by NADO, a neural model trained with examples sampled from the base model, demanding no additional auxiliary labeled data. Based on posterior regularization, we present the close-form optimal solution to incorporate the decomposed token-level guidance into the base model for controllable generation. We further discuss how the neural approximation affects the quality of the solution. These experiments conducted on two different applications: (1) text generation with lexical constraints and (2) machine translation with formality control demonstrate that our framework efficiently guides the base model towards the given oracle while keeping high generation quality.

Author Information

Tao Meng (, University of California, Los Angeles)
Sidi Lu (University of California, Los Angeles)
Nanyun Peng (University of California, Los Angeles)
Kai-Wei Chang (UCLA)

More from the Same Authors