`

Timezone: »

 
Poster
Learning Sparse Prototypes for Text Generation
Junxian He · Taylor Berg-Kirkpatrick · Graham Neubig

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #184

Prototype-driven text generation uses non-parametric models that first choose from a library of sentence "prototypes" and then modify the prototype to generate the output text. While effective, these methods are inefficient at test time as a result of needing to store and index the entire training corpus. Further, existing methods often require heuristics to identify which prototypes to reference at training time. In this paper, we propose a novel generative model that automatically learns a sparse prototype support set that, nonetheless, achieves strong language modeling performance. This is achieved by (1) imposing a sparsity-inducing prior on the prototype selection distribution, and (2) utilizing amortized variational inference to learn a prototype retrieval function. In experiments, our model outperforms previous prototype-driven language models while achieving up to a 1000x memory reduction, as well as a 1000x speed-up at test time. More interestingly, we show that the learned prototypes are able to capture semantics and syntax at different granularity as we vary the sparsity of prototype selection, and that certain sentence attributes can be controlled by specifying the prototype for generation.

Author Information

Junxian He (Carnegie Mellon University)
Taylor Berg-Kirkpatrick (University of California San Diego)
Graham Neubig (Carnegie Mellon University)

More from the Same Authors