`

Timezone: »

 
Towards Grounded Natural Language Proof Generation
Sean Welleck · Jiacheng Liu · Yejin Choi

When a student is working on a mathematical proof, it is often helpful to receive suggestions about how to proceed. To this end, we provide an initial study of two generation tasks in natural mathematical language: suggesting the next step in a proof, and full-proof generation. As proofs are grounded in past results- e.g. theorems, definitions- we study knowledge-grounded generation methods, and find that conditioning on retrieved or ground-truth knowledge greatly improves generations. We characterize error types and provide directions for future research.

Author Information

Sean Welleck (University of Washington)
Jiacheng Liu (Department of Computer Science, University of Washington)
Yejin Choi (University of Washington)

More from the Same Authors