Timezone: »

 
Towards Grounded Natural Language Proof Generation
Jiacheng Liu
Event URL: https://neurips.cc/virtual/2021/workshop/21828 »

When a student is working on a mathematical proof, it is often helpful to receive suggestions about how to proceed. To this end, we provide an initial study of two generation tasks in natural mathematical language: suggesting the next step in a proof, and full-proof generation. As proofs are grounded in past results- e.g. theorems, definitions- we study knowledge-grounded generation methods, and find that conditioning on retrieved or ground-truth knowledge greatly improves generations. We characterize error types and provide directions for future research.

Author Information

Jiacheng Liu (Department of Computer Science, University of Washington)

More from the Same Authors