Timezone: »

NaturalProver: Grounded Mathematical Proof Generation with Language Models
Sean Welleck · Jiacheng Liu · Ximing Lu · Hannaneh Hajishirzi · Yejin Choi

Tue Nov 29 09:00 AM -- 11:00 AM (PST) @ Hall J #513

Theorem proving in natural mathematical language – the mixture of symbolic and natural language used by humans – plays a central role in mathematical advances and education, and tests aspects of reasoning that are core to intelligence. Yet it has remained underexplored with modern generative models. We study large-scale language models on two new generation tasks: suggesting the next step in a mathematical proof, and full proof generation. We develop NaturalProver, a language model that generates proofs by conditioning on background references (e.g. theorems and definitions that are either retrieved or human-provided), and optionally enforces their presence with constrained decoding. On theorems from the NaturalProofs benchmark, NaturalProver improves the quality of next-step suggestions and generated proofs over fine-tuned GPT-3, according to human evaluations from university-level mathematics students. NaturalProver is capable of proving some theorems that require short (2-6 step) proofs, and providing next-step suggestions that are rated as correct and useful over 40% of the time, which is to our knowledge the first demonstration of these capabilities using neural language models.

Author Information

Sean Welleck (University of Washington)
Jiacheng Liu (Department of Computer Science, University of Washington)
Ximing Lu (Department of Computer Science, University of Washington)
Hannaneh Hajishirzi (University of Washington)
Yejin Choi (University of Washington)

More from the Same Authors