Plan Before You Write: Improve LLM writing by planning
Abstract
We present a framework designed to generate and execute writing plans for LLMs. This framework decomposes the writing task into multiple sub-tasks, including novel reflection steps for iterative plan refinement, facilitating a structured approach to content generation. It can be applied both to execute writing plans through multiple LLM calls and to generate training data for supervised fine-tuning. To evaluate the effectiveness of our framework, we developed automated raters trained on guideline-based, human-filtered questions to assess writing quality. We show that the writing outputs produced using our framework are superior to those generated by LLMs without a planning component. Additionally, human raters corroborated these results, with 79.4% of participants ranking our frameworkâs outputs as more interesting.