Skip to yearly menu bar Skip to main content


Poster

Demo2Code: From Summarizing Demonstrations to Synthesizing Code via Extended Chain-of-Thought

Yuki Wang · Gonzalo Gonzalez-Pumariega · Yash Sharma · Sanjiban Choudhury

Great Hall & Hall B1+B2 (level 1) #436

Abstract:

Language instructions and demonstrations are two natural ways for users to teach robots personalized tasks. Recent progress in Large Language Models (LLMs) has shown impressive performance in translating language instructions into code for robotic tasks. However, translating demonstrations into task code continues to be a challenge due to the length and complexity of both demonstrations and code, making learning a direct mapping intractable. This paper presents Demo2Code, a novel framework that generates robot task code from demonstrations via an extended chain-of-thought and defines a common latent specification to connect the two. Our framework employs a robust two-stage process: (1) a recursive summarization technique that condenses demonstrations into concise specifications, and (2) a code synthesis approach that expands each function recursively from the generated specifications. We conduct extensive evaluation on various robot task benchmarks, including a novel game benchmark Robotouille, designed to simulate diverse cooking tasks in a kitchen environment.

Chat is not available.