We train hierarchical Transformers on the task of synthesizing hardware circuits directly out of high-level logical speciﬁcations in linear-time temporal logic (LTL). The LTL synthesis problem is a well-known algorithmic challenge with a long history and an annual competition is organized to track the improvement of algorithms and tooling over time. New approaches using machine learning might open a lot of possibilities in this area, but suffer from the lack of sufﬁcient amounts of training data. In this paper, we consider a method to generate large amounts of additional training data, i.e., pairs of speciﬁcations and circuits implementing them. We ensure that this synthetic data is sufﬁciently close to human-written speciﬁcations by mining common patterns from the speciﬁcations used in the synthesis competitions. We show that hierarchical Transformers trained on this synthetic data solve a signiﬁcant portion of problems from the synthesis competitions, and even out-of-distribution examples from a recent case study.
Frederik Schmitt (Saarland University)
Christopher Hahn (CISPA Helmholtz Center for Information Security)
Markus Rabe (Google Research)
Bernd Finkbeiner (CISPA, saarland university, saarland informatics campus)
More from the Same Authors
2022 Poster: Autoformalization with Large Language Models »
Yuhuai Wu · Albert Qiaochu Jiang · Wenda Li · Markus Rabe · Charles Staats · Mateja Jamnik · Christian Szegedy