Designing Optimal Computation Protocols from Fluctuation Response Relations
Abstract
The energy cost of computation has emerged as a central topic at the intersection of physics and computer science. Recent advances in statistical physics---particularly in stochastic thermodynamics---enable precise characterizations of work, heat, and entropy production in computational systems driven far from equilibrium by time-dependent protocols. A key open question is then how to design protocols that minimize thermodynamic cost while still ensuring correct outcomes. In this work, we propose a unified framework to design optimal protocols using Fluctuation-Response Relation (FRR) and machine learning. Unlike previous approaches that either optimize distributions or protocols separately, our method unifies both using FRR-derived gradients. We construct loss functions for various computational objectives and then apply FRR, combined with gradient-based optimization, to efficiently locate the corresponding optimal protocols. We apply this framework to bit erasure in a double well potential---demonstrating how to construct loss functions that trade off energy cost against task error. Extending to underdamped systems, we show how momentum memory enables more complex operations like bit-flips, which are infeasible in purely one-dimensional Markovian settings. In both examples, we achieve work costs on the same order of Landauer's bound. Beyond these examples, our approach provides a general foundation for designing low-cost protocols in physical computation systems, with potential applications ranging from robust quantum gates under noise to energy-efficient control of chemical and synthetic biological networks.