Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning with New Compute Paradigms

PhyFF: Physical forward forward algorithm for in-hardware training and inference

Ali Momeni · Babak Rahmani · Matthieu MallĂ©jac · Philipp del Hougne · Romain Fleury

[ ] [ Project Page ]
Sat 16 Dec 9:25 a.m. PST — 10:30 a.m. PST

Abstract:

Training of digital deep learning models primarily relies on backpropagation, which poses challenges for physical implementation due to its dependency on precise knowledge of computations performed in the forward pass of the neural network. To address this issue, we propose a physical forward forward training algorithm (phyFF) that is inspired by the original forward forward algorithm. This novel approach facilitates direct training of deep physical neural networks comprising layers of diverse physical nonlinear systems, without the need for the complete knowledge of the underlying physics. We demonstrate the superiority of this method over current hardware-aware training techniques. The proposed method achieves faster training speeds, reduces digital computational requirements, and lowers training's power consumption in physical systems.

Chat is not available.