Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning with New Compute Paradigms

Beyond Digital: Harnessing Analog Hardware for Machine Learning

Marvin Syed · Kirill Kalinin · Natalia Berloff

[ ] [ Project Page ]
Sat 16 Dec 9:25 a.m. PST — 10:30 a.m. PST

Abstract:

A remarkable surge in utilizing large deep-learning models yields state-of-the-art results in a variety of tasks. Recent model sizes often exceed billions of parameters, underscoring the importance of fast and energy-efficient processing. The significant costs associated with training and inference primarily stem from the constrained memory bandwidth of current hardware and the computationally intensive nature of these models. Historically, the design of machine learning models has predominantly been guided by the operational parameters of classical digital devices. In contrast, analog computations have the potential to offer vastly improved power efficiency for both inference and training tasks. This work details several machine-learning methodologies that could leverage existing analog hardware infrastructures. To foster the development of analog hardware-aware machine learning techniques, we explore both optical and electronic hardware configurations suitable for executing the fundamental mathematical operations inherent to these models. Integrating analog hardware with innovative machine learning approaches may pave the way for cost-effective AI systems at scale.

Chat is not available.