Beyond Scaling: Adaptive Sensing for Neural Networks in the Physical World
Abstract
Recent advances in AI have been driven largely by scaling—bigger models trained on ever-larger datasets—to achieve generalization and robustness. While successful, this paradigm is increasingly costly and brittle, especially in the era of physical AI, where models must operate on data shaped by real-world sensors and complex, shifting environments. In contrast, biological sensory systems adapt dynamically at the input—adjusting pupil size, refocusing gaze, or reallocating attention—rather than relying solely on ever more powerful downstream processing. Motivated by this perspective, I will argue that adaptive sensing should be treated as a first-class principle in AI systems. By proactively modulating sensor parameters (e.g., exposure, sensitivity, and multimodal configurations) and closing the loop between perception and data acquisition, we can substantially mitigate covariate shifts, improve robustness, and reduce computational and energy costs. In this talk, I will present a series of our recent works that instantiate this vision.