Skip to yearly menu bar Skip to main content


Poster

Detecting and Adapting to Irregular Distribution Shifts in Bayesian Online Learning

Aodong Li · Alex Boyd · Padhraic Smyth · Stephan Mandt

Virtual

Keywords: [ Continual Learning ] [ Online Learning ] [ Self-Supervised Learning ]


Abstract:

We consider the problem of online learning in the presence of distribution shifts that occur at an unknown rate and of unknown intensity. We derive a new Bayesian online inference approach to simultaneously infer these distribution shifts and adapt the model to the detected changes by integrating ideas from change point detection, switching dynamical systems, and Bayesian online learning. Using a binary ‘change variable,’ we construct an informative prior such that--if a change is detected--the model partially erases the information of past model updates by tempering to facilitate adaptation to the new data distribution. Furthermore, the approach uses beam search to track multiple change-point hypotheses and selects the most probable one in hindsight. Our proposed method is model-agnostic, applicable in both supervised and unsupervised learning settings, suitable for an environment of concept drifts or covariate drifts, and yields improvements over state-of-the-art Bayesian online learning approaches.

Chat is not available.