Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Tackling Climate Change with Machine Learning

Paraformer: Parameterization of Sub-grid Scale Processes Using Transformers

Shuochen Wang · Nishant Yadav · Auroop Ganguly


Abstract:

One of the major sources of uncertainty in the current generation of Global Climate Models (GCMs) is the simulation of sub-grid scale physical processes. Over the years, with significantly improved computational performance, a series of Deep Learning (DL) parameterization schemes have been developed and incorporated into GCMs. However, these schemes use classic architectures whereas the new attention mechanism is not widely investigated. We proposed a “memory-aware” transformer-based model on ClimSim, the largest-ever dataset for climate parameterization. Our results show that the attention mechanism successfully captures the complex non-linear dependencies of sub-grid scale variables and reduces the prediction error.

Live content is unavailable. Log in and register to view live content