Common representations of the stratospheric ozone layer in climate modeling are widely considered only in a highly simplified form that falls short of the current understanding of the stratospheric ozone chemistry. For climate projections, it would be of advantage to include the mutual interactions between stratospheric ozone, temperature, and atmospheric dynamics to accurately represent radiative forcing.Our aim with this paper was to replace the ozone layer in climate models with a machine-learned implicit neural representation that provides a particularly fast, yet accurate and stable, simulation.First, we explored correlations and causalities to create a comprehensive benchmark data set from simulations of the ATLAS chemistry and transport model. Given this data set, we experimented with different variants of multilayer perceptrons suitable for physical problems to learn an implicit neural representation of our latent function. We optimised our model by an extensive Bayesian hyperparameter search. For validation, we coupled this model into the ATLAS chemistry and transport model and benchmarked computation time, accuracy, and stability against the full chemistry model.The resulting implicit neural representation (Neural-SWIFT) allowed us to perform long-term simulations with high accuracy, without significant error accumulation, and a factor of 700 faster than the baseline model. The accuracy of our model surpassed the previous polynomial approach and is able to accurately reproduce the regimes of chemical production and loss as well as seasonality in both hemispheres when compared to the full chemistry model.Neural-SWIFT enables mutual interactions between stratospheric ozone, temperature, and atmospheric dynamics and performs comparably to a full chemistry model, but in a much faster and simpler way, and is thus intended for use in climate models.