Timezone: »
Refining low-resolution (LR) spatial fields with high-resolution (HR) information, often known as statistical downscaling, is challenging as the diversity of spatial datasets often prevents direct matching of observations. Yet, when LR samples are modeled as aggregate conditional means of HR samples with respect to a mediating variable that is globally observed, the recovery of the underlying fine-grained field can be framed as taking an "inverse" of the conditional expectation, namely a deconditioning problem. In this work, we propose a Bayesian formulation of deconditioning which naturally recovers the initial reproducing kernel Hilbert space formulation from Hsu and Ramos (2019). We extend deconditioning to a downscaling setup and devise efficient conditional mean embedding estimator for multiresolution data. By treating conditional expectations as inter-domain features of the underlying field, a posterior for the latent field can be established as a solution to the deconditioning problem. Furthermore, we show that this solution can be viewed as a two-staged vector-valued kernel ridge regressor and show that it has a minimax optimal convergence rate under mild assumptions. Lastly, we demonstrate its proficiency in a synthetic and a real-world atmospheric field downscaling problem, showing substantial improvements over existing methods.
Author Information
Siu Lun Chau (University of Oxford)
Shahine Bouabid (University of Oxford)
Dino Sejdinovic (University of Oxford)
More from the Same Authors
-
2020 : Predicting Landsat Reflectance with Deep Generative Fusion »
Shahine Bouabid -
2022 : Bayesian inference for aerosol vertical profiles »
Shahine Bouabid · Duncan Watson-Parris · Dino Sejdinovic -
2022 Poster: Giga-scale Kernel Matrix-Vector Multiplication on GPU »
Robert Hu · Siu Lun Chau · Dino Sejdinovic · Joan Glaunès -
2022 Poster: Explaining Preferences with Shapley Values »
Robert Hu · Siu Lun Chau · Jaime Ferrando Huertas · Dino Sejdinovic -
2022 Poster: RKHS-SHAP: Shapley Values for Kernel Methods »
Siu Lun Chau · Robert Hu · Javier González · Dino Sejdinovic -
2021 Poster: BayesIMP: Uncertainty Quantification for Causal Data Fusion »
Siu Lun Chau · Jean-Francois Ton · Javier González · Yee Teh · Dino Sejdinovic -
2019 Poster: Hyperparameter Learning via Distributional Transfer »
Ho Chung Law · Peilin Zhao · Leung Sing Chan · Junzhou Huang · Dino Sejdinovic -
2018 Poster: Causal Inference via Kernel Deviance Measures »
Jovana Mitrovic · Dino Sejdinovic · Yee Whye Teh -
2018 Spotlight: Causal Inference via Kernel Deviance Measures »
Jovana Mitrovic · Dino Sejdinovic · Yee Whye Teh -
2018 Poster: Variational Learning on Aggregate Outputs with Gaussian Processes »
Ho Chung Law · Dino Sejdinovic · Ewan Cameron · Tim Lucas · Seth Flaxman · Katherine Battle · Kenji Fukumizu -
2018 Poster: Hamiltonian Variational Auto-Encoder »
Anthony Caterini · Arnaud Doucet · Dino Sejdinovic -
2017 Poster: Testing and Learning on Distributions with Symmetric Noise Invariance »
Ho Chung Law · Christopher Yau · Dino Sejdinovic