Timezone: »
Importance Sampling (IS) is a method for approximating expectations with respect to a target distribution using independent samples from a proposal distribution and the associated to importance weights. In many cases, the target distribution is known up to a normalization constant and self-normalized IS (SNIS) is then used. While the use of self-normalization can have a positive effect on the dispersion of the estimator, it introduces bias. In this work, we propose a new method BR-SNIS whose complexity is essentially the same as SNIS and which significantly reduces bias. This method is a wrapper, in the sense that it uses the same proposal samples and importance weights but makes a clever use of iterated sampling-importance-resampling (i-SIR) to form a bias-reduced version of the estimator. We derive the proposed algorithm with rigorous theoretical results, including novel bias, variance, and high-probability bounds. We illustrate our findings with numerical examples.
Author Information
Gabriel Cardoso (Ecole Polytechnique)
Sergey Samsonov (National Research University Higher School of Economics)
Achille Thin (AgroParisTech)
Eric Moulines (Ecole Polytechnique)
Jimmy Olsson
More from the Same Authors
-
2022 : Distributional deep Q-learning with CVaR regression »
Mastane Achab · REDA ALAMI · YASSER ABDELAZIZ DAHOU DJILALI · Kirill Fedyanin · Eric Moulines · Maxim Panov -
2023 Poster: First Order Methods with Markovian Noise: from Acceleration to Variational Inequalities »
Aleksandr Beznosikov · Sergey Samsonov · Marina Sheshukova · Alexander Gasnikov · Alexey Naumov · Eric Moulines -
2023 Poster: Model-free Posterior Sampling via Learning Rate Randomization »
Daniil Tiapkin · Denis Belomestny · Daniele Calandriello · Eric Moulines · Remi Munos · Alexey Naumov · Pierre Perrault · Michal Valko · Pierre Ménard -
2022 Spotlight: Optimistic Posterior Sampling for Reinforcement Learning with Few Samples and Tight Guarantees »
Daniil Tiapkin · Denis Belomestny · Daniele Calandriello · Eric Moulines · Remi Munos · Alexey Naumov · Mark Rowland · Michal Valko · Pierre Ménard -
2022 Poster: Optimistic Posterior Sampling for Reinforcement Learning with Few Samples and Tight Guarantees »
Daniil Tiapkin · Denis Belomestny · Daniele Calandriello · Eric Moulines · Remi Munos · Alexey Naumov · Mark Rowland · Michal Valko · Pierre Ménard -
2022 Poster: Local-Global MCMC kernels: the best of both worlds »
Sergey Samsonov · Evgeny Lagutin · Marylou Gabrié · Alain Durmus · Alexey Naumov · Eric Moulines -
2022 Poster: FedPop: A Bayesian Approach for Personalised Federated Learning »
Nikita Kotelevskii · Maxime Vono · Alain Durmus · Eric Moulines -
2021 Poster: Federated-EM with heterogeneity mitigation and variance reduction »
Aymeric Dieuleveut · Gersende Fort · Eric Moulines · Geneviève Robin -
2021 Poster: NEO: Non Equilibrium Sampling on the Orbits of a Deterministic Transform »
Achille Thin · Yazid Janati El Idrissi · Sylvain Le Corff · Charles Ollion · Eric Moulines · Arnaud Doucet · Alain Durmus · Christian X Robert -
2021 Poster: Tight High Probability Bounds for Linear Stochastic Approximation with Fixed Stepsize »
Alain Durmus · Eric Moulines · Alexey Naumov · Sergey Samsonov · Kevin Scaman · Hoi-To Wai -
2020 Poster: A Stochastic Path Integral Differential EstimatoR Expectation Maximization Algorithm »
Gersende Fort · Eric Moulines · Hoi-To Wai -
2019 Poster: On the Global Convergence of (Fast) Incremental Expectation Maximization Methods »
Belhal Karimi · Hoi-To Wai · Eric Moulines · Marc Lavielle -
2018 Poster: Low-rank Interaction with Sparse Additive Effects Model for Large Data Frames »
Geneviève Robin · Hoi-To Wai · Julie Josse · Olga Klopp · Eric Moulines -
2018 Spotlight: Low-rank Interaction with Sparse Additive Effects Model for Large Data Frames »
Geneviève Robin · Hoi-To Wai · Julie Josse · Olga Klopp · Eric Moulines -
2018 Poster: The promises and pitfalls of Stochastic Gradient Langevin Dynamics »
Nicolas Brosse · Alain Durmus · Eric Moulines