Timezone: »

Statistical bounds for entropic optimal transport: sample complexity and the central limit theorem
Gonzalo Mena · Jonathan Niles-Weed

Thu Dec 12 04:15 PM -- 04:20 PM (PST) @ West Ballroom A + B

We prove several fundamental statistical bounds for entropic OT with the squared Euclidean cost between subgaussian probability measures in arbitrary dimension. First, through a new sample complexity result we establish the rate of convergence of entropic OT for empirical measures. Our analysis improves exponentially on the bound of Genevay et al.~(2019) and extends their work to unbounded measures. Second, we establish a central limit theorem for entropic OT, based on techniques developed by Del Barrio and Loubes~(2019). Previously, such a result was only known for finite metric spaces. As an application of our results, we develop and analyze a new technique for estimating the entropy of a random variable corrupted by gaussian noise.

Author Information

Gonzalo Mena (Harvard)
Jonathan Niles-Weed (NYU)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors