Timezone: »

 
Poster
Estimation of Information Theoretic Measures for Continuous Random Variables
Fernando Perez-Cruz

Mon Dec 08 08:45 PM -- 12:00 AM (PST) @ None #None

We analyze the estimation of information theoretic measures of continuous random variables such as: differential entropy, mutual information or Kullback-Leibler divergence. The objective of this paper is two-fold. First, we prove that the information theoretic measure estimates using the k-nearest-neighbor density estimation with fixed k converge almost surely, even though the k-nearest-neighbor density estimation with fixed k does not converge to its true measure. Second, we show that the information theoretic measure estimates do not converge for k growing linearly with the number of samples. Nevertheless, these nonconvergent estimates can be used for solving the two-sample problem and assessing if two random variables are independent. We show that the two-sample and independence tests based on these nonconvergent estimates compare favorably with the maximum mean discrepancy test and the Hilbert Schmidt independence criterion, respectively.

Author Information

Fernando Perez-Cruz (Swiss Data Science Center (ETH Zurich and EPFL))

More from the Same Authors