Neural Methods for Point-wise Dependency Estimation
Yao-Hung Hubert Tsai, Han Zhao, Makoto Yamada, LP Morency, Russ Salakhutdinov
Spotlight presentation: Orals & Spotlights Track 01: Representation/Relational
on 2020-12-07T19:30:00-08:00 - 2020-12-07T19:40:00-08:00
on 2020-12-07T19:30:00-08:00 - 2020-12-07T19:40:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Since its inception, the neural estimation of mutual information (MI) has demonstrated the empirical success of modeling expected dependency between high-dimensional random variables. However, MI is an aggregate statistic and cannot be used to measure point-wise dependency between different events. In this work, instead of estimating the expected dependency, we focus on estimating point-wise dependency (PD), which quantitatively measures how likely two outcomes co-occur. We show that we can naturally obtain PD when we are optimizing MI neural variational bounds. However, optimizing these bounds is challenging due to its large variance in practice. To address this issue, we develop two methods (free of optimizing MI variational bounds): Probabilistic Classifier and Density-Ratio Fitting. We demonstrate the effectiveness of our approaches in 1) MI estimation, 2) self-supervised representation learning, and 3) cross-modal retrieval task.