Timezone: »

 
Poster
Fisher Efficient Inference of Intractable Models
Song Liu · Takafumi Kanamori · Wittawat Jitkrittum · Yu Chen

Tue Dec 10 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #48

Maximum Likelihood Estimators (MLE) has many good properties. For example, the asymptotic variance of MLE solution attains equality of the asymptotic Cram{\'e}r-Rao lower bound (efficiency bound), which is the minimum possible variance for an unbiased estimator. However, obtaining such MLE solution requires calculating the likelihood function which may not be tractable due to the normalization term of the density model. In this paper, we derive a Discriminative Likelihood Estimator (DLE) from the Kullback-Leibler divergence minimization criterion implemented via density ratio estimation and a Stein operator. We study the problem of model inference using DLE. We prove its consistency and show that the asymptotic variance of its solution can attain the equality of the efficiency bound under mild regularity conditions. We also propose a dual formulation of DLE which can be easily optimized. Numerical studies validate our asymptotic theorems and we give an example where DLE successfully estimates an intractable model constructed using a pre-trained deep neural network.

Author Information

Song Liu (University of Bristol)

I am a lecturer in University of Bristol. Before, I was a Project Assistant Professor in The Institute of Statistical Mathematics, Japan. I got my Doctor of Engineering degree from Tokyo Institute of Technology supervised by Prof. [Masashi Sugiyama](http://www.ms.k.u-tokyo.ac.jp/sugi/) and was awarded The DC2 Fellowship from Japan Society for the Promotion of Science.

Takafumi Kanamori (Tokyo Institute of Technology/RIKEN)
Wittawat Jitkrittum (Max Planck Institute for Intelligent Systems)
Yu Chen (University of Bristol)

More from the Same Authors