Skip to yearly menu bar Skip to main content


Poster

Fisher Efficient Inference of Intractable Models

Song Liu · Takafumi Kanamori · Wittawat Jitkrittum · Yu Chen

East Exhibition Hall B + C #48

Keywords: [ Information Theory; The ] [ Optimization -> Convex Optimization; Probabilistic Methods; Theory -> Frequentist Statistics; Theory ] [ Algorithms ] [ Density Estimation ]


Abstract:

Maximum Likelihood Estimators (MLE) has many good properties. For example, the asymptotic variance of MLE solution attains equality of the asymptotic Cram{\'e}r-Rao lower bound (efficiency bound), which is the minimum possible variance for an unbiased estimator. However, obtaining such MLE solution requires calculating the likelihood function which may not be tractable due to the normalization term of the density model. In this paper, we derive a Discriminative Likelihood Estimator (DLE) from the Kullback-Leibler divergence minimization criterion implemented via density ratio estimation and a Stein operator. We study the problem of model inference using DLE. We prove its consistency and show that the asymptotic variance of its solution can attain the equality of the efficiency bound under mild regularity conditions. We also propose a dual formulation of DLE which can be easily optimized. Numerical studies validate our asymptotic theorems and we give an example where DLE successfully estimates an intractable model constructed using a pre-trained deep neural network.

Live content is unavailable. Log in and register to view live content