Skip to yearly menu bar Skip to main content


Poster

Global Analysis of Expectation Maximization for Mixtures of Two Gaussians

Ji Xu · Daniel Hsu · Arian Maleki

Area 5+6+7+8 #183

Keywords: [ (Other) Statistics ] [ Learning Theory ]


Abstract:

Expectation Maximization (EM) is among the most popular algorithms for estimating parameters of statistical models. However, EM, which is an iterative algorithm based on the maximum likelihood principle, is generally only guaranteed to find stationary points of the likelihood objective, and these points may be far from any maximizer. This article addresses this disconnect between the statistical principles behind EM and its algorithmic properties. Specifically, it provides a global analysis of EM for specific models in which the observations comprise an i.i.d. sample from a mixture of two Gaussians. This is achieved by (i) studying the sequence of parameters from idealized execution of EM in the infinite sample limit, and fully characterizing the limit points of the sequence in terms of the initial parameters; and then (ii) based on this convergence analysis, establishing statistical consistency (or lack thereof) for the actual sequence of parameters produced by EM.

Live content is unavailable. Log in and register to view live content