Poster
Learning from the Wisdom of Crowds by Minimax Entropy
Denny Zhou · John C Platt · Sumit Basu · Yi Mao

Thu Dec 6th 02:00 -- 06:00 PM @ Harrah’s Special Events Center 2nd Floor #None

An important way to make large training sets is to gather noisy labels from crowds of nonexperts. We propose a minimax entropy principle to improve the quality of these labels. Our method assumes that labels are generated by a probability distribution over workers, items, and labels. By maximizing the entropy of this distribution, the method naturally infers item confusability and worker expertise. We infer the ground truth by minimizing the entropy of this distribution, which we show minimizes the Kullback-Leibler (KL) divergence between the probability distribution and the unknown truth. We show that a simple coordinate descent scheme can optimize minimax entropy. Empirically, our results are substantially better than previously published methods for the same problem.

Author Information

Denny Zhou (Microsoft Research Redmond)
John C Platt (Microsoft Research)
Sumit Basu (Microsoft Research)
Yi Mao (Microsoft)

More from the Same Authors