Skip to yearly menu bar Skip to main content


Poster

Learning Mixed Multinomial Logits with Provable Guarantees

Yiqun Hu · David Simchi-Levi · Zhenzhen Yan

Hall J (level 1) #307

Keywords: [ Provable algorithms ] [ Non-parametric estimation ] [ Conditional gradient (Frank-Wolfe) ] [ Mixed Multinomial Logits (MMNL) ] [ Sample Complexity ] [ Statistical Learning ]


Abstract:

A mixture of multinomial logits (MMNL) generalizes the single logit model, which is commonly used in predicting the probabilities of different outcomes. While extensive algorithms have been developed in the literature to learn MMNL models, theoretical results are limited. Built on the Frank-Wolfe (FW) method, we propose a new algorithm that learns both mixture weights and component-specific logit parameters with provable convergence guarantees for an arbitrary number of mixtures. Our algorithm utilizes historical choice data to generate a set of candidate choice probability vectors, each being close to the ground truth with a high probability. We further provide a sample complexity analysis to show that only a polynomial number of samples is required to secure the performance guarantee of our algorithm. Finally, we conduct simulation studies to evaluate the performance and demonstrate how to apply our algorithm to real-world applications.

Chat is not available.