Timezone: »

Learning Supervised PageRank with Gradient-Based and Gradient-Free Optimization Methods
Lev Bogolubsky · Pavel Dvurechenskii · Alexander Gasnikov · Gleb Gusev · Yurii Nesterov · Andrei M Raigorodskii · Aleksey Tikhonov · Maksim Zhukovskii

Mon Dec 05 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #16

In this paper, we consider a non-convex loss-minimization problem of learning Supervised PageRank models, which can account for features of nodes and edges. We propose gradient-based and random gradient-free methods to solve this problem. Our algorithms are based on the concept of an inexact oracle and unlike the state-of-the-art gradient-based method we manage to provide theoretically the convergence rate guarantees for both of them. Finally, we compare the performance of the proposed optimization methods with the state of the art applied to a ranking task.

Author Information

Lev Bogolubsky (Yandex, Moscow State University)
Pavel Dvurechenskii (Weierstrass Institute for Appl)

Since 2015 Research assistant, Research Group 6 "Stochastic Algorithms and Nonparametric Statistics", Weierstrass Institute for Applied Analysis and Stochastics, Berlin 2014 - 2015 Research assistant, Institute for Information Transmission Problems, Moscow, Russia 2009 - 2015 Junior researcher, Moscow Institute of Physics and Technology, Moscow, Russia 2013 Ph.D., Moscow Institute of Physics and Technology, Moscow, Russia 2010 Master's Diploma, Moscow Institute of Physics and Technology, Moscow, Russia 2008 Bachelor's Diploma, Moscow Institute of Physics and Technology, Moscow, Russia

Alexander Gasnikov (Moscow Institute of Physics and Technology)
Gleb Gusev (Yandex LLC)
Yurii Nesterov (Catholic University of Louvain (UCL))

Yurii Nesterov is a professor at Center for Operations Research and Econometrics (CORE) in Catholic University of Louvain (UCL), Belgium. He received Ph.D. degree (Applied Mathematics) in 1984 at Institute of Control Sciences, Moscow. Starting from 1993 he works at CORE. His research interests are related to complexity issues and efficient methods for solving various optimization problems. The main results are obtained in Convex Optimization (optimal methods for smooth problems, polynomial-time interior-point methods, smoothing technique for structural optimization, cubic regularization of Newton method, optimization methods for huge-scale problems). He is an author of 4 monographs and more than 80 refereed papers in the leading optimization journals. He got the Dantzig Prize from SIAM and Mathematical Programming society in 2000, von Neumann Theory Prize from INFORMS in 2009, and SIAM Outstanding paper award in 2014.

Andrei M Raigorodskii (Moscow Institute of Physics and Technology)
Aleksey Tikhonov (Yandex Technologies GmbH)
Maksim Zhukovskii

More from the Same Authors