Timezone: »

 
Poster
Learning Positive Functions with Pseudo Mirror Descent
Yingxiang Yang · Haoxiang Wang · Negar Kiyavash · Niao He

Tue Dec 10 05:30 PM -- 07:30 PM (PST) @ East Exhibition Hall B + C #55

The nonparametric learning of positive-valued functions appears widely in machine learning, especially in the context of estimating intensity functions of point processes. Yet, existing approaches either require computing expensive projections or semidefinite relaxations, or lack convexity and theoretical guarantees after introducing nonlinear link functions. In this paper, we propose a novel algorithm, pseudo mirror descent, that performs efficient estimation of positive functions within a Hilbert space without expensive projections. The algorithm guarantees positivity by performing mirror descent with an appropriately selected Bregman divergence, and a pseudo-gradient is adopted to speed up the gradient evaluation procedure in practice. We analyze both asymptotic and nonasymptotic convergence of the algorithm. Through simulations, we show that pseudo mirror descent outperforms the state-of-the-art benchmarks for learning intensities of Poisson and multivariate Hawkes processes, in terms of both computational efficiency and accuracy.

Author Information

Yingxiang Yang (University of Illinois at Urbana-Champaign)
Haoxiang Wang (University of Illinois, Urbana-Champaign)

1st year PhD student working on machine learning from UIUC. Has one Spotlight paper at NeurIPS 2019: https://papers.nips.cc/paper/9563-learning-positive-functions-with-pseudo-mirror-descent CV: https://www.dropbox.com/s/1w7iaw1h1x8yhap/Haoxiang_Wang_UIUC-CV.pdf?dl=0

Negar Kiyavash (EPFL)
Niao He (UIUC)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors