Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning for Engineering Modeling, Simulation and Design

Scalable Combinatorial Bayesian Optimization with Tractable Statistical models

Aryan Deshwal · Syrine Belakaria · Janardhan Rao Doppa


Abstract:

We study the problem of optimizing expensive blackbox functions over combinatorial spaces (e.g., sets, sequences, trees, and graphs). BOCS is a state-of-the-art Bayesian optimization method for tractable statistical models, which performs semi-definite programming based acquisition function optimization (AFO) to select the next structure for evaluation. Unfortunately, BOCS scales poorly for large number of binary and/or categorical variables. Based on recent advances in submodular relaxation for solving Binary Quadratic Programs, we study an approach referred as Parametrized Submodular Relaxation (PSR) towards the goal of improving the scalability and accuracy of solving AFO problems for BOCS model. Experiments on diverse benchmark problems including real-world applications in communications engineering and electronic design automation show significant improvements with PSR for BOCS model.

Chat is not available.