Skip to yearly menu bar Skip to main content


Poster

Triangulation candidates for Bayesian optimization

Robert Gramacy · Annie Sauer · Nathan Wycoff

Hall J (level 1) #934

Keywords: [ Active Learning ] [ Gaussian process ] [ convex hull ] [ surrogate modeling ] [ Delaunay triangulation ] [ sequential design ] [ space-filling design ]


Abstract:

Bayesian optimization involves "inner optimization" over a new-data acquisition criterion which is non-convex/highly multi-modal, may be non-differentiable, or may otherwise thwart local numerical optimizers. In such cases it is common to replace continuous search with a discrete one over random candidates. Here we propose using candidates based on a Delaunay triangulation of the existing input design. We detail the construction of these "tricands" and demonstrate empirically how they outperform both numerically optimized acquisitions and random candidate-based alternatives, and are well-suited for hybrid schemes, on benchmark synthetic and real simulation experiments.

Chat is not available.