Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

Quantization based Optimization : Alternative Stochastic Approximation of Global Optimization

Jinwuk Seok · Changsik Cho


Abstract:

In this study, we propose a global optimization algorithm based on quantizing the energy level of an objective function in an NP-hard problem.According to the white noise hypothesis for a quantization error with a dense and uniform distribution, we can regard the quantization error as i.i.d. white noise. According to stochastic analysis, the proposed algorithm converges weakly only under conditions satisfying Lipschitz continuity, instead of local convergence properties such as the Hessian constraint of the objective function. This shows that the proposed algorithm ensures global optimization by Laplace's condition. Numerical experiments show that the proposed algorithm outperforms conventional learning methods in solving NP-hard optimization problems such as the traveling salesman problem.

Chat is not available.