# Jialin Li: Convergence Rate Results in Black-Box Optimization with Surrogate Models

##
Abstract

The optimization of complex systems often relies on purely empirical observations of performance through experiments, with no explicit mathematical expression for the underlying function. One can record the performance of a particular configuration of input parameters, but such data can be very expensive. Thus, one has to utilize a few experiments to accurately predict the outcomes for other parameter values that cannot be tested. In this talk, I will present two works that focus on the prediction accomplished by constructing a surrogate model (an interpolation of observations) and their efficiency illustrated by convergence rates. The first idea involves designing experiments with epsilon-greedy, a randomized algorithm that searches for solutions either near the current best or over the entire domain. The rate at which the estimate converges to the global optimum is studied. The order of the rate becomes faster when the width of the local search neighborhood is made to shrink over time at a suitably chosen speed. Alternatively, the decision-maker can take a Bayesian viewpoint on the underlying black-box function which provides exponential decay rates of deviation error probabilities. In both works, the convergence rates connect the optimization error to a measure of how well the design points fill the domain.

##
Speaker

Visiting Assistant Professor

University of Massachusetts Amherst Department of Mathematics and Statistics