Bayesian optimization in effective dimensions via kernel-based sensitivity indices
Résumé
A determining factor to the utility of optimization algorithms is their cost. A strategy to contain this cost is to reduce the dimension of the search space by detecting the most important variables and optimizing over them only. Recently, sensitivity measures that rely on the Hilbert Schmidt Independence criterion (HSIC) adapted to optimization variables have been proposed. In this work, the HSIC sensitivities are used within a new Bayesian global optimization algorithm in order to reduce the dimension of the problem. At each iteration, the activation of optimization variables is challenged in a deterministic or probabilistic manner. Several strategies for filling in the variables that are dropped out are proposed. Numerical tests are carried out at low number of function evaluations that confirm the computational gains brought by the HSIC variable selection and point to the complementarity of the variable selection and fill-in strategies.