Bayesian optimization in effective dimensions via kernel-based sensitivity indices

Abstract : A determining factor to the utility of optimization algorithms is their cost. A strategy to contain this cost is to reduce the dimension of the search space by detecting the most important variables and optimizing over them only. Recently, sensitivity measures that rely on the Hilbert Schmidt Independence criterion (HSIC) adapted to optimization variables have been proposed. In this work, the HSIC sensitivities are used within a new Bayesian global optimization algorithm in order to reduce the dimension of the problem. At each iteration, the activation of optimization variables is challenged in a deterministic or probabilistic manner. Several strategies for filling in the variables that are dropped out are proposed. Numerical tests are carried out at low number of function evaluations that confirm the computational gains brought by the HSIC variable selection and point to the complementarity of the variable selection and fill-in strategies.
Complete list of metadatas

https://hal-emse.ccsd.cnrs.fr/emse-02295164
Contributor : Florent Breuil <>
Submitted on : Tuesday, September 24, 2019 - 8:54:10 AM
Last modification on : Thursday, October 17, 2019 - 12:36:13 PM

Identifiers

  • HAL Id : emse-02295164, version 1

Citation

Adrien Spagnol, Rodolphe Le Riche, Sébastien da Veiga. Bayesian optimization in effective dimensions via kernel-based sensitivity indices. 30th European conference on operational research, EURO2019, Jun 2019, Dublin, Ireland. ⟨emse-02295164⟩

Share

Metrics

Record views

16