Bayesian optimization in effective dimensions via kernel-based sensitivity indices

Abstract : A determining factor to the utility of optimization algorithms is their cost. A strategy to contain this cost is to reduce the dimension of the search space by detecting the most important variables and optimizing over them only. Recently, sensitivity measures that rely on the Hilbert Schmidt Independence criterion (HSIC) adapted to optimization variables have been proposed. In this work, the HSIC sensitivities are used within a new Bayesian global optimization algorithm in order to reduce the dimension of the problem. At each iteration, the activation of optimization variables is challenged in a deterministic or probabilistic manner. Several strategies for filling in the variables that are dropped out are proposed. Numerical tests are carried out at low number of function evaluations that confirm the computational gains brought by the HSIC variable selection and point to the complementarity of the variable selection and fill-in strategies.
Complete list of metadatas

https://hal-emse.ccsd.cnrs.fr/emse-02133923
Contributor : Florent Breuil <>
Submitted on : Monday, May 20, 2019 - 10:26:51 AM
Last modification on : Tuesday, May 21, 2019 - 1:37:33 AM

Identifiers

  • HAL Id : emse-02133923, version 1

Citation

Adrien Spagnol, Rodolphe Le Riche, Sébastien da Veiga. Bayesian optimization in effective dimensions via kernel-based sensitivity indices. 13th International Conference on Applications of Statistics and Probability in Civil Engineering(ICASP13), May 2019, Séoul, South Korea. ⟨emse-02133923⟩

Share

Metrics

Record views

46