An Empirical Case of Gaussian Processes Learning in High Dimension: the Likelihood versus Leave-One-Out Rivalry - Mines Saint-Étienne Access content directly
Conference Papers Year : 2023

An Empirical Case of Gaussian Processes Learning in High Dimension: the Likelihood versus Leave-One-Out Rivalry

Abstract

Gaussian Processes (GPs) are semi-parametric models commonly employed in various realworld applications such as statistical modeling and ensitivity analysis. They play an instrumental role in Bayesian optimization since they determine the sequence of iterates where the objective function is evaluated. GPs are particularly useful in the context of small data when there are of the order of a hundred to a thousand points to learn. However, GPs suer particularly from the curse of dimensionality [2]: at a xed number of data points, their predictive capability may decrease dramatically in high dimension (d ≳ 40) [3]. In this talk, we investigate such a phenomenon in details. We illustrate this loss of performance with increasing dimension on simple functions and analyze its underlying symptoms, in particular a tendency to become constant away from the data points. We show that the fundamental problem is one of learning and not one of representation capacity: maximum likelihood, the dominant loss function for such models, can miss regions of optimality of the GP hyperparameters. Failure of maximum likelihood is related to statistical model inadequacy [1]: a model with constant trend is sensitive to dimensionality when tting quadratic functions while it much better handles dimension growth for linear functions or Gaussian trajectories generated with the right covariance. Our experiments also show that the leave-one-out loss function is less prone to the curse of dimensionality even for inadequate statistical models.
No file

Dates and versions

emse-04344701 , version 1 (14-12-2023)

Identifiers

  • HAL Id : emse-04344701 , version 1

Cite

David Gaudrie, Rodolphe Le Riche, Tanguy Appriou. An Empirical Case of Gaussian Processes Learning in High Dimension: the Likelihood versus Leave-One-Out Rivalry. PGMO DAYS 2023, Nov 2023, Paris-Saclay, France. ⟨emse-04344701⟩
20 View
0 Download

Share

Gmail Facebook X LinkedIn More