Fast Gaussian Process Regression at Extreme Scale

  • Anthony P Stephenson

Student thesis: Doctoral ThesisDoctor of Philosophy (PhD)

Abstract

The overarching aim of this thesis is to extend the state-of-the-art in fast and scalable Bayesian inference techniques, with a focus on efficient application to large datasets in complex, non-linear modelling situations, whilst preserving the crucial properties we associate with Bayesian methods; namely well-calibrated uncertainty estimation.
A natural starting point in this context is the Gaussian process (GP), which defines probability distributions over function spaces and enables highly flexible, accurate regression models with well-calibrated uncertainty quantification. However, GP regression quickly becomes computationally infeasible on large datasets due to its cubic complexity. Consequently, the literature abounds with approximations to GPs that aim to address this issue.
In this work, we develop methods to carefully assess and benchmark the performance of competing GP approximations. We propose novel, highly effective large-scale GP predictive algorithms, including a general scalable framework that can accommodate many pre-existing approximations. Finally, we offer a strategy for selecting suitable approximation methods via elegant and resource-efficient heuristic methods.
Date of Award10 Dec 2024
Original languageEnglish
Awarding Institution
  • University of Bristol
SupervisorRobert F Allison (Supervisor) & Edward Pyzer-Knapp (Supervisor)

Keywords

  • Machine Learning
  • Gaussian Process
  • Regression
  • Statistics
  • Computational

Cite this

'