Asymptotic Analysis of an Adaptive Stochastic Gradient Descent
: Non-convexity and Markovian Dynamics

  • Al Zakaria

Student thesis: Doctoral ThesisDoctor of Philosophy (PhD)

Abstract

This thesis introduces a modified AdaGrad-Norm stepsize scheme for stochastic gradient descent based on the AdaGrad stepsize scheme that generalises the scaling factor. An asymptotic analysis of this stepsize scheme is undertaken in the setting of Markovian dynamics with a non-convex objective function. Under assumptions common within the literature, I demonstrate the stability of the modified AdaGrad-Norm scheme and local convergence almost surely. Under modified assumptions, a convergence rate is shown for the modified AdaGrad-Norm scheme and this convergence rate is used to demonstrate the asymptotic normality of the algorithm iterates.
Date of Award4 Feb 2025
Original languageEnglish
Awarding Institution
  • University of Bristol
SupervisorVladislav Tadic (Supervisor) & Christophe Andrieu (Supervisor)

Cite this

'