This thesis introduces a modified AdaGrad-Norm stepsize scheme for stochastic gradient descent based on the AdaGrad stepsize scheme that generalises the scaling factor. An asymptotic analysis of this stepsize scheme is undertaken in the setting of Markovian dynamics with a non-convex objective function. Under assumptions common within the literature, I demonstrate the stability of the modified AdaGrad-Norm scheme and local convergence almost surely. Under modified assumptions, a convergence rate is shown for the modified AdaGrad-Norm scheme and this convergence rate is used to demonstrate the asymptotic normality of the algorithm iterates.
Asymptotic Analysis of an Adaptive Stochastic Gradient Descent: Non-convexity and Markovian Dynamics
Zakaria, A. (Author). 4 Feb 2025
Student thesis: Doctoral Thesis › Doctor of Philosophy (PhD)