Abstract
The asymptotic behavior of the stochastic gradient algorithm using biased gradient estimates is analyzed. Relying on arguments based on dynamic system theory (chain-recurrence) and differential geometry (Yomdin theorem and Lojasiewicz inequalities), upper bounds on the asymptotic bias of this algorithm are derived. The results hold under mild conditions and cover a broad class of algorithms used in machine learning, signal processing and statistics.
Original language | English |
---|---|
Pages (from-to) | 3255-3304 |
Number of pages | 50 |
Journal | Annals of Applied Probability |
Volume | 27 |
Issue number | 6 |
Early online date | 15 Dec 2017 |
DOIs | |
Publication status | Published - Dec 2017 |
Keywords
- Biased gradient estimation
- Chain-recurrence
- Lojasiewicz inequalities
- Stochastic gradient search
- Yomdin theorem
Fingerprint
Dive into the research topics of 'Asymptotic bias of stochastic gradient search'. Together they form a unique fingerprint.Profiles
-
Dr Vladislav Tadic
- Statistical Science
- Probability, Analysis and Dynamics
- School of Mathematics - Senior Lecturer in Statistics
- Statistics
Person: Academic , Member