Asymptotic bias of stochastic gradient search

Vladislav B. Tadic, Arnaud Doucet

Research output: Contribution to journalArticle (Academic Journal)peer-review

32 Citations (Scopus)
293 Downloads (Pure)


The asymptotic behavior of the stochastic gradient algorithm using biased gradient estimates is analyzed. Relying on arguments based on dynamic system theory (chain-recurrence) and differential geometry (Yomdin theorem and Lojasiewicz inequalities), upper bounds on the asymptotic bias of this algorithm are derived. The results hold under mild conditions and cover a broad class of algorithms used in machine learning, signal processing and statistics.
Original languageEnglish
Pages (from-to)3255-3304
Number of pages50
JournalAnnals of Applied Probability
Issue number6
Early online date15 Dec 2017
Publication statusPublished - Dec 2017


  • Biased gradient estimation
  • Chain-recurrence
  • Lojasiewicz inequalities
  • Stochastic gradient search
  • Yomdin theorem


Dive into the research topics of 'Asymptotic bias of stochastic gradient search'. Together they form a unique fingerprint.

Cite this