Asymptotic bias of stochastic gradient search

Vladislav B. Tadic, Arnaud Doucet

Research output: Contribution to journalArticle (Academic Journal)peer-review

14 Citations (Scopus)
293 Downloads (Pure)

Abstract

The asymptotic behavior of the stochastic gradient algorithm using biased gradient estimates is analyzed. Relying on arguments based on dynamic system theory (chain-recurrence) and differential geometry (Yomdin theorem and Lojasiewicz inequalities), upper bounds on the asymptotic bias of this algorithm are derived. The results hold under mild conditions and cover a broad class of algorithms used in machine learning, signal processing and statistics.
Original languageEnglish
Pages (from-to)3255-3304
Number of pages50
JournalAnnals of Applied Probability
Volume27
Issue number6
Early online date15 Dec 2017
DOIs
Publication statusPublished - Dec 2017

Keywords

  • Biased gradient estimation
  • Chain-recurrence
  • Lojasiewicz inequalities
  • Stochastic gradient search
  • Yomdin theorem

Fingerprint Dive into the research topics of 'Asymptotic bias of stochastic gradient search'. Together they form a unique fingerprint.

Cite this