Guiding Time-Varying Generative Models by Natural Gradients on Exponential Family Manifold

Song Liu, Leyang Wang, Yakun Wang

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Optimising probabilistic models is a well-studied field in statistics. However, its connection with the training of generative models remains largely under-explored. In this paper, we show that the evolution of time-varying generative models can be projected onto an exponential family manifold, naturally creating a link between the parameters of a generative model and those of a probabilistic model. We then train the generative model by moving its projection on the manifold according to the natural gradient descent scheme. This approach also allows us to approximate the natural gradient of the KL divergence efficiently without relying on MCMC for intractable models. Furthermore, we propose particle versions of the algorithm, which feature closed-form update rules for any parametric model within the exponential family. Through toy and real-world experiments, we validate the effectiveness of the proposed algorithms.
Original languageEnglish
Title of host publicationProceedings of the 41st Conference on Uncertainty in Artificial Intelligence
PublisherProceedings of Machine Learning Research
Publication statusAccepted/In press - 2 Jun 2025
EventThe 41st Conference on Uncertainty in Artificial Intelligence - Rio de Janeiro, Brazil
Duration: 21 Jul 202525 Jul 2025
https://www.auai.org/uai2025/

Conference

ConferenceThe 41st Conference on Uncertainty in Artificial Intelligence
Abbreviated titleuai2025
Country/TerritoryBrazil
CityRio de Janeiro
Period21/07/2525/07/25
Internet address

Fingerprint

Dive into the research topics of 'Guiding Time-Varying Generative Models by Natural Gradients on Exponential Family Manifold'. Together they form a unique fingerprint.

Cite this