Flexible Tails for Normalizing Flows

Dennis Prangle, Tennessee W Hickling*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Normalizing flows are a flexible class of probability distributions, expressed as transformations of a simple base distribution. A limitation of standard normalizing flows is representing distributions with heavy tails, which arise in applications to both density estimation and variational inference. A popular current solution to this problem is to use a heavy tailed base distribution. We argue this can lead to poor performance due to the difficulty of optimising neural networks, such as normalizing flows, under heavy tailed input. We propose an alternative, “tail transform flow” (TTF), which uses a Gaussian base distribution and a final transformation layer which can produce heavy tails.Experimental results show this approach outperforms current methods, especially when the target distribution has large dimension or tail weight.
Original languageEnglish
Title of host publicationInternational Conference on Machine Learning, 13-19 July 2025, Vancouver, Canada
EditorsNeil Lawrence
Pages23155-23178
Number of pages24
Volume267
Publication statusPublished - 14 Oct 2025
EventThe 42nd International Conference on Machine Learning (ICML 2025) - Vancouver Convention Center, Vancouver, Canada
Duration: 13 Jul 202519 Jul 2025
https://icml.cc/Conferences/2025

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
ISSN (Electronic)2640-3498

Conference

ConferenceThe 42nd International Conference on Machine Learning (ICML 2025)
Abbreviated titleICML 2025
Country/TerritoryCanada
CityVancouver
Period13/07/2519/07/25
Internet address

Bibliographical note

Copyright 2025 by the author(s).

Fingerprint

Dive into the research topics of 'Flexible Tails for Normalizing Flows'. Together they form a unique fingerprint.

Cite this