Abstract
We present a novel approach to Bayesian inference and general Bayesian computation that is defined through a sequential decision loop. Our method defines a recursive partitioning of the sample space. It neither relies on gradients nor requires any problem-specific tuning, and is asymptotically exact for any density function with a bounded domain. The output is an approximation to the whole density function including the normalisation constant, via partitions organised in efficient data structures. Such approximations may be used for evidence estimation or fast posterior sampling, but also as building blocks to treat a larger class of estimation problems. The algorithm shows competitive performance to recent state-of-the-art methods on synthetic and real-world problems including parameter inference for gravitational-wave physics.
Original language | English |
---|---|
Title of host publication | Proceedings of the 38th International Conference on Machine Learning, PMLR |
Publisher | ML Research Press |
Pages | 1015-1025 |
Volume | 139 |
Publication status | Published - 24 Jul 2021 |
Event | International Conference on Machine Learning - Virtual Duration: 18 Jul 2021 → 24 Jul 2021 |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Publisher | ML Research Press |
Volume | 139 |
ISSN (Print) | 2640-3498 |
Conference
Conference | International Conference on Machine Learning |
---|---|
Period | 18/07/21 → 24/07/21 |
Bibliographical note
International Conference on Machine Learning (ICML) 2021Keywords
- stat.ML
- cs.LG