Abstract
The introduction of neural networks and increasingly powerful density estimation methods have allowed flexible estimation of unknown distributions. These advancements have been applied across Bayesian inference, for estimating posterior distributions and intractable likelihoods. This thesis leverages these advancements along with consideration of addressing issues with reliability, which are crucial for application of these methods within scientific domains. Robust neural posterior estimation is introduced, a method promoting reliable simulation-based inference under model misspecification, while enabling model criticism. Next, soft contrastive variational inference is introduced, which provides a novel reframing of variational inference as a classification problem, which is demonstrated to form mass-covering and stable to train objectives. Further, this method is demonstrated in a simulation-based inference context, alongside fitting of a surrogate simulator. Finally, FlowJAX is presented, a software package for distributions and normalising flows, and some novel normalising flow developments are discussed. These contributions facilitate neural network-based Bayesian inference by providing robust methodologies and software tools that are applicable across many scientific domains.| Date of Award | 13 May 2025 |
|---|---|
| Original language | English |
| Awarding Institution |
|
| Supervisor | Mark A Beaumont (Supervisor) & Matteo Fasiolo (Supervisor) |