Global consensus Monte Carlo

Lewis J. Rendell*, Adam M. Johansen, Anthony Lee, Nick Whiteley

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

22 Downloads (Pure)

Abstract

To conduct Bayesian inference with large data sets, it is often convenient or necessary to distribute the data across multiple machines. We consider a likelihood function expressed as a product of terms, each associated with a subset of the data. Inspired by global variable consensus optimisation, we introduce an instrumental hierarchical model associating auxiliary statistical parameters with each term, which are conditionally independent given the top-level parameters. One of these top-level parameters controls the unconditional strength of association between the auxiliary parameters. This model leads to a distributed MCMC algorithm on an extended state space yielding approximations of posterior expectations. A trade-off between computational tractability and fidelity to the original model can be controlled by changing the association strength in the instrumental model. We further propose the use of a SMC sampler with a sequence of association strengths, allowing both the automatic determination of appropriate strengths and for a bias correction technique to be applied. In contrast to similar distributed Monte Carlo algorithms, this approach requires few distributional assumptions. The performance of the algorithms is illustrated with a number of simulated examples.
Original languageEnglish
Number of pages12
JournalJournal of Computational and Graphical Statistics
DOIs
Publication statusPublished - 16 Oct 2020

Keywords

  • Bayesian inference
  • Distributed inference
  • Markov chain Monte Carlo
  • Sequential Monte Carlo

Fingerprint Dive into the research topics of 'Global consensus Monte Carlo'. Together they form a unique fingerprint.

Cite this