An Algorithm for Approximating the Second Moment of the Normalizing Constant Estimate from a Particle Filter

Svetoslav Kostov*, Nick Whiteley

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

267 Downloads (Pure)

Abstract

We propose a new algorithm for approximating the non-asymptotic second moment of the marginal likelihood estimate, or normalizing constant, provided by a particle filter. The computational cost of the new method is O(M) per time step, independently of the number of particles N in the particle filter, where M is a parameter controlling the quality of the approximation. This is in contrast to O(MN) for a simple averaging technique using M i.i.d. replicates of a particle filter with N particles. We establish that the approximation delivered by the new algorithm is unbiased, strongly consistent and, under standard regularity conditions, increasing M linearly with time is sufficient to prevent growth of the relative variance of the approximation, whereas for the simple averaging technique it can be necessary to increase M exponentially with time in order to achieve the same effect. This makes the new algorithm useful as part of strategies for estimating Monte Carlo variance. Numerical examples illustrate performance in the context of a stochastic Lotka–Volterra system and a simple AR(1) model.

Original languageEnglish
Pages (from-to)799–818
Number of pages20
JournalMethodology and Computing in Applied Probability
Volume19
Issue number3
Early online date27 Sep 2016
DOIs
Publication statusPublished - Sep 2017

Keywords

  • marginal likelihood
  • normalizing constant
  • hidden Markov model
  • particle filter

Fingerprint Dive into the research topics of 'An Algorithm for Approximating the Second Moment of the Normalizing Constant Estimate from a Particle Filter'. Together they form a unique fingerprint.

Cite this