Abstract
The thesis presents the main definitions and concepts of hidden Markov models (HMMs), focusing on how they are built and the issues arising when scaling up to high-dimensional settings. Novel approximate algorithms to compute filtering and smoothing distributions are proposed, these improve the applicability of HMMs to large scale and overcome computational problems.Firstly, an approximate forward-filtering and backward-smoothing algorithm for Factorial HMMs [Ghahramani and Jordan, 1997] is developed. The approximation involves discarding likelihood factors according to a notion of locality in the factor graph associated with the emission distribution. This allows the exponential-in-dimension cost of exact filtering and smoothing to be avoided. The approximation accuracy, measured in a local total variation norm, is proved to be dimension-free in the sense that as the overall dimension of the model increases the error bounds do not necessarily degrade. This method can be applied to data with known spatial or network structure, for instance in traffic models.
Secondly, an evolving in time Bayesian neural network called a Hidden Markov neural network is proposed. The weights of a feed-forward neural network are modelled with the hidden process of an HMM, whose observed process is given by the available data. An approximate filtering algorithm is used to learn a variational approximation to the evolving in time filtering distribution over the weights. Training is pursued through a sequential version of Bayes by Backprop [Blundell et al. 2015], which is enriched with a regularization technique called variational DropConnect.
Finally, the thesis produces also a new method for inference in stochastic epidemic models, which uses recursive multinomial approximations to filtering \slash smoothing distributions over unobserved variables and thus circumvent likelihood intractability. The method applies to a class of discrete-time finite-population compartmental models with partial, randomly under-reported or missing count observations. The algorithm is tested on real and synthetic data.
Date of Award | 28 Sept 2021 |
---|---|
Original language | English |
Awarding Institution |
|
Supervisor | Nick Whiteley (Supervisor) |
Keywords
- HMM
- approximate filtering
- high-dimensional
- Machine Learning
- computational statistics