Abstract
This paper presents a method for approximating posterior distributions over the parameters of a given PRISM program. A sequential approach is taken where the distribution is updated one datapoint at a time. This makes it applicable to online learning situations where data arrives over time. The method is applicable whenever the prior is a mixture of products of Dirichlet distributions. In this case the true posterior will be a mixture of very many such products. An approximation is effected by merging products of Dirichlet distributions. An analysis of the quality of the approximation is presented. Due to the heavy computational burden of this approach, the method has been implemented in the Mercury logic programming language. Initial results using a hidden Markov model are presented.
Original language | English |
---|---|
Pages (from-to) | 279-297 |
Number of pages | 19 |
Journal | Machine Learning |
Volume | 89 |
Issue number | 3 |
DOIs | |
Publication status | Published - Dec 2012 |