Variational inference for Student-t MLP models

Nguyen Thi-hang, Ian T. Nabney

    Research output: Contribution to journalArticle (Academic Journal)peer-review

    11 Downloads (Pure)

    Abstract

    This paper presents a novel methodology to infer parameters of probabilistic models whose output noise is a Student-t distribution. The method is an extension of earlier work for models that are linear in parameters to nonlinear multi-layer perceptrons (MLPs). We used an EM algorithm combined with variational approximation, the evidence procedure, and an optimisation algorithm. The technique was tested on two regression applications. The first one is a synthetic dataset and the second is gas forward contract prices data from the UK energy market. The results showed that forecasting accuracy is significantly improved by using Student-t noise models.
    Original languageEnglish
    Pages (from-to)2989-2997
    Number of pages9
    JournalNeurocomputing
    Volume73
    Issue number16-18
    DOIs
    Publication statusPublished - 1 Oct 2010

    Bibliographical note

    10th Brazilian Symposium on Neural Networks (SBRN2008). NOTICE: this is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Thi-hang, Nguyen and Nabney, Ian T. (2010). Variational inference for Student-t MLP models. Neurocomputing, 73 (16-18), pp. 2989-2997. DOI 10.1016/j.neucom.2010.07.009

    Keywords

    • variational inference, student-t noise, multi-layer perceptrons, EM algorithm, forecast

    Fingerprint

    Dive into the research topics of 'Variational inference for Student-t MLP models'. Together they form a unique fingerprint.

    Cite this