Skip to content

Efficient approximate representations of computationally expensive features

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Original languageEnglish
Title of host publicationESANN 2018 proceedings
Subtitle of host publicationEuropean Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
Place of PublicationBruges
DateAccepted/In press - 25 Jan 2018
DatePublished (current) - 25 Apr 2018


High computational complexity is often a barrier to achieving desired representations in resource-constrained settings. This paper introduces a simple and computationally cheap method of approximating complex features. We do so by carefully constraining the architecture of Neural Networks (NNs) and regress from raw data to the intended feature representation. Our analysis focuses on spectral features, and demonstrates how low-capacity networks can capture the end-to-end dynamics of cascaded composite functions. Not only do approximating NNs simplify the analysis pipeline, but our approach produces feature representations up to 20 times more quickly. Excellent feature fidelity is achieved in our experimental analysis with feature approximations, but we also report nearly indistinguishable predictive performance when comparing between exact and approximate representations.



  • Full-text PDF (final published version)

    Rights statement: This is the final published version of the article (version of record). It first appeared online via at Please refer to any applicable terms of use of the publisher.

    Final published version, 4 MB, PDF document

    Licence: CC BY-NC-SA


View research connections

Related faculties, schools or groups