Efficient approximate representations of computationally expensive features

Raul Santos-Rodriguez, Niall Twomey

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

34 Downloads (Pure)


High computational complexity is often a barrier to achieving desired representations in resource-constrained settings. This paper introduces a simple and computationally cheap method of approximating complex features. We do so by carefully constraining the architecture of Neural Networks (NNs) and regress from raw data to the intended feature representation. Our analysis focuses on spectral features, and demonstrates how low-capacity networks can capture the end-to-end dynamics of cascaded composite functions. Not only do approximating NNs simplify the analysis pipeline, but our approach produces feature representations up to 20 times more quickly. Excellent feature fidelity is achieved in our experimental analysis with feature approximations, but we also report nearly indistinguishable predictive performance when comparing between exact and approximate representations.
Original languageEnglish
Title of host publicationESANN 2018 proceedings
Subtitle of host publicationEuropean Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning
Place of PublicationBruges
Publication statusPublished - 25 Apr 2018

Fingerprint Dive into the research topics of 'Efficient approximate representations of computationally expensive features'. Together they form a unique fingerprint.

Cite this