High computational complexity is often a barrier to achieving desired representations in resource-constrained settings. This paper introduces a simple and computationally cheap method of approximating complex features. We do so by carefully constraining the architecture of Neural Networks (NNs) and regress from raw data to the intended feature representation. Our analysis focuses on spectral features, and demonstrates how low-capacity networks can capture the end-to-end dynamics of cascaded composite functions. Not only do approximating NNs simplify the analysis pipeline, but our approach produces feature representations up to 20 times more quickly. Excellent feature fidelity is achieved in our experimental analysis with feature approximations, but we also report nearly indistinguishable predictive performance when comparing between exact and approximate representations.
|Title of host publication||ESANN 2018 proceedings|
|Subtitle of host publication||European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning|
|Place of Publication||Bruges|
|Publication status||Published - 25 Apr 2018|