Big–Little Adaptive Neural Networks on Low-Power Near-Subthreshold Processors

Zichao Shen*, Neil Howard, Jose L Nunez-Yanez

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

3 Citations (Scopus)
30 Downloads (Pure)


This paper investigates the energy savings that near-subthreshold processors can obtain in edge AI applications and proposes strategies to improve them while maintaining the accuracy of the application. The selected processors deploy adaptive voltage scaling techniques in which the frequency and voltage levels of the processor core are determined at the run-time. In these systems, embedded RAM and flash memory size is typically limited to less than 1 megabyte to save power. This limited memory imposes restrictions on the complexity of the neural networks model that can be mapped to these devices and the required trade-offs between accuracy and battery life. To address these issues, we propose and evaluate alternative ‘big–little’ neural network strategies to improve battery life while maintaining prediction accuracy. The strategies are applied to a human activity recognition application selected as a demonstrator that shows that compared to the original network, the best configurations obtain an energy reduction measured at 80% while maintaining the original level of inference accuracy
Original languageEnglish
Article number28
JournalJournal of Low Power Electronics and Applications
Issue number2
Publication statusPublished - 18 May 2022

Bibliographical note

Funding Information:
Funding: This work was partially funded by the Royal Society INF/R2/192044 Machine Intelligence at the Network Edge (MINET) fellowship.

Publisher Copyright:
© 2022 by the authors. Licensee MDPI, Basel, Switzerland.


  • near-subthreshold processor
  • energy efficient
  • neural network
  • adaptive computing


Dive into the research topics of 'Big–Little Adaptive Neural Networks on Low-Power Near-Subthreshold Processors'. Together they form a unique fingerprint.

Cite this