Cortical microcircuits as gated-recurrent neural networks

Rui Ponte Costa, Yannis M. Assael, Brendan Shillingford, Nando de Freitas, Tim P. Vogels

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

33 Citations (Scopus)
137 Downloads (Pure)


Cortical circuits exhibit intricate recurrent architectures that are remarkably similar across different brain areas. Such stereotyped structure suggests the existence of common computational principles. However, such principles have remained largely elusive. Inspired by gated-memory networks, namely long short-term memory networks (LSTMs), we introduce a recurrent neural network in which information is gated through inhibitory cells that are subtractive (subLSTM). We propose a natural mapping of subLSTMs onto known canonical excitatory-inhibitory cortical microcircuits. Our empirical evaluation across sequential image classification and language modelling tasks shows that subLSTM units can achieve similar performance to LSTM units. These results suggest that cortical circuits can be optimised to solve complex contextual problems and proposes a novel view on their computational function. Overall our work provides a step towards unifying recurrent networks as used in machine learning with their biological counterparts.
Original languageEnglish
Title of host publicationNeural Information Processing Systems 30 (NIPS 2017)
Subtitle of host publicationAdvances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4-9 December 2017, Long Beach, CA, USA.
Place of PublicationCurran Associates Inc, NY
PublisherNeural Information Processing Systems (NIPS)
Number of pages12
ISBN (Print)9781510860964
Publication statusPublished - 7 Nov 2017


  • q-bio.NC
  • cs.NE
  • stat.ML


Dive into the research topics of 'Cortical microcircuits as gated-recurrent neural networks'. Together they form a unique fingerprint.

Cite this