Adding Biological Constraints to Deep Neural Networks Reduces their Capacity to Learn Unstructured Data

Chris I Tsvetkov, Gaurav Malhotra, Benjamin D Evans, Jeffrey S Bowers

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

1 Citation (Scopus)


Deep neural networks (DNNs) are becoming increasingly popular as a model of the human visual system. However, they show behaviours that are uncharacteristic of humans, including the ability to learn arbitrary data, such as images with pixel values drawn randomly from a Gaussian distribution. We investigated whether this behaviour is due to the learning and memory
capacity of DNNs being too high for the training task. We reduced the capacity of DNNs by incorporating biologically motivated constraints – an information bottleneck, internal noise and sigmoid activations – in order to diminish the learning of arbitrary data, without significantly degrading performance on
natural images. Internal noise reliably produced the desired behaviour, while a bottleneck had limited impact. Combining all three constraints yielded an even greater reduction in learning capacity. Furthermore, we tested whether these constraints contribute to a network’s ability to generalize by helping it develop more robust internal representations. However, none of the methods could consistently improve generalization.
Original languageEnglish
Title of host publicationProceedings of the 42nd Annual Conference of the Cognitive Science Society 2020
Place of PublicationToronto, Canada
Publication statusPublished - 2020


Dive into the research topics of 'Adding Biological Constraints to Deep Neural Networks Reduces their Capacity to Learn Unstructured Data'. Together they form a unique fingerprint.
  • M and M

    Bowers, J. S.


    Project: Research, Parent

Cite this