Familiarity discrimination algorithm inspired by computations of the perirhinal cortex

Rafal Bogacz, Malcolm Brown, Christophe Girraud-Carrier, Stefan Wermter, Jim Austin, David Willshaw

Research output: Chapter in Book/Report/Conference proceedingChapter in a book


Familiarity discrimination, i.e. the ability to recognise previously experienced objects is important to the survival of animals, but it may also find practical applications in information technology. This paper describes the Familiarity discrimination based on Energy algorithm (FamE) inspired by the presumed computations of the perirhinal cortex ? the area of the brain involved in familiarity discrimination. In FamE the information about occurrences of familiar records is encoded in the weights of a neural network. Using the network, FamE can discriminate whether a given record belongs to the set of familiar ones, but cannot retrieve the record. With this restriction, the network achieves much higher storage capacity for familiarity discrimination than other neural networks achieve for recall. Therefore, for a given number of familiar records, the description of the weights of the network occupies much less space in memory than the database containing the records itself. Furthermore, FamE can still classify a record as familiar even if it differs in a substantial proportion of its bits from its previous representation. FamE is also very fast. Preliminary results of simulations demonstrate that the algorithm may be applied to real-world problems.
Translated title of the contributionFamiliarity discrimination algorithm inspired by computations of the perirhinal cortex
Original languageEnglish
Title of host publicationEmergent Neural Computational Architectures based on Neuroscience
Publication statusPublished - 2001

Bibliographical note

Other page information: 435-448
Other identifier: 2000060


Dive into the research topics of 'Familiarity discrimination algorithm inspired by computations of the perirhinal cortex'. Together they form a unique fingerprint.

Cite this