Echo state networks with filter neurons and a delay∑ readout

Georg Holzmann, Helmut Hauser

Research output: Contribution to journalArticle (Academic Journal)peer-review

93 Citations (Scopus)


Echo state networks (ESNs) are a novel approach to recurrent neural network training with the advantage of a very simple and linear learning algorithm. It has been demonstrated that ESNs outperform other methods on a number of benchmark tasks. Although the approach is appealing, there are still some inherent limitations in the original formulation. Here we suggest two enhancements of this network model. First, the previously proposed idea of filters in neurons is extended to arbitrary infinite impulse response (IIR) filter neurons. This enables such networks to learn multiple attractors and signals at different timescales, which is especially important for modeling real-world time series. Second, a delay&sum readout is introduced, which adds trainable delays in the synaptic connections of output neurons and therefore vastly improves the memory capacity of echo state networks. It is shown in commonly used benchmark tasks and real-world examples, that this new structure is able to significantly outperform standard ESNs and other state-of-the-art models for nonlinear dynamical system modeling.

Original languageEnglish
Pages (from-to)244-56
Number of pages13
JournalNeural networks : the official journal of the International Neural Network Society
Issue number2
Publication statusPublished - Mar 2010


  • Algorithms
  • Computer Simulation
  • Humans
  • Learning
  • Linear Models
  • Memory
  • Music
  • Neural Networks (Computer)
  • Neurons
  • Nonlinear Dynamics
  • Periodicity
  • Synapses
  • Time Factors


Dive into the research topics of 'Echo state networks with filter neurons and a delay∑ readout'. Together they form a unique fingerprint.

Cite this