Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets

Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Robert Legenstein, Wolfgang Maass

Research output: Working paperPreprint


The way how recurrently connected networks of spiking neurons in the brain acquire powerful information processing capabilities through learning has remained a mystery. This lack of understanding is linked to a lack of learning algorithms for recurrent networks of spiking neurons (RSNNs) that are both functionally powerful and can be implemented by known biological mechanisms. Since RSNNs are simultaneously a primary target for implementations of brain-inspired circuits in neuromorphic hardware, this lack of algorithmic insight also hinders technological progress in that area. The gold standard for learning in recurrent neural networks in machine learning is back-propagation through time (BPTT), which implements stochastic gradient descent with regard to a given loss function. But BPTT is unrealistic from a biological perspective, since it requires a transmission of error signals backwards in time and in space, i.e., from post- to presynaptic neurons. We show that an online merging of locally available information during a computation with suitable top-down learning signals in
real-time provides highly capable approximations to BPTT. For tasks where information on errors arises only late during a network computation, we enrich locally available information through feedforward eligibility traces of synapses that can easily be computed in an online manner. The resulting new generation of learning algorithms for recurrent neural networks provides a new nderstanding of network learning in the brain that can be tested experimentally.
In addition, these algorithms provide effcient methods for on-chip training of RSNNs in neuromorphic hardware. We changed in this version 2 of the paper the name of the new learning algorithms to e-prop, corrected minor errors, added details { especially for resulting new rules for synaptic plasticity, edited the notation, and included new results for TIMIT.
Original languageEnglish
Number of pages37
Publication statusPublished - 25 Jan 2019

Publication series e-Print archive
PublisherCornell University Library


  • cs.NE


Dive into the research topics of 'Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets'. Together they form a unique fingerprint.

Cite this