Reservoirs learn to learn

Anand Subramoney, Franz Scherr, Wolfgang Maass*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

The common procedure in reservoir computing is to take a “found” reservoir, such as a recurrent neural network with randomly chosen synaptic weights or a complex physical device, and to adapt the weights of linear readouts from this reservoir for a particular computing task. We address the question of whether the performance of reservoir computing can be significantly enhanced if one instead optimizes some (hyper)parameters of the reservoir, not for a single task but for the range of all possible tasks in which one is potentially interested, before the weights of linear readouts are optimized for a particular computing task. After all, networks of neurons in the brain are also known to be not randomly connected. Rather, their structure and parameters emerge from complex evolutionary and developmental processes, arguably in a way that enhances the speed and accuracy of subsequent learning of any concrete task that is likely to be essential for the survival of the organism. We apply the Learning-to-Learn (L2L) paradigm to mimic this two-tier process, where a set of (hyper)parameters of the reservoir are optimized for a whole family of learning tasks. We found that this substantially enhances the performance of reservoir computing for the families of tasks that we considered. Furthermore, L2L enables a new form of reservoir learning that tends to enable even faster learning, where not even the weights of readouts need to be adjusted for learning a concrete task. We present demos and performance results of these new forms of reservoir computing for reservoirs that consist of networks of spiking neurons and are hence of particular interest from the perspective of neuroscience and implementations in spike-based neuromorphic hardware. We leave it as an open question of what performance advantage the new methods that we propose provide for other types of reservoirs.

Original languageEnglish
Title of host publicationReservoir Computing
Subtitle of host publicationTheory, Physical Implementations, and Applications
EditorsKohei Nakajima, Ingo Fischer
PublisherSpringer Singapore
Pages59-76
Number of pages18
ISBN (Electronic)978-981-13-1687-6
ISBN (Print)978-981-13-1686-9
DOIs
Publication statusPublished - 2021

Publication series

NameNatural Computing Series
ISSN (Print)1619-7127

Keywords

  • Learning to learn
  • Meta-learning
  • Optimized reservoir
  • Reservoir computing
  • Spiking neural networks

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science

Cite this