Reservoirs learn to learn

Anand Subramoney, Franz Scherr, Wolfgang Maass

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

The common procedure in reservoir computing is to take a "found" reservoir, such as a recurrent neural network with randomly chosen synaptic weights or a complex physical device, and to adapt the weights of linear readouts from this reservoir for a particular computing task. We address the question of whether the performance of reservoir computing can be significantly enhanced if one instead optimizes some (hyper)parameters of the reservoir, not for a single task but for the range of all possible tasks in which one is potentially interested, before the weights of linear readouts are optimized for a particular computing task. After all, networks of neurons in the brain are also known to be not randomly connected. Rather, their structure and parameters emerge from complex evolutionary and developmental processes, arguably in a way that enhances speed and accuracy of subsequent learning of any concrete task that is likely to be essential for the survival of the organism. We apply the Learning-to-Learn (L2L) paradigm to mimick this two-tier process, where a set of (hyper)parameters of the reservoir are optimized for a whole family of learning tasks. We found that this substantially enhances the performance of reservoir computing for the families of tasks that we considered. Furthermore, L2L enables a new form of reservoir learning that tends to enable even faster learning, where not even the weights of readouts need to be adjusted for learning a concrete task. We present demos and performance results of these new forms of reservoir computing for reservoirs that consist of networks of spiking neurons, and are hence of particular interest from the perspective of neuroscience and implementations in spike-based neuromorphic hardware. We leave it as an open question what performance advantage the new methods that we propose provide for other types of reservoirs.
Original languageEnglish
Title of host publicationReservoir Computing
Subtitle of host publicationTheory, Physical Implementations, and Applications
EditorsKohei Nakjima, Ingo Fischer
PublisherSpringer Singapore
Publication statusPublished - 2020

Fingerprint

Dive into the research topics of 'Reservoirs learn to learn'. Together they form a unique fingerprint.

Cite this