Special
Session on Reservoir Computing: theory, models, and applications
18 - 23rd June 2023, Gold Coast Convention
and Exhibition Centre Queensland, Australia
Andrea
Ceni (University of Pisa, Italy), Claudio Gallicchio (University of Pisa, Italy), Gouhei Tanaka (University of Tokyo, Japan).
Reservoir Computing (RC) is a popular approach for efficiently training Recurrent Neural Networks
(RNNs), based on (i) constraining the recurrent hidden layers to develop stable dynamics,
and (ii) restricting the training algorithms to operate solely on an output (readout) layer.
Over the years,
the field of RC attracted a lot of research attention, due to several reasons. Indeed, besides the striking efficiency of
training algorithms, RC neural networks are distinctively amenable to hardware implementations (including neuromorphic unconventional substrates,
like those studied in photonics and material sciences), enable clean mathematical analysis (rooted,
e.g., in the field of random matrix theory), and finds natural engineering applications in resource-constrained contexts, such as edge AI
systems. Moreover, in the broader picture of Deep Learning development, RC is a breeding ground for testing innovative ideas,
e.g. biologically plausible training algorithms beyond gradient back-propagation. Noticeably, although established in
the Machine Learning field, RC lends itself naturally to interdisciplinarity, where ideas and inspirations coming
from diverse areas such as computational neuroscience, complex systems
and non-linear physics can lead to further developments and new applications.
This special session is intended to
be a hub for discussion and collaboration within the Neural Networks community, and therefore invites contributions on all aspects of
RC, from theory, to new models, to emerging applications.
We invite researchers to submit papers
on all aspects of RC research, targeting contributions on theory, models, and applications.
A list
of relevant topics for this session includes, without being limited
to, the following:
- New Reservoir Computing models and architectures, including Echo State
Networks and Liquid State Machines
- Hardware, physical and neuromorphic implementations of
Reservoir Computing systems
- Learning algorithms in
Reservoir Computing
- Reservoir Computing in Computational Neuroscience
- Reservoir Computing on the edge systems
- Novel learning algorithms rooted in
Reservoir Computing concepts
- Novel applications of
Reservoir Computing, e.g., to images, video and structured data
- Federated and Continual Learning
in Reservoir Computing
- Deep Reservoir Computing neural networks
- Theory of complex and dynamical systems
in Reservoir Computing
- Extensions of the Reservoir Computing framework, such as Conceptors
- Papers submission deadline: January 31, 2023
- Decision notification: March
31, 2023
Submission Guidelines
and Instructions
Papers submission for this Special Session follows the same process as for the regular sessions of IJCNN 2023, which uses EDAS as
submission system.
The review
process for IJCNN 2023 will be double-blind.
For prospected authors, it is therefore mandatory to anonymize their manuscripts. Each
paper should have 6 to MAXIMUM 8 pages, including figures, tables and references. Please refer to the Submission Guidelines at https://2023.ijcnn.org/authors/paper-submission for
full information.
Note that anonymizing your paper is mandatory,
and papers that explicitly or implicitly reveal the authors' identities may be rejected.