Reservoir Computing (RC) denotes a class of recurrent neural models whose dynamics are left unadapted after initialization. The approach is appealing for several reasons, such as fast training, a natural propensity to edge computing and strong theoretical foundations with implications for the basic properties of recurrent neural networks in general.
Echo State Networks, Liquid State Machines
Hybrids of fully-trained and RC models
Deep Reservoir Computing
Reservoir Computing for structured data (trees, graphs, networks, …)
Ensemble learning and Reservoir Computing
Trustworthy AI concepts for Reservoir Computing
Reservoir dimensionality reduction, efficient reservoir hyper-parameter search and learning
Reservoir Computing in Neuroscience
Theoretical analysis of Reservoir Computing
Statistical Learning Theory of Reservoir Computing networks
Reservoir Computing for AI applications (e.g., vision, natural language processing, health, bioinformatics, etc.)
Deadline for rebuttal: 31 May 2024
Final notification of acceptance or rejection after rebuttal: 10 June 2024
Camera-ready paper upload: 20 June 2024
Deadline for author registration and early registration at early rate: 30 June 2024