Connectionists: [Call for Papers] Special Session on Reservoir Computing - IJCNN 2023 - Gold Coast Australia
Claudio Gallicchio
claudio.gallicchio at unipi.it
Fri Jan 13 09:48:58 EST 2023
[Apologies for any cross-postings]
[GetFileAttachment.png]
Special Session on Reservoir Computing: theory, models, and applications
18 - 23rd June 2023, Gold Coast Convention and Exhibition Centre Queensland, Australia
Papers submission deadline: 31 January 2023
More info at: IEEE Task Force on Reservoir Computing - IJCNN 2023 - Special Session<https://sites.google.com/view/reservoir-computing-tf/activities/ijcnn-2023-special-session?pli=1>
Paper submission Guidelines: International Joint Conference on Neural Networks 2023 (ijcnn.org)<https://es.sonicurlprotection-fra.com/click?PV=2&MSGID=202301041250470115263&URLID=7&ESV=10.0.18.7423&IV=B0EA138163D72C1645637B05A957C768&TT=1672836650981&ESN=exkdWBwJFM2UvquvQWk9UepUOQHkfX2wiAeNveUYixs%3D&KV=1536961729280&B64_ENCODED_URL=aHR0cHM6Ly8yMDIzLmlqY25uLm9yZy9hdXRob3JzL3BhcGVyLXN1Ym1pc3Npb24&HK=62185F7A32D8B64359F338CB48C098FC6DE66BF45EE4EF19AA4CC921B2E95F7E>
Organisers
Andrea Ceni (University of Pisa, Italy), Claudio Gallicchio (University of Pisa, Italy), Gouhei Tanaka (University of Tokyo, Japan).
Description
Reservoir Computing (RC) is a popular approach for efficiently training Recurrent Neural Networks (RNNs), based on (i) constraining the recurrent hidden layers to develop stable dynamics, and (ii) restricting the training algorithms to operate solely on an output (readout) layer.
Over the years, the field of RC attracted a lot of research attention, due to several reasons. Indeed, besides the striking efficiency of training algorithms, RC neural networks are distinctively amenable to hardware implementations (including neuromorphic unconventional substrates, like those studied in photonics and material sciences), enable clean mathematical analysis (rooted, e.g., in the field of random matrix theory), and finds natural engineering applications in resource-constrained contexts, such as edge AI systems. Moreover, in the broader picture of Deep Learning development, RC is a breeding ground for testing innovative ideas, e.g. biologically plausible training algorithms beyond gradient back-propagation. Noticeably, although established in the Machine Learning field, RC lends itself naturally to interdisciplinarity, where ideas and inspirations coming from diverse areas such as computational neuroscience, complex systems and non-linear physics can lead to further developments and new applications.
This special session is intended to be a hub for discussion and collaboration within the Neural Networks community, and therefore invites contributions on all aspects of RC, from theory, to new models, to emerging applications.
We invite researchers to submit papers on all aspects of RC research, targeting contributions on theory, models, and applications.
Topics of Interests
A list of relevant topics for this session includes, without being limited to, the following:
* New Reservoir Computing models and architectures, including Echo State Networks and Liquid State Machines
* Hardware, physical and neuromorphic implementations of Reservoir Computing systems
* Learning algorithms in Reservoir Computing
* Reservoir Computing in Computational Neuroscience
* Reservoir Computing on the edge systems
* Novel learning algorithms rooted in Reservoir Computing concepts
* Novel applications of Reservoir Computing, e.g., to images, video and structured data
* Federated and Continual Learning in Reservoir Computing
* Deep Reservoir Computing neural networks
* Theory of complex and dynamical systems in Reservoir Computing
* Extensions of the Reservoir Computing framework, such as Conceptors
Important Dates
* Papers submission deadline: January 31, 2023
* Decision notification: March 31, 2023
Submission Guidelines and Instructions
Papers submission for this Special Session follows the same process as for the regular sessions of IJCNN 2023, which uses EDAS as submission system.
The review process for IJCNN 2023 will be double-blind. For prospected authors, it is therefore mandatory to anonymize their manuscripts. Each paper should have 6 to MAXIMUM 8 pages, including figures, tables and references. Please refer to the Submission Guidelines at https://2023.ijcnn.org/authors/paper-submission for full information.
Submit your paper at the following link https://edas.info/N30081 and choose the track "Special Session: Reservoir Computing: theory, models, and applications", or use the direct link: https://edas.info/newPaper.php?c=30081&track=116064.
Note that anonymizing your paper is mandatory, and papers that explicitly or implicitly reveal the authors' identities may be rejected.
Sincerely,
The Organizing Team
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20230113/acfbf08c/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: GetFileAttachment.png
Type: image/png
Size: 87966 bytes
Desc: GetFileAttachment.png
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20230113/acfbf08c/attachment.png>
More information about the Connectionists
mailing list