Connectionists: New JETCAS Special Issue: Low-Power, Adaptive Neuromorphic Systems: Devices, Circuit, Architectures and Algorithms

Arindam Basu arindam.basu at ntu.edu.sg
Sat Jan 21 03:09:57 EST 2017


Dear all



** Apologies for cross-posting



Would like to draw your attention to a new Special issue in JETCAS on "Low-Power, Adaptive Neuromorphic Systems: Devices, Circuit, Architectures and Algorithms" that I am co-editing with colleagues. Manuscript submission deadline is on April 30, 2017. Look forward to receiving your contributions.


IEEE JOURNAL ON
EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS

CALL for PAPERS
Low-Power, Adaptive Neuromorphic Systems: Devices, Circuit, Architectures and Algorithms
Guest Editors
Full name

Email

Affiliation

Arindam Basu*

arindam.basu at ntu.edu.sg<mailto:arindam.basu at ntu.edu.sg>

Nanyang Technological University

Tanay Karnik

tanay.karnik at intel.com<mailto:tanay.karnik at intel.com>

Intel

Hai Li

hai.li at duke.edu<mailto:hai.li at duke.edu>

Duke University

Elisabetta Chicca

chicca at cit-ec.uni-bielefeld.de<mailto:chicca at cit-ec.uni-bielefeld.de>

Bielefeld University

Meng-Fan Chang

mfchang at mx.nthu.edu.tw<mailto:mfchang at mx.nthu.edu.tw>

National Tsing Hua University

Jae-sun Seo

jaesun.seo at asu.edu<mailto:jaesun.seo at asu.edu>

Arizona State University

(* Corresponding)

Scope and Purpose
The recent success of “Deep neural networks” (DNN) has renewed interest in bio-inspired machine learning algorithms. DNN refers to neural networks with multiple layers (typically two or more) where the neurons are interconnected using tunable weights. Though these architectures are not new, availability of lots of data, huge computing power and new training techniques (such as unsupervised initialization, use of rectified linear units as the neuronal nonlinearity, regularization using dropout or sparsity, etc.) to prevent the networks from over-fitting have led to its great success in recent times. DNN has been applied to a variety of fields such as object or face recognition in images, word recognition in speech or even natural language processing and the success stories of DNN keep on increasing every day.
However, the common training method in deep learning, such as back propagation, tunes the weights of neural networks based on the gradient of the error function, which requires a known output value for every input. It would be difficult to use such supervised learning methods to train and adapt to real-time sensory input data that are mostly unlabeled. In addition, training and classification phases of deep neural networks are typically separated, such that training occurs in the cloud or high-end graphics processing units, while their weights or synapses are fixed during deployment for classification. However, this makes it difficult for the neural network to continuously adapt to input or environment changes in real-world applications. By adopting unsupervised and semi-supervised learning rules found in biological nervous systems, we anticipate to enable adaptive neuromorphic systems for many real-time applications with a large amount of unlabeled data, similar to how humans analyze and associate sensory input data. Energy-efficient hardware implementation of these adaptive neuromorphic systems is particularly challenging due to intensive computation, memory, and communication that are necessary for online, real-time learning and classification. Cross-layer innovations on algorithms, architectures, circuits, and devices are required to enable adaptive intelligence especially on embedded systems with severe power and area constraints.
Topics of Interest
This special issue invites submissions relating to all aspects of adaptive neuromorphic systems across algorithms, devices, circuits, and architectures. Possible scalability to human brain-scale computing level with energy-efficient online learning is desired. Submissions are welcome in the following topics or other related topics:

·         Spin mode adaptive neuromorphics with devices such as spin transfer nano-oscillator, domain wall memory, tunneling magnetic resistance, inverse spin hall effect, etc.

·         Memristive technology based learning synapse and neurons

·         Neuromorphic implementations of synaptic plasticity, short-term adaptation and homeostatic mechanisms

·         Self-learning synapses (STDP and variants) and self-adaptive neuromorphic systems

·         High fan-in scalable interconnect fabric technologies mimicking brain-scale networks

·         Circuits and systems for efficient interfacing with post-CMOS memory based learning synapses

·         Design methodology and design tools for adaptive neuromorphic systems with post-CMOS devices

·         Algorithm, device, circuit, and architecture co-design for energy-efficient adaptive neuromorphic hardware
Important Dates

1.      Manuscript submissions due: April 30, 2017

2.      First decision: July 15, 2017

3.      Revised manuscripts due: August 15, 2017

4.      Final Decision: October 15, 2017

5.      Final manuscripts due: November 15, 2017
Request for Information
Arindam Basu (arindam.basu at ntu.edu.sg<mailto:arindam.basu at ntu.edu.sg>)

https://mc.manuscriptcentral.com/jetcas




Best regards,
Arindam

http://www3.ntu.edu.sg/home/arindam.basu/

________________________________
CONFIDENTIALITY: This email is intended solely for the person(s) named and may be confidential and/or privileged. If you are not the intended recipient, please delete it, notify us and do not copy, use, or disclose its contents.
Towards a sustainable earth: Print only when necessary. Thank you.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.srv.cs.cmu.edu/pipermail/connectionists/attachments/20170121/73e21896/attachment.html>


More information about the Connectionists mailing list