Connectionists: Call for Papers: Special Issue of TPAMI on Learning Deep Architectures

Hugo Larochelle hugo.larochelle at usherbrooke.ca
Sun Jan 8 17:47:48 EST 2012


Call For Papers: Special Issue on Learning Deep Architectures

To be published in IEEE Transactions on Pattern Analysis and Machine Intelligence

http://www.computer.org/cms/Computer.org/transactions/cfps/cfp_tp_lda.pdf

Topic Description:

In recent years, there has been an emerging interest in architectures, algorithms, and signal/information processing techniques that learn to transform data through multiple layers of nonlinearities, hence the concept of deep architectures. Several approaches have been developed in the context of learning deep architectures, including unsupervised feature learning using deep architectures, sparse coding for deep architectures, deep Boltzmann machines, stacked auto-encoders, deep belief networks, deep multilayer perceptrons, convolutional architectures, recursive compositional models, and various versions of hierarchical generative models, all of which have been successfully applied to a variety of tasks in computer vision, speech recognition/understanding, audio processing, natural language processing, information retrieval, and robotics. These developments cut across interests in many traditional and new machine learning, machine intelligence, pattern analysis, and signal/information processing research areas.

This special issue invites paper submissions on the most recent developments in learning deep architectures and its relation to unsupervised feature learning and hierarchical learning algorithms, theoretical foundations, inference and optimization, semi-supervised and transfer learning, and applications to real-world tasks. We also welcome survey and overview papers in these general areas pertaining to learning deep architectures. Detailed topics of presentations include but are not limited to:

- deep learning architectures and algorithms
- unsupervised feature learning algorithms with deep architectures
- semi-supervised and transfer learning algorithms with deep architectures
- inference and optimization relevant to learning deep architectures
- theoretical foundations of unsupervised feature learning with deep
  architectures
- theoretical foundations of deep learning
- applications of unsupervised feature learning and supervised learning with
  deep architectures

Paper submission and review:

Papers must be submitted online, selecting the choice that indicates this special issue. We will accept original research papers and overview/survey papers. Peer reviews will follow the standard IEEE review process. Priority will be given to the papers with high novelty and originality for the research papers, and to the papers with high potential impact for survey/overview papers. Complete manuscripts of full length are expected, following the TPAMI guideline in

    http://www.computer.org/portal/web/peerreviewjournals/author.

In case there is uncertainty as to whether the topic of your potential paper may fit well to this special issue, you are welcome to contact the guest editors below before you start writing the full-length manuscript.

Submission site: https://mc.manuscriptcentral.com/tpami-cs

Dates:

Submissions open period: until April 1st, 2012
First review results:    June 15, 2012
Second review results:   August 15, 2012
Final manuscripts due:   September 1st, 2012
Publication date:        December 1st, 2012

Guest editors:

Samy Bengio (bengio at google.com)
Li Deng (deng at microsoft.com)
Hugo Larochelle (hugo.larochelle at usherbrooke.ca)
Honglak Lee (honglak at eecs.umich.edu)
Ruslan Salakhutdinov (rsalakhu at utstat.toronto.edu)
Max Welling (welling at ics.uci.edu)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/20120108/c9b14dc9/attachment-0001.html


More information about the Connectionists mailing list