Connectionists: Review: Reservoir computing & recurrent neural networks / unsupervised RNN / IDSIA PhD fellowships

Schmidhuber Juergen juergen at idsia.ch
Mon Sep 7 05:09:57 EDT 2009


Thanks for this survey!

Please allow me to point to a few selected earlier papers (below) on  
unsupervised training procedures for recurrent neural networks (RNN) -  
in this new terminology you might call them reservoirs. We like to  
think these were pretty much the first unsupervised RNN. Please tell  
us if you know of earlier ones, also for the upcoming RNN book http://www.idsia.ch/~juergen/rnnbook.html

Note also the connection to the hot topic of unsupervised deep nets  
with many layers - RNN are deep by nature.

IDSIA is still offering a few PhD fellowships along these lines:
http://www.idsia.ch/~juergen/sinergia2008.html
http://www.idsia.ch/~juergen/eu2009.html

Cheers,
Juergen
http://www.idsia.ch/~juergen/whatsnew.html

4. M. Klapper-Rybicka, N. N. Schraudolph, J. Schmidhuber. Unsupervised  
Learning in LSTM Recurrent Neural Networks. In G. Dorffner, H.  
Bischof, K. Hornik, eds., Proc. ICANN'01, Vienna, LNCS 2130, pages  
684-691, Springer, 2001.

3. S. Hochreiter and J. Schmidhuber. Flat Minima. Neural Computation,  
9(1):1-42, 1997.
(Uses the FMS algorithm to find simple, low-complexity RNN among the  
many RNN that can solve a given task - another type of unsupervised  
learning, although some would file this under "regularizers for RNN".)

2. J.  Schmidhuber. Learning unambiguous reduced sequence  
descriptions. In J. E. Moody, S. J. Hanson, and R. P. Lippman,  
editors, NIPS'4, p 291-298. San Mateo, CA: Morgan Kaufmann, 1992.
(Unsupervised compact sequence encoding to facilitate subsequent  
supervised learning.)

1. J. Schmidhuber. Learning complex, extended sequences using the  
principle of history compression. Neural Computation, 4(2):234-242,  
1992.
(Uses self-prediction to compactly encode sequences in unsupervised  
fashion. This facilitates subsequent supervised learning.)


On Aug 27, 2009, at 5:08 PM, Mantas Lukosevicius wrote:
> A comprehensive survey article "Reservoir computing approaches to  
> recurrent neural network training" has been published in Computer  
> Science Review by M. Lukosevicius and H. Jaeger.
> Preprint:
> http://www.faculty.jacobs-university.de/hjaeger/pubs/2261_LukoseviciusJaeger09.pdf
> Article:
> http://dx.doi.org/10.1016/j.cosrev.2009.03.005

-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/20090907/7d48c60d/attachment.html


More information about the Connectionists mailing list