Connectionists: handwriting - fast deep nets & recurrent nets / formal theory of creativity / Nobels

Schmidhuber Juergen juergen at idsia.ch
Mon Oct 4 06:17:58 EDT 2010


Neural networks achieved the best known performance in various  
handwriting recognition contests.

(1) For isolated digits we use deep feedforward neural nets trained by  
an ancient algorithm: backprop. No fashionable unsupervised pre- 
training is necessary! But graphics cards are used to accelerate  
learning by a factor of 50. This is sufficient to clearly outperform  
numerous previous, more complex machine learning methods on the famous  
MNIST benchmark.

(2) For connected handwriting we use our bi-directional or multi- 
dimensional LSTM recurrent neural networks, which learn to maximize  
the probabilities of label sequences, given raw training sequences.  
This method won several handwriting competitions at ICDAR 2009.

There is an overview web page with papers in Neural Computation, IEEE  
Transactions PAMI, NIPS, ICDAR:

http://www.idsia.ch/~juergen/handwriting.html

Do the recent results herald a rennaissance of good old-fashioned  
neural networks?

---

Also available: a survey in IEEE TAMD (just came out) on the formal  
theory of creativity and what's driving science / art / music / humor  
- the simple algorithmic principles of artificial scientists &  
artists. Here the overview web page (with a video including attempts  
at applying the new theory of humor):

http://www.idsia.ch/~juergen/creativity.html
Cheers,

JS

PS: Also available for those interested in the history of science:  
Evolution of National Nobel Prize Shares in the 20th Century. http://www.idsia.ch/~juergen/nobelshare.html 
  . Other recent events: http://www.idsia.ch/~juergen/whatsnew.html


More information about the Connectionists mailing list