Connectionists: Feedback nets for control, prediction, classification

Juergen Schmidhuber juergen at idsia.ch
Tue Oct 4 09:20:17 EDT 2005


8 new papers on evolution-based & gradient-based recurrent
neural nets, with Alex Graves, Daan Wierstra, Matteo Gagliolo,
Santiago Fernandez, Nicole Beringer, Faustino Gomez, in
Neural Networks, IJCAI 2005, ICANN 2005, GECCO 2005
(with a best paper award):

EVolution of recurrent systems with Optimal LINear Output
(EVOLINO). Basic idea: Evolve an RNN population; to get some
RNN's fitness DO: Feed the training sequences into the RNN.
This yields sequences of hidden unit activations. Compute an
optimal linear mapping from hidden to target trajectories. The
fitness of the recurrent hidden units is the RNN performance on
a validation set, given this mapping. Evolino-based LSTM nets
learn to solve several previously unlearnable time series
prediction tasks, and form a basis for the first recurrent support
vector machines: http://www.idsia.ch/~juergen/evolino.html

COEVOLVING RECURRENT NEURONS LEARN TO CONTROL
FAST WEIGHTS. For example, 3 co-evolving RNNs compute
quickly changing weight values for 3 fast weight networks steering
the 3 wheels of a mobile robot in a confined space in a realistic
3D physics simulation.  Without a teacher it learns to balance two
poles with a joint: http://www.idsia.ch/~juergen/rnnevolution.html

EVOLUTION MAIN PAGE (links to work since 1987):
http://www.idsia.ch/~juergen/evolution.html

NEW RESULTS on bidirectional gradient-based RNNs for
phoneme recognition etc: http://www.idsia.ch/~juergen/rnn.html

Juergen Schmidhuber
TUM & IDSIA



More information about the Connectionists mailing list