NIPS 96 Workshop Announcement: Dynamical Recurrent Networks, Day 2

Stefan C. Kremer kremer at running.dgcd.doc.ca
Mon Nov 18 13:33:17 EST 1996



NIPS 96 Workshop Announcement:
==============================

                    Dynamical Recurrent Networks
                      Post Conference Workshop
                                Day 2

              Organized by John Kolen and Stefan Kremer                    
                    Saturday, December 7, 1996
                         Snowmass, Colorado



Introduction:  

There has been significant interest in recent years in
dynamic recurrent neural networks and their application to control, system
identification, signal processing, and time series analysis and
prediction. Much of this work is simply an extension of techniques which
work well for feedforward networks to recurrent networks. However, when
dynamics are added to a system there are many complex issues which are not
relevant to the study of feedforward nets, such as the existence of
attractors and questions of stability, controllability, and observability. 
In addition, the architectures and learning algorithms that work well for
feedforward systems are not necessarily useful or efficient in recurrent
systems.  

The first day of the workshop highlights the use of traditional
results from systems theory and nonlinear dynamics to analyze the behavior
of recurrent networks. The aim of the workshop is to expose recurrent
network designers to the traditional frameworks available in these well
established fields. A clearer understanding of the known results and open
problems in these fields, as they relate to recurrent networks, will
hopefully enable people working with recurrent networks to design more
robust systems which can be more efficiently trained.  This session will
overview known results from systems theory and nonlinear dynamics which
are relevant to recurrent networks, discuss their significance in the
context of recurrent networks, and highlight open problems.  (More
information about Day 1 of the workshop can be found at:
http://flute.lanl.gov/NIS-7_home_pages/jhowse/talk_abstracts.html).  

The second day of the workshop addresses the issues of designing and
selecting architectures and algorithms for dynamic recurrent networks.
Unlike previous workshops, which have typically focussed on reporting the
results of applying specific network architectures to specific problems,
this session is intended to assist both users and developers of recurrent
networks to select appropriate architectures and algorithms for specific
tasks. In addition, this session will provide a backward flow of
information -- a forum where researchers can listen to the needs of
application developers. The wide variety, rapid development and diverse
applications of recurrent networks are sure to make for exciting and
controversial discussions. 




Day 1, Friday, Dec. 6, 1996 
===========================

More information about Day 1 of the workshop can be found at: 
http://flute.lanl.gov/NIS-7_home_pages/jhowse/talk_abstracts.html


Day 2, Saturday, Dec. 7, 1996
=============================

Target Audience: 

This workshop is targeted at two groups. First, application developers
faced with the task of selecting an appropriate tool for their problem
will find this workshop invaluable.  Second, researchers interested in
studying and extending the capabilities of dynamic recurrent networks will wish to communicate
their findings and observations to other researchers in the area. In
addition, these researchers will have an opportunity to listen to the
needs of their technology's users. 

Format:

The format of the second day is designed to encouraged open discussion. 
Presenters have provided 1 or 2 references to electronically accessible
papers. At the workshop itself, the presenter will be asked to briefly (10
minutes) discuss highlights, conclusions, controversial issues or open
problems of their research. This presentation will be followed by a 20
minute discussion period during which the expression of contrary opinions,
related problems and speculation regarding solutions to open problems will
be encouraged. The workshop will conclude with a one hour panel
discussion. 

Important Note to People Attending this Workshop: 

The goal of this workshop is to offer an opportunity` for an open
discussion of important issues in the area of dynamic networks. To achieve
this goal, the presenters have been asked not to give a detailed
description of their work but rather to give only a very brief synopsis in
order to maximize the available discussion time. Attendees will get the
most from this workshop if they are already familiar with the details of
the work to be discussed. To make this possible, the presenters have made
papers relevant to the discussions available electronically via the links
on the workshop's web-page. Attendees who are not already familiar with
the work of the presenters at the workshop are encouraged to examine the
workshops web page (at:  "http://running.dgcd.doc.ca/NIPS96/"), and 
to retrieve and examine the papers prior to attending the workshop. 

List of Talk Titles and Speakers:

Learning Markovian Models for Sequence Processing 
Yoshua Bengio, University of Montreal / AT&T Labs - Research

Guessing Can Outperform Many Long Time Lag Algorithms
Jürgen Schmidhuber, Istituto Dalle Molle di Studi sull'Intelligenza 
		Artificiale. 
Sepp Hochreiter, Fakultät für Informatik, Technische Universität München. 

Optimal Learning of Data Structure.
Marco Gori, Universita' di Firenze

How Embedded Memory in Recurrent Neural Network Architectures Helps
Learning Long-term Temporal Dependencies. 
T. Lin, B. Horne & C. Lee Giles, NEC Research Institute, Princeton, NJ. 

Discovering the time scale of trends and periodic structure.
Michael Mozer and Kelvin Fedrick, University of Colorado.

Title to be announced. 
Lee Feldkamp, Ford Motor Co. Labs. 

Representation and learning issues for RNNs learning context free languages
Janet Wiles and Brad Tonkes, Departments of Computer Science and Psychology,
       University of Queensland

Title to be announced. 
Speaker to be Announced

Long Short Term Memory. 
Sepp Hochreiter, Fakultät für Informatik, Technische Universität München. 
Jürgen Schmidhuber, Istituto Dalle Molle di Studi sull'Intelligenza 
	Artificiale. 


Web page:

Please note:  more detailed and up to date information regarding this
workshop, as well as the reference papers described above can be found
at the Workshop's web page located at:

	http://running.dgcd.doc.ca/NIPS96/

--
Dr. Stefan C. Kremer, Research Scientist, Artificial Neural Systems
Communications Research Centre, 3701 Carling Ave.,
P.O. Box 11490, Station H, Ottawa, Ontario   K2H 8S2

WWW: http://running.dgcd.doc.ca/~kremer/index.html
Tel: (613)990-8175  Fax: (613)990-8369 E-mail: Stefan.Kremer at crc.doc.ca 


More information about the Connectionists mailing list