Reprint Announcement

john kolen kolen-j at cis.ohio-state.edu
Wed Oct 6 12:01:50 EDT 1993


This is an announcement of a newly available paper in neuroprose:

			      RECURRENT NETWORKS:
		 STATE MACHINES OR ITERATED FUNCTION SYSTEMS?
				 John F. Kolen
			  Laboratory for AI Research
		Department of Computer and Information Science
			   The Ohio State University
			      Columbus, OH  43210
			  kolen-j at cis.ohio-state.edu

Feedforward neural networks process information by performing fixed
transformations from one representation space to another. Recurrent networks,
on the other hand, process information quite differently. To understand
recurrent networks one must confront the notion of state as recurrent networks
perform iterated transformations on state representations. Many researchers
have recognized this difference and have suggested parallels between recurrent
networks and various automata. First, I will demonstrate how the common notion
of deterministic information processing does not necessarily hold for
deterministic recurrent neural networks whose dynamics are sensitive to initial
conditions. Second, I will link the mathematics of recurrent neural network
models with that of iterated function systems. This link points to model
independent constraints on the recurrent network state dynamics that explain
universal behaviors of recurrent networks like internal state clustering.

This paper will appear in The Proceedings of the 1993 Connectionist Models
Summer School.

************************ How to obtain a copy ************************

Via Anonymous FTP:

unix> ftp archive.cis.ohio-state.edu
Name: anonymous
Password: (type your email address)
ftp> cd pub/neuroprose
ftp> binary
ftp> get kolen.rnifs.ps.Z
ftp> quit
unix> uncompress kolen.rnifs.ps.Z
unix> lpr  kolen.rnifs.ps (or what you normally do to print PostScript)



More information about the Connectionists mailing list