Why does the error rise in a SRN?

mav@cs.uq.oz.au mav at cs.uq.oz.au
Wed Apr 1 18:49:46 EST 1992


I have been working with the Simple Recurrent Network (Elman style)
and variants there of for some time. Something which seems to happen
with surprising frequency is that the error will decrease for a period
and then will start to increase again. I have seen the same phenomena
using both root mean square and dprime error. It often occurs over
quite long time periods (several thousand epochs). The task that I
have studied most carefully is episodic recognition. A list (usually
very short) of words is given. A recognize symbol is then followed by
an item which either was in the list or wasn't. The task is to make
this decision. Following is a set of example inputs and outputs:

	Input		Output

	table		blank
	beach		blank
	king		blank
	recognize	blank
	beach		yes

Questions:

(1) Has anyone else noticed this? 
(2) Is it task dependent?
(3) Why does it happen?


Simon Dennis.



More information about the Connectionists mailing list