hypertransfer and interference
Jaap Murre
jaap.murre at mrc-apu.cam.ac.uk
Tue Dec 13 11:59:33 EST 1994
DeLiang Wang asks whether any major effort has been undertaken to
investigate and possibly remedy the problem of catastrophic interference.
A fairly detailed analysis of this problem can be found in:
Murre, J.M.J. (1992b). Categorization and learning in modular neural
networks. Hemel Hempstead: Harvester Wheatsheaf. Co-published
by Lawrence Erlbaum in the USA and Canada (Hillsdale, NJ).
I have recently completed a paper that shows that backpropagation not
only suffers from 'catastrophic interference' but also from 'hyper
transfer', i.e., in some circumstances performance on a set A actually
*improves* when learning a second set B. The learning transfer effects
are catastrophic (or hyper) with respect to human learning data. The
paper also shows that two-layer networks do not suffer from excessive
transfer and are in fact in very close accordance with the human
interference data as summarized in the classic paper by Osgood (1949):
Murre, J.M.J. (in press). Transfer of learning in backpropagation
networks and in related neural network models. To appear in
Levy, Bairaktaris, Bullinaria, & Cairns (Eds.), Connectionist
Models of Memory and Language. London: UCL Press. (in fpt
directory: hyper1.ps)
I have put the paper in our anonymous ftp directory:
ftp ftp.mrc-apu.cam.ac.uk
cd /pub/nn/murre
bin
get hyper1.ps
More information about the Connectionists
mailing list