catastrophic forgetting
David ELIZONDO
elizondo at axone.u-strasbg.fr
Thu Sep 3 03:36:04 EDT 1998
The Recurcive Deterministic Perceptron (RDP) is an example of a
neural network model that does not suffer from catastrophic
interference. This feedforward multilayer neural network is a
generalization of the single layer perceptron topology (SLPT) that
can handle both linearly separable and non linearly separable
problems.
Due to the incremental learning nature of the RDP neural networks,
the problem of catastrophic interference will not arise with this
learning method. The latter because the topology is build one step at
the time by adding an intermediate neuron (IN) to the topology. Once
a new IN is added, its weights are frozen.
Here are two references describing this model:
M. Tajine and D. Elizondo.
The recursive deterministic perceptron neural network.
Neural Networks (Pergamon Press). Acceptance date :
Mars 6, 1998
M. Tajine and D. Elizondo.
Growing Methods for constructing Recursive eterministic
Perceptron Neural Networks and Knowledge extraction.
Artificial Intelligence (Elsevier).
Acceptance date : May 6, 1998
A limited number of pre-print hard copies are available.
More information about the Connectionists
mailing list