Paper on overfitting ... and learning verb past tense
Charles X Ling
ling at csd.uwo.ca
Tue Jun 21 03:43:03 EDT 1994
Hi. A few months ago I posted some questions on the overfitting effect
of neural network learning, and I got some very helpful replies and
from many people. Thanks a million! After much more work,
I have just finished a short paper (to be submitted) which contains
clear results on the overfitting issue. Your comments
and suggestions on the paper will be highly appreciated!
***********
Overfitting in Neural-Network Learning
of Discrete Patterns
Charles X. Ling
Abstract
Weigend reports that the presence and absence of overfitting
in neural networks depends on how the testing error is measured,
and that there is no overfitting in terms of the classification error.
In this paper, we show that, in terms of the classification error,
overfitting can be very evident depending on the representation
used to encode the attributes.
We design a simple learning problem of a small Boolean function with
clear rationale, and present experimental results to support our claims.
We verify our findings in the task of learning the past tense of
English verbs.
Instructions for obtaining by anonymous ftp:
% ftp ftp.csd.uwo.ca
Name: anonymous
Password: <your email address>
ftp> cd pub/SPA/papers
ftp> get overfitting.ps
The paper is approx 280K and prints on 19 pages.
**********
While I am here, I'd like to ask if anyone is working on the
connectionst models for the learning of the past tense of English
verbs, which I have worked on with symbolic system SPA (see two papers
in the same directory as the overfitting paper). To insure more
direct comparison, if running SPA is needed, I'd be very happy to
assist and collaborate.
Regards,
Charles
More information about the Connectionists
mailing list