Technical Report Series in Neural and Computational Learning

John Shawe-Taylor john at dcs.rhbnc.ac.uk
Mon Jan 23 10:47:32 EST 1995


The European Community ESPRIT Working Group in Neural and Computational 
       Learning Theory (NeuroCOLT): two new reports available

----------------------------------------
NeuroCOLT Technical Report NC-TR-94-014:
----------------------------------------
Learning Minor Closed Graph Classes with Membership and Equivalence Queries
by John Shawe-Taylor, Dept of Computer Science, Royal Holloway, U. of London
   Carlos Domingo, Department of Software, U. Polit\`ecnica de Catalunya
   Hans Bodlaender, Dept of Computer Science, Utrecht University
   James Abello, Computer Science Dept, Texas A\&M University

Abstract: 
The paper considers the problem of learning classes of graphs closed
under taking minors. It is shown that any such class can be properly
learned in polynomial time using membership and equivalence queries.
The representation of the class is in terms of a set of minimal
excluded minors (obstruction set).

----------------------------------------
NeuroCOLT Technical Report NC-TR-94-016:
----------------------------------------
On-line learning with minimal degradation in feedforward networks
by V Ruiz de Angulo, Institute for System Engineering and Informatics,
CEC Joint Research Center, Ispra, Italy and Carme Torras, CSIC-UPC,
Barcelona, Spain

Abstract: Dealing with non-stationary processes requires quick
adaptation while at the same time avoiding catastrophic forgetting. A
neural learning technique that satisfies these requirements, without
sacrificing the benefits of distributed respresentations, is presented.
It relies on a formalization of the problem as the minimization of the
error over the previously learned input-output (i-o) patterns, subject
to the constraint of perfect encoding of the new pattern. Then this
constrained optimization problem is transformed into an unconstrained
one with hidden-unit activations as variables. This new formulation
naturally leads to an algorithm for solving the problem, which we call
Learning with Minimal Degradation (LMD). Some experimental comparisions
of the performance of LMD with back-propagation are provided which,
besides showing the advantages of using LMD, reveal the dependence of
forgetting on the learning rate in back-propagation. We also explain why
overtraining affects forgetting and fault-tolerance, which are seen as
related problems.

-----------------------
The Report NC-TR-94-014 can be accessed and printed as follows 

% ftp cscx.cs.rhbnc.ac.uk  (134.219.200.45)
Name: anonymous
password: your full email address
ftp> cd pub/neurocolt/tech_reports
ftp> binary
ftp> get nc-tr-94-014.ps.Z
ftp> bye
% zcat nc-tr-94-014.ps.Z | lpr -l

Similarly for the other technical report.

Uncompressed versions of the postscript files have also been
left for anyone not having an uncompress facility.

A full list of the currently available Technical Reports in the Series is held in a file
`abstracts' in the same directory.

Best wishes
John Shawe-Taylor





More information about the Connectionists mailing list