Computational Issues in Neural Network Training

Wray Buntine wray at ptolemy.arc.nasa.gov
Thu Dec 17 20:40:49 EST 1992


First, thanks to Scott Markel for producing this summary.
Its rapid dissemination of important information like this
to non-participants that lets the field progress as a whole!!!

>  SQP on a Test Problem
>  ---------------------
>  Scott Markel (David Sarnoff Research Center - smarkel at sarnoff.com)
>  
>  I followed Roger's presentation with a short set of slides showing actual
>  convergence of a neural network training problem where SQP was the training
>  algorithm.  Most of the workshop participants had not seen this kind of
>  convergence before.  Yann Le Cun noted that with such sharp convergence
>  generalization would probably be pretty bad.  

I'd say not necessarily.  If you use a good regularization method then
sharp convergence shouldn't harm generalization at all.

Of course, this begs the question:  what is a good 
regularizing/complexity/prior/MDL term?  
   (choose you own term depending on which regularizing fashion you follow.)

Wray Buntine
NASA Ames Research Center                 phone:  (415) 604 3389
Mail Stop 269-2                           fax:    (415) 604 3594
Moffett Field, CA, 94035 		  email:  wray at kronos.arc.nasa.gov


More information about the Connectionists mailing list