Paper available

barberd barberd at helios.aston.ac.uk
Wed Nov 15 13:40:48 EST 1995


The following paper, (a version of which was submitted to Europhysics
Letters) is available by anonymous ftp (instructions below).


              FINITE SIZE EFFECTS IN ON-LINE LEARNING
                  OF MULTI-LAYER NEURAL NETWORKS


            David Barber{2}, Peter Sollich{1} and David Saad{2}

{1} Department of Physics, University of Edinburgh, EH9 3JZ, UK
{2} Neural Computing Research Group, Aston University,
    Birmingham B4 7ET, United Kingdom

                  email: D.Barber at aston.ac.uk



                           Abstract


We complement the recent progress in thermodynamic limit analyses of
mean on-line gradient descent learning dynamics in multi-layer
networks by calculating the fluctuations possessed by finite
dimensional systems. Fluctuations from the mean dynamics are largest
at the onset of specialisation as student hidden unit weight vectors
begin to imitate specific teacher vectors, and increase with the
degree of symmetry of the initial conditions. Including a term to
stimulate asymmetry in the learning process typically significantly
decreases finite size effects and training time.
	

Ftp instructions

ftp cs.aston.ac.uk
User: anonymous
Password: (type your e-mail address)
ftp> cd neural/barberd
ftp> binary
ftp> get online.ps.Z
ftp> quit
unix> uncompress online.ps.Z 


       **PLEASE DO NOT REPLY DIRECTLY TO THIS MESSAGE** 




More information about the Connectionists mailing list