2 papers: 1. Phase transitions in multilayered nets, 2. Self-averaging and online learning

Michael Biehl biehl at physik.uni-wuerzburg.de
Wed May 13 09:42:21 EDT 1998


FTP-host:       ftp.physik.uni-wuerzburg.de 
FTP-filename:   /pub/preprint/1998/WUE-ITP-98-014.ps.gz
FTP-filename:   /pub/preprint/1998/WUE-ITP-98-002.ps.gz  

The following two manuscripts are now available via anonymous ftp,
see below for the retrieval procedure.  Alternatively, they can be 
obtained from the  Wuerzburg Theoretical Physics preprint server in
the WWW: 

   http://theorie.physik.uni-wuerzburg.de/~publications.shtml

------------------------------------------------------------------
Ref.  WUE-ITP-98-014                                     [8 pages]

        Phase transitions in soft--committee machines
            M.Biehl, E. Schl\"osser, and M. Ahr

Equilibrium statistical physics is applied to layered neural networks 
with differentiable activation functions.  A first analysis of off-line 
learning in soft-committee machines with N input and K hidden units 
learning a perfectly matching rule is performed.
Our results are exact in the limit of high training temperatures. For
K=2 we find a second order phase transition from  unspecialized to 
specialized student configurations at a critical size P of the training 
set, whereas for K > 2 the transition is first order. 
The limit K to infinity can be performed analytically, the transition 
occurs after presenting  on the order of  N K  examples. However, an
unspecialized metastable state  persists up to  P= O (N K^2).
Monte Carlo simulations indicate that our results are also valid for
moderately low temperatures qualitatively.

------------------------------------------------------------------
Ref.  WUE-ITP-98-002                                     [10 pages]

        Self-averaging and On-line Learning
           G. Reents and R. Urbanczik           
         (to appear in Phys. Rev. Letters)

Conditions are given under which one may prove that the stochastic
dynamics of on-line learning can be described by the deterministic
evolution of a finite set of order parameters in the thermodynamic
limit. A global constraint on the average magnitude of the increments 
in the stochastic process is necessary to ensure self-averaging. In the
absence of such a constraint, convergence may only be in probability.
          

-------------------------------------------------------------------

Retrieval procedure via anonymous ftp: 

     unix> ftp  ftp.physik.uni-wuerzburg.de
     Name: anonymous  Password: {your e-mail address}
     ftp>  cd pub/preprint/1998
     ftp>  binary
     ftp>  get WUE-ITP-98.XXX.ps.gz               (*)   
     ftp>  quit
     unix> gunzip WUE-ITP-98-XXX.ps.gz
e.g. unix> lp -odouble WUE-ITP-98-XXX.ps                        

(*) can be replaced by "get WUE-ITP-98-XXX.ps". The file will then
    be uncompressed before transmission (slow!). 
___________________________________________________________________ 

  e-mail :   biehl at physik.uni-wuerzburg.de
            reents at physik.uni-wuerzburg.de 



More information about the Connectionists mailing list