Preprint announcement: Online-Gibbs Learning

Jai Won Kim jkim at FIZ.HUJI.AC.IL
Mon Nov 27 14:38:09 EST 1995


Dear Jordan Pollack
   
  We would like to post an announcement of a new preprint
on your network.  We attach below the title, authors 
as well as the abstract of the paper.
 
Subject:  announcement of a new preprint: On-line Gibbs Learning

FTP-host: keter.fiz.huji.ac.il
FTP-file: pub/ON-LINE-LEARNING/online_gibbs.ps.Z

The length of the paper: 4 pages.

Thanking you in advance for your help.
               
             Regard,
             Haim Sompolinsky and Jaiwon Kim
             e-mails : haim at fiz.huji.ac.il,  jkim at fiz.huji.ac.il 
     
_______________________________________________________________________________o



                    On-line Gibbs Learning

                 J. W. Kim and H. Sompolinsky
      Racah Institute of Physics and Center for Neural Computation,
            Hebrew University, Jerusalem 91904, Israel 
         e-mails: jkim at fiz.huji.ac.il ; haim at fiz.huji.ac.il	

             (Submitted to Physical Review Letters, Nov 95)

            
                           ABSTRACT

   We propose a new model of on-line learning which is appropriate for
learning of realizable and unrealizable, smooth as well as threshold, 
functions. Following each presentation of an example the new weights
are chosen from a Gibbs distribution with an on-line energy that
balances the need to minimize the instantaneous error against the
need to minimize the change in the weights. We show that this algorithm
finds the weights that minimize the generalization error in the limit
of infinite number of examples. The asymptotic rate of convergence is
similar to that of batch learning.
  


More information about the Connectionists mailing list