Batch vs. Pattern Backprop

qin@turtle.fisher.com qin at turtle.fisher.com
Fri Jan 28 08:49:46 EST 1994


From:	UUCP%"ds2100!galab3.mh.ua.edu!brown" 28-JAN-1994 06:35:06.88
To:	cs.cmu.edu!Connectionists
CC:	
Subj:	Batch Backprop versus Incremental

>From: ds2100!galab3.mh.ua.edu!brown
>Message-Id: <9401271800.AA15191 at galab3.mh.ua.edu>
>Subject: Batch Backprop versus Incremental
>To: cs.cmu.edu!Connectionists
>Date: Thu, 27 Jan 1994 12:00:18 -0600 (CST)
>X-Mailer: ELM [version 2.4 PL22]
>Content-Type: text
>Content-Length: 1329      

>Dear Connectionists,
>	I attended the 5th International Conference on Genetic Algorithms
>this summer, and in one of the sessions on combinations of genetic 
>algorithms (GAs) and Neural Nets(ANNs) a gentleman from the U.K.
>suggested that Batch mode learning could possibly be unstable in
>the long term for backpropagation.  I did not know the gentleman
>and when I asked for a reference he could not provide one.

>Does anyone have any kind of proof stating that one method is better 
>than another?  Or that possibly batch backprop is unstable in <<Some>>
>sense?

>Any and all response are thanked for in advance,
>Brown Cribbs


===============================================================
Brown,

I suggest the following paper for reference:

S.Z. Qin, et al. (1992). Comparison of four neural net learning methods for
system identification. IEEE TRANSACTIONS ON NEURAL NETWORKS. vol3, no.1
pp122-130.


It is proven in the paper that the batch and pattern/incremental learning
methods are equivalent given small learning rates. There is no result showing
that batch learning is more unstable. However, one simulation in the paper
shows that batch learning has ripples for large learning rates in a particular
 simulation. But the pattern learning does not. In other words, batch learning
error decreases, then increases, and then decreases. Initially batch learning
was a bit faster than pattern learning, but it has this kind of ripples.
My guess for this observation is due to the particular error surface.

In summary, there is no significant difference for small learning rate, 
but there is difference for large learning rates. Though there is one
simulation example showing pattern learning is more stable than batch learning, 
it may not be generally true.

S.Joe Qin
Fisher-Rosemount Systems, Inc.
1712 Centre Creek Drive
Austin, TX 78754
512-832-3635


words, The 



More information about the Connectionists mailing list