Summary (long): pattern recognition comparisons

Gale Martin galem at mcc.com
Sun Aug 5 17:48:25 EDT 1990


Leonard Uhr states (about NN learning) "to make learning work, 
we need to cut down and direct explosive search at least as much 
as using any other approach."  

Certainly there is reason to agree with this in the general case, but I
doubt it's validity in important specific cases.  I've spent the past couple
of years working on backprop-based handwritten character recognition and 
find almost no supporting evidence of the need for explicitly cutting
down on explosive search through the use of heuristics in these 
SPECIFIC cases and circumstances.

We varied input character array size (10x16, 15x24, 20x32) to backprop
nets and found no difference in the number of training samples required
to achieve a given level of generalization performance for hand-printed
letters. In nets with one hidden layer, we increased the number 
of hidden nodes from 50 to 383 and found no increase in the number of 
training samples needed to achieve high generalization (in fact, generalization
is worse for the 50 hidden node case).  We experimented extensively with
nets having local connectivity and locally-linked nets in this domain and 
find similarly little evidence to support the need for such heuristics. These
results hold across two different types of handwritten character recognition
tasks (hand-printed letters and digits).  
 
This domain/case-specific robustness across architectural parameters and 
input size is one way to characterize the generality of a learning algorithm 
and may recommend one algorithm over another for specific problems.

Gale Martin

Martin, G. L., & Pittman, J. A.  Recognizing hand-printed letters and digits
	in D.S. Touretzky (Ed.) Advances in Neural Information Processing 
	Systems 2, 1990.
Martin, G.L., Leow, W.K. & Pittman, J. A.  Function complexity effects on
	backpropagation learning.  MCC Tech Report ACT-HI-062-90. 
	
 



More information about the Connectionists mailing list