backprop for classification

Russell Leighton russ at dash.mitre.org
Mon Aug 20 07:17:38 EDT 1990


We have found backprop VERY sensitive to the probability of
occurance of each class. As long as you are aware of this
you can use this to advantange. For example, if
false alarms are a big concern then by training
with large amounts of "noise" you can bias the sytem
to reduce the Pfa. 

This effect has been quantified analytically and experimentally
for systems with no hidden layers in a paper being compiled now.
The bottom line is that a no hidden layer system implements
a classical Mini-Max test if the signal classes are represented
equally in the training set. By varying the the composition
of the training sets, the network can be designed relative to 
a known maximum false alarm probablity independent of signal-to-noise
ratio. This work continues for multi-layer systems.

An experimental account of how to exploit this effect for
signal classification can be found in:

Wieland, et al., `An Analysis of Noise Tolerance for a Neural
Network Recognition System', Mitre Tech. Rep. MP-88W00021, 1988

and

Wieland, et al., `Shaping Schedules as a Method of Accelerated
Learning', Proceedings of the first INNS Meeting, 1988

Russ.


NFSNET: russ at dash.mitre.org

Russell Leighton
MITRE Signal Processing Lab
7525 Colshire Dr.
McLean, Va. 22102
USA



More information about the Connectionists mailing list