backprop for classification
xiru@Think.COM
xiru at Think.COM
Fri Aug 17 16:48:58 EDT 1990
While we trained a standard backprop network for some classification task
(one output unit for each class), we found that when the classes are not
evenly distribed in the training set, e.g., 50% of the training data belong
to one class, 10% belong to another, ... etc., then the network always biased
towards the classes that have the higher percentage in the training set.
Thus, we had to post-process the output of the network, giving more weights
to the classes that occur less frequently (in reverse proportion to their
population).
I wonder if other people have encountered the same problem, and if there
are better ways to deal with this problem.
Thanks in advance for any replies.
- Xiru Zhang
Thinking Machines Corp.
More information about the Connectionists
mailing list