Taxonomy of Neural Network Optimality
Scott.Fahlman@SEF-PMAX.SLISP.CS.CMU.EDU
Scott.Fahlman at SEF-PMAX.SLISP.CS.CMU.EDU
Tue Jul 30 12:33:03 EDT 1991
PERFORMANCE:
1. Most accurate on training set.
2. Most accurate on test set.
3. Best at generalization.
What does this mean if not the same as 2? Also, "most accurate" might mean
number of cases wrong or something like sum-squared error, depending on the
problem.
4. Performance independent of starting weights.
5. Performance independent of training exemplar order.
TRAINING:
6. Trains in fewest epochs.
Some problems and algorithms just don't fit into epochs. Probably better
to use "pattern presentations", but some algorithms don't even fit into
that.
7. Trains in fewest Floating Point/Integer Operations.
8. Trains in least clock time.
Machine-dependent, of course, so it says very little about the algorithm.
9. Trains in fewest exemplars.
10. Uses least memory.
TOPOLOGY:
11. Has fewest layers.
"Layers" may be ill-defined. Maybe look instead at the longest path from
input to output.
12. Has fewest nodes.
13. Has fewest interconnects.
14. Distributed representation (fault tolerant)
A few others:
15. How hard it is to partition/parallelize the algorithm?
16. How many parameters must the user adjust, and how critical are the adjustments?
17. Related to 16: What chance of immediate success on a new problem?
18. Range of problems covered:
Discrete vs. analog inputs and outputs
Can it handle time series?
Can it handle noisy data? (i.e misclassifying a few training points leads
to better generalization)
19. {Debugged, supported, portable, free} implementation available?
20. If not, how hard it the algorithm to implement?
21. Biologically plausible?
22. How does it scale with problem size?
-- Scott
More information about the Connectionists
mailing list