Technical Report Available.

Jerome H. Friedman jhf at playfair.Stanford.EDU
Mon Aug 12 16:00:26 EDT 1996


                  *** Technical Report Available ***



                        LOCAL LEARNING BASED ON
                          RECURSIVE COVERING

                          Jerome H. Friedman
                          Stanford University
                      (jhf at playfair.stanford.edu)



                              ABSTRACT


Local learning methods approximate a global relationship between an output
(response) variable and a set of input (predictor) variables by establishing
a set of "local" regions that collectively cover the input space, and
modeling a different (usually simple) input-output relationship in each one.
Predictions are made by using the model associated with the particular
region in which the prediction point is most centered. Two widely applied
local learning procedures are K - nearest neighbor methods, and decision tree
induction algorithms (CART, C4.5). The former induce a large number of
highly overlapping regions based only on the distribution of training input
values. By contrast, the latter (recursively) partition the input space into a
relatively small number of highly customized (disjoint) regions using the
training output values as well. Recursive covering unifies these two
approaches in an attempt to combine the strengths of both. A large number of
highly customized overlapping regions are produced based on both the
training input and output values. Moreover, the data structure representing
this cover permits rapid search for the prediction region given a set of
(future) input values.


Available by ftp from:
"ftp://playfair.stanford.edu/pub/friedman/dart.ps.Z"

Note: this postscript does not view properly on some ghostviews. It seems
to print OK on nearly all postscript printers.





More information about the Connectionists mailing list