A neat idea from L. Breiman
Ken Laws
LAWS at ai.sri.com
Tue Jun 6 06:52:25 EDT 2006
L. Breiman's "back fitting" sounds very much like the search
strategy in some of the fancier stepwise multiple regression
programs. At each step, the best remaining variable is added
to the regression equation. Then other variables in the
equation are tested to see if any can be dropped out. The
"repeat until quiesence" search isn't usually performed, but
I suppose that it could have its uses.
There are also clustering algorithms that have this flavor,
notably ISODATA. As clusters are grown or merged, it's possible
for data points to drop out.
I've also used such an algorithm in image segmentation.
I look for pairs of regions that can be merged, and for
any region that can be split. The two searches are
alternated, approaching a global optimum (one hopes).
It's quite different from the usual split-merge
algorithm.
If you try Breiman's back fitting, watch out for cycles.
In my segmentation application, I ran into cycles containing
more than a hundred steps.
-- Ken Laws
-------
More information about the Connectionists
mailing list