No subject

Bill Macready wgm at santafe.edu
Wed Aug 16 16:03:37 EDT 1995


In his recent posting, Kevin Lang continues a contest with John Koza
of the "my search algorithm beats your search algorithm according to
the following performance measure for the following (contrived)
fitness function" variety.

Interested readers of connectionist should know that over the space of
all fitness functions, no matter what the performance measure, any two
search algorithms have exactly the same expected performance.

This is discussed in the paper mentioned below. That paper goes on to
discuss other more interesting issues, like head-to-head minimax
distinctions between search algorithms, the information theoretic and
geometric aspects of search, time-varying fitness functions, etc.

Interested readers should also know that there was a (very) lengthy
thread on this topic on the ga-list several months ago. In particular,
some articles following up on the paper mentioned below are announced
there. For example, an article by Radcliffe and Surry is announced
there, as is an article by Macready and Wolpert on intrisically hard
fitness functions (as opposed to functions that are hard with respect
to some particular search algorithm).

***

We are also currently involved in research with a student to
exhaustively characterize the set of fitness functions for which searh
algorithm A does much better than algorithm B and how that set differs
for the set for which B outperforms A. In particular, we are doing
this for the case where the algorithms are hill-climbers and GA's, as
in Lang's debate with Koza.

In addition, search is (from a formal perspective) almost identical to
active learning, optimal experimental design, and (a less exact match)
control theory. We are also currently investigating what those other
fields have to offer for search.

Anyone interested in receiving the results of that work as it gets
written up can email us at wgm at santafe.edu or dhw at santafe.edu

Bill Macready

***

Here are the original paper announcements:

ftp-file-name: nfl.ps

    No Free Lunch Theorems for Search

       D.H. Wolpert, W.G. Macready

We show that all algorithms that search for an extremum of a cost
function perform exactly the same, when averaged over all possible
cost functions. In particular, if algorithm A outperforms algorithm
B on some cost functions, then loosely speaking there must exist
exactly as many other functions where B outperforms A. Starting from
this we analyze a number of the other a priori characteristics of
the search problem, like its geometry and its information-theoretic
aspects.  This analysis allows us to derive mathematical benchmarks
for assessing a particular search algorithm's performance.  We also
investigate minimax aspects of the search problem, the validity of
using characteristics of a partial search over a cost function to
predict future behavior of the search algorithm on that cost
function, and time-varying cost functions. We conclude with some
discussion of the justifiability of biologically-inspired search
methods.

-------------------------------------------------------------

ftp-file-name: hard.ps

   What Makes an Optimization Problem Hard?
  
        W.G. Macready, D.H. Wolpert

We address the question, ``Are some classes of combinatorial
optimization problems intrinsically harder than others, without regard
to the algorithm one uses, or can difficulty only be assessed relative
to particular algorithms?''  We provide a measure of the hardness of a
particular optimization problem for a particular optimization
algorithm.  We then present two algorithm-independent quantities that
use this measure to provide answers to our question. In the first of
these we average hardness over all possible algorithms for the
optimization problem at hand. We show that according to this quantity,
there is no distinction between optimization problems, and in this
sense no problems are intrinsically harder than others. For the second
quantity, rather than average over all algorithms we consider the
level of hardness of a problem (or class of problems) for the
algorithm that is optimal for that problem (or class of
problems). Here there are classes of problems that are intrinsically
harder than others.

To obtain an electronic copy of these papers:

	ftp ftp.santafe.edu
	login: anonymous
	password: <your email address>
	cd /pub/wgm
	get <ftp-file-name>
	quit

Then at your system:

	lpr -P<printer-name> <ftp-file-name> 

If you have trouble getting any of these papers electronically, you
can request a hard copy from publications (wp at santafe.edu), Santa Fe
Institute, 1399 Hyde Park Road, Santa Fe, NM, USA, 87501.




More information about the Connectionists mailing list