NFL and practice
zhuh
zhuh at helios.aston.ac.uk
Mon Dec 18 08:11:50 EST 1995
I accidentally sent my reply to Joerg Lemm, instead of Connnetionist.
Since he replied to the Connectionist, I'll reply here as well, and
include my original posting at the end.
I quite agree with Joerg's observation about learning algorithms in
practice, and the priors they use. The key difference is
Is it legitimate to be vague about prior?
Put it another way,
Do you claim the algorithm can pick up whatever prior automatically,
instead of being specified before hand?
My answer is NO, to both questions, because for an algorithm to be good on
any prior is exactly the same as for an algorithm to be good without prior,
as NFL told us.
For purely cosmetic reasons, it might be helpful to translate the
useless "No free lunch theorem" :-)
Without specifying a particular prior, any algorithm is as good as
random guess,
into the equivalent, but infinitely more useful, "You have to pay for lunch
Theorem" :-)
For an algorithm to perform better than random guess, a particular
prior should be specified.
On a more practical level,
> E.g. in nearly all cases functions are somewhat smooth.
Do you specify the scale on which it is smooth?
> This is a prior which exists in reality (for example because
> of input noise in the measuring process).
If you average smoothness over all scales, in a certain uniform way, you get
a prior which contains no smoothness at all. If you average them in a non-
uniform way, you actually specify a non-uniform prior, which is the crucial
piece of information for any algorithm to work at all.
> And the situation would we hopeless
> if we would not use this fact in practice.
It would still be hopeless if we only used the fact of "somewhat smooth",
instead of specifying how smooth. See the following for theory and examples:
Zhu, H. and Rohwer, R.:
Bayesian regression filters and the issue of priors, 1995. To appear in
Neural Computing and Applications.
ftp://cs.aston.ac.uk/neural/zhuh/reg_fil_prior.ps.Z
My original posting is enclosed as the following:
----- Begin Included Message -----
More information about the Connectionists
mailing list