R

Juergen Schmidhuber juergen at idsia.ch
Mon May 20 03:10:26 EDT 1996


Richard Long writes:

  There may be another reason for the brain to construct
  networks that are 'minimal' having to do with Chaitin and
  Kolmogorov computational complexity.  If a minimal network corresponds
  to a 'minimal algorithm' for implementing a particular computation, then
  that particular network must utilize all of the symmetries and
  regularities contained in the problem, or else these symmetries could be
  used to reduce the network further.  Chaitin has shown that no algorithm
  for finding this minimal algorithm in the general case is possible.
  However, if an evolutionary programming method is used in which the
  fitness function is both 'solves the problem' and 'smallest size' (i.e.
  Occam's razor), then it is possible that the symmetries and regularities
  in the problem would be extracted as smaller and smaller networks are
  found.  I would argue that such networks would compute the solution less
  by rote or brute force, and more from a deep understanding of the problem.
  I would like to hear anyone else's thoughts on this.

Some comments:
Apparently, Kolmogorov was the first to show the impossibility of finding
the minimal algorithm in the general case (but Solomonoff also mentions
it in his early work). The reason is the halting problem, of course - you
don't know the runtime of the minimal algorithm. For all practical 
applications, runtime has to be taken into account. Interestingly, there
is an ``optimal'' way of doing this, namely Levin's universal search algorithm,
which tests solution candidates in order of their Levin complexities:

L. A. Levin. Universal sequential search problems, Problems of Information 
Transmission 9:3,265-266,1973.

For finding Occam's razor neural networks with minimal Levin complexity, see 
J. Schmidhuber: Discovering solutions with  low Kolmogorov complexity
and high generalization capability.  In A.Prieditis and S.Russell, editors, 
Machine Learning: Proceedings of the 12th International Conference, 488--496. 
Morgan Kaufmann Publishers, San Francisco, CA, 1995.

For Occam's razor solutions of non-Markovian reinforcement learning tasks, see 
M. Wiering and J. Schmidhuber:  Solving POMDPs using Levin search and EIRA.
In Machine Learning: Proceedings of the 13th International Conference.
Morgan Kaufmann Publishers, San Francisco, CA, 1996, to appear.

---

Juergen Schmidhuber, IDSIA
http://www.idsia.ch/~juergen


More information about the Connectionists mailing list