Compressibility and Generalization

Juergen Schmidhuber juergen at idsia.ch
Tue Dec 5 12:50:01 EST 1995


Shahab Mohaghegh requested a definition of 
``compressibility of the history of a universe''. 

Let S(t) denote the state of a computable universe
at discrete time step t. Let's suppose S(t) can be 
described by n bits. The history of the universe
between time step 1 (big bang) and time step t is 
compressible if it can be computed by an algorithm
whose size is clearly less than tn bits.

Given a particular computing device, most histories 
are incompressible: there are 2^tn possible histories, 
but there are less than (1/2)^c * 2^tn algorithms 
with less than 2^(tn-c) bits (c is a small positive
constant).  With most possible universes, the mutual 
algorithmic information between past and future is zero, 
and previous experience won't help to generalize well
in the future.

There are a few compressible or ``regular'' universes, 
however. To use ML terminology, some of them allow for 
``generalization by analogy''. Some of them allow for 
``generalization by chunking''. Some of them allow for
``generalization by exploiting invariants''. Etc. It
would be nice to have a method that can generalize well
in *arbitrary* regular universes.

Juergen Schmidhuber
IDSIA

P.S.: Sorry, I meant to say: 
there are less than (1/2)^c * 2^tn 
algorithms with less than tn-c bits.
JS



More information about the Connectionists mailing list