No subject


Tue Jun 6 06:52:25 EDT 2006


When this book was conceived ten years ago,
few scientists realized the width of scope and the
power for applicability of the central ideas. Partially
because of the enthusiastic reception of the first edition,
open problems have been solved and new applications have been
developed. We have added new material on the relation between
data compression and  minimum description length induction,
computational learning, and universal prediction; circuit theory; distributed
algorithmics; instance complexity; CD compression;
computational complexity; Kolmogorov random graphs;
shortest encoding of routing tables in communication networks;
resource-bounded computable universal distributions; average case properties;
the equality of statistical entropy and expected Kolmogorov complexity;
and so on. Apart from being used by researchers and
as reference work, the book is now commonly used for graduate courses 
and seminars. In recognition of this fact, the second
edition has been produced in textbook style. We have
preserved as much as possible the ordering of
the material as it was in the first edition.
The many exercises bunched together at the ends of
some chapters have been moved to the appropriate sections.
The comprehensive bibliography on Kolmogorov complexity
at the end of the book has been updated, as have
the ``History and References'' sections of the chapters.
Many readers were kind enough to express their appreciation
for the first edition and to send notification of typos, errors,
and comments. Their number is too large to thank them individually,
so we thank them all collectively.


BLURB:

Written by two experts in the field, this is the only
comprehensive and unified treatment of the
central ideas and their applications of Kolmogorov complexity---the
theory dealing with the quantity of information in individual objects.
Kolmogorov complexity is known variously as `algorithmic
information', `algorithmic entropy', `Kolmogorov-Chaitin
complexity', `descriptional complexity', `shortest program length',
`algorithmic randomness', and others.

The book is ideal for advanced undergraduate students, graduate students
and researchers in computer science, mathematics, cognitive sciences,
artificial intelligence, philosophy, statistics and physics.
The book is self contained in the sense that it contains the basic requirements
of computability theory, probability theory, information theory, and coding.
Included are also numerous problem sets, comments, source references and hints
to the solutions of problems, course outlines for classroom use, as well as a 
great deal of new material not included in the first edition.

CONTENTS:

   Preface to the First Edition  v 
   How to Use This Book  viii 
   Acknowledgments  x 
   Preface to the Second Edition  xii 
   Outlines of One-Semester Courses  xii 
   List of Figures  xix 

   1 Preliminaries  1 
   1.1  A Brief Introduction   1 
   1.2  Prerequisites and Notation   6 
   1.3  Numbers and Combinatorics   8 
   1.4  Binary Strings   12 
   1.5  Asymptotic Notation   15 
   1.6  Basics of Probability Theory   18 
   1.7  Basics of Computability Theory   24 
   1.8  The Roots of Kolmogorov Complexity   47 
   1.9  Randomness   49 
   1.10  Prediction and Probability   59 
   1.11  Information Theory and Coding   65 
   1.12  State   Symbol Complexity   84 
   1.13  History and References   86 

   2 Algorithmic Complexity  93 
   2.1  The Invariance Theorem   96 
   2.2  Incompressibility   108 
   2.3  C as an Integer Function   119 
   2.4  Random Finite Sequences   127 
   2.5  *Random Infinite Sequences   136 
   2.6  Statistical Properties of Finite Sequences   158 
   2.7  Algorithmic Properties of             167 
   2.8  Algorithmic Information Theory   179 
   2.9  History and References   185 

   3 Algorithmic Prefix Complexity  189 
   3.1  The Invariance Theorem   192 
   3.2  *Sizes of the Constants   197 
   3.3  Incompressibility   202 
   3.4  K as an Integer Function   206 
   3.5  Random Finite Sequences   208 
   3.6  *Random Infinite Sequences   211 
   3.7  Algorithmic Properties of             224 
   3.8  *Complexity of Complexity   226 
   3.9  *Symmetry of Algorithmic Information   229 
   3.10  History and References   237 

   4 Algorithmic Probability  239 
   4.1  Enumerable Functions Revisited   240 
   4.2  Nonclassical Notation of Measures   242 
   4.3  Discrete Sample Space   245 
   4.4  Universal Average-Case Complexity   268 
   4.5  Continuous Sample Space   272 
   4.6  Universal Average-Case Complexity, Continued   307 
   4.7  History and References   307 

   5 Inductive Reasoning  315 
   5.1  Introduction   315 
   5.2  Solomonoff's Theory of Prediction   324 
   5.3  Universal Recursion Induction   335 
   5.4  Simple Pac-Learning   339 
   5.5  Hypothesis Identification by Minimum Description Length   351 
   5.6  History and References   372 

   6 The Incompressibility Method  379 
   6.1  Three Examples   380 
   6.2  High- Probability Properties   385 
   6.3  Combinatorics   389 
   6.4  Kolmogorov Random Graphs   396 
   6.5  Compact Routing   404 
   6.6  Average-Case Complexity of Heapsort   412 
   6.7  Longest Common Subsequence   417 
   6.8  Formal Language Theory   420 
   6.9  Online CFL Recognition   427 
   6.10  Turing Machine Time Complexity   432 
   6.11  Parallel Computation   445 
   6.12  Switching Lemma   449 
   6.13  History and References   452 

   7 Resource-Bounded Complexity  459 
   7.1  Mathematical Theory   460 
   7.2  Language Compression   476 
   7.3  Computational Complexity   488 
   7.4  Instance Complexity   495 
   7.5     Kt  Complexity and Universal Optimal Search   502 
   7.6  Time-Limited Universal Distributions   506 
   7.7  Logical Depth   510 
   7.8  History and References   516 

   8 Physics, Information, and Computation  521 
   8.1  Algorithmic Complexity and Shannon's Entropy   522 
   8.2  Reversible Computation   528 
   8.3  Information Distance   537 
   8.4  Thermodynamics   554 
   8.5  Entropy Revisited   565 
   8.6  Compression in Nature   583 
   8.7  History and References   586 

   References  591 
   Index  618 

If you are seriously interested in using the text in the course,
contact Springer-Verlag's Editor for Computer Science, Martin
Gilchrist, for a complimentary copy.

     Martin Gilchrist                   marting at springer-sc.com
     Suite 200, 3600 Pruneridge Ave.    (408) 249-9314
     Santa Clara, CA 95051

If you are interested in the text but won't be teaching a course,
we understand that Springer-Verlag sells the book, too.
To order, call toll-free 1-800-SPRINGER (1-800-777-4643); N.J.
residents call 201-348-4033. For information regarding
examination copies for course adoptions, write Springer-Verlag
New York, Inc. , 175 Fifth Avenue, New York,NY 10010. 
You can order through the Web site: "http://www.springer-ny.com/"

For U.S.A./Canada/Mexico- e-mail: orders at springer-ny.com or fax an
order form to: 201-348-4505.
For orders outside U.S.A./Canada/Mexico send this form to: orders at springer.de
Or call toll free: 800-SPRINGER - 8:30 am to 5:30 pm ET (that's 777-4643 and 
201-348-4033 in NJ). Write to Springer-Verlag New York, Inc., 175 Fifth Avenue,
New York, NY, 10010.

Visit your local scientific bookstore. Mail payments may be made by check, 
purchase order, or credit card (see note below). Prices are payable in U.S. 
currency or its equivalent and are subject to change without notice. Remember, 
your 30-day return privilege is always guaranteed!

Your complete address is necessary to fulfill your order.






More information about the Connectionists mailing list