Information Theory, Probability and Neural Networks

David J.C. MacKay mackay at mrao.cam.ac.uk
Wed Apr 9 16:46:00 EDT 1997


The following *draft* book is available for anonymous ftp. 

Feedback from the information theory and neural networks communities
would be warmly welcomed.

========================================================================
    "Information Theory, Probability and Neural Networks"
	
               by David J.C. MacKay
-------------------------------------------------------------------------

     An undergraduate / graduate textbook. 
     This book will feature:

		* lots of figures and demonstrations.
		* more than one hundred exercises with worked solutions.
		* up to date exposition of:
		   .  source coding 
			- including arithmetic coding, `bits back' coding
		   .  channel coding 
			- including Gallager codes, turbo codes
		   .  neural networks
			- including Gaussian processes
		   .  Monte Carlo methods
			- including Hybrid Monte Carlo, Overrelaxation

     The current draft (April 9th 1997) is     Draft 1.2.3  (308 pages).
     (Estimated to be about 70% complete.)

=================== COMPLETED CHAPTERS ===============================

     1.  Introduction to Information Theory

--------- Data Compression -------------------------------------------

     2.  The Source Coding Theorem
     3.  Data Compression II: Symbol Codes
     4.  Data Compression III: Stream Codes

--------- Noisy Channel Coding ---------------------------------------

     5.  Communication over a noisy channel
     6.  The noisy channel coding theorem
     7.  Error correcting codes & real channels

--------- Probabilities ----------------------------------------------

     8.  Bayesian Inference
     9.  Ising Models
     10. Variational Methods
     11. Monte Carlo methods

--------- Neural networks -----------------------------------------------

     12. Introduction to neural networks
     13. The single neuron as a classifier
     14. Capacity of a single neuron
     15. Learning as Inference
     16. The Hopfield network
     17. From Hopfield networks to Boltzmann machines
     18. Supervised learning in multilayer networks

==================== INCOMPLETE CHAPTERS ==============================

------- Unsupervised learning -----------------------------------------

      Clustering
      Independent component analysis
      Helmholtz machines
      A single neuron as an unsupervised learning element

------- Probability, data modelling and supervised neural networks ----

      Laplace's method
      Graphical models and belief propagation
      Complexity control and model comparison
      Gaussian processes

------- Unifying chapters ---------------------------------------------

      Hash codes: codes for efficient information retrieval
      `Bits back' source coding
      Low density parity check codes
      Turbo codes

========================================================================
downloading instructions: 
------------------------------------------------------------------------
  The book (1.1Mbytes) can be clicked from this web page in Cambridge, England:
     http://wol.ra.phy.cam.ac.uk/mackay/itprnn/#book
  or from this MIRROR in Toronto, Canada:
     http://www.cs.toronto.edu/~mackay/itprnn/#book

If you prefer to use ftp,
     ftp wol.ra.phy.cam.ac.uk (131.111.48.24)
     anonymous
     your name
     cd pub/mackay/itprnn
     binary
     get book.ps2.gz (tree saving two pages to a page version)
OR   get book.ps.gz  (ordinary version)
     quit
     gunzip book.*

==========================================================================
David J.C. MacKay        email: mackay at mrao.cam.ac.uk                     
                           www: http://wol.ra.phy.cam.ac.uk/mackay/
Cavendish Laboratory,      tel: (01223) 339852 fax: 354599  home: 276411
Madingley Road,                 international code: +44 1223
Cambridge CB3 0HE. U.K.   room: 982 Rutherford Building


More information about the Connectionists mailing list