Thesis in neuroprose

Petri Myllymaki Petri.Myllymaki at cs.Helsinki.FI
Tue Feb 15 04:52:42 EST 1994


FTP-host: archive.cis.ohio-state.edu
FTP-filename: /pub/neuroprose/Thesis/myllymaki.thesis.ps.Z

The following report has been placed in the neuroprose archive.

-----------------------------------------------------------------------
Bayesian Reasoning by Stochastic Neural Networks

Petri Myllymaki

Ph.Lic. Thesis
Department of Computer Science, University of Helsinki
Report C-1993-67, Helsinki, December 1993
78 pages

This work has been motivated by problems in several research areas:
expert system design, uncertain reasoning, optimization theory, and
neural network research. From the expert system design point of view,
our goal was to develop a generic expert system shell capable of
handling uncertain data. The theoretical framework used here for
handling uncertainty is probabilistic reasoning, in particular the
theory of Bayesian belief network representations. The probabilistic
reasoning task we are interested in is, given a Bayesian network
representation of a probability distribution on a set of discrete
random variables, to find a globally maximal probability state
consistent with given initial constraints. To solve this NP-hard
problem approximatively, we use an iterative stochastic method, Gibbs
sampling.  As this method can be quite inefficient when implemented on
a conventional sequential computer, we show how to construct a Gibbs
sampling process for a given Bayesian network on a massively parallel
architecture, a harmony neural network, which is a special case of the
Boltzmann machine architecture.

To empirically test the method developed, we implemented a hybrid
neural-symbolic expert system shell, NEULA. The symbolic part of the
system consists of a high-level conceptual description language and a
compiler, which can be used for constructing Bayesian networks and
providing them with the corresponding parameters (conditional
probabilities).  As the number of parameters needed for a given network
may generally be quite large, we restrict ourselves to Bayesian
networks having a special hierarchical structure.  The neural part of
the system consists of a neural network simulator which performs
massively parallel Gibbs sampling.  The performance of the NEULA system
was empirically tested by using a small artificial test example.

Computing Reviews (1991) Categories and Subject Descriptors:
G.3     [Probability and statistics]: Probabilistic algorithms
F.1.1   [Models of computation]: Neural networks
G.1.6   [Optimization]: Constrained optimization
I.2.5   [Programming languages and software]: Expert system tools and
techniques

General Terms: Algorithms, Theory.

Additional Key Words and Phrases:
Monte Carlo algorithms, Gibbs sampling, simulated annealing,
Bayesian belief networks, connectionism, massive parallelism

-----------------------------------------------------------------------
To obtain a copy:

  ftp archive.cis.ohio-state.edu
  login: anonymous
  password: <your email address>
  cd pub/neuroprose/Thesis
  binary
  get myllymaki.thesis.ps.Z
  quit

Then at your system:

  uncompress myllymaki.thesis.ps.Z
  lpr myllymaki.thesis.ps

-----------------------------------------------------------------------
Petri Myllymaki                          Petri.Myllymaki at cs.Helsinki.FI
Department of Computer Science           Int.+358 0 708 4212 (tel.)
P.O.Box 26 (Teollisuuskatu 23)           Int.+358 0 708 4441 (fax)
FIN-00014 University of Helsinki, Finland
-----------------------------------------------------------------------


More information about the Connectionists mailing list