4 reports available

Pierre Bessiere bessiere at imag.fr
Thu Oct 10 12:48:37 EDT 1991


The following four papers/reports have been placed in the neuroprose
archive:

- Bessiere, P.; "Toward a synthetic cognitive paradigm: Probabilistic 
  Inference"; Conference COGNITIVA90, Madrid, Spain, 1990

      Neuroprose file name:  bessiere.cognitiva90.ps.Z

- Talbi, E-G. & Bessiere, P.; "A parallel genetic algorithm for the graph 
  partitioning problem"; ACM-ICS91 (Conference on Super Computing), Cologne,
  Germany, 1991

      Neuroprose file name:  bessiere.acm-ics91.ps.Z

- Bessiere, P., Chams, A. & Muntean, T.; "A virtual machine model for 
  artificial neural network programming"; INNC90 (International Neural
  Networks Conference), Paris, France, 1990

      Neuroprose file name:  bessiere.innc90.ps.Z

- Bessiere, P., Chams, A. & Chol, P.; "MENTAL: a virtual machine approach to
  artificial neural networks programming"; ESPRIT B.R.A. project NERVES (3049),
  final report, 1991

The abstract of each paper and ftp instructions follow:

                                ----------

        TOWARD A SYNTHETIC COGNITIVE PARADIGM: PROBABILISTIC INFERENCE

Cognitive science is a very active field of scientific interest. 
It turns out to be a "melting pot" of ideas coming from very 
different areas. One of the principal hopes is that some synthetic 
cognitive paradigms will emerge from this interdisciplinary 
"brain storming". The goal of this paper is to answer the question:
"Given the state of the art, is there any hints indicating the 
emergence of such synthetic paradigms?" The main thesis of 
the paper is that there is a good candidate, namely, the probabilistic 
inference paradigm.

In support of the above thesis the structure of the paper is as follows:
	- in a first part, we identify five criteria to qualify 
as a synthetic cognitive paradigm (validity, self consistency, 
competence, feasibility and mimetic power);
	- in the second paragraph, the principles of probabilistic 
inference are reviewed and justifications of validity and self 
consistency of this paradigm are given (Marr's computational level);
	- then, the competence criterion is discussed, considering 
the efficiency of probabilistic inference for dealing with the different 
classical cognitive riddles and analyzing the relationships of 
probabilistic inference with several of the usual connexionist 
formalisms (Marr's algorithmic level);
	- the criteria of feasibility (condition of computer implementation) 
and mimetic power (adequation with what is known of the architecture 
of the nervous system) are finally considered in the fourth 
part (Marr's implementation level).

As a conclusion, it will appear that probabilistic inference is at 
least a very interesting framework to get a synthetic overview of a 
number of works in the area and to identify and formalize the most 
puzzling questions. Some of these questions will be listed. 
In fact, probabilistic inference will appear finally to be
able to play the same role for computational cognitive science 
that formal logic has played for classical symbolic Artificial 
Intelligence: a sound mathematical foundation serving as a guide 
line, as a constant reference and as a source of inspiration.

                                ----------

     A PARALLEL GENETIC ALGORITHM FOR THE GRAPH PARTITIONING PROBLEM

Genetic algorithms are stochastic search and optimization techniques 
which can be usedf for a wide range of applications. This paper 
addresses the application of genetic algorithms to the graph 
partitioning problem. Standard genetic algorithms with large 
populations suffer from lack of efficiency (quite high execution time). 
A massively parallel genetic algorithm is proposed, an implementation on 
a SuperNode( of Transputers( and results of various benchmarks are given. 

The parallel algorithm shows a superlinear speed-up, in the sense 
that when multiplying the number of processors by p, the time 
spent to reach a solution with a given score, is divided by kp (k>1). 

A comparative analysis of our approach with hill-climbing algorithms 
and simulated annealing is also presented. The experimental 
measures show that our algorithm gives better results concerning both 
the quality of the solution and the time needed to reach it.

                                ----------

    A VIRTUAL MACHINE MODEL FOR ARTIFICIAL NEURAL NETWORK PROGRAMMING

This paper introduces the model of a virtual machine 
for A.N.N. (Artificial Neural Networks).

The context of this work is a collaborative project to study new 
V.L.S.I. implementations and new architectures for neuronal machines. The 
work consists in the specification and a prototype implementation of 
a description language for A.N.N., of the associated virtual 
machine, of the compiler between them and of the compilers mapping 
the virtual machine on different highly parallel computers.

In this short paper we present the virtual machine model which 
combines the features of various parallel programming paradigms. 
Our model allows, in particular, to have the same A.N.N. program 
running on both synchronous or asynchronous type of machines. In 
this framework a parallel architecture (S.M.A.R.T.) and a dynamically 
reconfigurable parallel machine of Transputers (SuperNode) are 
considered as target machines.

                                ----------

  MENTAL: A VIRTUAL MACHINE APPROACH TO ARTIFICIAL NEURAL NETWORKS PROGRAMMING
                          (ATTENTION: 100 pages)

This report treats (extensively) the same subject than the short paper
described just above. Some parts are extracted from the three previouly
presented papers.

                                ----------

These reports may be FTP from either neuroprose archives or from my
own server (IMAG):

How to get files from the Neuroprose archives?
______________________________________________

Anonymous ftp on:
	- archive.cis.ohio-state.edu (128.146.8.52)

mymachine>ftp archive.cis.ohio-state.edu
Name: anonymous
Password: yourname at youradress
ftp>cd pub/neuroprose
ftp>binary
ftp>get bessiere.foo.ps.Z
ftp>quit
mymachine>uncompress bessiere.foo.ps.Z

How to get files from IMAG?
___________________________

Anonymous ftp on:
	- 129.88.32.1

mymachine>ftp 129.88.32.1
Name: anonymous
Password: yourname at youradress
ftp>cd pub/SYMPA/NNandGA
ftp>binary
ftp>get bessiere.foo.ps.Z
ftp>quit
mymachine>uncompress bessiere.foo.ps.Z



-- 

Pierre BESSIERE
***************

IMAG/LGI                                  phone:
BP 53X                                    Work: 33/76.51.45.72
38041 Grenoble Cedex                      Home: 33/76.51.16.15
FRANCE                                    Fax:  33/76.44.66.75
                                          Telex:UJF 980 134 F

E-Mail: bessiere at imag.imag.fr

C'est au savant moderne que convient, plus qu'a tout autre, l'austere
conseil de Kipling: "Si tu peux voir s'ecrouler soudain l'ouvrage de ta vie, 
et te remettre au travail, si tu peux souffrir, lutter, mourrir sans murmurer,
tu seras un homme , mon fils." Dans l'oeuvre de la science seulement on peut 
aimer ce qu on detruit, on peut continuer le passe en le niant, on peut venerer
son maitre en le contredisant.       GASTON BACHELARD


More information about the Connectionists mailing list