2 papers available

rachida chentouf at kepler.inpg.fr
Tue Jan 30 11:29:26 EST 1996


First paper:

Combining Sigmoids and Radial Basis Functions in Evolutive Neural Architectures.

available at:

ftp://tirf.inpg.fr/pub/HTML/chentouf/esann96_chentouf.ps.gz

ABSTRACT

An incremental algorithm for supervised learning of noisy data using two
layers neural networks with linear output units and a mixture of sigmoids
and radial basis functions in the hidden layer (2-[S,RBF]NN) is proposed.
Each time the network has to be extended, we compare different estimations
of the residual error: the one provided by a sigmoidal unit responding to
the overall input space, and those provided by a number of RBFs responding
to localized regions. The unit which provides the best estimation is
selected and installed in the existing network. The procedure is repeated
until the error reduces to the noise in the data. Experimental results show
that the incremental algorithm using 2-[S,RBF]NN is considerably faster
than the one using only sigmoidal hidden units. It also leads to a less
complex final network and avoids being trapped in spurious minima.

This paper has been accepted for publication in the European Symposium on
Artificial Neural Networks, Bruges, Belgium , April, 96.

===========================================================

The second paper is an extended abstract (the final version is in preparation):

DWINA: Depth and Width Incremental Neural Algorithm.

available at:

ftp://ftp.tirf.inpg.fr/pub/HTML/chentouf/icnn96_chentouf.ps.gz

ABSTRACT

This paper presents DWINA: an algorithm for depth and width design of
neural architectures in the case of supervised learning with noisy data.
Each new unit is trained to learn the error of the existing network and is
connected to it such that it does not affect its previous performance.
Criteria for choosing between increasing width or increasing depth are
proposed. The connection procedure for each case is also described. The
stopping criterion is very simple and consists in comparing the residual
error signal to the noise signal. Preliminary experiments point out the
efficacy of the algorithm especially to avoid spurious minima and to design
a network with a well-suited size. The complexity of the algorithm (number
of operations) is on average the same as that needed in a convergent run of
the BP algorithm on a static architecture having the optimal number of
parameters. Moreover, it is found that no significant difference exist
between networks having the same number of parameters and different
structure. Finally, the algorithm presents an interesting behaviour since
the MSE on the training set tends to decrease continuously during the
process evolving directly and surely to the solution of the mapping
problem.

This paper has been accepted for publication in the IEEE International
Conference on Neural Networks, Washington, June, 96.

     __      ______  __  ________  _______        __ __      __ ________ ______
    / /     /_  __/ / / / ____  / / _____/       / // /\    / // ____  // ____/
   / /       / /   / / / /___/ / / /___   ____  / // /\ \  / // /___/ // / ____
  / /_____  / /   / / /    ___/ / _____/ /___/ / // /  \ \/ // /_____// /_/ __/
 /_______/ /_/   /_/ /_/\__\   /_/            /_//_/    \_\//_/      /_____/
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
||		Mrs Rachida CHENTOUF					     ||
||		LTIRF-INPG						     ||
||		46, AV Felix Viallet 					     ||
||		38031 Grenoble - FRANCE					     ||
||					Tel : (+33) 76.57.43.64		     ||
||					Fax : (+33) 76.57.47.90              ||
||									     ||
||		WWW: ftp://tirf.inpg.fr/pub/HTML/chentouf/rachida.html	     ||
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=


More information about the Connectionists mailing list