two papers

B Garner bmg at numbat.cs.rmit.edu.au
Fri Jan 9 06:38:03 EST 1998


A number of people have requested that the following published papers 
be made available in postscript format.

They are now available at WWW site
http://yallara.cs.rmit.edu.au/~bmg/algA.ps
http://yallara.cs.rmit.edu.au/~bmg/algB.ps

I apologize if you have received multiple copies of this posting.
**************************************************************************

A symbolic solution for adaptive feedforward neural networks found with a 
new training algorithm

B. M. Garner, Department of Computer Science, RMIT, Melbourne, Australia.

ABSTRACT
Traditional adaptive feed forward neural network (NN) training algorithms 
find numerical values for the weights and thresholds. In this paper it is 
shown that a NN composed of linear threshold gates (LTGs) can function as 
a fully trained neural network without finding numerical values for the 
weights and thresholds. This surprising result is demonstrated by presenting 
a new training algorithm for this type of NN that resolves the network into 
constraints which describes all the numeric values the NN's weights and 
thresholds can take. The constraints do not require a numerical solution for 
the network to function as a fully trained NN which can generalize. The 
solution is said to be symbolic as a numerical solution is not required.
***************************************************************************

A training algorithm for Adaptive Feedforward Neural Networks that 
determines its own topology

B. M. Garner, Department of Computer Science, RMIT, Melbourne, Australia.

ABSTRACT 
There has been some interest in developing neural network training 
algorithms that determine their own architecture. A training algorithm 
for adaptive feedforward neural networks (NN) composed of Linear 
Threshold Gates (LTGs) is presented here that determines it's own 
architecture and trains in a single pass. This algorithm produces what 
is said to be a symbolic solution as it resolves the relationships 
between the weights and the thresholds into constraints which do not 
require to be solved numerically. The network has been shown to behave 
as a fully trained neural network which generalizes and the possibility 
that the algorithm has polynomial time complexity is discussed. The 
algorithm uses binary data during training.


Bernadette

=============================================================================
Bernadette Garner          He shall fall down a pit called Because, and there
bmg at numbat.cs.rmit.edu.au             he shall perish with the dogs of Reason
http://yallara.cs.rmit.edu.au/~bmg/                        - Aleister Crowley
=============================================================================


More information about the Connectionists mailing list