Record/archive of debate?
B Garner
bmg at cs.rmit.edu.au
Sun Aug 23 10:49:54 EDT 1998
*
* It would be nice if some sort of a record of the "Connectionist
* Symbol Processing" debate were to be produced and archived for
* the benefit of the community.
*
I think this would be a good idea.. because there were so many
interesting ideas expressed.
I recently published 2 training algorithms which I call symbolic
they are found at
http://yallara.cs.rmit.edu.au/~bmg/algA.ps and algB.ps
I have included the abstracts below.
Although I say these algorithms are symbolic because they train
the networks without finding a numerical solution, because sets of
constraints are derived during training. These constraints show that
the weights and the thresholds are all in relationship to each other at
each 'neuron'.
I have thought a lot about what 'symbol' means, and I have decided,
largely, that 'symbol is something that takes its meaning from those
symbols "around" it'. Perhaps there are better definitions because this
one is self-referential. But this idea of symbol is close to, apparently,
structural linguistics. Conveniently, perhaps you might think, this
idea supports the results of my training algorithms.
I have read some of the argument in this debate where someone said
that the topology of the input space needs to be examined. With my
second algorithm, once you know the topology of the input space the
problem can be transformed and learnt very simply. Even problems
such as the twin spiral problem can be learnt with one hidden layer.
These algorithms are very simple, but I haven't finished writing up
all my results I have yet.
Here are the abstracts:
A symbolic solution for adaptive feedforward neural networks found
with a new training algorithm
B. M. Garner, Department of Computer Science, RMIT, Melbourne, Australia.
ABSTRACT
Traditional adaptive feed forward neural network (NN) training
algorithms find numerical values for the weights and thresholds. In
this paper it is shown that a NN composed of linear threshold gates
(LTGs) can function as a fully trained neural network without finding
numerical values for the weights and thresholds. This surprising
result is demonstrated by presenting a new training algorithm for this
type of NN that resolves the network into constraints which describes
all the numeric values the NN's weights and thresholds can take. The
constraints do not require a numerical solution for the network to
function as a fully trained NN which can generalize. The solution is
said to be symbolic as a numerical solution is not required.
**************************************************************************
A training algorithm for Adaptive Feedforward Neural Networks that
determines its own topology
B. M. Garner, Department of Computer Science, RMIT, Melbourne, Australia.
ABSTRACT
There has been some interest in developing neural network training
algorithms that determine their own architecture. A training algorithm
for adaptive feedforward neural networks (NN) composed of Linear
Threshold Gates (LTGs) is presented here that determines it's own
architecture and trains in a single pass. This algorithm produces what
is said to be a symbolic solution as it resolves the relationships
between the weights and the thresholds into constraints which do not
require to be solved numerically. The network has been shown to behave
as a fully trained neural network which generalizes and the
possibility that the algorithm has polynomial time complexity is
discussed. The algorithm uses binary data during training.
More information about the Connectionists
mailing list