Connectionist symbol processing: any progress?

Jim Austin austin at minster.cs.york.ac.uk
Thu Sep 10 13:15:55 EDT 1998


Another outline of symbolic/neural work taking place at York, UK, that
may be of interest to the debate.

Jim Austin


\author{Victoria J. Hodge and Jim Austin}
e-mail: vicky,austin at cs.york.ac.uk

We are proposing a unified-connectionist-distributed system.  The system is
currently theoretical, proposing a logical architecture that we aim to map onto
the AURA \cite{Austin_96}, \cite{AURA_web} modular, distributed neural network
methodology.

We posit a flexible, generic hierarchical topology with three fundamental
layers: features, case, and classes.  There is no repetition, each concept is
represented by a single node in the hierarchy, maintaining consistency and
allowing multifarious data types to be represented thus permitting generic
domains to be accommodated.

The front-end is an implicit, self-organising, unsupervised approach
similar to the Growing Cell Structures of Fritzke
\cite{Fritzke_93:TR} but constructing a hierarchy on top of the clusters and
generating feature descriptions and weighted connections for all nodes.  This
hierarchy will be mapped onto the binary representations of AURA and input to a
hierarchically arranged CMM topology representing features, cases and classes;
all partitioned into CMMs according to the natural partitions inherent in the
hierarchy.

A constraint elimination process will be implemented initially to eliminate
implausible concepts and context-sensitively reduce the search space.  A
spreading activation (SA)-type process (purely theoretical at present) will be
initiated on the required features (i.e., sub-conceptually) and allowed to
spread via the weighted links throughout the hierarchy.  SA is postulated as
psychologically plausible and can implement context effects (semantically
focussing  retrieval) and priming of recently retrieved concepts.  The highest
activated case(s) and class(es) will be retrieved as the best match.

New classes, cases and features can be aggregated into the hierarchy anytime,
simply and efficiently merely by incorporating new node and connections.  We
also aim to implement a deletion procedure that will ensure our hierarchy
remains within finite bounds (i.e., is asymptotically limited).  When a
predetermined size is reached, nodes are generalised that have least utility,
i.e., are covered by other nodes and least frequently accessed.  This allows
forgetting as a new addition results in the generalisation of older concepts.

Future investigation includes: structured concepts; more complexity for classes
(including hierarchically divided classes); weight adaptation where the weights
in the hierarchy are adjusted if the retrieved case(s) or class(es) are a poor
match; solution adaptation allowing solutions to be generated for new cases and
possibly generated from subsections of other solutions and aggregated together;
and, an explanation procedure.

@misc{AURA_web,
	title	= {{The AURA Homepage:
	\emph{http://www.cs.york.ac.uk/arch/nn/aura.html}}},
	}

@Article{Austin_96,
	author 	= {Austin, J. },
	title 	= {{Distributed associative memories for high speed symbolic
				reasoning}},
	journal = {Fuzzy Sets and Systems},
	volume 	= 82,
	pages 	= {223--233},
	year 	= 1996
	}

@Techreport{Fritzke_93:TR,
	author 	= {Fritzke, Bernd},
	title 	= {{Growing Cell Structures - a Self-organizing Network for
		Unsupervised and  Supervised Learning}},
	institution	= {International Computer Science Institute},
	address = {Berkeley, CA},
	number 	= {TR-93-026},
	year 	= 1993,
	}

--


-- 

Dr. Jim Austin, Senior Lecturer, Department of Computer Science, University of York, York,
YO1 5DD, UK.
Tel : 01904 43 2734
Fax : 01904 43 2767
web pages: http://www.cs.york.ac.uk/arch/



More information about the Connectionists mailing list