Connectionist symbol processing: any progress?

Jim Austin austin at minster.cs.york.ac.uk
Mon Sep 7 14:22:00 EDT 1998



Its time to add one more (!) line of work on the symbolic neural networks
debate. We have been working on these systems for about 5 years, in the
context of binary neural networks. We have concentrated on the use
of outer-product methods for storing and accessing information in
many (commercially relevant) problems. We have not concentrated on the
cognitive significance (although I believe what we have done has some). We
exploit them for the following reasons;

1) The use of outer-product based methods are very fast in both learning and
access, this comes from the sparce representations we use, along with the
simple learning rules.

2) They are very memory efficient if used with distributed representations.

3) They have the ability to offer any-time properties (in this I mean
they can give a sensible result at any time after a fixed initial
processing time has passed).

4) They can be used in a modular way.

5) They are fault tolerant (probably)

6) They map to hardware very efficiently.

Our main work has been in there use in search engines, rule based systems
and image analysis.

The approach is based on the use of tensor products for binding data
before presentation to a CMM (an outer product based memory), the other
features are;
The use of binary distributed representations of data.
The use of threshold logic to select interesting matches.
The use of pre-processing to orthoganalise data for efficient representation.
The use of superposition to maintain fixed length representations.

Our work has looked at the use of large numbers of CMM systems for solving
hard problems. For example, we have done work using them for
molecule databases, where they are used for structure matching. We
have been using them to recognise images (simple ones), where they can
be made to generalise to translations and scale varying images.

We have built hardware to perform binary outer-products and other operations,
and are now scaling up the architecture to support large numbers of CMMs
using these cards.

The largest CMM we have used (i.e. an outer product based representation) is
750Mb on a text database.

Details of this work can be found on our web pages and in the following
papers;

The basic methods of tensor binding;
%A    J Austin
%E    N Kasabov
%J    International Journal on Fuzzy Sets and Systems
%T    Distributed Associative Memories for High Speed Symbolic Reasoning
%P 223-233
%V 82
%D 1996


Some more of the same;
%A J Austin
%B Conectionist Symbolic Integration
%D 1997
%E Ron Sun
%E Frederic Alexander
%I Lawrence Erlbaum Associates
%C 15
%P 265-278
%T A Distributed Associative Memory for High Speed Symbolic Reasoning
%K BBBB+
%O ISBN 0-8058-2348-4


The properties of a CMM that we use to store the bindings;
%A Mick Turner
%A Jim Austin
%T Matching Performance of Binary Correlation Matrix Memories
%J Neural Networks
%D 1997
%I Elsevier Science
%P 1637-1648
%N 9
%V 10

The use of the symbolic methods to do molecular database searching;
(A new paper just sent to Neural networks is also on this)
%A M Turner
%A J Austin
%T A neural network technique for chemical graph matching
%D July 1997
%E M Niranjan
%I IEE
%B Fifth International Conference on Artificial Neural Networks, Cambridge, UK

Image understanding architecture;
%A C Orovas
%A J Austin
%D 1997
%J 5th IEEE Int. Workshop on Cellular Neural Networks and their application.
%D 14-17 April 1998
%T A cellular system for pattern recognition using associative neural networks

The hardware;
%A J Austin
%A J Kennedy
%T PRESENCE, a hardware implementation of binary neural networks
%J International Conference on Artificial Neural Networks
%C Sweden
%D 1998





Jim

-- 

Dr. Jim Austin, Senior Lecturer, Department of Computer Science, University of York, York,
YO1 5DD, UK.
Tel : 01904 43 2734
Fax : 01904 43 2767
web pages: http://www.cs.york.ac.uk/arch/



More information about the Connectionists mailing list