Connectionist symbol processing: any progress?

Dmitri Rachkovskij ml_conn at infrm.kiev.ua
Wed May 5 11:39:50 EDT 1999


Keywords: distributed representation, sparse coding, binary coding,
binding, variable binding, thinning, representation of structure,
structured representation, recursive representation, nested representation,
compositional representation, connectionist symbol processing,
associative-projective neural networks.

Dear Colleagues,

The following paper draft (abstract enclosed) inspired
by the last year's debate is available at

http://cogprints.soton.ac.uk/abs/comp/199904008

Dmitri A. Rachkovskij & Ernst M. Kussul
"Binding and Normalization of Binary Sparse Distributed Representations
by Context-Dependent Thinning".

Comments are welcome!

Thank you and best regards,
Dmitri Rachkovskij


Abstract:

    Distributed representations were often criticized as inappropriate
for encoding of data with a complex structure. However Plate's Holographic
Reduced Representations and Kanerva's Binary Spatter Codes are recent
schemes that allow on-the-fly encoding of nested compositional structures
by real-valued or dense binary vectors of fixed dimensionality.
    In this paper we consider procedures of the Context-Dependent Thinning
which were developed for representation of complex hierarchical items
in the architecture of Associative-Projective Neural Networks. These
procedures provide binding of items represented by sparse binary
codevectors (with low probability of 1s). Such an encoding is biologically
plausible and allows to reach high information capacity of distributed
associative memory where the codevectors may be stored.
    In distinction to known binding procedures, Context-Dependent Thinning
allows to support the same low density (or sparseness) of the bound
codevector for varied number of constituent codevectors. Besides, a bound
codevector is not only similar to another one with similar constituent
codevectors (as in other schemes), but it is also similar to the
constituent codevectors themselves. This allows to estimate a structure
similarity just by the overlap of codevectors, without the retrieval of
the constituent codevectors. This also allows an easy retrieval of the
constituent codevectors.
    Examples of algorithmic and neural network implementations of the
thinning procedures are considered. We also present representation
examples of various types of nested structured data (propositions using
role-filler and predicate-arguments representation, trees, directed
acyclic graphs) using sparse codevectors of fixed dimension. Such
representations may provide a fruitful alternative to the symbolic
representations oftraditional AI, as well as to the localist and
microfeature-based connectionist representations.

*************************************************************************
Dmitri A. Rachkovskij, Ph.D.                       Net: dar at infrm.kiev.ua
Senior Researcher,
V.M.Glushkov Cybernetics Center,                   Tel: 380 (44) 266-4119
Pr. Acad. Glushkova 40,
Kiev 22, 252022, UKRAINE                           Fax: 380 (44) 266-1570
*************************************************************************





More information about the Connectionists mailing list