Connectionist symbol processing: any progress?

Dmitri Rachkovskij ml_conn at infrm.kiev.ua
Fri Jul 2 05:49:31 EDT 1999


Keywords: compositional distributed representations, sparse coding,
binary coding, binding, representation of structure, recursive representation,
nested representation, connectionist symbol processing, long-term memory,
associative-projective neural networks, analogical reasoning.

Dear Colleagues,

The following paper draft (abstract enclosed) is available at:
http://cogprints.soton.ac.uk/abs/comp/199907001

Dmitri A. Rachkovskij
"Representation and Processing of Structures
with Binary Sparse Distributed Codes".

Also you may be interested in a related paper (abstract enclosed):
http://cogprints.soton.ac.uk/abs/comp/199904008

Dmitri A. Rachkovskij & Ernst M. Kussul
"Binding and Normalization of Binary Sparse Distributed Representations
by Context-Dependent Thinning".

Comments are welcome!

Thank you and best regards,
Dmitri Rachkovskij

Encl:

Representation and Processing of Structures with Binary Sparse Distributed Codes

Abstract:
    The schemes for compositional distributed representations include
those allowing on-the-fly construction of fixed dimensionality
codevectors to encode structures of various complexity. Similarity of
such codevectors takes into account both structural and semantic
similarity of represented structures. In this paper we provide a
comparative description of sparse binary distributed representation
developed in the frames of the Associative-Projective Neural Network
architecture and more well-known Holographic Reduced
Representations of Plate and Binary Spatter Codes of Kanerva.
    The key procedure in Associative-Projective Neural Networks is
Context-Dependent Thinning which binds codevectors and maintains
their sparseness. The codevectors are stored in structured memory
array which can be realized as distributed auto-associative memory.
Examples of distributed representation of structured data are given.
Fast estimation of similarity of analogical episodes by the overlap of
their codevectors is used in modeling of analogical reasoning for
retrieval of analogs from memory and for analogical mapping.

--------
Binding and Normalization of Binary Sparse Distributed Representations
by Context-Dependent Thinning

Abstract:
    Distributed representations were often criticized as inappropriate
for encoding of data with a complex structure. However Plate's Holographic
Reduced Representations and Kanerva's Binary Spatter Codes are recent
schemes that allow on-the-fly encoding of nested compositional structures
by real-valued or dense binary vectors of fixed dimensionality.
    In this paper we consider procedures of the Context-Dependent Thinning
which were developed for representation of complex hierarchical items
in the architecture of Associative-Projective Neural Networks. These
procedures provide binding of items represented by sparse binary
codevectors (with low probability of 1s). Such an encoding is biologically
plausible and allows to reach high information capacity of distributed
associative memory where the codevectors may be stored.
    In distinction to known binding procedures, Context-Dependent Thinning
allows to support the same low density (or sparseness) of the bound
codevector for varied number of constituent codevectors. Besides, a bound
codevector is not only similar to another one with similar constituent
codevectors (as in other schemes), but it is also similar to the
constituent codevectors themselves. This allows to estimate a structure
similarity just by the overlap of codevectors, without the retrieval of
the constituent codevectors. This also allows an easy retrieval of the
constituent codevectors.
    Examples of algorithmic and neural network implementations of the
thinning procedures are considered. We also present representation
examples of various types of nested structured data (propositions using
role-filler and predicate-arguments representation, trees, directed
acyclic graphs) using sparse codevectors of fixed dimension. Such
representations may provide a fruitful alternative to the symbolic
representations oftraditional AI, as well as to the localist and
microfeature-based connectionist representations.

*************************************************************************
Dmitri A. Rachkovskij, Ph.D.                       Net: dar at infrm.kiev.ua
Senior Researcher,
V.M.Glushkov Cybernetics Center,                   Tel: 380 (44) 266-4119
Pr. Acad. Glushkova 40,
Kiev 22, 252022, UKRAINE                           Fax: 380 (44) 266-1570
*************************************************************************




More information about the Connectionists mailing list