book on hybrid connectionist language processing

Stefan Wermter wermter at nats2.informatik.uni-hamburg.de
Tue Dec 13 06:04:05 EST 1994



BOOK ANNOUNCEMENT
-----------------

The following book is now available from the beginning of December 1994.





Title:      Hybrid connectionist natural language processing

Date:       1995

Author:     Stefan Wermter
            Dept. of Computer Science
            University of Hamburg
            Vogt-Koelln-Str. 30
            D-22527 Hamburg
            Germany

            wermter at informatik.uni-hamburg.de

Series:     Neural Computing Series 7

Publisher:  Chapman & Hall Inc
            2-6 Boundary Row
            London SE1 8HN
            England



(Order information in the end of this message)



Description
-----------

The objective of this  book is  to describe a  new approach  in hybrid
connectionist natural language processing which bridges the gap between
strictly symbolic and connectionist systems. This objective is tackled
in two ways: the book gives an overview of hybrid connectionist archi-
tectures for natural  language processing; and  it demonstrates that a
hybrid connectionist architecture can be used for learning real-world
natural language problems. The book is primarily intended for scientists
and students interested in the fields of artificial intelligence, neural
networks,  connectionism, natural  language processing, hybrid symbolic
connectionist architectures,  parallel distributed processing, machine
learning, automatic knowledge  acquisition or computational linguistics.
Furthermore,  it might  be of  interest for  scientists and students
in information retrieval and cognitive science, since the book points
out interdisciplinary relationships to these fields.

We develop a systematic spectrum of hybrid connectionist architectures,
from completely symbolic architectures to separated hybrid connectionist
architectures,  integrated  hybrid  connectionist architectures and
completely connectionist architectures. Within this systematic spectrum
we have designed a system SCAN with two separated hybrid  connectionist
architectures and two integrated hybrid connectionist architectures for
a scanning  understanding of phrases. A scanning understanding  is a
relation-based flat understanding in  contrast to traditional symbolic
in-depth understanding. Hybrid  connectionist representations  consist
of either a combination of connectionist and symbolic representations
or different  connectionist representations. In particular, we focus on
important  tasks like structural  disambiguation and semantic context
classification. We show that  a parallel modular, constraint-based,
plausibility-based  and learned use of multiple hybrid connectionist
representations provides powerful architectures  for learning a scanning
understanding.  In particular, the combination of direct encoding of
domain-independent structural knowledge and the connectionist learning of
domain-dependent semantic knowledge, as suggested by a scanning under-
standing in SCAN, provides concepts which lead to  flexible, adaptable,
transportable architectures for different domains.




Table of Contents
-----------------

1 Introduction
  1.1 Learning a Scanning Understanding
  1.2 The General Approach
  1.3 Towards a Hybrid Connectionist Memory Organization
  1.4 An Overview of the SCAN Architecture
  1.5 Organization and Reader's Guide

2 Connectionist and Hybrid Models for Language Understanding
  2.1 Foundations of Connectionist and Hybrid Connectionist Approaches
  2.2 Connectionist Architectures
    2.2.1 Representation of Language in Parallel Spatial Models
      Early Pattern Associator for Past Tense Learning
      Pattern Associator for Semantic Case Assignment
      Pattern Associator with Sliding Window
      Time Delay Neural Networks
    2.2.2 Representation of Language in Recurrent Models
      Recurrent Jordan Network for Action Generation
      Simple Recurrent Network for Sequence Processing
      Recursive Autoassociative Memory Network
    2.2.3 Towards Modular and Integrated Connectionist Models
      Cascaded Networks
      Sentence Gestalt Model
      Grounding Models
  2.3 Hybrid Connectionist Architectures
    2.3.1 Sentence Analysis in Hybrid Models
      Hybrid Interactive Model for Constraint Integration
      Hybrid Model for Sentence Analysis
    2.3.2 Inferencing in Hybrid Models
      Symbolic Marker Passing and Localist Networks
      Symbolic Reasoning with Connectionist Models
    2.3.3 Architectural Issues in Hybrid Connectionist Systems
      Symbolic Neuroengineering and Symbolic Recirculation
      Modular Model for Parsing
  2.4 Summary and Discussion

3 A Hybrid Connectionist Scanning Understanding of Phrases
  3.1 Foundations of a Hybrid Connectionist Architecture
    3.1.1 Motivation for a Hybrid Connectionist Architecture
    3.1.2 The Computational Theory Level for a Scanning Understanding
    3.1.3 Constraint Integration
    3.1.4 Plausibility view
    3.1.5 Learning
    3.1.6 Subtasks of Scanning Understanding at the Computational Theory Level
    3.1.7 The Representation Level for a Scanning Understanding
  3.2 Corpora and Lexicon for a Scanning Understanding
    3.2.1 The Underlying Corpora
    3.2.2 Complex Phrases
    3.2.3 Context and Ambiguities of Phrases
    3.2.4 Organization of the Lexicon
  3.3 Plausibility Networks
    3.3.1 Learning Semantic Relationships and Semantic Context
    3.3.2 The Foundation of Plausibility Networks
    3.3.3 Plausibility Networks for Noun-Connecting Semantic Relationships
    3.3.4 Learning in Plausibility Networks
    3.3.5 Recurrent Plausibility Networks for Contextual Relationships
    3.3.6 Learning in Recurrent Plausibility Networks
  3.4 Summary and Discussion

4 Structural Phrase Analysis in a Hybrid Separated Model
  4.1 Introduction and Overview
  4.2 Constraints for Coordination
  4.3 Symbolic Representation of Syntactic Constraints
    4.3.1 A Grammar for Complex Noun Phrases
    4.3.2 The Active Chart Parser and the Syntactic Constraints
  4.4 Connectionist Representation of Semantic Constraints
    4.4.1 Head-noun Structure for Semantic Relationships
    4.4.2 Training and Testing Plausibility Networks with NCN-relationships
    4.4.3 Learned Internal Representation
  4.5 Combining Chart Parser and Plausibility Networks
  4.6 A Case Study
  4.7 Summary and Discussion

5 Structural Phrase Analysis in a Hybrid Integrated Model
  5.1 Introduction and Overview
  5.2 Constraints for Prepositional Phrase Attachment
  5.3 Representation of Constraints in Relaxation Networks
    5.3.1 Integrated Relaxation Network
    5.3.2 The Relaxation Algorithm
    5.3.3 Testing Relaxation Networks
  5.4 Representation of Semantic Constraints in Plausibility Networks
    5.4.1 Training and Testing Plausibility Networks with NPN-Relationships
    5.4.2 Learned Internal Representation
  5.5 Combining Relaxation Networks and Plausibility Networks
    5.5.1 The Interface between Relaxation Networks and Plausibility Networks
    5.5.2 The Dynamics of Processing in a Relaxation Network
  5.6 A Case Study
  5.7 Summary and Discussion

6 Contextual Phrase Analysis in a Hybrid Separated Model
  6.1 Introduction and Overview
  6.2 Towards a Scanning Understanding of Semantic Phrase Context
    6.2.1 Superficial Classification in Information Retrieval
    6.2.2 Skimming Classification with Symbolic Matching
  6.3 Constraints for Semantic Context Classification of Noun Phrases
  6.4 Syntactic Condensation of Phrases to Compound Nouns
    6.4.1 Motivation of Symbolic Condensation
    6.4.2 Condensation Using a Symbolic Chart Parser
  6.5 Plausibility Networks for Context Classification of Compound Nouns
    6.5.1 Training and Testing the Recurrent Plausibility Network
    6.5.2 Learned Internal Representation
  6.6 Summary and Discussion

7 Contextual Phrase Analysis in a Hybrid Integrated Model
  7.1 Introduction and Overview
  7.2 Constraints for Semantic Context Classification of Phrases
  7.3 Plausibility Networks for Context Classification of Phrases
    7.3.1 Training and Testing with Complete Phrases
    7.3.2 Training and Testing with Phrases without Insignificant Words
    7.3.3 Learned Internal Representation
  7.4 Semantic Context Classification and Text Filtering
  7.5 Summary and Discussion

8 General Summary and Discussion
  8.1 The General Framework of SCAN
  8.2 Analysis and Evaluation
    8.2.1 Evaluating the Problems
    8.2.2 Evaluating the Methods
    8.2.3 Evaluating the Representations
    8.2.4 Evaluating the Experiment Design
    8.2.5 Evaluating the Experiment Results
  8.3 Extensions of a Scanning Understanding
    8.3.1 Extending Modular Subtasks
    8.3.2 Extending Interactions
  8.4 Contributions and Conclusions

9 Appendix
  9.1 Hierarchical Cluster Analysis
  9.2 Implementation
  9.3 Examples of Phrases for Structural Phrase Analysis
  9.4 Examples of Phrases for Contextual Phrase Analysis

References
Index







Orders information
-----------------

            ISBN: 0 412 59100 6
            Pages: 190
            Figures: 56

            Price: 29.95 pounds sterling, 52.00 US dollars
            Credit cards: all major credit cards accepted by Chapman & Hall


            Please order from:

        --  Pam Hounsome
            Chapman & Hall
            Cheriton House
            North Way
            Andover
            Hants, SP10 5BE, UK
            England
 
            UK orders:
               Tel: 01264 342923 Fax: 01264 364418
            Overseas orders:
               Tel: +44 1264 342830 Fax: +44 1264 342761
 

            (Sister company in US)
        --  Peter Clifford
            Chapman & Hall Inc 
            One Penn Plaza
            41st Floor
            New York NY 10119
            USA
 
            Tel: 212 564 1060
            Fax: 212 564 1505

 
        --  or e-mail on: order at Chaphall.com





















More information about the Connectionists mailing list