Cognitive Modelling Workshop (Virtual and Physical)

Janet Wiles janet at psy.uq.oz.au
Thu Feb 1 02:46:37 EST 1996


                              CALL FOR PAPERS

                       We are pleased to announce a 

                       COGNITIVE MODELLING WORKSHOP 

                            in conjunction with
           the Australian Conference on Neural Networks (ACNN'96) 
            and the electronic cognitive science journal Noetica 


This announcement can be accessed on the WWW at URL:
http://psy.uq.edu.au/CogPsych/acnn96/cfp.html

--------------------------------------------------------------------
       Noetica/ACNN'96 Cognitive Modelling Workshop: Call For Papers

                              VIRTUAL WORKSHOP
                        February 1 to March 26, 1996

                              PHYSICAL WORKSHOP
                             Canberra Australia
                                April 9, 1996

                Memory, time, change and structure in ANNs:
        Distilling cognitive models into their functional components

                Organisers: Janet Wiles and J. Devin McAuley
               Departments of Computer Science and Psychology
                 University of Queensland QLD 4072 Australia
                  janet at psy.uq.edu.au devin at psy.uq.edu.au

                          Call For Papers Web Page
               http://psy.uq.edu.au/CogPsych/acnn96/cfp.html

                              Workshop Web Page
             http://psy.uq.edu.au/CogPsych/acnn96/workshop.html

Aim

The Workshop aim is to identify the functional roles of artificial neural
network (ANN) components and to understand how they combine to explain
cognitive phenomena. Existing ANN models will be distilled into their
functional components through case study analysis, targeting three
traditional strengths of ANNs - mechanisms for memory, time and change; and
one area of weakness - mechanisms for structure (see below for details of
these four areas).

Workshop Format

The Workshop will be held in two parts: a virtual workshop via the World
Wide Web followed by the physical workshop at ACNN'96.

February - March 1996 (Part 1 - Virtual Workshop): All members of the
cognitive modelling and ANN research communities are invited to submit case
studies of neural-network-based cognitive models (their own or established
models from the literature) and commentaries on the workshop issues and case
studies. Submissions judged appropriate for the workshop will be posted to
the Workshop Web Page as they arrive, and will be collated into a Special
Issue of the electronic journal Noetica. Multiple case studies of an ANN
model may be accepted if they address different cognitive phenomena. It is
OK to participate in the virtual workshop without attending the physical
workshop.

April 9, 1996 (Part 2 - Physical Workshop): A physical workshop will be held
as part of the 1996 Australian Conference on Neural Networks (ACNN'96) in
Canberra Australia. At the workshop, the collection of case studies and
commentaries will be available in hard copy form.

The physical workshop will be 90 minutes long, beginning with an
introduction (review of the issues); then presentation of submitted and
invited Case Studies; and closing with a discussion of what's missing from
the list of available mechanisms. A summary of the issues raised in the
discussion will be compiled afterwards, and made available via the workshop
web page. Further details on presentations will be announced closer to the
date of the workshop.

Rationale

For many ANN models of cognitive phenomena, interesting behaviour appears to
arise from the model as a total package, and it is often a challenge to
understand how aspects of the behaviour are supported by components of the
ANN. The goal of this workshop is to further such understanding:
Specifically, to identify the functional roles of ANN components, the link
from the component to the overall behaviour and how they combine to explain
cognitive phenomena.

The four target areas (memory, time, change and structure) are not disjoint,
but rather, provide overlapping viewpoints from which to examine models. In
essence, we believe these target areas are where to look for the ``sources
of power'' in a model.

We use the term "distillation" to refer to the process of identifying the
functional components of a model with respect to the four target areas. The
task of distillation requires exploring the details of a model in order to
clarify its source of power, stripping away other aspects. It focuses on the
computational properties of the model formalism, providing a method for: (1)
understanding the computational components of a newly presented model, and
how they give rise to its behavior, (2) discerning novel computational
components that may prove useful in model development, and (3) comparing ANN
models that target similar cognitive tasks.

The workshop is specifically intended for cognitive modellers who use ANNs,
but we anticipate that it will be of interest to the wider ANN community.
The case study format grounds the analysis of the functional components of
ANNs in the cognitive modelling literature, and focuses on phenomena that do
admit a computational explanation. The workshop is intended as much as a
learning experience as a communicative one.

----------------------------------------------------------------------------

Submission Details

Each Case Study should be based on a published ANN simulation in an area of
cognitive science. It should address all four target areas of memory, time,
change and structure, with the order of sections and choice of sub-headings
up to the individual. Some models may have little to say about one or more
of these areas - make this explicit. Include simulation details relevant to
understanding the functional components of the model but complete
replication details are not necessary.

Issues beyond the scope of the functional components and the behaviour they
support should not be included in the case study itself, but may be
appropriate as commentary.

Maximum length is 2500 words including references. Where possible, papers
should be submitted in html format (but ascii and postscript will also be
accepted).

The URL for each submission or the source document can be emailed to the
organisers at janet at psy.uq.edu.au or devin at psy.uq.edu.au between Feb 1 and
March 26, 1996.

Case Study Format

See Case Study #1 , "Elman's SRN and the discovery of lexical classes" as a
guide: 
      http://psych.psy.uq.oz.au/CogPsych/acnn96/case1.html

Target paper: Give the full reference to the original paper and relevant
simulation.

Introduction: In this section introduce the task addressed by the model and
the ANN used. Distinguish between the cognitive task of the model and its
instantiation in the input/output task of the network. Is there a gap
between the the cognitive task and the input/output task of the ANN? For
some studies this mismatch may be intentional, as the cognitive task can be
viewed as a by-product of another process (e.g., discovery of lexical
classes via the prediction task in Elman's SRN). In others, the mismatch
between cognitive task and input/output task may be less benign, obscuring
the contribution of the model towards understanding the phenomenon. For the
ANN task, consider the following questions: What are the inputs and outputs
of the model? How is the task information encoded in the input
representation? How is the model's response encoded by the output
representation? How does the network address the cognitive task?

Memory: Identify the information to be stored in memory, then describe the
mechanisms. There have been a range of mechanisms proposed for storing and
retrieving memories in neural networks: such as implicit long-term coding of
memories distributed in the weights; memory as an attractor; short-term
memory as transient decay of activations; limit-cycle encoding of memories
with synchrony as a method of retrieval. Consider the what and how of memory
storage and retrieval: What information needs memory? How is it stored and
retrieved?

Time: Describe how time is treated in the data, processing and parameters of
the network. For example, does the network consider time as an absolute
measure in which events in the input are time-stamped with reference to an
external clock, as a sequence in which only the order of events is
specified, or as a relative measure in which durations are ratios of one
another? Methods for processing temporal information with neural networks
have included: using a fixed or sliding time window which maps time into
space; learning of time delays in the network weights; sequential processing
of time slices; and encoding time as the phase angle of an oscillator. What
measures of time are used by the network? How are they represented and what
are the underlying mechanisms?

Change: In this section, consider the types of changes that occur in the
neural network, parameters, data, etc, over a range of timescales:

   * evolutionary change such as a genetic algorithm operating over network
     parameters;
   * generational change such as networks training the next generation of
     networks;
   * development and aging such as adding or removing units;
   * learning such as changing weights based on training data;
   * transient behavior such as activation equation dynamics

What types of changes occur in the selected model and how are they
implemented in the network's mechanisms? (Note that few of these aspects are
expected to apply to any one model, with many case studies focusing on
change as learning.)

Structure: In this section, consider how structured information in the
environment is represented as structured information in the network (e.g.,
an implicit grammar in training data can be encoded in the hidden-unit space
of a recurrent net; and higher-order bindings can be stored using tensors or
phase synchrony). What structure is coded directly into the architecture of
the network? Is the network partitioned into modules to directly encode
structure? What generalization is the network capable of? Is the
generalization due to direct coding of structure or does it learn it from
the training data? There has been an ongoing debate in the ANN literature on
the generalization abilities of networks. Are there important aspects of
structure that cannot be represented, learned or generalized by the network?
Where possible, identify structure in the environment that may be expected
to be reflected in the ANN model but is not.

Discussion and Conclusions: In the final section, discuss how the functional
components reviewed in the previous sections combine to explain the
cognitive phenomena targeted by the case study.

Commentary Format

The commentary section of the workshop is provided as an outlet for
interpretation, elaboration, and substantive criticism of case studies. It
is included as part of the workshop format to complement the case studies,
which are intended to be compact and focussed on the workshop aim of
distilling ANN models into their functional components. Each commentary
should discuss one or more case studies and have a maximum length of 1000
words including references.

----------------------------------------------------------------------------


More information about the Connectionists mailing list