NIPS*2000 workshop information

Rich Caruana caruana+ at cs.cmu.edu
Wed Oct 4 10:37:21 EDT 2000



          * * *      Post-NIPS*2000 Workshops      * * *
          * * *       Breckenridge, Colorado       * * *
          * * *         December 1-2, 2000         * * *


The NIPS*2000 Workshops will be held Friday and Saturday, December 1
and 2, in Breckenridge Colorado after the NIPS conference in Denver
Monday-Thursday, November 27-30.  This year there are 18 workshops:

 - Affective Computing
 - Algorithms and Technologies for Neuroprosthetics and Neurorobotics
 - Computation in the Cortical Column
 - Computational Molecular Biology
 - Computational Neuropsychology
 - Cross-Validation, Bootstrap, and Model Selection
 - Data Fusion -- Theory and Applications
 - Data Mining and Learning on the Web
 - Explorative Analysis and Data Modeling in Functional Neuroimaging
 - Geometric Methods in Learning Theory
 - Information and Statistical Structure in Spike Trains
 - Learn the Policy or Learn the Value-Function?
 - New Perspectives in Kernel-based Learning Methods
 - Quantum Neural Computing
 - Representing the Structure of Visual Objects
 - Real-Time Modeling for Complex Learning Tasks
 - Software Support for Bayesian Analysis Systems
 - Using Unlabeled Data for Supervised Learning

All workshops are open to all registered attendees.  Many workshops
also invite submissions.  Submissions, and questions about individual
workshops, should be directed to each workshop's organizers.  Included
below is a short description of each workshop.  Additional information
is available at the NIPS*2000 Workshop Web Page:

   http://www.cs.cmu.edu/Groups/NIPS/NIPS2000/Workshops/ 

Information about registration, travel, and accommodations for the
main conference and the workshops is available at:

   http://www.cs.cmu.edu/Web/Groups/NIPS/

Breckenridge is a ski resort a few hours drive from Denver.  The daily
workshop schedule is designed to allow participants to ski half days,
or enjoy other extra-curricular activities.  Some may wish to extend
their visit to take advantage of the relatively low pre-season rates.

We look forward to seeing you in Breckenridge.

Rich Caruana and Virginia de Sa
   NIPS Workshops Co-chairs


* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Affective Computing

Workshop Co-Chairs: 
Javier R. Movellan, Institute for Neural Computation, UCSD
Marian Bartlett, Institute for Neural Computation, UCSD
Gary Cottrell, Computer Science, UCSD
Rosalind W. Picard, Media Lab, MIT 

Description:
The goal of this workshop is to explore and discuss the idea of
affective computers, i.e., computers that have the ability to express
emotions, recognize emotions, and whose behavior is modulated by
emotional states.  Emotions are a fundamental part of humans
intelligence. It may be argued that emotions provide an "operating
system" for autonomous agents that need to handle uncertainty of
natural environments in a flexible and efficient manner. Connectionist
models of emotion dynamics have been developed (Velasquez, 1996)
providing examples of computational systems that incorporate emotional
dynamics and that are being used in actual autonomous agents (e.g.,
robotic pets). Emotional skills, especially the ability to recognize
and express emotions, are essential for natural communication between
humans, and until recently, have been absent from the computer side of
the human-computer interaction. For example, autonomous teaching
agents and pet robots would greatly benefit from detecting affective
cues from the users (curiosity, frustration, insight, anger) and
adjusting to them, and also from displaying emotion appropriate to the
context. The workshop will bring together leaders in the main research
areas of affective computing: emotion recognition, emotion synthesis,
emotion dynamics, applications. Speakers from industry will discuss
current applications of affective computing, including synthesizing
facial expressions in the entertainment industry, increasing the
appeal of the pet robots through emotion recognition and synthesis,
and measuring galvanic skin response through the mouse to determine
user frustration.

Format:
This will be a one day workshop. The speakers will be encouraged to
talk about challenges and controversial topics both in their prepared
talks and in the insuing discussions. Since one of the goals of the
workshop is to facilitate communication between researchers in
different subfields, ample time will be given to questions. The last
part of the workshop will be devoted to a discussion of the most
promising approaches and ideas that will have emerged during the
workshop.

Contact Info:
Javier R. Movellan 
Institute for Neural Computation 
University of California San Diego 
La Jolla, CA 92093-0515 
movellan at inc.ucsd.edu 

Marian Stewart Bartlett 
Institute for Neural Computation 
University of California San Diego 
La Jolla, CA 92093-0515 
marni at inc.ucsd.edu

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Algorithms, technologies, and neural representations for
neuroprosthetics and Neurorobotics.

Organizers: Simon F Giszter and Karen A Moxon

http://neurobio.mcphu.edu/GiszterWeb/nips_ws_2k.html

Goals and objectives:
The goal of the workshop is to bring together researchers interested
in neuroprosthetics and neurorobotics and intermediate representations
in the brain with a view to generating a lively discussion of the
design principles for a brain to artificial device interface.

Speakers will be charged to address (favourably or unfavourably) the
idea that the nervous system is built around, or dynamically
organizes, low dimensional representations which may be used in (or
need to be designed into) neurosprosthetic interfaces and controllers.

Some current prosthetics are built around explicit motor
representations e.g.  kinematic plans. Though controversial, the
notion of primitives and low dimensional representations of input and
output are gaining favor. These may or may not contain or be used in
explicit plans.  It is very likely that the appropriate choices of
sensory and motor representations and motor elements are critical for
the design of an integrated sensory motor prostheses that enables
rapid adaptive learning, and creative construction of new motions and
planning and execution. With a burgeoning interest in neuroprosthetics
it is therefore timely to address how the interfaces to
neuroprostheses should be conceptualized: what reprsentations should
be extracted, what control elements should be provided and how should
these be integrated.

We hope to engage a wide range of perspectives to address needs for
research and the possibilities enabled by neuroprosthetics. We intend
to assemble presentations and discussions from the perspectives of
both neural data and theory, of new technologies and algorithms, and
of applications or experimental approaches enabled by new and current
technologies.

Anticipated or fully confirmed speaker/ participants:

John Chapin: Neurorobotics
Nikos Hatzopoulos: Neural coding in cortex of primates
Scott Makeig or Terry Sejnowski: EEG based controllers: representations
James Abbas: peripheral FES with CPG models
Warren Grill and Michel Lemay Intraspinal FES and force-fields
Karen Moxon : sensory prostheses
Gerry Loeb : intramuscular prostheses and spinal controls
Simon Giszter : spinal primitives and interface
Emo Todorov : cortical encoding and representation
Igo Krebs : rehabilitation with robots

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

COMPUTATION IN THE CORTICAL COLUMN

Organizers:  Henry Markram and Jennifer Linden
URL: http://www.keck.ucsf.edu/~linden/ColumnWorkshop.html

Understanding computation in the cortical column is a holy grail for
both experimental and theoretical neuroscience.  The basic six-layered
neocortical columnar microcircuit, implemented most extensively (and
perhaps in its most sophisticated form) in the human brain, supports a
huge variety of sensory, cognitive, and motor functions. The secret
behind the incredible flexibility and power of cortical columns has
remained elusive, but new insights are emerging from several
different areas of research.  It is a great time for cortical
anatomists, physiologists, modellers, and theoreticians to join forces
in attempting to decipher computation in the cortical column.

In this workshop, leading experimental and theoretical neuroscientists
will present their own visions of computation in the cortical column,
and will debate their views with an interdisciplinary audience.
During the morning session, speakers and panel members will analyze
columnar computation from their perspectives as authorities on the
anatomy, physiology, evolution, and network properties of cortical
microcircuitry.  Speakers and panelists in the afternoon session will
consider the functional significance of the cortical column in light
of their expert knowledge of two columnar systems which have attracted
intensive experimental attention to date: the visual cortex of cats
and primates, and the barrel cortex of rodents.

The goal of the workshop will be to define answers to four questions.
ANATOMY: Does a common denominator, a repeating microcircuit element,
exist in all neocortex?  PHYSIOLOGY: What are the electrical dynamics,
the computations, of the six-layered cortical microcircuit?  FUNCTION:
How do cortical columns contribute to perception?  EVOLUTION: How does
the neocortex confer such immense adaptability?

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Computational Molecular Biology

Organizers:

   Tommi Jaakkola, MIT
   Nir Friedman, Hebrew University

For more information contact the workshop organizers at:

   tommi at ai.mit.edu
   nir at cs.huji.ac.il

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

NIPS*2000 Workshop on Computational Neuropsychology

Workshop Organizers:  Sara Solla
	              Northwestern University

	              Michael Mozer
	              University of Colorado

	              Martha Farah
	              University of Pennsylvania

The 1980's saw two important developments in the sciences of the mind:  The
development of neural network models in cognitive psychology, and the rise of
cognitive neuroscience. In the 1990's, these two separate approaches
converged, and one of the results was a new field that we call "Computational
Neuropsychology." In contrast to traditional cognitive neuropsychology,
computational neuropsychology uses the concepts and methods of computational
modeling to infer the normal cognitive architecture from the behavior of
brain-damaged patients. In contrast to traditional neural network modeling in
psychology, computational neuropsychology derives constraints on network
architectures and dynamics from functional neuroanatomy and neurophysiology.
Unfortunately, work in computational neuropsychology has had relatively little
contact with the Neural Information Processing Systems (NIPS) community.  Our
workshop aims to expose the NIPS community to the unusual patient cases in
neuropsychology and the sorts of inferences that can be drawn from these
patients based on computational models, and to expose researchers in
computational neuropsychology to some of the more sophisticated modeling
techniques and concepts that have emerged from the NIPS community in recent
years.

We are interested in speakers from all aspects of neuropsychology, including: 
* attention (neglect) 
* visual and auditory perception (agnosia) 
* reading (acquired dyslexia) 
* face recognition (prosopagnosia) 
* memory (Alzheimer's, amnesia, category-specific deficits) 
* language (aphasia) 
* executive function (schizophrenia, frontal deficits). 

Further information about the workshop can be obtained at:
	http://www.cs.colorado.edu/~mozer/nips2000workshop.html
Contact Sara Solla (solla at nwu.edu) or Mike Mozer (mozer at colorado.edu) 
if you are interested in speaking at the workshop.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Call for Papers
NIPS*2000 Workshop: Cross-Validation, Bootstrap and Model Selection
   
Organizers:

  Rahul Sukthankar
      Compaq CRL and Robotics Institute, Carnegie Mellon 
  Larry Wasserman
      Department of Statistics, Carnegie Mellon 
  Rich Caruana
      Center for Automated Learning and Discovery, Carnegie Mellon 

Electronic Submission Deadline: October 18, 2000 (extended abstracts)

Description

   Cross-validation and bootstrap are popular methods for estimating
   generalization error based on resampling a limited pool of data,
   and have become widely-used for model selection. The aim of this
   workshop is to bring together researchers from both matchine learning
   and statistics in an informal setting to discuss current issues in
   resampling-based techniques. These include:
     * Improving theoretical bounds on cross-validation, bootstrap or
       other resampling-based methods;
     * Empirical or theoretical comparisons between resampling-based
       methods and other forms of model selection;
     * Exploring the issue of overfitting in sampling-based methods;
     * Efficient algorithms for estimating generalization error;
     * Novel resampling-based approaches to model selection.

   The format for this one day workshop consists of invited talks, a
   panel discussion and short presentations from accepted submissions.

   Participants are encouraged to submit extended abstracts describing
   their current research in this area. Results presented at other
   conferences are eligible, provided that they are of broad interest
   to the community and clearly identified as such. Submissions for
   workshop presentations must be received by October 18, 2000, and
   should be sent to rahuls=nips at cs.cmu.edu. Extended abstracts should
   be in Postscript or Acrobat format and 1-2 pages in length.

Contact Information

   The workshop organizers can be contacted by email at
   <rahuls=nips at cs.cmu.edu>, or at the phone/fax numbers listed below.

     Organizer           Email              Phone           Fax
   Rahul Sukthankar rahuls at cs.cmu.edu  +1-617-551-7694 +1-617-551-7650
   Larry Wasserman  larry at stat.cmu.edu +1-412-268-8727 +1-412-268-7828
   Rich Caruana     caruana at cs.cmu.edu +1-412-268-7664

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Workshop Title:  

         Data Fusion -- Theory and Applications

Multisensor data fusion refers to the acquisition, processing, and
synergistic combination of information gathered by various knowledge
sources and sensors to provide a better understanding of the phenomenon
under consideration. The concept of fusion underlies many information
processing mechanisms in machines and in biological systems.
In biological/perceptual systems, information fusion seems to account
for remarkable performance and robustness when confronted with a variety of
uncertainties. The complexity of fusion processes is due to many
factors including uncertainties associated with different information
sources,  complimentarily of individual sources. For example, modeling,
processing, fusion, and interpretation of diverse sensor data for
knowledge assimilation and inferencing pose challenging problems,
especially when available information is incomplete, inconsistent,
and/or imprecise.

 The potential for significantly enhanced performance and robustness
has motivated vigorous ongoing research in both biological and
artificial  multisensor data fusion algorithms, architectures, and
applications.
Such efforts deal with fundamental issues including modeling process,
architecture and algorithms, information extraction, fusion process,
optimization of fused performance, real time (dynamic) fusion etc.

The goal of this workshop is to bring together researchers from various
diverse fields (learning, human computer interaction, vision, speech,
neural biology, etc) to discuss both theoretical and application issues
that are relevant across different fields. It aims is to make the NIPS
community aware of the various aspects and current status of this field,
as well as the problems that remain unsolved.


We are calling for participations. Submissions should be sent to
the workshop organizers.


Workshop Organizers:

         Misha Pavel

               pavel at ece.ogi.edu
               (503)748-1155 (o)
               Dept. of Electrical and Computer Engineering
               Oregon Graduate Institute of Science and Technology
               20000 NW Walker Road
               Beaverton, OR 97006

         Xubo Song

               xubosong at ece.ogi.edu
               (503) 748-1311 (o)
               Dept. of Electrical and Computer Engineering
               Oregon Graduate Institute of Science and Technology
               20000 NW Walker Road
               Beaverton, OR 97006

Workshop Web Page:
             http://www.ece.ogi.edu/~xubosong/FusionWorkshop.html

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Data Mining and Learning on the Web

Organizers: Gary William Flake (flake at research.nj.nec.com),
	    Frans Coetzee (coetzee at research.nj.nec.com), and
	    David Pennock (dpennock at research.nj.nec.com)

No doubt about it, the web is big.  So big, in fact, that many
classical algorithms for databases and graphs cannot scale to the
distributed multi-terabyte anarchy that is the web.  How, then, do we
best use, mine, and model this rich collection of data?  Arguably, the
best approach to developing scalable non-trivial applications and
algorithms is to exploit the fact that, both as a graph and as a
database, the web is highly non-random.  Furthermore, since the web is
mostly created and organized by humans, the graph structure (in the
form of hyperlinks) encodes aspects of the content, and vice-versa.
=20
We will discuss methods that exploit these properties of the web.  We
will further consider how many of the classical algorithms, which were
formulated to minimize worst-case performance over all possible
problem instances, can be adapted to the more regular structure of the
web.  Finally, we will attempt to identify major open research
directions.

This workshop will be organized into three mini-sessions: (1)
Systematic Web Regularities (2) Web Mining Algorithms, and (3)
Inferable Web Regularities.  All speakers are invited and a partial
list of confirmed speakers includes: Albert-L=E1szl=F3 Barab=E1si, Justin
Boyan, Rich Caruana, Soumen Chakrabarti, Monika Henzinger, Ravi Kumar,
Steve Lawrence, and Andrew McCallum.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

EXPLORATIVE ANALYSIS AND DATA MODELING IN FUNCTIONAL NEUROIMAGING:
Arriving at, not starting with a hypothesis

Advanced analysis methods and models of neuroimaging data are only 
marginally represented at the big international brain mapping
meetings. This contrasts the broad belief in the Neuroimaging 
community that these approaches are crucial in the further development 
of the field. The purpose of this NIPS workshop is to bring together 
theoreticians developing and applying new methods of neuroimaging data 
interpretation. 

The workshop focuses on explorative analysis (a) and modeling
(b) of neuroimaging data:

a) Higher-order explorative analysis (for example: ICA/PCA, clustering 
   algorithms)  can reveal data properties in a data-driven, not 
   hypothesis-driven manner.

b) Models for neuroimaging data can guide the data interpretation. 
   The universe of possible functional hypotheses can be constrained 
   by models linking the data to other empirical data, for instance 
   anatomical connectivity (pathway analysis to yield effective 
   connectivity), encephalograghy data, or to behavior (computational 
   function).  

The talks will introduce the new approaches, in discussions we hope to
address benefits and problems of the various methods and of their 
possible combination. It is intended to discuss not only the theory 
behind the various approaches, but as well their value for improved 
data interpretation. Therefore, we strongly encourage theparticipation 
of neuroimaging experimenters concerned with functional paradigms 
suggesting the use of nonstandard data interpretation methods. 

More information can be found on the workshop webpage:

  http://www.informatik.uni-ulm.de/ni/staff/FSommer/workshops/nips_ws00.html

For any requests please contact the workshop organizers:

  Fritz Sommer and Andrzej Wichert
  Department of Neural Information Processing
  University of Ulm
  D-89069 Ulm
  Germany
  Tel. 49(731)502-4154
       49(731)502-4257
  FAX  49(731)502-4156

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Geometric and Quantum Methods in Learning
Organization: Shunichi Amari, Amir Assadi (Chair), and Tomaso Poggio

Description. The purpose of this workshop is to attract the attention of 
the learning community to geometric methods and to take on an endeavor to:
1. lay out a geometric paradigm for formulating profound ideas in learning;
2. to facilitate the development of geometric methods suitable of 
investigation of new ideas in learning theory. Today's continuing advances 
in computation make it possible to infuse geometric ideas into learning 
that would otherwise have been computationally prohibitive. Quantum 
computation has created great excitement, offering a broad spectrum of new 
ideas for discovery of parallel-distributed algorithms, a hallmark of 
learning theory. In addition, geometry and quantum computation together 
offer a more profound picture of the physical world, and how it interacts 
with the brain, the ultimate learning system. Among the discussion topics, 
we envision the following: Information geometry, differential topological 
and quantum methods for turning local estimates into global quantities and 
invariants, Riemannian geometry and Feynman path integration as a framework 
to explore nonlinearity, and information theory of massive data sets. We 
will also examine the potential impact of learning theory on future 
development of geometry, and examples of how quantum computation has opened 
new vistas on design of parallel-distributed algorithms. The participants 
of the Workshop on Quantum Computation will find this workshop's geometric 
ideas beneficial for the theoretical aspects of quantum algorithms and 
quantum information theory.    We plan to prepare a volume based on the 
materials for the workshops and other contributions to be proposed to the 
NIPS Program Committee.

Contact Information
Amir Assadi
University of Wisconsin-Madison. URL: www.cms.wisc.edu/~cvg
E-mail: ahassadi at facstaff.wisc.edu

Partial List of Speakers and Panelists
Shun-Ichi Amari
Amir Assadi
Zubin Ghahramani
Geoffrey Hinton
Tomaso Poggio
Jose Principe
Scott Mackeig
Naoki Saito (tentative)

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Information and Statistical Structure in Spike Trains:
  How can we calculate what we really want to know?

Organizer: Jonathan D. Victor
jdvicto at med.cornell.edu

Advances in understanding how neurons represent and manipulate information
in their spike trains will require a combination of appropriate
theoretical, computational, and experimental strategies. The workshop has
several aims: (1) By presenting currently available methods in a
tutorial-like fashion, we hope to lower the energy barrier to
experimentalists who are interested in using information-theoretic and
related approaches, but have not yet done so. The presentation  of current
methods is to be done in a manner that emphasizes the theoretical
underpinnings of different strategies and the assumptions and tradeoffs
that they make. (2) By provide a forum for open discussion among current
practitioners, we hope to make progress towards understanding the
relationships of the available techniques, guidelines for their
application, and the basis of the differences in findings across
preparations. (3) By presenting the (not fully satisfactory) state of the
art to an audience that includes theorists, we hope to spur progress
towards the development of better techniques, with a particular emphasis on
exploiting more refined hypotheses for spike train structure, and
developing techniques that are applicable to multi-unit recordings.

A limited number of slots are available for contributed presentations.
Individuals interested in presenting a talk (approximately 20 minutes, with
10 to 20 minutes for discussion) should submit a title and abstract,
200-300 words, to the organizer by October 22, 2000. Please indicate
projection needs (overheads, 2x2 slides, LCD
data projector). 

For further information, please see
http://www-users.med.cornell.edu/~jdvicto/nips2000.html

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Title:
======
Reinforcement Learning: Learn the Policy or Learn the Value-Function?

Organising Committee
====================

Peter Bartlett  (Peter.Bartlett at anu.edu.au)
Jonathan Baxter (jbaxter at whizbang.com)
David McAllester (dmac at research.att.com)

Home page
=========
http://csl.anu.edu.au/~bartlett/rlworkshop

Workshop Outline
==============

There are essentially three main approaches to reinforcement learning
in large state spaces:

1) Learn an approximate value function and use that to generate a
   policy,

2) Learn the parameters of the policy directly, typically using a
   Monte-Carlo estimate of the performance gradient, and

3) "Actor-Critic" methods that seek to combine the best features of 1)
   and 2).

There has been a recent revival of interest in this area, with many
new algorithms being proposed in the past two years. It seems the time
is right to bring together researchers for an open discussion of the
three different approaches.

Submissions are sought on any topic of related interest, such as new
algorithms for reinforcement learning, but we are particularly keen to
solicit contributions that shed theoretical or experimental light on
the relative merits of the three approaches, or that provide a
synthesis or cross-fertilization between the different disciplines.

Format
======
There will be invited talks and a series of short contributed talks
(15 minutes), with plenty of discussion time. If you are interested in
presenting at the workshop, please send a title and short abstract to
jbaxter at whizbang.com

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

New Perspectives in Kernel-based Learning Methods
Nello Cristianini, John Shawe-Taylor, Bob Williamson
http://www.cs.rhbnc.ac.uk/colt/nips2000.html

Abstract:

The aim of the workshop is to present new perspectives and new
directions in kernel methods for machine learning.

Recent theoretical advances and experimental results have drawn
considerable attention to the use of kernel functions in learning
systems. 
Support Vector Machines, Gaussian Processes, kernel PCA, kernel
Gram-Schmidt, Bayes Point Machines, Relevance and Leverage Vector
Machines, are
just some of the algorithms that make crucial use of kernels for
problems of classification, regression, density estimation, novelty
detection and clustering.
At the same time as these algorithms have been under development, novel
techniques specifically designed for kernel-based systems have resulted
in
methods for assessing generalisation, implementing model selection, and
analysing performance. 
The choice of model may be simply determined by parameters of the
kernel, as for example the width of a Gaussian kernel. More recently,
however,
methods for designing and combining kernels have created a toolkit of
options for choosing a kernel in a particular application. 
These methods have extended the applicability of the techniques beyond
the natural Euclidean spaces to more general discrete structures. The
field is
witnessing growth on a number of fronts, with the publication of books,
editing of special issues, organization of special sessions and
web-sites.
Moreover, a convergence of ideas and concepts from different disciplines
is occurring. The growth is concentrated in four main directions: 
1) design of novel kernel-based algorithms 
2) design of novel types of kernel functions 
3) development of new learning theory concepts 
4) application of the techniques to new problem areas 


Extended abstracts may be submitted before October 30th to
nello at dcs.rhbnc.ac.uk

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Quantum Neural Computing

http://web.physics.twsu.edu/behrman/NIPS.htm

Recently there has been a resurgence of interest in quantum computers
because of their potential for being very much smaller and faster than
classical computers, and because of their ability in principle to do
heretofore impossible calculations, such as factorization of large
numbers in polynomial time. This workshop will explore ways to implement
quantum computing in network topologies, thus exploiting both the
intrinsic advantages of quantum computing and the adaptability of neural
computing.
Aspects/approaches to be explored will include: quantum hardware, e.g.
nmr, quantumdots, and molecular computing; theoretical and practical
limits to quantum and quantum neural computing, e.g., noise and
measurability; and simulations. Targeted groups: computer scientists,
physicists and mathematicians interested in quantum computing and
next-generation computing hardware.

Invited speakers will include:

Paul Werbos, NSF Program Director, Control, Networks & Computational
Intelligence Program, Electrical and Communications Systems Division,
who will keynote the workshop.

Thaddeus Ladd, Stanford, "Crystal lattice quantum computation."

Mitja Perus, Institute BION, Stegne 21, SI-1000 Ljubljana, Slovenia,
"Quantum associative nets: A new phase processing model"

Ron Spencer, Texas A&M University, "Spectral associative memories."

E.C. Behrman, J.E. Steck, and S.R. Skinner, Wichita State University,
"Simulations of quantum neural networks."
Dan Ventura, Penn State: "Linear optics implementation of quantum
algorithms."
Ron Chrisley, TBA

Send contributed papers, by October 20th,  to:

Co-chairs:  Elizabeth C. Behrman behrman at wsuhub.uc.twsu.edu
            James E. Steck steck at bravo.engr.twsu.edu

This Workshop is partially supported by the National Science Foundation,
Grant #ECS-9820606.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Workshop title:	Representing the Structure of Visual Objects

Web page:	http://kybele.psych.cornell.edu/~edelman/NIPS00/index.html

Organizers: Nathan Intrator   (Nathan_Intrator at brown.edu)
	    Shimon Edelman    (se37 at cornell.edu)

Confirmed invited speakers:

          Ron Chrisley (Sussex)
	  John Hummel (UCLA)			
	  Christoph von der Malsburg (USC)	
	  Pietro Perona (Caltech)			
	  Tomaso Poggio (MIT)			
	  Greg Rainer (Tuebingen)			
	  Manabu Tanifuji (RIKEN)			
	  Shimon Ullman (Weizmann)	

Description:

The focus of theoretical discussion in visual object processing has
recently started to shift from problems of recognition and
categorization to the representation of object structure. The main
challenges there are productivity and systematicity, two traits
commonly attributed to human cognition.  Intuitively, a cognitive
system is productive if it is open-ended, that is, if the set of
entities with which it can deal is, at least potentially,
infinite. Systematicity, even more than productivity, is at the crux
of the debate focusing on the representational theory of mind.  A
visual representation could be considered systematic if a well-defined
change in the spatial configuration of the object (e.g., swapping top
and bottom parts) were to cause a principled change in the
representation (the representations of top and bottom parts are
swapped).  In vision, this issue (as well as compositionality,
commonly seen as the perfect means of attaining systematicity) is, at
present, wide open. The workshop will start with an introductory
survey of the notions of productivity, systematicity and
compositionality, and will consist of presentations by the proponents
of some of the leading theories in the field of structure
representation, interspersed with open-floor discussion.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Title: Real-Time Modeling for Complex Learning Tasks

The goal of this workshop is to develop a better understanding how to create 
new statistical learning techniques that can deal with complex, high dimensional 
data sets, where (possibly redundant and/or irrelevant) data is received 
continuously from sensors and needs to be incorporated in learning models that may 
have to change their structure during learning under real time constraints. 

The workshop aims at bringing together researchers from various theoretical 
learning frameworks (Bayesian, Nonparametric statistics, Kernel methods, Gaussian 
processes etc.) and application domains to  discuss future research direction for
principled approaches towards real-time learning. For further details, please refer 
to the URL: http://www-slab.usc.edu/events/NIPS2000 

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Software Support for Bayesian Analysis System

URL: http://ase.arc.nasa.gov/nips2000

Bayesian Analysis is an established technique for many data analysis
applications. The development of application software for any specific
analysis problem, however, is a difficult and time-consuming task. 
Programs must be tailored to the specific problem, need to represent 
the given statistical model correctly, and should preferably run efficiently. 
Over the last years, a variety of different libraries, shells, and 
synthesis systems for Bayesian data analysis has been implemented which 
are intended to simplify application software development. The goal of this 
workshop is to bring together developers of such generic Bayesian software 
packages and tools (e.g., JavaBayes, AutoClass, BayesPack, BUGS, BayesNet
Toolbox, PDP++) together with the developers of generic algorithm schemas 
(more recent ones amenable to automated effort include Structural EM, 
Fisher Kernel method, mean-field, etc.), and software engineering experts. 
It is intended as a forum to discuss and exchange the different technical 
approaches as for example usage of libraries, interpretation of statistical 
models (e.g., Gibbs sampling), or software synthesis based on generic 
algorithm schemas. The workshop aims to discuss the potential and problems 
of generic tools for the development of efficient Bayesian data analysis 
software tailored towards specific applications.

If you are planning to attend this workshop as a particpiant and/or are
interested to present your work, please send a short (1-4 pages) system 
description, technical paper, or position paper to fisch at ptolemy.arc.nasa.gov
no later than Wednesday, October, 18 2000.

Preliminary PC:                 Organizers: 
L. Getoor, Stanford             W. Buntine, Dynaptics
P. Smyth, UC Irvine             B. Fischer, RIACS/NASA Ames 
M. Turmon, JPL                  J. Schumann, RIACS/NASA Ames 
K. Murphy, UC Berkeley

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

NIPS 2000 Workshop and Unlabeled Data Supervised Learning Competition!

We are pleased to announce the NIPS 2000 Unlabeled Data Supervised
Learning Competition!  This competition is designed to compare algorithms
and architectures that use unlabeled data to help supervised
learning, and will culminate in a NIPS workshop, where approaches and
results will be compared.

Round three begins soon, so don't delay (it is also still possible to
submit results for rounds 1 and 2).

More details, are now available at the competition web-site:

        http://q.cis.uoguelph.ca/~skremer/NIPS2000/

May the best algorithm win!

        Stefan, Deb, and Kristin

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *




More information about the Connectionists mailing list