From schmidhu at tumult.informatik.tu-muenchen.de Tue May 1 04:15:12 1990 From: schmidhu at tumult.informatik.tu-muenchen.de (Juergen Schmidhuber) Date: Tue, 1 May 90 10:15:12 +0200 Subject: New FKI-Reports Message-ID: <9005010815.AA23674@kiss.informatik.tu-muenchen.de> Two new reports on spatio-temporal credit assignment in neural networks for adaptive control are available. LEARNING TO GENERATE FOCUS TRAJECTORIES FOR ATTENTIVE VISION FKI-REPORT 128-90 Juergen Schmidhuber and Rudolf Huber One motivation of this paper is to replace the often unsuccessful and inefficient purely static `neural' approaches to visual pattern recognition by a more efficient sequential approach. The latter is inspired by the observation that biological systems employ sequential eye-movements for pattern recognition. The other motivation is to demonstrate that there is at least one principle which can lead to the LEARNING of dynamic selective spatial attention. A system consisting of an adaptive `model network' interacting with a dynamic adaptive `control network' is described. The system LEARNS to generate focus trajectories such that the final position of a moving focus corresponds to a target to be detected in a visual scene. The difficulty is that no teacher provides the desired activations of `eye-muscles' at various times. The only goal information is the desired final input corresponding to the target. Thus the task involves a complex temporal credit assignment problem, as well as an attention shifting problem. It is demonstrated experimentally that the system is able to learn correct sequences of focus movements involving translations and rotations. The system also learns to track a moving target. Some implications for attentive systems in general are discussed. For instance, one can build a `mental focus' which operates on the set of internal representations of a neural system. It is suggested that self-referential systems which model the consequences of their own `mental focus shifts' open the door for introspective learning in neural networks. TOWARDS COMPOSITIONAL LEARNING IN NEURAL NETWORKS FKI-REPORT 129-90 Juergen Schmidhuber None of the existing learning algorithms for neural networks with internal and/or external feedback addresses the problem of learning by composing subprograms, of learning `to divide and conquer'. In this work it is argued that algorithms based on pure gradient descent or on temporal difference methods are not suitable for large scale dynamic control problems, and that there is a need for algorithms that perform `compositional learning'. Some problems associated with compositional learning are identified, and a system is described which attacks at least one of them. The system learns to generate sub-goals that help to achieve its main goals. This is done with the help of `time-bridging' adaptive models that predict the effects of the system's sub-programs. A simple experiment is reported which demonstrates the feasibility of the method. To obtain copies of these reports, write to Juergen Schmidhuber Institut fuer Informatik, Technische Universitaet Muenchen Arcisstr. 21 8000 Muenchen 2 GERMANY or send email to schmidhu at lan.informatik.tu-muenchen.dbp.de Only if this does not work for some reason, try schmidhu at tumult.informatik.tu-muenchen.de Please let your message look like this: subject: FKI-Reports FKI-128-90, FKI-129-90 Physical address (not more than 33 characters per line) DO NOT USE REPLY! From khaines at GALILEO.ECE.CMU.EDU Tue May 1 10:48:03 1990 From: khaines at GALILEO.ECE.CMU.EDU (Karen Haines) Date: Tue, 1 May 90 10:48:03 EDT Subject: INNC - Call for Volunteers Message-ID: <9005011448.AA11682@galileo.ece.cmu.edu> *************************************************************************** INNC REQUEST FOR VOLUNTEERS July 9-13,1990 Paris, France *************************************************************************** This is the final call for volunteers to help at the INNC conference, to be held at the Palias Des Congres in Paris, France, on July 9-13,1990. Full admittance to the conference and a copy of the proceedings is offered in exchange for your assistance throughout the conference. In general, each volunteer is expected to work one shift each day of the conference. Hours are approximately: AM shift - 7:00 am - 1:00pm PM shift - Noon - 6:00 pm In addition, assistance may be required for the social events. Below is a description of the available positions. If you are interested in volunteering, please send me the following information: Name Address Phone number Country electronic mail address shift preference Positions are being filled on a first commit first served basis. If you have further questions, please feel free to contact me. Karen Haines Dept. of ECE Carnegie Mellon University Pittsburgh, PA 15213 message: (412) 362-8675 (tell me where you are calling from) email: khaines at galileo.ece.cmu.edu PLEASE NOTE!!!!!!!!!!!!!!!!!! THERE IS NO FUNDING AVAILABLE FROM THE CONFERENCE TO COVER TRAVELING/LODGING EXPENSES. Thank you, Karen Haines INNC Volunteer Coordinator Volunteer Positions (volunteers needed) - Description (please note that hours are subject to change) --------------------------------------------------------- Exhibits - Stuffing Proceedings (8) - These volunteers will be required to work Sunday 9am-6pm, Monday 8am-6pm, and Tuesday 8am-12pm. Sunday and Monday will be used to stuff proceedings into the bags. Monday/Tuesday they will double in the exhibits area assisting the Exhibits Chair exhibitors. Poster Session (8) - The volunteers will be responsible for assisting the presenters in putting up/taking down their posters. Days that they will be Shifts are AM or PM Tues thru Thurs. (Hours - General) Conference Sessions (16) - The number of Technical sessions that will be occurring each morning and afternoon of the conference is 4. Two volunteers will be used to check badges at the door for each technical session. Volunteers working the technical sessions will be assigned mornings or afternoons in groups of two. Note that they will be working with the same person each day throughout the conference. Shifts are AM or PM, Tues-Fri. (Hours - General) Exhibit Area II (4) : - Two volunteers will be used to check badges at the door. Volunteers will be assigned mornings or afternoons. Shifts are AM or PM, Tues-Fri. (Hours - General) Message Center (4) - Volunteers will be responsible for the message center. Two volunteers in the morning, two in the afternoon. Shifts are AM or PM Mon-Fri. (Hours - General) Reception at the Hotels (24) - Volunteers will be posted at 6 hotels to provide directions to the conference. Working in teams of 2, these volunteers will be required to work Sunday 9am-9pm, Monday 9am-9pm. From noel%cs.exeter.ac.uk at nsfnet-relay.AC.UK Tue May 1 10:28:46 1990 From: noel%cs.exeter.ac.uk at nsfnet-relay.AC.UK (Noel Sharkey) Date: Tue, 1 May 90 10:28:46 BST Subject: weight spaces In-Reply-To: ray@au.oz.su.cs.cluster's message of Mon, 30 Apr 1990 19:09:43 +1000 <9004300912.26100@munnari.oz.au> Message-ID: <17159.9005010928@entropy.cs.exeter.ac.uk> I will get cracking on the intro as well - did you get abstracts? I would like to set a real deadline for going to the publisher on may 12. I think you should tell stenning that the papers will go to the publishers on may 10th and we have to write an intro. so he is really to late. i didn't realise this when I spoke to him. (yes i now think you are right about the fast one.). lets give him no choice. noel From David.Servan-Schreiber at A.GP.CS.CMU.EDU Wed May 2 12:25:46 1990 From: David.Servan-Schreiber at A.GP.CS.CMU.EDU (David.Servan-Schreiber@A.GP.CS.CMU.EDU) Date: Wed, 02 May 90 12:25:46 EDT Subject: TR available Message-ID: <6169.641665546@A.GP.CS.CMU.EDU> A Parallel Distributed Processing Approach to Behavior and Biology in Schizophrenia. Jonathan D. Cohen and David Servan-Schreiber Technical Report AIP-100 Department of Psychology and School of Computer Science Carnegie Mellon Pittsburgh, PA 15123 ABSTRACT In this paper, we illustrate the use of connectionist models to explore the relationship between biological variables and cognitive deficits in schizophrenia. In the first part of the paper, we describe schizophrenic cognitive deficits in three experimental tasks that tap attention and language processing abilities. We also review biological disturbances that have been reported involving the frontal lobes and the mesocortical dopamine system. In the second part of the paper we present three computer models, each of which simulates normal performance in one of the cognitive tasks described initially. These models were developed within the connectionist (or parallel distributed processing) framework. At the behavioral level, the models suggest that a disturbance in the processing of context can account for schizophrenic patterns of performance in both attention and language-related tasks. At the same time, the models incorporate a mechanism for processing context that can be identified with the function of prefrontal cortex, and a parameter that corresponds to the neuromodulatory effects of dopamine. A disturbance in this parameter in the component of the model corresponding to function of prefrontal cortex is sufficient to account for schizophrenic patterns of performance in all three of the cognitive tasks simulated. Thus, the models offer an explanatory mechanism linking performance deficits to a disturbance in the processing of context which, in turn, is attributed to a reduction of dopaminergic activity in prefrontal cortex. In the General Discussion, we consider the implications that these models have for our understanding of both normal and schizophrenic cognition. We conclude with a discussion of some general issues concerning the use of computer simulation models in research. This report is availble at no charge. Please send requests to jc5e at andrew.cmu.edu. From noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK Wed May 2 11:31:50 1990 From: noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK (Noel Sharkey) Date: Wed, 2 May 90 11:31:50 BST Subject: weight spaces In-Reply-To: Noel Sharkey's message of Tue, 1 May 90 10:28:46 BST <17159.9005010928@entropy.cs.exeter.ac.uk> Message-ID: <17550.9005021031@entropy.cs.exeter.ac.uk> I apologise for the appearance of personal mail on the mailing list yesterday. I have no idea how it happened. From David.Servan-Schreiber at A.GP.CS.CMU.EDU Wed May 2 10:58:37 1990 From: David.Servan-Schreiber at A.GP.CS.CMU.EDU (David.Servan-Schreiber@A.GP.CS.CMU.EDU) Date: Wed, 02 May 90 10:58:37 EDT Subject: Recurrent Linguistic Domain Papers? In-Reply-To: Your message of Fri, 27 Apr 90 01:35:00 -0500. Message-ID: <4508.641660317@A.GP.CS.CMU.EDU> Tom, Axel Cleeremans, Jay McClelland and I have also worked on simple recurrent networks (SRNs) and their ability to discover finite state and recurrent grammars from examplars. We have shown that, during training with exemplars generated from a finite state grammar, an SRN progressively encodes more and more temporal context. We also explored the conditions under which the network can carry information about distant sequential contingencies across intervening elements to distant elements. Such information is retained with relative ease if it is relevant at each intermediate step of a sequence; it tends to be lost when intervening elements do not depend on it. However, in a more complex simulation, we showed that long distance sequential contingencies can be encoded by an SRN even if only subtle statistical properties of embedded strings depend on the early information. Our interpretation of this phenomenon is that the network encodes long-distance dependencies by *shading* internal representations that are responsible for processing common embeddings in otherwise different sequences. This ability to represent simultaneously similarities and differences between several sequences relies on the graded nature of representations used by the network, which contrast with the finite states of traditional automata. For this reason, in our more recent work we have started to call such networks *Graded State Machines*. Axel and Jay have also shown that learning and processing in such graded state machines accounts nicely for the way in which human subjects improve and perform in an implicit finite-state grammar learning experiment. Finally, in addition to Jeff Elman's and Jordan Pollack's work, Bob Allen has also done some interesting experiments with recurrent networks and discovery of sequential structures. Unfortunately I cannot put my hands on the appropriate references just now but he can be contacted at RBA at flash.bellcore.com. Cleeremans A, and McClelland J (submitted to Cognitive Science) Learning the Structure of Event Sequences. Available from the first author, dpt of Psychology, Carnegie Mellon University, Pgh, PA, 15213 Cleeremans A, Servan-Schreiber D, and McClelland J (1989) Finite State Automata and Simple Recurrent Networks. Neural Computation 1:372-381 Servan-Schreiber D, Cleeremans A, and McClelland J (1988) Encoding Sequential Structure in Simple Recurrent Networks. Technical Report CMU-CS-183, Carnegie Mellon University (orders taken by copetas at cs.cmu.edu, no charge) From mel at aurel.cns.caltech.edu Wed May 2 12:36:15 1990 From: mel at aurel.cns.caltech.edu (Bartlett Mel) Date: Wed, 2 May 90 09:36:15 PDT Subject: TR available Message-ID: <9005021636.AA08904@aurel.cns.caltech.edu> **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** The following TR is now available. A postscript version can be gotten by the usual anonymous ftp (see below). If you can't use the postscript version, you can get a hardcopy by sending a postcard to: C. Hochenedel Division of Biology Caltech, 216-76 Pasadena, CA 91125 ________________________________________ THE SIGMA-PI COLUMN: A MODEL OF ASSOCIATIVE LEARNING IN CEREBRAL NEOCORTEX Bartlett W. Mel Computation and Neural Systems Program, 216-76 California Institute of Technology Pasadena, California 91125 mel at aurel.cns.caltech.edu ABSTRACT In this paper we present a model of associative learning in cerebral neocortex. The extrinsically-projecting pyramidal cells of layers 2, 3, and 5 of association cortex are modeled as {\sl sigma-pi} units, where a {\sl sigma-pi} unit computes its activation level as a sum of contributions from a set of multiplicative (or locally-thresholded) clusters of synapses distributed throughout its dendritic tree. The model demonstrates how a broad class of biologically-relevant nonlinear associative learning problems can be solved in this system by modifying only a single layer of excitatory synapses under the control of a Hebb-type learning rule. The model also accounts for a variety of features of cortical anatomy, physiology, and biophysics whose relations to learning have remained poorly understood. These include, (1) three learning-related roles for the {\sc nmda} channel, one of them new, (2) the gross asymmetry in number and patterns of termination of excitatory vs. inhibitory synapses onto cortical pyramidal cells, as well as the apparent lack of plasticity at inhibitory synapses, (3) the replication of like-activated neurons beneath a single point in cerebral cortex, and in particular the clumping of apical dendrites of pyramidal cells on their rise to the cortical surface, (4) the complex 3-dimensional arborizations of axons and dendrites in layer 1, which give rise to a rich ``combinatorial'' association interface crucial to the current model, and (5) putative rules for activity-dependent axon growth and synaptogenesis during associative learning. __________________________ The postscript file for this manuscript was very large, so it was broken into three smaller files. Here is what you need to do to get them: unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62) Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get (remote-file) mel.sigmapi1.ps.Z (local-file) foo1.ps.Z ftp> get (remote-file) mel.sigmapi2.ps.Z (local-file) foo2.ps.Z ftp> get (remote-file) mel.sigmapi3.ps.Z (local-file) foo3.ps.Z ftp> quit unix> uncompress foo1.ps.Z foo2.ps.Z foo3.ps.Z unix> lpr -Pxx foo1.ps foo2.ps foo3.ps (xx is the name of your local postscript printer.) _______________________________________________________________________ From slehar at bucasb.bu.edu Wed May 2 08:23:47 1990 From: slehar at bucasb.bu.edu (slehar@bucasb.bu.edu) Date: Wed, 2 May 90 08:23:47 EDT Subject: INNC - Call for Volunteers In-Reply-To: connectionists@c.cs.cmu.edu's message of 1 May 90 23:45:40 GM Message-ID: <9005021223.AA16219@thalamus.bu.edu> Please consider me for volunteering at the INNC conference. Steven Lehar 350 Marlborough St, Boston MA 02115 USA slehar at bucasb.bu.edu AM shift NOTE: I will be giving a presentation, I'm not sure exactly when, but my schedule must of course conform with my presentation. From rr%cstr.edinburgh.ac.uk at NSFnet-Relay.AC.UK Thu May 3 10:22:02 1990 From: rr%cstr.edinburgh.ac.uk at NSFnet-Relay.AC.UK (Richard Rohwer) Date: Thu, 3 May 90 10:22:02 BST Subject: Recurrent Linguistic Domain Papers? Message-ID: <10665.9005030922@cstr.ed.ac.uk> D. Servan-Schreiber writes, regarding simple recurrent networks... > [...] > We also explored the conditions under which the network can carry > information about distant sequential contingencies across intervening > elements to distant elements. Such information is retained with relative > ease if it is relevant at each intermediate step of a sequence; it tends to > be lost when intervening elements do not depend on it. [...] The `Moving Targets' training algorithm (which is a bit of a pig in a lot of ways) can deal with situations in which information from the distant past is required in order to make a correct decision in the present. It can do this because error information is communicated through time by additive tradeoffs in the cost function (which has contributions from hidden nodes as well as target nodes), rather than by the multiplicitive processes derived from the chain rule. I make no claims about biological plausibility. References: R. Rohwer, "The `Moving Targets' Training Algorithm" in Proc. EURASIP Workshop on Neural Networks, Springer-Verlag Lecture Notes in Computer Science No. 412. (1990). R. Rohwer, "The `Moving Targets' Training Algorithm" to appear in Proc. NIPS 1989 Richard Rohwer JANET: rr at uk.ac.ed.cstr Centre for Speech Technology Research ARPA: rr%ed.cstr at nsfnet-relay.ac.uk Edinburgh University BITNET: rr at cstr.ed.ac.uk, 80, South Bridge rr%cstr.ed.UKACRL Edinburgh EH1 1HN, Scotland UUCP: ...!{seismo,decvax,ihnp4} !mcvax!ukc!cstr!rr From rich at gte.com Thu May 3 12:12:19 1990 From: rich at gte.com (Rich Sutton) Date: Thu, 3 May 90 12:12:19 -0400 Subject: Preprint announcement Message-ID: <9005031612.AA18444@bunny.gte.com> How could a connectionist network _plan_ a sequence of actions before doing them? The follow preprint describes one answer. --------------- INTEGRATED ARCHITECTURES FOR LEARNING, PLANNING, AND REACTING BASED ON APPROXIMATING DYNAMIC PROGRAMMING Richard S. Sutton GTE Labs Abstract This paper extends previous work with Dyna, a class of architectures for intelligent systems based on approximating dynamic programming methods. Dyna architectures integrate trial-and-error (reinforcement) learning and execution-time planning into a single process operating alternately on the world and on a learned model of the world. In this paper, I present and show results for two Dyna architectures. The Dyna-PI architecture is based on dynamic programming's policy iteration method and can be related to existing AI ideas such as evaluation functions and universal plans (reactive systems). Using a navigation task, results are shown for a simple Dyna-PI system which simultaneously learns by trial and error, learns a world model, and plans optimal routes using the evolving world model. The Dyna-Q architecture is based on Watkins's Q-learning, a new kind of reinforcement learning. Dyna-Q uses a less familiar set of data structures than does Dyna-PI, but is arguably simpler to implement and use. We show that Dyna-Q architectures are easy to adapt for use in changing environments. --------------- This paper will appear in the proceedings of the Seventh International Conference on Machine Learning, to be held June, 1990. For copies, send a request with your US MAIL address to: clc2 at gte.com From tp at irst.it Thu May 3 14:41:07 1990 From: tp at irst.it (Tomaso Poggio) Date: Thu, 3 May 90 20:41:07 +0200 Subject: Preprint announcement In-Reply-To: Rich Sutton's message of Thu, 3 May 90 12:12:19 -0400 <9005031612.AA18444@bunny.gte.com> Message-ID: <9005031841.AA06800@caneva.irst.it> From rr at cstr.edinburgh.ac.uk Thu May 3 10:22:02 1990 From: rr at cstr.edinburgh.ac.uk (Richard Rohwer) Date: Thu, 3 May 90 10:22:02 BST Subject: Recurrent Linguistic Domain Papers? Message-ID: <10665.9005030922@cstr.ed.ac.uk> D. Servan-Schreiber writes, regarding simple recurrent networks... > [...] > We also explored the conditions under which the network can carry > information about distant sequential contingencies across intervening > elements to distant elements. Such information is retained with relative > ease if it is relevant at each intermediate step of a sequence; it tends to > be lost when intervening elements do not depend on it. [...] The `Moving Targets' training algorithm (which is a bit of a pig in a lot of ways) can deal with situations in which information from the distant past is required in order to make a correct decision in the present. It can do this because error information is communicated through time by additive tradeoffs in the cost function (which has contributions from hidden nodes as well as target nodes), rather than by the multiplicitive processes derived from the chain rule. I make no claims about biological plausibility. References: R. Rohwer, "The `Moving Targets' Training Algorithm" in Proc. EURASIP Workshop on Neural Networks, Springer-Verlag Lecture Notes in Computer Science No. 412. (1990). R. Rohwer, "The `Moving Targets' Training Algorithm" to appear in Proc. NIPS 1989 Richard Rohwer JANET: rr at uk.ac.ed.cstr Centre for Speech Technology Research ARPA: rr%ed.cstr at nsfnet-relay.ac.uk Edinburgh University BITNET: rr at cstr.ed.ac.uk, 80, South Bridge rr%cstr.ed.UKACRL Edinburgh EH1 1HN, Scotland UUCP: ...!{seismo,decvax,ihnp4} !mcvax!ukc!cstr!rr PHONE: (44 or 0) (31) 225-8883 x280 FAX: (44 or 0) (31) 226-2730 From tsejnowski at UCSD.EDU Sat May 5 19:53:44 1990 From: tsejnowski at UCSD.EDU (Terry Sejnowski) Date: Sat, 5 May 90 16:53:44 PDT Subject: Neural Computation 2:1 Message-ID: <9005052353.AA16019@sdbio2.UCSD.EDU> Reviews: Generalized Deformable Model, Statistical Physics, and Matching Problems Alan L. Yuille Letters: An Optoelectronic Architecture for Multilayer Learning in a Single Photorefractive Crystal Carsten Peterson, Stephen Redfield, James D. Keeler, and Eric Hartman VLSI Implementation of Neural Classifiers Arun Rao, Mark R. Walker, Lawrence T. Clark, L. A. Akers, R. O. Grodin Coherent Compound Motion: Corners and Nonridgid Configurations Steven W. Zucker, Lee Iverson, and Robert A. Hummel A Complementarity Mechanism for Enhanced Pattern Processing James L. Adams Hebb-Type Dynamics Is Sufficient to Account for the Inverse Magnification Rule in Cortical Somatotopy Kamil A. Grajski and Michael M. Merzenich Optimal Plasticity from Matrix Memories: What Goes Up Must Come Down David Willshaw and Peter Dayan A Syntactically Structured Associative Memory DeLiang Wang, Joachim Buhmann, and Christoph von der Malsburg A Neural Net Associative Memory for Real-Time Applications Gregory L. Heileman, George M. Papadourakis, and Michael Georgiopoulos Gram-Schmidt Neural Networks Sophocles J. Orfanidis The Perceptron Algorithm Is Fast for Nonmalicious Distributions Eric B. Baum SUBSCRIPTIONS: Volume 2 ______ $35 Student ______ $50 Individual ______ $100 Institution Add $12. for postage outside USA and Canada surface mail. Add $18. for air mail. (Back issues of volume 1 are available for $25 each.) MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. (617) 253-2889. ----- From jose at learning.siemens.com Mon May 7 08:38:53 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Mon, 7 May 90 07:38:53 EST Subject: NIPS UPDATE Message-ID: <9005071238.AA11377@learning.siemens.com.siemens.com> *************NIPS UPDATE***************** Note there are less than 2-weeks left for your submission of NIPS abstracts Please send your abstracts by MAY 17th Mail Submissions To: Mail Requests For Registration Material To: John Moody Kathie Hibbard NIPS*90 Submissions NIPS*90 Local Committee Department of Computer Science Engineering Center Yale University University of Colorado P.O. Box 2158 Yale Station Campus Box 425 New Haven, Conn. 06520 Boulder, CO 80309-0425 DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 17, 1990 (see big green poster for more detail on NIPS topics for abstracts and summaries) please tell your friends --Steve *************NIPS UPDATE***************** From nips-90 at CS.YALE.EDU Tue May 8 15:49:33 1990 From: nips-90 at CS.YALE.EDU (nips90) Date: Tue, 8 May 90 15:49:33 EDT Subject: NIPS SUBMISSIONS AND REGISTRATION Message-ID: <9005081949.AA03897@CASPER.SUN2.CS.YALE.EDU> t at life.ai.mit.edu, arpanet-bboards at mc.lcs.mit.edu, biotech%umdc.bitnet at siemens.s iemens.com, comp-ai at ucbvax.berkeley.edu, dynsys-l%uncvm1.bitnet at siemens.siemens. com, epsynet%uhupvm1.bitnet at siemens.siemens.com, fj-ai%etl.jp at relay.cs.net, fox@ vtcs1.cs.vt.edu, gs at xp.psych.nyu.edu, hecht at ztivax.siemens.com, human-nets at arami s.rutgers.edu, info-futures at cs.bu.edu, iss at cadillac.siemens.com, jws at ibm-b.ruthe rford.ac.uk, mcmi!denny at siemens.siemens.com, mcvax!swivax!otten at uunet.uu.net, mi ng at demon.siemens.com, unido!mod-ki%gmdzi at siemens.siemens.com, msgs at clarity.princ eton.edu, msgs at neuron.siemens.com, neuron at hplabs.hp.com, nlist at bellcore.com, nns c at nnsc.nsf.net, optics-l%taunivm.bitnet at siemens.siemens.com, parsym at sumex-aim.st anford.edu, physics at mc.lcs.mit.edu, regine at ztivax.siemens.com, remmele at ztivax.si emens.com, self-org at mc.lcs.mit.edu, simulation at ufl.edu, soft-eng at mintaka.lcs.mit .edu, vision-list at ads.com, zercher at ztivax.siemens.com Cc: nips Bcc: This message corrects a previous message sent out yesterday. *************NIPS SUBMISSIONS AND REGISTRATION***************** Note there is about only 1 week left for your submission to NIPS. Please send six copies of both your 50-100 word abstracts and 1000 word summaries by MAY 17th to: John Moody NIPS*90 Submissions Department of Computer Science Yale University P.O. Box 2158 Yale Station New Haven, Conn. 06520 ****ALL SUBMITTING AUTHORS WILL BE SENT REGISTRATION**** *******MATERIALS AUTOMATICALLY!******* DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 17, 1990 (see big green poster for more detail on NIPS topics for abstracts and summaries) *************NIPS REGISTRATION ONLY!***************** If you are not sending in a submission for NIPS, but would still like to attend, please request registration materials from: Kathie Hibbard NIPS*90 Local Committee Engineering Center University of Colorado Campus Box 425 Boulder, CO 80309-0425 -- John Moody Program Chairman ------- From weili at wpi.wpi.edu Tue May 8 17:54:09 1990 From: weili at wpi.wpi.edu (Wei Li) Date: Tue, 8 May 90 16:54:09 EST Subject: neural networks apply to ATN Message-ID: <9005082154.AA09434@wpi.wpi.edu> Hi, we are very interested in knowing that if neural network can solve ATN (Augumented Transition Network) problems. If so, can neural network do it like a recursive process or some other typies of process? (ATN for natural language processing). weili at wpi.wpi.edu or apache!weil at uunet.uu.net thanks for any information. From Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU Tue May 8 19:49:28 1990 From: Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave.Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Tue, 08 May 90 19:49:28 EDT Subject: NIPS proceedings Message-ID: <3746.642210568@DST.BOLTZ.CS.CMU.EDU> The proceedings of the 1989 NIPS conference have started arriving in people's mailboxes. They were supposed to be out a few weeks ago, but there was a problem with the quality of the binding, so Morgan Kaufmann sent the whole batch back to the bindery to be redone. The second time around they got it perfect. If you are an author or co-author of a paper in the volume, OR if you attended the conference, you should receive a copy of the proceedings. If you don't get yours some time this week, call Morgan Kaufmann on Monday to check on it. Their number is 415-578-9911; ask for Shirley Jowell. If you would like to order extra copies of the proceedings, they are available from: Morgan Kaufmann Publishers 2929 Campus Drive, Suite 260 San Mateo, CA 94403 tel. 415-965-4081 (order department) fax: 415-578-0672. Enclose a check for $35.95 per copy, plus shipping charge of $3.50 for the first copy and $2.50 for each additional copy. California residents must add sales tax. There are higher shipping charges for air mail or international orders; contact the publisher for information. Note: the catalog code for this volume is "100-7"; include that in you order. An example of proper citation format for the volume is: Cowan, J. D. (1990) Neural networks: the early days. In D. S. Touretzky (ed.), Advances in Neural Information Processing Systems 2, pp. 828-842. San Mateo, CA: Morgan Kaufmann. -- Dave From noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK Wed May 9 13:33:07 1990 From: noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK (Noel Sharkey) Date: Wed, 9 May 90 13:33:07 BST Subject: psychologists Message-ID: <19754.9005091233@entropy.cs.exeter.ac.uk> ******************** CALL FOR PAPERS ****************** CONNECTION SCIENCE SPECIAL ISSUE CONNECTIONIST MODELLING OF PSYCHOLOGICAL PROCESSES EDITOR Noel Sharkey SPECIAL BOARD Jim Anderson Andy Barto Thomas Bever Glyn Humphries Walter Kintsch Dennis Norris Ronan Reilly Dave Rumelhart The journal Connection Science would like to encourage submissions from researchers modelling psychological data or conducting experiments comparing models within the connectionist framework. Papers of this nature may be submitted to our regular issues or to the special issue. Authors wishing to submit papers to the special issue should mark them SPECIAL PSYCHOLOGY ISSUE. Good quality papers not accepted for the special issue may appear in later regular issues. DEADLINE FOR SUBMISSION 12th October, 1990. Notification of acceptance or rejection will be by the end of December/beginning of January. From lyn%cs.exeter.ac.uk at NSFnet-Relay.AC.UK Wed May 9 13:45:18 1990 From: lyn%cs.exeter.ac.uk at NSFnet-Relay.AC.UK (Lyn Shackleton) Date: Wed, 9 May 90 13:45:18 BST Subject: Special issue for Connection Science Message-ID: <19910.9005091245@exsc.cs.exeter.ac.uk> ******************** CALL FOR PAPERS ****************** CONNECTION SCIENCE SPECIAL ISSUE CONNECTIONIST MODELLING OF PSYCHOLOGICAL PROCESSES EDITOR Noel Sharkey SPECIAL BOARD Jim Anderson Andy Barto Thomas Bever Glyn Humphries Walter Kintsch Dennis Norris Ronan Reilly Dave Rumelhart The journal Connection Science would like to encourage submissions from researchers modelling psychological data or conducting experiments comparing models within the connectionist framework. Papers of this nature may be submitted to our regular issues or to the special issue. Authors wishing to submit papers to the special issue should mark them SPECIAL PSYCHOLOGY ISSUE. Good quality papers not accepted for the special issue may appear in later regular issues. DEADLINE FOR SUBMISSION 12th October, 1990. Notification of acceptance or rejection will be by the end of December/beginning of January. Submissions should be sent to: lyn shackleton Centre for Connection Science JANET: lyn at uk.ac.exeter.cs Dept. Computer Science University of Exeter UUCP: !ukc!expya!lyn Exeter EX4 4PT Devon BITNET: lyn at cs.exeter.ac.uk.UKACRL U.K. From jfeldman%icsib2.Berkeley.EDU at jade.berkeley.edu Wed May 9 12:18:52 1990 From: jfeldman%icsib2.Berkeley.EDU at jade.berkeley.edu (Jerry Feldman) Date: Wed, 9 May 90 09:18:52 PDT Subject: ICSI Deputy Ad Message-ID: <9005091618.AA01252@icsib2.berkeley.edu.> We are starting a search for a full-time Deputy Director for the International Computer Science Institute (ICSI). We would highly appreciate any help you can give us in this search. The enclosed ad describes the position. Please feel free to distribute it electronically to anybody who might be interested. Thank you in advance, and best regards. Jerry Feldman Domenico Ferrari PS: If you need more information about duties and perks, please let us know. =============================================================== DEPUTY DIRECTOR International Computer Science Institute Nominations and Applications are solicited for the position of Deputy Director of the International Computer Science Institute. The Institute is an independent basic research laboratory affiliated with and physically near the University of California at Berkeley. Support comes from U.S. sources and sponsor nations, currently Germany, Italy and Switzerland. The Deputy Director will have the primary responsibility for the internal administration of the Institute and its post-doctoral and exchange programs with sponsor nations. There are also many opportunities for new initiatives. The position is like the chair of a research oriented computer science department and the ideal candidate would have such experience. ICSI is also expanding its research staff and welcomes applications from outstanding scientists at any post-doctoral level. Please respond to: Dr. Domenico Ferrari Deputy Director International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704-1105 From hinton at ai.toronto.edu Wed May 9 16:09:56 1990 From: hinton at ai.toronto.edu (Geoffrey Hinton) Date: Wed, 9 May 1990 16:09:56 -0400 Subject: image compression Message-ID: <90May9.161014edt.8256@ephemeral.ai.toronto.edu> We are doing some work on lossless image compression. (i.e. the image must be transmitted using as few bits as possible, but must be perfectly reconstructed after transmission). 1. Does anyone know of any neural network work on lossless image compression? (We know about vector quantization, autoencoders, etc, but these are lossy techniques that wouldn't be too good for compressing your disk files etc.) 2. Does anyone have an image on which other lossless techniques have been tried so that we can compare our technique? Thanks Geoff From harnad at clarity.Princeton.EDU Wed May 9 16:06:51 1990 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Wed, 9 May 90 16:06:51 EDT Subject: Optimality: BBS Call for Commentators Message-ID: <9005092006.AA02403@reason.Princeton.EDU> Below is the abstract of a forthcoming target article to appear in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. To be considered as a commentator or to suggest other appropriate commentators, please send email to: harnad at clarity.princeton.edu or write to: BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771] Please specify the aspect of the article that you are qualified and interested to comment upon. If you are not a current BBS Associate, please send your CV and/or the name of a current Associate who would be prepared to nominate you. ____________________________________________________________________ The Quest for Optimality: A Positive Heuristic of Science? Paul J. H. Schoemaker Center for Decision Research Graduate School of Business University of Chicago Chicago, IL 6063 Abstract This paper examines the strengths and weaknesses of one of science's most pervasive and flexible metaprinciples: Optimality is used to explain utility maximization in economics, least effort principles in physics, entropy in chemistry, and survival of the fittest in biology. Fermat's principle of least time involves both teleological and causal considerations, two distinct modes of explanation resting on poorly understood psychological primitives. The rationality heuristic in economics provides an example from social science of the potential biases arising from the extreme flexibility of optimality considerations, including selective search for confirming evidence, ex post rationalization, and the confusion of prediction with explanation. Commentators are asked to reflect on the extent to which optimality is (1) an organizing priniciple of nature, (2) a set of relatively unconnected techniques of science, (3) a normative principle for rational choice and social organization, (4) a metaphysical way of looking at the world, or (5) something else still. Key Words: Optimization, Variational Principles, Rationality, Explanation, Evolution, Economics, Adaptation, Causality, Heuristics, Biases, Sociobiology, Control Theory, Homeostasis, Entropy, Regulation. From weissg at lan.informatik.tu-muenchen.dbp.de Wed May 9 20:35:00 1990 From: weissg at lan.informatik.tu-muenchen.dbp.de (Gerhard Weiss) Date: 09 May 90 20:35 GMT-0200 Subject: reports available Message-ID: <9005091635.AA09834(a)tumult.informatik.tu-muenchen.de> *** Do not use 'REPLY' *** The following two reports are available. COMBINING NEURAL AND EVOLUTIONARY LEARNING: ASPECTS AND APPROACHES Report FKI-132-90 Gerhard Weiss This report focusses on the intersection of neural and evolutionary learning and shows basic aspects of and current approaches to the combination of these two learning paradigms. Advantages and difficulties of such a combination are described. Approaches from both the field of artificial intelligence and the neurosciences are surveyed. A number of related works as well as extensive references to further literature are presented. Contents: - Hybrid approaches in artificial intelligence . Evolutionary design of artificial neural networks . Evolutionary training of artificial neural networks . Further hybrid approaches and related works - Selective theories in the neurosciences . The evolutionary selection circuits model of learning (Conrad et.al.) . The theories of selective stabilization of synapses and pre-representations (Changeux et.al.) . The theory of neuronal group selection (Edelman) ARTIFICIAL NEURAL LEARNING Report FKI-127-90 Gerhard Weiss This report provides an introducing overview of the foundations and the principles of learning in artificial neural networks. Contents: - General aspects (artificial neural nets / adaptation rules / gradient-following / ...) - Supervised learning (perceptron convergence procedure / backprop / Boltzmann learning) - Associative reinforcement learning (associative reward-penalty algorithm / reinforcement comparison procedures / REINFORCE algorithms) - Unsupervised learning (topology-preserving feature maps / adaptive resonance theory / development of feature analyzing cells) REQUESTS FOR COPIES: weissg at lan.informatik.tu-muenchen.dbp.de -> Please use subject: FKI-127 or FKI-132 or FKI-127+132 -> Please leave only your physical address -> Those who already asked for copies will receive them without any further request OTHER CORRESPONDENCE: weissg at tumult.informatik.tu-muenchen.de or Gerhard Weiss Institut fuer Informatik -H2- Technische Universitaet Muenchen Arcisstrasse 21 D-8000 Muenchen 2 Fed.Rep.Germany From mark at cis.ohio-state.edu Thu May 10 08:13:19 1990 From: mark at cis.ohio-state.edu (Mark Jansen) Date: Thu, 10 May 90 08:13:19 -0400 Subject: image compression Message-ID: <9005101213.AA29337@giza.cis.ohio-state.edu> there is a pre there is a professor here at OSU in the department of electrical engineering, I believe his name is Aho who is working with image compression using neural nets but it is not lossless compression From bms%dcs.leeds.ac.uk at NSFnet-Relay.AC.UK Thu May 10 11:44:31 1990 From: bms%dcs.leeds.ac.uk at NSFnet-Relay.AC.UK (B M Smith) Date: Thu, 10 May 90 11:44:31 BST Subject: Item for Distribution Message-ID: <10920.9005101044@csuna6.dcs.leeds.ac.uk> CALL FOR PAPERS AISB'91 8th SSAISB CONFERENCE ON ARTIFICIAL INTELLIGENCE University of Leeds, UK 16-19 April, 1991 The Society for the Study of Artificial Intelligence and Simulation of Behaviour (SSAISB) will hold its eighth biennial conference at Bodington Hall, University of Leeds, from 16 to 19 April 1991. There will be a Tutorial Programme on 16 April followed by the full Technical Programme. The Programme Chair will be Luc Steels (AI Lab, Vrije Universiteit Brussel). Scope: Papers are sought in all areas of Artificial Intelligence and Simulation of Behaviour, but especially on the following AISB91 special themes: * Emergent functionality in autonomous agents * Neural networks and self-organisation * Constraint logic programming * Knowledge level expert systems research Papers may describe theoretical or practical work but should make a significant and original contribution to knowledge about the field of Artificial Intelligence. A prize of 500 pounds for the best paper has been offered by British Telecom Computing (Advanced Technology Group). It is expected that the proceedings will be published as a book. Submission: All submissions should be in hardcopy in letter quality print and should be written in 12 point or pica typewriter face on A4 or 8.5" x 11" paper, and should be no longer than 10 sides, single-spaced. Each paper should contain an abstract of not more than 200 words and a list of up to four keywords or phrases describing the content of the paper. Five copies should be submitted. Papers must be written in English. Authors should give an electronic mail address where possible. Submission of a paper implies that all authors have obtained all necessary clearances from the institution and that an author will attend the conference to present the paper if it is accepted. Papers should describe work that will be unpublished on the date of the conference. Dates: Deadline for Submission: 1 October 1990 Notification of Acceptance: 7 December 1990 Deadline for camera ready copy: 16 January 1991 Information: Papers and all queries regarding the programme should be sent to Judith Dennison. All other correspondence and queries regarding the conference to the Local Organiser, Barbara Smith. Ms. Judith Dennison Dr. Barbara Smith Cognitive Sciences Division of AI University of Sussex School of Computer Studies Falmer University of Leeds Brighton BN1 9QN Leeds LS2 9JT UK UK Tel: (+44) 273 678379 Tel: (+44) 532 334627 Email: judithd at cogs.sussex.ac.uk FAX: (+44) 532 335468 Email: aisb91 at ai.leeds.ac.uk From dario%TECHUNIX.BITNET at vma.CC.CMU.EDU Thu May 10 10:54:42 1990 From: dario%TECHUNIX.BITNET at vma.CC.CMU.EDU (Dario Ringach) Date: Thu, 10 May 90 17:54:42 +0300 Subject: image compression In-Reply-To: Geoffrey Hinton "image compression" (May 9, 4:09pm) Message-ID: <9005101454.AA10878@techunix.bitnet> I wouldn't expect NNs to perform better on error-free encoding than any of the standard algorithms (Lempel-Ziv for instance)... As far as the testing picture you are after, I think most of the vision community will agree that the famous "Lena" is the usual picture used for comparison... If you can tolerate errors in the reconstruction, then there are a couple of nice works I'm aware of: Daugman used a net to find the Gabor expansion of a picture and then compress it, and Sanger used Oja/Kohonen units + an orthogonalization procedure to obtain convergence to the first eigenfunction/eigenvalues of the Karhunen- Loeve expansion, and of course used it to compress the picture. If anyone is interested I can look for the exact references. -- Dario. From kammen at aurel.cns.caltech.edu Thu May 10 14:33:32 1990 From: kammen at aurel.cns.caltech.edu (Dan Kammen) Date: Thu, 10 May 90 11:33:32 PDT Subject: No subject Message-ID: <9005101833.AA19470@aurel.cns.caltech.edu> TOPIC: PAPER FOR DISSEMINATION WE HAVE RECENTLY COMPLETED AND SUBMITTED (N. NETWORKS) THE FOLLOWING PAPER WHICH SHOULD BE OF INTEREST BOTH TO PERSONS MODELING NEUROBIOLOGICAL NETWORKS AND THOSE DESIGNING SELF-ORGANIZING ALGORITHMS: CORRELATIONS IN HIGH DIMENSIONAL OR ASYMMETRIC DATA SETS: HEBBIAN NEURONAL PROCESSING WILLIAM R.SOFTKY and DANIEL M. KAMMEN Computation and Neural Systems Program California Institute of Technology Pasadena, CA 91125 ABSTRACT The Hebbian neural learning algorithm that implements Principal Component Analysis (PCA) can be extended for the analysis of more realistic forms of neural data by including higher than 2-channel correlations and non-Euclidean (l_P; l-sub-P) metrics. Maximizing a D-th rank tensor form which correlates D channels is equivalent to raising the exponential order of variance correlation from 2 to D in the algorithm that implements PCA. Simulations suggest that a generalized version of Oja's PCA neuron can detect such a D-th order principal component. Arguments from biology and pattern-recognition suggest that neural data in general is not symmetric about its mean; performing PCA with an implicit l_1 metric rather than the Euclidean metric weights exponentially distributed vectors according to their probability, as does a highly nonlinear Hebb rule. The correlation order D and the l_P metric exponent P were each roughly constant for each of several Hebb rules simulated. We propose and discuss a number of these generalized correlation algorithms in terms of natural (biological) and artificial network implementations. Keywords: Principal Component Analysis, Hebbian learning, self-organization, correlation functions, multi-dimensional analysis, non-Euclidean metrics, information theory, asymmetric coding. Address correspondence or preprint requests to: Dr. D. M. KAMMEN: Division of Biology, 216-76 California Institute of Technology Pasadena, CA 91125 USA kammen at aurel.cns.caltech.edu KAMMEN at CALTECH.BITNET From BARB at REAGAN.AI.MIT.EDU Thu May 10 15:36:00 1990 From: BARB at REAGAN.AI.MIT.EDU (Barbara K. Moore) Date: Thu, 10 May 90 15:36 EDT Subject: image compression In-Reply-To: <9005101454.AA10878@techunix.bitnet> Message-ID: <19900510193614.3.BARB@PENGUIN.AI.MIT.EDU> Sorry for being slightly off-track, but I think this is important: (In response to the suggestion to Geoff that he use "Lena" as an example for his image compression algorithm.) For years I have been bothered by the "pretty woman looking seductive" pictures used all too frequently as examples in machine vision ("Marilyn", "Lena", etc.). I realize that they are often used, but I think it's time for a change. How about a still life, or an animal, or just... people? Barbara Moore (barb at ai.mit.edu) From pkube at UCSD.EDU Fri May 11 01:56:07 1990 From: pkube at UCSD.EDU (Paul Kube) Date: Thu, 10 May 90 22:56:07 PDT Subject: image compression In-Reply-To: Your message of Thu, 10 May 90 15:36:00 EDT. <19900510193614.3.BARB@PENGUIN.AI.MIT.EDU> Message-ID: <9005110556.AA00312@kokoro.ucsd.edu> Barbara Moore is right; the Lena picture offends and should no longer be used as a benchmark image in research publications. A commonly used alternative is the "mandrill picture", obtainable from various places. (Mail me if you need it.) --Paul Kube at ucsd.edu From HKF218%DJUKFA11.BITNET at vma.CC.CMU.EDU Fri May 11 10:02:26 1990 From: HKF218%DJUKFA11.BITNET at vma.CC.CMU.EDU (Gregory Kohring) Date: Fri, 11 May 90 10:02:26 MES Subject: Preprint Message-ID: The following preprint is currently available. -- G.A. Kohring Finite-State Neural Networks: A Step Towards the Simulation of Very Large Systems G.A. Kohring HLRZ an der KFA Julich (Supercomputing Center at the KFA Julich) Abstract Neural networks composed of neurons with Q_N states and synapses with Q_Jstates are studied analytically and numerically. Analytically it is shown that these finite-state networks are up to 25 times more efficient at information storage than networks with continuous synapses. In order to take the utmost advantage of networks with finite-state elements, a multi-neuron and multi-synapse coding scheme is introduced which allows the simulation of networks having over one billion couplings at a speed of 7.1 billion coupling evaluations per second on a single processor of the Cray-YMP. A local learning algorithm is also introduced which allows for the efficient training of large networks with finite-state elements. Key Words: Neural Networks, Multi-Spin Coding, Replica Method, Finite-State Networks, Learning Algorithms HLRZ-33/90 Send Correspondence and request for preprints to: G.A. Kohring HLRZ an der KFA Julich Postfach 1913 D-5170 Julich, West Germany From pittman at mcc.com Fri May 11 09:42:26 1990 From: pittman at mcc.com (pittman@mcc.com) Date: Fri, 11 May 90 06:42:26 -0700 Subject: seductive compression Message-ID: <9005111342.AA02870@gluttony.aca.mcc.com> Barbara Moore complains about the "pretty woman looking seductive" and suggests an alternate image of "just... people". Perhaps we could get the producers of "The Bob Newhart Show" to submit a shot of Larry, Darrell, and Darrell. We might even get them to look seductive. I don't (at this time) wish to discuss the relative merits of the madrill over LD&D. Seriously, if you want to appeal to the general public (and therefore also public officials), I suggest you stick with what has funded the photo-developing industry: cute pictures of small children with big smiles on their faces. Jay Pittman (jay.pittman at mcc.com) From sstone%weber at ucsd.edu Fri May 11 13:23:40 1990 From: sstone%weber at ucsd.edu (Allucquere Rosanne Stone) Date: Fri, 11 May 90 10:23:40 pdt Subject: Image compression Message-ID: <9005111723.AA17488@weber.ucsd.edu> I heartily agree with Barbara Moore's suggestion that the stereotype soft-core photographs of women have seen their day and should be retired. Hopefully, by this time not only are there enough women in the profession who find this sort of thing demeaning, but there are enough men who are able to see how softcore photos perpetuate the idea of women as objects. Let's keep our objects within our programming languages. From dlovell at s1.elec.uq.OZ.AU Sat May 12 03:07:06 1990 From: dlovell at s1.elec.uq.OZ.AU (dlovell@s1.elec.uq.OZ.AU) Date: Sat, 12 May 1990 17:07:06 +1000 Subject: No subject Message-ID: <9005120708.1057@munnari.oz.au> From dlovell at s1.elec.uq.oz Sat May 12 18:06:05 1990 From: dlovell at s1.elec.uq.oz (dlovell@s1.elec.uq.oz) Date: Sat, 12 May 90 17:06:05 EST Subject: image compression and mandrils. Message-ID: > > Barbara Moore is right; the Lena picture offends and should no longer > be used as a benchmark image in research publications. A commonly > used alternative is the "mandrill picture", obtainable from various places. > (Mail me if you need it.) > > --Paul Kube at ucsd.edu > > Would that be a soft focus picture of a seductive looking primate perhaps? From koch%HAMLET.BITNET at vma.CC.CMU.EDU Sat May 12 01:50:42 1990 From: koch%HAMLET.BITNET at vma.CC.CMU.EDU (Christof Koch) Date: Fri, 11 May 90 22:50:42 PDT Subject: image compression In-Reply-To: Your message <9005110556.AA00312@kokoro.ucsd.edu> dated 10-May-1990 Message-ID: <900511225018.2260196b@Hamlet.Caltech.Edu> Paul Kube is entirely right. Not only should we not use the Lena picture as a bench mark, but we should also boycott books and museums which display the "Mona Lisa", Botticelli's "Venus" or Gaugin's "Tahiti Nudes". They are all honorable, sorry, offensive pictures. Christof From schraudo%cs at ucsd.edu Sat May 12 18:23:36 1990 From: schraudo%cs at ucsd.edu (Nici Schraudolph) Date: Sat, 12 May 90 15:23:36 PDT Subject: image compression Message-ID: <9005122223.AA07628@beowulf.ucsd.edu> > From: Christof Koch > > Paul Kube is entirely right. Not only should we not use the Lena picture > as a bench mark, but we should also boycott books and museums which > display the "Mona Lisa", Botticelli's "Venus" or Gaugin's "Tahiti Nudes". This comparison demonstrates ignorance of the cultural context in which Da Vinci, Botticelli and Gauguin created their masterpieces. But even if you choose to consider these works as sexist this neither diminishes their cultural value, nor does it excuse you from the social responsibilities of our time. -- Nici Schraudolph, C-014 nschraudolph at ucsd.edu University of California, San Diego nschraudolph at ucsd.bitnet La Jolla, CA 92093 ...!ucsd!nschraudolph From dario%TECHUNIX.BITNET at vma.CC.CMU.EDU Sun May 13 00:46:59 1990 From: dario%TECHUNIX.BITNET at vma.CC.CMU.EDU (Dario Ringach) Date: Sun, 13 May 90 07:46:59 +0300 Subject: image compression In-Reply-To: Paul Kube "Re: image compression" (May 10, 10:56pm) Message-ID: <9005130446.AA02576@techunix.bitnet> I want to apologize if I've offended anyone suggesting "Lena"... Now, regarding the references for image compression I mentioned in my previous mail, here they are: T. Sanger, 'Optimal Unsupervised Learning in a Single-Layer Feedforward Neural Network', Neural Networks, Vol. 2, pp. 459-73, 1989. [He has also some extensions of this work as internal MIT publications]. G. Cottrell et al., 'Principal Component Analysis of Images via Back Propagation' SPIE Proc. Visual Communications and Image Processing '88, pp. 1070-77, 1988 J. Daugman, 'Complete Discrete 2-D Gabor Transforms by Neural Networks for Image Analysis and Compression', IEEE Trans. ASSP, Vol. 36, No. 7, pp. 1169-79, 1988 N. Nasrabadi et al. 'Vector Quantization Based Upon the Kohonen Self- Organizing Feature Map', IEEE Conf. on NNs, pp. I-101-8, 1988. The list is surely incomplete... -- Dario Ringach From hbs at lucid.com Sun May 13 04:44:47 1990 From: hbs at lucid.com (Harlan Sexton) Date: Sun, 13 May 90 01:44:47 PDT Subject: image compression In-Reply-To: Nici Schraudolph's message of Sat, 12 May 90 15:23:36 PDT <9005122223.AA07628@beowulf.ucsd.edu> Message-ID: <9005130844.AA00207@kent-state> Do we need to discuss this any further? I think that the position that a more neutral set of benchmark pictures is desirable has been generally accepted (or at least understood), and the discussion seems to be wandering from this point onto topics that belong in some "readnews" category or in private correspondence. --Harlan From hgigley at note.nsf.gov Tue May 15 09:57:30 1990 From: hgigley at note.nsf.gov (Helen M. Gigley) Date: Tue, 15 May 90 09:57:30 EDT Subject: NSF offers access to Japanese data bases Message-ID: <9005150957.aa21494@Note.NSF.GOV> ------- Forwarded Message From jea%BKLYN.BITNET at VMA.CC.CMU.EDU Tue May 15 10:34:00 1990 From: jea%BKLYN.BITNET at VMA.CC.CMU.EDU (Jonathan E. Adler) Date: Tue, 15 May 90 10:34 EDT Subject: Optimality: BBS Call for Commentators In-Reply-To: Message of Wed, 9 May 90 16:06:51 EDT from Stevan Harnad Message-ID: I decline the commentary, but recommend Philip Kitcher, Philosophy, U.C. San Diego and Benaolette Guimberteau School of Education, U.C. Berkeley. From vg at psyche.inria.fr Tue May 15 13:56:57 1990 From: vg at psyche.inria.fr (Thierry BERNARD) Date: Tue, 15 May 90 19:56:57 +0200 Subject: image compression Message-ID: <9005151756.AA29413@psyche> As NN are usually meant to yield suboptimal answers for difficult problems, I am surprised that they can be used for LOSSLESS image compression. Anyway, if losing some information is acceptable, may be our work is of some interest. For image processing purposes within smart sensors, we have designed a neural technique for image analog-to-binary conversion, that we actually call "neural halftoning". We treat this conversion as an optimization problem subject to a fidelity criterion. The neural approach turns so adapted that : - the conversion quality is better than in any other halftoning technique. - a 100x100 pixel/neuron array can easily fit inside a standard CMOS chip. Anyone interested can read 2 recent papers of ours : [1] A neural halftoning algorithm suiting VLSI implementation. T.Bernard, P.Garda, B.Zavidovique. IEEE ICASSP April 90 [2] About the use of the adjective "neural", when applied to smart sensors. T.Bernard, B.Zavidovique. IEEE ICPR June 90 -- Thierry Bernard (vg at etca.fr) From nips-90 at CS.YALE.EDU Tue May 15 13:52:22 1990 From: nips-90 at CS.YALE.EDU (nips90) Date: Tue, 15 May 90 13:52:22 EDT Subject: FedEx address for NIPS Submissions Message-ID: <9005151752.AA29294@CASPER.NA.CS.YALE.EDU> Fellow Colleagues: For those of you feverishly trying to make the deadline for NIPS*90 (Neural Information Processing Systems, Natural and Synthetic), the correct street address for using Fed Ex or other express delivery services is John Moody NIPS*90 Submissions Department of Computer Science Yale Univesity 51 Prospect St. New Haven, CT 06520 US Postal Service Express Mail can be sent to John Moody NIPS*90 Submissions Department of Computer Science PO Box 2158 Yale Station New Haven, CT 06520 I will accept any submission with express postmark as late as May 17. Remember to include six copies of both abstract and 1000 word summary. Incomplete submissions will be returned. Lastly, contributing authors will be automatically sent registration materials, so there is no need to request them separately. Happy writing! --John ------- From Alex.Waibel at SPEECH2.CS.CMU.EDU Tue May 15 18:12:59 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Tue, 15 May 90 18:12:59 EDT Subject: FedEx address for NIPS Submissions Message-ID: Prospective NIPS'90 attendees, Please note: Due date for proposals for the NIPS'90 postconference workshops is also May 17th. To ensure proper and timely consideration of your workshop proposal, however, please be sure to send it directly to: Alex Waibel attn.: NIPS'90 Workshops School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 --------------------------------------------------------------------- Following the regular NIPS program, workshops on current topics in Neural Information Processing will be held on November 30 and December 1, 1990, at a ski resort near Denver. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Past topics have included: Rules and Connectionist Models; Speech; Vision; Neural Network Dynamics; Neurobiology; Computational Complexity Issues; Fault Tolerance in Neural Networks; Benchmarking and Comparing Neural Network Applications; Architectural Issues; Fast Training Techniques; VLSI; Control; Optimization; Statistical Inference; Genetic Algorithms. The format of the workshops is informal. Beyond reporting on past research, their goal is to provide a forum for scientists actively working in the field to freely discuss current issues of concern and interest. Sessions will meet in the morning and in the afternoon of both days, with free time in between for ongoing individual exchange or outdoor activities. Specific open or controversial issues are encouraged and preferred as workshop topics. Individuals interested in chairing a workshop must propose a topic of current interest and must be willing to accept responsibility for their group's discussion. Discussion leaders' responsibilities include: arrange brief informal presentations by experts working on this topic, moderate or lead the discussion, and report its high points, findings and conclusions to the group during evening plenary sessions, and in a short (2 page) summary. Submission Procedure: Interested parties should submit a short proposal for a workshop of interest by May 17, 1990. Proposals should include a title and a short description of what the workshop is to address and accomplish. It should state why the topic is of interest or controversial, why it should be discussed and what the targeted group of participants is. In addition, please send a brief resume of the prospective workshop chair, list of publications and evidence of scholarship in the field of interest. Name, mailing address, phone number, and e-mail net address (if applicable) must be on all submissions. --------------------------------------------------------------------- From marcus at aurel.cns.caltech.edu Wed May 16 03:07:42 1990 From: marcus at aurel.cns.caltech.edu (Marcus Quintana Mitchell) Date: Wed, 16 May 90 00:07:42 PDT Subject: mailing list Message-ID: <9005160707.AA09427@aurel.cns.caltech.edu> To whom it may concern: I would like to be placed on the connectionists mailing list. Thank you M. Q. Mitchell 164-30 California Inst. of Technology marcus at aurel.cns.caltech.edu From jose at learning.siemens.com Thu May 17 08:06:40 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Thu, 17 May 90 07:06:40 EST Subject: NIPS NOTE Message-ID: <9005171206.AA24748@learning.siemens.com.siemens.com> LAST MINUTE NOTE (RE: Cognitive Science/AI) as you are doing your last minute details prior to mailing... Remember there is a new submission category this year. Anyone submitting summaries relevant to COGNITIVE SCIENCE or AI please indicate this on your summary/abstract. Steve From PI05%primeb.dundee.ac.uk at NSFnet-Relay.AC.UK Thu May 17 16:24:46 1990 From: PI05%primeb.dundee.ac.uk at NSFnet-Relay.AC.UK (PI05%primeb.dundee.ac.uk@NSFnet-Relay.AC.UK) Date: Thu, 17 May 90 16:24:46 Subject: scheduling Message-ID: Does anyone have references to work on using connectionist techniques for solving scheduling problems - particularly timetabling? David Pickles. From steeg at ai.toronto.edu Thu May 17 13:53:33 1990 From: steeg at ai.toronto.edu (Evan W. Steeg) Date: Thu, 17 May 1990 13:53:33 -0400 Subject: scheduling Message-ID: <90May17.135338edt.8329@ephemeral.ai.toronto.edu> >Does anyone have references to work on using connectionist techniques for >solving scheduling problems - particularly timetabling? > > David Pickles. The following was announced a while ago: ---------------------------------------------- October 1989 LU TP 89-19 "TEACHERS AND CLASSES" WITH NEURAL NETWORKS Lars Gislen, Carsten Peterson and Bo Soderberg Department of Theoretical Physics, University of Lund Solvegatan 14A, S-22362 Lund, Sweden Submitted to International Journal of Neural Systems Abstract: A convenient mapping and an efficient algorithm for solving scheduling problems within the neural network paradigm is presented. It is based on a reduced encoding scheme and a mean field annealing prescription, which was recently successfully applied to TSP. Most scheduling problems are characterized by a set of hard and soft constraints. The prime target of this work is the hard constraints. In this domain the algorithm persistently finds legal solutions for quite difficult problems. We also make some exploratory investigations by adding soft constraints with very encouraging results. Our numerical studies cover problem sizes up to O(5*10^4) degrees of freedom with no parameter sensitivity. We stress the importance of adding certain extra terms to the energy functions which are redundant from the encoding point of view but beneficial when it comes to ignoring local minima and to stabilizing the good solutions in the annealing process. For copies of this report send requests to: THEPCAP at SELDC52. NOTICE: Those of you who requested our previous report, "A New Way of Mapping Optimization.... (LU TP 89-1), will automatically receive this one so no request is necessary. ------------------------------------------------------- -- Evan Evan W. Steeg (416) 978-7321 steeg at ai.toronto.edu (CSnet,UUCP,Bitnet) Dept of Computer Science steeg at ai.utoronto (other Bitnet) University of Toronto, steeg at ai.toronto.cdn (EAN X.400) Toronto, Canada M5S 1A4 {seismo,watmath}!ai.toronto.edu!steeg From sg at corwin.ccs.northeastern.edu Sat May 19 17:22:01 1990 From: sg at corwin.ccs.northeastern.edu (steve gallant) Date: Sat, 19 May 90 17:22:01 EDT Subject: TR: Representing Context and Word Disambiguation Message-ID: <9005192122.AA11525@corwin.CCS.Northeastern.EDU> The following TR is available: A Practical Approach for Representing Context And for Performing Word Sense Disambiguation Using Neural Networks Stephen I. Gallant ABSTRACT Representing and manipulating context information is one of the hardest problems in natural language processing. This paper proposes a method for representing some context information so that the correct meaning for a word in a sentence can be selected. The approach is based upon work by Waltz & Pollack, who emphasized neurally plausible sys- tems. By contrast this paper focuses upon computationally feasi- ble methods applicable to full-scale natural language processing systems. There are two key elements: a collection of context vectors defined for every word used by a natural language processing sys- tem, and a context algorithm that computes a dynamic context vector at any position in a body of text. Once the dynamic context vector has been computed it is easy to choose among competing meanings for a word. This choice of definitions is essentially a neural network computation, and neural network learning algorithms should be able to improve the system's choices. Although context vectors do not represent all context informa- tion, their use should improve those full-scale systems that have avoided context as being too difficult to deal with. Good candi- dates for full-scale context vector implementations are machine translation systems and text retrieval systems. A main goal of this paper is to encourage such large scale implementations and tests of context vector approaches. A variety of interesting directions for research in natural language processing and machine learning will be possible once a full set of context vectors has been created. In particular the development of more powerful context algorithms will be an impor- tant topic for future research. ----------------- Copies are available by Email only. To obtain a Latex copy, send mail to `sg at corwin.ccs.northeastern.edu'. Please do not post to other lists. Please be careful not to reply to the entire connectionist list! From aibanez at iai.es Mon May 21 11:31:00 1990 From: aibanez at iai.es (Alberto Ibaqez Rodrmguez) Date: 21 May 90 16:31 +0100 Subject: Linear separability Message-ID: <2*aibanez@iai.es> About a couple of months ago we sent a question to the list concerning the existence of a fast method to determine whether two subsets of the set of vertices of a hypercube are linerly separable (every vertex in the hypercube falls in one of the subsets). Most of the mails that were sent dealt with linear programing, perceptrons and some of them about the Walsh transform, convex polytopes or k- summability. Thank you to everyone for the interest and for dedicating time to that interesting discussion. The question arised from a simple method we arrived at, which looks like it works. However, we have not been able to prove that it will keep working in high dimension hypercubes. The idea is as follows: A subset of vertices (as defined formerly) is linearly separable from the other iff there exists a hyperplane perpendicular to the segment that joins the barycentres of the subsets in which the hypercube has been divided. If this proposition is correct, we can project every point in the subsets on this segment and find out if both sets of projections are separated. If so, the hyperplane equation can easily be calculated. We would appreciate any comments and especially if someone finds out how to prove or disprove it. Thank you very much Alberto Ibaqez et al. From MURRE%HLERUL55.BITNET at VMA.CC.CMU.EDU Tue May 22 12:29:00 1990 From: MURRE%HLERUL55.BITNET at VMA.CC.CMU.EDU (MURRE%HLERUL55.BITNET@VMA.CC.CMU.EDU) Date: Tue, 22 May 90 12:29 MET Subject: request positions for practical work Message-ID: <8F1858EAE8BF000B99@HLERUL55.BITNET> Request for positions for practical work in connectionist modelling We have several good students who are interested in periods of practical work abroad (i.e., outside the Netherlands). These periods are usually three to six months, and the student is expected to take part in some ongoing research project. Most students are in their final year (close to getting their drs. degree, comparable to a USA master's degree) in experimental psychology, and they are particularly interested in connectionist modelling of cognitive processes. All of them have followed at least one intensive course in connectionism, and they are all experienced with psychological experiments. The practical work could be in the area of modelling or experimentation. The students are expected to provide their own financing (travel, housing, etc.), but they are not supposed to pay tuition. If you consider having one of the students of our connectionist group for practical work, please, write me or send an E-mail to the address below. If you too have students that want to spend some time abroad we might think of some sort of exchange. Jaap M.J. Murre Jaap M.J. Murre Leiden University Unit of Experimental and Theoretical Psychology P.O. Box 9555 2300 RB Leiden The Netherlands tel.: 31-71-273631 fax : 31-71-273619 E-Mail: MURRE at HLERUL55.Bitnet From oruiz at fi.upm.es Tue May 22 12:00:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 22 May 90 18:00 +0200 Subject: mensa. Message-ID: <11*oruiz@fi.upm.es> I am very interested in the dynamical behavior of the neurons, and specially in the influence of the Decay parameter that appears in its dynamic equation. I want information of any work developed with this kind of neuron in pattern recognition, mainly in sequences of patterns, in feedforward networks or recurrent networks. In this momement I am eager to reach the next three papers: G.Kuhn. "Connected Recognition with a Recurrent Network". Proceedings NEUROSPEECH, 18 May 1989, special issue of Speech Communication, v 9, num.2 (1990) W.S. Stornetta, T.Hogg and B.A.Huberman. "A dynamical aproach to Temporal Pattern Processing", 1988, Neural Information Processing Systems, Editor A. Anderson. New York: American Institute of Physics. J.L. Elman, "Finding structure in time", CRL Technical Report 8801, University of California, San Diego, Center for Research in Language, La Jolla, 1988. If anybody has information of this papers please contact me. I want to know what they are about, and perhaps a brief abstract would suffice for me. Thank you in advance. From paul at NMSU.Edu Tue May 22 14:48:39 1990 From: paul at NMSU.Edu (paul@NMSU.Edu) Date: Tue, 22 May 90 12:48:39 MDT Subject: No subject Message-ID: <9005221848.AA08906@NMSU.Edu> PLEASE DISTRIBUTE THE FOLLOWING ANNOUNCEMENT IN YOUR DEPARTMENT/LABORATORY: Cut--------------------------------------------------------------------------- PRAGMATICS IN ARTIFICIAL INTELLIGENCE 5th Rocky Mountain Conference on Artificial Intelligence (RMCAI-90) Science Hall and Music Center Auditorium New Mexico State University Las Cruces, New Mexico, USA, June 28-30, 1990 PRAGMATICS PROBLEM: The problem of pragmatics in AI is one of developing theories, models, and implementations of systems that make effective use of contextual information to solve problems in changing environments. CONFERENCE GOAL: This conference will provide a forum for researchers from all subfields of AI to discuss the problem of pragmatics in AI. The implications that each area has for the others in tackling this problem are of particular interest. COOPERATION: American Association for Artificial Intelligence (AAAI) IEEE Computer Society SPONSORSHIP: Association for Computing Machinery (ACM) Computing Research Laboratory (CRL), NMSU Special Interest Group on Artificial Intelligence (SIGART) U S WEST Advanced Technologies and the Rocky Mountain Society for Artificial Intelligence (RMSAI) INVITED SPEAKERS: The following researchers are invited to present papers at the conference: *Martin Casdagli, Los Alamos National Laboratory, Los Alamos USA *Arthur Cater, University College Dublin, Ireland EC *Jerry Feldman, University of California at Berkeley, Berkeley USA & International Computer Science Institute, Berkeley USA *Barbara Grosz, Harvard University, Cambridge USA *James Martin, University of Colorado at Boulder, Boulder USA *Derek Partridge, University of Exeter, United Kingdom EC *Roger Schank, Northwestern University, Illinois, USA *Philip Stenton, Hewlett Packard, United Kingdom EC *Robert Wilensky, University of California at Berkeley Berkeley USA SUBMITTED PAPERS: In addition over 40 papers on pragmatics in AI have been accepted for the conference. THE LAND OF ENCHANTMENT: Las Cruces, lies in THE LAND OF ENCHANTMENT (New Mexico), USA and is situated in the Rio Grande Corridor with the scenic Organ Mountains overlooking the city. The city is close to Mexico, Carlsbad Caverns, and White Sands National Monument. There are a number of Indian Reservations and Pueblos in the Land Of Enchantment and the cultural and scenic cities of Taos and Santa Fe lie to the north. New Mexico has an interesting mixture of Indian, Mexican and Spanish culture. There is quite a variation of Mexican and New Mexican food to be found here too. GENERAL INFORMATION: The Rocky Mountain Conference on Artificial Intelligence is a major regional forum in the USA for scientific exchange and presentation of AI research. The conference emphasizes discussion and informal interaction as well as presentations. The conference encourages the presentation of completed research, ongoing research, and preliminary investigations. Researchers from both within and outside the region are invited to participate. DEADLINES: Pre-registration: June 1st, 1990 Final papers due: June 1st, 1990 TRANSPORT: Las Cruces, New Mexico is located one hour from El Paso, Texas on I-10 West. Participants can fly into El-Paso International Airport and transport will be provided from and to the airport. SOCIALS: The conference will include a registration reception buffet, going_away_party full-buffet, banquet and banquet speaker (+ $25.00), and numerous refreshments, HOTELS: The Las Cruces Hilton has rooms for $47.00 per night. (Call 1-800-284-0616, cutoff date is June 13th) Accommodation is also available in other Hotels and Motels. REGISTRATION: Pre-Registration: Professionals: $50.00; Students $30.00 (Pre-Registration cutoff date is June 1st 1990) Registration: Professionals: $70.00; Students $50.00 (at the conference) (Copied proof of student status is required). Registration form (IN BLOCK CAPITALS). Enclose payment made out to New Mexico State University. (ONLY checks in US dollars will be accepted). Send to the following address (MARKED REGISTRATION): Local Arrangements Chairperson, RMCAI-90 Computing Research Laboratory Dept. 3CRL, Box 30001, NMSU Las Cruces, NM 88003-0001, USA. Name:_______________________________ E-mail_____________________________ Phone__________________________ Affiliation: ____________________________________________________ Fax: ____________________________________________________ Address: ____________________________________________________ ____________________________________________________ ____________________________________________________ COUNTRY__________________________________________ LOCAL ARRANGEMENTS: Local Arrangements Chairperson, RMCAI-90. (same postal address as above). INQUIRIES: Inquiries regarding conference brochure and registration form should be addressed to the Local Arrangements Chairperson. Inquiries regarding the conference program should be addressed to the Program Chairperson. Local Arrangements Chairperson: E-mail: INTERNET: rmcai at nmsu.edu Phone: (+ 1 505)-646-5466 Fax: (+ 1 505)-646-6218. Program Chairperson: E-mail: INTERNET: paul at sparta.nmsu.edu Phone: (+ 1 505)-646-5109 Fax: (+ 1 505)-646-6218. Paul Mc Kevitt, Program Chairperson, RMCAI-90, Computing Research Laboratory (CRL), Dept. 3CRL, Box 30001, New Mexico State University, Las Cruces, NM 88003-0001, USA. TOPICS OF INTEREST: You are invited to submit a research paper addressing Pragmatics in AI, with any of the following orientations: Philosophy, Foundations and Methodology Knowledge Representation Neural Networks and Connectionism Genetic Algorithms, Emergent Computation, Nonlinear Systems Natural Language and Speech Understanding Problem Solving, Planning, Reasoning Machine Learning Vision and Robotics Applications TENTATIVE CONFERENCE SCHEDULE: .ce \fBRMCAI-90 CONFERENCE SCHEDULE\fR WEDNESDAY 27th June 1990: 6:00 pm - 10:00 pm: Registration and Reception, Double Eagle, Old Mesilla THURSDAY 28th June 1990: \fB8:50 am: Yorick Wilks and Paul Mc Kevitt: Welcome\fR \fB9:00 am: Invited talk: Jerry Feldman, UC Berkeley \fR .nf .ta .6i Miniature Language Acquisition: A Paradigm problem and some approaches 10:00 am: Coffee 10:30 am - 12:30 pm: Three tracks of submitted papers. .nf \fBTRACK A:\fR PRACMA: Processing Arguments between Controversially-Minded Agents Jurgen Allgayer : Alfred Kobsa : Carola Reddig : Norbert Reithinger Relevant Beliefs Afzal Ballim : Yorick Wilks Speech Acts and Mental States Robbert-Jan Beun Extensions of Constraints on Speech Act Ambiguity Elizabeth A. Hinkelman \fBTRACK B:\fR Dynamic Route Planning E. Cortes-Rello : F. Golshani Strategic Planning System (SPS) Mitchell Smith : Peter Briggs : Edward Freeman Re-planning a Route - A Pragmatic Approach Wai-Kiang Yeap Evaluation of Pragmatics Processing in a Direction Finding Domain Deborah A. Dahl \fBTRACK C:\fR Computing with Fast Modulation: Experiments with Biologically Realistic Model Neurons Mark DeYong : Randall Findley : Chris Fields Competition and Selection in Neural Networks with Distributed Representations Kankanahalli Srinivas : John Barnden Using Genetic Algorithms as a Post-Processor for Improving Vehicle Routing Solutions Nagesh Kadaba : Kendall E. Nygard An Application of Neural Networks is Robotics Dr. Behzad Ghavimi 12:30 pm - 2:00 pm: Lunch \fB2:00 pm: Invited talk: Robert Wilensky, UC Berkeley, USA\fP 3:00 pm - 3:30 pm: Coffee \fB3:30 pm - 4:30 pm: Invited talk: Phil Stenton, HP Laboratories, Bristol, UK\fP .nf .ta 1.2i Putting NL to work: A dialogue modeling approach .sp .fi 4:30 pm - 5:30 pm: Three tracks of submitted talks \fBTRACK A:\fR .sp .nf .ta .6i Using relational knowledge structures to handle null value situations in natural language interfaces Nick Cercone : Dan Fass : Chris Groeneboer : Gary Hall : Mimi Kao : Paul McFetridge : Fred Popowich A Classification of User-System Interactions in Natural Language with Special Reference to : Dan Fass : Nick Cercone : Gary Hall : Chris Groeneboer : Paul McFetridge : Fred Popowick \fBTRACK B:\fR Problem Solving Experience and Problem Solving Knowledge Stephen W. Smoliar An Abstraction-Partitioned Model for Reactive Planning Lee Spector : James A. Hendler \fBTRACK C:\fR A Graph Theoretic Basis for Problem Solving Daniel P. Eshner : Heather D. Pfeiffer Meta-Structures: Intelligent Structures for Inference Control Daniel J. Goter : David E. Monarchi FRIDAY 29th June 1990: \fB9:00 am: Invited talk: Barbara Grosz, Harvard University\fP Collaborative Planning for Discourse 10:00 am: Coffee 10:30 am - 12:30 pm: Three tracks of submitted papers \fBTRACK A:\fR Why Does Language Matter to Artificial Intelligence Marcelo Dascal Pragmatics of Postdeterminers Non-restrictive Modifications & Wh-phrases Frens J.H. Dols Pragmatics and Natural Language Processing Eduard H. Hovy On the Semantics of the Conjunction "but" Wlodek Zadrozny : Karen Jensen \fBTRACK B:\fR How to Become Immune to Facts M.J. Coombs : R.T. Hartley : W.B. Kilgore : H.D. Pfeiffer Constrained Rational Agency Bruce D'Ambrosio : Tony Fountain : Lothar Kaul Abductive Inference in AI: Potential Unifications Venugopala Rao Dasigi A Prolog Implementation of the Stable Model TMS Stephen Pimentel : John L. Cuadrado \fBTRACK C:\fR Multiple Level Island Search Peter C. Nelson : John F. Dillenburg Efficient Learning with Representative Presentations Xiaofeng (Charles) Ling User Modelling in a Knowledge-Based Environment for European Learning Michael F. McTear : Norman Creaney : Weiru Liu Training a Neural Network to be a Context Sensitive Grammer Robert F. Simmons : Yeong-Ho Yu 12:30 pm - 2:00 pm: Lunch \fB2:00 pm: Invited talk: Roger Schank, Northwestern University\fP 3:00 pm - 3:30 pm: Coffee \fB3:30 pm - 4:30 pm: Invited talk: Arthur Cater, University College Dublin, Ireland\fP 4:30 pm - 5:30 pm: Three tracks of submitted papers \fBTRACK A:\fR Towards Empirically Derived Semantic Classes Brian M. Slator : Shahrzad Amirsoleymani : Sandra Andersen : Kent Braaten John Davis : Rhonda Ficek : Hossein Hakimzadeh : Lester McCann : Joseph Rajkumar : Sam Thangiah : Daniel Thureen Using Words Louise Guthrie : Paul Mc Kevitt : Yorick Wilks \fBTRACK B:\fR An Expert Tool for Digital Circuit Design F.N. Sibai : K. L. Watson Explaining Control Strategy in Second Generation Expert Systems Xuejun Tong \fBTRACK C:\fR A New Approach to Analyzing Aerial Photographics Dwayne Phillips Acquiring Categorical Aspects: A Connectionist Account of Figurative Noun Semantics Susan Hollbach Weber 6:00 pm - 9:00 pm: Japanese Buffet in Garden Center (Budagher's) SATURDAY 30th June 1990: \fB9:00 am: Invited talk: Derek Partridge, University of Exeter, UK\fP 10:00 am: Coffee 10:30 am - 11:30: Two tracks of submitted papers \fBTRACK A\fR An Experiment on Technical Text Reproduction Wanying Jin Explanation Dialogues: Interpreting Real Life Questions & Explanations Efstratios Sarantinos : Peter Johnson Modeling of mind and its application to image sequence understanding Naoyuki Okada \fBTRACK B:\fR Communication and Belief Changes in a Society of Agents Graca Gaspar An Interval Calculus Based Finite Domain Constraint and its Implementation in Prolog Jin-Kao Hao : Jean-Jacques Chabrier Dynamic Context Diagrams: the pragmatics of social interaction in KBS development Simon P.H. Morgan 11:30 am - 1:30 pm: Lunch \fB1:30 pm - 2:30 pm: Invited talk: James Martin, University of Colorado at Boulder\fP .nf .ta 1.2i A Unified Approach To Conventional Non-Literal Language 3:00 pm - 3:30 pm: Coffee \fB2:30 pm - 3:30 pm: Invited talk: Martin Casdagli, Los Alamos National Laboratories\fP Pragmatic Artificial Neural Nets for the Nonlinear Prediction of Time Series 6:00 pm - 9:00 pm: Banquet (Double Eagle) .ce ***************************** PROGRAM COMMITTEE: *John Barnden, New Mexico State University (Connectionism, Beliefs, Metaphor processing) *Hans Brunner, U S WEST Advanced Technologies (Natural language interfaces, Dialogue interfaces) *Martin Casdagli, Los Alamos National Laboratory (Dynamical systems, Artificial neural networks, Applications) *Mike Coombs, New Mexico State University (Problem solving, Adaptive systems, Planning) *Dan Eshner, University of Maryland (Planning, Search, Knowledge Representation) *Thomas Eskridge, Lockheed Missile and Space Co. (Analogy, Problem solving) *Chris Fields, New Mexico State University (Neural networks, Nonlinear systems, Applications) *Roger Hartley, New Mexico State University (Knowledge Representation, Planning, Problem Solving) *Victor Johnson, New Mexico State University (Genetic Algorithms) *Paul Mc Kevitt, New Mexico State University (Natural language interfaces, Dialogue modeling) *Joe Pfeiffer, New Mexico State University (Computer Vision, Parallel architectures) *Keith Phillips, University of Colorado at Colorado Springs (Computer vision, Mathematical modelling) *Roger Schvaneveldt, New Mexico State University (Knowledge representation, Knowledge elicitation, cognitive modeling) *Brian Slator, North Dakota State University (Natural language processing, Knowledge acquisition) *Yorick Wilks, New Mexico State University (Natural language processing, Knowledge representation) *Scott Wolff, U S WEST Advanced Technologies (Intelligent tutoring, User interface design, Cognitive modeling) Organizing Committee RMCAI-90: Paul Mc Kevitt Yorick Wilks Research Scientist Director, CRL CRL and Professor, NMSU Computer Science cut------------------------------------------------------------------------ From aibanez at iai.es Wed May 23 05:44:00 1990 From: aibanez at iai.es (Alberto Ibaqez Rodrmguez) Date: 23 May 90 10:44 +0100 Subject: Linear separability Message-ID: <5*aibanez@iai.es> After sending to the list the mail concerning the existence of a criterion to prove the linear separability of two subsets of the set of vertices of a given n-dimensional hypercube, we have realized that the word 'barycentre' is not correct (does not appear in the Webster's). We apologize for this sligth mistake (barycentric does appear). The correct term would be 'centroid', 'centre of gravity' or 'centre of mass'. Alberto Ibaqez et al. From YAEGER.L at AppleLink.Apple.COM Wed May 23 08:29:00 1990 From: YAEGER.L at AppleLink.Apple.COM (Yaeger, Larry) Date: 23 May 90 12:29 GMT Subject: character recognition Message-ID: <1481495@AppleLink.Apple.COM> There have been a number of published works on the use of neural networks for handprinted character recognition. All the work of this type that I recall hearing or reading about has been based on either pixel maps or feature vectors derived from pixel maps. Segmentation was either provided appriori, or was a separate step in the recognition process that did not utilize a connectionist approach (and was usually the weakest link in the process). Is there any published (or unpublished) work on a connectionist approach to handprinted character recognition that utilizes stroke data -- time-sequences of input-device position -- as direct input to the network (whether predicting segmentation or having it supplied)? Is there an existing database of such stroke data (whether gathered for connectionist models or not)? Thanks in advance for any information you might be able to pass on. If there is an outpouring of references on the subject I'll collect them and post to this list. - larryy at apple.com [please use this address regardless of what your mailer says] P.S. Be careful not to reply to the entire list unless you intend to... From Bill_McKellin at mtsg.ubc.ca Wed May 23 12:17:44 1990 From: Bill_McKellin at mtsg.ubc.ca (Bill_McKellin@mtsg.ubc.ca) Date: Wed, 23 May 90 09:17:44 PDT Subject: metaphor Message-ID: <2251711@mtsg.ubc.ca> I am trying to locate material on connectionist approaches to metaphor and analogical forms of discourse processing. If you are working in these areas (or know of others who are) would you please contact me with a description of your research and references to appropriate articles. From fritz_dg%ncsd.dnet at gte.com Wed May 23 17:58:51 1990 From: fritz_dg%ncsd.dnet at gte.com (fritz_dg%ncsd.dnet@gte.com) Date: Wed, 23 May 90 17:58:51 -0400 Subject: requests for information Message-ID: <9005232158.AA05552@bunny.gte.com> Briefly, I see a fair number of ISO's without a return address, leaving no alternative but to respond to the entire list. Explicit inclusion of email address would be helpful. Dave F. fritz_dg%ncsd at gte.com From AMR at IBM.COM Thu May 24 14:06:39 1990 From: AMR at IBM.COM (AMR@IBM.COM) Date: Thu, 24 May 90 14:06:39 EDT Subject: Formal Properties Message-ID: Some time ago I requested references (together with hard copies if possible) to work on formal properties (such as Turing equivalence or otherwise) of models used in connectionist research. I would like to renew that request now, since there must be more out there than what I have received to date (while thanking those who have already responded). From oruiz at fi.upm.es Thu May 24 12:30:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 24 May 90 18:30 +0200 Subject: paper request Message-ID: <18*oruiz@fi.upm.es> I am interested on findind the following article of Cybenco, C. (1988): "Approximation by Superpositions of a Sigmoidal Function"; and an other about the implementation of functions with neural networks. I am also trying to find a probably unpublished paper of E.F. Moore: "Counter-Example to a Conjeture of McCluskey and Paull" (1957). Miguel A. Lerma Ps: I am sharing the email with Oscar Ruiz and Javier Segovia. From issnnet at bucasb.bu.edu Fri May 25 13:29:35 1990 From: issnnet at bucasb.bu.edu (issnnet@bucasb.bu.edu) Date: Fri, 25 May 90 13:29:35 EDT Subject: ISSNNet meeting/dinner at IJCNN Message-ID: <9005251729.AA18951@thalamus.bu.edu> The International Student Society for Neural Networks (ISSNNet) will hold a meeting on Monday night, June 18, during the IJCNN conference in San Diego. All interested parties are welcome to join us. We are planning to organize a (cheap), quick meal right before or after the meeting, so participants may attend the evening plenary talks. We also expect to get a lot of people together after the plenaries and head over to some local establishment (you do not need to be a member to join us there :-). Exact details will be available at registration or at the ISSNNet booth during the conference. For more information send email to: issnnet at bucasb.bu.edu From biafore%cs at ucsd.edu Mon May 28 23:51:47 1990 From: biafore%cs at ucsd.edu (Louis Steven Biafore) Date: Mon, 28 May 90 20:51:47 PDT Subject: IJCNN Reminder Message-ID: <9005290351.AA01290@beowulf.ucsd.edu> ............................................................ International Joint Conference on Neural Networks San Diego, CA. - June 17-21, 1990 The 1990 IJCNN is sponsored by the IEEE Council on Neural Networks and the International Neural Network Society (INNS). The IJCNN will cover the full spectrum of neural computing from theory such as neurodynamics to applications such as machine vision. Meet leading experts and practitioners during the largest conference in the field. For further information contact Nomi Feldman, Meeting Management, 5665 Oberlin Dr., Suite 110, San Diego, CA 92121. Telephone (619) 453-6222. Registration The conference registration fee includes admission to all sessions, exhibit area, Sunday Welcome Reception and Wednesday Party. TUTORIALS ARE NOT INCLUDED. The registration fee is $280. Single day registration is available for $110 (proceedings not included). Full-time students may attend for $50, proceed- ings and Wednesday Party not included. Schedule of Events Sunday 17 June TUTORIALS (8 am - 6 pm) RECEPTION (6 pm - 8 pm) INDUSTRY PANEL (8 pm - 10 pm) Monday 18 June TECHNICAL SESSIONS (8 am - 5 pm) BIOENGINEERING PANEL (12 pm - 1:30 pm) PLENARY SESSIONS (8 pm - 10 pm) Tuesday 19 June TECHNICAL SESSIONS (8 am - 5 pm) PLENARY SESSIONS (8 pm - 10 pm) Wednesday 20 June TECHNICAL SESSIONS (8 am - 5 pm) PARTY (6 pm - 8 pm) GOVERNMENT PANEL (8 pm - 10 pm) Thursday 21 June TECHNICAL SESSIONS (8 am - 5 pm) Tutorials Thirteen tutorials are planned for Sunday 17 June. Adaptive Sensory-Motor Control - Stephen Grossberg Associative Memory - Bart Kosko Chaos for Engineers - Leon Chua Dynamical Systems Review - Morris Hirsch LMS Techniques in Neural Networks - Bernard Widrow Neural Network Applications - Robert Hecht-Nielsen Neurobiology I: Neurons and Simple Networks - Walter Freeman Neurobiology II: Advanced Networks - Allen Selverston Optical Neurocomputers - Demitri Psaltis Reinforcement Learning - Andrew Barto Self-Organizing Feature Maps - Teuvo Kohonen Vision - John Daugman VLSI Technology and Neural Network Chips - Lawrence Jackel Exhibits Exhibitors will present innovations in neural networks, including neurocomputers, VLSI neural networks, implementations, software systems and applications. IJCNN is the neural network industy's largest tradeshow. Vendors may contact Richard Rea at (619) 222-7447 for additional information. Accomodations IJCNN 90 will be held at the San Diego Marriott Hotel on San Diego Bay (619) 234-1500. ............................................................ Please direct questions to the appropriate individual as specified above (please don't send questions to me). S. Biafore - UCSD From oruiz at fi.upm.es Tue May 29 09:47:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 29 May 90 15:47 +0200 Subject: counter-example Message-ID: <25*oruiz@fi.upm.es> I still don't have an answer to my request for a counter-example to a conjecture of McCluskey and Paull. Roughly, the conjecture was the following: A n-argument truth function is linearly separated if and only if there exist no four vertices of the n-cube that form a parallelogram, one pair of whose diagonal points are true vertices and the other pair false vertices. It is easy to show that the condition is necessary, but E.F. Moore proved in 1957 that it is not sufficient. I would like to know Moore's counter-example, or any other one. Miguel A. Lerma Sancho Davila 18 28028 MADRID SPAIN  From LUBTODI%YALEVM.BITNET at vma.CC.CMU.EDU Tue May 29 14:24:00 1990 From: LUBTODI%YALEVM.BITNET at vma.CC.CMU.EDU (LUBTODI%YALEVM.BITNET@vma.CC.CMU.EDU) Date: Tue, 29 May 90 14:24 EDT Subject: bibliog on high-level tasks Message-ID: I have prepared a bibliography on the application of connectionist models to high-level cognitive tasks. High-level cognitive tasks include: analogical thinking, evidential reasoning/decision making, and complex (multistage) problem solving. The bibliography also includes some related work on schemata, scripts, sequential information processing, mental models, and rule-like processing. I thank members of this mailing list who suggested references for this bibliography. If you would like a copy please send me your US MAIL address. Todd Lubart, Yale Univ., Dept.of Psychology, Box 11A Yale Station, New Haven CT 06520 bitnet: LUBTODI at YALEVM From jacobs at gluttony.cs.umass.edu Wed May 30 10:26:52 1990 From: jacobs at gluttony.cs.umass.edu (jacobs@gluttony.cs.umass.edu) Date: Wed, 30 May 90 10:26:52 EDT Subject: new technical report Message-ID: <9005301426.AA01120@ANW.edu> The following technical report is now available: Task Decomposition Through Competition in a Modular Connectionist Architecture Robert A. Jacobs Department of Computer & Information Science University of Massachusetts Amherst, MA 01003 COINS Technical Report 90-44 Abstract -------- A novel modular connectionist architecture is presented in which the networks composing the architecture compete to learn the training patterns. As a result of the competition, different networks learn different training patterns and, thus, learn to compute different functions. The architecture performs task decomposition in the sense that it learns to partition a task into two or more functionally independent tasks and allocates distinct networks to learn each task. In addition, the architecture tends to allocate to each task the network whose topology is most appropriate to that task, and tends to allocate the same network to similar tasks and distinct networks to dissimilar tasks. Furthermore, it can be easily modified so as to learn to perform a family of tasks by using one network to learn a shared strategy that is used in all contexts along with other networks that learn modifications to this strategy that are applied in a context sensitive manner. These properties are demonstrated by training the architecture to perform object recognition and spatial localization from simulated retinal images, and to control a simulated robot arm to move a variety of payloads, each of a different mass, along a specified trajectory. Finally, it is noted that function decomposition is an underconstrained problem and, thus, different modular architectures may decompose a function in different ways. A desirable decomposition can be achieved if the architecture is suitably restricted in the types of functions that it can compute. Finding appropriate restrictions is possible through the application of domain knowledge. A strength of the modular architecture is that its structure is well--suited for incorporating domain knowledge. Please note that this technical report is my Ph.D. thesis and, thus, is considerably longer than the typical technical report (125 pages). If possible, please obtain a postscript version of this technical report from the pub/neuroprose directory at cheops.cis.ohio-state.edu. a) Here are the directions: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): neuron ftp> cd pub/neuroprose ftp> type binary ftp> get (remote-file) jacobs.thesis1.ps.Z (local-file) foo1.ps.Z ftp> get (remote-file) jacobs.thesis2.ps.Z (local-file) foo2.ps.Z ftp> get (remote-file) jacobs.thesis3.ps.Z (local-file) foo3.ps.Z ftp> get (remote-file) jacobs.thesis4.ps.Z (local-file) foo4.ps.Z ftp> quit unix> uncompress foo1.ps.Z unix> uncompress foo2.ps.Z unix> uncompress foo3.ps.Z unix> uncompress foo4.ps.Z unix> lpr -P(your_local_postscript_printer) foo1.ps unix> lpr -P(your_local_postscript_printer) foo2.ps unix> lpr -P(your_local_postscript_printer) foo3.ps unix> lpr -P(your_local_postscript_printer) foo4.ps If your printer dies because the size of a file exceeds the printer's memory capacity, then please try the -s option to the lpr command (see the manual page for the lpr command). b) You can also use the Getps script posted on the connectionist mailing list a few months ago. If you do not have access to a postscript printer, copies of this technical report can be obtained by sending requests to Connie Smith at smith at cs.umass.edu. Remember to ask for COINS Technical Report 90-44. From reynolds at bucasb.bu.edu Wed May 30 13:44:35 1990 From: reynolds at bucasb.bu.edu (reynolds@bucasb.bu.edu) Date: Wed, 30 May 90 13:44:35 EDT Subject: bibliog on high-level tasks In-Reply-To: connectionists@c.cs.cmu.edu's message of 30 May 90 07:40:56 GM Message-ID: <9005301744.AA16598@thalamus.bu.edu> I would be quite interested in seeing your bibliography. My mailing address is: John Reynolds Cognitive and Neural Systems 111 Cummington Street Boston University Boston, MA 02215 Thank you, John From R0MJW at IBM.COM Wed May 30 15:25:05 1990 From: R0MJW at IBM.COM (Michael Witbrock) Date: Wed, 30 May 90 14:25:05 -0500 Subject: bibliog on high-level tasks In-Reply-To: Your message of Wed, 30 May 90 13:44:35 EDT. <9005301744.AA16598@thalamus.bu.edu> Message-ID: <9005301825.AA05423@mjw.watson.ibm.com> You sent your request to everyone. I'm a former maintainer of ``connectionists''. The fact that your message has an 'In-reply-to' field, indicates that you used the reply function of your mailer. Because of the possibility of inadvertantly mailing your notes around the world. ``Reply'' is not recommended with connectionists. thanks \michael From al at gmdzi.uucp Thu May 31 07:31:58 1990 From: al at gmdzi.uucp (Alexander Linden) Date: Thu, 31 May 90 09:31:58 -0200 Subject: Special Issue on Neural Networks Message-ID: <9005310731.AA08372@gmdzi.UUCP> Special Issue on Neural Networks in Parallel Computing (To appear in August) This special issue focuses on the third generation of neural networks, which can be characterized as being heterogeneous, modular and asynchronous. Contents: H. Muehlenbein: Limitations of Multilayer Perceptrons: Towards Genetic Neural Networks F. Smieja: The Geometry of Multilayer Perceptron Solutions H. Muehlenbein J. Kindermann: Inversion of Neural Networks by Gradient Descent A. Linden T. E. Lange: Simulation of Heterogeneous Neural Networks on Serial and Parallel Machines A. Singer: Implementations of Artificial Neural Networks on the Connection Machine X. Zhang: The Backpropagation Algorithm on Grid and Hypercube et al. Architectures M. Witbrock: An Implementation of Backpropagation Learning on GF11, M. Zagha a large SIMD Parallel Computers D. Whitley: Genetic Algorithms and Neural Networks: Optimizing et al. Connections and Connectivity M. Tenorio: Topology Synthesis Networks: Self Organization of Structure and Weight Adjustment as a Learning Paradigm K. Obermayer: Large Scale Simulations of Self-Organizing Neural Networks on Parallel Computers: Application for Biological Modelling R. Kentridge: Neural Networks for Learning in the Real World: Representation, Reinforcement and Dynamics ------------------------------------------------------------------------- HOW TO ORDER: The publisher is offering a special service. Copies of this issue at a price of $25 can be obtained from Dr. F. van Drunen Elsevier Science Publishers Mathematics and Computer Science Section P.O. BOX 103 1000 AC Amsterdam The Netherlands FAX: +31-10-5862-616 ------------------------------------------------------------------------- Copies of the first three papers can be got from GMD c/o Sekretariat Z1.HLRZ P.O. BOX 1240 D-5205 Sankt Augustin 1 West Germany FAX +49 - 2241 - 142618 Heinz Muehlenbein From schmidhu at tumult.informatik.tu-muenchen.de Tue May 1 04:15:12 1990 From: schmidhu at tumult.informatik.tu-muenchen.de (Juergen Schmidhuber) Date: Tue, 1 May 90 10:15:12 +0200 Subject: New FKI-Reports Message-ID: <9005010815.AA23674@kiss.informatik.tu-muenchen.de> Two new reports on spatio-temporal credit assignment in neural networks for adaptive control are available. LEARNING TO GENERATE FOCUS TRAJECTORIES FOR ATTENTIVE VISION FKI-REPORT 128-90 Juergen Schmidhuber and Rudolf Huber One motivation of this paper is to replace the often unsuccessful and inefficient purely static `neural' approaches to visual pattern recognition by a more efficient sequential approach. The latter is inspired by the observation that biological systems employ sequential eye-movements for pattern recognition. The other motivation is to demonstrate that there is at least one principle which can lead to the LEARNING of dynamic selective spatial attention. A system consisting of an adaptive `model network' interacting with a dynamic adaptive `control network' is described. The system LEARNS to generate focus trajectories such that the final position of a moving focus corresponds to a target to be detected in a visual scene. The difficulty is that no teacher provides the desired activations of `eye-muscles' at various times. The only goal information is the desired final input corresponding to the target. Thus the task involves a complex temporal credit assignment problem, as well as an attention shifting problem. It is demonstrated experimentally that the system is able to learn correct sequences of focus movements involving translations and rotations. The system also learns to track a moving target. Some implications for attentive systems in general are discussed. For instance, one can build a `mental focus' which operates on the set of internal representations of a neural system. It is suggested that self-referential systems which model the consequences of their own `mental focus shifts' open the door for introspective learning in neural networks. TOWARDS COMPOSITIONAL LEARNING IN NEURAL NETWORKS FKI-REPORT 129-90 Juergen Schmidhuber None of the existing learning algorithms for neural networks with internal and/or external feedback addresses the problem of learning by composing subprograms, of learning `to divide and conquer'. In this work it is argued that algorithms based on pure gradient descent or on temporal difference methods are not suitable for large scale dynamic control problems, and that there is a need for algorithms that perform `compositional learning'. Some problems associated with compositional learning are identified, and a system is described which attacks at least one of them. The system learns to generate sub-goals that help to achieve its main goals. This is done with the help of `time-bridging' adaptive models that predict the effects of the system's sub-programs. A simple experiment is reported which demonstrates the feasibility of the method. To obtain copies of these reports, write to Juergen Schmidhuber Institut fuer Informatik, Technische Universitaet Muenchen Arcisstr. 21 8000 Muenchen 2 GERMANY or send email to schmidhu at lan.informatik.tu-muenchen.dbp.de Only if this does not work for some reason, try schmidhu at tumult.informatik.tu-muenchen.de Please let your message look like this: subject: FKI-Reports FKI-128-90, FKI-129-90 Physical address (not more than 33 characters per line) DO NOT USE REPLY! From khaines at GALILEO.ECE.CMU.EDU Tue May 1 10:48:03 1990 From: khaines at GALILEO.ECE.CMU.EDU (Karen Haines) Date: Tue, 1 May 90 10:48:03 EDT Subject: INNC - Call for Volunteers Message-ID: <9005011448.AA11682@galileo.ece.cmu.edu> *************************************************************************** INNC REQUEST FOR VOLUNTEERS July 9-13,1990 Paris, France *************************************************************************** This is the final call for volunteers to help at the INNC conference, to be held at the Palias Des Congres in Paris, France, on July 9-13,1990. Full admittance to the conference and a copy of the proceedings is offered in exchange for your assistance throughout the conference. In general, each volunteer is expected to work one shift each day of the conference. Hours are approximately: AM shift - 7:00 am - 1:00pm PM shift - Noon - 6:00 pm In addition, assistance may be required for the social events. Below is a description of the available positions. If you are interested in volunteering, please send me the following information: Name Address Phone number Country electronic mail address shift preference Positions are being filled on a first commit first served basis. If you have further questions, please feel free to contact me. Karen Haines Dept. of ECE Carnegie Mellon University Pittsburgh, PA 15213 message: (412) 362-8675 (tell me where you are calling from) email: khaines at galileo.ece.cmu.edu PLEASE NOTE!!!!!!!!!!!!!!!!!! THERE IS NO FUNDING AVAILABLE FROM THE CONFERENCE TO COVER TRAVELING/LODGING EXPENSES. Thank you, Karen Haines INNC Volunteer Coordinator Volunteer Positions (volunteers needed) - Description (please note that hours are subject to change) --------------------------------------------------------- Exhibits - Stuffing Proceedings (8) - These volunteers will be required to work Sunday 9am-6pm, Monday 8am-6pm, and Tuesday 8am-12pm. Sunday and Monday will be used to stuff proceedings into the bags. Monday/Tuesday they will double in the exhibits area assisting the Exhibits Chair exhibitors. Poster Session (8) - The volunteers will be responsible for assisting the presenters in putting up/taking down their posters. Days that they will be Shifts are AM or PM Tues thru Thurs. (Hours - General) Conference Sessions (16) - The number of Technical sessions that will be occurring each morning and afternoon of the conference is 4. Two volunteers will be used to check badges at the door for each technical session. Volunteers working the technical sessions will be assigned mornings or afternoons in groups of two. Note that they will be working with the same person each day throughout the conference. Shifts are AM or PM, Tues-Fri. (Hours - General) Exhibit Area II (4) : - Two volunteers will be used to check badges at the door. Volunteers will be assigned mornings or afternoons. Shifts are AM or PM, Tues-Fri. (Hours - General) Message Center (4) - Volunteers will be responsible for the message center. Two volunteers in the morning, two in the afternoon. Shifts are AM or PM Mon-Fri. (Hours - General) Reception at the Hotels (24) - Volunteers will be posted at 6 hotels to provide directions to the conference. Working in teams of 2, these volunteers will be required to work Sunday 9am-9pm, Monday 9am-9pm. From noel%cs.exeter.ac.uk at nsfnet-relay.AC.UK Tue May 1 10:28:46 1990 From: noel%cs.exeter.ac.uk at nsfnet-relay.AC.UK (Noel Sharkey) Date: Tue, 1 May 90 10:28:46 BST Subject: weight spaces In-Reply-To: ray@au.oz.su.cs.cluster's message of Mon, 30 Apr 1990 19:09:43 +1000 <9004300912.26100@munnari.oz.au> Message-ID: <17159.9005010928@entropy.cs.exeter.ac.uk> I will get cracking on the intro as well - did you get abstracts? I would like to set a real deadline for going to the publisher on may 12. I think you should tell stenning that the papers will go to the publishers on may 10th and we have to write an intro. so he is really to late. i didn't realise this when I spoke to him. (yes i now think you are right about the fast one.). lets give him no choice. noel From David.Servan-Schreiber at A.GP.CS.CMU.EDU Wed May 2 12:25:46 1990 From: David.Servan-Schreiber at A.GP.CS.CMU.EDU (David.Servan-Schreiber@A.GP.CS.CMU.EDU) Date: Wed, 02 May 90 12:25:46 EDT Subject: TR available Message-ID: <6169.641665546@A.GP.CS.CMU.EDU> A Parallel Distributed Processing Approach to Behavior and Biology in Schizophrenia. Jonathan D. Cohen and David Servan-Schreiber Technical Report AIP-100 Department of Psychology and School of Computer Science Carnegie Mellon Pittsburgh, PA 15123 ABSTRACT In this paper, we illustrate the use of connectionist models to explore the relationship between biological variables and cognitive deficits in schizophrenia. In the first part of the paper, we describe schizophrenic cognitive deficits in three experimental tasks that tap attention and language processing abilities. We also review biological disturbances that have been reported involving the frontal lobes and the mesocortical dopamine system. In the second part of the paper we present three computer models, each of which simulates normal performance in one of the cognitive tasks described initially. These models were developed within the connectionist (or parallel distributed processing) framework. At the behavioral level, the models suggest that a disturbance in the processing of context can account for schizophrenic patterns of performance in both attention and language-related tasks. At the same time, the models incorporate a mechanism for processing context that can be identified with the function of prefrontal cortex, and a parameter that corresponds to the neuromodulatory effects of dopamine. A disturbance in this parameter in the component of the model corresponding to function of prefrontal cortex is sufficient to account for schizophrenic patterns of performance in all three of the cognitive tasks simulated. Thus, the models offer an explanatory mechanism linking performance deficits to a disturbance in the processing of context which, in turn, is attributed to a reduction of dopaminergic activity in prefrontal cortex. In the General Discussion, we consider the implications that these models have for our understanding of both normal and schizophrenic cognition. We conclude with a discussion of some general issues concerning the use of computer simulation models in research. This report is availble at no charge. Please send requests to jc5e at andrew.cmu.edu. From noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK Wed May 2 11:31:50 1990 From: noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK (Noel Sharkey) Date: Wed, 2 May 90 11:31:50 BST Subject: weight spaces In-Reply-To: Noel Sharkey's message of Tue, 1 May 90 10:28:46 BST <17159.9005010928@entropy.cs.exeter.ac.uk> Message-ID: <17550.9005021031@entropy.cs.exeter.ac.uk> I apologise for the appearance of personal mail on the mailing list yesterday. I have no idea how it happened. From David.Servan-Schreiber at A.GP.CS.CMU.EDU Wed May 2 10:58:37 1990 From: David.Servan-Schreiber at A.GP.CS.CMU.EDU (David.Servan-Schreiber@A.GP.CS.CMU.EDU) Date: Wed, 02 May 90 10:58:37 EDT Subject: Recurrent Linguistic Domain Papers? In-Reply-To: Your message of Fri, 27 Apr 90 01:35:00 -0500. Message-ID: <4508.641660317@A.GP.CS.CMU.EDU> Tom, Axel Cleeremans, Jay McClelland and I have also worked on simple recurrent networks (SRNs) and their ability to discover finite state and recurrent grammars from examplars. We have shown that, during training with exemplars generated from a finite state grammar, an SRN progressively encodes more and more temporal context. We also explored the conditions under which the network can carry information about distant sequential contingencies across intervening elements to distant elements. Such information is retained with relative ease if it is relevant at each intermediate step of a sequence; it tends to be lost when intervening elements do not depend on it. However, in a more complex simulation, we showed that long distance sequential contingencies can be encoded by an SRN even if only subtle statistical properties of embedded strings depend on the early information. Our interpretation of this phenomenon is that the network encodes long-distance dependencies by *shading* internal representations that are responsible for processing common embeddings in otherwise different sequences. This ability to represent simultaneously similarities and differences between several sequences relies on the graded nature of representations used by the network, which contrast with the finite states of traditional automata. For this reason, in our more recent work we have started to call such networks *Graded State Machines*. Axel and Jay have also shown that learning and processing in such graded state machines accounts nicely for the way in which human subjects improve and perform in an implicit finite-state grammar learning experiment. Finally, in addition to Jeff Elman's and Jordan Pollack's work, Bob Allen has also done some interesting experiments with recurrent networks and discovery of sequential structures. Unfortunately I cannot put my hands on the appropriate references just now but he can be contacted at RBA at flash.bellcore.com. Cleeremans A, and McClelland J (submitted to Cognitive Science) Learning the Structure of Event Sequences. Available from the first author, dpt of Psychology, Carnegie Mellon University, Pgh, PA, 15213 Cleeremans A, Servan-Schreiber D, and McClelland J (1989) Finite State Automata and Simple Recurrent Networks. Neural Computation 1:372-381 Servan-Schreiber D, Cleeremans A, and McClelland J (1988) Encoding Sequential Structure in Simple Recurrent Networks. Technical Report CMU-CS-183, Carnegie Mellon University (orders taken by copetas at cs.cmu.edu, no charge) From mel at aurel.cns.caltech.edu Wed May 2 12:36:15 1990 From: mel at aurel.cns.caltech.edu (Bartlett Mel) Date: Wed, 2 May 90 09:36:15 PDT Subject: TR available Message-ID: <9005021636.AA08904@aurel.cns.caltech.edu> **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** **********DO NOT FORWARD TO OTHER BBOARDS************** The following TR is now available. A postscript version can be gotten by the usual anonymous ftp (see below). If you can't use the postscript version, you can get a hardcopy by sending a postcard to: C. Hochenedel Division of Biology Caltech, 216-76 Pasadena, CA 91125 ________________________________________ THE SIGMA-PI COLUMN: A MODEL OF ASSOCIATIVE LEARNING IN CEREBRAL NEOCORTEX Bartlett W. Mel Computation and Neural Systems Program, 216-76 California Institute of Technology Pasadena, California 91125 mel at aurel.cns.caltech.edu ABSTRACT In this paper we present a model of associative learning in cerebral neocortex. The extrinsically-projecting pyramidal cells of layers 2, 3, and 5 of association cortex are modeled as {\sl sigma-pi} units, where a {\sl sigma-pi} unit computes its activation level as a sum of contributions from a set of multiplicative (or locally-thresholded) clusters of synapses distributed throughout its dendritic tree. The model demonstrates how a broad class of biologically-relevant nonlinear associative learning problems can be solved in this system by modifying only a single layer of excitatory synapses under the control of a Hebb-type learning rule. The model also accounts for a variety of features of cortical anatomy, physiology, and biophysics whose relations to learning have remained poorly understood. These include, (1) three learning-related roles for the {\sc nmda} channel, one of them new, (2) the gross asymmetry in number and patterns of termination of excitatory vs. inhibitory synapses onto cortical pyramidal cells, as well as the apparent lack of plasticity at inhibitory synapses, (3) the replication of like-activated neurons beneath a single point in cerebral cortex, and in particular the clumping of apical dendrites of pyramidal cells on their rise to the cortical surface, (4) the complex 3-dimensional arborizations of axons and dendrites in layer 1, which give rise to a rich ``combinatorial'' association interface crucial to the current model, and (5) putative rules for activity-dependent axon growth and synaptogenesis during associative learning. __________________________ The postscript file for this manuscript was very large, so it was broken into three smaller files. Here is what you need to do to get them: unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62) Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get (remote-file) mel.sigmapi1.ps.Z (local-file) foo1.ps.Z ftp> get (remote-file) mel.sigmapi2.ps.Z (local-file) foo2.ps.Z ftp> get (remote-file) mel.sigmapi3.ps.Z (local-file) foo3.ps.Z ftp> quit unix> uncompress foo1.ps.Z foo2.ps.Z foo3.ps.Z unix> lpr -Pxx foo1.ps foo2.ps foo3.ps (xx is the name of your local postscript printer.) _______________________________________________________________________ From slehar at bucasb.bu.edu Wed May 2 08:23:47 1990 From: slehar at bucasb.bu.edu (slehar@bucasb.bu.edu) Date: Wed, 2 May 90 08:23:47 EDT Subject: INNC - Call for Volunteers In-Reply-To: connectionists@c.cs.cmu.edu's message of 1 May 90 23:45:40 GM Message-ID: <9005021223.AA16219@thalamus.bu.edu> Please consider me for volunteering at the INNC conference. Steven Lehar 350 Marlborough St, Boston MA 02115 USA slehar at bucasb.bu.edu AM shift NOTE: I will be giving a presentation, I'm not sure exactly when, but my schedule must of course conform with my presentation. From rr%cstr.edinburgh.ac.uk at NSFnet-Relay.AC.UK Thu May 3 10:22:02 1990 From: rr%cstr.edinburgh.ac.uk at NSFnet-Relay.AC.UK (Richard Rohwer) Date: Thu, 3 May 90 10:22:02 BST Subject: Recurrent Linguistic Domain Papers? Message-ID: <10665.9005030922@cstr.ed.ac.uk> D. Servan-Schreiber writes, regarding simple recurrent networks... > [...] > We also explored the conditions under which the network can carry > information about distant sequential contingencies across intervening > elements to distant elements. Such information is retained with relative > ease if it is relevant at each intermediate step of a sequence; it tends to > be lost when intervening elements do not depend on it. [...] The `Moving Targets' training algorithm (which is a bit of a pig in a lot of ways) can deal with situations in which information from the distant past is required in order to make a correct decision in the present. It can do this because error information is communicated through time by additive tradeoffs in the cost function (which has contributions from hidden nodes as well as target nodes), rather than by the multiplicitive processes derived from the chain rule. I make no claims about biological plausibility. References: R. Rohwer, "The `Moving Targets' Training Algorithm" in Proc. EURASIP Workshop on Neural Networks, Springer-Verlag Lecture Notes in Computer Science No. 412. (1990). R. Rohwer, "The `Moving Targets' Training Algorithm" to appear in Proc. NIPS 1989 Richard Rohwer JANET: rr at uk.ac.ed.cstr Centre for Speech Technology Research ARPA: rr%ed.cstr at nsfnet-relay.ac.uk Edinburgh University BITNET: rr at cstr.ed.ac.uk, 80, South Bridge rr%cstr.ed.UKACRL Edinburgh EH1 1HN, Scotland UUCP: ...!{seismo,decvax,ihnp4} !mcvax!ukc!cstr!rr From rich at gte.com Thu May 3 12:12:19 1990 From: rich at gte.com (Rich Sutton) Date: Thu, 3 May 90 12:12:19 -0400 Subject: Preprint announcement Message-ID: <9005031612.AA18444@bunny.gte.com> How could a connectionist network _plan_ a sequence of actions before doing them? The follow preprint describes one answer. --------------- INTEGRATED ARCHITECTURES FOR LEARNING, PLANNING, AND REACTING BASED ON APPROXIMATING DYNAMIC PROGRAMMING Richard S. Sutton GTE Labs Abstract This paper extends previous work with Dyna, a class of architectures for intelligent systems based on approximating dynamic programming methods. Dyna architectures integrate trial-and-error (reinforcement) learning and execution-time planning into a single process operating alternately on the world and on a learned model of the world. In this paper, I present and show results for two Dyna architectures. The Dyna-PI architecture is based on dynamic programming's policy iteration method and can be related to existing AI ideas such as evaluation functions and universal plans (reactive systems). Using a navigation task, results are shown for a simple Dyna-PI system which simultaneously learns by trial and error, learns a world model, and plans optimal routes using the evolving world model. The Dyna-Q architecture is based on Watkins's Q-learning, a new kind of reinforcement learning. Dyna-Q uses a less familiar set of data structures than does Dyna-PI, but is arguably simpler to implement and use. We show that Dyna-Q architectures are easy to adapt for use in changing environments. --------------- This paper will appear in the proceedings of the Seventh International Conference on Machine Learning, to be held June, 1990. For copies, send a request with your US MAIL address to: clc2 at gte.com From tp at irst.it Thu May 3 14:41:07 1990 From: tp at irst.it (Tomaso Poggio) Date: Thu, 3 May 90 20:41:07 +0200 Subject: Preprint announcement In-Reply-To: Rich Sutton's message of Thu, 3 May 90 12:12:19 -0400 <9005031612.AA18444@bunny.gte.com> Message-ID: <9005031841.AA06800@caneva.irst.it> From rr at cstr.edinburgh.ac.uk Thu May 3 10:22:02 1990 From: rr at cstr.edinburgh.ac.uk (Richard Rohwer) Date: Thu, 3 May 90 10:22:02 BST Subject: Recurrent Linguistic Domain Papers? Message-ID: <10665.9005030922@cstr.ed.ac.uk> D. Servan-Schreiber writes, regarding simple recurrent networks... > [...] > We also explored the conditions under which the network can carry > information about distant sequential contingencies across intervening > elements to distant elements. Such information is retained with relative > ease if it is relevant at each intermediate step of a sequence; it tends to > be lost when intervening elements do not depend on it. [...] The `Moving Targets' training algorithm (which is a bit of a pig in a lot of ways) can deal with situations in which information from the distant past is required in order to make a correct decision in the present. It can do this because error information is communicated through time by additive tradeoffs in the cost function (which has contributions from hidden nodes as well as target nodes), rather than by the multiplicitive processes derived from the chain rule. I make no claims about biological plausibility. References: R. Rohwer, "The `Moving Targets' Training Algorithm" in Proc. EURASIP Workshop on Neural Networks, Springer-Verlag Lecture Notes in Computer Science No. 412. (1990). R. Rohwer, "The `Moving Targets' Training Algorithm" to appear in Proc. NIPS 1989 Richard Rohwer JANET: rr at uk.ac.ed.cstr Centre for Speech Technology Research ARPA: rr%ed.cstr at nsfnet-relay.ac.uk Edinburgh University BITNET: rr at cstr.ed.ac.uk, 80, South Bridge rr%cstr.ed.UKACRL Edinburgh EH1 1HN, Scotland UUCP: ...!{seismo,decvax,ihnp4} !mcvax!ukc!cstr!rr PHONE: (44 or 0) (31) 225-8883 x280 FAX: (44 or 0) (31) 226-2730 From tsejnowski at UCSD.EDU Sat May 5 19:53:44 1990 From: tsejnowski at UCSD.EDU (Terry Sejnowski) Date: Sat, 5 May 90 16:53:44 PDT Subject: Neural Computation 2:1 Message-ID: <9005052353.AA16019@sdbio2.UCSD.EDU> Reviews: Generalized Deformable Model, Statistical Physics, and Matching Problems Alan L. Yuille Letters: An Optoelectronic Architecture for Multilayer Learning in a Single Photorefractive Crystal Carsten Peterson, Stephen Redfield, James D. Keeler, and Eric Hartman VLSI Implementation of Neural Classifiers Arun Rao, Mark R. Walker, Lawrence T. Clark, L. A. Akers, R. O. Grodin Coherent Compound Motion: Corners and Nonridgid Configurations Steven W. Zucker, Lee Iverson, and Robert A. Hummel A Complementarity Mechanism for Enhanced Pattern Processing James L. Adams Hebb-Type Dynamics Is Sufficient to Account for the Inverse Magnification Rule in Cortical Somatotopy Kamil A. Grajski and Michael M. Merzenich Optimal Plasticity from Matrix Memories: What Goes Up Must Come Down David Willshaw and Peter Dayan A Syntactically Structured Associative Memory DeLiang Wang, Joachim Buhmann, and Christoph von der Malsburg A Neural Net Associative Memory for Real-Time Applications Gregory L. Heileman, George M. Papadourakis, and Michael Georgiopoulos Gram-Schmidt Neural Networks Sophocles J. Orfanidis The Perceptron Algorithm Is Fast for Nonmalicious Distributions Eric B. Baum SUBSCRIPTIONS: Volume 2 ______ $35 Student ______ $50 Individual ______ $100 Institution Add $12. for postage outside USA and Canada surface mail. Add $18. for air mail. (Back issues of volume 1 are available for $25 each.) MIT Press Journals, 55 Hayward Street, Cambridge, MA 02142. (617) 253-2889. ----- From jose at learning.siemens.com Mon May 7 08:38:53 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Mon, 7 May 90 07:38:53 EST Subject: NIPS UPDATE Message-ID: <9005071238.AA11377@learning.siemens.com.siemens.com> *************NIPS UPDATE***************** Note there are less than 2-weeks left for your submission of NIPS abstracts Please send your abstracts by MAY 17th Mail Submissions To: Mail Requests For Registration Material To: John Moody Kathie Hibbard NIPS*90 Submissions NIPS*90 Local Committee Department of Computer Science Engineering Center Yale University University of Colorado P.O. Box 2158 Yale Station Campus Box 425 New Haven, Conn. 06520 Boulder, CO 80309-0425 DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 17, 1990 (see big green poster for more detail on NIPS topics for abstracts and summaries) please tell your friends --Steve *************NIPS UPDATE***************** From nips-90 at CS.YALE.EDU Tue May 8 15:49:33 1990 From: nips-90 at CS.YALE.EDU (nips90) Date: Tue, 8 May 90 15:49:33 EDT Subject: NIPS SUBMISSIONS AND REGISTRATION Message-ID: <9005081949.AA03897@CASPER.SUN2.CS.YALE.EDU> t at life.ai.mit.edu, arpanet-bboards at mc.lcs.mit.edu, biotech%umdc.bitnet at siemens.s iemens.com, comp-ai at ucbvax.berkeley.edu, dynsys-l%uncvm1.bitnet at siemens.siemens. com, epsynet%uhupvm1.bitnet at siemens.siemens.com, fj-ai%etl.jp at relay.cs.net, fox@ vtcs1.cs.vt.edu, gs at xp.psych.nyu.edu, hecht at ztivax.siemens.com, human-nets at arami s.rutgers.edu, info-futures at cs.bu.edu, iss at cadillac.siemens.com, jws at ibm-b.ruthe rford.ac.uk, mcmi!denny at siemens.siemens.com, mcvax!swivax!otten at uunet.uu.net, mi ng at demon.siemens.com, unido!mod-ki%gmdzi at siemens.siemens.com, msgs at clarity.princ eton.edu, msgs at neuron.siemens.com, neuron at hplabs.hp.com, nlist at bellcore.com, nns c at nnsc.nsf.net, optics-l%taunivm.bitnet at siemens.siemens.com, parsym at sumex-aim.st anford.edu, physics at mc.lcs.mit.edu, regine at ztivax.siemens.com, remmele at ztivax.si emens.com, self-org at mc.lcs.mit.edu, simulation at ufl.edu, soft-eng at mintaka.lcs.mit .edu, vision-list at ads.com, zercher at ztivax.siemens.com Cc: nips Bcc: This message corrects a previous message sent out yesterday. *************NIPS SUBMISSIONS AND REGISTRATION***************** Note there is about only 1 week left for your submission to NIPS. Please send six copies of both your 50-100 word abstracts and 1000 word summaries by MAY 17th to: John Moody NIPS*90 Submissions Department of Computer Science Yale University P.O. Box 2158 Yale Station New Haven, Conn. 06520 ****ALL SUBMITTING AUTHORS WILL BE SENT REGISTRATION**** *******MATERIALS AUTOMATICALLY!******* DEADLINE FOR SUMMARIES & ABSTRACTS IS MAY 17, 1990 (see big green poster for more detail on NIPS topics for abstracts and summaries) *************NIPS REGISTRATION ONLY!***************** If you are not sending in a submission for NIPS, but would still like to attend, please request registration materials from: Kathie Hibbard NIPS*90 Local Committee Engineering Center University of Colorado Campus Box 425 Boulder, CO 80309-0425 -- John Moody Program Chairman ------- From weili at wpi.wpi.edu Tue May 8 17:54:09 1990 From: weili at wpi.wpi.edu (Wei Li) Date: Tue, 8 May 90 16:54:09 EST Subject: neural networks apply to ATN Message-ID: <9005082154.AA09434@wpi.wpi.edu> Hi, we are very interested in knowing that if neural network can solve ATN (Augumented Transition Network) problems. If so, can neural network do it like a recursive process or some other typies of process? (ATN for natural language processing). weili at wpi.wpi.edu or apache!weil at uunet.uu.net thanks for any information. From Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU Tue May 8 19:49:28 1990 From: Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave.Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Tue, 08 May 90 19:49:28 EDT Subject: NIPS proceedings Message-ID: <3746.642210568@DST.BOLTZ.CS.CMU.EDU> The proceedings of the 1989 NIPS conference have started arriving in people's mailboxes. They were supposed to be out a few weeks ago, but there was a problem with the quality of the binding, so Morgan Kaufmann sent the whole batch back to the bindery to be redone. The second time around they got it perfect. If you are an author or co-author of a paper in the volume, OR if you attended the conference, you should receive a copy of the proceedings. If you don't get yours some time this week, call Morgan Kaufmann on Monday to check on it. Their number is 415-578-9911; ask for Shirley Jowell. If you would like to order extra copies of the proceedings, they are available from: Morgan Kaufmann Publishers 2929 Campus Drive, Suite 260 San Mateo, CA 94403 tel. 415-965-4081 (order department) fax: 415-578-0672. Enclose a check for $35.95 per copy, plus shipping charge of $3.50 for the first copy and $2.50 for each additional copy. California residents must add sales tax. There are higher shipping charges for air mail or international orders; contact the publisher for information. Note: the catalog code for this volume is "100-7"; include that in you order. An example of proper citation format for the volume is: Cowan, J. D. (1990) Neural networks: the early days. In D. S. Touretzky (ed.), Advances in Neural Information Processing Systems 2, pp. 828-842. San Mateo, CA: Morgan Kaufmann. -- Dave From noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK Wed May 9 13:33:07 1990 From: noel%cs.exeter.ac.uk at NSFnet-Relay.AC.UK (Noel Sharkey) Date: Wed, 9 May 90 13:33:07 BST Subject: psychologists Message-ID: <19754.9005091233@entropy.cs.exeter.ac.uk> ******************** CALL FOR PAPERS ****************** CONNECTION SCIENCE SPECIAL ISSUE CONNECTIONIST MODELLING OF PSYCHOLOGICAL PROCESSES EDITOR Noel Sharkey SPECIAL BOARD Jim Anderson Andy Barto Thomas Bever Glyn Humphries Walter Kintsch Dennis Norris Ronan Reilly Dave Rumelhart The journal Connection Science would like to encourage submissions from researchers modelling psychological data or conducting experiments comparing models within the connectionist framework. Papers of this nature may be submitted to our regular issues or to the special issue. Authors wishing to submit papers to the special issue should mark them SPECIAL PSYCHOLOGY ISSUE. Good quality papers not accepted for the special issue may appear in later regular issues. DEADLINE FOR SUBMISSION 12th October, 1990. Notification of acceptance or rejection will be by the end of December/beginning of January. From lyn%cs.exeter.ac.uk at NSFnet-Relay.AC.UK Wed May 9 13:45:18 1990 From: lyn%cs.exeter.ac.uk at NSFnet-Relay.AC.UK (Lyn Shackleton) Date: Wed, 9 May 90 13:45:18 BST Subject: Special issue for Connection Science Message-ID: <19910.9005091245@exsc.cs.exeter.ac.uk> ******************** CALL FOR PAPERS ****************** CONNECTION SCIENCE SPECIAL ISSUE CONNECTIONIST MODELLING OF PSYCHOLOGICAL PROCESSES EDITOR Noel Sharkey SPECIAL BOARD Jim Anderson Andy Barto Thomas Bever Glyn Humphries Walter Kintsch Dennis Norris Ronan Reilly Dave Rumelhart The journal Connection Science would like to encourage submissions from researchers modelling psychological data or conducting experiments comparing models within the connectionist framework. Papers of this nature may be submitted to our regular issues or to the special issue. Authors wishing to submit papers to the special issue should mark them SPECIAL PSYCHOLOGY ISSUE. Good quality papers not accepted for the special issue may appear in later regular issues. DEADLINE FOR SUBMISSION 12th October, 1990. Notification of acceptance or rejection will be by the end of December/beginning of January. Submissions should be sent to: lyn shackleton Centre for Connection Science JANET: lyn at uk.ac.exeter.cs Dept. Computer Science University of Exeter UUCP: !ukc!expya!lyn Exeter EX4 4PT Devon BITNET: lyn at cs.exeter.ac.uk.UKACRL U.K. From jfeldman%icsib2.Berkeley.EDU at jade.berkeley.edu Wed May 9 12:18:52 1990 From: jfeldman%icsib2.Berkeley.EDU at jade.berkeley.edu (Jerry Feldman) Date: Wed, 9 May 90 09:18:52 PDT Subject: ICSI Deputy Ad Message-ID: <9005091618.AA01252@icsib2.berkeley.edu.> We are starting a search for a full-time Deputy Director for the International Computer Science Institute (ICSI). We would highly appreciate any help you can give us in this search. The enclosed ad describes the position. Please feel free to distribute it electronically to anybody who might be interested. Thank you in advance, and best regards. Jerry Feldman Domenico Ferrari PS: If you need more information about duties and perks, please let us know. =============================================================== DEPUTY DIRECTOR International Computer Science Institute Nominations and Applications are solicited for the position of Deputy Director of the International Computer Science Institute. The Institute is an independent basic research laboratory affiliated with and physically near the University of California at Berkeley. Support comes from U.S. sources and sponsor nations, currently Germany, Italy and Switzerland. The Deputy Director will have the primary responsibility for the internal administration of the Institute and its post-doctoral and exchange programs with sponsor nations. There are also many opportunities for new initiatives. The position is like the chair of a research oriented computer science department and the ideal candidate would have such experience. ICSI is also expanding its research staff and welcomes applications from outstanding scientists at any post-doctoral level. Please respond to: Dr. Domenico Ferrari Deputy Director International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704-1105 From hinton at ai.toronto.edu Wed May 9 16:09:56 1990 From: hinton at ai.toronto.edu (Geoffrey Hinton) Date: Wed, 9 May 1990 16:09:56 -0400 Subject: image compression Message-ID: <90May9.161014edt.8256@ephemeral.ai.toronto.edu> We are doing some work on lossless image compression. (i.e. the image must be transmitted using as few bits as possible, but must be perfectly reconstructed after transmission). 1. Does anyone know of any neural network work on lossless image compression? (We know about vector quantization, autoencoders, etc, but these are lossy techniques that wouldn't be too good for compressing your disk files etc.) 2. Does anyone have an image on which other lossless techniques have been tried so that we can compare our technique? Thanks Geoff From harnad at clarity.Princeton.EDU Wed May 9 16:06:51 1990 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Wed, 9 May 90 16:06:51 EDT Subject: Optimality: BBS Call for Commentators Message-ID: <9005092006.AA02403@reason.Princeton.EDU> Below is the abstract of a forthcoming target article to appear in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. To be considered as a commentator or to suggest other appropriate commentators, please send email to: harnad at clarity.princeton.edu or write to: BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771] Please specify the aspect of the article that you are qualified and interested to comment upon. If you are not a current BBS Associate, please send your CV and/or the name of a current Associate who would be prepared to nominate you. ____________________________________________________________________ The Quest for Optimality: A Positive Heuristic of Science? Paul J. H. Schoemaker Center for Decision Research Graduate School of Business University of Chicago Chicago, IL 6063 Abstract This paper examines the strengths and weaknesses of one of science's most pervasive and flexible metaprinciples: Optimality is used to explain utility maximization in economics, least effort principles in physics, entropy in chemistry, and survival of the fittest in biology. Fermat's principle of least time involves both teleological and causal considerations, two distinct modes of explanation resting on poorly understood psychological primitives. The rationality heuristic in economics provides an example from social science of the potential biases arising from the extreme flexibility of optimality considerations, including selective search for confirming evidence, ex post rationalization, and the confusion of prediction with explanation. Commentators are asked to reflect on the extent to which optimality is (1) an organizing priniciple of nature, (2) a set of relatively unconnected techniques of science, (3) a normative principle for rational choice and social organization, (4) a metaphysical way of looking at the world, or (5) something else still. Key Words: Optimization, Variational Principles, Rationality, Explanation, Evolution, Economics, Adaptation, Causality, Heuristics, Biases, Sociobiology, Control Theory, Homeostasis, Entropy, Regulation. From weissg at lan.informatik.tu-muenchen.dbp.de Wed May 9 20:35:00 1990 From: weissg at lan.informatik.tu-muenchen.dbp.de (Gerhard Weiss) Date: 09 May 90 20:35 GMT-0200 Subject: reports available Message-ID: <9005091635.AA09834(a)tumult.informatik.tu-muenchen.de> *** Do not use 'REPLY' *** The following two reports are available. COMBINING NEURAL AND EVOLUTIONARY LEARNING: ASPECTS AND APPROACHES Report FKI-132-90 Gerhard Weiss This report focusses on the intersection of neural and evolutionary learning and shows basic aspects of and current approaches to the combination of these two learning paradigms. Advantages and difficulties of such a combination are described. Approaches from both the field of artificial intelligence and the neurosciences are surveyed. A number of related works as well as extensive references to further literature are presented. Contents: - Hybrid approaches in artificial intelligence . Evolutionary design of artificial neural networks . Evolutionary training of artificial neural networks . Further hybrid approaches and related works - Selective theories in the neurosciences . The evolutionary selection circuits model of learning (Conrad et.al.) . The theories of selective stabilization of synapses and pre-representations (Changeux et.al.) . The theory of neuronal group selection (Edelman) ARTIFICIAL NEURAL LEARNING Report FKI-127-90 Gerhard Weiss This report provides an introducing overview of the foundations and the principles of learning in artificial neural networks. Contents: - General aspects (artificial neural nets / adaptation rules / gradient-following / ...) - Supervised learning (perceptron convergence procedure / backprop / Boltzmann learning) - Associative reinforcement learning (associative reward-penalty algorithm / reinforcement comparison procedures / REINFORCE algorithms) - Unsupervised learning (topology-preserving feature maps / adaptive resonance theory / development of feature analyzing cells) REQUESTS FOR COPIES: weissg at lan.informatik.tu-muenchen.dbp.de -> Please use subject: FKI-127 or FKI-132 or FKI-127+132 -> Please leave only your physical address -> Those who already asked for copies will receive them without any further request OTHER CORRESPONDENCE: weissg at tumult.informatik.tu-muenchen.de or Gerhard Weiss Institut fuer Informatik -H2- Technische Universitaet Muenchen Arcisstrasse 21 D-8000 Muenchen 2 Fed.Rep.Germany From mark at cis.ohio-state.edu Thu May 10 08:13:19 1990 From: mark at cis.ohio-state.edu (Mark Jansen) Date: Thu, 10 May 90 08:13:19 -0400 Subject: image compression Message-ID: <9005101213.AA29337@giza.cis.ohio-state.edu> there is a pre there is a professor here at OSU in the department of electrical engineering, I believe his name is Aho who is working with image compression using neural nets but it is not lossless compression From bms%dcs.leeds.ac.uk at NSFnet-Relay.AC.UK Thu May 10 11:44:31 1990 From: bms%dcs.leeds.ac.uk at NSFnet-Relay.AC.UK (B M Smith) Date: Thu, 10 May 90 11:44:31 BST Subject: Item for Distribution Message-ID: <10920.9005101044@csuna6.dcs.leeds.ac.uk> CALL FOR PAPERS AISB'91 8th SSAISB CONFERENCE ON ARTIFICIAL INTELLIGENCE University of Leeds, UK 16-19 April, 1991 The Society for the Study of Artificial Intelligence and Simulation of Behaviour (SSAISB) will hold its eighth biennial conference at Bodington Hall, University of Leeds, from 16 to 19 April 1991. There will be a Tutorial Programme on 16 April followed by the full Technical Programme. The Programme Chair will be Luc Steels (AI Lab, Vrije Universiteit Brussel). Scope: Papers are sought in all areas of Artificial Intelligence and Simulation of Behaviour, but especially on the following AISB91 special themes: * Emergent functionality in autonomous agents * Neural networks and self-organisation * Constraint logic programming * Knowledge level expert systems research Papers may describe theoretical or practical work but should make a significant and original contribution to knowledge about the field of Artificial Intelligence. A prize of 500 pounds for the best paper has been offered by British Telecom Computing (Advanced Technology Group). It is expected that the proceedings will be published as a book. Submission: All submissions should be in hardcopy in letter quality print and should be written in 12 point or pica typewriter face on A4 or 8.5" x 11" paper, and should be no longer than 10 sides, single-spaced. Each paper should contain an abstract of not more than 200 words and a list of up to four keywords or phrases describing the content of the paper. Five copies should be submitted. Papers must be written in English. Authors should give an electronic mail address where possible. Submission of a paper implies that all authors have obtained all necessary clearances from the institution and that an author will attend the conference to present the paper if it is accepted. Papers should describe work that will be unpublished on the date of the conference. Dates: Deadline for Submission: 1 October 1990 Notification of Acceptance: 7 December 1990 Deadline for camera ready copy: 16 January 1991 Information: Papers and all queries regarding the programme should be sent to Judith Dennison. All other correspondence and queries regarding the conference to the Local Organiser, Barbara Smith. Ms. Judith Dennison Dr. Barbara Smith Cognitive Sciences Division of AI University of Sussex School of Computer Studies Falmer University of Leeds Brighton BN1 9QN Leeds LS2 9JT UK UK Tel: (+44) 273 678379 Tel: (+44) 532 334627 Email: judithd at cogs.sussex.ac.uk FAX: (+44) 532 335468 Email: aisb91 at ai.leeds.ac.uk From dario%TECHUNIX.BITNET at vma.CC.CMU.EDU Thu May 10 10:54:42 1990 From: dario%TECHUNIX.BITNET at vma.CC.CMU.EDU (Dario Ringach) Date: Thu, 10 May 90 17:54:42 +0300 Subject: image compression In-Reply-To: Geoffrey Hinton "image compression" (May 9, 4:09pm) Message-ID: <9005101454.AA10878@techunix.bitnet> I wouldn't expect NNs to perform better on error-free encoding than any of the standard algorithms (Lempel-Ziv for instance)... As far as the testing picture you are after, I think most of the vision community will agree that the famous "Lena" is the usual picture used for comparison... If you can tolerate errors in the reconstruction, then there are a couple of nice works I'm aware of: Daugman used a net to find the Gabor expansion of a picture and then compress it, and Sanger used Oja/Kohonen units + an orthogonalization procedure to obtain convergence to the first eigenfunction/eigenvalues of the Karhunen- Loeve expansion, and of course used it to compress the picture. If anyone is interested I can look for the exact references. -- Dario. From kammen at aurel.cns.caltech.edu Thu May 10 14:33:32 1990 From: kammen at aurel.cns.caltech.edu (Dan Kammen) Date: Thu, 10 May 90 11:33:32 PDT Subject: No subject Message-ID: <9005101833.AA19470@aurel.cns.caltech.edu> TOPIC: PAPER FOR DISSEMINATION WE HAVE RECENTLY COMPLETED AND SUBMITTED (N. NETWORKS) THE FOLLOWING PAPER WHICH SHOULD BE OF INTEREST BOTH TO PERSONS MODELING NEUROBIOLOGICAL NETWORKS AND THOSE DESIGNING SELF-ORGANIZING ALGORITHMS: CORRELATIONS IN HIGH DIMENSIONAL OR ASYMMETRIC DATA SETS: HEBBIAN NEURONAL PROCESSING WILLIAM R.SOFTKY and DANIEL M. KAMMEN Computation and Neural Systems Program California Institute of Technology Pasadena, CA 91125 ABSTRACT The Hebbian neural learning algorithm that implements Principal Component Analysis (PCA) can be extended for the analysis of more realistic forms of neural data by including higher than 2-channel correlations and non-Euclidean (l_P; l-sub-P) metrics. Maximizing a D-th rank tensor form which correlates D channels is equivalent to raising the exponential order of variance correlation from 2 to D in the algorithm that implements PCA. Simulations suggest that a generalized version of Oja's PCA neuron can detect such a D-th order principal component. Arguments from biology and pattern-recognition suggest that neural data in general is not symmetric about its mean; performing PCA with an implicit l_1 metric rather than the Euclidean metric weights exponentially distributed vectors according to their probability, as does a highly nonlinear Hebb rule. The correlation order D and the l_P metric exponent P were each roughly constant for each of several Hebb rules simulated. We propose and discuss a number of these generalized correlation algorithms in terms of natural (biological) and artificial network implementations. Keywords: Principal Component Analysis, Hebbian learning, self-organization, correlation functions, multi-dimensional analysis, non-Euclidean metrics, information theory, asymmetric coding. Address correspondence or preprint requests to: Dr. D. M. KAMMEN: Division of Biology, 216-76 California Institute of Technology Pasadena, CA 91125 USA kammen at aurel.cns.caltech.edu KAMMEN at CALTECH.BITNET From BARB at REAGAN.AI.MIT.EDU Thu May 10 15:36:00 1990 From: BARB at REAGAN.AI.MIT.EDU (Barbara K. Moore) Date: Thu, 10 May 90 15:36 EDT Subject: image compression In-Reply-To: <9005101454.AA10878@techunix.bitnet> Message-ID: <19900510193614.3.BARB@PENGUIN.AI.MIT.EDU> Sorry for being slightly off-track, but I think this is important: (In response to the suggestion to Geoff that he use "Lena" as an example for his image compression algorithm.) For years I have been bothered by the "pretty woman looking seductive" pictures used all too frequently as examples in machine vision ("Marilyn", "Lena", etc.). I realize that they are often used, but I think it's time for a change. How about a still life, or an animal, or just... people? Barbara Moore (barb at ai.mit.edu) From pkube at UCSD.EDU Fri May 11 01:56:07 1990 From: pkube at UCSD.EDU (Paul Kube) Date: Thu, 10 May 90 22:56:07 PDT Subject: image compression In-Reply-To: Your message of Thu, 10 May 90 15:36:00 EDT. <19900510193614.3.BARB@PENGUIN.AI.MIT.EDU> Message-ID: <9005110556.AA00312@kokoro.ucsd.edu> Barbara Moore is right; the Lena picture offends and should no longer be used as a benchmark image in research publications. A commonly used alternative is the "mandrill picture", obtainable from various places. (Mail me if you need it.) --Paul Kube at ucsd.edu From HKF218%DJUKFA11.BITNET at vma.CC.CMU.EDU Fri May 11 10:02:26 1990 From: HKF218%DJUKFA11.BITNET at vma.CC.CMU.EDU (Gregory Kohring) Date: Fri, 11 May 90 10:02:26 MES Subject: Preprint Message-ID: The following preprint is currently available. -- G.A. Kohring Finite-State Neural Networks: A Step Towards the Simulation of Very Large Systems G.A. Kohring HLRZ an der KFA Julich (Supercomputing Center at the KFA Julich) Abstract Neural networks composed of neurons with Q_N states and synapses with Q_Jstates are studied analytically and numerically. Analytically it is shown that these finite-state networks are up to 25 times more efficient at information storage than networks with continuous synapses. In order to take the utmost advantage of networks with finite-state elements, a multi-neuron and multi-synapse coding scheme is introduced which allows the simulation of networks having over one billion couplings at a speed of 7.1 billion coupling evaluations per second on a single processor of the Cray-YMP. A local learning algorithm is also introduced which allows for the efficient training of large networks with finite-state elements. Key Words: Neural Networks, Multi-Spin Coding, Replica Method, Finite-State Networks, Learning Algorithms HLRZ-33/90 Send Correspondence and request for preprints to: G.A. Kohring HLRZ an der KFA Julich Postfach 1913 D-5170 Julich, West Germany From pittman at mcc.com Fri May 11 09:42:26 1990 From: pittman at mcc.com (pittman@mcc.com) Date: Fri, 11 May 90 06:42:26 -0700 Subject: seductive compression Message-ID: <9005111342.AA02870@gluttony.aca.mcc.com> Barbara Moore complains about the "pretty woman looking seductive" and suggests an alternate image of "just... people". Perhaps we could get the producers of "The Bob Newhart Show" to submit a shot of Larry, Darrell, and Darrell. We might even get them to look seductive. I don't (at this time) wish to discuss the relative merits of the madrill over LD&D. Seriously, if you want to appeal to the general public (and therefore also public officials), I suggest you stick with what has funded the photo-developing industry: cute pictures of small children with big smiles on their faces. Jay Pittman (jay.pittman at mcc.com) From sstone%weber at ucsd.edu Fri May 11 13:23:40 1990 From: sstone%weber at ucsd.edu (Allucquere Rosanne Stone) Date: Fri, 11 May 90 10:23:40 pdt Subject: Image compression Message-ID: <9005111723.AA17488@weber.ucsd.edu> I heartily agree with Barbara Moore's suggestion that the stereotype soft-core photographs of women have seen their day and should be retired. Hopefully, by this time not only are there enough women in the profession who find this sort of thing demeaning, but there are enough men who are able to see how softcore photos perpetuate the idea of women as objects. Let's keep our objects within our programming languages. From dlovell at s1.elec.uq.OZ.AU Sat May 12 03:07:06 1990 From: dlovell at s1.elec.uq.OZ.AU (dlovell@s1.elec.uq.OZ.AU) Date: Sat, 12 May 1990 17:07:06 +1000 Subject: No subject Message-ID: <9005120708.1057@munnari.oz.au> From dlovell at s1.elec.uq.oz Sat May 12 18:06:05 1990 From: dlovell at s1.elec.uq.oz (dlovell@s1.elec.uq.oz) Date: Sat, 12 May 90 17:06:05 EST Subject: image compression and mandrils. Message-ID: > > Barbara Moore is right; the Lena picture offends and should no longer > be used as a benchmark image in research publications. A commonly > used alternative is the "mandrill picture", obtainable from various places. > (Mail me if you need it.) > > --Paul Kube at ucsd.edu > > Would that be a soft focus picture of a seductive looking primate perhaps? From koch%HAMLET.BITNET at vma.CC.CMU.EDU Sat May 12 01:50:42 1990 From: koch%HAMLET.BITNET at vma.CC.CMU.EDU (Christof Koch) Date: Fri, 11 May 90 22:50:42 PDT Subject: image compression In-Reply-To: Your message <9005110556.AA00312@kokoro.ucsd.edu> dated 10-May-1990 Message-ID: <900511225018.2260196b@Hamlet.Caltech.Edu> Paul Kube is entirely right. Not only should we not use the Lena picture as a bench mark, but we should also boycott books and museums which display the "Mona Lisa", Botticelli's "Venus" or Gaugin's "Tahiti Nudes". They are all honorable, sorry, offensive pictures. Christof From schraudo%cs at ucsd.edu Sat May 12 18:23:36 1990 From: schraudo%cs at ucsd.edu (Nici Schraudolph) Date: Sat, 12 May 90 15:23:36 PDT Subject: image compression Message-ID: <9005122223.AA07628@beowulf.ucsd.edu> > From: Christof Koch > > Paul Kube is entirely right. Not only should we not use the Lena picture > as a bench mark, but we should also boycott books and museums which > display the "Mona Lisa", Botticelli's "Venus" or Gaugin's "Tahiti Nudes". This comparison demonstrates ignorance of the cultural context in which Da Vinci, Botticelli and Gauguin created their masterpieces. But even if you choose to consider these works as sexist this neither diminishes their cultural value, nor does it excuse you from the social responsibilities of our time. -- Nici Schraudolph, C-014 nschraudolph at ucsd.edu University of California, San Diego nschraudolph at ucsd.bitnet La Jolla, CA 92093 ...!ucsd!nschraudolph From dario%TECHUNIX.BITNET at vma.CC.CMU.EDU Sun May 13 00:46:59 1990 From: dario%TECHUNIX.BITNET at vma.CC.CMU.EDU (Dario Ringach) Date: Sun, 13 May 90 07:46:59 +0300 Subject: image compression In-Reply-To: Paul Kube "Re: image compression" (May 10, 10:56pm) Message-ID: <9005130446.AA02576@techunix.bitnet> I want to apologize if I've offended anyone suggesting "Lena"... Now, regarding the references for image compression I mentioned in my previous mail, here they are: T. Sanger, 'Optimal Unsupervised Learning in a Single-Layer Feedforward Neural Network', Neural Networks, Vol. 2, pp. 459-73, 1989. [He has also some extensions of this work as internal MIT publications]. G. Cottrell et al., 'Principal Component Analysis of Images via Back Propagation' SPIE Proc. Visual Communications and Image Processing '88, pp. 1070-77, 1988 J. Daugman, 'Complete Discrete 2-D Gabor Transforms by Neural Networks for Image Analysis and Compression', IEEE Trans. ASSP, Vol. 36, No. 7, pp. 1169-79, 1988 N. Nasrabadi et al. 'Vector Quantization Based Upon the Kohonen Self- Organizing Feature Map', IEEE Conf. on NNs, pp. I-101-8, 1988. The list is surely incomplete... -- Dario Ringach From hbs at lucid.com Sun May 13 04:44:47 1990 From: hbs at lucid.com (Harlan Sexton) Date: Sun, 13 May 90 01:44:47 PDT Subject: image compression In-Reply-To: Nici Schraudolph's message of Sat, 12 May 90 15:23:36 PDT <9005122223.AA07628@beowulf.ucsd.edu> Message-ID: <9005130844.AA00207@kent-state> Do we need to discuss this any further? I think that the position that a more neutral set of benchmark pictures is desirable has been generally accepted (or at least understood), and the discussion seems to be wandering from this point onto topics that belong in some "readnews" category or in private correspondence. --Harlan From hgigley at note.nsf.gov Tue May 15 09:57:30 1990 From: hgigley at note.nsf.gov (Helen M. Gigley) Date: Tue, 15 May 90 09:57:30 EDT Subject: NSF offers access to Japanese data bases Message-ID: <9005150957.aa21494@Note.NSF.GOV> ------- Forwarded Message From jea%BKLYN.BITNET at VMA.CC.CMU.EDU Tue May 15 10:34:00 1990 From: jea%BKLYN.BITNET at VMA.CC.CMU.EDU (Jonathan E. Adler) Date: Tue, 15 May 90 10:34 EDT Subject: Optimality: BBS Call for Commentators In-Reply-To: Message of Wed, 9 May 90 16:06:51 EDT from Stevan Harnad Message-ID: I decline the commentary, but recommend Philip Kitcher, Philosophy, U.C. San Diego and Benaolette Guimberteau School of Education, U.C. Berkeley. From vg at psyche.inria.fr Tue May 15 13:56:57 1990 From: vg at psyche.inria.fr (Thierry BERNARD) Date: Tue, 15 May 90 19:56:57 +0200 Subject: image compression Message-ID: <9005151756.AA29413@psyche> As NN are usually meant to yield suboptimal answers for difficult problems, I am surprised that they can be used for LOSSLESS image compression. Anyway, if losing some information is acceptable, may be our work is of some interest. For image processing purposes within smart sensors, we have designed a neural technique for image analog-to-binary conversion, that we actually call "neural halftoning". We treat this conversion as an optimization problem subject to a fidelity criterion. The neural approach turns so adapted that : - the conversion quality is better than in any other halftoning technique. - a 100x100 pixel/neuron array can easily fit inside a standard CMOS chip. Anyone interested can read 2 recent papers of ours : [1] A neural halftoning algorithm suiting VLSI implementation. T.Bernard, P.Garda, B.Zavidovique. IEEE ICASSP April 90 [2] About the use of the adjective "neural", when applied to smart sensors. T.Bernard, B.Zavidovique. IEEE ICPR June 90 -- Thierry Bernard (vg at etca.fr) From nips-90 at CS.YALE.EDU Tue May 15 13:52:22 1990 From: nips-90 at CS.YALE.EDU (nips90) Date: Tue, 15 May 90 13:52:22 EDT Subject: FedEx address for NIPS Submissions Message-ID: <9005151752.AA29294@CASPER.NA.CS.YALE.EDU> Fellow Colleagues: For those of you feverishly trying to make the deadline for NIPS*90 (Neural Information Processing Systems, Natural and Synthetic), the correct street address for using Fed Ex or other express delivery services is John Moody NIPS*90 Submissions Department of Computer Science Yale Univesity 51 Prospect St. New Haven, CT 06520 US Postal Service Express Mail can be sent to John Moody NIPS*90 Submissions Department of Computer Science PO Box 2158 Yale Station New Haven, CT 06520 I will accept any submission with express postmark as late as May 17. Remember to include six copies of both abstract and 1000 word summary. Incomplete submissions will be returned. Lastly, contributing authors will be automatically sent registration materials, so there is no need to request them separately. Happy writing! --John ------- From Alex.Waibel at SPEECH2.CS.CMU.EDU Tue May 15 18:12:59 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Tue, 15 May 90 18:12:59 EDT Subject: FedEx address for NIPS Submissions Message-ID: Prospective NIPS'90 attendees, Please note: Due date for proposals for the NIPS'90 postconference workshops is also May 17th. To ensure proper and timely consideration of your workshop proposal, however, please be sure to send it directly to: Alex Waibel attn.: NIPS'90 Workshops School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 --------------------------------------------------------------------- Following the regular NIPS program, workshops on current topics in Neural Information Processing will be held on November 30 and December 1, 1990, at a ski resort near Denver. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Past topics have included: Rules and Connectionist Models; Speech; Vision; Neural Network Dynamics; Neurobiology; Computational Complexity Issues; Fault Tolerance in Neural Networks; Benchmarking and Comparing Neural Network Applications; Architectural Issues; Fast Training Techniques; VLSI; Control; Optimization; Statistical Inference; Genetic Algorithms. The format of the workshops is informal. Beyond reporting on past research, their goal is to provide a forum for scientists actively working in the field to freely discuss current issues of concern and interest. Sessions will meet in the morning and in the afternoon of both days, with free time in between for ongoing individual exchange or outdoor activities. Specific open or controversial issues are encouraged and preferred as workshop topics. Individuals interested in chairing a workshop must propose a topic of current interest and must be willing to accept responsibility for their group's discussion. Discussion leaders' responsibilities include: arrange brief informal presentations by experts working on this topic, moderate or lead the discussion, and report its high points, findings and conclusions to the group during evening plenary sessions, and in a short (2 page) summary. Submission Procedure: Interested parties should submit a short proposal for a workshop of interest by May 17, 1990. Proposals should include a title and a short description of what the workshop is to address and accomplish. It should state why the topic is of interest or controversial, why it should be discussed and what the targeted group of participants is. In addition, please send a brief resume of the prospective workshop chair, list of publications and evidence of scholarship in the field of interest. Name, mailing address, phone number, and e-mail net address (if applicable) must be on all submissions. --------------------------------------------------------------------- From marcus at aurel.cns.caltech.edu Wed May 16 03:07:42 1990 From: marcus at aurel.cns.caltech.edu (Marcus Quintana Mitchell) Date: Wed, 16 May 90 00:07:42 PDT Subject: mailing list Message-ID: <9005160707.AA09427@aurel.cns.caltech.edu> To whom it may concern: I would like to be placed on the connectionists mailing list. Thank you M. Q. Mitchell 164-30 California Inst. of Technology marcus at aurel.cns.caltech.edu From jose at learning.siemens.com Thu May 17 08:06:40 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Thu, 17 May 90 07:06:40 EST Subject: NIPS NOTE Message-ID: <9005171206.AA24748@learning.siemens.com.siemens.com> LAST MINUTE NOTE (RE: Cognitive Science/AI) as you are doing your last minute details prior to mailing... Remember there is a new submission category this year. Anyone submitting summaries relevant to COGNITIVE SCIENCE or AI please indicate this on your summary/abstract. Steve From PI05%primeb.dundee.ac.uk at NSFnet-Relay.AC.UK Thu May 17 16:24:46 1990 From: PI05%primeb.dundee.ac.uk at NSFnet-Relay.AC.UK (PI05%primeb.dundee.ac.uk@NSFnet-Relay.AC.UK) Date: Thu, 17 May 90 16:24:46 Subject: scheduling Message-ID: Does anyone have references to work on using connectionist techniques for solving scheduling problems - particularly timetabling? David Pickles. From steeg at ai.toronto.edu Thu May 17 13:53:33 1990 From: steeg at ai.toronto.edu (Evan W. Steeg) Date: Thu, 17 May 1990 13:53:33 -0400 Subject: scheduling Message-ID: <90May17.135338edt.8329@ephemeral.ai.toronto.edu> >Does anyone have references to work on using connectionist techniques for >solving scheduling problems - particularly timetabling? > > David Pickles. The following was announced a while ago: ---------------------------------------------- October 1989 LU TP 89-19 "TEACHERS AND CLASSES" WITH NEURAL NETWORKS Lars Gislen, Carsten Peterson and Bo Soderberg Department of Theoretical Physics, University of Lund Solvegatan 14A, S-22362 Lund, Sweden Submitted to International Journal of Neural Systems Abstract: A convenient mapping and an efficient algorithm for solving scheduling problems within the neural network paradigm is presented. It is based on a reduced encoding scheme and a mean field annealing prescription, which was recently successfully applied to TSP. Most scheduling problems are characterized by a set of hard and soft constraints. The prime target of this work is the hard constraints. In this domain the algorithm persistently finds legal solutions for quite difficult problems. We also make some exploratory investigations by adding soft constraints with very encouraging results. Our numerical studies cover problem sizes up to O(5*10^4) degrees of freedom with no parameter sensitivity. We stress the importance of adding certain extra terms to the energy functions which are redundant from the encoding point of view but beneficial when it comes to ignoring local minima and to stabilizing the good solutions in the annealing process. For copies of this report send requests to: THEPCAP at SELDC52. NOTICE: Those of you who requested our previous report, "A New Way of Mapping Optimization.... (LU TP 89-1), will automatically receive this one so no request is necessary. ------------------------------------------------------- -- Evan Evan W. Steeg (416) 978-7321 steeg at ai.toronto.edu (CSnet,UUCP,Bitnet) Dept of Computer Science steeg at ai.utoronto (other Bitnet) University of Toronto, steeg at ai.toronto.cdn (EAN X.400) Toronto, Canada M5S 1A4 {seismo,watmath}!ai.toronto.edu!steeg From sg at corwin.ccs.northeastern.edu Sat May 19 17:22:01 1990 From: sg at corwin.ccs.northeastern.edu (steve gallant) Date: Sat, 19 May 90 17:22:01 EDT Subject: TR: Representing Context and Word Disambiguation Message-ID: <9005192122.AA11525@corwin.CCS.Northeastern.EDU> The following TR is available: A Practical Approach for Representing Context And for Performing Word Sense Disambiguation Using Neural Networks Stephen I. Gallant ABSTRACT Representing and manipulating context information is one of the hardest problems in natural language processing. This paper proposes a method for representing some context information so that the correct meaning for a word in a sentence can be selected. The approach is based upon work by Waltz & Pollack, who emphasized neurally plausible sys- tems. By contrast this paper focuses upon computationally feasi- ble methods applicable to full-scale natural language processing systems. There are two key elements: a collection of context vectors defined for every word used by a natural language processing sys- tem, and a context algorithm that computes a dynamic context vector at any position in a body of text. Once the dynamic context vector has been computed it is easy to choose among competing meanings for a word. This choice of definitions is essentially a neural network computation, and neural network learning algorithms should be able to improve the system's choices. Although context vectors do not represent all context informa- tion, their use should improve those full-scale systems that have avoided context as being too difficult to deal with. Good candi- dates for full-scale context vector implementations are machine translation systems and text retrieval systems. A main goal of this paper is to encourage such large scale implementations and tests of context vector approaches. A variety of interesting directions for research in natural language processing and machine learning will be possible once a full set of context vectors has been created. In particular the development of more powerful context algorithms will be an impor- tant topic for future research. ----------------- Copies are available by Email only. To obtain a Latex copy, send mail to `sg at corwin.ccs.northeastern.edu'. Please do not post to other lists. Please be careful not to reply to the entire connectionist list! From aibanez at iai.es Mon May 21 11:31:00 1990 From: aibanez at iai.es (Alberto Ibaqez Rodrmguez) Date: 21 May 90 16:31 +0100 Subject: Linear separability Message-ID: <2*aibanez@iai.es> About a couple of months ago we sent a question to the list concerning the existence of a fast method to determine whether two subsets of the set of vertices of a hypercube are linerly separable (every vertex in the hypercube falls in one of the subsets). Most of the mails that were sent dealt with linear programing, perceptrons and some of them about the Walsh transform, convex polytopes or k- summability. Thank you to everyone for the interest and for dedicating time to that interesting discussion. The question arised from a simple method we arrived at, which looks like it works. However, we have not been able to prove that it will keep working in high dimension hypercubes. The idea is as follows: A subset of vertices (as defined formerly) is linearly separable from the other iff there exists a hyperplane perpendicular to the segment that joins the barycentres of the subsets in which the hypercube has been divided. If this proposition is correct, we can project every point in the subsets on this segment and find out if both sets of projections are separated. If so, the hyperplane equation can easily be calculated. We would appreciate any comments and especially if someone finds out how to prove or disprove it. Thank you very much Alberto Ibaqez et al. From MURRE%HLERUL55.BITNET at VMA.CC.CMU.EDU Tue May 22 12:29:00 1990 From: MURRE%HLERUL55.BITNET at VMA.CC.CMU.EDU (MURRE%HLERUL55.BITNET@VMA.CC.CMU.EDU) Date: Tue, 22 May 90 12:29 MET Subject: request positions for practical work Message-ID: <8F1858EAE8BF000B99@HLERUL55.BITNET> Request for positions for practical work in connectionist modelling We have several good students who are interested in periods of practical work abroad (i.e., outside the Netherlands). These periods are usually three to six months, and the student is expected to take part in some ongoing research project. Most students are in their final year (close to getting their drs. degree, comparable to a USA master's degree) in experimental psychology, and they are particularly interested in connectionist modelling of cognitive processes. All of them have followed at least one intensive course in connectionism, and they are all experienced with psychological experiments. The practical work could be in the area of modelling or experimentation. The students are expected to provide their own financing (travel, housing, etc.), but they are not supposed to pay tuition. If you consider having one of the students of our connectionist group for practical work, please, write me or send an E-mail to the address below. If you too have students that want to spend some time abroad we might think of some sort of exchange. Jaap M.J. Murre Jaap M.J. Murre Leiden University Unit of Experimental and Theoretical Psychology P.O. Box 9555 2300 RB Leiden The Netherlands tel.: 31-71-273631 fax : 31-71-273619 E-Mail: MURRE at HLERUL55.Bitnet From oruiz at fi.upm.es Tue May 22 12:00:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 22 May 90 18:00 +0200 Subject: mensa. Message-ID: <11*oruiz@fi.upm.es> I am very interested in the dynamical behavior of the neurons, and specially in the influence of the Decay parameter that appears in its dynamic equation. I want information of any work developed with this kind of neuron in pattern recognition, mainly in sequences of patterns, in feedforward networks or recurrent networks. In this momement I am eager to reach the next three papers: G.Kuhn. "Connected Recognition with a Recurrent Network". Proceedings NEUROSPEECH, 18 May 1989, special issue of Speech Communication, v 9, num.2 (1990) W.S. Stornetta, T.Hogg and B.A.Huberman. "A dynamical aproach to Temporal Pattern Processing", 1988, Neural Information Processing Systems, Editor A. Anderson. New York: American Institute of Physics. J.L. Elman, "Finding structure in time", CRL Technical Report 8801, University of California, San Diego, Center for Research in Language, La Jolla, 1988. If anybody has information of this papers please contact me. I want to know what they are about, and perhaps a brief abstract would suffice for me. Thank you in advance. From paul at NMSU.Edu Tue May 22 14:48:39 1990 From: paul at NMSU.Edu (paul@NMSU.Edu) Date: Tue, 22 May 90 12:48:39 MDT Subject: No subject Message-ID: <9005221848.AA08906@NMSU.Edu> PLEASE DISTRIBUTE THE FOLLOWING ANNOUNCEMENT IN YOUR DEPARTMENT/LABORATORY: Cut--------------------------------------------------------------------------- PRAGMATICS IN ARTIFICIAL INTELLIGENCE 5th Rocky Mountain Conference on Artificial Intelligence (RMCAI-90) Science Hall and Music Center Auditorium New Mexico State University Las Cruces, New Mexico, USA, June 28-30, 1990 PRAGMATICS PROBLEM: The problem of pragmatics in AI is one of developing theories, models, and implementations of systems that make effective use of contextual information to solve problems in changing environments. CONFERENCE GOAL: This conference will provide a forum for researchers from all subfields of AI to discuss the problem of pragmatics in AI. The implications that each area has for the others in tackling this problem are of particular interest. COOPERATION: American Association for Artificial Intelligence (AAAI) IEEE Computer Society SPONSORSHIP: Association for Computing Machinery (ACM) Computing Research Laboratory (CRL), NMSU Special Interest Group on Artificial Intelligence (SIGART) U S WEST Advanced Technologies and the Rocky Mountain Society for Artificial Intelligence (RMSAI) INVITED SPEAKERS: The following researchers are invited to present papers at the conference: *Martin Casdagli, Los Alamos National Laboratory, Los Alamos USA *Arthur Cater, University College Dublin, Ireland EC *Jerry Feldman, University of California at Berkeley, Berkeley USA & International Computer Science Institute, Berkeley USA *Barbara Grosz, Harvard University, Cambridge USA *James Martin, University of Colorado at Boulder, Boulder USA *Derek Partridge, University of Exeter, United Kingdom EC *Roger Schank, Northwestern University, Illinois, USA *Philip Stenton, Hewlett Packard, United Kingdom EC *Robert Wilensky, University of California at Berkeley Berkeley USA SUBMITTED PAPERS: In addition over 40 papers on pragmatics in AI have been accepted for the conference. THE LAND OF ENCHANTMENT: Las Cruces, lies in THE LAND OF ENCHANTMENT (New Mexico), USA and is situated in the Rio Grande Corridor with the scenic Organ Mountains overlooking the city. The city is close to Mexico, Carlsbad Caverns, and White Sands National Monument. There are a number of Indian Reservations and Pueblos in the Land Of Enchantment and the cultural and scenic cities of Taos and Santa Fe lie to the north. New Mexico has an interesting mixture of Indian, Mexican and Spanish culture. There is quite a variation of Mexican and New Mexican food to be found here too. GENERAL INFORMATION: The Rocky Mountain Conference on Artificial Intelligence is a major regional forum in the USA for scientific exchange and presentation of AI research. The conference emphasizes discussion and informal interaction as well as presentations. The conference encourages the presentation of completed research, ongoing research, and preliminary investigations. Researchers from both within and outside the region are invited to participate. DEADLINES: Pre-registration: June 1st, 1990 Final papers due: June 1st, 1990 TRANSPORT: Las Cruces, New Mexico is located one hour from El Paso, Texas on I-10 West. Participants can fly into El-Paso International Airport and transport will be provided from and to the airport. SOCIALS: The conference will include a registration reception buffet, going_away_party full-buffet, banquet and banquet speaker (+ $25.00), and numerous refreshments, HOTELS: The Las Cruces Hilton has rooms for $47.00 per night. (Call 1-800-284-0616, cutoff date is June 13th) Accommodation is also available in other Hotels and Motels. REGISTRATION: Pre-Registration: Professionals: $50.00; Students $30.00 (Pre-Registration cutoff date is June 1st 1990) Registration: Professionals: $70.00; Students $50.00 (at the conference) (Copied proof of student status is required). Registration form (IN BLOCK CAPITALS). Enclose payment made out to New Mexico State University. (ONLY checks in US dollars will be accepted). Send to the following address (MARKED REGISTRATION): Local Arrangements Chairperson, RMCAI-90 Computing Research Laboratory Dept. 3CRL, Box 30001, NMSU Las Cruces, NM 88003-0001, USA. Name:_______________________________ E-mail_____________________________ Phone__________________________ Affiliation: ____________________________________________________ Fax: ____________________________________________________ Address: ____________________________________________________ ____________________________________________________ ____________________________________________________ COUNTRY__________________________________________ LOCAL ARRANGEMENTS: Local Arrangements Chairperson, RMCAI-90. (same postal address as above). INQUIRIES: Inquiries regarding conference brochure and registration form should be addressed to the Local Arrangements Chairperson. Inquiries regarding the conference program should be addressed to the Program Chairperson. Local Arrangements Chairperson: E-mail: INTERNET: rmcai at nmsu.edu Phone: (+ 1 505)-646-5466 Fax: (+ 1 505)-646-6218. Program Chairperson: E-mail: INTERNET: paul at sparta.nmsu.edu Phone: (+ 1 505)-646-5109 Fax: (+ 1 505)-646-6218. Paul Mc Kevitt, Program Chairperson, RMCAI-90, Computing Research Laboratory (CRL), Dept. 3CRL, Box 30001, New Mexico State University, Las Cruces, NM 88003-0001, USA. TOPICS OF INTEREST: You are invited to submit a research paper addressing Pragmatics in AI, with any of the following orientations: Philosophy, Foundations and Methodology Knowledge Representation Neural Networks and Connectionism Genetic Algorithms, Emergent Computation, Nonlinear Systems Natural Language and Speech Understanding Problem Solving, Planning, Reasoning Machine Learning Vision and Robotics Applications TENTATIVE CONFERENCE SCHEDULE: .ce \fBRMCAI-90 CONFERENCE SCHEDULE\fR WEDNESDAY 27th June 1990: 6:00 pm - 10:00 pm: Registration and Reception, Double Eagle, Old Mesilla THURSDAY 28th June 1990: \fB8:50 am: Yorick Wilks and Paul Mc Kevitt: Welcome\fR \fB9:00 am: Invited talk: Jerry Feldman, UC Berkeley \fR .nf .ta .6i Miniature Language Acquisition: A Paradigm problem and some approaches 10:00 am: Coffee 10:30 am - 12:30 pm: Three tracks of submitted papers. .nf \fBTRACK A:\fR PRACMA: Processing Arguments between Controversially-Minded Agents Jurgen Allgayer : Alfred Kobsa : Carola Reddig : Norbert Reithinger Relevant Beliefs Afzal Ballim : Yorick Wilks Speech Acts and Mental States Robbert-Jan Beun Extensions of Constraints on Speech Act Ambiguity Elizabeth A. Hinkelman \fBTRACK B:\fR Dynamic Route Planning E. Cortes-Rello : F. Golshani Strategic Planning System (SPS) Mitchell Smith : Peter Briggs : Edward Freeman Re-planning a Route - A Pragmatic Approach Wai-Kiang Yeap Evaluation of Pragmatics Processing in a Direction Finding Domain Deborah A. Dahl \fBTRACK C:\fR Computing with Fast Modulation: Experiments with Biologically Realistic Model Neurons Mark DeYong : Randall Findley : Chris Fields Competition and Selection in Neural Networks with Distributed Representations Kankanahalli Srinivas : John Barnden Using Genetic Algorithms as a Post-Processor for Improving Vehicle Routing Solutions Nagesh Kadaba : Kendall E. Nygard An Application of Neural Networks is Robotics Dr. Behzad Ghavimi 12:30 pm - 2:00 pm: Lunch \fB2:00 pm: Invited talk: Robert Wilensky, UC Berkeley, USA\fP 3:00 pm - 3:30 pm: Coffee \fB3:30 pm - 4:30 pm: Invited talk: Phil Stenton, HP Laboratories, Bristol, UK\fP .nf .ta 1.2i Putting NL to work: A dialogue modeling approach .sp .fi 4:30 pm - 5:30 pm: Three tracks of submitted talks \fBTRACK A:\fR .sp .nf .ta .6i Using relational knowledge structures to handle null value situations in natural language interfaces Nick Cercone : Dan Fass : Chris Groeneboer : Gary Hall : Mimi Kao : Paul McFetridge : Fred Popowich A Classification of User-System Interactions in Natural Language with Special Reference to : Dan Fass : Nick Cercone : Gary Hall : Chris Groeneboer : Paul McFetridge : Fred Popowick \fBTRACK B:\fR Problem Solving Experience and Problem Solving Knowledge Stephen W. Smoliar An Abstraction-Partitioned Model for Reactive Planning Lee Spector : James A. Hendler \fBTRACK C:\fR A Graph Theoretic Basis for Problem Solving Daniel P. Eshner : Heather D. Pfeiffer Meta-Structures: Intelligent Structures for Inference Control Daniel J. Goter : David E. Monarchi FRIDAY 29th June 1990: \fB9:00 am: Invited talk: Barbara Grosz, Harvard University\fP Collaborative Planning for Discourse 10:00 am: Coffee 10:30 am - 12:30 pm: Three tracks of submitted papers \fBTRACK A:\fR Why Does Language Matter to Artificial Intelligence Marcelo Dascal Pragmatics of Postdeterminers Non-restrictive Modifications & Wh-phrases Frens J.H. Dols Pragmatics and Natural Language Processing Eduard H. Hovy On the Semantics of the Conjunction "but" Wlodek Zadrozny : Karen Jensen \fBTRACK B:\fR How to Become Immune to Facts M.J. Coombs : R.T. Hartley : W.B. Kilgore : H.D. Pfeiffer Constrained Rational Agency Bruce D'Ambrosio : Tony Fountain : Lothar Kaul Abductive Inference in AI: Potential Unifications Venugopala Rao Dasigi A Prolog Implementation of the Stable Model TMS Stephen Pimentel : John L. Cuadrado \fBTRACK C:\fR Multiple Level Island Search Peter C. Nelson : John F. Dillenburg Efficient Learning with Representative Presentations Xiaofeng (Charles) Ling User Modelling in a Knowledge-Based Environment for European Learning Michael F. McTear : Norman Creaney : Weiru Liu Training a Neural Network to be a Context Sensitive Grammer Robert F. Simmons : Yeong-Ho Yu 12:30 pm - 2:00 pm: Lunch \fB2:00 pm: Invited talk: Roger Schank, Northwestern University\fP 3:00 pm - 3:30 pm: Coffee \fB3:30 pm - 4:30 pm: Invited talk: Arthur Cater, University College Dublin, Ireland\fP 4:30 pm - 5:30 pm: Three tracks of submitted papers \fBTRACK A:\fR Towards Empirically Derived Semantic Classes Brian M. Slator : Shahrzad Amirsoleymani : Sandra Andersen : Kent Braaten John Davis : Rhonda Ficek : Hossein Hakimzadeh : Lester McCann : Joseph Rajkumar : Sam Thangiah : Daniel Thureen Using Words Louise Guthrie : Paul Mc Kevitt : Yorick Wilks \fBTRACK B:\fR An Expert Tool for Digital Circuit Design F.N. Sibai : K. L. Watson Explaining Control Strategy in Second Generation Expert Systems Xuejun Tong \fBTRACK C:\fR A New Approach to Analyzing Aerial Photographics Dwayne Phillips Acquiring Categorical Aspects: A Connectionist Account of Figurative Noun Semantics Susan Hollbach Weber 6:00 pm - 9:00 pm: Japanese Buffet in Garden Center (Budagher's) SATURDAY 30th June 1990: \fB9:00 am: Invited talk: Derek Partridge, University of Exeter, UK\fP 10:00 am: Coffee 10:30 am - 11:30: Two tracks of submitted papers \fBTRACK A\fR An Experiment on Technical Text Reproduction Wanying Jin Explanation Dialogues: Interpreting Real Life Questions & Explanations Efstratios Sarantinos : Peter Johnson Modeling of mind and its application to image sequence understanding Naoyuki Okada \fBTRACK B:\fR Communication and Belief Changes in a Society of Agents Graca Gaspar An Interval Calculus Based Finite Domain Constraint and its Implementation in Prolog Jin-Kao Hao : Jean-Jacques Chabrier Dynamic Context Diagrams: the pragmatics of social interaction in KBS development Simon P.H. Morgan 11:30 am - 1:30 pm: Lunch \fB1:30 pm - 2:30 pm: Invited talk: James Martin, University of Colorado at Boulder\fP .nf .ta 1.2i A Unified Approach To Conventional Non-Literal Language 3:00 pm - 3:30 pm: Coffee \fB2:30 pm - 3:30 pm: Invited talk: Martin Casdagli, Los Alamos National Laboratories\fP Pragmatic Artificial Neural Nets for the Nonlinear Prediction of Time Series 6:00 pm - 9:00 pm: Banquet (Double Eagle) .ce ***************************** PROGRAM COMMITTEE: *John Barnden, New Mexico State University (Connectionism, Beliefs, Metaphor processing) *Hans Brunner, U S WEST Advanced Technologies (Natural language interfaces, Dialogue interfaces) *Martin Casdagli, Los Alamos National Laboratory (Dynamical systems, Artificial neural networks, Applications) *Mike Coombs, New Mexico State University (Problem solving, Adaptive systems, Planning) *Dan Eshner, University of Maryland (Planning, Search, Knowledge Representation) *Thomas Eskridge, Lockheed Missile and Space Co. (Analogy, Problem solving) *Chris Fields, New Mexico State University (Neural networks, Nonlinear systems, Applications) *Roger Hartley, New Mexico State University (Knowledge Representation, Planning, Problem Solving) *Victor Johnson, New Mexico State University (Genetic Algorithms) *Paul Mc Kevitt, New Mexico State University (Natural language interfaces, Dialogue modeling) *Joe Pfeiffer, New Mexico State University (Computer Vision, Parallel architectures) *Keith Phillips, University of Colorado at Colorado Springs (Computer vision, Mathematical modelling) *Roger Schvaneveldt, New Mexico State University (Knowledge representation, Knowledge elicitation, cognitive modeling) *Brian Slator, North Dakota State University (Natural language processing, Knowledge acquisition) *Yorick Wilks, New Mexico State University (Natural language processing, Knowledge representation) *Scott Wolff, U S WEST Advanced Technologies (Intelligent tutoring, User interface design, Cognitive modeling) Organizing Committee RMCAI-90: Paul Mc Kevitt Yorick Wilks Research Scientist Director, CRL CRL and Professor, NMSU Computer Science cut------------------------------------------------------------------------ From aibanez at iai.es Wed May 23 05:44:00 1990 From: aibanez at iai.es (Alberto Ibaqez Rodrmguez) Date: 23 May 90 10:44 +0100 Subject: Linear separability Message-ID: <5*aibanez@iai.es> After sending to the list the mail concerning the existence of a criterion to prove the linear separability of two subsets of the set of vertices of a given n-dimensional hypercube, we have realized that the word 'barycentre' is not correct (does not appear in the Webster's). We apologize for this sligth mistake (barycentric does appear). The correct term would be 'centroid', 'centre of gravity' or 'centre of mass'. Alberto Ibaqez et al. From YAEGER.L at AppleLink.Apple.COM Wed May 23 08:29:00 1990 From: YAEGER.L at AppleLink.Apple.COM (Yaeger, Larry) Date: 23 May 90 12:29 GMT Subject: character recognition Message-ID: <1481495@AppleLink.Apple.COM> There have been a number of published works on the use of neural networks for handprinted character recognition. All the work of this type that I recall hearing or reading about has been based on either pixel maps or feature vectors derived from pixel maps. Segmentation was either provided appriori, or was a separate step in the recognition process that did not utilize a connectionist approach (and was usually the weakest link in the process). Is there any published (or unpublished) work on a connectionist approach to handprinted character recognition that utilizes stroke data -- time-sequences of input-device position -- as direct input to the network (whether predicting segmentation or having it supplied)? Is there an existing database of such stroke data (whether gathered for connectionist models or not)? Thanks in advance for any information you might be able to pass on. If there is an outpouring of references on the subject I'll collect them and post to this list. - larryy at apple.com [please use this address regardless of what your mailer says] P.S. Be careful not to reply to the entire list unless you intend to... From Bill_McKellin at mtsg.ubc.ca Wed May 23 12:17:44 1990 From: Bill_McKellin at mtsg.ubc.ca (Bill_McKellin@mtsg.ubc.ca) Date: Wed, 23 May 90 09:17:44 PDT Subject: metaphor Message-ID: <2251711@mtsg.ubc.ca> I am trying to locate material on connectionist approaches to metaphor and analogical forms of discourse processing. If you are working in these areas (or know of others who are) would you please contact me with a description of your research and references to appropriate articles. From fritz_dg%ncsd.dnet at gte.com Wed May 23 17:58:51 1990 From: fritz_dg%ncsd.dnet at gte.com (fritz_dg%ncsd.dnet@gte.com) Date: Wed, 23 May 90 17:58:51 -0400 Subject: requests for information Message-ID: <9005232158.AA05552@bunny.gte.com> Briefly, I see a fair number of ISO's without a return address, leaving no alternative but to respond to the entire list. Explicit inclusion of email address would be helpful. Dave F. fritz_dg%ncsd at gte.com From AMR at IBM.COM Thu May 24 14:06:39 1990 From: AMR at IBM.COM (AMR@IBM.COM) Date: Thu, 24 May 90 14:06:39 EDT Subject: Formal Properties Message-ID: Some time ago I requested references (together with hard copies if possible) to work on formal properties (such as Turing equivalence or otherwise) of models used in connectionist research. I would like to renew that request now, since there must be more out there than what I have received to date (while thanking those who have already responded). From oruiz at fi.upm.es Thu May 24 12:30:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 24 May 90 18:30 +0200 Subject: paper request Message-ID: <18*oruiz@fi.upm.es> I am interested on findind the following article of Cybenco, C. (1988): "Approximation by Superpositions of a Sigmoidal Function"; and an other about the implementation of functions with neural networks. I am also trying to find a probably unpublished paper of E.F. Moore: "Counter-Example to a Conjeture of McCluskey and Paull" (1957). Miguel A. Lerma Ps: I am sharing the email with Oscar Ruiz and Javier Segovia. From issnnet at bucasb.bu.edu Fri May 25 13:29:35 1990 From: issnnet at bucasb.bu.edu (issnnet@bucasb.bu.edu) Date: Fri, 25 May 90 13:29:35 EDT Subject: ISSNNet meeting/dinner at IJCNN Message-ID: <9005251729.AA18951@thalamus.bu.edu> The International Student Society for Neural Networks (ISSNNet) will hold a meeting on Monday night, June 18, during the IJCNN conference in San Diego. All interested parties are welcome to join us. We are planning to organize a (cheap), quick meal right before or after the meeting, so participants may attend the evening plenary talks. We also expect to get a lot of people together after the plenaries and head over to some local establishment (you do not need to be a member to join us there :-). Exact details will be available at registration or at the ISSNNet booth during the conference. For more information send email to: issnnet at bucasb.bu.edu From biafore%cs at ucsd.edu Mon May 28 23:51:47 1990 From: biafore%cs at ucsd.edu (Louis Steven Biafore) Date: Mon, 28 May 90 20:51:47 PDT Subject: IJCNN Reminder Message-ID: <9005290351.AA01290@beowulf.ucsd.edu> ............................................................ International Joint Conference on Neural Networks San Diego, CA. - June 17-21, 1990 The 1990 IJCNN is sponsored by the IEEE Council on Neural Networks and the International Neural Network Society (INNS). The IJCNN will cover the full spectrum of neural computing from theory such as neurodynamics to applications such as machine vision. Meet leading experts and practitioners during the largest conference in the field. For further information contact Nomi Feldman, Meeting Management, 5665 Oberlin Dr., Suite 110, San Diego, CA 92121. Telephone (619) 453-6222. Registration The conference registration fee includes admission to all sessions, exhibit area, Sunday Welcome Reception and Wednesday Party. TUTORIALS ARE NOT INCLUDED. The registration fee is $280. Single day registration is available for $110 (proceedings not included). Full-time students may attend for $50, proceed- ings and Wednesday Party not included. Schedule of Events Sunday 17 June TUTORIALS (8 am - 6 pm) RECEPTION (6 pm - 8 pm) INDUSTRY PANEL (8 pm - 10 pm) Monday 18 June TECHNICAL SESSIONS (8 am - 5 pm) BIOENGINEERING PANEL (12 pm - 1:30 pm) PLENARY SESSIONS (8 pm - 10 pm) Tuesday 19 June TECHNICAL SESSIONS (8 am - 5 pm) PLENARY SESSIONS (8 pm - 10 pm) Wednesday 20 June TECHNICAL SESSIONS (8 am - 5 pm) PARTY (6 pm - 8 pm) GOVERNMENT PANEL (8 pm - 10 pm) Thursday 21 June TECHNICAL SESSIONS (8 am - 5 pm) Tutorials Thirteen tutorials are planned for Sunday 17 June. Adaptive Sensory-Motor Control - Stephen Grossberg Associative Memory - Bart Kosko Chaos for Engineers - Leon Chua Dynamical Systems Review - Morris Hirsch LMS Techniques in Neural Networks - Bernard Widrow Neural Network Applications - Robert Hecht-Nielsen Neurobiology I: Neurons and Simple Networks - Walter Freeman Neurobiology II: Advanced Networks - Allen Selverston Optical Neurocomputers - Demitri Psaltis Reinforcement Learning - Andrew Barto Self-Organizing Feature Maps - Teuvo Kohonen Vision - John Daugman VLSI Technology and Neural Network Chips - Lawrence Jackel Exhibits Exhibitors will present innovations in neural networks, including neurocomputers, VLSI neural networks, implementations, software systems and applications. IJCNN is the neural network industy's largest tradeshow. Vendors may contact Richard Rea at (619) 222-7447 for additional information. Accomodations IJCNN 90 will be held at the San Diego Marriott Hotel on San Diego Bay (619) 234-1500. ............................................................ Please direct questions to the appropriate individual as specified above (please don't send questions to me). S. Biafore - UCSD From oruiz at fi.upm.es Tue May 29 09:47:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 29 May 90 15:47 +0200 Subject: counter-example Message-ID: <25*oruiz@fi.upm.es> I still don't have an answer to my request for a counter-example to a conjecture of McCluskey and Paull. Roughly, the conjecture was the following: A n-argument truth function is linearly separated if and only if there exist no four vertices of the n-cube that form a parallelogram, one pair of whose diagonal points are true vertices and the other pair false vertices. It is easy to show that the condition is necessary, but E.F. Moore proved in 1957 that it is not sufficient. I would like to know Moore's counter-example, or any other one. Miguel A. Lerma Sancho Davila 18 28028 MADRID SPAIN  From LUBTODI%YALEVM.BITNET at vma.CC.CMU.EDU Tue May 29 14:24:00 1990 From: LUBTODI%YALEVM.BITNET at vma.CC.CMU.EDU (LUBTODI%YALEVM.BITNET@vma.CC.CMU.EDU) Date: Tue, 29 May 90 14:24 EDT Subject: bibliog on high-level tasks Message-ID: I have prepared a bibliography on the application of connectionist models to high-level cognitive tasks. High-level cognitive tasks include: analogical thinking, evidential reasoning/decision making, and complex (multistage) problem solving. The bibliography also includes some related work on schemata, scripts, sequential information processing, mental models, and rule-like processing. I thank members of this mailing list who suggested references for this bibliography. If you would like a copy please send me your US MAIL address. Todd Lubart, Yale Univ., Dept.of Psychology, Box 11A Yale Station, New Haven CT 06520 bitnet: LUBTODI at YALEVM From jacobs at gluttony.cs.umass.edu Wed May 30 10:26:52 1990 From: jacobs at gluttony.cs.umass.edu (jacobs@gluttony.cs.umass.edu) Date: Wed, 30 May 90 10:26:52 EDT Subject: new technical report Message-ID: <9005301426.AA01120@ANW.edu> The following technical report is now available: Task Decomposition Through Competition in a Modular Connectionist Architecture Robert A. Jacobs Department of Computer & Information Science University of Massachusetts Amherst, MA 01003 COINS Technical Report 90-44 Abstract -------- A novel modular connectionist architecture is presented in which the networks composing the architecture compete to learn the training patterns. As a result of the competition, different networks learn different training patterns and, thus, learn to compute different functions. The architecture performs task decomposition in the sense that it learns to partition a task into two or more functionally independent tasks and allocates distinct networks to learn each task. In addition, the architecture tends to allocate to each task the network whose topology is most appropriate to that task, and tends to allocate the same network to similar tasks and distinct networks to dissimilar tasks. Furthermore, it can be easily modified so as to learn to perform a family of tasks by using one network to learn a shared strategy that is used in all contexts along with other networks that learn modifications to this strategy that are applied in a context sensitive manner. These properties are demonstrated by training the architecture to perform object recognition and spatial localization from simulated retinal images, and to control a simulated robot arm to move a variety of payloads, each of a different mass, along a specified trajectory. Finally, it is noted that function decomposition is an underconstrained problem and, thus, different modular architectures may decompose a function in different ways. A desirable decomposition can be achieved if the architecture is suitably restricted in the types of functions that it can compute. Finding appropriate restrictions is possible through the application of domain knowledge. A strength of the modular architecture is that its structure is well--suited for incorporating domain knowledge. Please note that this technical report is my Ph.D. thesis and, thus, is considerably longer than the typical technical report (125 pages). If possible, please obtain a postscript version of this technical report from the pub/neuroprose directory at cheops.cis.ohio-state.edu. a) Here are the directions: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): neuron ftp> cd pub/neuroprose ftp> type binary ftp> get (remote-file) jacobs.thesis1.ps.Z (local-file) foo1.ps.Z ftp> get (remote-file) jacobs.thesis2.ps.Z (local-file) foo2.ps.Z ftp> get (remote-file) jacobs.thesis3.ps.Z (local-file) foo3.ps.Z ftp> get (remote-file) jacobs.thesis4.ps.Z (local-file) foo4.ps.Z ftp> quit unix> uncompress foo1.ps.Z unix> uncompress foo2.ps.Z unix> uncompress foo3.ps.Z unix> uncompress foo4.ps.Z unix> lpr -P(your_local_postscript_printer) foo1.ps unix> lpr -P(your_local_postscript_printer) foo2.ps unix> lpr -P(your_local_postscript_printer) foo3.ps unix> lpr -P(your_local_postscript_printer) foo4.ps If your printer dies because the size of a file exceeds the printer's memory capacity, then please try the -s option to the lpr command (see the manual page for the lpr command). b) You can also use the Getps script posted on the connectionist mailing list a few months ago. If you do not have access to a postscript printer, copies of this technical report can be obtained by sending requests to Connie Smith at smith at cs.umass.edu. Remember to ask for COINS Technical Report 90-44. From reynolds at bucasb.bu.edu Wed May 30 13:44:35 1990 From: reynolds at bucasb.bu.edu (reynolds@bucasb.bu.edu) Date: Wed, 30 May 90 13:44:35 EDT Subject: bibliog on high-level tasks In-Reply-To: connectionists@c.cs.cmu.edu's message of 30 May 90 07:40:56 GM Message-ID: <9005301744.AA16598@thalamus.bu.edu> I would be quite interested in seeing your bibliography. My mailing address is: John Reynolds Cognitive and Neural Systems 111 Cummington Street Boston University Boston, MA 02215 Thank you, John From R0MJW at IBM.COM Wed May 30 15:25:05 1990 From: R0MJW at IBM.COM (Michael Witbrock) Date: Wed, 30 May 90 14:25:05 -0500 Subject: bibliog on high-level tasks In-Reply-To: Your message of Wed, 30 May 90 13:44:35 EDT. <9005301744.AA16598@thalamus.bu.edu> Message-ID: <9005301825.AA05423@mjw.watson.ibm.com> You sent your request to everyone. I'm a former maintainer of ``connectionists''. The fact that your message has an 'In-reply-to' field, indicates that you used the reply function of your mailer. Because of the possibility of inadvertantly mailing your notes around the world. ``Reply'' is not recommended with connectionists. thanks \michael From al at gmdzi.uucp Thu May 31 07:31:58 1990 From: al at gmdzi.uucp (Alexander Linden) Date: Thu, 31 May 90 09:31:58 -0200 Subject: Special Issue on Neural Networks Message-ID: <9005310731.AA08372@gmdzi.UUCP> Special Issue on Neural Networks in Parallel Computing (To appear in August) This special issue focuses on the third generation of neural networks, which can be characterized as being heterogeneous, modular and asynchronous. Contents: H. Muehlenbein: Limitations of Multilayer Perceptrons: Towards Genetic Neural Networks F. Smieja: The Geometry of Multilayer Perceptron Solutions H. Muehlenbein J. Kindermann: Inversion of Neural Networks by Gradient Descent A. Linden T. E. Lange: Simulation of Heterogeneous Neural Networks on Serial and Parallel Machines A. Singer: Implementations of Artificial Neural Networks on the Connection Machine X. Zhang: The Backpropagation Algorithm on Grid and Hypercube et al. Architectures M. Witbrock: An Implementation of Backpropagation Learning on GF11, M. Zagha a large SIMD Parallel Computers D. Whitley: Genetic Algorithms and Neural Networks: Optimizing et al. Connections and Connectivity M. Tenorio: Topology Synthesis Networks: Self Organization of Structure and Weight Adjustment as a Learning Paradigm K. Obermayer: Large Scale Simulations of Self-Organizing Neural Networks on Parallel Computers: Application for Biological Modelling R. Kentridge: Neural Networks for Learning in the Real World: Representation, Reinforcement and Dynamics ------------------------------------------------------------------------- HOW TO ORDER: The publisher is offering a special service. Copies of this issue at a price of $25 can be obtained from Dr. F. van Drunen Elsevier Science Publishers Mathematics and Computer Science Section P.O. BOX 103 1000 AC Amsterdam The Netherlands FAX: +31-10-5862-616 ------------------------------------------------------------------------- Copies of the first three papers can be got from GMD c/o Sekretariat Z1.HLRZ P.O. BOX 1240 D-5205 Sankt Augustin 1 West Germany FAX +49 - 2241 - 142618 Heinz Muehlenbein