From oruiz at fi.upm.es Mon Jun 4 10:32:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 4 Jun 90 16:32 +0200 Subject: heuristic adjustment Message-ID: <29*oruiz@fi.upm.es> Subject: heuristic adjustment. I am looking for studies about applications of heuristic methods to neural network adjustment. I will appreciate any help in this matter. Miguel A. Lerma Sancho Davila 18 28028 MADRID SPAIN  From mclennan%MACLENNAN.CS.UTK.EDU at cs.utk.edu Mon Jun 4 16:07:50 1990 From: mclennan%MACLENNAN.CS.UTK.EDU at cs.utk.edu (mclennan%MACLENNAN.CS.UTK.EDU@cs.utk.edu) Date: Mon, 4 Jun 90 16:07:50 EDT Subject: TR available Message-ID: <9006042007.AA10605@MACLENNAN.CS.UTK.EDU> ***** DO NOT DISTRIBUTE TO OTHER LISTS ***** The following technical report is available: Synthetic Ethology: An Approach to the Study of Communication Bruce MacLennan Computer Science Department University of Tennessee Knoxville, TN 37996-1301 internet address: maclennan at cs.utk.edu CS-90-104 A complete understanding of communication, language, intentional- ity and related mental phenomena will require a theory integrat- ing mechanistic explanations with ethological phenomena. For the foreseeable future, the complexities of natural life in its natural environment will preclude such an understanding. An approach more conducive to carefully controlled experiments and to the discovery of deep laws of great generality is to study synthetic life forms in a synthetic world to which they have become coupled through evolution. This is the approach of _syn- thetic ethology_. Some simple experiments in synthetic ethology are described, in which we have observed the evolution of commun- ication in a population of simple machines. We show that even in these simple worlds communication manifests some of the richness and complexity found in natural communication. Finally some future directions for research in synthetic ethology are dis- cussed, as well as some issues relevant to both synthetic ethol- ogy and artificial life. For a copy, send your physical mail address to: library at cs.utk.edu For other correspondence send mail to the author. From esmythe at ANDREW.dnet.ge.com Mon Jun 4 17:16:20 1990 From: esmythe at ANDREW.dnet.ge.com (Erich J Smythe) Date: Mon, 4 Jun 90 17:16:20 EDT Subject: References needed on time-frequency classification methods Message-ID: <9006042116.AA29829@ge-dab.GE.COM> The following is posted for a friend. Please respond to him if you can. If not, I will forward the message. thanks -erich smythe esmythe at andrew.dnet.ge.com ------------------------------------------------------------------- I am writing a review on the use of time-frequency distributions of signals as inputs to classification algorithms. The review will appear in a book "New Methods in Time-Frequency Signal Analysis" to be published by Longman & Cheshire. I am particularly (but not solely) interested in schemes where the classification mechanism is that of a neural network. I would appreciate any inputs from the net as to appropriate references. All applications are relevant. I would like to see this review be comprehensive and adequately represent the contributions of neural nets. I will post a summary if there is interest. Please reply to "dmalkoff at atl.dnet.ge.com" ____________________________________ Donald B. Malkoff General Electric Company Advanced Technology Laboratories Moorestown Corporate Center Bldg. 145-2, Route 38 Moorestown, N.J. 08057 (609) 866-6516 From ersoy at ee.ecn.purdue.edu Mon Jun 4 14:28:45 1990 From: ersoy at ee.ecn.purdue.edu (Okan K Ersoy) Date: Mon, 4 Jun 90 13:28:45 -0500 Subject: No subject Message-ID: <9006041828.AA12660@ee.ecn.purdue.edu> FINAL CALL FOR PAPERS AND REFEREES HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES - 24 NEURAL NETWORKS AND RELATED EMERGING TECHNOLOGIES HAWAII - JANUARY 9-11, 1991 The Neural Networks Track of HICSS-24 will contain a special set of papers focusing on a broad selection of topics in the area of Neural Networks and Related Emerging Technologies. The presentations will provide a forum to discuss new advances in learning theory, associative memory, self-organization, architectures, implementations and applications. Papers are invited that may be theoretical, conceptual, tutorial or descriptive in nature. Those papers selected for presentation will appear in the Conference Proceedings which is published by the Computer Society of the IEEE. HICSS-24 is sponsored by the University of Hawaii in cooperation with the ACM, the Computer Society,and the Pacific Research Institute for Information Systems and Management (PRIISM). Submissions are solicited in: Supervised and Unsupervised Learning Issues of Complexity and Scaling Associative Memory Self-Organization Architectures Optical, Electronic and Other Novel Implementations Optimization Signal/Image Processing and Understanding Novel Applications INSTRUCTIONS FOR SUBMITTING PAPERS Manuscripts should be 22-26 typewritten, double-spaced pages in length. Do not send submissions that are significantly shorter or longer than this. Papers must not have been previously presented or published, nor currently submitted for journal publication. Each manuscript will be put through a rigorous refereeing process. Manuscripts should have a title page that includes the title of the paper, full name of its author(s), affiliations(s), complete physical and electronic address(es), telephone number(s) and a 300-word abstract of the paper. DEADLINES Six copies of the manuscript are due by June 25, 1990. Notification of accepted papers by September 1, 1990. Accepted manuscripts, camera-ready, are due by October 3, 1990. SEND SUBMISSIONS AND QUESTIONS TO O. K. Ersoy Purdue University School of Electrical Engineering W. Lafayette, IN 47907 (317) 494-6162 From het at seiden.psych.mcgill.ca Wed Jun 6 15:01:08 1990 From: het at seiden.psych.mcgill.ca (Phil Hetherington) Date: Wed, 6 Jun 90 15:01:08 EDT Subject: competetive learning simulators Message-ID: <9006061901.AA05730@seiden.psych.mcgill.ca.> I am looking for alternative competetive learning packages (other than McClelland and Rumelhart's) that: accept continuously coded input values (ranging from 0 to 1), allow for varying number of groups of competing units, allow for varying number of units within a group, accept varying initial weight ranges, and are easily modified to use different learning algorithms. It would be helpful if the packages came with source code (preferably in C) so that they might be compiled for either a compatible or sun4. I would appreciate any information on simulators that satisfy the above conditions. Please reply to het at seiden.psych.mcgill.ca Thanks. From ejua61 at castle.edinburgh.ac.uk Thu Jun 7 12:29:20 1990 From: ejua61 at castle.edinburgh.ac.uk (ejua61@castle.edinburgh.ac.uk) Date: Thu, 7 Jun 90 12:29:20 WET DST Subject: Optimality: BBS Call for Commentators Message-ID: <9006071229.aa06693@castle.ed.ac.uk> I tried to contact you on your princeton address but failed to get through. We (Wann, Wing) would be interested in providing commentary on "Optimality" as an organising principle in nature from a standpoint of developing movement control in humans, related to our previous commentary on Gottleib et al 1989. Indication of the time-course for review would be appreciated John Wann (john1 at uk.ac.edinburgh) Dept. of Psychology 7 G 7 George Sq., Edinburgh EH8 9JZ, Scotland (Prev. MRC APU Cambridge England) From Alex.Waibel at SPEECH2.CS.CMU.EDU Sun Jun 10 20:09:38 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Sun, 10 Jun 90 20:09:38 EDT Subject: Special Issue Announcement Message-ID: ANNOUNCEMENT MACHINE LEARNING will be publishing a special issue devoted to connectionist models under the title: "Structured Connectionist Learning Systems: Methods and Real World Applications" MACHINE LEARNING publishes articles on all aspects of Machine Learning, and on occasion runs special issues on particular subtopics of special interest. This issue of the journal will emphasize conectionist learning systems that aim at real world applications. Papers are solicited on this topic. Five copies of the manuscript should be sent by August 3, 1990 to: Dr. Alex Waibel School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Telephone: (412) 268-7676 Papers will be subject to the standard review process. From dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU Mon Jun 11 08:29:59 1990 From: dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU (Dario Ringach) Date: Mon, 11 Jun 90 15:29:59 +0300 Subject: Density Theorems and Nonorthogonal Expansions: Are the Same? Message-ID: <9006111229.AA12223@techunix.bitnet> Hi! Couldn't all those "density theorems" of networks in L^2(R) be regarded as particular cases of nonorthogonal expansion stuff we already know? For instance, a representation in which we translate and dilate a single function - a family of affine coherent states - leads to the theory of wavelet representation; while translations and modulations - the Weyl-Heisenberg class - leads to Gabor expansions (see [1] for instance). Another question: given a function phi() in a Banach space, and a function f which we want to approximate by a linear combination of N functions which are dilations and translations of phi(). If we call the the linear combination g. Does anyone know how to solve the optimization problem of min ||f-g|| ? I know only about heuristic approaches such as "generalized radial basis functions"... Thanks! -- Dario Ringach. [1] I. Daubechies, A. Grossman, Y. Meyer, 'Painless Nonorthogonal Expansions', J. Math. Phys., Vol. 27, No. 5, pp.1271-1283, May 1986. From harnad at clarity.Princeton.EDU Tue Jun 12 00:11:15 1990 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Tue, 12 Jun 90 00:11:15 EDT Subject: Cognitive Science Society Meeting Message-ID: <9006120411.AA03207@reason.Princeton.EDU> The XII Annual Conference of the Cognitive Science Society will take place at MIT, July 25-28, 1990. (Immediately preceding the meeting of the AAAI, also to take place in the Boston area). Conference Chair: M. Piattelli-Palmarini (MIT) Scientific Advisors: Beth Adelson (Tufts), Stephen M. Kosslyn (Harvard) Steven Pinker (MIT), Kenneth Wexler (MIT) Registration fees: Members 150$ (before July 1), $200 after July 1 non-members 185 225 student 90 110 Contact the MIT Conference Services, MIT Room 7- 111 Cambridge, MA 02139 Tel. (617) 253-1700 _______________________________________________ Outline of the program Tuesday July 24, Wednesday July 25 Tutorials: "Cognitive Aspects of Linguistic Theory", "Logic and Computability", Cognitive Neuroscience" (Require separate registrations) Wednesday, July 26 4.00 - 7.30 pm Registration at Kresge Auditorium 7.30 - 9.00 First plenary session : Kresge Main Auditorium Welcoming address by Samuel Jay Keyser, Assistant Provost of MIT, Co-Director of the MIT Center for Cognitive Science; Welcoming address by David E. Rumelhart (Stanford), Chairman of the Board of the Cognitive Science Society Keynote speaker: Noam Chomsky (MIT) "Language and Cognition" __________ Thursday, July 26 9.00 am - 11.15 am Symposia: Execution-Time Response: Applying Plans in a Dynamic World Kristian J. Hammond (University of Chicago), Chair Phil Agre (University of Chicago) Richard Alterman (Brandeis University) Reid Simmons (Carnegie Mellon University) R. James Firby (NASA Jet Propulsion Lab) Cognitive Aspects of Linguistic Theory Howard Lasnik (University of Connecticut), Chair David Pesetsky (Massachusetts Institute of Technology), Chair James T. Higginbotham (Massachusetts Institute of Technology) John McCarthy (University of Massachusetts) Perception, Computation and Categorization Whitman Richards (Massachusetts Institute of Technology), Chair Aaron Bobick (SRI International) Ken Nakayama (Harvard University) Allan Jepson (University of Toronto) Paper Presentations: Rule-Based Reasoning,Explanation and Problem- Solving Reasoning II: Planning 11.30 - 12.45 Plenary session Kresge Main Auditorium Keynote Speaker: Morris Halle (MIT) "Words and their Parts" Chair: Kenneth Wexler (MIT) __________________ Thursday, July 26 Afternoon 2.00 pm - 4.15 pm Symposia: Principle-Based Parsing Robert C. Berwick (Massachusetts Institute of Technology), Chair Steven P. Abney (Bell Communications Research) Bonnie J. Dorr (Massachusetts Institute of Technology) Sandiway Fong (Massachusetts Institute of Technology) Mark Johnson (Brown University) Edward P. Stabler, Jr. (University of California, Los Angeles) Recent Results in Formal Learning Theory Kevin T. Kelly (Carnegie Mellon University) Clark Glymour (Carnegie Mellon University), Chair Self-Organizing Cognitive and Neural Systems Stephen Grossberg (Boston University), Chair Ennio Mingolla (Boston University) Michael Rudd (Boston University) Daniel Bullock (Boston University) Gail A. Carpenter (Boston University) Action Systems: Planning and Execution Emilio Bizzi (Massachusetts Institute of Technology), Chair Michael I. Jordan (Massachusetts Institute of Technology) Paper presentations Reasoning : Analogy Learning and Memory : Acquisition 4.30 - 5.45 Plenary Session (Kresge Main Auditorium) Keynote Speaker: Amos Tversky (Stanford) "Decision under conflict" Chair: Daniel N. Osherson (MIT) Banquet ______________ Friday July 27 9.00 - 11.45 am Symposia: What's New in Language Acquisition ? Steven Pinker and Kenneth Wexler (MIT), Chair Stephen Crain (University of Connecticut) Myrna Gopnik (McGill University) Alan Prince (Brandeis University) Michelle Hollander, John Kim, Gary Marcus, Sandeep Prasada, Michael Ullman (MIT) Attracting Attention Ann Treisman (University of California, Berkeley), Chair Patrick Cavanagh (Harvard University) Ken Nakayama (Harvard University) Jeremy M. Wolfe (Massachusetts Institute of Technology) Steven Yantis (Johns Hopkins University) A New Look at Decision Making Susan Chipman (Office of Naval Research) and Judith Orasanu (Army Research Institute and Princeton University),Chair Gary Klein (Klein Associates) John A. Swets (Bolt Beranek & Newman Laboratories) Paul Thagard (Princeton University) Marvin S. Cohen (Decision Science Consortium, Inc.) Designing an Integrated Architecture: The Prodigy View Jaime G. Carbonell (Carnegie Mellon University), Chair Yolanda Gil (Carnegie Mellon University) Robert Joseph (Carnegie Mellon University) Craig A. Knoblock (Carnegie Mellon University) Steve Minton (NASA Ames Research Center) Manuela M. Veloso (Carnegie Mellon University) Paper presentations: Reasoning : Categories and Concepts Language : Pragmatics and Communication 11.30 - 12. 45 Plenary Session (Kresge main Auditorium) Keynote speaker: Margaret Livingstone (Harvard) "Parallel Processing of Form, Color and Depth" Chair: Richard M. Held (MIT) _________________________ Friday, July 27 afternoon 2.00 - 4.15 pm Symposia: What is Cognitive Neuroscience? David Caplan (Harvard Medical School) and Stephen M. Kosslyn (Harvard University), Chair Michael S. Gazzaniga (Dartmouth Medical School) Michael I. Posner (University of Oregon) Larry Squire (University of California, San Diego) Computational Models of Category Learning Pat Langley (NASA Ames Research Center) and Michael Pazzani (University of California, Irvine), Chair Dorrit Billman (Georgia Institute of Technology) Douglas Fisher (Vanderbilt University) Mark Gluck (Stanford University) The Study of Expertise: Prospects and Limits Anders Ericsson (University of Colorado, Boulder),Chair Neil Charness (University of Waterloo) Vimla L. Patel and Guy Groen (McGill University) Yuichiro Anzai (Keio University) Fran Allard and Jan Starkes (University of Waterloo) Keith Holyoak (University of California, Los Angeles), Discussant Paper presentations: Language (Panel 1) : Phonology Language (Panel 2) : Syntax 4.30 - 5.45 Keynote speaker: Anne Treisman (UC Berkeley) "Features and Objects" Chair: Stephen M. Kosslyn (Harvard) Poster Sessions I. Connectionist Models II. Machine Simulations and Algorithms III. Knowledge and Problem-Solving __________________________________ Saturday, July 28 9.00 am - 11.15 am Symposia: SOAR as a Unified Theory of Cognition: Spring 1990 Allen Newell (Carnegie Mellon University), Chair Richard L. Lewis (Carnegie Mellon University) Scott B. Huffman (University of Michigan) Bonnie E. John (Carnegie Mellon University) John E. Laird (University of Michigan) Jill Fain Lehman (Carnegie Mellon University) Paul S. Rosenbloom (University of Southern California) Tony Simon (Carnegie Mellon University) Shirley G. Tessler (Carnegie Mellon University) Neonate Cognition Richard Held (Massachusetts Institute of Technology), Chair Jane Gwiazda (Massachusetts Institute of Technology) Renee Baillargeon (University of Illinois) Adele Diamond (University of Pennsylvania) Jacques Mehler (CNRS, Paris, France) Discussant Conceptual Coherence in Text and Discourse Arthur C. Grasser (Memphis State University), Chair Richard Alterman (Brandeis University) Kathleen Dahlgren (Intelligent Text Processing, Inc.) Bruce K. Britton (University of Georgia) Paul van den Broek (University of Minnesota) Charles R. Fletcher (University of Minnesota) Roger J. Kreuz (Memphis State University) Richard M. Roberts (Memphis State University) Tom Trabasso and Nancy Stein Paper presentations: Causality,Induction and Decision-Making Vision (Panel 1) : Objects and Features Vision (Panel 2) : Imagery Language : Lexical Semantics Case-Based Reasoning 11.30 - 12.45 Keynote Speaker Ellen Markman (Stanford) "Constraints Children Place on Possible Word Meanings" Chair: Susan Carey (MIT) Lunch presentation: "Cognitive Science in Europe: A Panorama" Chair: Willem Levelt (Max Planck, Nijmegen). Informal presentations by: Jacques Mehler (CNRS, Paris), Paolo Viviani (University of Geneva), Paolo Legrenzi (University of Trieste), Karl Wender (University of Trier). _____________________________ Saturday 28 Afternoon 2.00 - 3.00 pm Paper presentations: Vision : Attention Language Processing Educational Methods Learning and Memory Agents, Goals and Constraints 3.15 - 4.30 Keynote Speaker: Roger Schank (Norhwestern) "The Story is the Message: Memory and Instruction" Chair: Beth Adelson (Tufts) 4.30 - 5.45 Keynote Speaker: Stephen Jay Gould (Harvard) "Evolution and Cognition" From curt at cassi.cog.syr.edu Tue Jun 12 09:18:56 1990 From: curt at cassi.cog.syr.edu (Curt Burgess) Date: Tue, 12 Jun 90 09:18:56 EDT Subject: subscribe Message-ID: <9006121318.AA01553@cassi.cog.syr.edu> Can you send me subscription information? Thanks - Curt Burgess From KAMELI%COSMO at utrcgw.utc.com Mon Jun 11 09:17:00 1990 From: KAMELI%COSMO at utrcgw.utc.com (KAMELI%COSMO@utrcgw.utc.com) Date: Mon, 11 Jun 90 08:17 EST Subject: Connectionists Gathering Message-ID: Is anyone aware of any upcomming workshop(s) in Neural Networks or Neural Modeling, or any other kind of gathering that might provide hands on training in NNs. Any information would be appreciated. Thanx Nader Kameli kameli at cosmo.otis.utc.com From EDSON%BRUC.ANSP.BR at VMA.CC.CMU.EDU Tue Jun 12 20:04:00 1990 From: EDSON%BRUC.ANSP.BR at VMA.CC.CMU.EDU (EDSON%BRUC.ANSP.BR@VMA.CC.CMU.EDU) Date: Tue, 12 Jun 90 21:04 -0300 Subject: No subject References: ANSP network: HEPnet/SPAN/Bitnet/Internet gateway Message-ID: Bubscribe Edson Francozo From SATINDER at cs.umass.EDU Wed Jun 13 12:36:00 1990 From: SATINDER at cs.umass.EDU (SATINDER@cs.umass.EDU) Date: Wed, 13 Jun 90 11:36 EST Subject: This is for the Librarian.! Message-ID: <9006131536.AA08343@dime.cs.umass.edu> Hi! Some time ago - Robbie Jacobs sent out a message inviting requests for his Ph.d Thesis report. Unfortunately he forgot to mention that the department charges for Ph.d reports - even though other tech. reports are distributed free of charge. In Robbie's absence -- I am sending this reminder from the Librarian..... Thank you for your request for COINS technical report 90-44, a Ph.D. thesis by Robert Jacobs. Unfortunately, Mr. Jacobs forgot to mention that Ph.D. papers are NOT free of charge. If you would still like to receive the paper, please send a check or money order made out to: ***** COINS Department ***** in the amount of $7.05 US$. Once I have received the money for the paper, I will forward the paper to you right away. Please forward your request and your check to: Connie Smith, Librarian Computer & Information Science Department University of Massachusetts Lederle Graduate Research Center Amherst, MA 01007 Thanks very much for your interest in our department. I look forward to hearing from you. --------------------------- end message. satinder. P.S. PLEASE DO NOT REPLY TO THIS MESSAGE. IT will be ignored.!! From thomasp at lan.informatik.tu-muenchen.dbp.de Wed Jun 13 16:20:25 1990 From: thomasp at lan.informatik.tu-muenchen.dbp.de (Patrick Thomas) Date: 13 Jun 90 22:20:25+0200 Subject: Independent Rules for Synaptic Plasticity Message-ID: <9006132020.AA00883@gshalle1.informatik.tu-muenchen.de> I wonder who is currently supporting the idea of independent (non hebbian) rules for synaptic plasticity apart from Finkel & Edelman (1). They formulated a synaptic plasticity mechanism which is based on a PRESYNAPTIC RULE (the efficacy of the presynaptic terminal is dependent only on the activity of the presynaptic neuron, no postsynaptic firing or above-treshold depolarization is necessary, ALL presynaptic terminals are affected) and on a POSTSYNAPTIC RULE which is a heterosynaptic modification rule similiar to that of Changeux and others. Unfortunately there is strong evidence that a hebbian condition is necessary for modifications of synaptic efficacy (cf 2,3). Even the experiment by Bonhoeffer et al which demonstrated the non-locality of hebbian modification (all presynaptic terminals are likewise enhanced although only one postsynaptic neuron is depolarized) only occured when the hebbian condition was fulfilled. Could someone provide references to work either crushing the idea of independent modification rules or supporting it ? Thanx in advance. Patrick Thomas Computer Science, Munich Technical University (1) "Synaptic Function", Edelman/Gall/Cowan (eds), Wiley, 1987. (2) "Hebbian Synapses in Hippocampus", Kelso et al, PNAS, 83:5326-5330. (3) Bonhoeffer et al, PNAS, 86:8113-8117. PS: Bad timing. I bet everyone is in San Diego. From kruschke at ucs.indiana.edu Wed Jun 13 15:31:00 1990 From: kruschke at ucs.indiana.edu (KRUSCHKE,JOHN,PSY) Date: 13 Jun 90 14:31:00 EST Subject: research report announcement Message-ID: *** PLEASE DO NOT FORWARD TO OTHER BULLETIN BOARDS *** Research Report announcement: ALCOVE: A Connectionist Model of Category Learning John K. Kruschke Psychology and Cognitive Science Indiana University This report should interest cognitive scientists studying category *learning*, especially those familiar with the work of psychologists such as Nosofsky, Medin, Gluck & Bower, and Estes. The report should also interest connectionists studying the abilities of back-prop architectures that use radial basis function nodes, and new architectures for selective attention. ABSTRACT ALCOVE is a new connectionist model of category learning that models the course of learning in humans and their asymptotic performance. The model is a variant of back propagation, using Gaussian (radial basis function) hidden nodes, and *adaptive attentional strengths* on the input dimensions. Unlike standard back propagation networks, ALCOVE cannot develop completely new dimensions for representing the stimuli, but it does learn to differentially attend to the given input dimensions. This constraint is an accurate reflection of human performance. ALCOVE is successfully applied to several category learning phenomena: (1)~It correctly orders the difficulty of the six category types from the classic work of Shepard, Hovland and Jenkins (1961). (2)~It accurately fits trial-by-trial learning data and mimics the base-rate neglect observed by Gluck and Bower (1988b). In preliminary work, it is also shown that ALCOVE can: (3)~exhibit three-stage learning of high-frequency exceptions to rules (\cf\ Rumelhart \& McClelland 1986), (4)~show emergent graded internal structure in categories, \ie, typicality ratings, (5)~produce asymmetries of similarities between typical and atypical exemplars, (6)~show selective sensitivity to correlated dimensions, and (7)~learn non-linearly separable categories faster than linearly separable categories, in those cases that humans do. It is also suggested that ALCOVE could serve as the input to a rule generating system, so that the dimensions most attended are the ones first used for rules. Moreover, it is shown that ALCOVE is falsifiable, in principle, and that there are some phenomena in category learning that ALCOVE cannot capture. Nevertheless, ALCOVE is attractive because of the broad range of phenomena it does model. If you are truly interested (supplies are limited), you are welcome to a free copy by e-mailing your physical address to the Cognitive Science Program secretary, Cathy Barnes, at iucogsci at ucs.indiana.edu Be sure to mention "Research Report #19 by John Kruschke". (As usual, don't use the "reply" command to make your request.) *** PLEASE DO NOT FORWARD TO OTHER BULLETIN BOARDS *** From MRE1%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU Thu Jun 14 15:06:00 1990 From: MRE1%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU (MRE1%VMS.BRIGHTON.AC.UK@VMA.CC.CMU.EDU) Date: Thu, 14 Jun 90 15:06 BST Subject: Facial Feature Detector Message-ID: I would like to know if anyone has or knows of a database of images of human faces or portrait shots of people. I require the images to form training and test sets for a neural network for facial feature detection. This is part of my PhD program and not a commercial development. I would gratefully appreciate any help. Mark Evans mre1 at uk.ac.bton.vms From pawlicki at Kodak.COM Fri Jun 15 14:30:19 1990 From: pawlicki at Kodak.COM (Dr. Thaddeus F. Pawlicki) Date: Fri, 15 Jun 90 14:30:19 EDT Subject: Research Positions Available Message-ID: <9006151830.AA15626@strategic.> The Signal Processing Research Group of Eastman Kodak has on going research in the areas of Image Processing and Neural Networks. The focus of this work is document understanding and pattern recognition. We are actively recruiting individuals who are interested in these areas. Serious inquirees should contact (hardcopy/phone) : Dr. Roger Gaborski Eastman Kodak Company 901 Elmgrove Road Rochester, New York, 14653-5722 (716) 726-4169 From white at cs.rochester.edu Fri Jun 15 11:40:15 1990 From: white at cs.rochester.edu (white@cs.rochester.edu) Date: Fri, 15 Jun 90 11:40:15 -0400 Subject: Tech-Report Announcement Message-ID: <9006151540.AA07823@maple.cs.rochester.edu> The following technical report is now available: LEARNING TO PERCEIVE AND ACT Steven D. Whitehead and Dana H. Ballard Technical Report # 331 (Revised) Department of Computer Science University of Rochester Rochester, NY 14627 ABSTRACT: This paper considers adaptive control architectures that integrate active sensory-motor systems with decision systems based on reinforcement learning. One unavoidable consequence of active perception is that the agent's internal representation often confounds external world states. We call this phenomenon perceptual aliasing and show that it destabilizes existing reinforcement learning algorithms with respect to the optimal decision policy. We then describe a new decision system that overcomes these difficulties for a restricted class of decision problems. The system incorporates a perceptual subcycle within the overall decision cycle and uses a modified learning algorithm to suppress the effects of perceptual aliasing. The result is a control architecture that learns not only how to solve a task but also where to focus its attention in order to collect necessary sensory information. The report can be obtained by sending requests to either peg at cs.rochester.edu or white at cs.rochester.edu. Be sure to mention TR331(revised) in your request. From FRANKLINS%MEMSTVX1.BITNET at VMA.CC.CMU.EDU Fri Jun 15 13:44:00 1990 From: FRANKLINS%MEMSTVX1.BITNET at VMA.CC.CMU.EDU (FRANKLINS%MEMSTVX1.BITNET@VMA.CC.CMU.EDU) Date: Fri, 15 Jun 90 12:44 CDT Subject: report offered Message-ID: What follows is the abstract of a technical report, really more of a position paper, by Max Garzon and myself. It deals more with neural networks as computational tools than as models of cognition. It was motivated by more technical work of ours on the outer reaches of neural computation under ideal conditions. An extended abstract of this work appeared as "Neural computability II", in Proc. 3rd Int. Joint. Conf. on Neural Networks, Washington, D.C. 1989 I, 631- 637. ******************************************************* When does a neural network solve a problem? Stan Franklin and Max Garzon Institute for Intelligent Systems Department of Mathematical Sciences Memphis State University Memphis, TN 38152 USA Abstract Reproducibility, scalability, controlability, and physical realizability are characteristic features of conventional solutions to algorithmic problems. Their desirability for neural network approaches to computational problems is discussed. It is argued that reproducibility requires the eventual stability of the network at a correct answer, scalability requires consideration of successively larger finite (or just infinite) networks, and the other two features require discrete activations. A precise definition of solution with these properties is offered. The importance of the stability problem in neurocomputing is discussed, as well as the need for study of infinite networks. ******************************************************* A hard copy of the position paper (report 90-11) and/or a full version of "Neural computability II" may be requested from franklins at memstvx1.bitnet. We would greatly appreciate your comments. Please do not REPLY to this message. -- Stan Franklin From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Sun Jun 17 01:02:35 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Sun, 17 Jun 90 01:02:35 EDT Subject: Cascade-Correlation simulator in C Message-ID: Thanks to Scott Crowder, one of my graduate students at Carnegie Mellon, there is now a C version of the public-domain simulator for the Cascade-Correlation learning algorithm. This is a translation of the original simulator that I wrote in Common Lisp. Both versions are now available by anonymous FTP -- see the instructions below. Before anyone asks, we are *NOT* prepared to make tapes and floppy disks for people. Since this code is in the public domain, it is free and no license agreement is required. Of course, as a matter of simple courtesy, we expect people who use this code, commercially or for research, to acknowledge the source. I am interested in hearing about people's experience with this algorithm, successful or not. I will maintain an E-mail mailing list of people using this code so that I can inform users of any bug-fixes, new versions, or problems. Send me E-mail if you want to be on this list. If you have questions that specifically pertain to the C version, contact Scott Crowder (rsc at cs.cmu.edu). If you have more general questions about the algorithm and how to run it, contact me (fahlman at cs.cmu.edu). We'll try to help, though the time we can spend on this is limited. Please use E-mail for such queries if at all possible. Scott Crowder will be out of town for the next couple of weeks, so C-specific problems might have to wait until he returns. The Cascade-Correlation algorithm is described in S. Fahlman and C Lebiere, "The Cascade-Correlation Learning Architecture" in D. S. Touretzky (ed.) _Advances_in_Neural_Information_Processing_Systems_2_, Morgan Kaufmann Publishers, 1990. A tech report containing essentially the same information can be obtained via FTP from the "neuroprose" collection of postscript files at Ohio State. (See instructions below.) Enjoy, Scott E. Fahlman School of Computer Science Carnegie-Mellon University Pittsburgh, PA 15217 --------------------------------------------------------------------------- To FTP the simulation code: For people (at CMU, MIT, and soon some other places) with access to the Andrew File System (AFS), you can access the files directly from directory "/afs/cs.cmu.edu/project/connect/code". This file system uses the same syntactic conventions as BSD Unix: case sensitive names, slashes for subdirectories, no version numbers, etc. The protection scheme is a bit different, but that shouldn't matter to people just trying to read these files. For people accessing these files via FTP: 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu". 2. Log in as user "anonymous" with no password. You may see an error message that says "filenames may not have /.. in them" or something like that. Just ignore it. 3. Change remote directory to "/afs/cs/project/connect/code". Any subdirectories of this one should also be accessible. The parent directories may not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. The Lisp version of the Cascade-Correlation simulator lives in files "cascor1.lisp". The C version lives in "cascor1.c". If you try to access this directory by FTP and have trouble, please contact me. The exact FTP commands you use to change directories, list files, etc., will vary from one version of FTP to another. --------------------------------------------------------------------------- To access the postscript file for the tech report: unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62) Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get fahlman.cascor-tr.ps.Z ftp> quit unix> uncompress fahlman.cascor-tr.ps.Z unix> lpr fahlman.cascor-tr.ps (use flag your printer needs for Postscript) --------------------------------------------------------------------------- From Connectionists-Request at CS.CMU.EDU Sun Jun 17 20:25:18 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Sun, 17 Jun 90 20:25:18 EDT Subject: Connectionists maintainer out of town...delays are possible Message-ID: <7653.645668718@B.GP.CS.CMU.EDU> I will be out of town at the Connectionists Summer School for the next two weeks. After looking at the proposed schedule, it seems likely that I will not have time to answer any mail sent to Connectionists-Request at cs.cmu.edu until I return during the first week of July. Please be patient with any change of address, additions/deletions to/from the list, or other administrative requests. Thanks Scott Crowder Connectionists-Request at cs.cmu.edu (ARPAnet) From white at cs.rochester.edu Mon Jun 18 12:49:19 1990 From: white at cs.rochester.edu (white@cs.rochester.edu) Date: Mon, 18 Jun 90 12:49:19 -0400 Subject: $$ for TR Message-ID: <9006181649.AA08459@maple.cs.rochester.edu> >The following technical report is now available: > > > LEARNING TO PERCEIVE AND ACT > > Steven D. Whitehead and Dana H. Ballard > > Technical Report # 331 (Revised) > Department of Computer Science > University of Rochester > Rochester, NY 14627 > >ABSTRACT: This paper considers adaptive control architectures that >integrate active sensory-motor systems with decision systems based >on reinforcement learning. One unavoidable consequence of active perception >is that the agent's internal representation often confounds external world >states. We call this phenomenon perceptual aliasing and show that it >destabilizes existing reinforcement learning algorithms with respect >to the optimal decision policy. We then describe a new decision system >that overcomes these difficulties for a restricted class of decision >problems. The system incorporates a perceptual subcycle within the overall >decision cycle and uses a modified learning algorithm to suppress the effects >of perceptual aliasing. The result is a control architecture that learns not >only how to solve a task but also where to focus its attention in order to >collect necessary sensory information. > > >The report can be obtained by sending requests to either peg at cs.rochester.edu >or white at cs.rochester.edu. Be sure to mention TR331(revised) in your request. I failed to mention that the TR costs $2.00. If you have already requested the TR and NO LONGER WANT IT, PLEASE MAIL ME. Otherwise, I'll just send the TR (along with the bill.) My original intension was to bypass our standard billing procedure and distrubute the TR freely, however the overwheling number of requests has made that impractical. I apologize for the hassle. -Steve From dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU Tue Jun 19 06:27:08 1990 From: dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU (Dario Ringach) Date: Tue, 19 Jun 90 13:27:08 +0300 Subject: Attention! (summary request) Message-ID: <9006191027.AA16187@techunix.bitnet> Time ago there was a discussion on the list dealing with models of visual spatial attention... Somehow, I erased the summary I had. Can anyone who was interested in the discussion and still has the summary send me a copy? Thanks in advance! -- Dario Ringach From clay at CS.CMU.EDU Tue Jun 19 12:35:16 1990 From: clay at CS.CMU.EDU (Clay Bridges) Date: Tue, 19 Jun 90 12:35:16 EDT Subject: A GA Tutorial and a GA Short Course Message-ID: <6836.645813316@GS10.SP.CS.CMU.EDU> A tutorial entitled "Genetic Algorithms and Classifier Systems" will be presented on Wednesday afternoon, August 1, at the AAAI conference in Boston, MA by David E. Goldberg (Alabama) and John R. Koza (Stanford). The course will survey GA mechanics, power, applications, and advances together with similar information regarding classifier systems and other genetics-based machine learning systems. For further information regarding this tutorial write to AAAI-90, Burgess Drive, Menlo Park, CA 94025, (415)328-3123. A five-day short course entitled "Genetic Algorithms in Search, Optimization, and Machine Learning" will be presented at Stanford University's Western Institute in Computer Science on August 6-10 by David E. Goldberg (Alabama) and John R. Koza (Stanford). The course presents in-depth coverage of GA mechanics, theory and application in search, optimization, and machine learning. Students will be encouraged to solve their own problems in hands-on computer workshops monitored by the course instructors. For further information regarding this course contact Joleen Barnhill, Western Institute in Computer Science, PO Box 1238, Magalia, CA 95954, (916)873-0576. From Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU Wed Jun 20 00:08:55 1990 From: Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave.Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Wed, 20 Jun 90 00:08:55 EDT Subject: tech report announcement Message-ID: <2930.645854935@DST.BOLTZ.CS.CMU.EDU> Here comes yet another tech report announcement. *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** Rules and Maps III: Further Progress in Connectionist Phonology David S. Touretzky Deirdre W. Wheeler Gillette Elvgren III June 1990 Report number CMU-CS-90-138 ABSTRACT: This report contains three papers from an ongoing research project on connectionist phonology. The first introduces syllabification into our ``many maps'' processing model. The second shows how syllabification and a previously-described clustering mechanism can be used jointly to implement the stress assignment rules of a number of languages. The third paper describes a preliminary version of a phonological rule-learning program whose rule syntax is determined by the architecture of our model. ``Two Derivations Suffice: The Role of Syllabification in Cognitive Phonology'' will appear in C. Tenny (ed.), The MIT Parsing Volume, 1989-1990. MIT Center for Cognitive Science, Parsing Project Working Papers 3. ``From Syllables to Stress: A Cognitively Plausible Model'' will appear in K. Deaton, M. Noske, and M. Ziolkowski (eds.), CLS 26-II: Papers from the Parasession on The Syllable in Phonetics and Phonology. Chicago: Chicago Linguistic Society, 1990. ``Phonological Rule Induction: An Architectural Solution'' will appear in Proceedings of the Twelfth Annual Conference of the Cognitive Science Society. Hillsdale, NJ: Lawrence Erlbaum Associates, 1990. ................................................................ To order this report, send email to Catherine Copetas (copetas at cs.cmu.edu) requesting a copy of CMU-CS-90-138. Be sure to include your physical mail address in the message. *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** From lina at ai.mit.edu Wed Jun 20 17:54:51 1990 From: lina at ai.mit.edu (Lina Massone) Date: Wed, 20 Jun 90 17:54:51 EDT Subject: No subject Message-ID: <9006202154.AA03637@globus-pallidus> ********** DO NOT FORWARD TO OTHER BBOARDS *********** The following technical report is available. Target-Switching Experiments with a Sequential Neuro-Controller Lina Massone Dept. of Brain and Cognitive Sciences Massachusetts Institute of Technology 77 Massachusetts Avenue - Cambridge Ma 02139 This paper describes some target-switching experiments simulated with a neural system that drives a three-joint redundant limb. The system is composed of a controller (a sequential network) and a limb emulator. The system was trained to generate aiming movements of the limb towards targets specified as sensory stimuli; it was not trained to perform the target-switching task itself. The experiments demonstrate that the system possesses the ability to solve the target-switching task, which requires generalization with respect to both initial limb posture and sensory stimulation. I performed the experiments under two different perceptual conditions: (1) on/off switching of the two stimuli, (2) temporal overlap of the two stimuli. The second case refers to a hypothesis proposed by many experimental investigators about two different systems being involved in the programming of movements: a "where-system" that would build an internal representation of the target that shifts gradually to the new values, and a "when-system" that would start the motor program generator.The "where-system" would be able to account for the observed differences in path, while the "when-system" would be able to account for the response-time phenomenon. The case of temporal overlap of the two stimuli is a simulation of the "where-system". I present a qualitative comparison of data generated by the neural system under conditions (1) and (2), namely (i) the endpoint paths and velocity profiles, (ii) the patterns of muscular activation. Results of the comparison show that in presence of the "where-system" the controller can account for the variability in paths and the basic two-peak structure of the velocity profiles commonly observed in psychophysical experiments. In absence of the "where-system" the behavior of the controller is, on the contrary, highly stereotyped. Results also point out an inadequacy in the network architecture to deal with the observed high peak velocities after stimuli are switched. Please forward all requests to lina at ai.mit.edu From thomasp at lan.informatik.tu-muenchen.dbp.de Thu Jun 21 07:53:54 1990 From: thomasp at lan.informatik.tu-muenchen.dbp.de (Patrick Thomas) Date: 21 Jun 90 13:53:54+0200 Subject: Independent, again Message-ID: <9006211153.AA11935@gshalle1.informatik.tu-muenchen.de> I wonder who is currently supporting the idea of INDEPENDENT (non hebbian) rules for synaptic plasticity apart from Finkel & Edelman (1). They formulated a synaptic plasticity mechanism which is based on a PRESYNAPTIC RULE (the efficacy of the presynaptic terminal is dependent only on the activity of the presynaptic neuron, no postsynaptic firing or above-treshold depolarization is necessary, ALL presynaptic terminals are affected) and on a POSTSYNAPTIC RULE which is a heterosynaptic modification rule similiar to that of Changeux, Alkon and others. There is general agreement that Hebbs original notion of postsynaptic FIRING as a condition of synaptic weight increase is inappropriate. Usually a postsynaptic DEPOLARIZATION is said to be needed with a further refinement preventing unbounded weight increase, namely some kind of ANTI-HEBB condition which decreases synaptic weight in the absence of correlated conditions of "activity" (cf Stent 1973, Palm and others). Of course there are numerous other variations of Hebb rules not to be considered here (cf Brown, 1990, Ann Rev NS). But, what shall we do with the following two facts: 1) No mechanism is known to detect coincidence of pre/postsynaptic "activity". The NMDA-Receptor complex is currently en vogue, but available data is inconclusive. 2) There is a growing amount of data related to heteroassociative interactions LOCAL on the dendritic tree between neighbouring synapses. So why not redefine our models based on this observations ? All of the heteroassociative effects of synaptic plasticity, of course, rely on some kind of "postsynaptic activity". But this is not meant in the hebb-sense as to involve the postsynaptic neuron as a functional whole but rather in the context of local depolarization affecting neighbouring membrane channel properties, for example. Alkon therefore simulates with his "neurons" having distinct patches for incoming signals. In addition to a postsynaptic/heterosynaptic mechanism there is ample evidence for homo/multisynaptic facilitation and depression which is independent of postsynaptic activity. Edelmans DUAL RULES MODEL sketched earlier could therefore well be an appropriate starting point for the investigation of new learning laws to be applied within the context of Artificial Neural Networks (actually, it needs some refinements). Could someone provide references to work either crushing the idea of independent modification rules or supporting it ? Thanx in advance. Patrick Thomas Computer Science, Munich Technical University (1) "Synaptic Function", Edelman/Gall/Cowan (eds), Wiley, 1987. PS: Bad timing. I bet everbody is in San Diego. PSS: The Kelso (1986) and Bonhoeffer (1989) results are admittedly a challenge to non-hebbian rules. Hopefully a moderate one. From holyoak at cognet.ucla.edu Thu Jun 21 13:41:57 1990 From: holyoak at cognet.ucla.edu (Keith J Holyoak) Date: Thu, 21 Jun 90 10:41:57 PDT Subject: connectionist analogy etc. Message-ID: <9006211741.AA02274@paris.cognet.ucla.edu> .ll 7i .nr LL 7i .ps 11 .nr PS 11 .nr VS 13 .nf .ta 1.5i .UL "Information for inclusion in a CALL FOR PAPERS" SERIES: Advances in Connectionist and Neural Computation Theory SERIES EDITOR: John A. Barnden VOLUME: 2 VOLUME TITLE: \fIConnectionist Approaches to Analogy, Metaphor and Case-Based Reasoning\fR. .ta 2i VOLUME EDITORS: Keith J. Holyoak Department of Psychology University of California Los Angeles, CA 90024. (213) 206-1646 holyoak at cognet.ucla.edu John A. Barnden Computer Science Department & Computing Research Laboratory Box 30001/3CRL New Mexico State University Las Cruces, NM 88003. (505) 646-6235 jbarnden at nmsu.edu .fi .LP DESCRIPTION .PP Connectionist capabilities such as associative retrieval, approximate matching, soft constraint handling and adaptation hold considerable promise for supporting analogy-based reasoning, case-based reasoning and metaphor processing. At the same time, these three strongly related forms of processing traditionally involve complex symbol structures, and connectionism continues to have difficulty in providing the benefits normally supplied by such structures. Recently, some connectionist approaches to metaphor, analogy and case-based reasoning have begun to appear, and the purpose of our volume is to encourage further work and discussion in this area. .PP The volume will include both invited and submitted peer-reviewed articles. We are seeking submissions from researchers in any relevant field \*- artificial intelligence, psychology, philosophy, linguistics, and others. Articles can be positive, neutral or negative on the applicability of connectionism to analogy/metaphor/case-based processing. They can be of any type, including subfield reviews, general discussions, critiques, detailed presentations of models or supporting mechanisms, formal theoretical analyses, empirical studies, and methodological studies. .LP SUBMISSION PROCEDURE Submissions may be sent to either editor, by 20 November 1990. The suggested length is 7000-20,000 words excluding figures, references, abstract and so on. Format details, etc. will be supplied on request. Authors are strongly encouraged to discuss ideas for possible submissions with the editors. .sp 3 .LP ((ADVERTISEMENT FOR VOLUME 1, TO BE INSERTED BY ABLEX)) From tsejnowski at UCSD.EDU Thu Jun 21 15:50:15 1990 From: tsejnowski at UCSD.EDU (Terry Sejnowski) Date: Thu, 21 Jun 90 12:50:15 PDT Subject: Independent, again Message-ID: <9006211950.AA19324@sdbio2.UCSD.EDU> There are two types of LTP in the hippocampus, one in area CA1 (and elsewhere) that depends on the NMDA receptor (and is blocked by AP5) and another type in area CA3 that is not blocked by AP5. The latter appears not be associative and may not be Hebbian (but the experimental evidence is not yet definitive on this point). In addition to heterosynaptic depression (postsynaptic activity in the absence of presynaptic activity) there is also evidence for homosynaptic depression (presynaptic activity in the absence of postsynaptic activity). For a review of these mechanisms, see Sejnowski et al., Induction of synaptic plasticity by Hebbian covariance in the hippocampus, In: R. Durbin, C. Miall and G. Mitchison. (Eds.), The Computing Neuron, Addison-Wesley, 1989. Incidently, this collection of papers is one of the best on the interface of biology with computational models. Another good recent collection specifically on biologically relevant connectionist models is Connectionist Modeling and Brain Function, Hanson and Olson (Eds.), MIT Press, 1990. The emerging evidence from neurobiologists is that there is a multiplicity of mechanisms for plasticity at synapses. Furthermore, there are mechanisms that can change the excitability of a neuron, such as changing the density or voltage dependence of ion-selective channels in the membrane. This is similar to changing the threshold and shape of the nonlinearity, except that the change may be specific to a dendritic branch, not the whole neuron. This gives Nature (and modelers) a much richer palate of mechanisms to work with. Terry ----- From rbelew at UCSD.EDU Tue Jun 26 08:26:18 1990 From: rbelew at UCSD.EDU (Rik Belew) Date: Tue, 26 Jun 90 05:26:18 PDT Subject: Evolving Networks - New TR Message-ID: <9006261226.AA04629@blakey.ucsd.edu> EVOLVING NETWORKS: USING THE GENETIC ALGORITHM WITH CONNECTIONIST LEARNING Richard K. Belew John McInerney Nicolaus Schraudolf Cognitive Computer Science Research Group Computer Science & Engr. Dept. (C-014) Univ. California at San Diego La Jolla, CA 92093 rik at cs.ucsd.edu CSE Technical Report #CS90-174 June, 1990 ABSTRACT It is appealing to consider hybrids of neural-network learning algorithms with evolutionary search procedures, simply because Nature has so successfully done so. In fact, computational models of learning and evolution offer theoretical biology new tools for addressing questions about Nature that have dogged that field since Darwin. The concern of this paper, however, is strictly artificial: Can hybrids of connectionist learning algorithms and genetic algorithms produce more efficient and effective algorithms than either technique applied in isolation? The paper begins with a survey of recent work (by us and others) that combines Holland's Genetic Algorithm (GA) with connectionist techniques and delineates some of the basic design problems these hybrids share. This analysis suggests the dangers of overly literal representations of the network on the genome (e.g., encoding each weight explicitly). A preliminary set of experiments that use the GA to find unusual but successful values for BP parameters (learning rate, momentum) are also reported. The focus of the report is a series of experiments that use the GA to explore the space of initial weight values, from which two different gradient techniques (conjugate gradient and back propagation) are then allowed to optimize. We find that use of the GA provides much greater confidence in the face of the stochastic variation that can plague gradient techniques, and can also allow training times to be reduced by as much as two orders of magnitude. Computational trade-offs between BP and the GA are considered, including discussion of a software facility that exploits the parallelism inherent in GA/BP hybrids. This evidence leads us to conclude that the GA's GLOBAL SAMPLING characteristics compliment connectionist LOCAL SEARCH techniques well, leading to efficient and reliable hybrids. -------------------------------------------------- If possible, please obtain a postscript version of this technical report from the pub/neuroprose directory at cheops.cis.ohio-state.edu. Here are the directions: /*** Note: This file is not yet in place. Give us a few days, ***/ /*** say until after 4th of July weekend, before you try to get it. ***/ unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): neuron ftp> cd pub/neuroprose ftp> type binary ftp> get (remote-file) evol-net.ps.Z (local-file) foo.ps.Z ftp> quit unix> uncompress foo.ps.Z unix> lpr -P(your_local_postscript_printer) foo.ps /*** Note: This file is not yet in place. Give us a few days, ***/ /*** say until after 4th of July weekend, before you try to get it. ***/ If you do not have access to a postscript printer, copies of this technical report can be obtained by sending requests to: Kathleen Hutcheson CSE Department (C-014) Univ. Calif. -- San Diego La Jolla, CA 92093 Ask for CSE Technical Report #CS90-174, and enclose $3.00 to cover the cost of publication and postage. From sontag at hilbert.rutgers.edu Wed Jun 27 16:27:35 1990 From: sontag at hilbert.rutgers.edu (Eduardo Sontag) Date: Wed, 27 Jun 90 16:27:35 EDT Subject: Tech Reports Available Message-ID: <9006272027.AA14695@hilbert.rutgers.edu> The following report is now available: "On the recognition capabilities of feedforward nets" by Eduardo D. Sontag, SYCON Center, Rutgers University. ABSTRACT: In this note we deal with the recognition capabilities of various feedforward neural net architectures, analyzing the effect of direct input to output connections and comparing Heaviside (threshold) with sigmoidal response units. The results state, roughly, that allowing direct connections or allowing sigmoidal responses doubles the recognition power of the standard architecture (no connections, Heaviside responses) which is often assumed in theoretical studies. Recognition power is expressed in terms of various measures, including worst-case and VC-dimension, though in the latter case, only results for subsets of the plane are proved (the general case is still open). There is also some discussion of Boolean recognition problems, including the example of computing N-bit parity with about N/2 sigmoids. --------------------------------------------------------------------------- To obtain copies of the postscript file, please use Jordan Pollack's service: Example: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get (remote-file) sontag.capabilities.ps.Z (local-file) foo.ps.Z ftp> quit unix> uncompress foo.ps unix> lpr -P(your_local_postscript_printer) foo.ps ---------------------------------------------------------------------------- If you have any difficulties with the above, please send e-mail to sontag at hilbert.rutgers.edu. DO NOT "reply" to this message, please. From oruiz at fi.upm.es Thu Jun 28 09:19:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 28 Jun 90 15:19 +0200 Subject: neural efficiency Message-ID: <42*oruiz@fi.upm.es> I would appreciate any information about the relationship between neural networks and algorithm efficiency theory. My address is the following: Miguel A. Lerma Sancho Davila 18 28028 MADRID SPAIN Thanks in advance.  From koza at Sunburn.Stanford.EDU Fri Jun 29 18:33:15 1990 From: koza at Sunburn.Stanford.EDU (John Koza) Date: Fri, 29 Jun 1990 15:33:15 PDT Subject: Genetic Programming -new TR Available Message-ID: A new technical report entitled "Genetic Programming: A Paradigm for Genetically Breeding Populations of Computer Programs to Solve Problems" is now available as Stanford University Computer Science Department technical report no. STAN-CS-90-1314. ABSTRACT: Many seemingly different problems in artificial intelligence, symbolic processing, and machine learning can be viewed as requiring discovery of a computer program that produces some desired output for particular inputs. When viewed in this way, the process of solving these problems becomes equivalent to searching a space of possible computer programs for a most fit individual computer program. The new "genetic programming" paradigm described in this report provides a way to search for this most fit individual computer program. In this new "genetic programming" paradigm, populations of computer programs are genetically bred using the Darwinian principle of survival of the fittest and using a genetic crossover (recombination) operator appropriate for genetically mating computer programs. In this report, the process of formulating and solving problems using this new paradigm is illustrated using examples from various areas. Examples come from the areas of machine learning of a function; planning; sequence induction; symbolic function identificiation (including symbolic regression, empirical discovery, "data to function" symbolic integration, "data to function" symbolic differentiation); solving equations (including differential equations, integral equations, and functional equations)' concept formation; automatica programming; pattern recognition; time-optimal control; playing differential pursuer-evader games; neural network design; and finding a game-playing strategy for a game in extensive form. AVAILABILITY: (1) A limited number of copies of this report can be obtained from the author FREE between now and August 31, 1990, by writing John Koza, Post Office Box K, Los Altos Hills, CA 94023. (2) Copies may be obtained for $15 from Taleen Nazarian, Computer Science Department, Margarget Jacks Hall, Stanford University, Stanford, CA 94023 USA. John R. Koza Computer Science Department Stanford Univeristy From lss at compsci.stirling.ac.uk Fri Jun 29 11:19:07 1990 From: lss at compsci.stirling.ac.uk (Dr. Leslie S. Smith) Date: 29 Jun 90 11:19:07 BST (Fri) Subject: No subject Message-ID: <9006291119.AA06820@uk.ac.stir.cs.crown> Subject: Request for information I have a student about to undertake a project on the application of Neural Nets to identification of earthquake seismic signatures. I would be most appreciative if anyone could tell me of any references in this area. -- Leslie Smith -- lss at uk.ac.stir.cs --Dr. L. S. Smith, Department of Computing Science, Univ of Stirling. From jmerelo at ugr.es Tue Jun 26 06:01:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 26 Jun 90 12:01 +0200 Subject: Introduction Message-ID: <44*jmerelo@ugr.es> My name is JJ Merelo, I am working in Granada University. Our grooup is called CSIP and we are more prone to the hardware stuff, but I am myself concerned with software. I have already implemented a Kohonen network, that is being used for Spanish speech r ecognition. The source code is available in C, should anyone be interested. That's all by now. JJ ================== From neuron-request at hplabs.hpl.hp.com Tue Jun 26 05:46:00 1990 From: neuron-request at hplabs.hpl.hp.com (Neuron-Digest Moderator Peter Marvit) Date: 26 Jun 90 11:46 +0200 Subject: Welcome to Neuron-Digest In-Reply-To: > Message-ID: <9192.646334026@hplpm.hpl.hp.com> >X-Handled-By: EUnet via goya.uucp The following address has been added to the Neuron-Digest mailing list: "JJ Merelo" You should begin receiving Digests shortly. At the end of this message is the official "blurb" of this Digest. You can retrieve back issues with anonymous ftp, as described in the blurb. Please let me know if you have difficulties or need back issues mailed. Please feel free to submit messages early and often. The Digest will be thin without your participation. Send all messages to (UUCP style) "hplabs!neuron-request" or (ARPA style) "neuron-request at hplabs.hp.com". Who are YOU and what are YOUR interests? Also, how did you find out about the Digest? -Peter Marvit Neuron-Digest Moderator ------------------------------ CUT HERE ------------------------------- ARPA: NEURON at hplabs.hp.com uucp: ...!hplabs!neuron Neuron-Digest is a list (in digest form) dealing with all aspects of neural networks (and any type of network or neuromorphic system), especially: NATURAL SYSTEMS Software Simulations Neurobiology Hardware Neuroscience Digital ARTIFICIAL SYSTEMS Analog Neural Networks Optical Algorithms Cellular Automatons Some key words which may stir up some further interest include: Hebbian Systems Widrow-Hoff Algorithm Perceptron Threshold Logic Holography Content Addressable Memories Lyapunov Stability Criterion Navier-Stokes Equation Annealing Spin Glasses Locally Couples Systems Globally Coupled Systems Dynamical Systems (Adaptive) Control Theory Back-Propagation Generalized Delta Rule Pattern Recognition Vision Systems Parallel Distributed Processing Connectionism Any contribution in these areas is accepted. Any of the following are reasonable: Abstracts Reviews Lab Descriptions Research Overviews Work Planned or in Progress Half-Baked Ideas Conference Announcements Conference Reports Bibliographies History Connectionism Puzzles and Unsolved Problems Anecdotes, Jokes, and Poems Queries and Requests Address Changes (Bindings) Archived files/messages are available with anonymous ftp from hplpm.hpl.hp.com (15.255.176.205) in the directory pub/Neuron-Digest. That directory contains back issues with the names vol-nn-no-mm (e.g., vol-3-no-02). I'm also collecting simulation software in pub/Neuron-Software. Contributions are welcome. All requests to be added to or deleted from this list, problems, questions, etc., should be sent to neuron-request at hplabs.hp.com. Moderator: Peter Marvit ------------------------------ CUT HERE ------------------------------- From jmerelo at ugr.es Tue Jun 26 06:01:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 26 Jun 90 12:01 +0200 Subject: Introduction Message-ID: <44*jmerelo@ugr.es> My name is JJ Merelo, I am working in Granada University. Our grooup is called CSIP and we are more prone to the hardware stuff, but I am myself concerned with software. I have already implemented a Kohonen network, that is being used for Spanish speech r ecognition. The source code is available in C, should anyone be interested. That's all by now. JJ ================== From neuron-request at hplabs.hpl.hp.com Tue Jun 26 05:46:00 1990 From: neuron-request at hplabs.hpl.hp.com (Neuron-Digest Moderator Peter Marvit) Date: 26 Jun 90 11:46 +0200 Subject: Welcome to Neuron-Digest In-Reply-To: > Message-ID: <9192.646334026@hplpm.hpl.hp.com> >X-Handled-By: EUnet via goya.uucp The following address has been added to the Neuron-Digest mailing list: "JJ Merelo" You should begin receiving Digests shortly. At the end of this message is the official "blurb" of this Digest. You can retrieve back issues with anonymous ftp, as described in the blurb. Please let me know if you have difficulties or need back issues mailed. Please feel free to submit messages early and often. The Digest will be thin without your participation. Send all messages to (UUCP style) "hplabs!neuron-request" or (ARPA style) "neuron-request at hplabs.hp.com". Who are YOU and what are YOUR interests? Also, how did you find out about the Digest? -Peter Marvit Neuron-Digest Moderator ------------------------------ CUT HERE ------------------------------- ARPA: NEURON at hplabs.hp.com uucp: ...!hplabs!neuron Neuron-Digest is a list (in digest form) dealing with all aspects of neural networks (and any type of network or neuromorphic system), especially: NATURAL SYSTEMS Software Simulations Neurobiology Hardware Neuroscience Digital ARTIFICIAL SYSTEMS Analog Neural Networks Optical Algorithms Cellular Automatons Some key words which may stir up some further interest include: Hebbian Systems Widrow-Hoff Algorithm Perceptron Threshold Logic Holography Content Addressable Memories Lyapunov Stability Criterion Navier-Stokes Equation Annealing Spin Glasses Locally Couples Systems Globally Coupled Systems Dynamical Systems (Adaptive) Control Theory Back-Propagation Generalized Delta Rule Pattern Recognition Vision Systems Parallel Distributed Processing Connectionism Any contribution in these areas is accepted. Any of the following are reasonable: Abstracts Reviews Lab Descriptions Research Overviews Work Planned or in Progress Half-Baked Ideas Conference Announcements Conference Reports Bibliographies History Connectionism Puzzles and Unsolved Problems Anecdotes, Jokes, and Poems Queries and Requests Address Changes (Bindings) Archived files/messages are available with anonymous ftp from hplpm.hpl.hp.com (15.255.176.205) in the directory pub/Neuron-Digest. That directory contains back issues with the names vol-nn-no-mm (e.g., vol-3-no-02). I'm also collecting simulation software in pub/Neuron-Software. Contributions are welcome. All requests to be added to or deleted from this list, problems, questions, etc., should be sent to neuron-request at hplabs.hp.com. Moderator: Peter Marvit ------------------------------ CUT HERE ------------------------------- From oruiz at fi.upm.es Mon Jun 4 10:32:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 4 Jun 90 16:32 +0200 Subject: heuristic adjustment Message-ID: <29*oruiz@fi.upm.es> Subject: heuristic adjustment. I am looking for studies about applications of heuristic methods to neural network adjustment. I will appreciate any help in this matter. Miguel A. Lerma Sancho Davila 18 28028 MADRID SPAIN  From mclennan%MACLENNAN.CS.UTK.EDU at cs.utk.edu Mon Jun 4 16:07:50 1990 From: mclennan%MACLENNAN.CS.UTK.EDU at cs.utk.edu (mclennan%MACLENNAN.CS.UTK.EDU@cs.utk.edu) Date: Mon, 4 Jun 90 16:07:50 EDT Subject: TR available Message-ID: <9006042007.AA10605@MACLENNAN.CS.UTK.EDU> ***** DO NOT DISTRIBUTE TO OTHER LISTS ***** The following technical report is available: Synthetic Ethology: An Approach to the Study of Communication Bruce MacLennan Computer Science Department University of Tennessee Knoxville, TN 37996-1301 internet address: maclennan at cs.utk.edu CS-90-104 A complete understanding of communication, language, intentional- ity and related mental phenomena will require a theory integrat- ing mechanistic explanations with ethological phenomena. For the foreseeable future, the complexities of natural life in its natural environment will preclude such an understanding. An approach more conducive to carefully controlled experiments and to the discovery of deep laws of great generality is to study synthetic life forms in a synthetic world to which they have become coupled through evolution. This is the approach of _syn- thetic ethology_. Some simple experiments in synthetic ethology are described, in which we have observed the evolution of commun- ication in a population of simple machines. We show that even in these simple worlds communication manifests some of the richness and complexity found in natural communication. Finally some future directions for research in synthetic ethology are dis- cussed, as well as some issues relevant to both synthetic ethol- ogy and artificial life. For a copy, send your physical mail address to: library at cs.utk.edu For other correspondence send mail to the author. From esmythe at ANDREW.dnet.ge.com Mon Jun 4 17:16:20 1990 From: esmythe at ANDREW.dnet.ge.com (Erich J Smythe) Date: Mon, 4 Jun 90 17:16:20 EDT Subject: References needed on time-frequency classification methods Message-ID: <9006042116.AA29829@ge-dab.GE.COM> The following is posted for a friend. Please respond to him if you can. If not, I will forward the message. thanks -erich smythe esmythe at andrew.dnet.ge.com ------------------------------------------------------------------- I am writing a review on the use of time-frequency distributions of signals as inputs to classification algorithms. The review will appear in a book "New Methods in Time-Frequency Signal Analysis" to be published by Longman & Cheshire. I am particularly (but not solely) interested in schemes where the classification mechanism is that of a neural network. I would appreciate any inputs from the net as to appropriate references. All applications are relevant. I would like to see this review be comprehensive and adequately represent the contributions of neural nets. I will post a summary if there is interest. Please reply to "dmalkoff at atl.dnet.ge.com" ____________________________________ Donald B. Malkoff General Electric Company Advanced Technology Laboratories Moorestown Corporate Center Bldg. 145-2, Route 38 Moorestown, N.J. 08057 (609) 866-6516 From ersoy at ee.ecn.purdue.edu Mon Jun 4 14:28:45 1990 From: ersoy at ee.ecn.purdue.edu (Okan K Ersoy) Date: Mon, 4 Jun 90 13:28:45 -0500 Subject: No subject Message-ID: <9006041828.AA12660@ee.ecn.purdue.edu> FINAL CALL FOR PAPERS AND REFEREES HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES - 24 NEURAL NETWORKS AND RELATED EMERGING TECHNOLOGIES HAWAII - JANUARY 9-11, 1991 The Neural Networks Track of HICSS-24 will contain a special set of papers focusing on a broad selection of topics in the area of Neural Networks and Related Emerging Technologies. The presentations will provide a forum to discuss new advances in learning theory, associative memory, self-organization, architectures, implementations and applications. Papers are invited that may be theoretical, conceptual, tutorial or descriptive in nature. Those papers selected for presentation will appear in the Conference Proceedings which is published by the Computer Society of the IEEE. HICSS-24 is sponsored by the University of Hawaii in cooperation with the ACM, the Computer Society,and the Pacific Research Institute for Information Systems and Management (PRIISM). Submissions are solicited in: Supervised and Unsupervised Learning Issues of Complexity and Scaling Associative Memory Self-Organization Architectures Optical, Electronic and Other Novel Implementations Optimization Signal/Image Processing and Understanding Novel Applications INSTRUCTIONS FOR SUBMITTING PAPERS Manuscripts should be 22-26 typewritten, double-spaced pages in length. Do not send submissions that are significantly shorter or longer than this. Papers must not have been previously presented or published, nor currently submitted for journal publication. Each manuscript will be put through a rigorous refereeing process. Manuscripts should have a title page that includes the title of the paper, full name of its author(s), affiliations(s), complete physical and electronic address(es), telephone number(s) and a 300-word abstract of the paper. DEADLINES Six copies of the manuscript are due by June 25, 1990. Notification of accepted papers by September 1, 1990. Accepted manuscripts, camera-ready, are due by October 3, 1990. SEND SUBMISSIONS AND QUESTIONS TO O. K. Ersoy Purdue University School of Electrical Engineering W. Lafayette, IN 47907 (317) 494-6162 From het at seiden.psych.mcgill.ca Wed Jun 6 15:01:08 1990 From: het at seiden.psych.mcgill.ca (Phil Hetherington) Date: Wed, 6 Jun 90 15:01:08 EDT Subject: competetive learning simulators Message-ID: <9006061901.AA05730@seiden.psych.mcgill.ca.> I am looking for alternative competetive learning packages (other than McClelland and Rumelhart's) that: accept continuously coded input values (ranging from 0 to 1), allow for varying number of groups of competing units, allow for varying number of units within a group, accept varying initial weight ranges, and are easily modified to use different learning algorithms. It would be helpful if the packages came with source code (preferably in C) so that they might be compiled for either a compatible or sun4. I would appreciate any information on simulators that satisfy the above conditions. Please reply to het at seiden.psych.mcgill.ca Thanks. From ejua61 at castle.edinburgh.ac.uk Thu Jun 7 12:29:20 1990 From: ejua61 at castle.edinburgh.ac.uk (ejua61@castle.edinburgh.ac.uk) Date: Thu, 7 Jun 90 12:29:20 WET DST Subject: Optimality: BBS Call for Commentators Message-ID: <9006071229.aa06693@castle.ed.ac.uk> I tried to contact you on your princeton address but failed to get through. We (Wann, Wing) would be interested in providing commentary on "Optimality" as an organising principle in nature from a standpoint of developing movement control in humans, related to our previous commentary on Gottleib et al 1989. Indication of the time-course for review would be appreciated John Wann (john1 at uk.ac.edinburgh) Dept. of Psychology 7 G 7 George Sq., Edinburgh EH8 9JZ, Scotland (Prev. MRC APU Cambridge England) From Alex.Waibel at SPEECH2.CS.CMU.EDU Sun Jun 10 20:09:38 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Sun, 10 Jun 90 20:09:38 EDT Subject: Special Issue Announcement Message-ID: ANNOUNCEMENT MACHINE LEARNING will be publishing a special issue devoted to connectionist models under the title: "Structured Connectionist Learning Systems: Methods and Real World Applications" MACHINE LEARNING publishes articles on all aspects of Machine Learning, and on occasion runs special issues on particular subtopics of special interest. This issue of the journal will emphasize conectionist learning systems that aim at real world applications. Papers are solicited on this topic. Five copies of the manuscript should be sent by August 3, 1990 to: Dr. Alex Waibel School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Telephone: (412) 268-7676 Papers will be subject to the standard review process. From dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU Mon Jun 11 08:29:59 1990 From: dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU (Dario Ringach) Date: Mon, 11 Jun 90 15:29:59 +0300 Subject: Density Theorems and Nonorthogonal Expansions: Are the Same? Message-ID: <9006111229.AA12223@techunix.bitnet> Hi! Couldn't all those "density theorems" of networks in L^2(R) be regarded as particular cases of nonorthogonal expansion stuff we already know? For instance, a representation in which we translate and dilate a single function - a family of affine coherent states - leads to the theory of wavelet representation; while translations and modulations - the Weyl-Heisenberg class - leads to Gabor expansions (see [1] for instance). Another question: given a function phi() in a Banach space, and a function f which we want to approximate by a linear combination of N functions which are dilations and translations of phi(). If we call the the linear combination g. Does anyone know how to solve the optimization problem of min ||f-g|| ? I know only about heuristic approaches such as "generalized radial basis functions"... Thanks! -- Dario Ringach. [1] I. Daubechies, A. Grossman, Y. Meyer, 'Painless Nonorthogonal Expansions', J. Math. Phys., Vol. 27, No. 5, pp.1271-1283, May 1986. From harnad at clarity.Princeton.EDU Tue Jun 12 00:11:15 1990 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Tue, 12 Jun 90 00:11:15 EDT Subject: Cognitive Science Society Meeting Message-ID: <9006120411.AA03207@reason.Princeton.EDU> The XII Annual Conference of the Cognitive Science Society will take place at MIT, July 25-28, 1990. (Immediately preceding the meeting of the AAAI, also to take place in the Boston area). Conference Chair: M. Piattelli-Palmarini (MIT) Scientific Advisors: Beth Adelson (Tufts), Stephen M. Kosslyn (Harvard) Steven Pinker (MIT), Kenneth Wexler (MIT) Registration fees: Members 150$ (before July 1), $200 after July 1 non-members 185 225 student 90 110 Contact the MIT Conference Services, MIT Room 7- 111 Cambridge, MA 02139 Tel. (617) 253-1700 _______________________________________________ Outline of the program Tuesday July 24, Wednesday July 25 Tutorials: "Cognitive Aspects of Linguistic Theory", "Logic and Computability", Cognitive Neuroscience" (Require separate registrations) Wednesday, July 26 4.00 - 7.30 pm Registration at Kresge Auditorium 7.30 - 9.00 First plenary session : Kresge Main Auditorium Welcoming address by Samuel Jay Keyser, Assistant Provost of MIT, Co-Director of the MIT Center for Cognitive Science; Welcoming address by David E. Rumelhart (Stanford), Chairman of the Board of the Cognitive Science Society Keynote speaker: Noam Chomsky (MIT) "Language and Cognition" __________ Thursday, July 26 9.00 am - 11.15 am Symposia: Execution-Time Response: Applying Plans in a Dynamic World Kristian J. Hammond (University of Chicago), Chair Phil Agre (University of Chicago) Richard Alterman (Brandeis University) Reid Simmons (Carnegie Mellon University) R. James Firby (NASA Jet Propulsion Lab) Cognitive Aspects of Linguistic Theory Howard Lasnik (University of Connecticut), Chair David Pesetsky (Massachusetts Institute of Technology), Chair James T. Higginbotham (Massachusetts Institute of Technology) John McCarthy (University of Massachusetts) Perception, Computation and Categorization Whitman Richards (Massachusetts Institute of Technology), Chair Aaron Bobick (SRI International) Ken Nakayama (Harvard University) Allan Jepson (University of Toronto) Paper Presentations: Rule-Based Reasoning,Explanation and Problem- Solving Reasoning II: Planning 11.30 - 12.45 Plenary session Kresge Main Auditorium Keynote Speaker: Morris Halle (MIT) "Words and their Parts" Chair: Kenneth Wexler (MIT) __________________ Thursday, July 26 Afternoon 2.00 pm - 4.15 pm Symposia: Principle-Based Parsing Robert C. Berwick (Massachusetts Institute of Technology), Chair Steven P. Abney (Bell Communications Research) Bonnie J. Dorr (Massachusetts Institute of Technology) Sandiway Fong (Massachusetts Institute of Technology) Mark Johnson (Brown University) Edward P. Stabler, Jr. (University of California, Los Angeles) Recent Results in Formal Learning Theory Kevin T. Kelly (Carnegie Mellon University) Clark Glymour (Carnegie Mellon University), Chair Self-Organizing Cognitive and Neural Systems Stephen Grossberg (Boston University), Chair Ennio Mingolla (Boston University) Michael Rudd (Boston University) Daniel Bullock (Boston University) Gail A. Carpenter (Boston University) Action Systems: Planning and Execution Emilio Bizzi (Massachusetts Institute of Technology), Chair Michael I. Jordan (Massachusetts Institute of Technology) Paper presentations Reasoning : Analogy Learning and Memory : Acquisition 4.30 - 5.45 Plenary Session (Kresge Main Auditorium) Keynote Speaker: Amos Tversky (Stanford) "Decision under conflict" Chair: Daniel N. Osherson (MIT) Banquet ______________ Friday July 27 9.00 - 11.45 am Symposia: What's New in Language Acquisition ? Steven Pinker and Kenneth Wexler (MIT), Chair Stephen Crain (University of Connecticut) Myrna Gopnik (McGill University) Alan Prince (Brandeis University) Michelle Hollander, John Kim, Gary Marcus, Sandeep Prasada, Michael Ullman (MIT) Attracting Attention Ann Treisman (University of California, Berkeley), Chair Patrick Cavanagh (Harvard University) Ken Nakayama (Harvard University) Jeremy M. Wolfe (Massachusetts Institute of Technology) Steven Yantis (Johns Hopkins University) A New Look at Decision Making Susan Chipman (Office of Naval Research) and Judith Orasanu (Army Research Institute and Princeton University),Chair Gary Klein (Klein Associates) John A. Swets (Bolt Beranek & Newman Laboratories) Paul Thagard (Princeton University) Marvin S. Cohen (Decision Science Consortium, Inc.) Designing an Integrated Architecture: The Prodigy View Jaime G. Carbonell (Carnegie Mellon University), Chair Yolanda Gil (Carnegie Mellon University) Robert Joseph (Carnegie Mellon University) Craig A. Knoblock (Carnegie Mellon University) Steve Minton (NASA Ames Research Center) Manuela M. Veloso (Carnegie Mellon University) Paper presentations: Reasoning : Categories and Concepts Language : Pragmatics and Communication 11.30 - 12. 45 Plenary Session (Kresge main Auditorium) Keynote speaker: Margaret Livingstone (Harvard) "Parallel Processing of Form, Color and Depth" Chair: Richard M. Held (MIT) _________________________ Friday, July 27 afternoon 2.00 - 4.15 pm Symposia: What is Cognitive Neuroscience? David Caplan (Harvard Medical School) and Stephen M. Kosslyn (Harvard University), Chair Michael S. Gazzaniga (Dartmouth Medical School) Michael I. Posner (University of Oregon) Larry Squire (University of California, San Diego) Computational Models of Category Learning Pat Langley (NASA Ames Research Center) and Michael Pazzani (University of California, Irvine), Chair Dorrit Billman (Georgia Institute of Technology) Douglas Fisher (Vanderbilt University) Mark Gluck (Stanford University) The Study of Expertise: Prospects and Limits Anders Ericsson (University of Colorado, Boulder),Chair Neil Charness (University of Waterloo) Vimla L. Patel and Guy Groen (McGill University) Yuichiro Anzai (Keio University) Fran Allard and Jan Starkes (University of Waterloo) Keith Holyoak (University of California, Los Angeles), Discussant Paper presentations: Language (Panel 1) : Phonology Language (Panel 2) : Syntax 4.30 - 5.45 Keynote speaker: Anne Treisman (UC Berkeley) "Features and Objects" Chair: Stephen M. Kosslyn (Harvard) Poster Sessions I. Connectionist Models II. Machine Simulations and Algorithms III. Knowledge and Problem-Solving __________________________________ Saturday, July 28 9.00 am - 11.15 am Symposia: SOAR as a Unified Theory of Cognition: Spring 1990 Allen Newell (Carnegie Mellon University), Chair Richard L. Lewis (Carnegie Mellon University) Scott B. Huffman (University of Michigan) Bonnie E. John (Carnegie Mellon University) John E. Laird (University of Michigan) Jill Fain Lehman (Carnegie Mellon University) Paul S. Rosenbloom (University of Southern California) Tony Simon (Carnegie Mellon University) Shirley G. Tessler (Carnegie Mellon University) Neonate Cognition Richard Held (Massachusetts Institute of Technology), Chair Jane Gwiazda (Massachusetts Institute of Technology) Renee Baillargeon (University of Illinois) Adele Diamond (University of Pennsylvania) Jacques Mehler (CNRS, Paris, France) Discussant Conceptual Coherence in Text and Discourse Arthur C. Grasser (Memphis State University), Chair Richard Alterman (Brandeis University) Kathleen Dahlgren (Intelligent Text Processing, Inc.) Bruce K. Britton (University of Georgia) Paul van den Broek (University of Minnesota) Charles R. Fletcher (University of Minnesota) Roger J. Kreuz (Memphis State University) Richard M. Roberts (Memphis State University) Tom Trabasso and Nancy Stein Paper presentations: Causality,Induction and Decision-Making Vision (Panel 1) : Objects and Features Vision (Panel 2) : Imagery Language : Lexical Semantics Case-Based Reasoning 11.30 - 12.45 Keynote Speaker Ellen Markman (Stanford) "Constraints Children Place on Possible Word Meanings" Chair: Susan Carey (MIT) Lunch presentation: "Cognitive Science in Europe: A Panorama" Chair: Willem Levelt (Max Planck, Nijmegen). Informal presentations by: Jacques Mehler (CNRS, Paris), Paolo Viviani (University of Geneva), Paolo Legrenzi (University of Trieste), Karl Wender (University of Trier). _____________________________ Saturday 28 Afternoon 2.00 - 3.00 pm Paper presentations: Vision : Attention Language Processing Educational Methods Learning and Memory Agents, Goals and Constraints 3.15 - 4.30 Keynote Speaker: Roger Schank (Norhwestern) "The Story is the Message: Memory and Instruction" Chair: Beth Adelson (Tufts) 4.30 - 5.45 Keynote Speaker: Stephen Jay Gould (Harvard) "Evolution and Cognition" From curt at cassi.cog.syr.edu Tue Jun 12 09:18:56 1990 From: curt at cassi.cog.syr.edu (Curt Burgess) Date: Tue, 12 Jun 90 09:18:56 EDT Subject: subscribe Message-ID: <9006121318.AA01553@cassi.cog.syr.edu> Can you send me subscription information? Thanks - Curt Burgess From KAMELI%COSMO at utrcgw.utc.com Mon Jun 11 09:17:00 1990 From: KAMELI%COSMO at utrcgw.utc.com (KAMELI%COSMO@utrcgw.utc.com) Date: Mon, 11 Jun 90 08:17 EST Subject: Connectionists Gathering Message-ID: Is anyone aware of any upcomming workshop(s) in Neural Networks or Neural Modeling, or any other kind of gathering that might provide hands on training in NNs. Any information would be appreciated. Thanx Nader Kameli kameli at cosmo.otis.utc.com From EDSON%BRUC.ANSP.BR at VMA.CC.CMU.EDU Tue Jun 12 20:04:00 1990 From: EDSON%BRUC.ANSP.BR at VMA.CC.CMU.EDU (EDSON%BRUC.ANSP.BR@VMA.CC.CMU.EDU) Date: Tue, 12 Jun 90 21:04 -0300 Subject: No subject References: ANSP network: HEPnet/SPAN/Bitnet/Internet gateway Message-ID: Bubscribe Edson Francozo From SATINDER at cs.umass.EDU Wed Jun 13 12:36:00 1990 From: SATINDER at cs.umass.EDU (SATINDER@cs.umass.EDU) Date: Wed, 13 Jun 90 11:36 EST Subject: This is for the Librarian.! Message-ID: <9006131536.AA08343@dime.cs.umass.edu> Hi! Some time ago - Robbie Jacobs sent out a message inviting requests for his Ph.d Thesis report. Unfortunately he forgot to mention that the department charges for Ph.d reports - even though other tech. reports are distributed free of charge. In Robbie's absence -- I am sending this reminder from the Librarian..... Thank you for your request for COINS technical report 90-44, a Ph.D. thesis by Robert Jacobs. Unfortunately, Mr. Jacobs forgot to mention that Ph.D. papers are NOT free of charge. If you would still like to receive the paper, please send a check or money order made out to: ***** COINS Department ***** in the amount of $7.05 US$. Once I have received the money for the paper, I will forward the paper to you right away. Please forward your request and your check to: Connie Smith, Librarian Computer & Information Science Department University of Massachusetts Lederle Graduate Research Center Amherst, MA 01007 Thanks very much for your interest in our department. I look forward to hearing from you. --------------------------- end message. satinder. P.S. PLEASE DO NOT REPLY TO THIS MESSAGE. IT will be ignored.!! From thomasp at lan.informatik.tu-muenchen.dbp.de Wed Jun 13 16:20:25 1990 From: thomasp at lan.informatik.tu-muenchen.dbp.de (Patrick Thomas) Date: 13 Jun 90 22:20:25+0200 Subject: Independent Rules for Synaptic Plasticity Message-ID: <9006132020.AA00883@gshalle1.informatik.tu-muenchen.de> I wonder who is currently supporting the idea of independent (non hebbian) rules for synaptic plasticity apart from Finkel & Edelman (1). They formulated a synaptic plasticity mechanism which is based on a PRESYNAPTIC RULE (the efficacy of the presynaptic terminal is dependent only on the activity of the presynaptic neuron, no postsynaptic firing or above-treshold depolarization is necessary, ALL presynaptic terminals are affected) and on a POSTSYNAPTIC RULE which is a heterosynaptic modification rule similiar to that of Changeux and others. Unfortunately there is strong evidence that a hebbian condition is necessary for modifications of synaptic efficacy (cf 2,3). Even the experiment by Bonhoeffer et al which demonstrated the non-locality of hebbian modification (all presynaptic terminals are likewise enhanced although only one postsynaptic neuron is depolarized) only occured when the hebbian condition was fulfilled. Could someone provide references to work either crushing the idea of independent modification rules or supporting it ? Thanx in advance. Patrick Thomas Computer Science, Munich Technical University (1) "Synaptic Function", Edelman/Gall/Cowan (eds), Wiley, 1987. (2) "Hebbian Synapses in Hippocampus", Kelso et al, PNAS, 83:5326-5330. (3) Bonhoeffer et al, PNAS, 86:8113-8117. PS: Bad timing. I bet everyone is in San Diego. From kruschke at ucs.indiana.edu Wed Jun 13 15:31:00 1990 From: kruschke at ucs.indiana.edu (KRUSCHKE,JOHN,PSY) Date: 13 Jun 90 14:31:00 EST Subject: research report announcement Message-ID: *** PLEASE DO NOT FORWARD TO OTHER BULLETIN BOARDS *** Research Report announcement: ALCOVE: A Connectionist Model of Category Learning John K. Kruschke Psychology and Cognitive Science Indiana University This report should interest cognitive scientists studying category *learning*, especially those familiar with the work of psychologists such as Nosofsky, Medin, Gluck & Bower, and Estes. The report should also interest connectionists studying the abilities of back-prop architectures that use radial basis function nodes, and new architectures for selective attention. ABSTRACT ALCOVE is a new connectionist model of category learning that models the course of learning in humans and their asymptotic performance. The model is a variant of back propagation, using Gaussian (radial basis function) hidden nodes, and *adaptive attentional strengths* on the input dimensions. Unlike standard back propagation networks, ALCOVE cannot develop completely new dimensions for representing the stimuli, but it does learn to differentially attend to the given input dimensions. This constraint is an accurate reflection of human performance. ALCOVE is successfully applied to several category learning phenomena: (1)~It correctly orders the difficulty of the six category types from the classic work of Shepard, Hovland and Jenkins (1961). (2)~It accurately fits trial-by-trial learning data and mimics the base-rate neglect observed by Gluck and Bower (1988b). In preliminary work, it is also shown that ALCOVE can: (3)~exhibit three-stage learning of high-frequency exceptions to rules (\cf\ Rumelhart \& McClelland 1986), (4)~show emergent graded internal structure in categories, \ie, typicality ratings, (5)~produce asymmetries of similarities between typical and atypical exemplars, (6)~show selective sensitivity to correlated dimensions, and (7)~learn non-linearly separable categories faster than linearly separable categories, in those cases that humans do. It is also suggested that ALCOVE could serve as the input to a rule generating system, so that the dimensions most attended are the ones first used for rules. Moreover, it is shown that ALCOVE is falsifiable, in principle, and that there are some phenomena in category learning that ALCOVE cannot capture. Nevertheless, ALCOVE is attractive because of the broad range of phenomena it does model. If you are truly interested (supplies are limited), you are welcome to a free copy by e-mailing your physical address to the Cognitive Science Program secretary, Cathy Barnes, at iucogsci at ucs.indiana.edu Be sure to mention "Research Report #19 by John Kruschke". (As usual, don't use the "reply" command to make your request.) *** PLEASE DO NOT FORWARD TO OTHER BULLETIN BOARDS *** From MRE1%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU Thu Jun 14 15:06:00 1990 From: MRE1%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU (MRE1%VMS.BRIGHTON.AC.UK@VMA.CC.CMU.EDU) Date: Thu, 14 Jun 90 15:06 BST Subject: Facial Feature Detector Message-ID: I would like to know if anyone has or knows of a database of images of human faces or portrait shots of people. I require the images to form training and test sets for a neural network for facial feature detection. This is part of my PhD program and not a commercial development. I would gratefully appreciate any help. Mark Evans mre1 at uk.ac.bton.vms From pawlicki at Kodak.COM Fri Jun 15 14:30:19 1990 From: pawlicki at Kodak.COM (Dr. Thaddeus F. Pawlicki) Date: Fri, 15 Jun 90 14:30:19 EDT Subject: Research Positions Available Message-ID: <9006151830.AA15626@strategic.> The Signal Processing Research Group of Eastman Kodak has on going research in the areas of Image Processing and Neural Networks. The focus of this work is document understanding and pattern recognition. We are actively recruiting individuals who are interested in these areas. Serious inquirees should contact (hardcopy/phone) : Dr. Roger Gaborski Eastman Kodak Company 901 Elmgrove Road Rochester, New York, 14653-5722 (716) 726-4169 From white at cs.rochester.edu Fri Jun 15 11:40:15 1990 From: white at cs.rochester.edu (white@cs.rochester.edu) Date: Fri, 15 Jun 90 11:40:15 -0400 Subject: Tech-Report Announcement Message-ID: <9006151540.AA07823@maple.cs.rochester.edu> The following technical report is now available: LEARNING TO PERCEIVE AND ACT Steven D. Whitehead and Dana H. Ballard Technical Report # 331 (Revised) Department of Computer Science University of Rochester Rochester, NY 14627 ABSTRACT: This paper considers adaptive control architectures that integrate active sensory-motor systems with decision systems based on reinforcement learning. One unavoidable consequence of active perception is that the agent's internal representation often confounds external world states. We call this phenomenon perceptual aliasing and show that it destabilizes existing reinforcement learning algorithms with respect to the optimal decision policy. We then describe a new decision system that overcomes these difficulties for a restricted class of decision problems. The system incorporates a perceptual subcycle within the overall decision cycle and uses a modified learning algorithm to suppress the effects of perceptual aliasing. The result is a control architecture that learns not only how to solve a task but also where to focus its attention in order to collect necessary sensory information. The report can be obtained by sending requests to either peg at cs.rochester.edu or white at cs.rochester.edu. Be sure to mention TR331(revised) in your request. From FRANKLINS%MEMSTVX1.BITNET at VMA.CC.CMU.EDU Fri Jun 15 13:44:00 1990 From: FRANKLINS%MEMSTVX1.BITNET at VMA.CC.CMU.EDU (FRANKLINS%MEMSTVX1.BITNET@VMA.CC.CMU.EDU) Date: Fri, 15 Jun 90 12:44 CDT Subject: report offered Message-ID: What follows is the abstract of a technical report, really more of a position paper, by Max Garzon and myself. It deals more with neural networks as computational tools than as models of cognition. It was motivated by more technical work of ours on the outer reaches of neural computation under ideal conditions. An extended abstract of this work appeared as "Neural computability II", in Proc. 3rd Int. Joint. Conf. on Neural Networks, Washington, D.C. 1989 I, 631- 637. ******************************************************* When does a neural network solve a problem? Stan Franklin and Max Garzon Institute for Intelligent Systems Department of Mathematical Sciences Memphis State University Memphis, TN 38152 USA Abstract Reproducibility, scalability, controlability, and physical realizability are characteristic features of conventional solutions to algorithmic problems. Their desirability for neural network approaches to computational problems is discussed. It is argued that reproducibility requires the eventual stability of the network at a correct answer, scalability requires consideration of successively larger finite (or just infinite) networks, and the other two features require discrete activations. A precise definition of solution with these properties is offered. The importance of the stability problem in neurocomputing is discussed, as well as the need for study of infinite networks. ******************************************************* A hard copy of the position paper (report 90-11) and/or a full version of "Neural computability II" may be requested from franklins at memstvx1.bitnet. We would greatly appreciate your comments. Please do not REPLY to this message. -- Stan Franklin From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Sun Jun 17 01:02:35 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Sun, 17 Jun 90 01:02:35 EDT Subject: Cascade-Correlation simulator in C Message-ID: Thanks to Scott Crowder, one of my graduate students at Carnegie Mellon, there is now a C version of the public-domain simulator for the Cascade-Correlation learning algorithm. This is a translation of the original simulator that I wrote in Common Lisp. Both versions are now available by anonymous FTP -- see the instructions below. Before anyone asks, we are *NOT* prepared to make tapes and floppy disks for people. Since this code is in the public domain, it is free and no license agreement is required. Of course, as a matter of simple courtesy, we expect people who use this code, commercially or for research, to acknowledge the source. I am interested in hearing about people's experience with this algorithm, successful or not. I will maintain an E-mail mailing list of people using this code so that I can inform users of any bug-fixes, new versions, or problems. Send me E-mail if you want to be on this list. If you have questions that specifically pertain to the C version, contact Scott Crowder (rsc at cs.cmu.edu). If you have more general questions about the algorithm and how to run it, contact me (fahlman at cs.cmu.edu). We'll try to help, though the time we can spend on this is limited. Please use E-mail for such queries if at all possible. Scott Crowder will be out of town for the next couple of weeks, so C-specific problems might have to wait until he returns. The Cascade-Correlation algorithm is described in S. Fahlman and C Lebiere, "The Cascade-Correlation Learning Architecture" in D. S. Touretzky (ed.) _Advances_in_Neural_Information_Processing_Systems_2_, Morgan Kaufmann Publishers, 1990. A tech report containing essentially the same information can be obtained via FTP from the "neuroprose" collection of postscript files at Ohio State. (See instructions below.) Enjoy, Scott E. Fahlman School of Computer Science Carnegie-Mellon University Pittsburgh, PA 15217 --------------------------------------------------------------------------- To FTP the simulation code: For people (at CMU, MIT, and soon some other places) with access to the Andrew File System (AFS), you can access the files directly from directory "/afs/cs.cmu.edu/project/connect/code". This file system uses the same syntactic conventions as BSD Unix: case sensitive names, slashes for subdirectories, no version numbers, etc. The protection scheme is a bit different, but that shouldn't matter to people just trying to read these files. For people accessing these files via FTP: 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu". 2. Log in as user "anonymous" with no password. You may see an error message that says "filenames may not have /.. in them" or something like that. Just ignore it. 3. Change remote directory to "/afs/cs/project/connect/code". Any subdirectories of this one should also be accessible. The parent directories may not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. The Lisp version of the Cascade-Correlation simulator lives in files "cascor1.lisp". The C version lives in "cascor1.c". If you try to access this directory by FTP and have trouble, please contact me. The exact FTP commands you use to change directories, list files, etc., will vary from one version of FTP to another. --------------------------------------------------------------------------- To access the postscript file for the tech report: unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62) Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get fahlman.cascor-tr.ps.Z ftp> quit unix> uncompress fahlman.cascor-tr.ps.Z unix> lpr fahlman.cascor-tr.ps (use flag your printer needs for Postscript) --------------------------------------------------------------------------- From Connectionists-Request at CS.CMU.EDU Sun Jun 17 20:25:18 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Sun, 17 Jun 90 20:25:18 EDT Subject: Connectionists maintainer out of town...delays are possible Message-ID: <7653.645668718@B.GP.CS.CMU.EDU> I will be out of town at the Connectionists Summer School for the next two weeks. After looking at the proposed schedule, it seems likely that I will not have time to answer any mail sent to Connectionists-Request at cs.cmu.edu until I return during the first week of July. Please be patient with any change of address, additions/deletions to/from the list, or other administrative requests. Thanks Scott Crowder Connectionists-Request at cs.cmu.edu (ARPAnet) From white at cs.rochester.edu Mon Jun 18 12:49:19 1990 From: white at cs.rochester.edu (white@cs.rochester.edu) Date: Mon, 18 Jun 90 12:49:19 -0400 Subject: $$ for TR Message-ID: <9006181649.AA08459@maple.cs.rochester.edu> >The following technical report is now available: > > > LEARNING TO PERCEIVE AND ACT > > Steven D. Whitehead and Dana H. Ballard > > Technical Report # 331 (Revised) > Department of Computer Science > University of Rochester > Rochester, NY 14627 > >ABSTRACT: This paper considers adaptive control architectures that >integrate active sensory-motor systems with decision systems based >on reinforcement learning. One unavoidable consequence of active perception >is that the agent's internal representation often confounds external world >states. We call this phenomenon perceptual aliasing and show that it >destabilizes existing reinforcement learning algorithms with respect >to the optimal decision policy. We then describe a new decision system >that overcomes these difficulties for a restricted class of decision >problems. The system incorporates a perceptual subcycle within the overall >decision cycle and uses a modified learning algorithm to suppress the effects >of perceptual aliasing. The result is a control architecture that learns not >only how to solve a task but also where to focus its attention in order to >collect necessary sensory information. > > >The report can be obtained by sending requests to either peg at cs.rochester.edu >or white at cs.rochester.edu. Be sure to mention TR331(revised) in your request. I failed to mention that the TR costs $2.00. If you have already requested the TR and NO LONGER WANT IT, PLEASE MAIL ME. Otherwise, I'll just send the TR (along with the bill.) My original intension was to bypass our standard billing procedure and distrubute the TR freely, however the overwheling number of requests has made that impractical. I apologize for the hassle. -Steve From dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU Tue Jun 19 06:27:08 1990 From: dario%TECHUNIX.BITNET at VMA.CC.CMU.EDU (Dario Ringach) Date: Tue, 19 Jun 90 13:27:08 +0300 Subject: Attention! (summary request) Message-ID: <9006191027.AA16187@techunix.bitnet> Time ago there was a discussion on the list dealing with models of visual spatial attention... Somehow, I erased the summary I had. Can anyone who was interested in the discussion and still has the summary send me a copy? Thanks in advance! -- Dario Ringach From clay at CS.CMU.EDU Tue Jun 19 12:35:16 1990 From: clay at CS.CMU.EDU (Clay Bridges) Date: Tue, 19 Jun 90 12:35:16 EDT Subject: A GA Tutorial and a GA Short Course Message-ID: <6836.645813316@GS10.SP.CS.CMU.EDU> A tutorial entitled "Genetic Algorithms and Classifier Systems" will be presented on Wednesday afternoon, August 1, at the AAAI conference in Boston, MA by David E. Goldberg (Alabama) and John R. Koza (Stanford). The course will survey GA mechanics, power, applications, and advances together with similar information regarding classifier systems and other genetics-based machine learning systems. For further information regarding this tutorial write to AAAI-90, Burgess Drive, Menlo Park, CA 94025, (415)328-3123. A five-day short course entitled "Genetic Algorithms in Search, Optimization, and Machine Learning" will be presented at Stanford University's Western Institute in Computer Science on August 6-10 by David E. Goldberg (Alabama) and John R. Koza (Stanford). The course presents in-depth coverage of GA mechanics, theory and application in search, optimization, and machine learning. Students will be encouraged to solve their own problems in hands-on computer workshops monitored by the course instructors. For further information regarding this course contact Joleen Barnhill, Western Institute in Computer Science, PO Box 1238, Magalia, CA 95954, (916)873-0576. From Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU Wed Jun 20 00:08:55 1990 From: Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave.Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Wed, 20 Jun 90 00:08:55 EDT Subject: tech report announcement Message-ID: <2930.645854935@DST.BOLTZ.CS.CMU.EDU> Here comes yet another tech report announcement. *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** Rules and Maps III: Further Progress in Connectionist Phonology David S. Touretzky Deirdre W. Wheeler Gillette Elvgren III June 1990 Report number CMU-CS-90-138 ABSTRACT: This report contains three papers from an ongoing research project on connectionist phonology. The first introduces syllabification into our ``many maps'' processing model. The second shows how syllabification and a previously-described clustering mechanism can be used jointly to implement the stress assignment rules of a number of languages. The third paper describes a preliminary version of a phonological rule-learning program whose rule syntax is determined by the architecture of our model. ``Two Derivations Suffice: The Role of Syllabification in Cognitive Phonology'' will appear in C. Tenny (ed.), The MIT Parsing Volume, 1989-1990. MIT Center for Cognitive Science, Parsing Project Working Papers 3. ``From Syllables to Stress: A Cognitively Plausible Model'' will appear in K. Deaton, M. Noske, and M. Ziolkowski (eds.), CLS 26-II: Papers from the Parasession on The Syllable in Phonetics and Phonology. Chicago: Chicago Linguistic Society, 1990. ``Phonological Rule Induction: An Architectural Solution'' will appear in Proceedings of the Twelfth Annual Conference of the Cognitive Science Society. Hillsdale, NJ: Lawrence Erlbaum Associates, 1990. ................................................................ To order this report, send email to Catherine Copetas (copetas at cs.cmu.edu) requesting a copy of CMU-CS-90-138. Be sure to include your physical mail address in the message. *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** *** PLEASE DO NOT FORWARD THIS MESSAGE TO OTHER GROUPS *** From lina at ai.mit.edu Wed Jun 20 17:54:51 1990 From: lina at ai.mit.edu (Lina Massone) Date: Wed, 20 Jun 90 17:54:51 EDT Subject: No subject Message-ID: <9006202154.AA03637@globus-pallidus> ********** DO NOT FORWARD TO OTHER BBOARDS *********** The following technical report is available. Target-Switching Experiments with a Sequential Neuro-Controller Lina Massone Dept. of Brain and Cognitive Sciences Massachusetts Institute of Technology 77 Massachusetts Avenue - Cambridge Ma 02139 This paper describes some target-switching experiments simulated with a neural system that drives a three-joint redundant limb. The system is composed of a controller (a sequential network) and a limb emulator. The system was trained to generate aiming movements of the limb towards targets specified as sensory stimuli; it was not trained to perform the target-switching task itself. The experiments demonstrate that the system possesses the ability to solve the target-switching task, which requires generalization with respect to both initial limb posture and sensory stimulation. I performed the experiments under two different perceptual conditions: (1) on/off switching of the two stimuli, (2) temporal overlap of the two stimuli. The second case refers to a hypothesis proposed by many experimental investigators about two different systems being involved in the programming of movements: a "where-system" that would build an internal representation of the target that shifts gradually to the new values, and a "when-system" that would start the motor program generator.The "where-system" would be able to account for the observed differences in path, while the "when-system" would be able to account for the response-time phenomenon. The case of temporal overlap of the two stimuli is a simulation of the "where-system". I present a qualitative comparison of data generated by the neural system under conditions (1) and (2), namely (i) the endpoint paths and velocity profiles, (ii) the patterns of muscular activation. Results of the comparison show that in presence of the "where-system" the controller can account for the variability in paths and the basic two-peak structure of the velocity profiles commonly observed in psychophysical experiments. In absence of the "where-system" the behavior of the controller is, on the contrary, highly stereotyped. Results also point out an inadequacy in the network architecture to deal with the observed high peak velocities after stimuli are switched. Please forward all requests to lina at ai.mit.edu From thomasp at lan.informatik.tu-muenchen.dbp.de Thu Jun 21 07:53:54 1990 From: thomasp at lan.informatik.tu-muenchen.dbp.de (Patrick Thomas) Date: 21 Jun 90 13:53:54+0200 Subject: Independent, again Message-ID: <9006211153.AA11935@gshalle1.informatik.tu-muenchen.de> I wonder who is currently supporting the idea of INDEPENDENT (non hebbian) rules for synaptic plasticity apart from Finkel & Edelman (1). They formulated a synaptic plasticity mechanism which is based on a PRESYNAPTIC RULE (the efficacy of the presynaptic terminal is dependent only on the activity of the presynaptic neuron, no postsynaptic firing or above-treshold depolarization is necessary, ALL presynaptic terminals are affected) and on a POSTSYNAPTIC RULE which is a heterosynaptic modification rule similiar to that of Changeux, Alkon and others. There is general agreement that Hebbs original notion of postsynaptic FIRING as a condition of synaptic weight increase is inappropriate. Usually a postsynaptic DEPOLARIZATION is said to be needed with a further refinement preventing unbounded weight increase, namely some kind of ANTI-HEBB condition which decreases synaptic weight in the absence of correlated conditions of "activity" (cf Stent 1973, Palm and others). Of course there are numerous other variations of Hebb rules not to be considered here (cf Brown, 1990, Ann Rev NS). But, what shall we do with the following two facts: 1) No mechanism is known to detect coincidence of pre/postsynaptic "activity". The NMDA-Receptor complex is currently en vogue, but available data is inconclusive. 2) There is a growing amount of data related to heteroassociative interactions LOCAL on the dendritic tree between neighbouring synapses. So why not redefine our models based on this observations ? All of the heteroassociative effects of synaptic plasticity, of course, rely on some kind of "postsynaptic activity". But this is not meant in the hebb-sense as to involve the postsynaptic neuron as a functional whole but rather in the context of local depolarization affecting neighbouring membrane channel properties, for example. Alkon therefore simulates with his "neurons" having distinct patches for incoming signals. In addition to a postsynaptic/heterosynaptic mechanism there is ample evidence for homo/multisynaptic facilitation and depression which is independent of postsynaptic activity. Edelmans DUAL RULES MODEL sketched earlier could therefore well be an appropriate starting point for the investigation of new learning laws to be applied within the context of Artificial Neural Networks (actually, it needs some refinements). Could someone provide references to work either crushing the idea of independent modification rules or supporting it ? Thanx in advance. Patrick Thomas Computer Science, Munich Technical University (1) "Synaptic Function", Edelman/Gall/Cowan (eds), Wiley, 1987. PS: Bad timing. I bet everbody is in San Diego. PSS: The Kelso (1986) and Bonhoeffer (1989) results are admittedly a challenge to non-hebbian rules. Hopefully a moderate one. From holyoak at cognet.ucla.edu Thu Jun 21 13:41:57 1990 From: holyoak at cognet.ucla.edu (Keith J Holyoak) Date: Thu, 21 Jun 90 10:41:57 PDT Subject: connectionist analogy etc. Message-ID: <9006211741.AA02274@paris.cognet.ucla.edu> .ll 7i .nr LL 7i .ps 11 .nr PS 11 .nr VS 13 .nf .ta 1.5i .UL "Information for inclusion in a CALL FOR PAPERS" SERIES: Advances in Connectionist and Neural Computation Theory SERIES EDITOR: John A. Barnden VOLUME: 2 VOLUME TITLE: \fIConnectionist Approaches to Analogy, Metaphor and Case-Based Reasoning\fR. .ta 2i VOLUME EDITORS: Keith J. Holyoak Department of Psychology University of California Los Angeles, CA 90024. (213) 206-1646 holyoak at cognet.ucla.edu John A. Barnden Computer Science Department & Computing Research Laboratory Box 30001/3CRL New Mexico State University Las Cruces, NM 88003. (505) 646-6235 jbarnden at nmsu.edu .fi .LP DESCRIPTION .PP Connectionist capabilities such as associative retrieval, approximate matching, soft constraint handling and adaptation hold considerable promise for supporting analogy-based reasoning, case-based reasoning and metaphor processing. At the same time, these three strongly related forms of processing traditionally involve complex symbol structures, and connectionism continues to have difficulty in providing the benefits normally supplied by such structures. Recently, some connectionist approaches to metaphor, analogy and case-based reasoning have begun to appear, and the purpose of our volume is to encourage further work and discussion in this area. .PP The volume will include both invited and submitted peer-reviewed articles. We are seeking submissions from researchers in any relevant field \*- artificial intelligence, psychology, philosophy, linguistics, and others. Articles can be positive, neutral or negative on the applicability of connectionism to analogy/metaphor/case-based processing. They can be of any type, including subfield reviews, general discussions, critiques, detailed presentations of models or supporting mechanisms, formal theoretical analyses, empirical studies, and methodological studies. .LP SUBMISSION PROCEDURE Submissions may be sent to either editor, by 20 November 1990. The suggested length is 7000-20,000 words excluding figures, references, abstract and so on. Format details, etc. will be supplied on request. Authors are strongly encouraged to discuss ideas for possible submissions with the editors. .sp 3 .LP ((ADVERTISEMENT FOR VOLUME 1, TO BE INSERTED BY ABLEX)) From tsejnowski at UCSD.EDU Thu Jun 21 15:50:15 1990 From: tsejnowski at UCSD.EDU (Terry Sejnowski) Date: Thu, 21 Jun 90 12:50:15 PDT Subject: Independent, again Message-ID: <9006211950.AA19324@sdbio2.UCSD.EDU> There are two types of LTP in the hippocampus, one in area CA1 (and elsewhere) that depends on the NMDA receptor (and is blocked by AP5) and another type in area CA3 that is not blocked by AP5. The latter appears not be associative and may not be Hebbian (but the experimental evidence is not yet definitive on this point). In addition to heterosynaptic depression (postsynaptic activity in the absence of presynaptic activity) there is also evidence for homosynaptic depression (presynaptic activity in the absence of postsynaptic activity). For a review of these mechanisms, see Sejnowski et al., Induction of synaptic plasticity by Hebbian covariance in the hippocampus, In: R. Durbin, C. Miall and G. Mitchison. (Eds.), The Computing Neuron, Addison-Wesley, 1989. Incidently, this collection of papers is one of the best on the interface of biology with computational models. Another good recent collection specifically on biologically relevant connectionist models is Connectionist Modeling and Brain Function, Hanson and Olson (Eds.), MIT Press, 1990. The emerging evidence from neurobiologists is that there is a multiplicity of mechanisms for plasticity at synapses. Furthermore, there are mechanisms that can change the excitability of a neuron, such as changing the density or voltage dependence of ion-selective channels in the membrane. This is similar to changing the threshold and shape of the nonlinearity, except that the change may be specific to a dendritic branch, not the whole neuron. This gives Nature (and modelers) a much richer palate of mechanisms to work with. Terry ----- From rbelew at UCSD.EDU Tue Jun 26 08:26:18 1990 From: rbelew at UCSD.EDU (Rik Belew) Date: Tue, 26 Jun 90 05:26:18 PDT Subject: Evolving Networks - New TR Message-ID: <9006261226.AA04629@blakey.ucsd.edu> EVOLVING NETWORKS: USING THE GENETIC ALGORITHM WITH CONNECTIONIST LEARNING Richard K. Belew John McInerney Nicolaus Schraudolf Cognitive Computer Science Research Group Computer Science & Engr. Dept. (C-014) Univ. California at San Diego La Jolla, CA 92093 rik at cs.ucsd.edu CSE Technical Report #CS90-174 June, 1990 ABSTRACT It is appealing to consider hybrids of neural-network learning algorithms with evolutionary search procedures, simply because Nature has so successfully done so. In fact, computational models of learning and evolution offer theoretical biology new tools for addressing questions about Nature that have dogged that field since Darwin. The concern of this paper, however, is strictly artificial: Can hybrids of connectionist learning algorithms and genetic algorithms produce more efficient and effective algorithms than either technique applied in isolation? The paper begins with a survey of recent work (by us and others) that combines Holland's Genetic Algorithm (GA) with connectionist techniques and delineates some of the basic design problems these hybrids share. This analysis suggests the dangers of overly literal representations of the network on the genome (e.g., encoding each weight explicitly). A preliminary set of experiments that use the GA to find unusual but successful values for BP parameters (learning rate, momentum) are also reported. The focus of the report is a series of experiments that use the GA to explore the space of initial weight values, from which two different gradient techniques (conjugate gradient and back propagation) are then allowed to optimize. We find that use of the GA provides much greater confidence in the face of the stochastic variation that can plague gradient techniques, and can also allow training times to be reduced by as much as two orders of magnitude. Computational trade-offs between BP and the GA are considered, including discussion of a software facility that exploits the parallelism inherent in GA/BP hybrids. This evidence leads us to conclude that the GA's GLOBAL SAMPLING characteristics compliment connectionist LOCAL SEARCH techniques well, leading to efficient and reliable hybrids. -------------------------------------------------- If possible, please obtain a postscript version of this technical report from the pub/neuroprose directory at cheops.cis.ohio-state.edu. Here are the directions: /*** Note: This file is not yet in place. Give us a few days, ***/ /*** say until after 4th of July weekend, before you try to get it. ***/ unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): neuron ftp> cd pub/neuroprose ftp> type binary ftp> get (remote-file) evol-net.ps.Z (local-file) foo.ps.Z ftp> quit unix> uncompress foo.ps.Z unix> lpr -P(your_local_postscript_printer) foo.ps /*** Note: This file is not yet in place. Give us a few days, ***/ /*** say until after 4th of July weekend, before you try to get it. ***/ If you do not have access to a postscript printer, copies of this technical report can be obtained by sending requests to: Kathleen Hutcheson CSE Department (C-014) Univ. Calif. -- San Diego La Jolla, CA 92093 Ask for CSE Technical Report #CS90-174, and enclose $3.00 to cover the cost of publication and postage. From sontag at hilbert.rutgers.edu Wed Jun 27 16:27:35 1990 From: sontag at hilbert.rutgers.edu (Eduardo Sontag) Date: Wed, 27 Jun 90 16:27:35 EDT Subject: Tech Reports Available Message-ID: <9006272027.AA14695@hilbert.rutgers.edu> The following report is now available: "On the recognition capabilities of feedforward nets" by Eduardo D. Sontag, SYCON Center, Rutgers University. ABSTRACT: In this note we deal with the recognition capabilities of various feedforward neural net architectures, analyzing the effect of direct input to output connections and comparing Heaviside (threshold) with sigmoidal response units. The results state, roughly, that allowing direct connections or allowing sigmoidal responses doubles the recognition power of the standard architecture (no connections, Heaviside responses) which is often assumed in theoretical studies. Recognition power is expressed in terms of various measures, including worst-case and VC-dimension, though in the latter case, only results for subsets of the plane are proved (the general case is still open). There is also some discussion of Boolean recognition problems, including the example of computing N-bit parity with about N/2 sigmoids. --------------------------------------------------------------------------- To obtain copies of the postscript file, please use Jordan Pollack's service: Example: unix> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Name (cheops.cis.ohio-state.edu:): anonymous Password (cheops.cis.ohio-state.edu:anonymous): ftp> cd pub/neuroprose ftp> binary ftp> get (remote-file) sontag.capabilities.ps.Z (local-file) foo.ps.Z ftp> quit unix> uncompress foo.ps unix> lpr -P(your_local_postscript_printer) foo.ps ---------------------------------------------------------------------------- If you have any difficulties with the above, please send e-mail to sontag at hilbert.rutgers.edu. DO NOT "reply" to this message, please. From oruiz at fi.upm.es Thu Jun 28 09:19:00 1990 From: oruiz at fi.upm.es (Oscar Ruiz) Date: 28 Jun 90 15:19 +0200 Subject: neural efficiency Message-ID: <42*oruiz@fi.upm.es> I would appreciate any information about the relationship between neural networks and algorithm efficiency theory. My address is the following: Miguel A. Lerma Sancho Davila 18 28028 MADRID SPAIN Thanks in advance.  From koza at Sunburn.Stanford.EDU Fri Jun 29 18:33:15 1990 From: koza at Sunburn.Stanford.EDU (John Koza) Date: Fri, 29 Jun 1990 15:33:15 PDT Subject: Genetic Programming -new TR Available Message-ID: A new technical report entitled "Genetic Programming: A Paradigm for Genetically Breeding Populations of Computer Programs to Solve Problems" is now available as Stanford University Computer Science Department technical report no. STAN-CS-90-1314. ABSTRACT: Many seemingly different problems in artificial intelligence, symbolic processing, and machine learning can be viewed as requiring discovery of a computer program that produces some desired output for particular inputs. When viewed in this way, the process of solving these problems becomes equivalent to searching a space of possible computer programs for a most fit individual computer program. The new "genetic programming" paradigm described in this report provides a way to search for this most fit individual computer program. In this new "genetic programming" paradigm, populations of computer programs are genetically bred using the Darwinian principle of survival of the fittest and using a genetic crossover (recombination) operator appropriate for genetically mating computer programs. In this report, the process of formulating and solving problems using this new paradigm is illustrated using examples from various areas. Examples come from the areas of machine learning of a function; planning; sequence induction; symbolic function identificiation (including symbolic regression, empirical discovery, "data to function" symbolic integration, "data to function" symbolic differentiation); solving equations (including differential equations, integral equations, and functional equations)' concept formation; automatica programming; pattern recognition; time-optimal control; playing differential pursuer-evader games; neural network design; and finding a game-playing strategy for a game in extensive form. AVAILABILITY: (1) A limited number of copies of this report can be obtained from the author FREE between now and August 31, 1990, by writing John Koza, Post Office Box K, Los Altos Hills, CA 94023. (2) Copies may be obtained for $15 from Taleen Nazarian, Computer Science Department, Margarget Jacks Hall, Stanford University, Stanford, CA 94023 USA. John R. Koza Computer Science Department Stanford Univeristy From lss at compsci.stirling.ac.uk Fri Jun 29 11:19:07 1990 From: lss at compsci.stirling.ac.uk (Dr. Leslie S. Smith) Date: 29 Jun 90 11:19:07 BST (Fri) Subject: No subject Message-ID: <9006291119.AA06820@uk.ac.stir.cs.crown> Subject: Request for information I have a student about to undertake a project on the application of Neural Nets to identification of earthquake seismic signatures. I would be most appreciative if anyone could tell me of any references in this area. -- Leslie Smith -- lss at uk.ac.stir.cs --Dr. L. S. Smith, Department of Computing Science, Univ of Stirling. From jmerelo at ugr.es Tue Jun 26 06:01:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 26 Jun 90 12:01 +0200 Subject: Introduction Message-ID: <44*jmerelo@ugr.es> My name is JJ Merelo, I am working in Granada University. Our grooup is called CSIP and we are more prone to the hardware stuff, but I am myself concerned with software. I have already implemented a Kohonen network, that is being used for Spanish speech r ecognition. The source code is available in C, should anyone be interested. That's all by now. JJ ================== From neuron-request at hplabs.hpl.hp.com Tue Jun 26 05:46:00 1990 From: neuron-request at hplabs.hpl.hp.com (Neuron-Digest Moderator Peter Marvit) Date: 26 Jun 90 11:46 +0200 Subject: Welcome to Neuron-Digest In-Reply-To: > Message-ID: <9192.646334026@hplpm.hpl.hp.com> >X-Handled-By: EUnet via goya.uucp The following address has been added to the Neuron-Digest mailing list: "JJ Merelo" You should begin receiving Digests shortly. At the end of this message is the official "blurb" of this Digest. You can retrieve back issues with anonymous ftp, as described in the blurb. Please let me know if you have difficulties or need back issues mailed. Please feel free to submit messages early and often. The Digest will be thin without your participation. Send all messages to (UUCP style) "hplabs!neuron-request" or (ARPA style) "neuron-request at hplabs.hp.com". Who are YOU and what are YOUR interests? Also, how did you find out about the Digest? -Peter Marvit Neuron-Digest Moderator ------------------------------ CUT HERE ------------------------------- ARPA: NEURON at hplabs.hp.com uucp: ...!hplabs!neuron Neuron-Digest is a list (in digest form) dealing with all aspects of neural networks (and any type of network or neuromorphic system), especially: NATURAL SYSTEMS Software Simulations Neurobiology Hardware Neuroscience Digital ARTIFICIAL SYSTEMS Analog Neural Networks Optical Algorithms Cellular Automatons Some key words which may stir up some further interest include: Hebbian Systems Widrow-Hoff Algorithm Perceptron Threshold Logic Holography Content Addressable Memories Lyapunov Stability Criterion Navier-Stokes Equation Annealing Spin Glasses Locally Couples Systems Globally Coupled Systems Dynamical Systems (Adaptive) Control Theory Back-Propagation Generalized Delta Rule Pattern Recognition Vision Systems Parallel Distributed Processing Connectionism Any contribution in these areas is accepted. Any of the following are reasonable: Abstracts Reviews Lab Descriptions Research Overviews Work Planned or in Progress Half-Baked Ideas Conference Announcements Conference Reports Bibliographies History Connectionism Puzzles and Unsolved Problems Anecdotes, Jokes, and Poems Queries and Requests Address Changes (Bindings) Archived files/messages are available with anonymous ftp from hplpm.hpl.hp.com (15.255.176.205) in the directory pub/Neuron-Digest. That directory contains back issues with the names vol-nn-no-mm (e.g., vol-3-no-02). I'm also collecting simulation software in pub/Neuron-Software. Contributions are welcome. All requests to be added to or deleted from this list, problems, questions, etc., should be sent to neuron-request at hplabs.hp.com. Moderator: Peter Marvit ------------------------------ CUT HERE ------------------------------- From jmerelo at ugr.es Tue Jun 26 06:01:00 1990 From: jmerelo at ugr.es (JJ Merelo) Date: 26 Jun 90 12:01 +0200 Subject: Introduction Message-ID: <44*jmerelo@ugr.es> My name is JJ Merelo, I am working in Granada University. Our grooup is called CSIP and we are more prone to the hardware stuff, but I am myself concerned with software. I have already implemented a Kohonen network, that is being used for Spanish speech r ecognition. The source code is available in C, should anyone be interested. That's all by now. JJ ================== From neuron-request at hplabs.hpl.hp.com Tue Jun 26 05:46:00 1990 From: neuron-request at hplabs.hpl.hp.com (Neuron-Digest Moderator Peter Marvit) Date: 26 Jun 90 11:46 +0200 Subject: Welcome to Neuron-Digest In-Reply-To: > Message-ID: <9192.646334026@hplpm.hpl.hp.com> >X-Handled-By: EUnet via goya.uucp The following address has been added to the Neuron-Digest mailing list: "JJ Merelo" You should begin receiving Digests shortly. At the end of this message is the official "blurb" of this Digest. You can retrieve back issues with anonymous ftp, as described in the blurb. Please let me know if you have difficulties or need back issues mailed. Please feel free to submit messages early and often. The Digest will be thin without your participation. Send all messages to (UUCP style) "hplabs!neuron-request" or (ARPA style) "neuron-request at hplabs.hp.com". Who are YOU and what are YOUR interests? Also, how did you find out about the Digest? -Peter Marvit Neuron-Digest Moderator ------------------------------ CUT HERE ------------------------------- ARPA: NEURON at hplabs.hp.com uucp: ...!hplabs!neuron Neuron-Digest is a list (in digest form) dealing with all aspects of neural networks (and any type of network or neuromorphic system), especially: NATURAL SYSTEMS Software Simulations Neurobiology Hardware Neuroscience Digital ARTIFICIAL SYSTEMS Analog Neural Networks Optical Algorithms Cellular Automatons Some key words which may stir up some further interest include: Hebbian Systems Widrow-Hoff Algorithm Perceptron Threshold Logic Holography Content Addressable Memories Lyapunov Stability Criterion Navier-Stokes Equation Annealing Spin Glasses Locally Couples Systems Globally Coupled Systems Dynamical Systems (Adaptive) Control Theory Back-Propagation Generalized Delta Rule Pattern Recognition Vision Systems Parallel Distributed Processing Connectionism Any contribution in these areas is accepted. Any of the following are reasonable: Abstracts Reviews Lab Descriptions Research Overviews Work Planned or in Progress Half-Baked Ideas Conference Announcements Conference Reports Bibliographies History Connectionism Puzzles and Unsolved Problems Anecdotes, Jokes, and Poems Queries and Requests Address Changes (Bindings) Archived files/messages are available with anonymous ftp from hplpm.hpl.hp.com (15.255.176.205) in the directory pub/Neuron-Digest. That directory contains back issues with the names vol-nn-no-mm (e.g., vol-3-no-02). I'm also collecting simulation software in pub/Neuron-Software. Contributions are welcome. All requests to be added to or deleted from this list, problems, questions, etc., should be sent to neuron-request at hplabs.hp.com. Moderator: Peter Marvit ------------------------------ CUT HERE -------------------------------