From krulwich-bruce at YALE.ARPA Mon May 2 15:28:31 1988 From: krulwich-bruce at YALE.ARPA (Bruce Krulwich) Date: Mon, 2 May 88 15:28:31 EDT Subject: Novelty of Neural Net Approach Message-ID: <8805021928.AA08835@SUNED.ZOO.CS.YALE.EDU> I think that Teru Homma's recent message contrasting connectionist and conventional computer systems shows the importance of distinguishing between the hardware characteristics of a system and characteristics of its reasoning. The lack of such a distinction has resulted in several recent articles in the media claiming that connectionism has solved all the problems of AI, which I don't think is quite true yet. Examples of such a distinction follow: > Conventional Systems Neural Net (Connectionist) Systems >8. have complex structure have rather uniform structure This is true only at the elemental level. Any particular connectionist model may have complex (nonuniform) structure. In fact, it seems to be the case that connectionist AI models will have to scale up the complexity of their representations before they will be able to handle any high level cognitive behavior like goal-oriented activity, expectations, plans, etc. >10. designed or programmed with learn from examples; > rules specifying the system may self-organize > behavior Conventional computers can learn from examples just as well as connectionist systems. Conventional algorithms for inductive learning, which is basically what back propogation, recirculation, Boltzmann learning (tentatively), and competative learning all do, have been under development for over 20 years. What connectionist models do provide is a uniform process model in which inductive learning becomes easier. Most connectionist learning algorithms so far (most notably backprop) suffer many of the limitations that inductive learning does, among them that on the one hand how well the system can learn depends heavily on the representation chosen, but on the other hand if the system is given a complex representation then it hasn't truly learned the domain but simply learned to use the domain knowledge it started out with. In summary I think that it's important to distinguish between the advances in neural network systems and the advances in connectionist AI. The two of them have to discussed along different metrics and with different past and future goals. Bruce Krulwich ------- From netlist at psych.Stanford.EDU Tue May 3 17:48:25 1988 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Tue, 3 May 88 14:48:25 PDT Subject: Stanford Adaptive Networks Colloquium Message-ID: Stanford University Interdisciplinary Colloquium Series: ADAPTIVE NETWORKS AND THEIR APPLICATIONS ************************************************************************** May 5 (Thursday, 3:15pm): STUART KAUFMAN "Evolutionary Adaptation in Distributed Networks" Dept. of Biochemistry & Biophysics School of Medicine Univ. of Pennsylvania Philadelphia, PA 19174 ************************************************************************** * * * Location: Room 380-380C which can be reached through the lower level courtyard between the Psychology and Mathematical Sciences buildings. Information: To be placed on an electronic mail distribution list for information about these and other adaptive network events in the Stanford area, send email to netlist at psych.stanford.edu. For additional information, contact Mark Gluck, Bldg. 420-316; (415) 725-2434 or email to gluck at psych.stanford.edu From kawahara at nuesun.ntt.jp Wed May 4 23:27:11 1988 From: kawahara at nuesun.ntt.jp (Hideki Kawahara) Date: Wed, 4 May 88 23:27:11 JST Subject: Neural Network research in Japan, second report Message-ID: <8805041427.AA18808@MECL.NTT.jp> Dear Colleagues: The second report on Neural Netowrk research in Japan. The following list is an excerpt from programs presented at the EIC technical meetin this spring. This meeting has been a part of an interdiciplinary meeting on neural information processing for about 20 years. I'm afraid this information is outdated already. But I think there are some valuable information still left. Sevelar authors will attend ICNN, INNS or NIPS meeting. Then you may have chance to discuss with them in depth. -- kawahara%nttlab.ntt.JP at RELAY.CS.NET (from ARPA) ----------------------------------------------------------- Meeting title: ME and Bio Cybernetics Technical Meeting 26/March/1988 Tamagawa University, Tokyo , JAPAN Sponsor: EIC (The Institute of Electronics, Information and Communication Engineers Co-sponsor: ME Society, IEEE Tokyo chapter etc. Report may be available through ordering to EIC. ordering information: Report title: IEICE Technical Report, Vol.87 No.428 26/March/1988 issue Address: The Institute of Electronics, Information and Communication Engineers of Japan Kikai-Shinko-Kaikan Bldg., 5-8, Shibakoen 3 chome, Minato-ku, Tokyo 105 JAPAN (When I bought this, it was 4000-Yen (about $30) without postage.) --------------------------------------------------------- %A Takeshi Sato %A Yukifumi Shigematsu %A Harehiko Nomura %A Masahiro Yamada %A Masahisa Saburi %A Terunori Mori %T Exciting/Inhibiting Effects of Supersonic to Neurons %R MBE87-109, EIC %O Electrotechnical Laboratory %A Tomoyuki Okawa %A Hiroshi Yagi %T An Identification in Neural Network with Stridulation of Cricket %R MBE87-110, EIC %O Faculty of Engineering, Toyama University %A Mitsuyuki Nakao %T Multi-Receptor Model of Taste Sensory System of Insect and Its Capacity %R MBE87-114, EIC %O Education Center for Information Processing, Tohoku University %A K. Aihara %A T. Takabe %A J. Takahashi %A M. Toyoda %A M. Kotani %A G. Matsumoto %T Chaotic Neural Networks I. Theoretical Model %R MBE87-115, EIC %O 1st to 5th author: Tokyo Denki Univ., last auther: Electrotechnical Lab. %A Kazuhiko Shimizu %A Kazuyuki Aihara %A Makoto Kotani %T Chaotic Neural Networks II. Hardware Implementation %R MBE87-116, EIC %O Tokyo Denki University %A Takeshi Aihara %A Nobuyasu Sakata %A Minoru Tsukada %T Temporal Pattern Discrimination in Mesencephic Periaqueductal Gray of Rat and in Real-time Neuron Model %R MBE87-117, EIC %O Faculty of Engineering, Tamagawa University %A Naotoshi Sugano %T Discharge Patterns of a Motoneuron Model Elicited by a Sinusoidal Driving Function R MBE87-118, EIC %O Department of Electronic Engineering, Tamagawa University %A Syozo Yasui %A Masahiro Yamada %T Unusual Synaptic Mechanisms in the Retina %R MBE87-119, EIC %O First auther: National Institute for Basic Biology, Second Author: Electrotechnical Lab. %A Makoto Mizuno %A Susumu Iwasawa %A Minoru Tsukada %T Analysis of Spatial Non-linearity of Retinal Ganglion Cell in Cat %R MBE87-120, EIC %O Faculty of Engineering, Tamagawa University %A Itsuo Kumazawa %A Hidemitsu Ogawa %T Constraints Satisfaction Searches using Attractors with Long Hamming Distance from Each Other and Automatic Temperature Control %R MBE87-121, EIC %O Department of Computer Science, Tokyo Institute of Technology %A Hiroaki Hara %T Stochastic Model of Serial Position Effect %R MBE87-122, EIC %O Department of Science Engineering, Tohoku University %A Yoko Yamaguchi %A Hiroshi Shimizu %T Encoding of Information of Figures by Coupled Oscillators with Negative Interactions and Its Application to Holovision %R MBE87-123, EIC %O Faculty of Pharmaceutical Science, University of Tokyo %A Okihide Hikosaka %T Neural Mechanisms of Saccadic Eye Movement %R MBE87-130, EIC %O Department of Physiology, School of Medicine, Toho University %A Bilin Zhang %A Yoji Uno %A Mitsuo Kawato %A Ryoji Suzuki %T Trajectory in Redundant Human Arm Movement with Three Joints %R MBE87-132, EIC %O Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Yoshiharu Maeda %A Mitsuo Kawato %A Yoji Uno %A Ryoji Suzuki %T Multi-layer Neural Network Model which Learns and Generates 芥uman Multi-Joint Arm Trajectory %R MBE87-133, EIC %O Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Michiaki Isobe %A Mitsuo Kawato %A Ryoji Suzuki %T Iterative Learning Control of Industrial Manipulator in Joint-angular and Visual Coordinates %R MBE87-134, EIC %O First auther: Faculty of Engineering, Second and Third author: Tokyo University., Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Tohru Setoyama %A Mitsuo Kawato %A Ryoji Suzuki %T Manipulator Control by Inverse-dynamics Model Learned in Multi-layer Neural Network %R MBE87-135, EIC %O Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Masahiko Fujita %T A Proposed System of Inhibition in the Output Layers of the Superior Colliculus %R MBE87-135, EIC %O Nagasaki Institute of Applied Science %A Masamicni Nakagawa %A Shunsuke Sato %T Feature Extraction by means of Hough Transform %R MBE87-139, EIC %O Department of Biophysical Engineering, Fuculty of Engineering Science, Osaka University %A Shuichi Kurogi %T Abilities and Limitation of a Neural Network Model for Spoken Word Recognition %R MBE-140, EIC %O Kyushu Institute of Technology %A Kenji Doya %A Shuji Yoshizawa %T Motor Pattern Memory in Neural Networks %R MBE-141, EIC %O University of Tokyo %A Ryoko Futami %A Mitsuo Hashiba %A Nozomu Hoshimiya %T A Neural Sequence Identification Network (ANSIN) Model %R MBE87-142, EIC %O Research Institute of Applied Electrocity, Hokkaido University %A Teruhiko Ohtomo %A Mikio Ishitani %A Ken-ichi Hara %T Learning and Feature Extraction of Perceptual Vowel Distribution using a Neural Net Model %R MBE87-143, EIC %O Faculty of Engineering, Yamagata University %A T. Ohmori %A K. Satou %A K. Koyama %T Representation and Operation Model of the 3-Dimensional Object by the Neuron Network %R MBE87-145, EIC %O Tokyo University of Agriculture and Technology %A Hideaki Honda %A Noboru Ohnishi %A Noboru Sugie %T Understanding of the Shape of 3-D Objects using the Connectionist Models %R MBE87-146 %O Faculty of Engineering, Nagoya University %A Satoshi Omata %A Yoko Yamaguchi %A Hiroshi Shimizu %T A Holonic Model of Visual Motion Perception %R MBE87-147 %O First author: Cannon Research Center, CANNON INC. Second and third author: University of Tokyo %A Masafumi Yano %A Shigeo Suzuki %A Hiroshi Shimizu %T The Model of Color Recognition %R MBE87-148 %O Faculty of Pharamaceutical Science, University of Tokyo %A Masahide Nomura %A Gen Matsumoto %T A Model of Binocular Response of Simple Cell in Striate Cortex %R MBE87-149, EIC %O First author: NEC Corporation Fundamental Research Labs. Second author: Electrotechnical Labs. %A Yasuyuki Tsukui %A Yuzo Hirai %T A Model of Visual Information Processing with Hierarchical and Distributed Network Structure %R MBE87-150, EIC %O University of Tsukuba %A Takashi Nagano %A Hiroshi Rakutani %T Neuroscientific Considerations on the Training of Three-layered Neural Net Models %R MBE87-151, EIC %O College of Engineering, Hosei University %A Nobuhiko Ikeda %A Toyoshi Torioka %T An Associative Memory based on Self-organizing System of Recognition Cells %R MBE87-152, EIC %O First author: Tokuyama Technical College, Second author: Yamagudhi University %A Shigeru Tanaka %T Thory of Ocular Dominance Column Formation %R MBE87-153, EIC %O Fundamental Research Laboratories, NEC Corporation %A Koji Kurata %T Self-organization of Topographic Mapping in Boltzmann Machines %R MBE87-154, EIC %O Fuclty of Engineering, University of Tokyo %A Masamichi Fuakaya %A Manabu Kitagawa %A Yoichi Okabe %T Neural Network Model Learning by Interaction with Environment: Quantized Representation of Inner State %R MBE87-155, EIC %O Department of Electronic Engineering, University of Tokyo %A Yoshihiro Mori %A Kazuhiko Yokosawa %A Michio Umeda %T Hand-writing KANJI Character Recognition by a PDP Model %R MBE87-156, EIC %O ATR Auditory and Visual Perception Research Laboratories %A Katsuhiro Kamada %A Yuzo Hirai %T Digital Neural Model %R MBE87-157, EIC %O University of Tsukuba %A Akiyuki Anzai %A Yoshimi Kamiyama %A Shiro Usui %A Sei Miyake %T Neural Circuits Simulator: NCS %R MBE87-158, EIC %O First to third author: Toyohashi University of Technology, Department of Information & Computer Sciences Last author: ATR Auditory and Visual Perception Research Laboratories %A Mitsuo Takeda %T Optical Neurocomputing: A Review %R MBE87-159, EIC %O University of Electro-communication %A Keisuke Toyama %T Self-organization of Visual Cortical Circuitry %R MBE87-160, EIC %O Kyoto Prefectural School of Medicine -------------------------------------------------------- Hideki Kawahara NTT Basic Research Laboratories kawahara%nttlab.ntt.JP at RELAY.CS.NET (from ARPA site) From SINGHS at KIRK.ASTON.AC.UK Wed May 4 17:42:22 1988 From: SINGHS at KIRK.ASTON.AC.UK (SINGHS@KIRK.ASTON.AC.UK) Date: 4-MAY-1988 21:42:22 GMT Subject: No subject Message-ID: Dear Sir, Please add me to your mailing list. Thanks. S.Singh From SINGHS at KIRK.ASTON.AC.UK Wed May 4 17:38:37 1988 From: SINGHS at KIRK.ASTON.AC.UK (SINGHS@KIRK.ASTON.AC.UK) Date: 4-MAY-1988 21:38:37 GMT Subject: No subject Message-ID: Dear Sir, Please add me to your mailing list. Thanks. S.Singh From kawahara at speech-apollo.ntt.jp Thu May 5 13:23:38 1988 From: kawahara at speech-apollo.ntt.jp (Hideki Kawahara) Date: Thu, 5 May 88 13:23:38 jst Subject: Neural Network research in Japan, second report Message-ID: <8805050423.AA00492@speech-apollo.NTT.jp> >Report may be available through ordering to EIC. I forgot to point out one thing. The reports are written in Japanese. kawahara%nttlab.ntt.JP at RELAY.CS.NET (from ARPA site) From ELECTRO at atc.bendix.com Thu May 5 08:41:00 1988 From: ELECTRO at atc.bendix.com (Richard A. Burne) Date: Thu, 5 May 88 08:41 EDT Subject: Graduate Student Summer Employment Message-ID: For those who have students looking for a summer research job or for those who are students in pursuit of summer employment, the following will be of interest. Richard Burne Electro at ATC.Bendix.Com Allied-Signal Aerospace Technology Center Graduate Student Summer Employment Three-month Summer Research Position in Columbia, MD, for a graduate student interested in neural network methods for signal classification. Under the direction of a staff scientist, the student's duties would include setting up and testing a commercial PC-based neural-network system and performing computer-generated signal classification experiments. Successful applicant needs an awareness of neural-network concepts, especially feed-forward networks with back-propagation learning, and familiarity with linear system theory, signal processing and basic decision theory. Experience with IBM PC/AT hardware and software preferred. The Allied-Signal Aerospace Technology Center is located in modern facilities in Columbia, MD, (between Baltimore and Washington, D.C.) and employs a technical staff of 55, 27 of whom are advanced degree holders. Current research activities include neural networks, fault-tolerant computer architectures, and advanced microelectronics. Interested candidates should contact Dr. Robert Simpson at (301)964-4147, or write to him at the following address: Dr. Robert Simpson Allied-Signal Aerospace Technology Center 9140 Old Annapolis Road/MD 108 Columbia, MD 21045 We would prefer to fill this position beginning June 1, 1988, so a prompt response to this announcement will be to your advantage. From mike%bucasb.bu.edu at bu-it.BU.EDU Tue May 10 10:50:50 1988 From: mike%bucasb.bu.edu at bu-it.BU.EDU (Michael Cohen) Date: Tue, 10 May 88 10:50:50 EDT Subject: Lectures on Neural Networks Message-ID: <8805101450.AA27989@bucasb.bu.edu> Stephen Grossberg of Boston University will deliver two lectures at the U. Mass Worcester Medical School, 55 Lake Avenue North, Worcester, MA 01655, right off Route 9. Wednesday, May 11, 1988, 4:00PM: "Neural networks for learning and memory of perceptual and cognitive recognition codes" and in the Southeast Conference Room, Thursday, May 12, 1988, 12 noon: "Neural dynamics of distributed decision making and short-term memory: Contrast-enhancement, automatic gain control, and reverberation in shunting on-center off-surround networks" Call 1-856-4147 for further information. ----------------------------------------------------------------------------- -- Michael Cohen ---- Center for Adaptive Systems Boston University (617-353-7857) Email: mike at bucasb.bu.edu Smail: Michael Cohen Center for Adaptive System Department of Mathematics, Boston University 111 Cummington Street Boston, Mass 02215 From barto at anger.tcp.cs.umass.edu Thu May 12 09:31:51 1988 From: barto at anger.tcp.cs.umass.edu (barto@anger.tcp.cs.umass.edu) Date: Thu, 12 May 88 09:31:51 EDT Subject: Technical Report Available Message-ID: <8805121331.AA10805@anger.ANW.edu> ACTION Michael I. Jordan Massachusetts Institute of Technology David A. Rosenbaum University of Massachusetts, Amherst COINS Technical Report 88-26 ABSTRACT We discuss three fundamental problems for theories of the organization of action--the degrees-of-freedom problem, the serial order problem, and the problem of sensorimotor learning. Our emphasis is on the interrelated nature of these problems. Several recent models and algorithms which address various aspects of these problems are described and evaluated. To appear in M. I. Posner (Ed.), Handbook of Cognitive Science, Cambridge, MA: MIT Press. To obtain a copy, contact Ms. Connie Smith Computer and Information Science Graduate Research Center University of Massachusetts Amherst, MA 01003 smith at cs.umass.edu From netlist at psych.Stanford.EDU Sun May 15 23:20:57 1988 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Sun, 15 May 88 20:20:57 PDT Subject: Stanford Adaptive Networks Colloquium Message-ID: Stanford University Interdisciplinary Colloquium Series: ADAPTIVE NETWORKS AND THEIR APPLICATIONS May 17 (Tuesday, 3:15pm): STEPHEN J. HANSON "Some Comments and Variations on Backpropogation" Bell Communications Research 435 South St. Morristown, NJ 07960 * * * ============ NOTE: room change this week ===================================== Location: Room 040 in the lower level of the Psychology Building, Jordan Hall (Bldg. 420). ============================================================================== Information: To be placed on an electronic mail distribution list for information about these and other adaptive network events in the Stanford area, send email to netlist at psych.stanford.edu. For additional information, contact Mark Gluck, Bldg. 420-316; (415) 725-2434 or email to gluck at psych.stanford.edu From harnad at Princeton.EDU Mon May 16 01:25:26 1988 From: harnad at Princeton.EDU (Stevan Harnad) Date: Mon, 16 May 88 01:25:26 edt Subject: Call for qualified referees Message-ID: <8805160525.AA08748@mind.Princeton.EDU> BBS (Behavioral & Brain Sciences), published by Cambridge university Press, is an international, interdisciplinary journal devoted exclusively to Open Peer Commentary on important and controversial current target articles in the biobehavioral and cognitive sciences. Because of the growing volume of connectionist and connectionism-related submissions now being received at BBS, we are looking for more referees who are qualified and willing to evaluate submitted manuscripts. If you are professionally qualified in connectionism, parallel distributed processing, associative networks, neural modeling etc., and wish to serve as a referee for BBS, please send your CV to the email or USmail address below. Individuals who are already BBS Associates need only specify that this is a specialty area that they wish to review in. Stevan Harnad ARPANET: harnad at mind.princeton.edu or harnad%princeton.mind.edu at princeton.edu CSNET: harnad%mind.princeton.edu at relay.cs.net BITNET: harnad%mind.princeton.edu at pucc.bitnet UUCP: princeton!mind!harnad PHONE: 609 921 7771 USMAIL: BBS, 20 Nassau Street, Rm. 240, Princeton NJ 08542 From esj at cs.brown.edu Wed May 18 14:28:26 1988 From: esj at cs.brown.edu (Eugene Santos Jr.) Date: Wed, 18 May 88 14:28:26 EDT Subject: Dynamic Neural Net Architectures Message-ID: I'm looking for references to work on neural networks capable of changing their architectures by adding/removing nodes. Thanks! ------------------------------------------------------------------------------- uucp: ...!{ihnp4,decvax}!brunix!esj Eugene Santos Jr. bitnet: esj at browncs.bitnet Brown University arpa: esj%cs.brown.edu at relay.cs.net csnet: esj at cs.brown.edu us mail:Eugene Santos Jr. Box 1910 Department of Computer Science Brown University Providence, RI 02912 ------------------------------------------------------------------------------- From netlist at psych.Stanford.EDU Thu May 19 09:40:50 1988 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Thu, 19 May 88 06:40:50 PDT Subject: Stanford Adaptive Networks Colloquium Message-ID: Stanford University Interdisciplinary Colloquium Series: ADAPTIVE NETWORKS AND THEIR APPLICATIONS ************************************************************************** May 20 (Friday, 2:15pm): STEVEN GALLANT "Sequential Associative Memories" College of Computer Science Northeastern Univ. Boston, MA 02115 ************************************************************************** ABSTRACT Humans are very good at manipulating sequential information, but sequences present special problems for connectionist models. As an approach to sequential problems we have examined totally connected subnetworks of cells called Sequential Associative Memories (SAM's). The coefficients for SAM cells are unmodifi- able and are generated at random. A subnetwork of SAM cells performs two tasks: 1. Their activations determine a state for the network that permits previous inputs and outputs to be recalled, and 2. They increase the dimensionality of input and output representations to make it possible for other (modifiable) cells in the network to learn difficult tasks. The second function is similar to the Distributed Method, a way of generating intermediate cells for non-sequential problems. Results from several experiments are presented. The first is a robotic control task that required a network to produce one of several sequences of outputs when input cells were set to a corresponding `plan number'. The second experiment was to learn a sequential version of the parity function that would generalize to arbitrarily long input strings. Finally we attempted to teach a network how to add arbitrarily long pairs of binary numbers. Here we were successful if the network contained a cell dedicated to the notion of `carry'; otherwise the network performed at less than 100% for unseen se- quences longer than those used during training. Each of these tasks required a representation of state, and hence a network with feedback. All were learned using subnetworks of SAM cells. * * * Location: Room 380-380C which can be reached through the lower level courtyard between the Psychology and Mathematical Sciences buildings. Information: To be placed on an electronic mail distribution list for information about these and other adaptive network events in the Stanford area, send email to netlist at psych.stanford.edu. For additional information, contact Mark Gluck, Bldg. 420-316; (415) 725-2434 or email to gluck at psych.stanford.edu From breen at silver.bacs.indiana.edu Fri May 20 20:37:24 1988 From: breen at silver.bacs.indiana.edu (elise breen) Date: Fri, 20 May 88 19:37:24 est Subject: e-mailing list Message-ID: Please put me on the connectionist mailing list. My e-mail address is breen at silver.bacs.indiana.edu Thank you, Elise M. Breen From UTANS%GINDI at Venus.YCC.Yale.Edu Tue May 24 12:45:00 1988 From: UTANS%GINDI at Venus.YCC.Yale.Edu (UTANS%GINDI@Venus.YCC.Yale.Edu) Date: Tue, 24 May 88 11:45 EST Subject: No subject Message-ID: help From harnad at Princeton.EDU Wed May 25 00:52:09 1988 From: harnad at Princeton.EDU (Stevan Harnad) Date: Wed, 25 May 88 00:52:09 edt Subject: Language Learnability: BBS Call for Commentators Message-ID: <8805250455.AA22675@Princeton.EDU> Below is the abstract of a forthcoming target article to appear in Behavioral and Brain Sciences (BBS), an international journal of "open peer commentary" in the biobehavioral and cognitive sciences, published by Cambridge University Press. For information on how to serve as a commentator or to nominate qualified professionals in these fields as commentators, please send email to: harnad at mind.princeton.edu or write to: BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771] ----------------------------------------------------------------------- [Chomsky, heredity/environment, poverty-of-the-stimulus, development] The Child's Trigger Experience: "Degree-0" Learnability David Lightfoot Linguistics Department University of Maryland A selective model of human language capacities holds that people come to know more than they experience. The discrepancy between experience and eventual capacity is bridged by genetically provided information. Hence any hypothesis about the linguistic genotype (or "Universal Grammar," UG) has consequences for what experience is needed and what form people's mature capacities (or "grammars") will take. This BBS target article discusses the "trigger experience," i.e., the experience that actually affects a child's linguistic development. It is argued that this must be a subset of a child's total linguistic experience and hence that much of what a child hears has no consequence for the form of the eventual grammar. UG filters experience and provides an upper bound on what constitutes the triggering experience. This filtering effect can often be seen in the way linguistic capacity can change between generations. Children only need access to robust structures of minimal ("degree-0") complexity. Everything can be learned from simple, unembedded "domains" (a grammatical concept involved in defining an expression's logical form). Children do not need access to more complex structures. From uttal at humu.nosc.mil Wed May 25 16:05:13 1988 From: uttal at humu.nosc.mil (William L. Uttal) Date: Wed, 25 May 88 16:05:13 HST Subject: Language Learnability: BBS Call for Commentators Message-ID: <8805260205.AA02056@humu.nosc.mil> Steve -- That is not my cup of tea but Ron Johnson at the U of Hawaii dept of Psych would be perfect. Aloha, Bill Uttal From ELECTRO at atc.bendix.com Tue May 24 10:16:00 1988 From: ELECTRO at atc.bendix.com (Richard A. Burne) Date: Tue, 24 May 88 10:16 EDT Subject: Research Scientist Position Message-ID: We have an openning at our research center for a research scientist with knowledge of learning algorithms for neural networks. The following provides additional information. Richard Burne electro at ATC.BENDIX.COM Aerospace Technology Center Allied-Signal Corporation Research Scientist Position The AEROSPACE TECHNOLOGY CENTER, the research and development laboratory for the Aerospace Company of the Allied-Signal Corporation, is seeking an individual to join an ongoing research project investigating neural networks. The primary technical responsibility of the individual will be researching and developing novel learning algorithms for dynamic neural network architectures. Educational background should include a Ph.D. in Applied Math, Electrical Engineering, Physics or a related scientific discipline -- with an emphasis on learning algorithms. Knowledge of signal processing in biological systems would be advantageous. The AEROSPACE TECHNOLOGY CENTER is located in Columbia, Maryland. Our location provides cultural, recreational and educational opportunites to satisfy every taste. A half-hour trip to the North puts you in the heart of Baltimore renaissance. The same distance to the South opens up the metropolitan and historical pleasures of Washington, D.C. For prompt consideration, please direct your resume to: Mr. John Flato Aerospace Technology Center 9140 Old Annapolis Road Columbia, MD 21045 The Aerospace Technology Center is an equal opportunity employer. From JHLU at sdr.slb.com Thu May 26 21:43:00 1988 From: JHLU at sdr.slb.com (JHLU@sdr.slb.com) Date: Thu, 26 May 88 21:43 EDT Subject: Asking for Information Message-ID: I am Jiang-hong Lu, a sophormore at MIT. I am doing a summer project on neural computing and want to join this committee. Please send me e-mail if you get this letter. Thinks a lot. Jiang-hong Lu From 8414902 at UWACDC.ACS.WASHINGTON.EDU Tue May 31 20:06:00 1988 From: 8414902 at UWACDC.ACS.WASHINGTON.EDU (TERU) Date: Tue, 31 May 1988 17:06 PDT Subject: HIMOSTHYLEDYNE - MACH Message-ID: That is, hierarchical, modular, stochastic, hybrid, learnable, dynamic, nets -- maybe chaotic. (I could replace MACH with PROCH standing for probably chaotic or pro-chaotic.) Well, I am trying to indicate by the adjectives the significant properties we want the ultimate nets to have. Other obvious candidates are self-organizing, self-reproducing, and genetic --- SOSERGE. Of course parallel processing, so add OPA! Add more or remove some? Take it lightly as a friendly noise since it has been quite quiet here for a while. -- Teru Homma From krulwich-bruce at YALE.ARPA Mon May 2 15:28:31 1988 From: krulwich-bruce at YALE.ARPA (Bruce Krulwich) Date: Mon, 2 May 88 15:28:31 EDT Subject: Novelty of Neural Net Approach Message-ID: <8805021928.AA08835@SUNED.ZOO.CS.YALE.EDU> I think that Teru Homma's recent message contrasting connectionist and conventional computer systems shows the importance of distinguishing between the hardware characteristics of a system and characteristics of its reasoning. The lack of such a distinction has resulted in several recent articles in the media claiming that connectionism has solved all the problems of AI, which I don't think is quite true yet. Examples of such a distinction follow: > Conventional Systems Neural Net (Connectionist) Systems >8. have complex structure have rather uniform structure This is true only at the elemental level. Any particular connectionist model may have complex (nonuniform) structure. In fact, it seems to be the case that connectionist AI models will have to scale up the complexity of their representations before they will be able to handle any high level cognitive behavior like goal-oriented activity, expectations, plans, etc. >10. designed or programmed with learn from examples; > rules specifying the system may self-organize > behavior Conventional computers can learn from examples just as well as connectionist systems. Conventional algorithms for inductive learning, which is basically what back propogation, recirculation, Boltzmann learning (tentatively), and competative learning all do, have been under development for over 20 years. What connectionist models do provide is a uniform process model in which inductive learning becomes easier. Most connectionist learning algorithms so far (most notably backprop) suffer many of the limitations that inductive learning does, among them that on the one hand how well the system can learn depends heavily on the representation chosen, but on the other hand if the system is given a complex representation then it hasn't truly learned the domain but simply learned to use the domain knowledge it started out with. In summary I think that it's important to distinguish between the advances in neural network systems and the advances in connectionist AI. The two of them have to discussed along different metrics and with different past and future goals. Bruce Krulwich ------- From netlist at psych.Stanford.EDU Tue May 3 17:48:25 1988 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Tue, 3 May 88 14:48:25 PDT Subject: Stanford Adaptive Networks Colloquium Message-ID: Stanford University Interdisciplinary Colloquium Series: ADAPTIVE NETWORKS AND THEIR APPLICATIONS ************************************************************************** May 5 (Thursday, 3:15pm): STUART KAUFMAN "Evolutionary Adaptation in Distributed Networks" Dept. of Biochemistry & Biophysics School of Medicine Univ. of Pennsylvania Philadelphia, PA 19174 ************************************************************************** * * * Location: Room 380-380C which can be reached through the lower level courtyard between the Psychology and Mathematical Sciences buildings. Information: To be placed on an electronic mail distribution list for information about these and other adaptive network events in the Stanford area, send email to netlist at psych.stanford.edu. For additional information, contact Mark Gluck, Bldg. 420-316; (415) 725-2434 or email to gluck at psych.stanford.edu From kawahara at nuesun.ntt.jp Wed May 4 23:27:11 1988 From: kawahara at nuesun.ntt.jp (Hideki Kawahara) Date: Wed, 4 May 88 23:27:11 JST Subject: Neural Network research in Japan, second report Message-ID: <8805041427.AA18808@MECL.NTT.jp> Dear Colleagues: The second report on Neural Netowrk research in Japan. The following list is an excerpt from programs presented at the EIC technical meetin this spring. This meeting has been a part of an interdiciplinary meeting on neural information processing for about 20 years. I'm afraid this information is outdated already. But I think there are some valuable information still left. Sevelar authors will attend ICNN, INNS or NIPS meeting. Then you may have chance to discuss with them in depth. -- kawahara%nttlab.ntt.JP at RELAY.CS.NET (from ARPA) ----------------------------------------------------------- Meeting title: ME and Bio Cybernetics Technical Meeting 26/March/1988 Tamagawa University, Tokyo , JAPAN Sponsor: EIC (The Institute of Electronics, Information and Communication Engineers Co-sponsor: ME Society, IEEE Tokyo chapter etc. Report may be available through ordering to EIC. ordering information: Report title: IEICE Technical Report, Vol.87 No.428 26/March/1988 issue Address: The Institute of Electronics, Information and Communication Engineers of Japan Kikai-Shinko-Kaikan Bldg., 5-8, Shibakoen 3 chome, Minato-ku, Tokyo 105 JAPAN (When I bought this, it was 4000-Yen (about $30) without postage.) --------------------------------------------------------- %A Takeshi Sato %A Yukifumi Shigematsu %A Harehiko Nomura %A Masahiro Yamada %A Masahisa Saburi %A Terunori Mori %T Exciting/Inhibiting Effects of Supersonic to Neurons %R MBE87-109, EIC %O Electrotechnical Laboratory %A Tomoyuki Okawa %A Hiroshi Yagi %T An Identification in Neural Network with Stridulation of Cricket %R MBE87-110, EIC %O Faculty of Engineering, Toyama University %A Mitsuyuki Nakao %T Multi-Receptor Model of Taste Sensory System of Insect and Its Capacity %R MBE87-114, EIC %O Education Center for Information Processing, Tohoku University %A K. Aihara %A T. Takabe %A J. Takahashi %A M. Toyoda %A M. Kotani %A G. Matsumoto %T Chaotic Neural Networks I. Theoretical Model %R MBE87-115, EIC %O 1st to 5th author: Tokyo Denki Univ., last auther: Electrotechnical Lab. %A Kazuhiko Shimizu %A Kazuyuki Aihara %A Makoto Kotani %T Chaotic Neural Networks II. Hardware Implementation %R MBE87-116, EIC %O Tokyo Denki University %A Takeshi Aihara %A Nobuyasu Sakata %A Minoru Tsukada %T Temporal Pattern Discrimination in Mesencephic Periaqueductal Gray of Rat and in Real-time Neuron Model %R MBE87-117, EIC %O Faculty of Engineering, Tamagawa University %A Naotoshi Sugano %T Discharge Patterns of a Motoneuron Model Elicited by a Sinusoidal Driving Function R MBE87-118, EIC %O Department of Electronic Engineering, Tamagawa University %A Syozo Yasui %A Masahiro Yamada %T Unusual Synaptic Mechanisms in the Retina %R MBE87-119, EIC %O First auther: National Institute for Basic Biology, Second Author: Electrotechnical Lab. %A Makoto Mizuno %A Susumu Iwasawa %A Minoru Tsukada %T Analysis of Spatial Non-linearity of Retinal Ganglion Cell in Cat %R MBE87-120, EIC %O Faculty of Engineering, Tamagawa University %A Itsuo Kumazawa %A Hidemitsu Ogawa %T Constraints Satisfaction Searches using Attractors with Long Hamming Distance from Each Other and Automatic Temperature Control %R MBE87-121, EIC %O Department of Computer Science, Tokyo Institute of Technology %A Hiroaki Hara %T Stochastic Model of Serial Position Effect %R MBE87-122, EIC %O Department of Science Engineering, Tohoku University %A Yoko Yamaguchi %A Hiroshi Shimizu %T Encoding of Information of Figures by Coupled Oscillators with Negative Interactions and Its Application to Holovision %R MBE87-123, EIC %O Faculty of Pharmaceutical Science, University of Tokyo %A Okihide Hikosaka %T Neural Mechanisms of Saccadic Eye Movement %R MBE87-130, EIC %O Department of Physiology, School of Medicine, Toho University %A Bilin Zhang %A Yoji Uno %A Mitsuo Kawato %A Ryoji Suzuki %T Trajectory in Redundant Human Arm Movement with Three Joints %R MBE87-132, EIC %O Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Yoshiharu Maeda %A Mitsuo Kawato %A Yoji Uno %A Ryoji Suzuki %T Multi-layer Neural Network Model which Learns and Generates 芥uman Multi-Joint Arm Trajectory %R MBE87-133, EIC %O Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Michiaki Isobe %A Mitsuo Kawato %A Ryoji Suzuki %T Iterative Learning Control of Industrial Manipulator in Joint-angular and Visual Coordinates %R MBE87-134, EIC %O First auther: Faculty of Engineering, Second and Third author: Tokyo University., Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Tohru Setoyama %A Mitsuo Kawato %A Ryoji Suzuki %T Manipulator Control by Inverse-dynamics Model Learned in Multi-layer Neural Network %R MBE87-135, EIC %O Department of Biophysical Engineering, Faculty of Engineering Science, Osaka University %A Masahiko Fujita %T A Proposed System of Inhibition in the Output Layers of the Superior Colliculus %R MBE87-135, EIC %O Nagasaki Institute of Applied Science %A Masamicni Nakagawa %A Shunsuke Sato %T Feature Extraction by means of Hough Transform %R MBE87-139, EIC %O Department of Biophysical Engineering, Fuculty of Engineering Science, Osaka University %A Shuichi Kurogi %T Abilities and Limitation of a Neural Network Model for Spoken Word Recognition %R MBE-140, EIC %O Kyushu Institute of Technology %A Kenji Doya %A Shuji Yoshizawa %T Motor Pattern Memory in Neural Networks %R MBE-141, EIC %O University of Tokyo %A Ryoko Futami %A Mitsuo Hashiba %A Nozomu Hoshimiya %T A Neural Sequence Identification Network (ANSIN) Model %R MBE87-142, EIC %O Research Institute of Applied Electrocity, Hokkaido University %A Teruhiko Ohtomo %A Mikio Ishitani %A Ken-ichi Hara %T Learning and Feature Extraction of Perceptual Vowel Distribution using a Neural Net Model %R MBE87-143, EIC %O Faculty of Engineering, Yamagata University %A T. Ohmori %A K. Satou %A K. Koyama %T Representation and Operation Model of the 3-Dimensional Object by the Neuron Network %R MBE87-145, EIC %O Tokyo University of Agriculture and Technology %A Hideaki Honda %A Noboru Ohnishi %A Noboru Sugie %T Understanding of the Shape of 3-D Objects using the Connectionist Models %R MBE87-146 %O Faculty of Engineering, Nagoya University %A Satoshi Omata %A Yoko Yamaguchi %A Hiroshi Shimizu %T A Holonic Model of Visual Motion Perception %R MBE87-147 %O First author: Cannon Research Center, CANNON INC. Second and third author: University of Tokyo %A Masafumi Yano %A Shigeo Suzuki %A Hiroshi Shimizu %T The Model of Color Recognition %R MBE87-148 %O Faculty of Pharamaceutical Science, University of Tokyo %A Masahide Nomura %A Gen Matsumoto %T A Model of Binocular Response of Simple Cell in Striate Cortex %R MBE87-149, EIC %O First author: NEC Corporation Fundamental Research Labs. Second author: Electrotechnical Labs. %A Yasuyuki Tsukui %A Yuzo Hirai %T A Model of Visual Information Processing with Hierarchical and Distributed Network Structure %R MBE87-150, EIC %O University of Tsukuba %A Takashi Nagano %A Hiroshi Rakutani %T Neuroscientific Considerations on the Training of Three-layered Neural Net Models %R MBE87-151, EIC %O College of Engineering, Hosei University %A Nobuhiko Ikeda %A Toyoshi Torioka %T An Associative Memory based on Self-organizing System of Recognition Cells %R MBE87-152, EIC %O First author: Tokuyama Technical College, Second author: Yamagudhi University %A Shigeru Tanaka %T Thory of Ocular Dominance Column Formation %R MBE87-153, EIC %O Fundamental Research Laboratories, NEC Corporation %A Koji Kurata %T Self-organization of Topographic Mapping in Boltzmann Machines %R MBE87-154, EIC %O Fuclty of Engineering, University of Tokyo %A Masamichi Fuakaya %A Manabu Kitagawa %A Yoichi Okabe %T Neural Network Model Learning by Interaction with Environment: Quantized Representation of Inner State %R MBE87-155, EIC %O Department of Electronic Engineering, University of Tokyo %A Yoshihiro Mori %A Kazuhiko Yokosawa %A Michio Umeda %T Hand-writing KANJI Character Recognition by a PDP Model %R MBE87-156, EIC %O ATR Auditory and Visual Perception Research Laboratories %A Katsuhiro Kamada %A Yuzo Hirai %T Digital Neural Model %R MBE87-157, EIC %O University of Tsukuba %A Akiyuki Anzai %A Yoshimi Kamiyama %A Shiro Usui %A Sei Miyake %T Neural Circuits Simulator: NCS %R MBE87-158, EIC %O First to third author: Toyohashi University of Technology, Department of Information & Computer Sciences Last author: ATR Auditory and Visual Perception Research Laboratories %A Mitsuo Takeda %T Optical Neurocomputing: A Review %R MBE87-159, EIC %O University of Electro-communication %A Keisuke Toyama %T Self-organization of Visual Cortical Circuitry %R MBE87-160, EIC %O Kyoto Prefectural School of Medicine -------------------------------------------------------- Hideki Kawahara NTT Basic Research Laboratories kawahara%nttlab.ntt.JP at RELAY.CS.NET (from ARPA site) From SINGHS at KIRK.ASTON.AC.UK Wed May 4 17:42:22 1988 From: SINGHS at KIRK.ASTON.AC.UK (SINGHS@KIRK.ASTON.AC.UK) Date: 4-MAY-1988 21:42:22 GMT Subject: No subject Message-ID: Dear Sir, Please add me to your mailing list. Thanks. S.Singh From SINGHS at KIRK.ASTON.AC.UK Wed May 4 17:38:37 1988 From: SINGHS at KIRK.ASTON.AC.UK (SINGHS@KIRK.ASTON.AC.UK) Date: 4-MAY-1988 21:38:37 GMT Subject: No subject Message-ID: Dear Sir, Please add me to your mailing list. Thanks. S.Singh From kawahara at speech-apollo.ntt.jp Thu May 5 13:23:38 1988 From: kawahara at speech-apollo.ntt.jp (Hideki Kawahara) Date: Thu, 5 May 88 13:23:38 jst Subject: Neural Network research in Japan, second report Message-ID: <8805050423.AA00492@speech-apollo.NTT.jp> >Report may be available through ordering to EIC. I forgot to point out one thing. The reports are written in Japanese. kawahara%nttlab.ntt.JP at RELAY.CS.NET (from ARPA site) From ELECTRO at atc.bendix.com Thu May 5 08:41:00 1988 From: ELECTRO at atc.bendix.com (Richard A. Burne) Date: Thu, 5 May 88 08:41 EDT Subject: Graduate Student Summer Employment Message-ID: For those who have students looking for a summer research job or for those who are students in pursuit of summer employment, the following will be of interest. Richard Burne Electro at ATC.Bendix.Com Allied-Signal Aerospace Technology Center Graduate Student Summer Employment Three-month Summer Research Position in Columbia, MD, for a graduate student interested in neural network methods for signal classification. Under the direction of a staff scientist, the student's duties would include setting up and testing a commercial PC-based neural-network system and performing computer-generated signal classification experiments. Successful applicant needs an awareness of neural-network concepts, especially feed-forward networks with back-propagation learning, and familiarity with linear system theory, signal processing and basic decision theory. Experience with IBM PC/AT hardware and software preferred. The Allied-Signal Aerospace Technology Center is located in modern facilities in Columbia, MD, (between Baltimore and Washington, D.C.) and employs a technical staff of 55, 27 of whom are advanced degree holders. Current research activities include neural networks, fault-tolerant computer architectures, and advanced microelectronics. Interested candidates should contact Dr. Robert Simpson at (301)964-4147, or write to him at the following address: Dr. Robert Simpson Allied-Signal Aerospace Technology Center 9140 Old Annapolis Road/MD 108 Columbia, MD 21045 We would prefer to fill this position beginning June 1, 1988, so a prompt response to this announcement will be to your advantage. From mike%bucasb.bu.edu at bu-it.BU.EDU Tue May 10 10:50:50 1988 From: mike%bucasb.bu.edu at bu-it.BU.EDU (Michael Cohen) Date: Tue, 10 May 88 10:50:50 EDT Subject: Lectures on Neural Networks Message-ID: <8805101450.AA27989@bucasb.bu.edu> Stephen Grossberg of Boston University will deliver two lectures at the U. Mass Worcester Medical School, 55 Lake Avenue North, Worcester, MA 01655, right off Route 9. Wednesday, May 11, 1988, 4:00PM: "Neural networks for learning and memory of perceptual and cognitive recognition codes" and in the Southeast Conference Room, Thursday, May 12, 1988, 12 noon: "Neural dynamics of distributed decision making and short-term memory: Contrast-enhancement, automatic gain control, and reverberation in shunting on-center off-surround networks" Call 1-856-4147 for further information. ----------------------------------------------------------------------------- -- Michael Cohen ---- Center for Adaptive Systems Boston University (617-353-7857) Email: mike at bucasb.bu.edu Smail: Michael Cohen Center for Adaptive System Department of Mathematics, Boston University 111 Cummington Street Boston, Mass 02215 From barto at anger.tcp.cs.umass.edu Thu May 12 09:31:51 1988 From: barto at anger.tcp.cs.umass.edu (barto@anger.tcp.cs.umass.edu) Date: Thu, 12 May 88 09:31:51 EDT Subject: Technical Report Available Message-ID: <8805121331.AA10805@anger.ANW.edu> ACTION Michael I. Jordan Massachusetts Institute of Technology David A. Rosenbaum University of Massachusetts, Amherst COINS Technical Report 88-26 ABSTRACT We discuss three fundamental problems for theories of the organization of action--the degrees-of-freedom problem, the serial order problem, and the problem of sensorimotor learning. Our emphasis is on the interrelated nature of these problems. Several recent models and algorithms which address various aspects of these problems are described and evaluated. To appear in M. I. Posner (Ed.), Handbook of Cognitive Science, Cambridge, MA: MIT Press. To obtain a copy, contact Ms. Connie Smith Computer and Information Science Graduate Research Center University of Massachusetts Amherst, MA 01003 smith at cs.umass.edu From netlist at psych.Stanford.EDU Sun May 15 23:20:57 1988 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Sun, 15 May 88 20:20:57 PDT Subject: Stanford Adaptive Networks Colloquium Message-ID: Stanford University Interdisciplinary Colloquium Series: ADAPTIVE NETWORKS AND THEIR APPLICATIONS May 17 (Tuesday, 3:15pm): STEPHEN J. HANSON "Some Comments and Variations on Backpropogation" Bell Communications Research 435 South St. Morristown, NJ 07960 * * * ============ NOTE: room change this week ===================================== Location: Room 040 in the lower level of the Psychology Building, Jordan Hall (Bldg. 420). ============================================================================== Information: To be placed on an electronic mail distribution list for information about these and other adaptive network events in the Stanford area, send email to netlist at psych.stanford.edu. For additional information, contact Mark Gluck, Bldg. 420-316; (415) 725-2434 or email to gluck at psych.stanford.edu From harnad at Princeton.EDU Mon May 16 01:25:26 1988 From: harnad at Princeton.EDU (Stevan Harnad) Date: Mon, 16 May 88 01:25:26 edt Subject: Call for qualified referees Message-ID: <8805160525.AA08748@mind.Princeton.EDU> BBS (Behavioral & Brain Sciences), published by Cambridge university Press, is an international, interdisciplinary journal devoted exclusively to Open Peer Commentary on important and controversial current target articles in the biobehavioral and cognitive sciences. Because of the growing volume of connectionist and connectionism-related submissions now being received at BBS, we are looking for more referees who are qualified and willing to evaluate submitted manuscripts. If you are professionally qualified in connectionism, parallel distributed processing, associative networks, neural modeling etc., and wish to serve as a referee for BBS, please send your CV to the email or USmail address below. Individuals who are already BBS Associates need only specify that this is a specialty area that they wish to review in. Stevan Harnad ARPANET: harnad at mind.princeton.edu or harnad%princeton.mind.edu at princeton.edu CSNET: harnad%mind.princeton.edu at relay.cs.net BITNET: harnad%mind.princeton.edu at pucc.bitnet UUCP: princeton!mind!harnad PHONE: 609 921 7771 USMAIL: BBS, 20 Nassau Street, Rm. 240, Princeton NJ 08542 From esj at cs.brown.edu Wed May 18 14:28:26 1988 From: esj at cs.brown.edu (Eugene Santos Jr.) Date: Wed, 18 May 88 14:28:26 EDT Subject: Dynamic Neural Net Architectures Message-ID: I'm looking for references to work on neural networks capable of changing their architectures by adding/removing nodes. Thanks! ------------------------------------------------------------------------------- uucp: ...!{ihnp4,decvax}!brunix!esj Eugene Santos Jr. bitnet: esj at browncs.bitnet Brown University arpa: esj%cs.brown.edu at relay.cs.net csnet: esj at cs.brown.edu us mail:Eugene Santos Jr. Box 1910 Department of Computer Science Brown University Providence, RI 02912 ------------------------------------------------------------------------------- From netlist at psych.Stanford.EDU Thu May 19 09:40:50 1988 From: netlist at psych.Stanford.EDU (Mark Gluck) Date: Thu, 19 May 88 06:40:50 PDT Subject: Stanford Adaptive Networks Colloquium Message-ID: Stanford University Interdisciplinary Colloquium Series: ADAPTIVE NETWORKS AND THEIR APPLICATIONS ************************************************************************** May 20 (Friday, 2:15pm): STEVEN GALLANT "Sequential Associative Memories" College of Computer Science Northeastern Univ. Boston, MA 02115 ************************************************************************** ABSTRACT Humans are very good at manipulating sequential information, but sequences present special problems for connectionist models. As an approach to sequential problems we have examined totally connected subnetworks of cells called Sequential Associative Memories (SAM's). The coefficients for SAM cells are unmodifi- able and are generated at random. A subnetwork of SAM cells performs two tasks: 1. Their activations determine a state for the network that permits previous inputs and outputs to be recalled, and 2. They increase the dimensionality of input and output representations to make it possible for other (modifiable) cells in the network to learn difficult tasks. The second function is similar to the Distributed Method, a way of generating intermediate cells for non-sequential problems. Results from several experiments are presented. The first is a robotic control task that required a network to produce one of several sequences of outputs when input cells were set to a corresponding `plan number'. The second experiment was to learn a sequential version of the parity function that would generalize to arbitrarily long input strings. Finally we attempted to teach a network how to add arbitrarily long pairs of binary numbers. Here we were successful if the network contained a cell dedicated to the notion of `carry'; otherwise the network performed at less than 100% for unseen se- quences longer than those used during training. Each of these tasks required a representation of state, and hence a network with feedback. All were learned using subnetworks of SAM cells. * * * Location: Room 380-380C which can be reached through the lower level courtyard between the Psychology and Mathematical Sciences buildings. Information: To be placed on an electronic mail distribution list for information about these and other adaptive network events in the Stanford area, send email to netlist at psych.stanford.edu. For additional information, contact Mark Gluck, Bldg. 420-316; (415) 725-2434 or email to gluck at psych.stanford.edu From breen at silver.bacs.indiana.edu Fri May 20 20:37:24 1988 From: breen at silver.bacs.indiana.edu (elise breen) Date: Fri, 20 May 88 19:37:24 est Subject: e-mailing list Message-ID: Please put me on the connectionist mailing list. My e-mail address is breen at silver.bacs.indiana.edu Thank you, Elise M. Breen From UTANS%GINDI at Venus.YCC.Yale.Edu Tue May 24 12:45:00 1988 From: UTANS%GINDI at Venus.YCC.Yale.Edu (UTANS%GINDI@Venus.YCC.Yale.Edu) Date: Tue, 24 May 88 11:45 EST Subject: No subject Message-ID: help From harnad at Princeton.EDU Wed May 25 00:52:09 1988 From: harnad at Princeton.EDU (Stevan Harnad) Date: Wed, 25 May 88 00:52:09 edt Subject: Language Learnability: BBS Call for Commentators Message-ID: <8805250455.AA22675@Princeton.EDU> Below is the abstract of a forthcoming target article to appear in Behavioral and Brain Sciences (BBS), an international journal of "open peer commentary" in the biobehavioral and cognitive sciences, published by Cambridge University Press. For information on how to serve as a commentator or to nominate qualified professionals in these fields as commentators, please send email to: harnad at mind.princeton.edu or write to: BBS, 20 Nassau Street, #240, Princeton NJ 08542 [tel: 609-921-7771] ----------------------------------------------------------------------- [Chomsky, heredity/environment, poverty-of-the-stimulus, development] The Child's Trigger Experience: "Degree-0" Learnability David Lightfoot Linguistics Department University of Maryland A selective model of human language capacities holds that people come to know more than they experience. The discrepancy between experience and eventual capacity is bridged by genetically provided information. Hence any hypothesis about the linguistic genotype (or "Universal Grammar," UG) has consequences for what experience is needed and what form people's mature capacities (or "grammars") will take. This BBS target article discusses the "trigger experience," i.e., the experience that actually affects a child's linguistic development. It is argued that this must be a subset of a child's total linguistic experience and hence that much of what a child hears has no consequence for the form of the eventual grammar. UG filters experience and provides an upper bound on what constitutes the triggering experience. This filtering effect can often be seen in the way linguistic capacity can change between generations. Children only need access to robust structures of minimal ("degree-0") complexity. Everything can be learned from simple, unembedded "domains" (a grammatical concept involved in defining an expression's logical form). Children do not need access to more complex structures. From uttal at humu.nosc.mil Wed May 25 16:05:13 1988 From: uttal at humu.nosc.mil (William L. Uttal) Date: Wed, 25 May 88 16:05:13 HST Subject: Language Learnability: BBS Call for Commentators Message-ID: <8805260205.AA02056@humu.nosc.mil> Steve -- That is not my cup of tea but Ron Johnson at the U of Hawaii dept of Psych would be perfect. Aloha, Bill Uttal From ELECTRO at atc.bendix.com Tue May 24 10:16:00 1988 From: ELECTRO at atc.bendix.com (Richard A. Burne) Date: Tue, 24 May 88 10:16 EDT Subject: Research Scientist Position Message-ID: We have an openning at our research center for a research scientist with knowledge of learning algorithms for neural networks. The following provides additional information. Richard Burne electro at ATC.BENDIX.COM Aerospace Technology Center Allied-Signal Corporation Research Scientist Position The AEROSPACE TECHNOLOGY CENTER, the research and development laboratory for the Aerospace Company of the Allied-Signal Corporation, is seeking an individual to join an ongoing research project investigating neural networks. The primary technical responsibility of the individual will be researching and developing novel learning algorithms for dynamic neural network architectures. Educational background should include a Ph.D. in Applied Math, Electrical Engineering, Physics or a related scientific discipline -- with an emphasis on learning algorithms. Knowledge of signal processing in biological systems would be advantageous. The AEROSPACE TECHNOLOGY CENTER is located in Columbia, Maryland. Our location provides cultural, recreational and educational opportunites to satisfy every taste. A half-hour trip to the North puts you in the heart of Baltimore renaissance. The same distance to the South opens up the metropolitan and historical pleasures of Washington, D.C. For prompt consideration, please direct your resume to: Mr. John Flato Aerospace Technology Center 9140 Old Annapolis Road Columbia, MD 21045 The Aerospace Technology Center is an equal opportunity employer. From JHLU at sdr.slb.com Thu May 26 21:43:00 1988 From: JHLU at sdr.slb.com (JHLU@sdr.slb.com) Date: Thu, 26 May 88 21:43 EDT Subject: Asking for Information Message-ID: I am Jiang-hong Lu, a sophormore at MIT. I am doing a summer project on neural computing and want to join this committee. Please send me e-mail if you get this letter. Thinks a lot. Jiang-hong Lu From 8414902 at UWACDC.ACS.WASHINGTON.EDU Tue May 31 20:06:00 1988 From: 8414902 at UWACDC.ACS.WASHINGTON.EDU (TERU) Date: Tue, 31 May 1988 17:06 PDT Subject: HIMOSTHYLEDYNE - MACH Message-ID: That is, hierarchical, modular, stochastic, hybrid, learnable, dynamic, nets -- maybe chaotic. (I could replace MACH with PROCH standing for probably chaotic or pro-chaotic.) Well, I am trying to indicate by the adjectives the significant properties we want the ultimate nets to have. Other obvious candidates are self-organizing, self-reproducing, and genetic --- SOSERGE. Of course parallel processing, so add OPA! Add more or remove some? Take it lightly as a friendly noise since it has been quite quiet here for a while. -- Teru Homma