From slehar at park.bu.edu Thu Nov 1 10:46:48 1990 From: slehar at park.bu.edu (slehar@park.bu.edu) Date: Thu, 1 Nov 90 10:46:48 -0500 Subject: splitting hairs In-Reply-To: connectionists@c.cs.cmu.edu's message of 1 Nov 90 03:16:42 GM Message-ID: <9011011546.AA01594@bucasd.bu.edu> HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA !!!!!!! Excellent stuff! Great reading! From finton at cs.wisc.edu Thu Nov 1 17:06:50 1990 From: finton at cs.wisc.edu (David J. Finton) Date: Thu, 1 Nov 90 16:06:50 -0600 Subject: CRG-TR-90-5 request Message-ID: <9011012206.AA02232@ai.cs.wisc.edu> finton at cs.wisc.edu From dlovell at s1.elec.uq.oz.au Fri Nov 2 10:42:40 1990 From: dlovell at s1.elec.uq.oz.au (David Lovell) Date: Fri, 2 Nov 90 10:42:40 EST Subject: Rejection not misclassification Message-ID: <9011020042.AA09177@c17.elec.uq.oz.au> Dear Net, I have just been talking with my PhD supervisor and he has pointed out the value of a neural net classifier which will reject patterns that are "too different" from the exemplars used to train the net. For example, if a neural net is being used to classify substances as being toxic or non-toxic then it is much more desirable for the net to reject a poorly recognized sample rather than possibly classify it incorrectly. Seized by a frenzy of motivation, I have decided that the international neural net research community might be able to point out some references on the rejection of poor pattern matches. Naturally, I will compile a list of all the references I receive and send it out to all subscribers of this mailgroup. Thanks in advance, David Lovell (dlovell at s1.elec.uq.oz.au) Department of Electrical Engineering University of Queensland QUEENSLAND 4072 Australia From Mail_System%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU Thu Nov 1 19:05:00 1990 From: Mail_System%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU (Mail_System%VMS.BRIGHTON.AC.UK@VMA.CC.CMU.EDU) Date: Thu, 1 Nov 90 19:05 BST Subject: %% Undelivered Mail %% Message-ID: Your mail was not delivered as follows: %MAIL-E-SENDERR, error sending to user MRE1 %MAIL-E-OPENOUT, error opening !AS as output -RMS-E-CRE, ACP file create failed -SYSTEM-F-EXDISKQUOTA, disk quota exceeded %MAIL-E-SENDERR, error sending to user MRE1 -MAIL-E-OPENOUT, error opening !AS as output -RMS-E-CRE, ACP file create failed -SYSTEM-F-EXDISKQUOTA, disk quota exceeded Your original mail header and message follow. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% From kawahara at siva.ntt.jp Sat Nov 3 00:12:30 1990 From: kawahara at siva.ntt.jp (kawahara@siva.ntt.jp) Date: Sat, 03 Nov 90 00:12:30 JST Subject: JNNS'90 program and the mailing list (long) Message-ID: <9011021512.AA03947@siva.ntt.jp> I have finally compiled a list of titles presented at the first annual conference of Japan Neural Network Society. I hope this will give a general aspect of neural network research activities in Japan. If you have any questions, please contact neuro-admin at tut.ac.jp, which is the administrating group of Japanese neural network researcher's mailing list. The mailing list is still its infancy, and the traffic is still very low. I hope this will change in the near future. Hideki Kawahara NTT Basic Research Laboratories kawahara%siva.ntt.jp at RELAY.CS.NET ---------- cut here ---------- JNNS'90 The first annual conference of the Japan Neural Network Society Tamagawa University, Tokyo Japan 10-12 September, 1990 Presentation titles: (Original titles were Japanese. These titles were translated into English by the original authors. Some of them, indicated by '**', were translated by the editor of this list. 'O' stands for oral presentation and 'P' stands for poster presentation.) O1-1, A-BP "Another type of back propagation learning", Kazuhisa Niki (Electro technical Lab.) O1-2, Learning of Affine Invariance by Competitive Neural Network, Suichi Kurogi (Division of Control Engineering, Kyushu Institute of Technology) O1-3, ** An Optimal Network Size Selection for Generalization based on Cross Validation, Yasuhiro Wada and Mitsuo Kawato (ATR) O1-4, Generalizing Neural Networks using Mean Curvature, Shin Suzuki and Hideki Kawahara (NTT Basic Research Labs.) O1-5, Evolution of Artificial Animal having Perceptron by Genetic Algorithm, Masanori Ichinose and Tsutomu Hoshino (Institute of Engineering Mechanics, University of Tsukuba) O2-1, Neural Network Model of Self-Organization of Walking Patterns in Insects, Shinichi Kimura, Masafumi Yano and Hiroshi Shimizu(Faculty of Pharmaceutical Science University of Tokyo) O2-2, Learning Trajectory and Force Control of Human Arm Using Feedback-Error-Learning Scheme, Masazumi Katayama and Mitsuo Kawato(ATR Auditory and Visual Perception Research Laboratories) O2-3, Formation of Optimal Hand Configuration to grasp an Object, Naohiro Fukumura, Yoji Uno and Ryoji Suzuki(Department of Mathematical Engineering and Information Physics, Faculty of Engineering, University of Tokyo) O2-4, Hybrid Control of Robotic Manipulator by Neural Network Model (Variable learning of neural networks by Fuzzy set theory), Takanori Shibata and Toshio Fukuda (Nagoya University) Masatoshi Tokita and Toyokazu Mitsuda (Kisarazu Technical College) O2-5, An Overview of Neuofuzzy System, Akira KAWAMURA, Nobuo WATANABE, Yuri Owada, Ryusuke Masuoka and Kazuo Asakawa (Computer-based Systems Laboratory, Fujutsu Laboratories Ltd.) O3-1, ** Cognitive Effects caused by Random Movement of Wide-Field Patterns and Response Characteristics of MST Cells of Macaque Monkey, Masao Kaneko, Hiroshi Nakajima, Makoto Mizuno, Eiki Hida, Hide-aki Saito and Minoru Tsukada (Faculty of Engineering, Tamagawa University) O3-2, Extraction of Binocular Parallax with a Neural Network and 3-D Surface Reconstruction, Mahito Fujii, Takayuki Ito, Toshio Nakagawa (NHK Science and Technical Research Laboratories) O3-3, A Model of 3-D Surface Depth Perception from Its Boundary Perceived with Binocular Viewing, Masanori Idesawa (Rikin: The Institute of Physical and Chemical Research) O3-4, Theory of Information Propagation, Tetsuya Takahashi(The Institute for Physical and Chemical Research Laboratory for Neural Networks) O3-5, A model of the transformation of color selectivity in the monkey visual system, Hidehiko Komatsu, Shinji Kaji, Shigeru Yamane (Electrotechnical Laboratory Neuroscience Section), Yoshie Ideura (Komatsu Limited) O4-1, Magical number in cognitive map and quantization of cognitive map shape, Terunori Mori (Electrotechnical Laboratory) O4-2, Generative representation of symbolic information in a pattern recognition model "holovision", Hiroshi Shimizu and Yoko Yamaguchi (Faculty of Pharmaceutical Sciences, University of Tokyo) O4-3, Long-Term Potentiation to Temporal Pattern Stimuli in Hippocampal Slices, Takeshi Aihara, Minoru Tsukada and Makoto Mizuno (Faculty. of Eng. Tamagawa Univ.), Hiroshi Kato and Haruyoshi Miyagawa(Dept. of Physiol. Yamagata Univ.) O4-4, Bidirectional Neural Network Model for the Generation and Recognition of Temporal Patterns, Ryoko Futami and Nozomu Hoshimiya (Department of Electrical Communications, Faculty of Engineering, Tohoku University) O4-5, Theta rhythm in hippocampus: phase control of information circulation, Yoko Yamaguchi and Hiroshi Shimizu (Faculty of Pharmaceutical Science University of Tokyo) O5-1, Stability and/or instability of limit cycle memories embedded in an asynchronous neural network model, Toshinao Ishii and Wolfgang Banzhaf (Central Research Laboratory Mitsubishi Electric Corporation), Shigetoshi Nara (Department of Electric and Electronic Engineering Faculty of Engineer Okayama Univ.) O5-2, Geometric analysis of the dynamics of associative memory networks, Kenji Doya (Faculty of Engineering, University of Tokyo) O5-3, On the Integration of Mapping and Relaxation, Kazuyoshi Tsutsumi (Department of Mechanical and System Engineering Faculty of Science and Technology Ryukoku Univ.) P1-1, Neural network model on gustatory neurons in rat, Masaharu Adachi, Eiko Ohshima, Kazuyuki Aihara and Makoto Kotani (Faculty of Engineering, Tokyo Denki Univ.), Takanori Nagai (Faculty of Medicine, Teikyo Univ.), Takashi Yamamoto (Faculty of Dentistry, Osaka Univ.) P1-2, Learning Algorithm based on Temporal Pattern Discrimination in Hippocampus, Minoru Tsukada and Takeshi Aihara (Tamagawa University), K. Kato (Yamagata University) P1-3, A study on Learning by Synapse patch group, Shuji Akiyama, Yukifumi Shigematsu and Gen Matsumoto (Electrotechnical Laboratory Molecular and Cellular Neuroscience Section) P1-4, A Model of the Mechanisms of Long-Term Depression in the Cerebellum, Tatso Kitajima and Kenichi Hara (Faculty of Engineering, Yamagata Univ.) P1-5, ** Self-Organization in Neural Networks with Lateral Inhibition, Y. Tamori, S. Inawashiro and Y. Musya (Faculty of Engineering, Tohoku University) P1-6, ** Receptive Fields by Self-Organization in Neural Networks, S.Inawashiro, Y. Tamori and Y. Musya (Faculty of Engineering, Tohoku University) P1-7, ** Does Backpropagation Exist in Biological Systems? -- Discussions and Considerations, Shyozo Yasui (Kyusyu Institute of Technology), Eiki Hida (Tamagawa University) P1-8, Accelerating the convergence of the error back-propagation algorithm by deciding effective teacher signal, Yutaka Fukuoka, Hideo Matsuki, Hidetake Muraoka and Haruyuki Minamitani (Keio Univ.) P1-9, **Three-Layered Backpropagation Model with Temperature Parameter, Yoji Fukuda, Manabu Kotani and Haruya Matsumoto (Faculty of Engineering, Kobe University) P1-10, Kalman Type Least Square Error Learning Law for Sequential Neural Network and its Information Theory, Kazuyoshi Matsumoto (Kansai Advanced Research Center, CRL, MPT) P1-11, Learning Surface of Hierarchical Neural Networks and Valley Learning Method, Kazutoshi Gouhara, Norifusa Kanai, Takeshi Iwata and Yoshiki Uchikawa (Department of Electronic Mechanical Engineering School of Engineering: Nagoya Univ.) P1-12, A Study of Generalization of Multi-layered Neural Networks, Katsumasa Matsuura (Mechanical Engineering Research Laboratory, Hitachi Ltd.) P1-13, When "Learning" occurs in Machine Learning, Noboru Watanabe (Department of Biophysics, Kyoto University) P1-14, A Study on Self-supervised Learning System, Kazushige Saga, Tamami Sugasaka and Shigemi Nagata (Computer-Based Systems Laboratory Fujitsu Laboratories Ltd.) P1-15, A Learning Circuit for VLSI Analog Neural Network Implementation, Hiroyuki Wasaki, Yoshihiko Horio and Shogo Nakamura (Department of Electronic Engineering : Tokyo Denki Univ.) P1-16, Comparison of Learning Methods for Recurrent Neural Networks, Tatsumi Watanabe, Kazutoshi Gouhara and Yoshiki Uchikawa (Department of Electronic Mechanical Engineering, School of Engineering : Nagoya Univ.) P1-17, A Learning algorithm for the neural network of Hopfield type, Fumio Matsunari and Masuji Ohshima (Toyota Central Res. & Develop. Labs. Inc.) P1-18, Quantitative relationship between internal model of motor system and movement accuracy, movement distance and movement time, Yoji Uno and Ryoji Suzuki (Faculty of Engineering: University of Tokyo) P1-19, Autonomic control of bipedal locomotion using neural oscillators, Gentaro Taga, Yoko Yamaguchi and Hiroshi Shimizu (Faculty of Pharmaceutical Sciences: University of Tokyo) P1-20, Learning Model of Posture Control in Cerebellum, Hiroaki Gomi and Mitsuo Kawato (ATR Auditory and Visual Perception Research Laboratories) P1-21, Hybrid Control of Robotic Manipulator by Neural Network Model (Sensing and Hybrid Control of Robotic Manipulator with Collision Phenomena), Takanori Shibata, Toshio Fukuda, Fumihito Arai and Hiroshi Wada (Nagoya University), Masatoshi Tokita and Toyokazu Mituoka (Kisarazu Technical College) Yasumasa Shoji (Toyo Engineering Corp.) P1-22, Hybrid Position/Force Control of Robotic Manipulator by Application of Neural Network (Adaptive Control with Consideration of Characteristics of Objects), Masatoshi Tokita and Toyokazu Mituoka(Kisarazu National College of Technology), Toshio Fukuda and Takanori Shibata (Nagoya Univ.) P1-23, Reverbration in Chaotic Neural Network with a Fractal Connection, Masatoshi Hori, Masaaki Okabe and Masahiro Nakagawa (Department of Electrical Engineering, Faculty of Engineering, Nagaoka University of Technology) P1-24, An investigation of correlation possibility in neurocomputing and quantum mechanics via Matrix Dynamics, Tomoyuki Nishio (Technology Planning office General R&D Laboratory JUKI Corporation) P1-25, Knowledge Representation and Parallel Inference with Structured Networks, Akira Namatame and Youichi Ousawa (National Defense Academy) P1-26, Simulation of a Spiking Neuron with Multiple Input, G. Bugmann (Fundamental Research Laboratories, NEC Corporation) P2-1, Parallel Implementation of Edge detection by Energy Minimization on QCDPAX machine, Hideki Asoh (Electrotechnical Laboratory), Youichi Hachikubo and Tsutomu Hoshino (University of Tsukuba) P2-2, Characteristics of the Marr-Hildreth filter for two particle image, Shigeharu Toyoda (Department of Chemical Engineering, Faculty of Engineering Science, Osaka Univ.) P2-3, A neural network for fixation point selection, Makoto Hirahara and Takashi Nagano (College of engineering, Hosei Univ.) P2-4, A Method for Analyzing Inverse Dynamics of the Retinal Horizontal Cell Response through the Ionic Current Model, Yoshimi Kamiyama, Hiroyuki Ishii and Shiro Usui (Information and Computer Science, Toyohashi Uni. of Technology) P2-5, Selective Recognition of Plural Patterns by Neural Networks, Katsuji Imai, Kazutoshi Gouhara and Yoshiki Uchikawa (Department of Electronic Mechanical Engineering, School of Engineering, Nagoya Univ.) P2-6, A neural network for size-invariant pattern recognition, Masaichi Ishikawa and Toshi Nagano (College of Engineering, Hosei Univ.) P2-7, Human-Face Identification by Neural Network for Mosaic Pattern, Makoto Kosugi (NTT Human Interface Laboratories) P2-8, Pattern Classification and Recognition Neural Network based on the Associative Learning Rules, Hidetoshi Ikeno (Maizuru College of Technology), Shiro Usui and Manabu Sakakibara (Toyohashi Univ. of Technology) P2-9, Learning of A Three-Layer Neural Network with Translated Patterns, Jianqiang YI, Shuichi Kurogi and Kiyotoshi Matsuoka (Division of Control Engineering, Kyushu Institute of Technology) P2-10, Combining a Neural Network with Template Matching for Handwritten Numeral Classification, Masahiko Tateishi, Haruaki Yamazaki (Systems Laboratories, OKI Electric Corporation) P2-11, Recognition of Continuous Writing of English Words with the Mechanism of Selective Attention, Taro Imagawa and Kunihiko Fukushima (Faculty of Engineering Science, Osaka University) P2-12, Recognition of Handwritten Alphanumeric Characters by the Neocognitron, Nobuaki Wake Kunihiko Fukushima (Faculty of Engineering Science, Osaka University) P2-13, Target Recognition with Chebyshev Networks, Nobuhisa Ueda and Akira Namatame (National Defense Academy) P2-14, A Large Scale Neural Network "Comb NET" for Printed Kanji Character Recognition (JIS 1st and 2nd Level Character Set), Takashi Tohma, Akira Iwata, Hiroshi Matsuo and Nobuo Suzumura (Nagoya Institute of Technology) P2-15, Discriminative Properties of the Temporal Pattern in the Mesencephalic Periaqueductal Gray of Rat, Mitsuo Terasawa, Minoru Tsukada, Makoto Mizuno, Takeshi Aihara (Faculty of Engineering Tamagawa Univ.) P2-16, Spoken Word Recognition using sequential Neural Network, Seiichi Nakagawa and Isao Hayakawa (Toyohashi University of Technology, Dept. of Info. & Comp. Science.) P2-17, Learning of the three-vowel sequence by a neural network model and influence to a middle vowel from preceding and succeeding vowels, Teruhiko Ohtomo and Ken-ichi Hara (Faculty of Engineering, Yamagata Univ.) P2-18, A Self-Organizing Neural Network for The Classification of Temporal Sequences, Seiichi Ozawa (Nara National College of Technology), Kazuyoshi Tsutsumi (Faculty of Science and Technology, Ryukoku Univ.), Haruya Matsumoto (Faculty of Technology, Kobe Univ.) P2-19, Neural Mechanism for Fine Time-Resolution, Kiyohiko Nakamura (Graduate School of Science and Engineering Tokyo Institute of Technology) P2-20, Neuromagnetic Image Reconstruction by a Neural Network, Hisashi Tsuruoka (Fukuoka Institute of Technology) and Kohyu Fukunishi (Advanced Research Lab., Hitachi, Ltd.) P2-21, A Sparse Stabilization in Mutually Connected Neural Networks, Hiroshi Yamakawa (Faculty of Engineering, University of Tokyo), Yoichi Okabe (Research Center for Advanced Science and Technology, University of Tokyo) P2-22, ** Relation between Recollection Ability and Membrane Potential Threshold in Hopfield Model, Eizo Ohno and Atsushi Yamanaka (Sharp Laboratory) P2-23, Weight quantization in learning Boltzmann machine, Masanobu Takahashi, Wolfgang Balzer, Jun Ohta and Kazuo Kyuma (Solid state quantum electronics department Central Research Laboratory Mitsubishi Electric Corporation) P2-24, A cellular organizing approach for the travelling salesman problem, M. Shigematsu (Electrotechnical Laboratory) P2-25, Combinatorial Optimization Problems and Stochastic Logic Neural Network, Yoshikazu Kondo and Yasuji Sawada (Research Institute of Electrical Communication Tohoku Univ.) P2-26, Neural Network Models on SIMD Architecture Computer, Makoto Hirayama (ATR Auditory and Visual Perception Research Laboratories) P2-27, Optimal connection of an associative network with non-uniform elements, Hiro-fumi Yanai (Department of Mathematical Engineering and Information Physics, University of Tokyo) From B344DSL at UTARLG.UTARL.EDU Sat Nov 3 00:33:00 1990 From: B344DSL at UTARLG.UTARL.EDU (B344DSL@UTARLG.UTARL.EDU) Date: Fri, 2 Nov 90 23:33 CST Subject: Splitting hairs Message-ID: <0DDC7B47EB5F000497@utarlg.utarl.edu> From: IN%"gary%cs at ucsd.edu" "Gary Cottrell" 1-NOV-1990 23:35:40.19 To: jose at learning.siemens.com, tgd at turing.CS.ORST.EDU CC: connectionists at CS.CMU.EDU Subj: splitting hairs From davec at cogs.sussex.ac.uk Fri Nov 2 10:01:55 1990 From: davec at cogs.sussex.ac.uk (David Cliff) Date: Fri, 2 Nov 90 15:01:55 GMT Subject: Technical Report CSRP162 Message-ID: <2042.9011021501@rsund.cogs.susx.ac.uk> The following report is now available: "Computational Neuroethology: A Provisional Manifesto" Dave Cliff, University of Sussex School of Cognitive and Computing Sciences CSRP162, May 1990. \begin{abstract} This paper questions approaches to computational modelling of neural mechanisms underlying behaviour. It examines ``simplifying'' (connectionist) models used in computational neuroscience and concludes that, unless embedded within a sensorimotor system, they are meaningless. The implication is that future models should be situated within closed-environment simulation systems: output of the simulated nervous system is then expressed as observable behaviour. This approach is referred to as ``computational neuroethology''. Computational neuroethology offers a firmer grounding for the semantics of the model, eliminating subjectivity from the result-interpretation process. A number of more fundamental implications of the approach are also discussed, chief of which is that insect cognition should be studied in preference to mammalian cognition. \end{abstract} An abridged version of this paper is to appear in: "From Animals to Animats: Proceedings of the First International Conference on the Simulation of Adaptive Behaviour" J.-A. Meyer and S.W. Wilson, editors. MIT Press/Bradford Books, 1990. --------------------------------------------------------------------------- Copies of the postscript file cliff.manifesto.Z may be obtained from the pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the Getps script or do this: unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Connected to cheops.cis.ohio-state.edu. Name (cheops.cis.ohio-state.edu:): anonymous 331 Guest login ok, sent ident as password. Password: neuron 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose ftp> binary ftp> get cliff.manifesto.ps.Z ftp> quit unix-2> uncompress cliff.manifesto.ps.Z unix-3> lpr -P(your_local_postscript_printer) cliff.manifesto.ps ---------------------------------------------------------------------------- Or, order a hardcopy by sending your physical mail address to davec at cogs.sussex.ac.uk, mentioning CSRP162. Please do this only if you cannot use the ftp method described above. From rob at galab2.mh.ua.edu Sat Nov 3 16:55:51 1990 From: rob at galab2.mh.ua.edu (Robert Elliott Smith) Date: Sat, 03 Nov 90 15:55:51 CST Subject: literature related to kawahara posting. Message-ID: <9011032156.AA24258@galab2.mh.ua.edu> I tried to mail the following directly to Mr. Kawahara, but it bounced. Could someone else supply me with the necessary info? would like to get copies any papers assocciated with titles included in your list. Could you give me some idea of how to get any literature related to: O1-5, Evolution of Artificial Animal having Perceptron by Genetic Algorithm, Masanori Ichinose and Tsutomu Hoshino (Institute of Engineering Mechanics, University of Tsukuba) P1-13, When "Learning" occurs in Machine Learning, Noboru Watanabe (Department of Biophysics, Kyoto University) P1-14, A Study on Self-supervised Learning System, Kazushige Saga, Tamami Sugasaka and Shigemi Nagata (Computer-Based Systems Laboratory Fujitsu Laboratories Ltd.) Any help you can give me would be appreciated. Sincerely, Rob Smith. ------------------------------------------- Robert Elliott Smith Department of Engineering of Mechanics The University of Alabama P. O. Box 870278 Tuscaloosa, Alabama 35487 <> rob at galab2.mh.ua.edu <> (205) 348-4661 ------------------------------------------- From well!moritz at apple.com Sat Nov 3 21:46:55 1990 From: well!moritz at apple.com (Elan Moritz) Date: Sat, 3 Nov 90 18:46:55 pst Subject: _homo_trans sapiens, request for comments Message-ID: <9011040246.AA14274@well.sf.ca.us> TRANS_SAPIENS and TRANS_CULTURE *** REQUEST FOR COMMENTS ------------------------------------- In an earlier paper [Memetic Science: I - General Introduction; Journal of Ideas, Vol. 1, #1, 3-22, 1990] I postulated the emergence of a descendent of homo sapiens. This descendent will be primarily differentiated from h. sapiens by having * substantially greater cognitive abilities *. [the relevant section of the paper is included below]. >>>>> I plan to write a more substantive paper on the topic and would appreciate comments, speculation, arguments for & against this hypothesis. Relevant comments / arguments will be addressed in the paper and be properly acknowledged/referenced <<<<<. Elan Moritz <<<< -- text of h. trans sapiens section follows -- We also introduce here the concepts of trans-culture and Homo trans-sapiens (or simply trans-sapiens). While being topics of a future paper, trans-culture can be described as the next step of culture dominated by deep connections, interactions, and relationships between objects created by large human/machine teams. A manifest property of trans-culture is the extreme and transcendent complexity of interactions and relations between humans and the cultural objects involved, with the additional property of being non-accessible to Homo sapiens. Examples of trans-cultural objects already exist; for example, there is no individual who (at any given temporal instance) is an expert in all aspects of medicine, or who is familiar with all biological species and their relationships, or is an expert in all aspects of physics, or who is totally familiar with all aspects of even a single cultural artifact (e.g. Hubble space telescope, Space Shuttle design, or the total design of a nuclear power plant). In fact, we are approaching the point that certain proofs of mathematical theorems are becoming too long and difficult for any one individual to keep in conscious awareness. In a way, these transcendent and extended complexity relationships are examples of more complicated 'meta-memes', which is one of the reasons it is interesting to study the evolution of ideas. Homo trans-sapiens is the [postulated] next step in evolution of homo sapiens. There is no reason to expect or require that Homo sapiens will not undergo further evolution. The bio-historical trend indicates that the major evolutionary development in Homo is in the cortico-neural arena (i.e. increasingly more complex organization of the nervous system and the brain). Specifically it is the higher level cognitive - Knowledge Information Processing functions that set H. Sapiens apart. It is asserted here (and to be discussed in a future paper) that H. trans-sapiens is a logical consequence of evolution, and that the milieu and adaptive epigenetic landscape for H. trans-sapiens is already present in the form of trans-culture. It is indeed possible that the basic mutations are in place and trans-sapiens already exists or will appear in the biologically-near time frame. [ Please pass to other relevant news groups/ e-lists] Elan Moritz, snail mail: Elan Moritz The Institute for Memetic Research PO Box 16327, Panama City, Florida 32406 e mail: moritz at well.sf.ca.us [internet] From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Sun Nov 4 11:04:44 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Sun, 04 Nov 90 11:04:44 EST Subject: _homo_trans sapiens, request for comments In-Reply-To: Your message of Sat, 03 Nov 90 18:46:55 -0800. <9011040246.AA14274@well.sf.ca.us> Message-ID: ... There is no reason to expect or require that Homo sapiens will not undergo further evolution. The bio-historical trend indicates that the major evolutionary development in Homo is in the cortico-neural arena (i.e. increasingly more complex organization of the nervous system and the brain). I think that this is a very shaky bit of extrapolation. Perhaps the full paper addresses this point in more detail, but I think that many observers (including some of the more perceptive science fiction writers) have pointed out the following: The very matrix of culture and machines that give rise to your "trans-sapiens" has pretty much eliminated the selective pressure that has driven evolution for the last couple of billion years -- at least for humans living in developed countries. In fact, in such countries, it is often the most successful people (in the intellectual terms you are concerned with) who breed the least, so not-very-natural selection may actually be working against this kind of success. So, the cultural matrix continues to develop, but the individual humans who make up that matrix may, on the average, be less capable as time goes on. It may well be that within a generation or two we'll have such complete control over the genetic makeup of our human progeny that the old mutation/selection machinery will be irrelevant. Whether we will (or should) choose to use this control to deliberately change the species in certain directions is a *very* complex question. But whether or not we actively take control of human evolution, it makes little sense to predict the future based on "bio-historical trends" of the past million years. We have recently crossed a discontinuity that changes the rules of evolution in fundamental and not-yet-understood ways. That change is probably permanent unless the culture itself is destroyed. -- Scott From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Mon Nov 5 01:46:18 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Mon, 05 Nov 90 01:46:18 EST Subject: _homo_trans sapiens (Ooops!) Message-ID: A couple of people have suggested to me that the _homo_trans_sapiens_ discussion, while potentially interesting to some of us, really doesn't belong on this mailing list. Upon reflection, I have to agree. I hope Elan Moritz's original post and my hasty response don't generate a lot of amateur philosophizing on this list -- that's what usenet is for. :-) Perhaps Mr. Moritz could pick an appropriate newsgroup for the continuation of this discussion and inform people interested in this topic where to find the discussion. Then the rest of us can get back to building _back_propo_micro_sapiens_ or whatever. -- Scott From well!moritz at apple.com Sun Nov 4 21:46:19 1990 From: well!moritz at apple.com (Elan Moritz) Date: Sun, 4 Nov 90 18:46:19 pst Subject: _homo_trans sapiens, request for comments Message-ID: <9011050246.AA24500@well.sf.ca.us> You raise interesting points here, others have raised mutation rates, coordination with other physical parameters of human body, etc. The points most seem to make make are with respect to the genotype+phenotype set. When more comments come in I will summarize and post them. From rr at cstr.edinburgh.ac.uk Mon Nov 5 07:54:26 1990 From: rr at cstr.edinburgh.ac.uk (Richard Rohwer) Date: Mon, 5 Nov 90 12:54:26 GMT Subject: maximum entroppy and trans_homo sapiens Message-ID: <5812.9011051254@kahlo.cstr.ed.ac.uk> This isn't really a connectionist subject, but perhaps it has enough amusement value to merit posting... Elan Moritz writes: > Homo trans-sapiens is the [postulated] next step in > evolution of homo sapiens. There is no reason to expect or My view is that the fate of humanity is either anihilation or obsolesence (or both). There are lots of obvious possibilities for reaching these fates, and I do not wish to review the menu. Instead I would like to mention an amusing (if uncompelling) doomsday-is-imminent argument based on the Anthropic principle -- the idea that the universe appears as it does to folks like us because folks like us cannot survive in a substantially different universe, and therefore such universes go unnoticed. Suppose the human population grows exponentially until it is anihilated. A randomly selected person in this population (considered over all time) is most likely to live at a time when the population is large, which also happens to be a time near doomsday. So in the spirit of the Maximum Entropy Principle, one asserts that we are probably typical people, and therefore doomsday looms. (There's a wee problem with this argument... It applied equally well at any time in history.) Scott Fahlman makes the point that: > The very matrix of culture and machines that > give rise to your "trans-sapiens" has pretty much eliminated the selective > pressure that has driven evolution for the last couple of billion years -- Ah, but do selective pressures live on in the machines we build...? Those machines will survive which make the most effective use of their human (and other) resources... I presume this idea constitutes a standard theme in sci-fi. Richard Rohwer JANET: rr at uk.ac.ed.cstr Centre for Speech Technology Research ARPA: rr%ed.cstr at nsfnet-relay.ac.uk Edinburgh University BITNET: rr at cstr.ed.ac.uk, 80, South Bridge rr%cstr.ed.UKACRL Edinburgh EH1 1HN, Scotland UUCP: ...!uunet!mcsun!ukc!cstr!rr PHONE: (44 or 0) (31) 225-8883 x261 FAX: (44 or 0) (31) 226-2730 From kayama at CS.UCLA.EDU Sat Nov 3 23:34:09 1990 From: kayama at CS.UCLA.EDU (Masahiro Kayama) Date: Sat, 3 Nov 90 20:34:09 -0800 Subject: Question Message-ID: <9011040434.AA23120@oahu.cs.ucla.edu> Dear connectionist: I am Masahiro Kayama, a visiting scholar of UCLA from Hitachi Ltd., Japan. I am investigating to introduce Kohonen feature map into the feature extraction module of the Factory Automation Controller. Does anyone know how to determine a suitable size of units of a feature map? Are there any deterministic equation between a capacity of a object and a number of units? I am thinking two dimensional map. Masahiro Kayama kayama at cs.ucla.edu From JBECKMAN%DD0RUD81.BITNET at BITNET.CC.CMU.EDU Mon Nov 5 17:18:15 1990 From: JBECKMAN%DD0RUD81.BITNET at BITNET.CC.CMU.EDU (JBECKMAN%DD0RUD81.BITNET@BITNET.CC.CMU.EDU) Date: Mon, 05 Nov 90 17:18:15 CET Subject: NEURAL NETWORK APPLICATIONS Message-ID: <1881251526C00577@BITNET.CC.CMU.EDU> For a survey on NEURAL NETWORK APPLICATIONS (realized or intended) I am looking for REVIEW articles and/or publications about special projects. Who can help me ? - Thanks in advance. Juergen Beckmann, JBECKMAN at DD0RUD81.BITNET From malerma at batman.fi.upm.es Tue Nov 6 09:05:00 1990 From: malerma at batman.fi.upm.es (malerma@batman.fi.upm.es) Date: 6 Nov 90 15:05 +0100 Subject: No subject Message-ID: <9011061405.AA00938@batman.fi.upm.es> I am interested in constructive algorithms for neural networks (capable of generating new units and layers in the NN). Is there any constructive algorithm for NN whose units have continuous (e.g.: sigmoidal) activation function? Miguel A. Lerma Sancho Davila 18 28028 MADRID - SPAIN From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Tue Nov 6 11:35:49 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Tue, 06 Nov 90 11:35:49 EST Subject: No subject In-Reply-To: Your message of 06 Nov 90 15:05:00 +0100. <9011061405.AA00938@batman.fi.upm.es> Message-ID: For those of you who want to play with the Cascade-Correlation algorithm, a public-domain Common Lisp version of my Cascade-Correlation simulator is now available for FTP via Internet. This is the same version I've been using for my own experiments, except that a lot of non-portable display and user-interface code has been removed. I believe that this version is now strictly portable Common Lisp. [Some users have reported that changing "short-float" declarations to "single-float" in Lucid seems to make things work better.] Scott Crowder, one of my graduate students at CMU, has translated this code into C. The C version is considerably faster on most machines, but less flexible. Instructions for obtaining the code via Internet FTP are included at the end of this message. If people can't get it by FTP, contact me by E-mail and I'll try once to mail it to you. Specify whether you want the C or Lisp version. If it bounces or your mailer rejects such a large message, I don't have time to try a lot of other delivery methods. I am maintaining an E-mail list of people using this code so taht I can notify them of any changes or problems that occur. I would appreciate hearing about any interesting applications of this code, and will try to help with any problems people run into. Of course, if the code is incorporated into any products or larger systems, I would appreciate an acknowledgement of where it came from. There are several other programs in the "code" directory mentioned below: versions of Quickprop in Common Lisp and C, and some simulation code written by Tony Robinson for the vowel benchmark he contributed to the benchmark collection. NOTE: This code is distributed without charge on an "as is" basis. There is no warranty of any kind by the authors or by Carnegie-Mellon University. -- Scott *************************************************************************** For people (at CMU, MIT, and soon some other places) with access to the Andrew File System (AFS), you can access the files directly from directory "/afs/cs.cmu.edu/project/connect/code". This file system uses the same syntactic conventions as BSD Unix: case sensitive names, slashes for subdirectories, no version numbers, etc. The protection scheme is a bit different, but that shouldn't matter to people just trying to read these files. For people accessing these files via FTP: 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu". The internet address of this machine is 128.2.254.155, for those who need it. 2. Log in as user "anonymous" with no password. You may see an error message that says "filenames may not have /.. in them" or something like that. Just ignore it. 3. Change remote directory to "/afs/cs/project/connect/code". Any subdirectories of this one should also be accessible. Parent directories may not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. The Cascade-Correlation simulator lives in files "cascor1.lisp" and "cascor1.c". If you try to access this directory by FTP and have trouble, please contact me. The exact FTP commands you use to change directories, list files, etc., will vary from one version of FTP to another. --------------------------------------------------------------------------- To access the postscript file for the tech report describing this algorithm: unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62) Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get fahlman.cascor-tr.ps.Z ftp> quit unix> uncompress fahlman.cascor-tr.ps.Z unix> lpr fahlman.cascor-tr.ps (use flag your printer needs for Postscript) --------------------------------------------------------------------------- From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Tue Nov 6 11:33:25 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Tue, 06 Nov 90 11:33:25 EST Subject: No subject In-Reply-To: Your message of 06 Nov 90 15:05:00 +0100. <9011061405.AA00938@batman.fi.upm.es> Message-ID: I am interested in constructive algorithms for neural networks (capable of generating new units and layers in the NN). Is there any constructive algorithm for NN whose units have continuous (e.g.: sigmoidal) activation function? The only nets that come to mind that fit your description are Cascade-Correlation (Fahlman and Lebiere, paper in NIPS-89 proceedings) and SONN (Tenrio and Lee, NIPS-88). There are a lot of system around that build up a *single* layer of hidden units with continuous activation functions. Information on how to FTP the Cascade-Correlation tech report and simulation code has been sent to this list before. I'll send it to you in a separate message. -- Scott From moody-john at CS.YALE.EDU Wed Nov 7 12:34:22 1990 From: moody-john at CS.YALE.EDU (john moody) Date: Wed, 7 Nov 90 12:34:22 EST Subject: constructive algorithms Message-ID: <9011071734.AA26833@SUNNY.SYSTEMSX.CS.YALE.EDU> In addition to the Cascade Correlation and SONN networks, there is a large body of work by Barron and Barron and by Russian authors on synthesizing multilayer polynomial networks. The synthesis algorithms come under the general title GMDH (Group Method of Data Handling), and have been developed since the early 1960's. They have been the subject of large amounts of both theoretical analysis and empirical testing. Two useful references are: S.J. Farlow, Self Organizing Methods in Modeling: GMDH Type Algorithms, New York: Marcel Dekker 1984. A.R. Barron and R.L. Barron, "Statistical learning networks: a unifying view," Computing Science and Statistics: Proc. 20th Interface Symposium, Ed Wegman, ed. Am. Statist. Assoc. Washington DC, pp 192-203, 1988. Furthermore, an impressive new algorithm for generating tree-structured polynomial networks will be presented as an invited talk at NIPS*90 in three weeks. The algorithm called MARS (Multivariate Adaptive Regression Splines) was developed by Jerome Friedman, Chairman of the Stanford Statistics Department. I believe that a paper on the algorithm is about to appear in one of the statistics journals. Professor Friedman's email address is jhf at playfair.stanford.edu. Hope this helps, John Moody ------- From Alex.Waibel at SPEECH2.CS.CMU.EDU Wed Nov 7 12:12:41 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Wed, 7 Nov 90 12:12:41 EST Subject: constructive algorithms Message-ID: If you define it loosely as self-modifying architectures, there are more: Three at least four more that come to my mind: o Meiosis Nets (Hanson, NIPS'89) o Optimal Brain Damage (Denker and co., NIPS'89) o Adaptive Delays and Connections (Bodenhausen, NIPS'90) o Weight Elimination (Rumelhart) Alex From nips-90 at CS.YALE.EDU Wed Nov 7 14:17:37 1990 From: nips-90 at CS.YALE.EDU (nips90) Date: Wed, 7 Nov 90 14:17:37 EST Subject: For NIPS*90 Presenters! Message-ID: <9011071917.AA02402@CASPER.NA.CS.YALE.EDU> The NIPS*90 Conference is now less than 3 weeks away. I would like to encourage all NIPS*90 presenters to consider bringing videotapes of their work if available. We will most likely have a video projection system set up in the main conference room for oral and poster spotlight presenters. We will also have a limited number of televisions and video casette players available in the hall for the poster sessions. If you would like to present a videotape (VHS format preferred), please contact me as soon as possible so that I can reserve a machine for your time slot or for your poster session. Thanks much, John Moody NIPS*90 Program Chair nips at cs.yale.edu (203)432-1200 VOICE (203)432-0593 FAX ------- From moody-john at CS.YALE.EDU Wed Nov 7 22:08:55 1990 From: moody-john at CS.YALE.EDU (john moody) Date: Wed, 7 Nov 90 22:08:55 EST Subject: more constructive networks at NIPS Message-ID: <9011080308.AA29766@SUNNY.SYSTEMSX.CS.YALE.EDU> Four more constructive algorithms for networks will be presented at NIPS*90: "Bumptrees for Efficient Function, Constraint, and Classification Learning", by Steve Omohundro. "Neural Net Algorithms that Learn in Polynomial Time from Examples and Queries", by Eric Baum and Kevin Lang "A Resource-Allocating Network for Function Interpolation", by John Platt "A Tree-Structured Network for Approximation on High-Dimensional Space", by Terry Sanger ------- From GOLDFARB at UNB.CA Thu Nov 8 00:35:34 1990 From: GOLDFARB at UNB.CA (GOLDFARB) Date: Thu, 08 Nov 90 01:35:34 AST Subject: Reconfigurable neural networks Message-ID: > I am interested in constructive algorithms for neural networks > (capable of generating new units and layers in the NN). I believe that the current framework of neural networks is not well suited to the above task, i.e. for reconfigurable NN. See my message of 27 September. Introduction to a fundamentally new model that is "reconfigurable" can be found in my Pattern Recognition paper mentioned in the above message and in several more resent papers. Although the example considered in the above paper uses nonnumeric operations (and representation) and does not require the learning machine to reconfigure itself, the issue of reconfigurability was one of the main driving motivations for the proposed model and is partly addressed in the more resent papers. Since the proposed model is also valid (and even becomes simpler) for vector patterns, an excellent topic (particularly for a thesis) would be an application of the model to vector patterns. I would be glad to help with the necessary missing details. Lev Goldfarb From aboulang at BBN.COM Thu Nov 8 13:55:58 1990 From: aboulang at BBN.COM (aboulang@BBN.COM) Date: Thu, 8 Nov 90 13:55:58 EST Subject: constructive algorithms In-Reply-To: john moody's message of Wed, 7 Nov 90 12:34:22 EST <9011071734.AA26833@SUNNY.SYSTEMSX.CS.YALE.EDU> Message-ID: From: john moody Full-Name: john moody Date: Wed, 7 Nov 90 12:34:22 EST Furthermore, an impressive new algorithm for generating tree-structured polynomial networks will be presented as an invited talk at NIPS*90 in three weeks. The algorithm called MARS (Multivariate Adaptive Regression Splines) was developed by Jerome Friedman, Chairman of the Stanford Statistics Department. I believe that a paper on the algorithm is about to appear in one of the statistics journals. Professor Friedman's email address is jhf at playfair.stanford.edu. For those of you who can't make it to NIPS (me), or can't wait to find out about MARS (me), or would like to study-up on MARS before the meeting, or even would like to play with it, here are some references from the sci.math.stat newsgroup and how to get MARS code from statlib: From: pgh at stl.stc.co.uk (P.G.Hamer) Newsgroups: sci.math.stat Subject: Re: WHAT DOES MARS MEAN? Date: 23 Oct 90 15:04:20 GMT Reply-To: "P.G.Hamer" Organization: STC Technology Limited, London Road, Harlow, Essex, UK In article <90294.210657BKW1 at psuvm.psu.edu> BKW1 at psuvm.psu.edu writes: >I have been unable to obtain any reference on MARS. Can anyone >tell me what MARS means. c c Multivariate Adaptive Regression Splines (MARS modeling, version 2.5). c c MARS 2.5 is a collection of subroutines that implement the multivariate c adaptive regression spline strategy for data fitting and function c approximation described in Friedman (1988b). It is a generalization of c MARS 1.0 described in Friedman (1988a). It implements as a special case c a palindromically invariant version of the TURBO fitting technique for c smoothing and additive modeling described in Friedman and Silverman (1987). c c References: c c Friedman, J. H. (1988a). Fitting functions to noisy data in high dimensions. c Proceedings, Twentyth Symposium on the Interface, E. Wegman, Ed. c c Friedman, J. H. (1988b). Multivariate adaptive regression splines. c Department of Statistics, Stanford University, Tech. Report LCS102. c c Friedman, J. H. and Silverman, B. W. (1987). Flexible parsimonious smoothing c and additive modeling. Dept. of Statistics, Stanford University, Tech. c report. (TECHNOMETRICS: Feb. 1989). c I got fortran source from the statlib mail server with the request 'send newmars from general'. If you do this, you will first get a message replay that is bogus, following this reply will be another with the MARS source. Cheers, Albert Boulanger From GOLDFARB at unb.ca Thu Nov 8 15:13:28 1990 From: GOLDFARB at unb.ca (GOLDFARB) Date: Thu, 08 Nov 90 16:13:28 AST Subject: Reconfigurable neural networks Message-ID: " I am interested in constructive algorithms for neural networks (capable of generating new units and layers in the NN)." I believe that the current framework of neural networks is not well suited to the above task, i.e. for reconfigurable NN. See my message of 27 September. Introduction to a fundamentally new model that is "reconfigurable" can be found in my Pattern Recognition paper mentioned in the above message and in several more resent papers. Although the example considered in the above paper uses nonnumeric operations (and representation) and does not require the learning machine to reconfigure itself, the issue of reconfigurability was one of the main driving motivations for the proposed model and is partly addressed in the more resent papers. Since the proposed model is also valid (and even becomes simpler) for vector patterns, an excellent topic (particularly for a thesis) would be an application of the model to vector patterns. I would be glad to help with the necessary missing details. Lev Goldfarb From KOKAR at northeastern.edu Fri Nov 9 12:09:00 1990 From: KOKAR at northeastern.edu (KOKAR@northeastern.edu) Date: Fri, 9 Nov 90 12:09 EST Subject: Intelligent Control Conference Message-ID: CALL FOR PAPERS 1991 IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT CONTROL August 13-15,1991 Key Bridge Marriott Arlington, Virginia Sponsored by the IEEE Control Systems Society General Chairman: Harry E. Stephanou, Rensselaer Polytechnic lnstitute Program Chairman: Alexander H. Levis, George Mason University Finance Chairman: Elizabelh R. Ducot, MlT Lincoln Labs Registration Chairman : Umit Ozguner, Ohio State University Publications Chairman: Mieczyslaw Kokar, Northeastern University Local Arrangements: James E. Gaby, UNYSlS Corporation The 6th IEEE International Symposium on Intelligent Control (ISIC 91 ) will be held in conjunction with the 1991 IFAC Symposium on Distributed Intelligence Systems. Registrants in either symposium will be able to attend all technical and social events in both symposia and will receive preprint volumes from both. The ISIC 91 theme will be "Integrating Quantitative and Symbolic Processing". The design and analysis of automatic control systems have traditionally been based on rigorous, numerical techniques for modeling and optimization. Conventional controllers perform well in the presence of random disturbances, and can adapt to relatively small changes in fairly well known environments. Intelligent controllers are designed to operate in unknown environments and, therefore, require much higher levels of adaptation to unexpected events. They are also required to process and interpret large quantities of sensor data, and use the results for action planning or replanning. The design of intelligent controllers, therefore, incorporates heuristic and/or symbolic tools from artificial intelligence. Such tools which have traditionally been applied to open-loop, off-line problems, must now be integrated into the perception-reasoning-action closed loop of intelligent controllers. Effective methods for the integration of numerical and symbolic processing schemes are needed. Robustness and graceful degradation issues must be addressed. Reconfigurable feedback loops at varying levels of abstraction should be considered. Papers are being solicited ior presentation at the Symposium and publication in the Symposium Proceedings. Topics include, but are not limited to, the following: Intelligent control architectures Reasoning under uncertainty Self-organizing systems Sensor-based robot control Fault detection and error recovery Cellular robotics Intelligent manufacturing control Microelectro-mechanical systems systems Discrete event systems Variable precision reasoning Concurrent engineering Active sensing and perception Neural network controllers Multisensor data fusion Hierarchical controllers Intelligent inspection Learning control systems Intelligent database systems Autonomous control systems Microelectronics,advanced materials, Knowledge representation for and other novel applications real-time processing Five copies of papers should be sent by February 15,1991 to: Professor Alexander H. Levis Dept. of ECE George Mason University Fairfax, VA 22030-4444 Telephone: 703-764-6282 A separate cover sheet with the name of the corresponding author, telephone and fax numbers, and e-mail address should also be included. Authors will be notified of acceptance by April 15, 1991. Accepted papers, in final camera ready form, will be due on May 15, 1991. Proposals for invited sessions and tutorial workshops are also solicited. Cohesive sessions focusing on successful applications are particularly encouraged. Requests for additional information and proposal submissions (by February 15, 1991) should be addressed to Professor Levis. Symposium Program Committee: Suguru Arimoto, University of Tokyo Vivek V, Badami, General Electric John Baras, University of Maryland Research Lab Piero Bonissone, General Electric Hamid Berenji, NASA Ames Research Lab V.T. Chien, National Science David B. Cooper, Brown University Foundation David A. Dornfeld, University Kenneth J. DeJong, George Mason of California, Berkeley University Judy A. Franklin, GTE Laboratories Masakazu Ejiri, Hitachi Janos Gertler, George Mason Univesity Roger Geesey, BDM International Roderic Grupen, University of George Giralt, LAAS Massachusetts William A. Gruver, University of Susan Hackwood, University of Kentucky California, Riverside Thomas Henderson, Uiversity of Utah Joseph K. Kearney, University of Pradeep Khosla, Carnegie Mellon Iowa University Yves Kodratoff, Universite de Paris Benjamin Kuipers, University of Texas, Michael B. Leahy, Air Force Institute Austin of Technology Gaston H. Lefranc, Universidad Catolica Ramiro Liscano, Nat'l Research Council Valparaiso of Canada Ronald Lumia, NIST Yukio Mieda, Honda Engineering Co.,Ltd Thang N. Nguyen, IBM Corporation Kevin M. Passino, Ohio State Michael A.Peshkin, Northwestern University University Roger T. Schappell, Martin Marietta Yoshiaki Shirai, Osaka University Marwan Simaan, University of Janos Sztipanovits, Vanderbilt Pittsburgh University Zuheir Tumeh, General Motors Research Kimon P. Valavanis, Northeastern Labs University Agostino Villa, Politecnico di Torino John Wen, Rensselaer Polytechnic Institute From zl at guinness.ias.edu Fri Nov 9 13:20:54 1990 From: zl at guinness.ias.edu (Zhaoping Li) Date: Fri, 9 Nov 90 13:20:54 EST Subject: No subject Message-ID: <9011091820.AA04621@guinness.ias.edu> THE INSTITUTE FOR ADVANCED STUDY Anticipates the opening of one or more positions in the area of mathematical biology with emphasis on neural systems and computations. The positions are at a postdoctoral level and are for two years beginning academic year 1991-92. Applicants should send a CV and arrange for three letters of recommendations to be sent directly to Dr. Joseph J. Atick, School of Natural Sciences Institute for Advanced Study Princeton, NJ 08540 Applications should be received no later than January 20, 1991. Women and minorities are encouraged to apply. From stolcke at ICSI.Berkeley.EDU Sat Nov 10 02:13:11 1990 From: stolcke at ICSI.Berkeley.EDU (stolcke@ICSI.Berkeley.EDU) Date: Fri, 09 Nov 90 23:13:11 PST Subject: Cluster analysis software sought Message-ID: <9011100713.AA10859@icsib12.Berkeley.EDU> I am looking for a tool that does hierarchical cluster analyis *and* is able to display the resulting tree in some high-quality graphics format (such as Postscript). The program I am currently using prints only a crude approximation using ASCII characters. It would truely appreciate it if anybody who knows where to find such beast could drop me a note. --Andreas Stolcke From jose at learning.siemens.com Mon Nov 12 06:11:18 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Mon, 12 Nov 90 06:11:18 EST Subject: NIPS update Message-ID: <9011121111.AA07507@learning.siemens.com.siemens.com> A Friendly Reminder to REGISTER and get a room; its getting late: IEEE Conference on Neural Information Processing Systems -Natural and Synthetic- Monday, November 26 - Thursday, November 29, 1990 Sheraton Denver Tech Center Denver, Colorado Mail Requests For Registration Material To: Kathie Hibbard NIPS*90 Local Committee Engineering Center University of Colorado Campus Box 425 Boulder, CO 80309-0425 hibbard at boulder.colorado.edu From sun at umiacs.UMD.EDU Mon Nov 12 13:59:59 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Mon, 12 Nov 90 13:59:59 -0500 Subject: No subject Message-ID: <9011121859.AA00284@neudec.umiacs.UMD.EDU> >I am interested in constructive algorithms for neural networks >(capable of generating new units and layers in the NN). Is there >any constructive algorithm for NN whose units have continuous >(e.g.: sigmoidal) activation function? The works of this kind back to 1987 may also include: (1)G.Z. Sun, H.H. Chen and Y.C. Lee, "A Novel Net that Learns Sequential Decision Processes", NIP Proceedings (1987),Denver. (2) G.Z. Sun, H.H.Chen and Y.C.Lee, "Parallel Sequential Induction Network: A New Paradiagm of Neural Network Architecture", ICNN Proceedings,(1988), San Diego. These two papers generalized the idea of conventional regression tree to generate the new NN units dynamically. The continuous activation fuction is used. This PSIN (Parallel Sequential Induction Net) algorithm integrates the parallel processing of neural net and sequential processing of decision tree together to form a "most efficient" multilayered neural net classifier. The efficiency measure is the information entropy. -G. Z. Sun University of Maryland From baird%icsia8.Berkeley.EDU at berkeley.edu Mon Nov 12 14:48:39 1990 From: baird%icsia8.Berkeley.EDU at berkeley.edu (Bill Baird) Date: Mon, 12 Nov 90 11:48:39 PST Subject: preprint - associative memory in oscillating cortex Message-ID: <9011121948.AA02070@icsia8> Preprint announcement: available by ftp from neuroprose Learning with Synaptic Nonlinearities in a Coupled Oscillator Model of Olfactory Cortex Bill Baird Depts. Mathematics, and Molecular and Cell Biology, U.C.Berkeley, Berkeley, Ca. 94720 Abstract A simple network model of olfactory cortex, which assumes only minimal coupling justified by known anatomy, can be analytically proven to function as an associative memory for oscillatory patterns. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long range excitatory connections. Using a local Hebb-like learning rule for primary and higher order synapses at the ends of the long range connections, the system can learn to store the kinds of oscillation amplitude and phase patterns observed in olfactory and visual cortex. Memory capacity is N/2 oscillatory attractors, N/4 chaotic attractors in an N node network. The network can be truely self-organizing because a synapse can modify itself according to it's own pre and postsynaptic activity during stimulation by an input pattern to be learned. The neurons of the neuron pools modeled here can be viewed as operating in the linear region of the usual sigmoidal axonal nonlinearity, and multiple memories are stored instead by the learned {\em synaptic} nonlinearities. Introduction We report recent results of work which seeks to narrow the gap that exists between physiologically detailed network models of real vertebrate cortical memory systems and analytically understood artificial neural networks for associative memory. The secondary olfactory sensory cortex known as prepyriform cortex is thought to be one of the clearest cases of a real biological network with associative memory function. Patterns of 40 to 80 Hz oscillation have been observed in the large scale activity (local field potentials) of olfactory cortex and visual neocortex, and shown to predict the olfactory and visual pattern recognition responses of a trained animal. Similar Observations of 40 Hz oscillation in retina, auditory cortex, motor cortex and in the EMG have been reported. It thus appears that cortical computation in general may occur by dynamical interaction of resonant modes, as has been thought to be the case in the olfactory system. Given the sensitivity of neurons to the location and arrival times of dendritic input, the sucessive volleys of pulses that are generated by the oscillation of a neural net may be ideal for the formation and reliable longe range transmission of the collective activity of one cortical area to another. The oscillation can serve a macroscopic clocking function and entrain or ``bind" the relevant microscopic activity of disparate cortical regions into a well defined phase coherent collective state or ``gestalt". This can overide irrelevant microscopic activity and help produce coordinated motor output. If this view is correct, then oscillatory network modules form the actual cortical substrate of the diverse sensory, motor, and cognitive operations now studied in static networks. It must ultimately be shown how those functions can be accomplished with oscillatory dynamics. ftp proceedure: unix> ftp cheops.cis.ohio-state.edu Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get baird.oscmem.ps.Z ftp> quit unix> uncompress baird.oscmem.ps.Z unix> lpr -P(your postscript printer) baird.oscmem.ps For background papers, send e-mail to baird at icsi.berkeley.edu, giving paper or e-mail address for Tex or Postscript output. From jt at cns.edinburgh.ac.uk Mon Nov 12 14:54:23 1990 From: jt at cns.edinburgh.ac.uk (Jay Buckingham) Date: Mon, 12 Nov 90 19:54:23 GMT Subject: Post-NIPS workshop on Associative Memory Message-ID: <2667.9011121954@cns.ed.ac.uk> There will be a 1-day post-NIPS workshop on Associative Memory this year. This is a short summary of what we plan to do. NEUROBIOLOGICALLY MOTIVATED ASSOCIATIVE MEMORY The 1990 ASSOCIATIVE MEMORY workshop will focus on theoretical issues relevant to the understanding of neurobiologically motivated associative memory models. We are organizing it into a set of discussions on key issues. In these discussions the workshop participants can explain how they have addressed the issue at hand in their work and hopefully improve their understanding via exchange with others. What we want are nuts-and-bolts discussions among people who have wrestled with these topics. Some of the key issues are: - Architectures for associative memories, with emphasis on partially connected, biologically motivated models, including multi-stage or modular architectures. - Synaptic learning rules - Performance measures and capacity - Information theoretic issues such as information efficiency - Thresholding techniques - Biological relevance of these models (Discussing, for example, how David Marr made functional statements about the cerebellum, hippocampus and neocortex with associative memory models.) This list is not exhaustive and the items are interrelated so we expect them to come up in various contexts. We will probably begin each discussion by formulating a few questions that the participants can address. So come with your questions about these topics and your ways of thinking about them. Jay Buckingham Cognitive Neuroscience University of Edinburgh 2 Buccleuch Place Edinburgh EH8 9LW, SCOTLAND Phone: (44 or 0) 31 667 1011 ext 6302 E-mail: jt%ed.cns at nsfnet-relay.ac.uk From dave at cogsci.indiana.edu Tue Nov 13 02:28:44 1990 From: dave at cogsci.indiana.edu (David Chalmers) Date: Tue, 13 Nov 90 02:28:44 EST Subject: Proceedings -- ConnectFest 1990 Message-ID: A small connectionist meeting, "ConnectFest 1990", was recently held at Indiana University. Here are selected highlights from the official proceedings. The full proceedings (about 3-4 times the size of this) are available on request from dave at cogsci.indiana.edu. The Frank Rosenblatt Memorial Award for best contribution was taken by Tim van Gelder, who has bravely allowed his name to be attached to his contribution at great risk to his career. For your very own ConnectFest T-shirt (only $8 each), send e-mail to Doug Blank, blank at copper.ucs.indiana.edu. Dave Chalmers (dave at cogsci.indiana.edu) Center for Research on Concepts and Cognition Indiana University. ========================================================================== Proceedings of ConnectFest 1990 -- Indiana University, November 3-4, 1990. Compiled and edited by Doug Blank and Dave Chalmers. (c) 1990, ConnectFest Publications. ----------------------------------------------------- Said the Boltzmann machine to the whore, When his temperature reached sixty-four, "This is such a great feeling I've slowed down my annealing. That way there's more time to explore." -- Mike Gasser ----------------------------------------------------- The folks from the journal Cognition Weren't more than a bad apparition. All's left of Fodor Is just a bad odor, And nobody's missin' Pylyshyn. -- Tim van Gelder ----------------------------------------------------- From Alex.Waibel at SPEECH2.CS.CMU.EDU Tue Nov 13 02:30:50 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Tue, 13 Nov 90 02:30:50 EST Subject: Job Openings at CMU Message-ID: The School of Computer Science at Carnegie Mellon University has several immediate positions at the level of Research Associate and Research Programmer. These positions are research positions without teaching obligations aimed at the development and evaluation of neural network based speech understanding systems. We are seeking strongly motivated, outstanding individuals who have a demonstrated background and research experience in at least one of the following areas and an interest and commitment to work in the others: o Speech Recognition and Understanding (Stochastic or Connectionist) o Neural Network Modeling and Connectionist System Design o System design and implementation, development of connectionist systems on fast parallel hardware, including various available supercomputers, signal processors and neural network chips. Applicants for the position of Research Associate should have completed their PhD and have a demonstrated ability to do independent innovative research in an area listed above. Applicants for the position of Research Programmer should have a B.S. or M.S. degree and be interested in performing independent research and in developing innovative system solutions. Carnegie Mellon University is one of the leading institutions in Computer Science and offers an exciting and challenging environment for research in the areas listed above. The connectionist group at CMU, is one of the most active research teams in this area in the US, and provides a stimulating environment for innovative research. It also maintains a lively interaction with other departments, including Psychology, the Center for Machine Translation, and Computational Linguistics. Interested individuals should send a complete curriculum vitae listing three references to: Dr. Alex Waibel School of Computer Science Carnegie Mellon University Pittburgh, PA 15213 Email: waibel at cs.cmu.edu From pratt at paul.rutgers.edu Tue Nov 13 09:18:42 1990 From: pratt at paul.rutgers.edu (Lorien Y. Pratt) Date: Tue, 13 Nov 90 09:18:42 EST Subject: Announcing a NIPS '90 workshop comparing decision trees and neural nets Message-ID: <9011131418.AA03376@paul.rutgers.edu> Neural Networks and Decision Tree Induction: Exploring the relationship between two research areas A NIPS '90 workshop, 11/30/1990 or 12/1/1990, Keystone, Colorado Workshop Co-Chairs: L. Y. Pratt and S. W. Norton The fields of Neural Networks and Machine Learning have evolved separately in many ways. However, close examination of multilayer perceptron learning algorithms (such as Back-Propagation) and decision tree induction methods (such as ID3 and CART) reveals that there is considerable convergence between these subfields. They address similar problem classes (inductive classifier learning) and can be characterized by a common representational formalism of hyperplane decision regions. Furthermore, topical subjects within both fields are related, from minimal trees and network reduction schemes to incremental learning. In this workshop, invited speakers from the Neural Network and Machine Learning communities will discuss their empirical and theoretical comparisons of the two areas, and then present work at the interface between these two fields which takes advantage of the potential for technology transfer between them. In a discussion period, we'll discuss our conclusions, comparing the methods along the dimensions of representation, learning, and performance. We'll debate the ``strong convergence hypothesis'' that these two research areas are really studying the same problem. Schedule of talks: AM: 7:30-7:50 Lori Pratt Introductory remarks 7:50-8:10 Tom Dietterich Evidence For and Against Convergence: Experiments Comparing ID3 and BP 8:15-8:35 Les Atlas Is backpropagation really better than classification and regression trees? 8:40-9:00 Ah Chung Tsoi Comparison of the performance of some popular machine learning algorithms: CART, C4.5, and multi-layer perceptrons 9:05-9:25 Ananth Sankar Neural Trees: A Hybrid Approach to Pattern Recognition PM: 4:30-4:55 Stephen Omohundro A Bayesian View of Learning with Tree Structures and Neural Networks 5:00-5:20 Paul Utgoff Linear Machine Decision Trees 5:25-5:45 Terry Sanger Basis Function Trees as a Generalization of CART, MARS, and Other Local Variable Selection Techniques 5:50-6:30 Discussion, wrap-up ------------------------------------------------------------------------------ L. Y. Pratt S. W. Norton pratt at paul.rutgers.edu, norton at learning.siemens.com Rutgers University Computer Science Dept. Siemens Corporate Research New Brunswick, NJ 08903. 755 College Road East (201) 932-4634 Princeton, NJ 08540 (609) 734-3365 From PEGJ at oregon.uoregon.edu Tue Nov 13 11:12:00 1990 From: PEGJ at oregon.uoregon.edu (Peggy Jennings) Date: Tue, 13 Nov 90 08:12 PST Subject: real thinking Message-ID: <05B83B73F77F40AB90@oregon.uoregon.edu> This is an open letter to those of you who assume that men are connectionists and women are whores. I am sick of having to point out to you that women are not only connectionists but are humans as well. I can't describe how angry it makes me to log onto the network and find a series of limericks, some of them clever and funny to all of us, others insulting and denigrating to women. What next? Sambo jokes? I wish you guys would use your real brains and think about others than yourselves. Get a clue. _____________________ Peggy J. Jennings Dept. of Psychology Univ. of Oregon From sun at umiacs.UMD.EDU Tue Nov 13 14:11:28 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Tue, 13 Nov 90 14:11:28 -0500 Subject: ~m Message-ID: <9011131911.AA01209@neudec.umiacs.UMD.EDU> From ml-connectionists-request at Q.CS.CMU.EDU Tue Nov 13 03:09:35 1990 Received: from skippy.umiacs.umd.edu by neudec.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88) id AA00957; Tue, 13 Nov 90 03:09:34 -0500 Received: from Q.CS.CMU.EDU by skippy.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88) id AA06534; Tue, 13 Nov 90 03:09:31 -0500 Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa22519; 12 Nov 90 14:11:22 EST Received: from cs.cmu.edu by Q.CS.CMU.EDU id aa22517; 12 Nov 90 14:05:09 EST Received: from neudec.umiacs.umd.edu by CS.CMU.EDU id aa20292; 12 Nov 90 14:00:08 EST Received: by neudec.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88) id AA00284; Mon, 12 Nov 90 13:59:59 -0500 Date: Mon, 12 Nov 90 13:59:59 -0500 From: Guo-Zheng Sun Message-Id: <9011121859.AA00284 at neudec.umiacs.UMD.EDU> To: connectionists at CS.CMU.EDU Status: RO >I am interested in constructive algorithms for neural networks >(capable of generating new units and layers in the NN). Is there >any constructive algorithm for NN whose units have continuous >(e.g.: sigmoidal) activation function? The works of this kind back to 1987 may also include: (1)G.Z. Sun, H.H. Chen and Y.C. Lee, "A Novel Net that Learns Sequential Decision Processes", NIP Proceedings (1987),Denver. (2) G.Z. Sun, H.H.Chen and Y.C.Lee, "Parallel Sequential Induction Network: A New Paradiagm of Neural Network Architecture", ICNN Proceedings,(1988), San Diego. These two papers generalized the idea of conventional regression tree to generate the new NN units dynamically. The continuous activation fuction is used. This PSIN (Parallel Sequential Induction Net) algorithm integrates the parallel processing of neural net and sequential processing of decision tree together to form a "most efficient" multilayered neural net classifier. The efficiency measure is the information entropy. -G. Z. Sun University of Maryland this is a test of mail tilde escape From stolcke at ICSI.Berkeley.EDU Mon Nov 12 23:34:53 1990 From: stolcke at ICSI.Berkeley.EDU (stolcke@ICSI.Berkeley.EDU) Date: Mon, 12 Nov 90 20:34:53 PST Subject: Summary: Cluster Analysis software Message-ID: <9011130434.AA00821@icsib12.Berkeley.EDU> Here is a short summary of the replies I got to my request for clustering software. Thanks to all who replied, the problem is more that solved. A couple of home-brewn programs were mentioned. One that apparently has been floating around quite a bit is "cluster" by Yoshiro Miyata. This turned out to be the program that draws cluster trees in ASCII that I had mentioned in my original message. Many people mentioned "S" (or "S+" in a more recent version), a general statistics package that does cluster analysis among other things. Reportedly PostScript and X Windows graphics output are supported. This is commercial software, the contact address is STATSCI, PO Box 85625, Seattle, WA 98145-1625, 206-322-8707 (standard disclaimer: I have no personal interest in the economic success of this company, I don't even know their product yet). There is also a book about S by R.A. Becker & J.M. Chambers, "S: An Inter- active Environment for Data Analysis and Graphics," Wadsworth, Belmont, Ca., 1984 (2nd edition, probably not the most recent one). An alternative (possibly less expensive, but then I have no pricing info yet), might be Berkeley's BLSS software package, available through BLSS Project, Dept. of Statistics, University of California, Berkeley, CA 94720, 415-642-5258, blss at back.berkeley.edu. Again, there's a book out, by D.M. Abrahams & F. Rizzardi, "BLSS: The Berkeley Interactive Statistical System," W.W. Norton & Company, 1988. Finally, since I needed something FAST, I created my own solution. I added a couple of features to Miyata's cluster program to make it more useful in conjunction with other programs. It now can be used conveniently as a UNIX filter, and optionally produces device-independent graphics output that may be rendered on a variety of output media through the family of standard UNIX programs dealing with plot(5) format (including PostScript and X Windows previewing). Other features added include options for output format selection, distance metric selection, and scaling of individual dimensions. I have done absolutely nothing to improve the algorithm used to do the actual clustering, since my data sets are typically small and I needed to quickly hack up something that produces acceptable output. (The complexity of the algorithm currently used is n^3, whereas n log n is feasible, I am told.) The program is written modularly enough so that it shouldn't be too hard to plug in a better algorithm if available. The program as it stands now should be flexible enough to be of public use. It could be a poor man's solution for those who need to produce cluster analyses but do not have access to fancy statistics software packages or don't want to deal with one. I'll make it available through anonymous FTP from icsi-ftp.berkeley.edu (128.32.201.55). To get it use FTP as follows: % ftp icsi-ftp.berkeley.edu Connected to icsic.Berkeley.EDU. 220 icsi-ftp (icsic) FTP server (Version 5.60 local) ready. Name (icsic.Berkeley.EDU:stolcke): anonymous Password (icsic.Berkeley.EDU:anonymous): 331 Guest login ok, send ident as password. 230 Guest login Ok, access restrictions apply. ftp> cd pub 250 CWD command successful. ftp> binary 200 Type set to I. ftp> get cluster.tar.Z 200 PORT command successful. 150 Opening BINARY mode data connection for cluster.tar.Z (15531 bytes). 226 Transfer complete. 15531 bytes received in 0.08 seconds (1.9e+02 Kbytes/s) ftp> quit 221 Goodbye. Then unpack and compile using % zcat cluster.tar.Z | tar xf - % make Please don't ask me to mail it if you can get it through FTP. Of course I cannot guarantee the usefulness of the program or promise any level of support etc. However, I'd like to know of any bug fixes and improvements, especially to the algorithm (hint, hint ...) Again, thanks for all the replies, Andreas From gary%cs at ucsd.edu Tue Nov 13 15:38:09 1990 From: gary%cs at ucsd.edu (Gary Cottrell) Date: Tue, 13 Nov 90 12:38:09 PST Subject: real thinking Message-ID: <9011132038.AA10517@desi.ucsd.edu> I totally agree with Peggy. I find several of those limericks really offensive. Degrading women degrades the degrader. g. From gary%cs at ucsd.edu Tue Nov 13 15:43:05 1990 From: gary%cs at ucsd.edu (Gary Cottrell) Date: Tue, 13 Nov 90 12:43:05 PST Subject: real thinking Message-ID: <9011132043.AA10527@desi.ucsd.edu> PS. As long as we're on the subject, I also find the continued use of the "Missionaries and Cannibals" problem in AI as really poor taste. I usually present it to my class as the "Masai and missionaries" problem - the goal being for the Masai to avoid being converted by the missionaries. g. From ISAI at TECMTYVM.BITNET Tue Nov 13 19:23:35 1990 From: ISAI at TECMTYVM.BITNET (Centro de Inteligencia Artificial(ITESM)) Date: Tue, 13 Nov 90 18:23:35 CST Subject: CALL FOR PAPERS Message-ID: <901113.182335.CST.ISAI@TECMTYVM> Please include the following information in your bulletin board. Thank you in advance. Sincerely, The Symposium Publicity Committee. ====================================================================== INTERNATIONAL SYMPOSIUM ON ARTIFICIAL INTELLIGENCE APPLICATIONS IN INFORMATICS: Software Engineering, Data Base Systems, Computer Networks, Programming Environments, Management Information Systems, Decision Support Systems. C A L L F O R P A P E R S Preliminary Version. The Fourth International Sysmposium on Artificial Intelligence will be held in Cancun Mexico on November 13-15, 1991. The Symposium is sponsored by the ITESM (Instituto Tecnologico y de Estudios Superiores de Monterrey) in cooperation with the International Joint Conferences on Artificial Intelligence Inc., the American Association for Artificial Intelligence, the Canadian Society for Computational Studies of Intelligence, the Sociedad Mexicana de Inteligencia Artificial and IBM of Mexico. Papers from all countries are sought that: (1) Present applications of artificial intelligence technology to the solution of problems in Software Engineering, Data Base Systems, Computer Networks, Programming Environments, Management Information Systems, Decision Support Systems and other Informatics technologies; and (2) Describe research on techniques to accomplish such applications, (3) Address the problem of transfering the AI Technology in different socio-economic contexts and environments. Areas of application include but are no limited to: Software development, software design, software testing and validation, computer-aided software engineering, programming environments, structured techniques, intelligent databases, operating systems, intelligent compilers, local networks, computer network design, satellite and telecommunications, MIS and data processing applications, intelligent decision support systems. AI techniques include but are not limited to: Expert systems, knowledge acquisition and representation, natural language processing, computer vision, neural networks and genetic algorithms, automated learning, automated reasoning, search and problem solving, knowledge engineering tools and methodologies. Persons wishing to submit a paper should send five copies written in English to: Hugo Terashima, Program Chair Centro de Inteligencia Artificial, ITESM. Ave. Eugenio Garza Sada 2501, Col.Tecnologico C.P. 64849 Monterrey, N.L. Mexico Tel.(52-83) 58-2000 Ext.5134 Telefax (52-83) 58-1400 Dial Ext.5143 or 58-2000 Ask Ext.5143 Net address: ISAI at tecmtyvm.bitnet or ISAI at tecmtyvm.mty.itesm.mx The paper should identify the area and technique to which it belongs. Extended abstract is not required. Use a serif type font, size 10, sigle-spaced with a maximum of 10 pages. No papers will be accepted by electronic means. Important dates: Papers must be received by April 30,1991. Authors will be notified of acceptance or rejection by June 15,1991. A final copy of each accepted paper, camera ready for inclusion in the Symposium proceedings, will be due by July 15,1991. ======================================================================= From pollack at cis.ohio-state.edu Tue Nov 13 21:22:08 1990 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Tue, 13 Nov 90 21:22:08 -0500 Subject: real thinking In-Reply-To: Peggy Jennings's message of Tue, 13 Nov 90 08:12 PST <05B83B73F77F40AB90@oregon.uoregon.edu> Message-ID: <9011140222.AA01478@dendrite.cis.ohio-state.edu> **Do Not Forward** As one of the persons responsible for the Connectfest, I apologize to Peggy Jennings for all the hidden units of misogyny in the abridged proceedings recently circulated. One has to read very closely to find: "I recursively RAAM it/with Symbols Goddammit" (Brutality?) "We only had backpropagation." (Sodomy?) "Your bottom feels better/Sliding over each letter" (slave branding?) "a willing basin of attraction" (body-part objectification?) I also apologize to the women from Bloomington who were present during the sophomoric smut contest, and must have felt uncomfortable with us males engaging in such a "connectionismo" poetry competition. However, I think the charge of sexism would be much more effective if offered with the correct meter and form: The Boltzwoman yelled at the chauvinist: "You used the word "whore" and I'm pissed! I'll clamp down on your nodes, anneal your temporal lobes, And your genetic code will be fixed!" Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Fax/Phone: (614) 292-4890 From harnad at clarity.Princeton.EDU Wed Nov 14 09:15:09 1990 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Wed, 14 Nov 90 09:15:09 EST Subject: Netiquette Message-ID: <9011141415.AA12499@psycho.Princeton.EDU> > Date: Tue, 13 Nov 90 12:38:09 PST > From: Gary Cottrell > > ...I find several of those limericks really offensive. > Degrading women degrades the degrader... > > ...I also find the continued use of the "Missionaries and > Cannibals" problem in AI as really poor taste... Is there more than one Gary Cottrell on the net? Or has the one that posted the "Rumelphart" piece just a few weeks ago undergone some sort of an aesthetic conversion experience? Stevan Harnad From fritz_dg%ncsd.dnet at gte.com Wed Nov 14 09:15:26 1990 From: fritz_dg%ncsd.dnet at gte.com (fritz_dg%ncsd.dnet@gte.com) Date: Wed, 14 Nov 90 09:15:26 -0500 Subject: missionaries and cannibals Message-ID: <9011141415.AA13520@bunny.gte.com> Open to g. Masai and missionaries still in poor taste, as someone remains the butt of the joke. Think up something new and neutral. Also, in future, please provide address so reply needn't be broadcast. fritz_dg%ncsd at gte.com From hendler at cs.UMD.EDU Wed Nov 14 09:33:28 1990 From: hendler at cs.UMD.EDU (Jim Hendler) Date: Wed, 14 Nov 90 09:33:28 -0500 Subject: real thinking Message-ID: <9011141433.AA07326@dormouse.cs.UMD.EDU> Before this debate gets too political (and I do agree with Peggy and Gary), I'd like to point out that all of the limericks posted were in the modern limerick form, in which the first and fifth lines rhyme, but don't recap. The traditional limerick form, made most popular by Lear, has a reiteration of first and fifth. Here's an example in the traditional form to make up for this lack: Said a wizened old grad with a net, "Alas, It has not converged yet!" The learn rate? why 1, Momentum? ummm ... none. That frustrated grad with a net. -J Hendler From pastor at PRC.Unisys.COM Wed Nov 14 10:58:13 1990 From: pastor at PRC.Unisys.COM (pastor@PRC.Unisys.COM) Date: Wed, 14 Nov 90 10:58:13 EST Subject: Stop it right now, please Message-ID: <9011141558.AA01269@gem.PRC.Unisys.COM> I hesitate to get involved in this, but I think that the limerick "discussion" has moved outside the scope of connectionists. A news group/mailing list that consistently provides challenging, interesting, and *pertinent* discussion (and *no* flames or invective) is a rare animal these days, and I hate to see this one jeopardized. I am not commenting on the merits of either party's arguments, or on the importance of the topic; all I'm saying is that it has nothing to do with connectionism per se. I would like to suggest that the rest of this apparently-escalating (and probably irresolvable) conflict be continued via direct e-mail among the combatants, or on a newsgroup that is more appropriate to its content. Thanks to all concerned, and good luck in resolving your disagreements. From kimd at gizmo.usc.edu Wed Nov 14 14:09:42 1990 From: kimd at gizmo.usc.edu (Kim Daugherty) Date: Wed, 14 Nov 1990 11:09:42 PST Subject: Hierarchical Representations Message-ID: We are interested in representing variable sized hierarchical tree structures as activations over a fixed number of units in a connectionist model. We are aware of two current approaches, Touretzky's BoltzCONS and Pollack's RAAM, that perform this task. Have other approaches been developed as well? We are also looking for some connectionist modeling tools. If anybody has or knows of any simulator that fits some or all of the following description, please contact Kim Daugherty (email below): (1) RAAM simulator written in C or a nice backprop simulator written in C that can be easily converted to a RAAM, (2) A graphical interface for unit activations and weights, and (3) Something written to work under UNIX hosted on a SUN. Thanks in advance for help. We will summarize responses and post to the mailing list if appropriate. Mark Seidenberg Kim Daugherty Neuroscience Program Neuroscience Program University of Southern California University of Southern California marks at neuro.usc.edu kimd at gizmo.usc.edu From LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU Tue Nov 13 23:19:00 1990 From: LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU (LAI-WAN CHAN) Date: Wed, 14 Nov 90 12:19 +0800 Subject: real thinking Message-ID: <04CC9E29C53F20023C@CUCSD.CUHK.HK> > From: Peggy Jennings > Subject: real thinking > To: connectionists at CS.CMU.EDU > Message-id: <05B83B73F77F40AB90 at oregon.uoregon.edu> > This is an open letter to those of you who assume that men are > connectionists and women are whores. I am sick of having to point out to > you that women are not only connectionists but are humans as well. I can't > describe how angry it makes me to log onto the network and find a series of > limericks, some of them clever and funny to all of us, others insulting and > denigrating to women. What next? Sambo jokes? I wish you guys would use > your real brains and think about others than yourselves. Get a clue. Agreed. I, as a woman, am expressing my anger too. Funny jokes are good but don't be insulting. Lai-Wan Chan, Computer Science Dept, The Chinese University of Hong Kong. From wilson at cs.UMD.EDU Wed Nov 14 13:17:38 1990 From: wilson at cs.UMD.EDU (wilson@cs.UMD.EDU) Date: Wed, 14 Nov 90 13:17:38 -0500 Subject: real thinking Message-ID: <9011141817.AA14886@brillig.cs.UMD.EDU> I too found some of the limericks to be offensive and showing a great insensitivity, and am glad to see a few others agree. But because I'm afraid we may be in the minority I feel compelled to add my voice. Although I believe Jordan Pollack had the best intentions, he may have inadvertently trivialized the issue by making light of it. Sigh. Will we ever learn?? Anne Wilson From dave at cogsci.indiana.edu Wed Nov 14 16:36:41 1990 From: dave at cogsci.indiana.edu (David Chalmers) Date: Wed, 14 Nov 90 16:36:41 EST Subject: limericks Message-ID: I apologize to anyone who was offended. The organizers felt, perhaps naively, that none of the limericks were degrading, being sexual rather than sexist. Sorry for cluttering connectionists with all this. Dave Chalmers. From gary%cs at ucsd.edu Wed Nov 14 16:39:02 1990 From: gary%cs at ucsd.edu (Gary Cottrell) Date: Wed, 14 Nov 90 13:39:02 PST Subject: Netiquette Message-ID: <9011142139.AA11426@desi.ucsd.edu> >From: Stevan Harnad >> Date: Tue, 13 Nov 90 12:38:09 PST >> From: Gary Cottrell >> >> ...I find several of those limericks really offensive. >> Degrading women degrades the degrader... >> >> ...I also find the continued use of the "Missionaries and >> Cannibals" problem in AI as really poor taste... > >Is there more than one Gary Cottrell on the net? Or has the one that >posted the "Rumelphart" piece just a few weeks ago undergone some sort >of an aesthetic conversion experience? > >Stevan Harnad I think there's a big difference between satirizing individuals and using notions that degrade a class of people. The notion of "whore" is something that applies only to women. You can argue about this, and *we should take this argument off the net*, but there is no equivalent notion for men. It is an inherently sexist concept. Yes, the fake names I use in my humor are sometimes cheap shots. Sometimes they are just the closest funniest similar sounding thing I can think of, and not meant to attribute any particular qualities to the individuals so targeted. As far as I know my farts are the worst around. gary From chan at unb.ca Wed Nov 14 15:58:47 1990 From: chan at unb.ca (Tony Chan) Date: Wed, 14 Nov 90 16:58:47 AST Subject: Peggy Jennings Message-ID: I came from a culture, where, as far as I know, feminism is assumed and femininity is worshipped. I unreservedly agree with Peggy on this issue. Sexism aside, I find it sickening to learn that the cream of the crop of this culture wastes so much of the resources---mental, electronic, and others---on non-scientific matters. Being a member of this list, I feel I must apologize sincerely to women on the list even though I have absolutely nothing to do with Jordan Pollock or the Connectfest that he was partly responsible for. Tony Chan Telephone: 506 648-5500 P.O. Box 5050 Fax: 506 648-5528 Computer Science Email: chan at unb.ca UNB Saint John Campus Saint John, New Brunswick E2L 4L5 Canada From Connectionists-Request at CS.CMU.EDU Thu Nov 15 10:19:56 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Thu, 15 Nov 90 10:19:56 EST Subject: Fwd: CALL FOR PAPERS Message-ID: <18295.658682396@B.GP.CS.CMU.EDU> ------- Forwarded Message From wilson at magi.ncsl.nist.gov Wed Nov 14 13:16:00 1990 From: wilson at magi.ncsl.nist.gov (Charles Wilson x2080) Date: Wed, 14 Nov 90 13:16:00 EST Subject: CALL FOR PAPERS Message-ID: <9011141816.AA11580@magi.ncsl.nist.gov> CALL FOR PAPERS PROGRESS IN NEURAL NETWORKS This is a call for papers for Special Volume of the Progress In Neural Networks Series. This volume will concentrate on software implementation of neural networks, natural and synthetic. Contributions from leading researchers and experts will be sought. This series is intended for a wide audience, including those professionally involved in neural network research, such as lecturers and primary investigators in neural computing, neural modeling, neural learning, neural memory, and neurocomputers. Authors are invited to submit an abstract, extended summary, or manuscripts describing recent progress in theoretical analysis, modeling, or design of neural network architecture. The manuscripts should be self contained and of a tutorial nature. Suggested topics include, but are not limited to: * Parallel Architectures * Distributed Architectures * Hybrid Architectures * Parallel Learning Models * Connectionist Architectures * Associative Memory * Self Organizing and Adaptive Systems * Neural Net Language)to)Architecture Translation Ablex and the editors invite you to submit an abstract, extended summary, or manuscript proposal for consideration. Please contact the editors directly. Omid M. Omidvar, Associate Professor Charles L. Wilson, Manager Progress Series Editor Associate Volume Editor University of the District of Columbia Image Recognition Group Computer Science Department Advanced Systems Division 4200 Connecticut Avenue, N.W. National Inst. of Tech & Stand. Washington, D.C. 20008 Gaithersburg, Maryland 20899 Tel:(202)282-7345 Fax:(202)282-3677 Tel:(301)975-2080 Fax:(301)590-0932 Email: OOMIDVAR at UDCVAX.BITNET Email:Wilson at MAGI.NCSL.NIST.GOV Publisher:Ablex Publishing Corp.,355 Chestnut St., Norwood,NJ 07648 (201)767)8450 ------- End of Forwarded Message From AC1MPS at primea.sheffield.ac.uk Thu Nov 15 16:02:49 1990 From: AC1MPS at primea.sheffield.ac.uk (AC1MPS@primea.sheffield.ac.uk) Date: Thu, 15 Nov 90 16:02:49 Subject: super-Turing systems Message-ID: Two points: (1) All this chat about limericks is irrelevant to the actual purpose of this network, as I understand it. Some subscribers have to pay to receive this stuff - others have better things to do with their time than wade through it all to find the bits that matter. Please stop it. (2) super-Turing systems ==================== For some time now I've been interested in identifying systems that seem to possess 'super-Turing' potential, in the sense that they can implement/model processes which are provably not modellable by a Turing machine program. A few weeks ago, the Pour-El work was broadcast, showing that some aspects of analog computation were super-Turing. Other examples exist as well. Does anyone know of any others, besides the ones listed below? a) Pour-El et al (cited in earlier broadcast) The wave equation with computable initial conditions can assume non-computable values after computable time. b) Myhill (cited in Pour-El's work) A recursive function can have non-recursive derivative. c) An analog version of the Turing machine can solve the Turing halting problem. (myself: "X-machines and the Halting Problem ... ", to appear in Formal Aspects of Computing 1990(4)). d) An extended version of the universal quantum computer (Deutsch 1985) can implement unbounded non-determinism. (myself: work in progress, unpublished) [Reference to Deutsch's work: "Quantum theory, the Church-Turing principle, and the universal quantum computer", Proc R.Soc.London A400 (1985) pp. 97-117]. e) Standard concurrency specification languages can represent unbounded non-determinism (e.g. CCS using infinite summation). f) Smolin/Bennett Quantum mechanics can be used in cryptology. Claimed in Deutsch 1989 (Quantum communication thwarts eavesdroppers, in New Scientist, December 9th 1989) that their QPKD system is the first physical information processing device with capabilities beyond those of the Turing machine. I'm currently trying to decide how (if?) these models can be unified, and whether any of them might reasonably be implemented within biological systems. Is anyone working on similar problems? Mike Stannett, AC1MPS @ uk.ac.shefpa From kanal at cs.UMD.EDU Thu Nov 15 11:50:37 1990 From: kanal at cs.UMD.EDU (Laveen N. Kanal) Date: Thu, 15 Nov 90 11:50:37 -0500 Subject: Announcing a NIPS '90 workshop comparing decision trees and neural nets Message-ID: <9011151650.AA03106@brillig.cs.UMD.EDU> Your information has come too late for me to take advantage and be present aty this interesting workshop. Many of the papers in the A.I literature and statistics literature on the topic of decision trees have ingnored other literature on the subject in pattern recogntion. For those who are interested an extensive survey appears in " decision trees in Pattern recognition" by G.R. Dattatreya and L.N. Kanal, in Progress in pattern recognition 2, L.kanal and A. Rosenfeld (editors), North-Holland, 1985. Please let me know if you are going to put together the papers presented at your workshop as I would be most interested in seing them. Thanks, laveen kanal From lina at ai.mit.edu Thu Nov 15 11:36:41 1990 From: lina at ai.mit.edu (Lina Massone) Date: Thu, 15 Nov 90 11:36:41 EST Subject: limericks Message-ID: <9011151636.AA01275@cauda-equina> I too feel compelled to add my voice. So that, hopefully, people will do some more thinking before sending offensive, not-funny jokes on the network. Lina Massone From jai at sol.boeing.com Thu Nov 15 13:49:53 1990 From: jai at sol.boeing.com (Jai Choi) Date: Thu, 15 Nov 90 10:49:53 PST Subject: NN for image pre-processing Message-ID: <9011151849.AA09339@sol.boeing.com> Hi, my name is Jai Choi I found this connectionist letter very useful to get new information. As you know NN has proven to be well suited for the low-level image pre-processing. We can take advantage of speed over conventional convolution technique. I am currently trying to implement NN processor for examining an X-ray image with hard-wired neural network. Do anybody know of any kind of hardware development for this problem ? Please share information regarding NN application for low level image pre-processing either software or hardware aspect. My e-mail address is jai at sol.boeing.com. My physical address is Jai Choi Boeing Computer Services P.O.Box 24346, MS 6R-08 Seattle, WA 98124. I will summarize and post the result later. From pratt at paul.rutgers.edu Thu Nov 15 18:03:01 1990 From: pratt at paul.rutgers.edu (Lorien Y. Pratt) Date: Thu, 15 Nov 90 18:03:01 EST Subject: Request for references to adaptive MLP/BP networks (i.e. changing training data) Message-ID: <9011152303.AA12431@paul.rutgers.edu> Hello, I'm collecting references for papers which address the behavior of back-propagation-like networks when the training data changes. My bibliography so far of papers that look applicable is reproduced below this message. I'd appreciate any pointers that you might have to other papers, especially recent ones, or where the fact that the training data changes is not obvious from the title. Thanks in advance! -- Lori ------------------------------------------------------------------------------ @techreport{ weigend-90, MYKEY = " weigend-90 : .unb .csy .con .meth", TITLE = "Predicting the Future: A Connectionist Approach", AUTHOR = "Andreas S. Weigend and Bernardo A. Huberman and David E. Rumelhart", YEAR = 1990, MONTH = "April", INSTITUTION = "Stanford PDP Research Group", ADDRESS = "Stanford, California 94305-2130", NUMBER = "Stanford-PDP-90-01, PARC-SSl-90-20", } @misc{ tenorio-89a, MYKEY = " tenorio-89a : .bap .unr10 .unb .csy .app .con ", TITLE = "{Adaptive Networks as a Model for Human Speech Development}", NOTE = "A large scaled-up back-propagation study", KEY = "tenorio-89a" } @INCOLLECTION{ barron-84, MYKEY = " barron-84 : .unf .dt ", AUTHOR = {R. L. Barron and A. N. Mucciardi and F. J. Cook and J. N. Craig and A. R. Barron}, TITLE = {Adaptive Learning Networks: Development and Application in the {United} {States} of Algorithms related to {GMDH}}, BOOKTITLE = {Self-Organizing Methods in Modeling}, PUBLISHER = {Marcel Dekker}, YEAR = {1984}, EDITOR = {S. J. Farlow}, address = "{New} {York}" } @misc{ nguyen-89, MYKEY = " nguyen-89: .hop .blt .con .cnf ", TITLE = "{A New LMS-Based Algorithm for Rapid Adaptive Classification in Dynamic Environments}", KEY = "nguyen-89" } Allen, R.B. Adaptive Training of Connectionist State Machines. ACM Computer Science Conference, Louisville, Feb, 1989, 428. --References for which I don't have full bibliographic information:-- Adaptive Pattern Recognition and Neural Networks by Y. Pao >From Von Nostran Reinhold: NEURAL NETWORK ARCHITECTURES An Introduction by Judith Dayhoff, Ph.D., Editor of the Journal of Neural Network Computing Contents include: Associative and Adaptive Networks - More Paradigms The book: ``Neural Computing'' From slehar at park.bu.edu Thu Nov 15 19:06:43 1990 From: slehar at park.bu.edu (slehar@park.bu.edu) Date: Thu, 15 Nov 90 19:06:43 -0500 Subject: real thinking In-Reply-To: connectionists@c.cs.cmu.edu's message of 15 Nov 90 01:03:18 GM Message-ID: <9011160006.AA24654@thalamus.bu.edu> When the meaning of every joke Is perceived as a sexist poke The problem may lie In the reader's own eye That they can't laugh like regular folk :-) From french at cogsci.indiana.edu Fri Nov 16 08:52:32 1990 From: french at cogsci.indiana.edu (Bob French) Date: Fri, 16 Nov 90 08:52:32 EST Subject: No subject Message-ID: If one were to simply effect a count of the responses to Ms. Peggy Jennings' lambasting of the limericks posted recently on connectionists, one might conclude that there is significant support of her view. I believe this is wrong. I think it probable that that those who disagree with her have simply kept their mouths shut for fear of being branded as sexist for their disagreement. Allow me to buck that trend. I wish to speak on behalf of Mike Gasser (author of the Offending Limerick) and Dave Chalmers (one of the organizers of the Offending Contest in which the Offending Limerick was read and responsible for posting It to the net), both of whom are good friends of mine. Each of them is absolutely aware, in theory and in practice, in their speech and in their writings of the issues of sexism, sexist language, etc. and it pains me to see the inflammatory remarks of the Oregon connectionist directed at them. Stauncher supporters of women in science than Mike and Dave you would be very hard pressed to find. I feel that a Ms. Jennings' comments were an over-reaction verging on paranoia ("men are connectionists and women are whores"). While one might argue that the limerick about the Boltzman machine might have been mildly offensive because of its use of an apparently proscribed word ("whore"), under no circumstances did it merit such an extremely virulent response. I personally thought the limerick in question was funny and, judging by the reaction of the audience of men AND WOMEN at the ConnectFest here in Bloomington, so did they. And, no, the participating women were not unthinking minions to male domination. They, like the guys there, simply had a sense of humor. And now, hopefully, back to connectionism. Bob French Center for Research on Concepts and Cognition, Indiana University From petsche at learning.siemens.com Fri Nov 16 09:20:57 1990 From: petsche at learning.siemens.com (Thomas Petsche) Date: Fri, 16 Nov 90 09:20:57 EST Subject: Peggy Jennings In-Reply-To: Message-ID: <9011161420.AA20254@learning.siemens.com.siemens.com> Perhaps we could all get off our high horses now and stop bashing Jordan and his Connectionfest now. In response to Tony Chan <@bitnet.cc.cmu.edu:chan at unb.ca>: Actually, I spend approximately half my waking hours engaged in all sorts of activity that "wastes so much of the resources---mental, electronic, and others---on non-scientific matters". Eg, cooking, cleaning, house repairs, playing with children, and yes, even childish joking with friends (my goodness, (nope, no profanity here!) most of them have graduate degrees -- what is the world coming to?). I'm ashamed to admit it, but I even exchange email (would you believe it, wasting all that bandwidth!) *containing winking smileys* with my wife. Hope this sexual (sexist?) revelation doesn't offend anyone. ... I was interested to discover that there is no such thing as a male prostitute ... hmm, what was that movie, "American ...". Don't tell me ... I'll get it ... starts with a "G" ... From Michael.Witbrock at MJW.BOLTZ.CS.CMU.EDU Fri Nov 16 10:28:34 1990 From: Michael.Witbrock at MJW.BOLTZ.CS.CMU.EDU (Michael.Witbrock@MJW.BOLTZ.CS.CMU.EDU) Date: Fri, 16 Nov 90 10:28:34 EST Subject: verifiable cognition Message-ID: The recent attempt to show humour, which subsists in restating the rumour, that ``regular folks'' amounts to ``us blokes'', is no better than those posted sooner. From Alex.Waibel at SPEECH2.CS.CMU.EDU Fri Nov 16 11:51:08 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Fri, 16 Nov 90 11:51:08 EST Subject: Announcing a NIPS '90 workshop comparing decision trees and neural nets Message-ID: There will be summary write-ups for all workshops placed on file at Ohio-State for remote FTP-ing, after the workshops. Enjoy, Alex Waibel From jose at learning.siemens.com Fri Nov 16 08:30:39 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Fri, 16 Nov 90 08:30:39 EST Subject: Netiquette Message-ID: <9011161330.AA20175@learning.siemens.com.siemens.com> Yes folks. Lets get this off the net please. Steve Hanson From Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU Sat Nov 17 00:46:56 1990 From: Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave.Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Sat, 17 Nov 90 00:46:56 EST Subject: neural net position available Message-ID: <5238.658820816@DST.BOLTZ.CS.CMU.EDU> I'm posting this position announcement as a favor to the Navy. I have no additional information to what appears below, so contact Dr. Lorey, not me, if you have questions. And please: let's have no more limericks or limerick discussions. Everybody get back to work. -- Dave ................................................................ Subj: POSITION DESCRIPTION 1. The Naval Surface Warfare Center is seeking to expand its operations in basic research in Artificial Neural Nets to compliment the ongoing applied work. A new thrust in basic research is seeking to augment the existing staff with one or two new hires. Preference will be given candidates who will soon complete their PhD work but Masters candidates will be considered. We are also interested in filling at least one summer faculty position. Neural Network experience is a must for all positions. The full-time and summer positions will join the current staff of five--two with PhDs, two with Masters. 2. The ongoing research areas include: multi-sensor fusion, image/data compression, image processing, optimization, development of new learning rules for existing architectures and development of schemes to embed networks in multi-chip simulation systems. 3. Equipment on hand: 4 Sun Workstations 1 Alliant FX40 Minisuper Computer with 2 Compote engines 1 Silicon Graphics 4D/220GTXB Graphics Workstation 4. Planned positions(s) will encompass work in the following areas: -route planning for autonomous vehicles, -improvement of existing (Hopfield, Boltzmann machines) and -development of new network architectures for optimization, -development/implementation of networks for low-level vision processing, and -evaluation/integration of neural network processing chips 5. Planned procurements: 2 Silicon Graphics 4D/35TG 1 Terrain board 1 Silicon Graphics 4D1340VGXB 1 VME Sun Expansion box 1 VME SG Expansion box 1 Breadboard table Video capability 5. Technical questions regarding ongoing work at the Center may be addressed to Dr. George Rogers or Mr. Jeffrey Solka at 703-663- 7650. 6. Interested candidates should submit their resume to Dr. Richard Lorey (703-663-8159) at MILNET address: rlorey@ relay.nswc.navy.mil or via mail to: Commander Naval Surface Warfare Center Code K12 (Dr. Richard Lorey) Dahlgren, VA 22448-5000 7. ELIGIBILITY FOR SECRET LEVEL CLEARANCES MANDATORY - RESUME SHOULD INDICATE U.S. CITIZENSHIP STATUS. /s/ Dr. Richard Lorey From burr at mojave.stanford.edu Sat Nov 17 00:54:20 1990 From: burr at mojave.stanford.edu (Jim Burr) Date: Fri, 16 Nov 90 21:54:20 PST Subject: NIPS VLSI post conference workshop Message-ID: <9011170554.AA06228@mojave.Stanford.EDU> This year's post conference workshop on VLSI neural nets will be held Saturday, Dec 1. There will be a morning and an evening session. The workshop will address the latest advances in VLSI implementations of neural nets. How successful have implementations been so far? Are dedicated neurochips being used in real applications? What algorithms have been implemented? Which ones have not been? Why not? How important is on chip learning? How much arithmetic precision is necessary? Which is more important, capacity or performance? What are the issues in constructing very large networks? What are the technology scaling limits? Any new technology developments? Several invited speakers will address these and other questions from various points of view in discussing their current research. We will try to gain better insight into the strengths and limitations of dedicated hardware solutions. Jim Burr burr at mojave.stanford.edu From rsun at chaos.cs.brandeis.edu Sat Nov 17 00:14:40 1990 From: rsun at chaos.cs.brandeis.edu (Ron Sun) Date: Sat, 17 Nov 90 00:14:40 est Subject: No subject Message-ID: <9011170514.AA15626@chaos.cs.brandeis.edu> Tony Chan wrote: >I came from a culture, where, as far as I know, feminism is assumed and >femininity is worshipped. I wonder where that *culture* is. If you are from where I am from, you are too far from the truth. >I unreservedly agree with Pegg >y on this >issue. Sexism aside, I find it sickening to learn that the cream of the >crop of this culture wastes so much of the resources---mental, >electronic, and others---on non-scientific matters. so much? >Being a member of >this list, I feel I must apologize sincerely to women on the list even >though I have absolutely nothing to do with Jordan Pollock or the >Connectfest that he was partly responsible for. Sounds really weird to me. > >Tony Chan Telephone: 506 648-5500 >P.O. Box 5050 Fax: 506 648-5528 >Computer Science Email: chan at unb.ca >UNB Saint John Campus >Saint John, New Brunswick E2L 4L5 >Canada Please, folks, stop it now. enough of this stuff. --Ron From watt at compsci.stirling.ac.uk Fri Nov 16 11:07:01 1990 From: watt at compsci.stirling.ac.uk (R.J. Watt) Date: 16 Nov 90 16:07:01 GMT (Fri) Subject: Permanent Lectureship Message-ID: <9011161607.AA22485@uk.ac.stir.cs.lev> University of Stirling, Scotland Department of Psychology Centre for Cognitive and Computational Neuroscience (CCCN) We have a permanent post available for a LECTURER: M. Sc. in NEURAL COMPUTATION This new one-year M. Sc. will begin Sept 1991, and will cover a wide variety of topics in neural computation with vision as a major specialisation. The person appointed will have a major responsibility for running the course, and for teaching the vision components, and will also be exepected to have an active research program. The CCCN is multidisciplinary and includes staff from the Departments of Psychology, Computing Science and Mathematics. Starting date: 1 July 1991 Salary on scale UKL 12,086 - UKL 22,311 Applications (initially by by email, fax or snail-mail) including a CV with names and addresses (inc e-mail if poss) of 2 referees to Prof R. J. Watt, Psychology Dept., University of Stirling, FK9 4LA Scotland tel: 0786 67665 fax : + 44 786 63000 e-mail: watt at cs.stir.ac.uk by 26 Nov 1990. Further particulars and information from Prof Watt. The University of Stirling is an equal opportunities employer. From Connectionists-Request at CS.CMU.EDU Sat Nov 17 12:55:56 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Sat, 17 Nov 90 12:55:56 EST Subject: Enough Already! Message-ID: <18700.658864556@B.GP.CS.CMU.EDU> Move the discussion of the limericks off this mailing list! The Connectionists mailing list is a private mailing list for the discussion of *TECHNICAL* issues only. Persons who continue this limerick thread will be removed from the list. Scott Crowder Connectionists-Request at cs.cmu.edu (ARPAnet) From PEGJ at oregon.uoregon.edu Sat Nov 17 17:01:00 1990 From: PEGJ at oregon.uoregon.edu (Peggy Jennings) Date: Sat, 17 Nov 90 14:01 PST Subject: a limerick of my own Message-ID: <0262CDAF6F1F413D93@oregon.uoregon.edu> I am risking broadcasting this message on the network because I started a discussion surrounding the offensive limericks, but did not make clear what offended me. It is important to broadcast the clarification of this issue on the net because there is a recurring type of offense that happens on this network and at connectionist conferences and summer institutes that I am not faced with on other networks and at other conferences. The offenses in the limericks are not related to word usage ('whore') nor about topic selection (sex). I apologize to those who have involuntarily spent money or time reading those messages. The offenses lie in the communication of assumed male perspective and values, and in the assumption that humor that is insulting and degrading to another group of people is acceptable by tacit agreement. This *is* a concern of members of this network. Unfortunately, in my original message, I defined the issues in a conjunction using two asymmetrical relations: (1) that connectionists are men (2) that women are whores. Both of these assumptions anger me -- the second offends me personally (and is *not* the concern of this network) and the first offends me professionally (and *is* the concern of this network). I'd like clarify the issue and pull further discussion of this ***off the network.*** What I am trying to communicate is my frustration and anger at assumption (1): that those who attend Connectionist Modeling Summer Institutes, those who attend connectionism conferences and talks, and those who subscribe to connectionist networks are white men. The offense is one of egocentric perspective. This perspective is manifested as sexism in my case, but to others it could look like racism, for example, or of "religio- centrism" What angers me is the communication of the implicit assumption that members of this network and participants in the ConnectFest share a value system that, for example, finds humor in the function or dysfunction of male sexual arousal or in the exploitation of an objectified sexual "service". We who subscribe to this network have in common that we are *people,* but the only further assumption that can safely be made is that we are all interested in connectionist research and development. Posting humor or any other message that is insulting or degrading to *any* group of people is not appropriate. I am challenging members of this network to communicate a global perspective. When you post something on this network, when you give a presentation at a conference or special institute, do not assume that the audiece is just like you. Assume that members of the audience are different genders, races, nationalities and religions sharing scientific interests in connectionism. The only value system we have identified to you by subscribing to the network or by attending the conference is one of research interest. And that is the only topic we have asked to read about in our mail. With that, let's **get this off the net now** I'll gladly continue this discussion with individuals who want to continue. And finally, I'd like to close this discussion with a little limerick of my own (after Gasser): Said the Boltzmann machine to its lover, "There's an optimum we can discover. This is such a great feeling Let's slow our annealing Exploring each dip in the other." P.S. Thanks to all of you who wrote to support me, criticize me, agree with me, call me names, and ask me what I think. I am fortunate in having the opportunity to enter into discussion with so many of you who care about these issues in our profession. ******************************* -- Peg Jennings From Connectionists-Request at CS.CMU.EDU Mon Nov 19 09:12:22 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Mon, 19 Nov 90 09:12:22 EST Subject: call for papers for COLT '91 Message-ID: <10821.659023942@B.GP.CS.CMU.EDU> From haussler at saturn.ucsc.edu Mon Nov 19 01:18:16 1990 From: haussler at saturn.ucsc.edu (David Haussler) Date: Sun, 18 Nov 90 22:18:16 -0800 Subject: call for papers for COLT '91 Message-ID: <9011190618.AA27456@saturn.ucsc.edu> CALL FOR PAPERS COLT '91 Fourth Workshop on Computational Learning Theory Santa Cruz, CA. August 5-7, 1991 The fourth workshop on Computational Learning Theory will be held at the Santa Cruz Campus of the University of California. Registration is open, within the limits of the space available (about 150 people). In previous years COLT has focused primarily on developments in the analysis of learning algorithms within certain computational learning models. This year we would like to widen the scope of the workshop by encouraging papers in all areas that relate directly to the theory of machine learning, including artificial and biological neural networks, robotics, pattern recognition, information theory, decision theory, Bayesian/MDL estimation, and cryptography. We look forward to a lively, interdisciplinary meeting. As part of our program, we are pleased to present two special invited talks. "Gambling, Inference and Data Compression" Prof. Tom Cover of Stanford University "The Role of Learning in Autonomous Robots" Prof. Rodney Brooks of MIT Authors should submit an extended abstract that consists of: (1) A cover page with title, authors' names, (postal and e-mail) addresses, and a 200 word summary. (2) A body not longer than 10 pages in twelve-point font. Be sure to include a clear definition of the theoretical model used, an overview of the results, and some discussion of their significance, including comparison to other work. Proofs or proof sketches should be included in the technical section. Experimental results are welcome, but are expected to be supported by theoretical analysis. Authors should send 11 copies of their abstract to L.G. Valiant, COLT '91, Aiken Computing Laboratory, Harvard University, Cambridge, MA 02138. The deadline for receiving submissions is February 15, 1991. This deadline is FIRM. Authors will be notified by April 8; final camera-ready papers will be due May 22. The proceedings will be published by Morgan-Kaufmann. Each individual author will keep the copyright to his/her abstract, allowing subsequent journal submission of the full paper. Chair: Manfred Warmuth (UC Santa Cruz). Local arrangements chair: David Helmbold (UC Santa Cruz). Program committee: Leslie Valiant (Harvard, chair), Dana Angluin (Yale), Andrew Barron (U. Illinois), Eric Baum (NEC, Princeton), Tom Dietterich (Oregon State U.), Mark Fulk (U. Rochester), Alon Itai (Technion, Israel), Michael Kearns (Int. Comp. Sci. Inst., Berkeley), Ron Rivest (MIT), Naftali Tishby (Bell Labs, Murray Hill), Manfred Warmuth (UCSC). Hosting Institution: Department of Computer and Information Science, UC Santa Cruz. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences are not appropriate for submission to COLT. Unlike previous years, this includes papers submitted to the IEEE Symposium on Foundations of Computer Science (FOCS). We no longer have a dual submission policy with FOCS. Note: this call is being distributed to THEORY-NET, ML-LIST, CONNECTIONISTS, Alife, NEWS.ANNOUNCE.CONFERENCES, COMP.THEORY, COMP.AI, COMP.AI.EDU, COMP.AI.NEURAL-NETS, and COMP.ROBOTICS. Please help us by forwarding it to colleagues who may be interested and posting it on any other relevant electronic networks. ------- End of Forwarded Message From szand at cs.utexas.edu Mon Nov 19 12:32:26 1990 From: szand at cs.utexas.edu (Shahriar Zand-Biglari) Date: Mon, 19 Nov 90 11:32:26 CST Subject: On Necessity of Awareness of Regular Folks! Message-ID: <9011191732.AA19610@cs.utexas.edu> Apparently, the sexist/prejudice trend is not going to stop their exploitation of the connectionist network. Here is another that just came up: When the meaning of every joke Is perceived as a sexist poke The problem may lie In the reader's own eye That they can't laugh like regular folk Such statements, even if not consciously and intentionally brought up, only reflects the serious social unawareness of the "regular folks" in our scientific community. And yet another evidence on how education and degree does not provide the members of our community with a progressive culture, appropriate for our time. I very much appreciate the sharp eyes of those like Peggy, which could see the prejudice behind the sexist poetry. Regular folks, male (as myself) or female, better be aware of such sexist remarks and feel sorrow for their existence in our time that to laugh. Shar Zand-Biglari Department of Computer Sciences University of Texas at Austin szand at cs.utexas.edu From fozzard at boulder.Colorado.EDU Mon Nov 19 12:46:54 1990 From: fozzard at boulder.Colorado.EDU (Richard Fozzard) Date: Mon, 19 Nov 90 10:46:54 -0700 Subject: all this hoopla over limericks... Message-ID: <9011191746.AA25019@alumni.colorado.edu> Will all of you PLEASE put a "set Replyall" in your .mailrc? This will default "r" to reply only to the sender (and "R" will reply to sender and all recipients). [This may be different on some UNIXs - do a man mail to be sure.] This should get some of the personal discussions (aka "flames") off of our mailing list. thanks! ======================================================================== Richard Fozzard "Serendipity empowers" Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 From ernst at russel.cns.caltech.edu Mon Nov 19 14:48:36 1990 From: ernst at russel.cns.caltech.edu (Ernst Miebur) Date: Mon, 19 Nov 90 11:48:36 PST Subject: Post NIPS Conference Workshop on Cortical Oscillations Message-ID: <9011191948.AA10858@russel.caltech.edu> NIPS Post Conference Workshop #1: Oscillations in Cortical Systems 40-60 Hz oscillations have long been reported in the rat and rabbit olfactory bulb and cortex on the basis of single-and multi-unit recordings as well as EEG activity. Periodicities in eye movement reaction times as well as oscillations in the auditory evoked potential in response to single click or a series of clicks all support a 30-50 Hz framework for aspects of cortical activity and possibly cortical information processing. Recently, highly synchronized, stimulus specific oscillations in the 35-85 Hz range were observed in areas 17, 18 and PMLS of anesthetized as well as awake cats. Neurons with similar orientation tuning up to 10 mm apart, even across the vertical meridian (i.e. in different hemispheres) can show phase-locked oscillations. Organization of the workshop will favor interaction between participants as much as possible. To set a framework, introductory talks will be presented. Speakers include J. Bower (Caltech): Experiments B. Ermentrout (U. Pittsburgh): Coupled Oscillators E. Niebur (Caltech): Models D. Schwenders (U. Munich): Psychophysics If you plan to present your work during a 5-10 minute talk, I would appreciate sending me a notice, although ``walk-ins'' are welcome. Topics that will be discussed include - possible functions of cortical oscillations, - crucial experiments to elucidate these functions, - mechanisms for long-range synchronization. Ernst Niebur Computation and Neural Systems Caltech 216-76 Pasadena, CA 91125 ernst at descartes.cns.caltech.edu (818)356-6885 From dave at cogsci.indiana.edu Tue Nov 20 00:16:24 1990 From: dave at cogsci.indiana.edu (David Chalmers) Date: Tue, 20 Nov 90 00:16:24 EST Subject: Bibliography available Message-ID: I originally sent this notice to a philosophy list, but somebody suggested that I send it to connectionists as a lot of the material is relevant, including about 60 papers on the philosophy of connectionism. -- For the last year or so, I've been working on a bibliography of recent work in the philosophy of mind, philosophy of cognitive science, and philosophy of AI. I keep intending to distribute it when it's complete, but of course it's never complete as I'm always coming across new things. So maybe I should just distribute it as it is. It consists of 861 entries, divided into 4 parts: 1. "First-person" issues (consciousness, qualia, etc) [218 entries] 2. "Third-person" issues (content, psych explanation, etc) [346 entries] 3. Philosophy of Artificial Intelligence [155 entries] 4. Miscellaneous topics [142 entries] About half of the entries are annotated with a 1-or-2-line summary, and occasionally criticism. The rest I either haven't read, or haven't got around to annotating yet. Of course none of the bibliographies are complete, but part 4 is particularly feeble, without any attempt at thoroughly covering the areas involved. The vast bulk of the bibliography consists of papers and books from the last 10-15 years, although a little earlier material is included where it is directly relevant to current concerns. I've enclosed a section-by-section summary below. To get a copy, write to me at dave at cogsci.indiana.edu. The files take up about 120K in total. Dave Chalmers (dave at cogsci.indiana.edu) Center for Research on Concepts and Cognition Indiana University. ----------------------------------------------------------------------------- Bibliography: Recent Work in the Philosophy of Mind and Cognition ================================================================= Compiled by David J. Chalmers, Center for Research on Concepts and Cognition, Indiana University, Bloomington, IN 47408. (c) 1990 David J. Chalmers. Summary ------- 1. "First-person" issues (consciousness, qualia, etc) [218] 1.1 Subjectivity (Nagel) [26] 1.2 The Knowledge Argument (Jackson) [13] 1.3 Functionalism and Qualia (including Absent Qualia, etc) [28] 1.4 Inverted Spectrum [12] 1.5 Qualia, General [18] 1.6 Are Programs Enough? (Searle) [32] 1.7 Machines and Conscious Mentality (other) [15] 1.8 Mind-Body Problem (Misc) [10] 1.9 Zombies and Other Minds [4] 1.10 Consciousness -- Eliminativist Perspectives [10] 1.11 Consciousness -- Functional Accounts [21] 1.12 Consciousness, General [17] 1.13 Subjective Mental Content [4] 1.14 Dualism [8] 2. "Third-person" issues (content, psych explanation, etc) [346] 2.1 The Reality of Propositional Psychology [50] 2.1a General [12] 2.1b Sententialism (esp. Fodor) [16] 2.1c Instrumentalism (Dennett) [17] 2.1d Syntactic Functionalism (Stich) [5] 2.2 Eliminativism, Psychology & Neuroscience (esp. Churchlands) [29] 2.2a Eliminative Materialism [16] 2.2b Psychology & Neuroscience [13] 2.3 Narrow/Wide Content [53] 2.3a Why Reference is not in the Head (Putnam) [9] 2.3b Implications for Psychology (Burge, Fodor) [20] 2.3c The Status of Narrow Content [12] 2.3d Miscellaneous [12] 2.4 Causal Theories of Content [37] 2.4a Information-Theoretic Accounts (Dretske) [11] 2.4b Causal Accounts, General [11] 2.4c Teleological Approaches (Millikan) [8] 2.4d Situation Semantics (Barwise/Perry) [7] 2.5 Theories of Content, Misc [13] 2.6 Representation (General) [13] 2.7 Supervenience, Reduction, Mental Causation [46] 2.7a Supervenience (Kim, etc) [13] 2.7b Anomalous Monism (Davidson) [14] 2.7c Token Identity (Davidson, etc) [4] 2.7d Mental Causation [8] 2.7e Mental/Physical, Misc [7] 2.8 Functionalism (General) [31] 2.9 Computationalism (General) [17] 2.10 Psychological Explanation, Misc [7] 2.11 Perception/Modularity/Plasticity (Fodor, Churchland) [12] 2.12 Nativism (Chomsky, etc) [22] 2.13 Misc Phil of Mind [16] 3. Philosophy of Artificial Intelligence [155] 3.1 Can Machines be Conscious? -- see 1.6, 1.7. 3.2 Computationalism as Psychological Explanation -- see 2.9. 3.3 The Turing Test [9] 3.4 Godelian Arguments (Lucas) [23] 3.5 Philosophy of Connectionism [40] 3.6 Foundations of Connectionism (more empirical) [10] 3.7 Connectionism & Structured Representation (Fodor/Pylyshyn) [10] 3.8 Foundations of AI (somewhat empirical) [14] 3.9 Computation and Semantics [12] 3.10 The Frame Problem [10] 3.11 Analog and Digital Processing [5] 3.12 Levels of Analysis (Marr, etc) [7] 3.13 Philosophy of AI, Misc [15] 4. Miscellaneous Topics [142] 4.1 Colour, General [15] 4.2 Colour Incompatibilities [6] 4.3 Split Brains [11] 4.4 Personal Identity (tiny selection) [7] 4.5 Pain and Pleasure [13] 4.6 Dreaming [9] 4.7 Phenomenal Qualities and the Sorites Paradox [7] 4.8 Mental Images (Pylyshyn, Kosslyn) [21] 4.9 Sensation and Perception, Misc [8] 4.10 Emotions, etc [6] 4.11 Free Will (tiny selection) [7] 4.12 Animal Cognition [7] 4.13 Brains in Vats (Putnam) [14] 4.14 Rationality [11] From AC1MPS at primea.sheffield.ac.uk Tue Nov 20 09:57:02 1990 From: AC1MPS at primea.sheffield.ac.uk (AC1MPS@primea.sheffield.ac.uk) Date: Tue, 20 Nov 90 09:57:02 Subject: thanks and  as requested Message-ID: Many thanks to everyone who replied to my last broadcast about super-Turing systems. To those of you who asked for hard-copies of my paper: they're in the post. A lot of you missed the original references to Pour-El's work. Here are the references I've got, although there may be later work as well. Myhill J. "A recursive function, defined on a compact interval and having a continuous derivative that is not recursive." Michegan Math. J. 18 (1971) pp 97-8. M. Boykan Pour-El "Abstract computability versus analog-generability." Springer Lecture Notes in Math 337 (1971) pp 345-60 (Cambridge summer school in math. logic). Marian Boykan Pour-El "Abstract computability and its relation to the general purpose analog computer." Trans AMS 199 (1974) pp 1-28. Marian Boykan Pour-El + Ian Richards "The wave equation with computable initial data such that its unique solution is not computable." Advances in math 39 (1981) pp 215-39. Thanks again. Mike Stannett. From pollack at cis.ohio-state.edu Tue Nov 20 13:00:39 1990 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Tue, 20 Nov 90 13:00:39 -0500 Subject: Postdoctoral Positions Message-ID: <9011201800.AA00608@dendrite.cis.ohio-state.edu> POSTDOCTORAL FELLOWS IN COGNITIVE SCIENCE The Ohio State University The Center for Cognitive Science at the Ohio State University has several openings for postdoctoral researchers. We will consider recent Ph.D.'s in all areas of Cognitive Science based on overall quality of research and commitment to interdisciplinary work. Applications are especially encouraged from candidates in areas in which we have active faculty interests.(See attached note) These two-year positions will begin July 1991 and carry a yearly stipend of $25,000 with $1000 for moving expenses and $1000 per year for research and travel. An office and computer access will be provided by the Center. The Center for Cognitive Science is an interdisciplinary University-wide research center with approximately eighty members from sixteen departments. OSU is one of the largest universities in the country with significant research resources including a Cray YMP and a PET scanner. Columbus provides a high quality of life along with very affordable housing. To apply, send your vita, reprints and a statement of research interests and a research plan, tell us the name of the faculty member(s) you wish to work with at OSU, and arrange for three recommendation letters to be sent to: Postdoctoral Search Commmittee Center for Cognitive Science 208 Ohio Stadium East 1961 Tuttle Park Place Columbus, OH 43210-1102 Materials must be postmarked by January 15, 1991. The Ohio State University is an Equal Opportunity/Affirmative Action employer. ------------------------------------- I've put a summary of the main interests of the center in neuroprose (file OSU.Cogsci). Personally, I am looking for someone with a solid background in non-linear dynamical systems to collaborate on the question of how fractals and chaos are exploited by cognition. Other opportunities exist in neuroscience, AI, vision, speech, language, music, motor control, philosophy of mind, and elsewhere, under sponsorship of faculty in those areas. Please contact me (and/or any other faculty you know at OSU) for further info. Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Fax/Phone: (614) 292-4890 From mike at park.bu.edu Tue Nov 20 18:11:35 1990 From: mike at park.bu.edu (mike@park.bu.edu) Date: Tue, 20 Nov 90 18:11:35 -0500 Subject: No subject Message-ID: <9011202311.AA01221@fenway.bu.edu> BOSTON UNIVERSITY A World Leader In Neural Network Research and Technology Presents Two Major Events on the Cutting Edge NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, MAY 5-10, 1991 A self-contained systematic course by leading neural architects. NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, MAY 10-12, 1991 An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. Special student registration rates are available. Sponsored by: Boston University's Wang Institute, Center for Adaptive Systems, and Graduate Program in Cognitive and Neural Systems, with partial support from the Air Force Office of Scientific Research. NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 5-10, 1991 This self-contained systematic five-day course is based on the graduate curriculum in the technology, computation, mathematics, and biology of neural networks developed at the Center for Adaptive Systems (CAS) and the graduate program in Cognitive and Neural Systems (CNS) of Boston University. The curriculum refines and updates the successful course held at the Wang Institute in May, 1990. The course will be taught by CAS/CNS faculty, as well as by distinguished guest lecturers at the beautiful and superbly equipped campus of the Wang Institute. An extraordinary range and depth of models, methods, and applications will be presented with ample opportunity for interaction with the lecturers and other participants at the daily discussion sections, meals, receptions, and breaks that are included with registration. At the 1990 Course, participants came from 20 countries and 35 states of the U.S. Boston University tutors are STEPHEN GROSSBERG, GAIL CARPENTER, ENNIO MINGOLLA, MICHAEL COHEN, DAN BULLOCK, AND JOHN MERRILL. Guest tutors are FEDERICO FAGGIN, ROBERT HECHT-NIELSEN, MICHAEL JORDAN, ANDY BARTO, AND ALEX WAIBEL. DAY 1 COURSE SCHEDULE (May 6, 1991) PROFESSOR GROSSBERG: Historical Overview, Cooperation and Competition, Content Addressable Memory, and Associative Learning. PROFESSORS CARPENTER, GROSSBERG, AND MINGOLLA: Associative Learning Continued, Neocognitron, Perceptrons, and Introduction to Back Propagation. PROFESSOR JORDAN: Recent Developments of Back Propagation. Evening Discussions with Tutors and Informal Presentations. DAY 2 COURSE SCHEDULE (May 7, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Adaptive Pattern Recognition. PROFESSORS CARPENTER AND GROSSBERG: Introduction to Adaptive Resonance, Theory and Analysis of ART 1. PROFESSOR CARPENTER: Analysis of ART 2, ART 3, Predictive ART, and Self-Organization of Invariant Pattern Recognition codes. Evening Discussions with Tutors and Informal Presentations. DAY 3 COURSE SCHEDULE (May 8, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Vision and Image Processing. PROFESSORS BULLOCK AND GROSSBERG: Adaptive Sensory-Motor Planning and Control. Evening Discussions with Tutors and Informal Presentations. DAY 4 COURSE SCHEDULE (May 9, 1991) PROFESSORS COHEN, GROSSBERG, AND WAIBEL: Speech Perception and Production. PROFESSORS BARTO, GROSSBERG, AND MERRILL: Reinforcement Learning and Prediction. DR. HECHT-NIELSEN: Recent Developments in the Neurocomputer Industry. Evening Discussions with Tutors and Informal Presentations. DAY 5 COURSE SCHEDULE (May 10, 1991) DR. FAGGIN: VLSI Implementation of Neural Networks. END OF COURSE (at 1:30 PM). RESEARCH CONFERENCE NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING MAY 10-12, 1991 This international research conference on a topic at the cutting edge of science and technology will bring together leading experts in academe, government, and industry to present their results on vision and image processing in INVITED LECTURES and CONTRIBUTED POSTERS. Topics range from visual neurobiology and psychophysics through computational modelling to technological applications. CALL FOR PAPERS - VIP POSTER SESSION: A featured 3-hour poster session on neural network research related to vision and image processing will be held on May 11, 1991. Attendees who wish to present a poster should submit three copies of an abstract (one single-spaced page), postmarked by March 1, 1991, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1991. DAY 1 CONFERENCE PROGRAM (May 10, 1991, 5:00-7:30 PM) PROFESSOR JOHN DAUGMAN, CAMBRIDGE UNIVERSITY: "High-Confidence Personal Identification System Built from Quadrature Neural Filter" PROFESSOR DAVID CASASENT, CARNEGIE MELLON UNIVERSITY: "CMU Hybrid Optical/ Digital Neural Net for Scene Analysis" DR. ROBERT HECHT-NIELSEN, HNC,: "Neurocomputers for Image Analysis" DAY 2 CONFERENCE PROGRAM (May 11, 1991) PROFESSOR V.S. RAMACHANDRAN, UNIVERSITY OF CALIFORNIA, SAN DIEGO: "Interactions Between `Channels' Concerned with the Perception of Motion, Depth, Color, and Form" PROFESSOR STEPHEN GROSSBERG, BOSTON UNIVERSITY: "A Neural Network Architecture for 3-D Vision and Figure-Ground Separation" PROFESSOR ENNIO MINGOLLA, BOSTON UNIVERSITY: "A Neural Network Architecture for Visual Motion Segmentation" PROFESSOR GEORGE SPERLING, NEW YORK UNIVERSITY: "Two Systems of Visual Processing" DR. ROBERT DESIMONE, NATIONAL INSTITUTE OF MENTAL HEALTH: "Attentional Control of Visual Perception: Cortical and Subcortical Mechanisms" PROFESSOR GAIL CARPENTER, BOSTON UNIVERSITY: "Neural Network Architectures for Attentive Learning, Recognition, and Prediction" DR. RALPH LINSKER, IBM T.J. WATSON RESEARCH CENTER: "New Approaches to Network Learning and Optimization" PROFESSOR STUART ANSTIS, UNIVERSITY OF TORONTO: "My Recent Research on Motion Perception" POSTER SESSION DAY 3 CONFERENCE PROGRAM (May 12, 1991) PROFESSOR JACOB BECK, UNIVERSITY OF OREGON: "Preattentive Visual Processing" PROFESSOR JAMES TODD, BRANDEIS UNIVERSITY: "Neural Analysis of Motion" DR. ALLEN M. WAXMAN, MIT LINCOLN LAB: "Extraction" PROFESSOR ERIC SCHWARTZ, NEW YORK UNIVERSITY: "Biologically Motivated Machine Vision" PROFESSOR ALEX PENTLAND, MASSACHUSETTS INSTITUTE OF TECHNOLOGY: "The Optimal Observer: Design of a Dynamically-Responding Visual System" DISCUSSION END OF RESEARCH CONFERENCE (at 1 PM) CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. STUDENT REGISTRATION: A limited number of spaces at the course and conference have been reserved at a subsidized rate for full time students. These spaces will be assigned on a first-come, first-served basis. Completed registration form and payment for students who wish to be considered for the reduced student rate must be received by April 15, 1991. YOUR REGISTRATION FEE INCLUDES: COURSE CONFERENCE Five days of tutorials Admission to all invited lectures Course notebooks for all tutorials Admission to poster session All guest lectures One reception Sunday evening reception Two continental breakfasts Five continental breakfasts One lunch Five lunches One dinner Four dinners Daily morning/afternoon Daily morning/afternoon coffee coffee service service Evening discussion sessions with leading neural architects CANCELLATION POLICY: Course fee, less $100, and the research conference fee, less $60, will be refunded upon receipt of a written request postmarked before March 31, 1991. After this date no refund will be made. Registrants who do not attend and who do not cancel in writing before March 31, 1991 are liable for the full amount of the registration fee. You must obtain a cancellation number from our registrar in order to make the cancellation valid. HOW TO REGISTER: ADVANCE REGISTRATION: To register by telephone, call (508) 649-9731 with VISA or Mastercard between 8:00-5:00 PM (EST). To register by fax, complete and fax back the Registration Form to (508) 649-6926. To register by mail, complete the registration form and mail it with your full form of payment as directed. Make check payable in U.S. dollars to Boston University. ON-SITE REGISTRATION: Those who wish to register for the course and the research conference on-site may do so on a space-available basis. SITE: The Wang Institute of Boston University possesses excellent conference facilities in a beautiful 220-acre setting. It is easily reached from Boston's Logan Airport and Route 128. HOTEL RESERVATIONS: Sheraton Tara, Nashua, NH (603) 888-9970; Red Roof Inn, Nashua, NH (603) 888-1893; or Stonehedge Inn, Tyngsboro, MA, (508) 649-4342. The special conference rate applies only if you mention the name and dates of the meeting when making the reservation. The hotels in Nashua are located approximately five miles from the Wang Institute. Shuttle bus service will be provided. REGISTRATION FORM: COURSE - NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, May 5-10, 1991 RESEARCH CONFERENCE - NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, May 10-12, 1991 Name: ______________________________________________________________ Title: _____________________________________________________________ Organization: ______________________________________________________ Address: ___________________________________________________________ City: ____________________________ State: __________ Zip: __________ Telephone: _________________________________________________________ Course: Research Conference: ( ) regular attendee $985 ( ) regular attendee $95 ( ) full-time student $275* ( ) Full-time student $75* *limited number of spaces. Student registrations must be received by April 15, 1991. Total payment enclosed: ____________________________________________ Form of payment: ( ) Check or money order (payable in U.S. dollars to Boston University). ( ) VISA ( ) Mastercard #_______________________________________Exp. Date:__________________ Signature (as it appears on card): _________________________________ Return to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 Boston University's policies provide for equal opportunity and affirmative action in employment and admission to all programs of the University. From WANG at nbivax.nbi.dk Wed Nov 21 06:56:00 1990 From: WANG at nbivax.nbi.dk (WANG@nbivax.nbi.dk) Date: Wed, 21 Nov 90 12:56 +0100 (NBI, Copenhagen) Subject: Copenhagen Optimization Conference Message-ID: ------------------------------------------------------------ INTERNATIONAL CONFERENCE ON NOVEL METHODS IN OPTIMIZATION February 7 - 8, 1991 arranged by NORDITA Nordic Institute of Theoretical Physics Copenhagen and DIKU Department of Computer Science University of Copenhagen supported by funding from Nordic Initiative for Neural Computation (NINC) ------------------------------------------------------------ In recent years there has been an increasing interest in using neural networks, simulated annealing, and genetics as modelling frames of reference to construct novel search heuristics for solving hard optimization problems. Algorithms constructed in this way, together with tabu search, constitute promising new approaches to optimization and are the subjects of this conference. The aim of the conference is to bring together researchers in classical optimization and researchers working with the novel methods, thus enabling a fruitful exchange of information and results. An important part of the conference will be a tutorial presentation of both classical and new methods to establish a common base for discussion among the participants. Tutorial session. ----------------- The first day of the conference will be devoted to introductory lectures given by invited speakers. The lectures will be on: * Classical Optimization. a) Laurence Wolsey, Center for Operations Research and Econometrics, Universite de Louvain, Belgium: Introduction to Classical Optimization: P-problems and their solution. b) Susan Powell, London School of Economics: Introduction to Classical Optimization: NP-problems and their solution. * Neural Networks. Carsten Peterson, Lund University, Sweden: The use of neural networks and optimization. * Simulated Annealing. (Speaker to be announced later) * Genetic Algorithms. (Speaker to be announced later) * Tabu Search. (Speaker to be announced later) * Statistical Mechanics. Marc Mezard, Ecole Normale Superieure, Paris: "Formal statistical mechanical methods in optimization problems." About the speakers: Laurence A. Wolsey is Professor of Applied Mathematics at CORE and is one of the leading researchers in the field of computational mathematical programming. He received the Beale-Orchard-Hays prize for his work in 1988, and is one of the authors of the widely used book "Integer and Combinatorial Optimization". Susan Powell is Lecturer in Operations Research at London School of Economics and is well known for her work on Fortran Codes for linear and integer programs. She has a solid background in prac- tical problem solving through her contacts with industry and British OR companies. Carsten Peterson is Lecturer in Theoretical Physics at Lund University. He is co-inventor of the deterministic Boltzmann learning algorithm for symmetric recurrent networks and a leader in applications of neural networks to optimization problems. Marc Mezard is Lecturer in Physics at the Ecole Nomale Superieure, Paris. Together with his colleagues there and their coworkers at the University of Rome, he pioneered the application of methods from the statistical mechanics of random systems to optimizaation problems. Contributed Papers. ------------------- The second day of the conference will be devoted to selected half-hour contributed presentations. An abstract of each paper submitted for presentation should be mailed or e-mailed to: Prof. Jens Clausen DIKU, Dept. of Computer Science Universitetsparken 1, DK-2100 Copenhagen OE Denmark. e-mail: clausen at diku.dk before January 1, 1991. Authors of accepted papers will be notified before January 15, 1991. (No proceedings will be published). Poster Sessions. ---------------- On both seminar days there will be poster sessions. An abstract of the poster should be mailed or e-mailed to Prof. Jens Clausen DIKU, Dept. of Computer Science Universitetsparken 1, DK-2100 Copenhagen OE Denmark. e-mail: clausen at diku.dk before january 1, 1991. Authors of accepted posters will be notified before January 15, 1991. Registration. ------------- The registration fee is 500 DKK (or equivalent in other convertible currency) and covers coffee/tea and lunch both days as well as an informal conference dinner on the evening of February 7. To register please fill in the form below and mail it together with the registrations fee to the address given on the form. No credit cards accepted. Cheques or Eurocheques should be payable to OPTIMIZATION CONFERENCE. The organizing commitee must receive your registration form January 15, 1991 the latest, and the final program will be mailed by January 22, 1991. Travel support for Nordic participants. --------------------------------------- A limited amount of money from NINC is reserved for paying the travel costs of participants from the Nordic countries, especially younger researchers. If you would like to apply for this support, please indicate on the registration form. Accommodation. -------------- The organizing commitee has reserved a certain number of hotel rooms. Please indicate on the registration form if you would like the conference to book one for you. ------------------------------------------------------------- INTERNATIONAL CONFERENCE ON NOVEL METHODS IN OPTIMIZATION February 7 - 8, 1991 ------------------------------------------------------------- REGISTRATION FORM ------------------------------------------------------------- Name:_______________________________________________ Affiliation:_______________________________________________ Address:_______________________________________________ _______________________________________________ _______________________________________________ Telephone no.:_______________________________________________ e-mail:_______________________________________________ If you want the conference to reserve you a hotel room, please indicate here for which nights: ______________________________________________________________ Nordic participants: If you want to be considered for travel support, please indicate your needs here: ______________________________________________________________ Mail this registration form to: John Hertz NORDITA Blegdamsvej 17 DK-2100 Copenhagen OE, Denmark For further information: e-mail: hertz at nordita.dk FAX: [+45] 31 38 91 57 From dyer at CS.UCLA.EDU Wed Nov 21 12:35:03 1990 From: dyer at CS.UCLA.EDU (Dr Michael G Dyer) Date: Wed, 21 Nov 90 09:35:03 PST Subject: tech rep available on evolving neural networks Message-ID: <901121.173503z.22459.dyer@lanai.cs.ucla.edu> Evolution of Communication in Artificial Organisms* Gregory M. Werner Michael G. Dyer Tech. Rep. UCLA-AI-90-06 Abstract: A population of artificial organisms evolved simple communication protocols for mate finding. Female animals in our artificial environment had the ability to see males and to emit sounds. Male animals were blind, but could hear signals from females. Thus, the environment was designed to favor organisms that evolved to generate and interpret meaningful signals. Starting with random neural networks, the simulation resulted in a progression of generations that exhibit increasingly effective mate finding strategies. In addition, a number of distinct subspecies, i.e. groups with different signaling protocols or "dialects", evolve and compete. These protocols become a behavioral barrier to mating that supports the formation of distinct subspecies. Experiments with physical barriers in the environment were also performed. A partially permeable barrier allows a separate subspecies to evolve and survive for indefinite periods of time, in spite of occasional migration and contact from members of other subspecies. * To appear in: J. D. Farmer, C. Langton, S. Rasmussen & C. Taylor (Eds.), Artificial Life II, Addison-Wesley, in press. For a copy of the above paper, please send a request for Tech. Rep. UCLA-AI-90-06 to: valerie at cs.ucla.edu From mike at park.bu.edu Tue Nov 20 18:11:35 1990 From: mike at park.bu.edu (mike@park.bu.edu) Date: Tue, 20 Nov 90 18:11:35 -0500 Subject: No subject Message-ID: <9011202311.AA01221@fenway.bu.edu> BOSTON UNIVERSITY A World Leader In Neural Network Research and Technology Presents Two Major Events on the Cutting Edge NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, MAY 5-10, 1991 A self-contained systematic course by leading neural architects. NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, MAY 10-12, 1991 An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. Special student registration rates are available. Sponsored by: Boston University's Wang Institute, Center for Adaptive Systems, and Graduate Program in Cognitive and Neural Systems, with partial support from the Air Force Office of Scientific Research. NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 5-10, 1991 This self-contained systematic five-day course is based on the graduate curriculum in the technology, computation, mathematics, and biology of neural networks developed at the Center for Adaptive Systems (CAS) and the graduate program in Cognitive and Neural Systems (CNS) of Boston University. The curriculum refines and updates the successful course held at the Wang Institute in May, 1990. The course will be taught by CAS/CNS faculty, as well as by distinguished guest lecturers at the beautiful and superbly equipped campus of the Wang Institute. An extraordinary range and depth of models, methods, and applications will be presented with ample opportunity for interaction with the lecturers and other participants at the daily discussion sections, meals, receptions, and breaks that are included with registration. At the 1990 Course, participants came from 20 countries and 35 states of the U.S. Boston University tutors are STEPHEN GROSSBERG, GAIL CARPENTER, ENNIO MINGOLLA, MICHAEL COHEN, DAN BULLOCK, AND JOHN MERRILL. Guest tutors are FEDERICO FAGGIN, ROBERT HECHT-NIELSEN, MICHAEL JORDAN, ANDY BARTO, AND ALEX WAIBEL. DAY 1 COURSE SCHEDULE (May 6, 1991) PROFESSOR GROSSBERG: Historical Overview, Cooperation and Competition, Content Addressable Memory, and Associative Learning. PROFESSORS CARPENTER, GROSSBERG, AND MINGOLLA: Associative Learning Continued, Neocognitron, Perceptrons, and Introduction to Back Propagation. PROFESSOR JORDAN: Recent Developments of Back Propagation. Evening Discussions with Tutors and Informal Presentations. DAY 2 COURSE SCHEDULE (May 7, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Adaptive Pattern Recognition. PROFESSORS CARPENTER AND GROSSBERG: Introduction to Adaptive Resonance, Theory and Analysis of ART 1. PROFESSOR CARPENTER: Analysis of ART 2, ART 3, Predictive ART, and Self-Organization of Invariant Pattern Recognition codes. Evening Discussions with Tutors and Informal Presentations. DAY 3 COURSE SCHEDULE (May 8, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Vision and Image Processing. PROFESSORS BULLOCK AND GROSSBERG: Adaptive Sensory-Motor Planning and Control. Evening Discussions with Tutors and Informal Presentations. DAY 4 COURSE SCHEDULE (May 9, 1991) PROFESSORS COHEN, GROSSBERG, AND WAIBEL: Speech Perception and Production. PROFESSORS BARTO, GROSSBERG, AND MERRILL: Reinforcement Learning and Prediction. DR. HECHT-NIELSEN: Recent Developments in the Neurocomputer Industry. Evening Discussions with Tutors and Informal Presentations. DAY 5 COURSE SCHEDULE (May 10, 1991) DR. FAGGIN: VLSI Implementation of Neural Networks. END OF COURSE (at 1:30 PM). RESEARCH CONFERENCE NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING MAY 10-12, 1991 This international research conference on a topic at the cutting edge of science and technology will bring together leading experts in academe, government, and industry to present their results on vision and image processing in INVITED LECTURES and CONTRIBUTED POSTERS. Topics range from visual neurobiology and psychophysics through computational modelling to technological applications. CALL FOR PAPERS - VIP POSTER SESSION: A featured 3-hour poster session on neural network research related to vision and image processing will be held on May 11, 1991. Attendees who wish to present a poster should submit three copies of an abstract (one single-spaced page), postmarked by March 1, 1991, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1991. DAY 1 CONFERENCE PROGRAM (May 10, 1991, 5:00-7:30 PM) PROFESSOR JOHN DAUGMAN, CAMBRIDGE UNIVERSITY: "High-Confidence Personal Identification System Built from Quadrature Neural Filter" PROFESSOR DAVID CASASENT, CARNEGIE MELLON UNIVERSITY: "CMU Hybrid Optical/ Digital Neural Net for Scene Analysis" DR. ROBERT HECHT-NIELSEN, HNC,: "Neurocomputers for Image Analysis" DAY 2 CONFERENCE PROGRAM (May 11, 1991) PROFESSOR V.S. RAMACHANDRAN, UNIVERSITY OF CALIFORNIA, SAN DIEGO: "Interactions Between `Channels' Concerned with the Perception of Motion, Depth, Color, and Form" PROFESSOR STEPHEN GROSSBERG, BOSTON UNIVERSITY: "A Neural Network Architecture for 3-D Vision and Figure-Ground Separation" PROFESSOR ENNIO MINGOLLA, BOSTON UNIVERSITY: "A Neural Network Architecture for Visual Motion Segmentation" PROFESSOR GEORGE SPERLING, NEW YORK UNIVERSITY: "Two Systems of Visual Processing" DR. ROBERT DESIMONE, NATIONAL INSTITUTE OF MENTAL HEALTH: "Attentional Control of Visual Perception: Cortical and Subcortical Mechanisms" PROFESSOR GAIL CARPENTER, BOSTON UNIVERSITY: "Neural Network Architectures for Attentive Learning, Recognition, and Prediction" DR. RALPH LINSKER, IBM T.J. WATSON RESEARCH CENTER: "New Approaches to Network Learning and Optimization" PROFESSOR STUART ANSTIS, UNIVERSITY OF TORONTO: "My Recent Research on Motion Perception" POSTER SESSION DAY 3 CONFERENCE PROGRAM (May 12, 1991) PROFESSOR JACOB BECK, UNIVERSITY OF OREGON: "Preattentive Visual Processing" PROFESSOR JAMES TODD, BRANDEIS UNIVERSITY: "Neural Analysis of Motion" DR. ALLEN M. WAXMAN, MIT LINCOLN LAB: "Extraction" PROFESSOR ERIC SCHWARTZ, NEW YORK UNIVERSITY: "Biologically Motivated Machine Vision" PROFESSOR ALEX PENTLAND, MASSACHUSETTS INSTITUTE OF TECHNOLOGY: "The Optimal Observer: Design of a Dynamically-Responding Visual System" DISCUSSION END OF RESEARCH CONFERENCE (at 1 PM) CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. STUDENT REGISTRATION: A limited number of spaces at the course and conference have been reserved at a subsidized rate for full time students. These spaces will be assigned on a first-come, first-served basis. Completed registration form and payment for students who wish to be considered for the reduced student rate must be received by April 15, 1991. YOUR REGISTRATION FEE INCLUDES: COURSE CONFERENCE Five days of tutorials Admission to all invited lectures Course notebooks for all tutorials Admission to poster session All guest lectures One reception Sunday evening reception Two continental breakfasts Five continental breakfasts One lunch Five lunches One dinner Four dinners Daily morning/afternoon Daily morning/afternoon coffee coffee service service Evening discussion sessions with leading neural architects CANCELLATION POLICY: Course fee, less $100, and the research conference fee, less $60, will be refunded upon receipt of a written request postmarked before March 31, 1991. After this date no refund will be made. Registrants who do not attend and who do not cancel in writing before March 31, 1991 are liable for the full amount of the registration fee. You must obtain a cancellation number from our registrar in order to make the cancellation valid. HOW TO REGISTER: ADVANCE REGISTRATION: To register by telephone, call (508) 649-9731 with VISA or Mastercard between 8:00-5:00 PM (EST). To register by fax, complete and fax back the Registration Form to (508) 649-6926. To register by mail, complete the registration form and mail it with your full form of payment as directed. Make check payable in U.S. dollars to Boston University. ON-SITE REGISTRATION: Those who wish to register for the course and the research conference on-site may do so on a space-available basis. SITE: The Wang Institute of Boston University possesses excellent conference facilities in a beautiful 220-acre setting. It is easily reached from Boston's Logan Airport and Route 128. HOTEL RESERVATIONS: Sheraton Tara, Nashua, NH (603) 888-9970; Red Roof Inn, Nashua, NH (603) 888-1893; or Stonehedge Inn, Tyngsboro, MA, (508) 649-4342. The special conference rate applies only if you mention the name and dates of the meeting when making the reservation. The hotels in Nashua are located approximately five miles from the Wang Institute. Shuttle bus service will be provided. REGISTRATION FORM: COURSE - NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, May 5-10, 1991 RESEARCH CONFERENCE - NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, May 10-12, 1991 Name: ______________________________________________________________ Title: _____________________________________________________________ Organization: ______________________________________________________ Address: ___________________________________________________________ City: ____________________________ State: __________ Zip: __________ Telephone: _________________________________________________________ Course: Research Conference: ( ) regular attendee $985 ( ) regular attendee $95 ( ) full-time student $275* ( ) Full-time student $75* *limited number of spaces. Student registrations must be received by April 15, 1991. Total payment enclosed: ____________________________________________ Form of payment: ( ) Check or money order (payable in U.S. dollars to Boston University). ( ) VISA ( ) Mastercard #_______________________________________Exp. Date:__________________ Signature (as it appears on card): _________________________________ Return to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 Boston University's policies provide for equal opportunity and affirmative action in employment and admission to all programs of the University. From mike at park.bu.edu Tue Nov 20 18:11:35 1990 From: mike at park.bu.edu (mike@park.bu.edu) Date: Tue, 20 Nov 90 18:11:35 -0500 Subject: No subject Message-ID: <9011202311.AA01221@fenway.bu.edu> BOSTON UNIVERSITY A World Leader In Neural Network Research and Technology Presents Two Major Events on the Cutting Edge NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, MAY 5-10, 1991 A self-contained systematic course by leading neural architects. NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, MAY 10-12, 1991 An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. Special student registration rates are available. Sponsored by: Boston University's Wang Institute, Center for Adaptive Systems, and Graduate Program in Cognitive and Neural Systems, with partial support from the Air Force Office of Scientific Research. NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 5-10, 1991 This self-contained systematic five-day course is based on the graduate curriculum in the technology, computation, mathematics, and biology of neural networks developed at the Center for Adaptive Systems (CAS) and the graduate program in Cognitive and Neural Systems (CNS) of Boston University. The curriculum refines and updates the successful course held at the Wang Institute in May, 1990. The course will be taught by CAS/CNS faculty, as well as by distinguished guest lecturers at the beautiful and superbly equipped campus of the Wang Institute. An extraordinary range and depth of models, methods, and applications will be presented with ample opportunity for interaction with the lecturers and other participants at the daily discussion sections, meals, receptions, and breaks that are included with registration. At the 1990 Course, participants came from 20 countries and 35 states of the U.S. Boston University tutors are STEPHEN GROSSBERG, GAIL CARPENTER, ENNIO MINGOLLA, MICHAEL COHEN, DAN BULLOCK, AND JOHN MERRILL. Guest tutors are FEDERICO FAGGIN, ROBERT HECHT-NIELSEN, MICHAEL JORDAN, ANDY BARTO, AND ALEX WAIBEL. DAY 1 COURSE SCHEDULE (May 6, 1991) PROFESSOR GROSSBERG: Historical Overview, Cooperation and Competition, Content Addressable Memory, and Associative Learning. PROFESSORS CARPENTER, GROSSBERG, AND MINGOLLA: Associative Learning Continued, Neocognitron, Perceptrons, and Introduction to Back Propagation. PROFESSOR JORDAN: Recent Developments of Back Propagation. Evening Discussions with Tutors and Informal Presentations. DAY 2 COURSE SCHEDULE (May 7, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Adaptive Pattern Recognition. PROFESSORS CARPENTER AND GROSSBERG: Introduction to Adaptive Resonance, Theory and Analysis of ART 1. PROFESSOR CARPENTER: Analysis of ART 2, ART 3, Predictive ART, and Self-Organization of Invariant Pattern Recognition codes. Evening Discussions with Tutors and Informal Presentations. DAY 3 COURSE SCHEDULE (May 8, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Vision and Image Processing. PROFESSORS BULLOCK AND GROSSBERG: Adaptive Sensory-Motor Planning and Control. Evening Discussions with Tutors and Informal Presentations. DAY 4 COURSE SCHEDULE (May 9, 1991) PROFESSORS COHEN, GROSSBERG, AND WAIBEL: Speech Perception and Production. PROFESSORS BARTO, GROSSBERG, AND MERRILL: Reinforcement Learning and Prediction. DR. HECHT-NIELSEN: Recent Developments in the Neurocomputer Industry. Evening Discussions with Tutors and Informal Presentations. DAY 5 COURSE SCHEDULE (May 10, 1991) DR. FAGGIN: VLSI Implementation of Neural Networks. END OF COURSE (at 1:30 PM). RESEARCH CONFERENCE NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING MAY 10-12, 1991 This international research conference on a topic at the cutting edge of science and technology will bring together leading experts in academe, government, and industry to present their results on vision and image processing in INVITED LECTURES and CONTRIBUTED POSTERS. Topics range from visual neurobiology and psychophysics through computational modelling to technological applications. CALL FOR PAPERS - VIP POSTER SESSION: A featured 3-hour poster session on neural network research related to vision and image processing will be held on May 11, 1991. Attendees who wish to present a poster should submit three copies of an abstract (one single-spaced page), postmarked by March 1, 1991, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1991. DAY 1 CONFERENCE PROGRAM (May 10, 1991, 5:00-7:30 PM) PROFESSOR JOHN DAUGMAN, CAMBRIDGE UNIVERSITY: "High-Confidence Personal Identification System Built from Quadrature Neural Filter" PROFESSOR DAVID CASASENT, CARNEGIE MELLON UNIVERSITY: "CMU Hybrid Optical/ Digital Neural Net for Scene Analysis" DR. ROBERT HECHT-NIELSEN, HNC,: "Neurocomputers for Image Analysis" DAY 2 CONFERENCE PROGRAM (May 11, 1991) PROFESSOR V.S. RAMACHANDRAN, UNIVERSITY OF CALIFORNIA, SAN DIEGO: "Interactions Between `Channels' Concerned with the Perception of Motion, Depth, Color, and Form" PROFESSOR STEPHEN GROSSBERG, BOSTON UNIVERSITY: "A Neural Network Architecture for 3-D Vision and Figure-Ground Separation" PROFESSOR ENNIO MINGOLLA, BOSTON UNIVERSITY: "A Neural Network Architecture for Visual Motion Segmentation" PROFESSOR GEORGE SPERLING, NEW YORK UNIVERSITY: "Two Systems of Visual Processing" DR. ROBERT DESIMONE, NATIONAL INSTITUTE OF MENTAL HEALTH: "Attentional Control of Visual Perception: Cortical and Subcortical Mechanisms" PROFESSOR GAIL CARPENTER, BOSTON UNIVERSITY: "Neural Network Architectures for Attentive Learning, Recognition, and Prediction" DR. RALPH LINSKER, IBM T.J. WATSON RESEARCH CENTER: "New Approaches to Network Learning and Optimization" PROFESSOR STUART ANSTIS, UNIVERSITY OF TORONTO: "My Recent Research on Motion Perception" POSTER SESSION DAY 3 CONFERENCE PROGRAM (May 12, 1991) PROFESSOR JACOB BECK, UNIVERSITY OF OREGON: "Preattentive Visual Processing" PROFESSOR JAMES TODD, BRANDEIS UNIVERSITY: "Neural Analysis of Motion" DR. ALLEN M. WAXMAN, MIT LINCOLN LAB: "Extraction" PROFESSOR ERIC SCHWARTZ, NEW YORK UNIVERSITY: "Biologically Motivated Machine Vision" PROFESSOR ALEX PENTLAND, MASSACHUSETTS INSTITUTE OF TECHNOLOGY: "The Optimal Observer: Design of a Dynamically-Responding Visual System" DISCUSSION END OF RESEARCH CONFERENCE (at 1 PM) CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. STUDENT REGISTRATION: A limited number of spaces at the course and conference have been reserved at a subsidized rate for full time students. These spaces will be assigned on a first-come, first-served basis. Completed registration form and payment for students who wish to be considered for the reduced student rate must be received by April 15, 1991. YOUR REGISTRATION FEE INCLUDES: COURSE CONFERENCE Five days of tutorials Admission to all invited lectures Course notebooks for all tutorials Admission to poster session All guest lectures One reception Sunday evening reception Two continental breakfasts Five continental breakfasts One lunch Five lunches One dinner Four dinners Daily morning/afternoon Daily morning/afternoon coffee coffee service service Evening discussion sessions with leading neural architects CANCELLATION POLICY: Course fee, less $100, and the research conference fee, less $60, will be refunded upon receipt of a written request postmarked before March 31, 1991. After this date no refund will be made. Registrants who do not attend and who do not cancel in writing before March 31, 1991 are liable for the full amount of the registration fee. You must obtain a cancellation number from our registrar in order to make the cancellation valid. HOW TO REGISTER: ADVANCE REGISTRATION: To register by telephone, call (508) 649-9731 with VISA or Mastercard between 8:00-5:00 PM (EST). To register by fax, complete and fax back the Registration Form to (508) 649-6926. To register by mail, complete the registration form and mail it with your full form of payment as directed. Make check payable in U.S. dollars to Boston University. ON-SITE REGISTRATION: Those who wish to register for the course and the research conference on-site may do so on a space-available basis. SITE: The Wang Institute of Boston University possesses excellent conference facilities in a beautiful 220-acre setting. It is easily reached from Boston's Logan Airport and Route 128. HOTEL RESERVATIONS: Sheraton Tara, Nashua, NH (603) 888-9970; Red Roof Inn, Nashua, NH (603) 888-1893; or Stonehedge Inn, Tyngsboro, MA, (508) 649-4342. The special conference rate applies only if you mention the name and dates of the meeting when making the reservation. The hotels in Nashua are located approximately five miles from the Wang Institute. Shuttle bus service will be provided. REGISTRATION FORM: COURSE - NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, May 5-10, 1991 RESEARCH CONFERENCE - NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, May 10-12, 1991 Name: ______________________________________________________________ Title: _____________________________________________________________ Organization: ______________________________________________________ Address: ___________________________________________________________ City: ____________________________ State: __________ Zip: __________ Telephone: _________________________________________________________ Course: Research Conference: ( ) regular attendee $985 ( ) regular attendee $95 ( ) full-time student $275* ( ) Full-time student $75* *limited number of spaces. Student registrations must be received by April 15, 1991. Total payment enclosed: ____________________________________________ Form of payment: ( ) Check or money order (payable in U.S. dollars to Boston University). ( ) VISA ( ) Mastercard #_______________________________________Exp. Date:__________________ Signature (as it appears on card): _________________________________ Return to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 Boston University's policies provide for equal opportunity and affirmative action in employment and admission to all programs of the University. From birnbaum at fido.ils.nwu.edu Wed Nov 21 16:31:40 1990 From: birnbaum at fido.ils.nwu.edu (Lawrence Birnbaum) Date: Wed, 21 Nov 90 15:31:40 CST Subject: ML91 Call for papers: Eighth International Machine Learning Workshop Message-ID: <9011212131.AA00580@fido.ils.nwu.edu> ML91 The Eighth International Workshop on Machine Learning Call for Papers The organizing committee is please to announce that ML91 will include the following workshop topics: Automated Knowledge Acquisition Computational Models of Human Learning Learning Relations Machine Learning in Engineering Automation Learning to React/in Complex Environments Constructive Induction Learning in Intelligent Information Retrieval Learning from Theory and Data Papers must be submitted to one of these workshops for consideration. The provisional deadline for submission is February 1, 1991. Papers to appear in the Proceedings must fit in 4 pages, double column format. More details about the constituent workshops, including submission procedures, contact points, and reviewing committees, will be forthcoming shortly. ML91 will be held at Northwestern University, Evanston, Illinois, USA (just north of Chicago), June 27-29, 1991. On behalf of the organizing committee, Larry Birnbaum and Gregg Collins From len at mqcomp.mqcs.mq.oz.au Thu Nov 22 10:35:18 1990 From: len at mqcomp.mqcs.mq.oz.au (Len Hamey) Date: Thu, 22 Nov 90 10:35:18 EST Subject: Back-propogation Message-ID: <9011220035.AA05175@mqcomp.mqcs.mq.oz.au> In studying back-propogation, I find a degree of uncertainty as to exactly what the algorithm is. In particular, we all know and love the parameter ETA which specifies how big a step to take, but is the step taken simply ETA times the derivative of the error, or is the derivative vector normalised so that the step taken is ETA in weight space? I have experimented with both of these variations and observe that normalising the size of the step taken produces faster convergence on parity and XOR problems, but can also introduce oscillation. SUrprisingly, I have not been able to observe oscillatory behaviour when using un-normalised derivative steps. (I refer to oscillation of the solution across a narrow valley in the error surface). I assume then that proponents of momentum (which stems back at least to Rumelhart et al) have all been normalising the derivative vector to achieve fixed-size steps in weight space. Is this assumption correct? If not, then can somebody point me to a problem that generates oscillatory behaviour (and please indicate the value of ETA also). Len Hamey len at mqcomp.mqcs.mq.oz.au From yoshua at HOMER.MACH.CS.CMU.EDU Wed Nov 21 19:41:36 1990 From: yoshua at HOMER.MACH.CS.CMU.EDU (Yoshua BENGIO) Date: Wed, 21 Nov 90 19:41:36 EST Subject: learning a synaptic learning rule. TR available. Message-ID: <9011220041.AA13980@homer.cs.mcgill.ca> The following technical report is now available by ftp from neuroprose: Bengio Y. and Bengio S. (1990). Learning a synaptic learning rule. Technical Report #751, Universite de Montreal, Departement d'informatique et de recherche operationelle. Learning a synaptic learning rule Yoshua Bengio Samy Bengio McGill University, Universite de Montreal School of Computer Science, Departement d'informatique 3480 University street, et de recherche operationelle, Montreal, Qc, Canada, H3A 2A7 Montreal, Qc, Canada, H3C 3J7 yoshua at cs.mcgill.ca bengio at iro.umontreal.ca An original approach to neural modeling is presented, based on the idea of searching for and tuning, with learning methods, a synaptic learning rule which is biologically plausible, and yields networks capable to learn to perform difficult tasks. This method relies on the idea of considering the synaptic modification rule DeltaW() as a parametric function. This function has local inputs and is the same in many neurons. Its parameters can be estimated with known learning methods. For this optimization, we give particular attention to gradient descent and genetic algorithms. Estimation of these parameters consists of a joint global optimization of (a) the synaptic modification function, and (b) the networks that are learning to perform some tasks, using this function. We show how to compute the gradient of an optimization criteria with respect to the parameters of DeltaW(). Both network architecture and the learning function can be designed within constraints derived from biological knowledge. To avoid that DeltaW() be too specialized, this function is forced to be the same for a large number of synapses, in a population of networks learning to perform different tasks. To enforce efficiency constraints, some of these networks should learn complex mappings (as in pattern recognition). Others should learn to reproduce behavioral phenomena, such as associative conditioning, and neurological phenomena, such as habituation, recovery, dishabituation and sensitization. The architecture of the networks reproducing these biological phenomena can be designed based on well-studied circuits, such as those involved in associations in Aplysia, Hermissenda, or the rabbit eyelid closure response. Multiple synaptic modification functions allow for the diverse types of synapses (e.g. inhibitory, excitatory). Models of pre-, epi- and post-synaptic mechanisms can be used to bootstrap Delta W(), so that it initially consists of a combination of simpler modules, each emulating a particular synaptic mechanism. --------------------------------------------------------------------------- Copies of the postscript file bengio.learn.ps.Z may be obtained from the pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the Getps script or do this: unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Connected to cheops.cis.ohio-state.edu. Name (cheops.cis.ohio-state.edu:): anonymous 331 Guest login ok, sent ident as password. Password: neuron 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose ftp> binary ftp> get bengio.learn.ps.Z ftp> quit unix-2> uncompress bengio.learn.ps.Z unix-3> lpr -P(your_local_postscript_printer) bengio.learn.ps Or, order a hardcopy by sending your physical mail address to bengio at iro.umontreal.ca, mentioning Technical Report #751. Please do this only if you cannot use the ftp method described above. ---------------------------------------------------------------------------- From ackley at chatham.bellcore.com Wed Nov 21 14:39:37 1990 From: ackley at chatham.bellcore.com (David H Ackley) Date: Wed, 21 Nov 90 14:39:37 -0500 Subject: NIPS*90 workshop at Keystone, CO. 11/30/90 or 12/1/90 Message-ID: <9011211939.AA09300@chatham.bellcore.com> Genetic Algorithms, Neural Networks, and Artificial Life David H. Ackley Richard K. Belew Based on the principles of natural selection, "genetic algorithms" (GAs) are a class of adaptive techniques that use a population of structures to represent a set of potential solutions to some problem. Selective reproduction emphasizes "more fit" individuals and focuses the search process, while genetic operators modify the offspring to increase diversity and search broadly. Theoretical and empirical results highlight the importance of employing the "crossover" operator to exchange information between individuals. Such genetic recombination produces a global search strategy quite different from --- and in some ways complementary to --- the gradient-based techniques popular in neural network learning. We will survey the theory and practice of genetic algorithms, and then focus on the growing body of research efforts that combine genetic algorithms and neural networks. Brief presentations from researchers active in the field (including Richard Lippmann, David Stork, and Darrell Whitley) will set the stage for in-depth discussions of issues in the area, such as: * Comparison and composition of GA sampling and NNet searching * The advantages and costs of recombination operators * Parallel implementations of GAs * Appropriate representations for NNets with the GA * Roles for ontogeny between GA evolution and NNet learning During the course of the workshop we will gradually broaden our scope. As natural neurons provide inspiration for artificial neural networks, and natural selection provides inspiration for GAs, other aspects of natural life can provide inspirations for studies in "artificial life". We will sample recent "alife" research efforts, and conclude with a discussion of prospects and problems for this new, interdisciplinary field. From carol at ai.toronto.edu Thu Nov 22 10:15:39 1990 From: carol at ai.toronto.edu (Carol Plathan) Date: Thu, 22 Nov 1990 10:15:39 -0500 Subject: CRG-TR-90-6 and 90-7 requests Message-ID: <90Nov22.101545edt.434@neuron.ai.toronto.edu> PLEASE DO NOT FORWARD TO OTHER NEWSGROUPS OF MAILING LISTS ********************************************************** You may order the following two new technical reports by sending your physical mailing address to carol at ai.toronto.edu. Pls do not reply to the whole list. 1. CRG-TR-90-6: Speaker Normalization and Adaptation using Second-Order Connectionist Networks by Raymond L. Watrous 2. CRG-TR-90-7: Learning Stochastic Feedforward Networks by Radford M. Neal Abstracts follow: ------------------------------------------------------------------------------- This technical report is an extended version of the paper that was presented to the Acoustical Society of America in May, 1990. SPEAKER NORMALIZATION AND ADAPTATION USING SECOND-ORDER CONNECTIONIST NETWORKS Raymond L. Watrous Department of Computer Science, University of Toronto, Toronto, Canada M5S 1A4 CRG-TR-90-6 A method for speaker-adaptive classification of vowels using connectionist networks is developed. A normalized representation of the vowels is computed by a speaker-specific linear transformation of observations of the speech signal using second-order connectionist network units. Vowel classification is accomplished by a multilayer network which operates on the normalized speech data. The network is adapted for a new talker by modifying the transformation parameters while leaving the classifier fixed. This is accomplished by back-propagating classification error through the classifier to the second-order transformation units. This method was evaluated for the classification of ten vowels for 76 speakers using the first two formant values of the Peterson/Barney data. A classifier optimized on the normalized data led to a recognition accuracy of 93.2%. When adapted to each speaker from various initial transformation parameters, the accuracy improved to 96.6%. When the speaker-dependent transformation and nonlinear classifier were simultaneously optimized, a vowel recognition accuracy of as high as 97.5% was obtained. Speaker adaptation using this network also yielded an accuracy of 96.6%. The results suggest that rapid speaker adaptation resulting in high classification accuracy can be accomplished by this method. ------------------------------------------------------------------------------- LEARNING STOCHASTIC FEEDFORWARD NETWORKS Radford M. Neal Department of Computer Science, University of Toronto, Toronto, Canada M5S 1A4 CRG-TR-90-7 Connectionist learning procedures are presented for "sigmoid" and "noisy-OR" varieties of stochastic feedforward network. These networks are in the same class as the "belief networks" used in expert systems. They represent a probability distribution over a set of visible variables using hidden variables to express correlations. Conditional probability distributions can be exhibited by stochastic simulation for use in tasks such as classification. Learning from empirical data is done via a gradient-ascent method analogous to that used in Boltzmann machines, but due to the feedforward nature of the connections, the negative phase of Boltzmann machine learning is unnecessary. Experimental results show that, as a result, learning in a sigmoid feedforward network can be faster than in a Boltzmann machine. These networks have other advantages over Boltzmann machines in pattern classification and decision making applications, and provide a link between work on connectionist learning and work on the representation of expert knowledge. ------------------------------------------------------------------------------- From perham at nada.kth.se Fri Nov 23 06:43:59 1990 From: perham at nada.kth.se (Per Hammarlund) Date: Fri, 23 Nov 90 12:43:59 +0100 Subject: Performance of the biosim program. Message-ID: <9011231143.AA14396@nada.kth.se> We have been approached to give some details about the capabilities and performance of the biosim program. (A program that enables biologically realistic simulations of large neuronal networks on the Connection Machine from TMC. The program has been developed by the SANS group, Royal Institute of Technology; Technical Report TRITA-NA-P9021.) The biosim program is used, among other things, to simulate the swimming rhythm generator of the lamprey. The present model includes 100 segments each consisting of 6 neurons who in turn are built up of 4 compartments. The segments have 12 internal synapses and are interconnected by 4 synapses, giving a total of 2400 compartments and 1600 synapses. The simulation, as presented in the TR, uses a time step of 0.4 milliseconds. To simulate 2 seconds of real time with this setup takes about 11 minutes or one time step in 0.135 seconds on our 8K machine. The system is thus running at a speed of about 0.3% of real time. The time step can often be doubled without significant changes in simulation results. Obviously, changing the time step directly effects the simulation time. It should probably also be mentioned that the program does not put any limitations on the number of compartments nor synapses per cell. These numbers are set individually for every cell. Due to the SIMD architecture of the Connection Machine, what is simulated is actually at least 8192 compartments and equally many synapses, since that is the number of processing elements in our machine. It is therefore possible to simulate either a three times larger model of the lamprey or three lampreys with the above specifications simultaneously without ANY change in execution time. It also turns out that up to 65536 compartments and synapses, i.e. 27 lampreys of the above type, can be simulated in roughly four times the time. This is in some sense equivalent to simulating one lamprey at about 2% of real time. For a full size Connection Machine with 64K processing elements it is possible to simulate roughly eight times larger models in the same time, i.e. using the same reasoning you get 16% of real time. All of the above shows that it is difficult to state any single general nice and easy to grasp number regarding the relation to real time but we hope that this has explained the issue. Why would anyone want to simulate the same thing 27 or 216 times in parallel? An excellent use of such an ability is to try out different settings of a number of parameters in one shot. A more obvious use of the biosim program is to run a single large model. For instance the number of cells in each segment of the lamprey model can be increased giving even better correspondence with reality. Bjorn Levin, Per Hammarlund, Anders Lansner blevin at bion.kth.se, perham at nada.kth.se, ala at bion.kth.se SANS-NADA Royal Institute of Technology S-100 44 Stockholm Sweden From burr at mojave.stanford.edu Sat Nov 24 00:26:13 1990 From: burr at mojave.stanford.edu (Jim Burr) Date: Fri, 23 Nov 90 21:26:13 PST Subject: NIPS90 VLSI workshop Message-ID: <9011240526.AA05352@mojave.Stanford.EDU> papers/neuro/nips90/agenda.t To: everyone I've contacted about the NIPS90 VLSI workshop, thanks for your help! It's shaping up to be a great session. Special thanks to those who have volunteered to give presentations. Workshop 8. on VLSI Neural Networks is being held Saturday, Dec 1 at Keystone. Related workshops are workshop 7. on implementations of neural networks on digital, massively parallel computers, and workshop 9. on optical implementations. Abstract: 8. VLSI Neural Networks Jim Burr Stanford University Stanford, CA 94305 (415) 723-4087 burr at mojave.stanford.edu This one day workshop will address the latest advances in VLSI implementations of neural nets. How successful have implementations been so far? Are dedicated neurochips being used in real applications? What algorithms have been implemented? Which ones have not been? Why not? How important is on chip learning? How much arithmetic precision is necessary? Which is more important, capacity or performance? What are the issues in constructing very large networks? What are the technology scaling limits? Any new technology developments? Several invited speakers will address these and other questions from various points of view in discussing their current research. We will try to gain better insight into the strengths and limitations of dedicated hardware solutions. Agenda: morning: 1. review of new chips capacity performance power learning architecture 2. guidelines for reporting results - recommendation at evening session specify technology, performance, power if possible translate power into joules/connection or joules/update 3. analog vs digital - the debate goes on 4. on-chip learning - who needs it 5. locality - who needs it (Boltzmann vs backprop) 6. precision - how much 7. leveraging tech scaling evening: 1. large networks - how big memory power Here are some of the issues we will discuss during the workshop: - What is the digital/analog tradeoff for storing weights? - What is the digital/analog tradeoff for doing inner products? - What is the digital/analog tradeoff for multichip systems? - Is on-chip learning necessary? - How important is locality? - How much precision is needed in a digital system? - What capabilities can we expect in 2 years? 5 years? - What are the biggest obstacles to implementing LARGE networks? Capacity? Performance? Power? Connectivity? presenters: Kristina Johnson, UC Boulder electro-optical networks Josh Alspector, Bellcore analog Boltzmann machines Andy Moore, Caltech subthreshold (with Video!) Edi Saeckinger, ATT NET32K Tom Baker, Adaptive Solutions precision Hal McCartor, Adaptive Solutions the X1 Chip presenters: please consider mentioning the following: - technology (eg 2.0 micron CMOS, 0.8 micron GaAs) - capacity in connections and neurons - performance in connections per second - energy per connection (power dissipation) - on-chip learning? (Updates per second) - scalability? How large a network? - a few words on tools See you at the workshop! Jim. From LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU Thu Nov 22 21:35:00 1990 From: LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU (LAI-WAN CHAN) Date: Fri, 23 Nov 90 10:35 +0800 Subject: Back-propogation Message-ID: > From: IN%"len at mqcomp.mqcs.mq.oz.au" "Len Hamey" 23-NOV-1990 09:39:15.77 > In studying back-propogation, I find a degree of uncertainty as to exactly > what the algorithm is. In particular, we all know and love the parameter > ETA which specifies how big a step to take, but is the step taken > simply ETA times the derivative of the error, or is the derivative vector > normalised so that the step taken is ETA in weight space? I have The standard back propagation algorithm uses the gradient descent method in which the step size is ETA times the gradient. The addition of the momentum term reduces some oscillations. However, the convergence depends very much on the step size (i.e. ETA and momentum). A small step size gives a smooth but slow learning path and a large step size gives an oscillatory path. The oscillatory path sometimes has a faster convergence but too much oscillation is hazardous to your nets !! The two updating methods that you mentioned were effectively adopting two sets of step sizes and this is why the network show different convergence speed. In fact, the optimal values of ETA (gradient term) and ALPHA (momentum term) depend on the training patterns and the size of the network. Finding these values are an headache. There are methods that automatically adapt the step size so as to speed up the training. I have worked on the adaptive learning [1] which I found it very useful and efficient. Other commonly used training methods which show faster convergence include the delta-bar-delta method [2] and the conjugate gradient method (used in optimisation). A comparison of these methods can be found in [3]. Reference : [1] L-W. Chan & F. Fallside, "An adaptive training algorithm for back propagation networks", Computer Speech and Language, 2, 1987, p205-218. [2] R.A. Jacobs, "Increased rates of convergence through learning rate adaptation", Neural networks, Vol 1, 1988, p295-307. [3] L-W. Chan, "Efficacy of different learning algorithms of the back propagation network", Proceeding of the IEEE Region 10 Conference on Computer and Communication Systems (TENCON'90), 1990, Vol 1, p23-27. Lai-Wan Chan, Computer Science Dept, The Chinese University of Hong Kong, Shatin, N.T., Hong Kong. email : lwchan at cucsd.cuhk.hk (bitnet) tel : (+852) 695 2795 FAX : (+852) 603 5024 From WWILBERG at WEIZMANN.BITNET Mon Nov 26 03:28:19 1990 From: WWILBERG at WEIZMANN.BITNET (Asher Merhav) Date: Mon, 26 Nov 90 10:28:19 +0200 Subject: add new member Message-ID: <57C5DA0DF860012D@BITNET.CC.CMU.EDU> Dear mail distributor, Please add my name to your mailing list. I would very much appreciate you confirmation to this massage. Sincerely, Asher Merhav From bharucha at eleazar.dartmouth.edu Tue Nov 27 11:21:00 1990 From: bharucha at eleazar.dartmouth.edu (Jamshed Bharucha) Date: Tue, 27 Nov 90 11:21:00 -0500 Subject: Cognitive Neuroscience Message-ID: <9011271621.AA08894@eleazar.dartmouth.edu> The 1991 James S. McDonnell Foundation Summer Institute in Cognitive Neuroscience The 1991 Summer Institute will be held at Dartmouth College, July 1-12. The two week course will examine how information about the brain bears on issues in cognitive science, and how approaches in cognitive science apply to neuroscience research. A distinguished international faculty will lecture on current topics in attention and emotion, including neurological and psychiatric disorders; laboratories, demonstrations and videotapes will offer practical experience with cognitive neuropsychology experiments, connectionist and computational modeling, and neuroanatomy. At every stage, the relationship between cognitive issues and underlying neural circuits will be explored. The Institute directors will be Michael I. Posner, David L. LaBerge, Joseph E. LeDoux, Michael S. Gazzaniga, and Gordon M. Shepherd. The Foundation is providing limited support for travel expenses and room/board. Applications are invited from beginning and established researchers, and must be received by January 11, 1991. For further information contact P. Reuter-Lorenz (PARL at mac.dartmouth .edu) or write to: McDonnell Summer Institute in Cognitive Neuroscience HB 7915-A Dartmouth Medical School Hanover, NH 03756 ______________________________________________________ APPLICATION FORM 1991 JAMES S. McDONNELL FOUNDATION SUMMER INSTITUTE IN COGNITIVE NEUROSCIENCE NAME_________________________________________________ INSTITUTIONAL AFFILIATION________________________________________ RESEARCH INTERESTS________________________________________ POSITION______________________________________________ HOME ADDRESS_______________________________________________ ______________________________________________________ WORK ADDRESS_______________________________________________ ______________________________________________________ E-MAIL ADDRESS______________________________________________ TELEPHONES: WORK: ( )______________ HOME: ( )______________ Housing expenses and some meal costs will be covered by the Foundation. There will also be limited travel support available. Please indicate the percent of your need for this support: _______% APPLICATION DEADLINE: All materials must be recevied by JANUARY 11, 1991 NOTIFICATION OF ACCEPTANCE: MARCH 10, 1991 PLEASE SEND THIS FORM, TOGETHER WITH: 1. A one-page statement explaining why you wish to attend. 2. A curriculum vitae. 3. Two letters of recommendation. TO: McDonnell Summer Institute HB 7915-A Dartmouth Medical School Hanover, New Hampshire 03756 From faramarz at ecn.purdue.edu Tue Nov 27 19:32:39 1990 From: faramarz at ecn.purdue.edu (Faramarz Valafar) Date: Tue, 27 Nov 90 19:32:39 -0500 Subject: Network Message-ID: <9011280032.AA13943@fourier.ecn.purdue.edu> Please put me on the mail list for the connectionist mail network. Thank you FARAMARZ VALAFAR From dyer at CS.UCLA.EDU Wed Nov 28 00:29:39 1990 From: dyer at CS.UCLA.EDU (Dr Michael G Dyer) Date: Tue, 27 Nov 90 21:29:39 PST Subject: tech. rep. available on evolving trail-following organisms Message-ID: <901128.052939z.14327.dyer@lanai.cs.ucla.edu> Evolution as a Theme in Artificial Life: The Genesys/Tracker System* David Jefferson, Robert Collins, Claus Cooper Michael Dyer, Margot Flowers, Richard Korf Charles Taylor, Alan Wang Abstract Direct, fine-grained simulation is a promising way of investigating and modeling natural evolution. In this paper we show how we can model a population of organisms as a population of computer programs, and how the evolutionarily significant activities of organisms (birth, interaction with the environment, migration, sexual reproduction with genetic mutation and recombination, and death) can all be represented by corresponding operations on programs. We illustrate these ideas in a system built for the Connection Machine called Genesys/Tracker, in which artificial "ants" evolve the ability to perform a complex task. In less than 100 generations a population of 64K "random" ants, represented either as finite state automata or as artificial neural nets, evolve the ability to traverse a winding broken "trail" in a rectilinear grid environment. Throughout this study we pay special attention to methodological issues, such as the avoidance of representational artifacts, and to biological verisimilitude. * To appear in J. D. Farmer, C. Langton, S. Rasmussen and C. Taylor (Eds.), Artificial Life II, Addison-Wesley, in press. For a copy of this tech. rep., please send e-mail to: valerie at cs.ucla.edu requesting "Jefferson et al. -- Evolution as Theme in ALife" Be sure to include your USA mail address. From SCHWARTZ at PURCCVM.BITNET Wed Nov 28 11:39:00 1990 From: SCHWARTZ at PURCCVM.BITNET (SCHWARTZ) Date: Wed, 28 Nov 90 11:39 EST Subject: Please add me to the mailing list Message-ID: <2E58E50AF8600D38@BITNET.CC.CMU.EDU> Please add my name Richard G. Schwartz (schwartz at purccvm.bitnet) to the mailing list. Thank you. From menon at cs.utexas.edu Wed Nov 28 13:34:21 1990 From: menon at cs.utexas.edu (menon@cs.utexas.edu) Date: Wed, 28 Nov 90 12:34:21 cst Subject: Dissertation abstract & Technical report Message-ID: <9011281834.AA08555@jaguar.cs.utexas.edu> Technical report announcement: Dynamic Aspects of Signaling in Distributed Neural Systems Vinod Menon Dept. of Computer Sciences Univ. of Texas at Austin Austin, TX 78712 ABSTRACT A distributed neural system consists of localized populations of neurons -- neuronal groups -- linked by massive reciprocal connections. Signaling between neuronal groups forms the basis of functioning of such a system. In this thesis, fundamental aspects of signaling are investigated mathematically with particular emphasis on the architecture and temporal self-organizing features of distributed neural systems. Coherent population oscillations, driven by exogenous and endogenous events, serve as autonomous timing mechanisms and are the basis of one possible mechanism of signaling. The theoretical analysis has, therefore, concentrated on a detailed study of the origin and frequency-amplitude-phase characteristics of the oscillations and the emergent features of inter-group reentrant signaling. It is shown that a phase shift between the excitatory and inhibitory components of the interacting intra-neuronal-group signals underlies the generation of oscillations. Such a phase shift is readily induced by delayed inhibition or slowly decaying inhibition. Theoretical analysis shows that a large dynamic frequency-amplitude range is possible by varying the time course of the inhibitory signal. Reentrant signaling between two groups is shown to give rise to synchronization, desynchronization, and resynchronization (with a large jump in frequency and phase difference) of the oscillatory activity as the latency of the reentrant signal is varied. We propose that this phenomenon represents a correlation dependent non-Boolean switching mechanism. A study of triadic neuronal group interactions reveals topological effects -- the existence of stabilizing (closed loop) and destabilizing (open loop) circuits. The analysis indicates (1) the metastable nature of signaling, (2) the existence of time windows in which correlated and uncorrelated activity can take place, and (3) dynamic frequency-amplitude-phase modulation of oscillations. By varying the latencies, and hence the relative phases of the reentrant signals, it is possible to dynamically and selectively modulate the cross-correlation between coactive neuronal groups in a manner that reflects the mapping topology as well as the intrinsic neuronal circuit properties. These mechanisms, we argue, provide dynamic linkage between neuronal groups thereby enabling the distributed neural system to operate in a highly parallel manner without clocks, algorithms, and central control. To obtain a copy of the technical report TR-90-36 please send $5 in US bank check or international money order payable to "The University of Texas" at the following address: Technical Report Center Department of Computer Sciences University of Texas at Austin Taylor Hall 2.124 Austin, TX 78712-1188 USA ------------------------------------------------------------------------- Dept. of Computer Sciences email: menon at cs.utexas.edu University of Texas at Austin tel: 512-343-8033 Austin, TX 78712 -471-9572 ------------------------------------------------------------------------- From Alexis_Manaster_Ramer%Wayne-MTS at um.cc.umich.edu Wed Nov 28 11:18:58 1990 From: Alexis_Manaster_Ramer%Wayne-MTS at um.cc.umich.edu (Alexis_Manaster_Ramer%Wayne-MTS@um.cc.umich.edu) Date: Wed, 28 Nov 90 11:18:58 EST Subject: No subject Message-ID: <273352@Wayne-MTS> What is the address for business correspondence? From SCHNEIDER at vms.cis.pitt.edu Wed Nov 28 15:08:00 1990 From: SCHNEIDER at vms.cis.pitt.edu (SCHNEIDER@vms.cis.pitt.edu) Date: Wed, 28 Nov 90 16:08 EDT Subject: Neural Processes in Cognition Training Program Message-ID: Program announcement for Interdisciplinary Graduate and Postdoctoral Training in Neural Processes in Cognition at the University of Pittsburgh and Carnegie Mellon University The National Science Foundation has newly established an innovative program for students investigating the neurobiology of cognition. The program's focus is the interpretation of cognitive functions in terms of neuroanatomical and neurophysiological data and computer simulations. Such functions include perceiving, attending, learning, planning, and remembering in humans and in animals. A carefully designed program of study prepares each student to perform original research investigating cortical function at multiple levels of analysis. State of the art facilities include: computerized microscopy, human and animal electrophysiological instrumentation, behavioral assessment laboratories, brain scanners, the Pittsburgh Supercomputing Center, and a regional medical center providing access to human clinical populations. This is a joint program between the University of Pittsburgh, its School of Medicine, and Carnegie Mellon University. Each student receives full financial support, travel allowances and a computer workstation. Applications are encouraged from students with interest in biology, psychology, engineering, physics, mathematics, or computer science. Pittsburgh is one of America's most exciting and affordable cities, offering outstanding symphony, theater, professional sports, and outdoor recreation in the surrounding Allegheny mountains. More than ten thousand graduate students attend its universities. Core Faculty and interests and affiliation Carnegie Mellon University -Psychology- James McClelland, Johnathan Cohen, Martha Farah, Mark Johnson University of Pittsburgh Behavioral Neuroscience - Michael Ariel, Theodore Berger Biology - Teresa Chay Information Science - Paul Munro Neurobiology Anatomy and Cell Sciences - Al Humphrey, Jennifer Lund Neurological Surgery - Don Krieger, Robert Sclabassi Neurology Steven Small Psychiatry - David Lewis, Lisa Morrow, Stuart Steinhauer Psychology - Walter Schneider, Velma Dobson Physiology - Dan Simons Applications: To apply to the program contact the program office or one of the affiliated departments. Students are admitted jointly to a home department and the Neural Processes in Cognition Program. The application deadline is February 1. Students must be accepted both by an affiliated department and this program. For information contact: Professor Walter Schneider Program Director Neural Processes in Cognition University of Pittsburgh 3939 O'Hara St Pittsburgh, PA 15260 Or: call 412-624-7064 or Email to NEUROCOG at PITTVMS.BITNET. From BOHANNON%BUTLERU.BITNET at VMA.CC.CMU.EDU Thu Nov 29 11:59:00 1990 From: BOHANNON%BUTLERU.BITNET at VMA.CC.CMU.EDU (BOHANNON%BUTLERU.BITNET@VMA.CC.CMU.EDU) Date: Thu, 29 Nov 90 11:59 EST Subject: job opening - Rank open Message-ID: Applied Cognition - Rank Open: The Department of Psychology at Butler University is seeking nominations/applications for a tenure track opening to start August, 1991. We are seeking faculty who would be knowledgeable in distributed processing systems and their applications. Specific area is less important than evidence of excellence in both teaching and research. Salaries are negotiable and competitive. Responsibilities include: maintain an active research program with undergraduates, teach an undergraduate course in Human Factors and other courses in the candidate's area of interest. Minimum Qualifications: Ph.D. in psychology at time of appointment. For junior nominees/applicants: potential excellence in both research and teaching. Teaching experience preferred. For senior nominees/applicants: established record of excellence in research, more than 3 years of teaching experience, and an interest and/or experience in attracting external funding. Butler University is a selective, private university located on a 400+ acre campus in the residential heart of Indianapolis. The psychology department is housed in recently renovated space with state-of-the-art, video/social interaction and computer-cognition labs (Macintosh II and up). Screening of nominees/applicants will continue until suitable faculty are found. Those interested should send a statement of research and teaching interests, vita, and three original letters of reference to: John Neil Bohannon III, Head, Department of Psychology, Butler University, 4600 Sunset Ave., Indianapolis, IN 46208. Bitnet : Bohannon@ ButlerU. AA/EEO From hwang at uw-isdl.ee.washington.edu Thu Nov 29 19:39:07 1990 From: hwang at uw-isdl.ee.washington.edu ( J. N. Hwang) Date: Thu, 29 Nov 90 16:39:07 PST Subject: call for papers Message-ID: <9011300039.AA12518@uw-isdl.ee.washington.edu.isdlyp> IJCNN'91 SINGAPORE, CALL FOR PAPERS CONFERENCE: The IEEE Neural Network Council and the international neural network society (INNS) invite all persons interested in the field of Neural Networks to submit FULL PAPERS for possible presentation at the conference. FULL PAPERS: must be received by May 31, 1991. All submissions will be acknowledged by mail. Authors should submit their work via Air Mail or Express Courier so as to ensure timely arrival. Papers will be reviewed by senior researchers in the field, and all papers accepted will be published in full in the conference proceedings. The conference hosts tutorials on Nov. 18 and tours arranged probably on Nov. 17 and Nov. 22, 1991. Conference sess- ions will be held from Nov. 19-21, 1991. Proposals for tutorial speakers & topics should be submitted to Professor Toshio Fukuda (address below) by Nov. 15, 1990. TOPICS OF INTEREST: original, basic and applied papers in all areas of neural networks & their applications are being solicited. FULL PAPERS may be submitted for consideration as oral or poster pres- entation in (but not limited to) the following sessions: -- Associative Memory -- Sensation & Perception -- Electrical Neurocomputers -- Sensormotor Control System -- Image Processing -- Supervised Learning -- Invertebrate Neural Networks -- Unsupervised Learning -- Machine Vision -- Neuro-Physiology -- Neurocognition -- Hybrid Systems (AI, Neural -- Neuro-Dynamics Networks, Fuzzy Systems) -- Optical Neurocomputers -- Mathematical Methods -- Optimization -- Applications -- Robotics AUTHORS' SCHEDULE: Deadline for submission of FULL PAPERS (camera ready) May 31, 1991 Notification of acceptance AUg. 31, 1991 SUBMISSION GUIDELINES: Eight copies (One original and seven photocopies) are required for submission. Do not fold or staple the original, camera ready copy. Papers of no more than 6 pages, including figures, tables and references, should be written in English and only complete papers will be considered. Papers must be submitted camera-ready on 8 1/2" x 11" white bond paper with 1" margins on all four sides. They should be prepared by typewriter or letter quality printer in one-column format, single-spaced or similar type of 10 points or larger and should be printed on one side of the paper only. FAX submissions are not acceptable. Centered at the top of the first page should be the complete title, author name(s), affiliation(s) and mailing address(es). This is followed by a blank space and then the abstract, up to 15 lines, followed by the text. In an accompanying letter, the following must be included: -- Corresponding author: -- Presentation preferred: Name Oral Mailing Address Poster Telephone & FAX number -- Technical Session: -- Presenter: 1st Choice Name 2nd Choice Mailing Address Telephone & FAX number FOR SUBMISSION FROM JAPAN, SEND TO: Professor Toshio Fukuda Programme Chairman IJCNN'91 SINGAPORE Dept. of Mechanical Engineering Nagoya University, Furo-cho, Chikusa-Ku Nagoya 464-01 Japan. (FAX: 81-52-781-9243) FOR SUBMISSION FROM USA, SEND TO: Ms Nomi Feldman Meeting Management 5565 Oberlin Drive, Suite 110 San Diego, CA 92121 (FAX: 81-52-781-9243) FOR SUBMISSION FROM REST OF THE WORLD, SEND TO: Dr. Teck-Seng, Low IJCNN'91 SINGAPORE Communication Intl Associates Pte Ltd 44/46 Tanjong Pagar Road Singapore 0208 (TEL: (65) 226-2838, FAX: (65) 226-2877, (65) 221-8916) From fodslett at daimi.aau.dk Fri Nov 30 13:48:14 1990 From: fodslett at daimi.aau.dk (Martin Moller) Date: Fri, 30 Nov 90 19:48:14 +0100 Subject: Preprint about scaled conjugate gradient available. Message-ID: <9011301848.AA22842@daimi.aau.dk> ************************************************************* ******************** PREPRINT announcement: **************** A Scaled Conjugate Gradient Algorithm for Fast Supervised Learning. Martin F. Moller Computer Science Dept. University of Aarhus Denmark e-mail: fodslett at daimi.aau.dk Abstract-- A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. SCG uses second order information from the neural network, but requires only O(N) memory usage. The performance of SCG is benchmarked against the performance of the standard backpropagation algorithm (BP) and several recently proposed standard conjugate gradient algorithms. SCG yields a speed-up at least an order of magnitude relative to BP. The speed-up depends on the convergence criterion,i.e., the bigger demand for reduction in error the bigger the speed-up. SCG is fully automated including no user dependent parameters and avoids a time consuming line search, which other conjugate gradient algorithms use in order to determine a good step size. Incorporating problem dependent structural information in the architecture of a neural network often lowers the overall complexity. The smaller the complexity of the neural network relative to the problem domain, the bigger the possibility that the weights space contains long ravines characterized by sharp curvature. While BP is inefficient on these ravine phenomena, SCG handles them effectively. The paper is available by ftp in the neuroprose directory under the name: moller.conjugate-gradient.ps.Z Any question or comments on this short writing or on the preprint would be very much appriciated. Martin M. From caroly at park.bu.EDU Fri Nov 30 15:13:20 1990 From: caroly at park.bu.EDU (caroly@park.bu.EDU) Date: Fri, 30 Nov 90 15:13:20 -0500 Subject: graduate study in neural networks Message-ID: <9011302013.AA00760@bucasb.bu.edu> (please post) *********************************************** * * * GRADUATE PROGRAM IN * * COGNITIVE AND NEURAL SYSTEMS (CNS) * * AT BOSTON UNIVERSITY * * * *********************************************** Gail A.Carpenter & Stephen Grossberg, Co-Directors The Boston University graduate program in Cognitive and Neural Systems offers comprehensive advanced training in the neural and computational principles, mechanisms, and architectures that underly human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Applications for Fall, 1991 admissions and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write or telephone: Cognitive & Neural Systems Program Boston University 111 Cummington Street, Room 240 Boston, MA 02215 (617) 353-9481 or send a mailing address to: caroly at park.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores may decrease an applicant's chances for admission and financial aid. Description of the CNS Program: The Cognitive and Neural Systems (CNS) Program provides advanced training and research experience for graduate students interested in the neural and computational principles, mechanisms, and architectures that underly human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Students are trained in a broad range of areas concerning cognitive and neural systems, including vision and image processing; speech and language understanding; adaptive pattern recognition; associative learning and long-term memory; cognitive information processing; self-organization; cooperative and competitive network dynamics and short-term memory; reinforcement, motivation, and attention; adaptive sensory-motor control and robotics; and biological rhythms; as well as the mathematical and computational methods needed to support advanced modeling research and applications. The CNS Program awards MA, PhD, and BA/MA degrees. The CNS Program embodies a number of unique features. Its core curriculum consists of eight interdisciplinary graduate courses each of which integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of neural networks to technology. Each course is taught once a week in the evening to make the program available to qualified students, including working professionals, throughout the Boston area. Students develop a coherent area of expertise by designing a program that includes courses in areas such as Biology, Computer Science, Engineering, Mathematics, and Psychology, in addition to courses in the CNS core curriculum. The CNS Program prepares Ph.D. students for thesis research with scientists in one of several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The unit most closely linked to the Program is the Center for Adaptive Systems. The Center for Adaptive Systems is also part of the Boston Consortium for Behavioral and Neural Studies, a Boston-area multi-institutional Congressional Center of Excellence. Another multi-institutional Congressional Center of Excellence focussed at Boston University is the Center for the Study of Rhythmic Processes. Other research resources include distinguished research groups in dynamical systems within the mathematics department; in theoretical computer science within the Computer Science Department; in biophysics and computational physics within the Physics Department; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the Engineering School; and in neurophysiology, neuroanatomy, and neuropharmacology at the Medical School. From slehar at park.bu.edu Thu Nov 1 10:46:48 1990 From: slehar at park.bu.edu (slehar@park.bu.edu) Date: Thu, 1 Nov 90 10:46:48 -0500 Subject: splitting hairs In-Reply-To: connectionists@c.cs.cmu.edu's message of 1 Nov 90 03:16:42 GM Message-ID: <9011011546.AA01594@bucasd.bu.edu> HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA !!!!!!! Excellent stuff! Great reading! From finton at cs.wisc.edu Thu Nov 1 17:06:50 1990 From: finton at cs.wisc.edu (David J. Finton) Date: Thu, 1 Nov 90 16:06:50 -0600 Subject: CRG-TR-90-5 request Message-ID: <9011012206.AA02232@ai.cs.wisc.edu> finton at cs.wisc.edu From dlovell at s1.elec.uq.oz.au Fri Nov 2 10:42:40 1990 From: dlovell at s1.elec.uq.oz.au (David Lovell) Date: Fri, 2 Nov 90 10:42:40 EST Subject: Rejection not misclassification Message-ID: <9011020042.AA09177@c17.elec.uq.oz.au> Dear Net, I have just been talking with my PhD supervisor and he has pointed out the value of a neural net classifier which will reject patterns that are "too different" from the exemplars used to train the net. For example, if a neural net is being used to classify substances as being toxic or non-toxic then it is much more desirable for the net to reject a poorly recognized sample rather than possibly classify it incorrectly. Seized by a frenzy of motivation, I have decided that the international neural net research community might be able to point out some references on the rejection of poor pattern matches. Naturally, I will compile a list of all the references I receive and send it out to all subscribers of this mailgroup. Thanks in advance, David Lovell (dlovell at s1.elec.uq.oz.au) Department of Electrical Engineering University of Queensland QUEENSLAND 4072 Australia From Mail_System%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU Thu Nov 1 19:05:00 1990 From: Mail_System%VMS.BRIGHTON.AC.UK at VMA.CC.CMU.EDU (Mail_System%VMS.BRIGHTON.AC.UK@VMA.CC.CMU.EDU) Date: Thu, 1 Nov 90 19:05 BST Subject: %% Undelivered Mail %% Message-ID: Your mail was not delivered as follows: %MAIL-E-SENDERR, error sending to user MRE1 %MAIL-E-OPENOUT, error opening !AS as output -RMS-E-CRE, ACP file create failed -SYSTEM-F-EXDISKQUOTA, disk quota exceeded %MAIL-E-SENDERR, error sending to user MRE1 -MAIL-E-OPENOUT, error opening !AS as output -RMS-E-CRE, ACP file create failed -SYSTEM-F-EXDISKQUOTA, disk quota exceeded Your original mail header and message follow. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% From kawahara at siva.ntt.jp Sat Nov 3 00:12:30 1990 From: kawahara at siva.ntt.jp (kawahara@siva.ntt.jp) Date: Sat, 03 Nov 90 00:12:30 JST Subject: JNNS'90 program and the mailing list (long) Message-ID: <9011021512.AA03947@siva.ntt.jp> I have finally compiled a list of titles presented at the first annual conference of Japan Neural Network Society. I hope this will give a general aspect of neural network research activities in Japan. If you have any questions, please contact neuro-admin at tut.ac.jp, which is the administrating group of Japanese neural network researcher's mailing list. The mailing list is still its infancy, and the traffic is still very low. I hope this will change in the near future. Hideki Kawahara NTT Basic Research Laboratories kawahara%siva.ntt.jp at RELAY.CS.NET ---------- cut here ---------- JNNS'90 The first annual conference of the Japan Neural Network Society Tamagawa University, Tokyo Japan 10-12 September, 1990 Presentation titles: (Original titles were Japanese. These titles were translated into English by the original authors. Some of them, indicated by '**', were translated by the editor of this list. 'O' stands for oral presentation and 'P' stands for poster presentation.) O1-1, A-BP "Another type of back propagation learning", Kazuhisa Niki (Electro technical Lab.) O1-2, Learning of Affine Invariance by Competitive Neural Network, Suichi Kurogi (Division of Control Engineering, Kyushu Institute of Technology) O1-3, ** An Optimal Network Size Selection for Generalization based on Cross Validation, Yasuhiro Wada and Mitsuo Kawato (ATR) O1-4, Generalizing Neural Networks using Mean Curvature, Shin Suzuki and Hideki Kawahara (NTT Basic Research Labs.) O1-5, Evolution of Artificial Animal having Perceptron by Genetic Algorithm, Masanori Ichinose and Tsutomu Hoshino (Institute of Engineering Mechanics, University of Tsukuba) O2-1, Neural Network Model of Self-Organization of Walking Patterns in Insects, Shinichi Kimura, Masafumi Yano and Hiroshi Shimizu(Faculty of Pharmaceutical Science University of Tokyo) O2-2, Learning Trajectory and Force Control of Human Arm Using Feedback-Error-Learning Scheme, Masazumi Katayama and Mitsuo Kawato(ATR Auditory and Visual Perception Research Laboratories) O2-3, Formation of Optimal Hand Configuration to grasp an Object, Naohiro Fukumura, Yoji Uno and Ryoji Suzuki(Department of Mathematical Engineering and Information Physics, Faculty of Engineering, University of Tokyo) O2-4, Hybrid Control of Robotic Manipulator by Neural Network Model (Variable learning of neural networks by Fuzzy set theory), Takanori Shibata and Toshio Fukuda (Nagoya University) Masatoshi Tokita and Toyokazu Mitsuda (Kisarazu Technical College) O2-5, An Overview of Neuofuzzy System, Akira KAWAMURA, Nobuo WATANABE, Yuri Owada, Ryusuke Masuoka and Kazuo Asakawa (Computer-based Systems Laboratory, Fujutsu Laboratories Ltd.) O3-1, ** Cognitive Effects caused by Random Movement of Wide-Field Patterns and Response Characteristics of MST Cells of Macaque Monkey, Masao Kaneko, Hiroshi Nakajima, Makoto Mizuno, Eiki Hida, Hide-aki Saito and Minoru Tsukada (Faculty of Engineering, Tamagawa University) O3-2, Extraction of Binocular Parallax with a Neural Network and 3-D Surface Reconstruction, Mahito Fujii, Takayuki Ito, Toshio Nakagawa (NHK Science and Technical Research Laboratories) O3-3, A Model of 3-D Surface Depth Perception from Its Boundary Perceived with Binocular Viewing, Masanori Idesawa (Rikin: The Institute of Physical and Chemical Research) O3-4, Theory of Information Propagation, Tetsuya Takahashi(The Institute for Physical and Chemical Research Laboratory for Neural Networks) O3-5, A model of the transformation of color selectivity in the monkey visual system, Hidehiko Komatsu, Shinji Kaji, Shigeru Yamane (Electrotechnical Laboratory Neuroscience Section), Yoshie Ideura (Komatsu Limited) O4-1, Magical number in cognitive map and quantization of cognitive map shape, Terunori Mori (Electrotechnical Laboratory) O4-2, Generative representation of symbolic information in a pattern recognition model "holovision", Hiroshi Shimizu and Yoko Yamaguchi (Faculty of Pharmaceutical Sciences, University of Tokyo) O4-3, Long-Term Potentiation to Temporal Pattern Stimuli in Hippocampal Slices, Takeshi Aihara, Minoru Tsukada and Makoto Mizuno (Faculty. of Eng. Tamagawa Univ.), Hiroshi Kato and Haruyoshi Miyagawa(Dept. of Physiol. Yamagata Univ.) O4-4, Bidirectional Neural Network Model for the Generation and Recognition of Temporal Patterns, Ryoko Futami and Nozomu Hoshimiya (Department of Electrical Communications, Faculty of Engineering, Tohoku University) O4-5, Theta rhythm in hippocampus: phase control of information circulation, Yoko Yamaguchi and Hiroshi Shimizu (Faculty of Pharmaceutical Science University of Tokyo) O5-1, Stability and/or instability of limit cycle memories embedded in an asynchronous neural network model, Toshinao Ishii and Wolfgang Banzhaf (Central Research Laboratory Mitsubishi Electric Corporation), Shigetoshi Nara (Department of Electric and Electronic Engineering Faculty of Engineer Okayama Univ.) O5-2, Geometric analysis of the dynamics of associative memory networks, Kenji Doya (Faculty of Engineering, University of Tokyo) O5-3, On the Integration of Mapping and Relaxation, Kazuyoshi Tsutsumi (Department of Mechanical and System Engineering Faculty of Science and Technology Ryukoku Univ.) P1-1, Neural network model on gustatory neurons in rat, Masaharu Adachi, Eiko Ohshima, Kazuyuki Aihara and Makoto Kotani (Faculty of Engineering, Tokyo Denki Univ.), Takanori Nagai (Faculty of Medicine, Teikyo Univ.), Takashi Yamamoto (Faculty of Dentistry, Osaka Univ.) P1-2, Learning Algorithm based on Temporal Pattern Discrimination in Hippocampus, Minoru Tsukada and Takeshi Aihara (Tamagawa University), K. Kato (Yamagata University) P1-3, A study on Learning by Synapse patch group, Shuji Akiyama, Yukifumi Shigematsu and Gen Matsumoto (Electrotechnical Laboratory Molecular and Cellular Neuroscience Section) P1-4, A Model of the Mechanisms of Long-Term Depression in the Cerebellum, Tatso Kitajima and Kenichi Hara (Faculty of Engineering, Yamagata Univ.) P1-5, ** Self-Organization in Neural Networks with Lateral Inhibition, Y. Tamori, S. Inawashiro and Y. Musya (Faculty of Engineering, Tohoku University) P1-6, ** Receptive Fields by Self-Organization in Neural Networks, S.Inawashiro, Y. Tamori and Y. Musya (Faculty of Engineering, Tohoku University) P1-7, ** Does Backpropagation Exist in Biological Systems? -- Discussions and Considerations, Shyozo Yasui (Kyusyu Institute of Technology), Eiki Hida (Tamagawa University) P1-8, Accelerating the convergence of the error back-propagation algorithm by deciding effective teacher signal, Yutaka Fukuoka, Hideo Matsuki, Hidetake Muraoka and Haruyuki Minamitani (Keio Univ.) P1-9, **Three-Layered Backpropagation Model with Temperature Parameter, Yoji Fukuda, Manabu Kotani and Haruya Matsumoto (Faculty of Engineering, Kobe University) P1-10, Kalman Type Least Square Error Learning Law for Sequential Neural Network and its Information Theory, Kazuyoshi Matsumoto (Kansai Advanced Research Center, CRL, MPT) P1-11, Learning Surface of Hierarchical Neural Networks and Valley Learning Method, Kazutoshi Gouhara, Norifusa Kanai, Takeshi Iwata and Yoshiki Uchikawa (Department of Electronic Mechanical Engineering School of Engineering: Nagoya Univ.) P1-12, A Study of Generalization of Multi-layered Neural Networks, Katsumasa Matsuura (Mechanical Engineering Research Laboratory, Hitachi Ltd.) P1-13, When "Learning" occurs in Machine Learning, Noboru Watanabe (Department of Biophysics, Kyoto University) P1-14, A Study on Self-supervised Learning System, Kazushige Saga, Tamami Sugasaka and Shigemi Nagata (Computer-Based Systems Laboratory Fujitsu Laboratories Ltd.) P1-15, A Learning Circuit for VLSI Analog Neural Network Implementation, Hiroyuki Wasaki, Yoshihiko Horio and Shogo Nakamura (Department of Electronic Engineering : Tokyo Denki Univ.) P1-16, Comparison of Learning Methods for Recurrent Neural Networks, Tatsumi Watanabe, Kazutoshi Gouhara and Yoshiki Uchikawa (Department of Electronic Mechanical Engineering, School of Engineering : Nagoya Univ.) P1-17, A Learning algorithm for the neural network of Hopfield type, Fumio Matsunari and Masuji Ohshima (Toyota Central Res. & Develop. Labs. Inc.) P1-18, Quantitative relationship between internal model of motor system and movement accuracy, movement distance and movement time, Yoji Uno and Ryoji Suzuki (Faculty of Engineering: University of Tokyo) P1-19, Autonomic control of bipedal locomotion using neural oscillators, Gentaro Taga, Yoko Yamaguchi and Hiroshi Shimizu (Faculty of Pharmaceutical Sciences: University of Tokyo) P1-20, Learning Model of Posture Control in Cerebellum, Hiroaki Gomi and Mitsuo Kawato (ATR Auditory and Visual Perception Research Laboratories) P1-21, Hybrid Control of Robotic Manipulator by Neural Network Model (Sensing and Hybrid Control of Robotic Manipulator with Collision Phenomena), Takanori Shibata, Toshio Fukuda, Fumihito Arai and Hiroshi Wada (Nagoya University), Masatoshi Tokita and Toyokazu Mituoka (Kisarazu Technical College) Yasumasa Shoji (Toyo Engineering Corp.) P1-22, Hybrid Position/Force Control of Robotic Manipulator by Application of Neural Network (Adaptive Control with Consideration of Characteristics of Objects), Masatoshi Tokita and Toyokazu Mituoka(Kisarazu National College of Technology), Toshio Fukuda and Takanori Shibata (Nagoya Univ.) P1-23, Reverbration in Chaotic Neural Network with a Fractal Connection, Masatoshi Hori, Masaaki Okabe and Masahiro Nakagawa (Department of Electrical Engineering, Faculty of Engineering, Nagaoka University of Technology) P1-24, An investigation of correlation possibility in neurocomputing and quantum mechanics via Matrix Dynamics, Tomoyuki Nishio (Technology Planning office General R&D Laboratory JUKI Corporation) P1-25, Knowledge Representation and Parallel Inference with Structured Networks, Akira Namatame and Youichi Ousawa (National Defense Academy) P1-26, Simulation of a Spiking Neuron with Multiple Input, G. Bugmann (Fundamental Research Laboratories, NEC Corporation) P2-1, Parallel Implementation of Edge detection by Energy Minimization on QCDPAX machine, Hideki Asoh (Electrotechnical Laboratory), Youichi Hachikubo and Tsutomu Hoshino (University of Tsukuba) P2-2, Characteristics of the Marr-Hildreth filter for two particle image, Shigeharu Toyoda (Department of Chemical Engineering, Faculty of Engineering Science, Osaka Univ.) P2-3, A neural network for fixation point selection, Makoto Hirahara and Takashi Nagano (College of engineering, Hosei Univ.) P2-4, A Method for Analyzing Inverse Dynamics of the Retinal Horizontal Cell Response through the Ionic Current Model, Yoshimi Kamiyama, Hiroyuki Ishii and Shiro Usui (Information and Computer Science, Toyohashi Uni. of Technology) P2-5, Selective Recognition of Plural Patterns by Neural Networks, Katsuji Imai, Kazutoshi Gouhara and Yoshiki Uchikawa (Department of Electronic Mechanical Engineering, School of Engineering, Nagoya Univ.) P2-6, A neural network for size-invariant pattern recognition, Masaichi Ishikawa and Toshi Nagano (College of Engineering, Hosei Univ.) P2-7, Human-Face Identification by Neural Network for Mosaic Pattern, Makoto Kosugi (NTT Human Interface Laboratories) P2-8, Pattern Classification and Recognition Neural Network based on the Associative Learning Rules, Hidetoshi Ikeno (Maizuru College of Technology), Shiro Usui and Manabu Sakakibara (Toyohashi Univ. of Technology) P2-9, Learning of A Three-Layer Neural Network with Translated Patterns, Jianqiang YI, Shuichi Kurogi and Kiyotoshi Matsuoka (Division of Control Engineering, Kyushu Institute of Technology) P2-10, Combining a Neural Network with Template Matching for Handwritten Numeral Classification, Masahiko Tateishi, Haruaki Yamazaki (Systems Laboratories, OKI Electric Corporation) P2-11, Recognition of Continuous Writing of English Words with the Mechanism of Selective Attention, Taro Imagawa and Kunihiko Fukushima (Faculty of Engineering Science, Osaka University) P2-12, Recognition of Handwritten Alphanumeric Characters by the Neocognitron, Nobuaki Wake Kunihiko Fukushima (Faculty of Engineering Science, Osaka University) P2-13, Target Recognition with Chebyshev Networks, Nobuhisa Ueda and Akira Namatame (National Defense Academy) P2-14, A Large Scale Neural Network "Comb NET" for Printed Kanji Character Recognition (JIS 1st and 2nd Level Character Set), Takashi Tohma, Akira Iwata, Hiroshi Matsuo and Nobuo Suzumura (Nagoya Institute of Technology) P2-15, Discriminative Properties of the Temporal Pattern in the Mesencephalic Periaqueductal Gray of Rat, Mitsuo Terasawa, Minoru Tsukada, Makoto Mizuno, Takeshi Aihara (Faculty of Engineering Tamagawa Univ.) P2-16, Spoken Word Recognition using sequential Neural Network, Seiichi Nakagawa and Isao Hayakawa (Toyohashi University of Technology, Dept. of Info. & Comp. Science.) P2-17, Learning of the three-vowel sequence by a neural network model and influence to a middle vowel from preceding and succeeding vowels, Teruhiko Ohtomo and Ken-ichi Hara (Faculty of Engineering, Yamagata Univ.) P2-18, A Self-Organizing Neural Network for The Classification of Temporal Sequences, Seiichi Ozawa (Nara National College of Technology), Kazuyoshi Tsutsumi (Faculty of Science and Technology, Ryukoku Univ.), Haruya Matsumoto (Faculty of Technology, Kobe Univ.) P2-19, Neural Mechanism for Fine Time-Resolution, Kiyohiko Nakamura (Graduate School of Science and Engineering Tokyo Institute of Technology) P2-20, Neuromagnetic Image Reconstruction by a Neural Network, Hisashi Tsuruoka (Fukuoka Institute of Technology) and Kohyu Fukunishi (Advanced Research Lab., Hitachi, Ltd.) P2-21, A Sparse Stabilization in Mutually Connected Neural Networks, Hiroshi Yamakawa (Faculty of Engineering, University of Tokyo), Yoichi Okabe (Research Center for Advanced Science and Technology, University of Tokyo) P2-22, ** Relation between Recollection Ability and Membrane Potential Threshold in Hopfield Model, Eizo Ohno and Atsushi Yamanaka (Sharp Laboratory) P2-23, Weight quantization in learning Boltzmann machine, Masanobu Takahashi, Wolfgang Balzer, Jun Ohta and Kazuo Kyuma (Solid state quantum electronics department Central Research Laboratory Mitsubishi Electric Corporation) P2-24, A cellular organizing approach for the travelling salesman problem, M. Shigematsu (Electrotechnical Laboratory) P2-25, Combinatorial Optimization Problems and Stochastic Logic Neural Network, Yoshikazu Kondo and Yasuji Sawada (Research Institute of Electrical Communication Tohoku Univ.) P2-26, Neural Network Models on SIMD Architecture Computer, Makoto Hirayama (ATR Auditory and Visual Perception Research Laboratories) P2-27, Optimal connection of an associative network with non-uniform elements, Hiro-fumi Yanai (Department of Mathematical Engineering and Information Physics, University of Tokyo) From B344DSL at UTARLG.UTARL.EDU Sat Nov 3 00:33:00 1990 From: B344DSL at UTARLG.UTARL.EDU (B344DSL@UTARLG.UTARL.EDU) Date: Fri, 2 Nov 90 23:33 CST Subject: Splitting hairs Message-ID: <0DDC7B47EB5F000497@utarlg.utarl.edu> From: IN%"gary%cs at ucsd.edu" "Gary Cottrell" 1-NOV-1990 23:35:40.19 To: jose at learning.siemens.com, tgd at turing.CS.ORST.EDU CC: connectionists at CS.CMU.EDU Subj: splitting hairs From davec at cogs.sussex.ac.uk Fri Nov 2 10:01:55 1990 From: davec at cogs.sussex.ac.uk (David Cliff) Date: Fri, 2 Nov 90 15:01:55 GMT Subject: Technical Report CSRP162 Message-ID: <2042.9011021501@rsund.cogs.susx.ac.uk> The following report is now available: "Computational Neuroethology: A Provisional Manifesto" Dave Cliff, University of Sussex School of Cognitive and Computing Sciences CSRP162, May 1990. \begin{abstract} This paper questions approaches to computational modelling of neural mechanisms underlying behaviour. It examines ``simplifying'' (connectionist) models used in computational neuroscience and concludes that, unless embedded within a sensorimotor system, they are meaningless. The implication is that future models should be situated within closed-environment simulation systems: output of the simulated nervous system is then expressed as observable behaviour. This approach is referred to as ``computational neuroethology''. Computational neuroethology offers a firmer grounding for the semantics of the model, eliminating subjectivity from the result-interpretation process. A number of more fundamental implications of the approach are also discussed, chief of which is that insect cognition should be studied in preference to mammalian cognition. \end{abstract} An abridged version of this paper is to appear in: "From Animals to Animats: Proceedings of the First International Conference on the Simulation of Adaptive Behaviour" J.-A. Meyer and S.W. Wilson, editors. MIT Press/Bradford Books, 1990. --------------------------------------------------------------------------- Copies of the postscript file cliff.manifesto.Z may be obtained from the pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the Getps script or do this: unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Connected to cheops.cis.ohio-state.edu. Name (cheops.cis.ohio-state.edu:): anonymous 331 Guest login ok, sent ident as password. Password: neuron 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose ftp> binary ftp> get cliff.manifesto.ps.Z ftp> quit unix-2> uncompress cliff.manifesto.ps.Z unix-3> lpr -P(your_local_postscript_printer) cliff.manifesto.ps ---------------------------------------------------------------------------- Or, order a hardcopy by sending your physical mail address to davec at cogs.sussex.ac.uk, mentioning CSRP162. Please do this only if you cannot use the ftp method described above. From rob at galab2.mh.ua.edu Sat Nov 3 16:55:51 1990 From: rob at galab2.mh.ua.edu (Robert Elliott Smith) Date: Sat, 03 Nov 90 15:55:51 CST Subject: literature related to kawahara posting. Message-ID: <9011032156.AA24258@galab2.mh.ua.edu> I tried to mail the following directly to Mr. Kawahara, but it bounced. Could someone else supply me with the necessary info? would like to get copies any papers assocciated with titles included in your list. Could you give me some idea of how to get any literature related to: O1-5, Evolution of Artificial Animal having Perceptron by Genetic Algorithm, Masanori Ichinose and Tsutomu Hoshino (Institute of Engineering Mechanics, University of Tsukuba) P1-13, When "Learning" occurs in Machine Learning, Noboru Watanabe (Department of Biophysics, Kyoto University) P1-14, A Study on Self-supervised Learning System, Kazushige Saga, Tamami Sugasaka and Shigemi Nagata (Computer-Based Systems Laboratory Fujitsu Laboratories Ltd.) Any help you can give me would be appreciated. Sincerely, Rob Smith. ------------------------------------------- Robert Elliott Smith Department of Engineering of Mechanics The University of Alabama P. O. Box 870278 Tuscaloosa, Alabama 35487 <> rob at galab2.mh.ua.edu <> (205) 348-4661 ------------------------------------------- From well!moritz at apple.com Sat Nov 3 21:46:55 1990 From: well!moritz at apple.com (Elan Moritz) Date: Sat, 3 Nov 90 18:46:55 pst Subject: _homo_trans sapiens, request for comments Message-ID: <9011040246.AA14274@well.sf.ca.us> TRANS_SAPIENS and TRANS_CULTURE *** REQUEST FOR COMMENTS ------------------------------------- In an earlier paper [Memetic Science: I - General Introduction; Journal of Ideas, Vol. 1, #1, 3-22, 1990] I postulated the emergence of a descendent of homo sapiens. This descendent will be primarily differentiated from h. sapiens by having * substantially greater cognitive abilities *. [the relevant section of the paper is included below]. >>>>> I plan to write a more substantive paper on the topic and would appreciate comments, speculation, arguments for & against this hypothesis. Relevant comments / arguments will be addressed in the paper and be properly acknowledged/referenced <<<<<. Elan Moritz <<<< -- text of h. trans sapiens section follows -- We also introduce here the concepts of trans-culture and Homo trans-sapiens (or simply trans-sapiens). While being topics of a future paper, trans-culture can be described as the next step of culture dominated by deep connections, interactions, and relationships between objects created by large human/machine teams. A manifest property of trans-culture is the extreme and transcendent complexity of interactions and relations between humans and the cultural objects involved, with the additional property of being non-accessible to Homo sapiens. Examples of trans-cultural objects already exist; for example, there is no individual who (at any given temporal instance) is an expert in all aspects of medicine, or who is familiar with all biological species and their relationships, or is an expert in all aspects of physics, or who is totally familiar with all aspects of even a single cultural artifact (e.g. Hubble space telescope, Space Shuttle design, or the total design of a nuclear power plant). In fact, we are approaching the point that certain proofs of mathematical theorems are becoming too long and difficult for any one individual to keep in conscious awareness. In a way, these transcendent and extended complexity relationships are examples of more complicated 'meta-memes', which is one of the reasons it is interesting to study the evolution of ideas. Homo trans-sapiens is the [postulated] next step in evolution of homo sapiens. There is no reason to expect or require that Homo sapiens will not undergo further evolution. The bio-historical trend indicates that the major evolutionary development in Homo is in the cortico-neural arena (i.e. increasingly more complex organization of the nervous system and the brain). Specifically it is the higher level cognitive - Knowledge Information Processing functions that set H. Sapiens apart. It is asserted here (and to be discussed in a future paper) that H. trans-sapiens is a logical consequence of evolution, and that the milieu and adaptive epigenetic landscape for H. trans-sapiens is already present in the form of trans-culture. It is indeed possible that the basic mutations are in place and trans-sapiens already exists or will appear in the biologically-near time frame. [ Please pass to other relevant news groups/ e-lists] Elan Moritz, snail mail: Elan Moritz The Institute for Memetic Research PO Box 16327, Panama City, Florida 32406 e mail: moritz at well.sf.ca.us [internet] From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Sun Nov 4 11:04:44 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Sun, 04 Nov 90 11:04:44 EST Subject: _homo_trans sapiens, request for comments In-Reply-To: Your message of Sat, 03 Nov 90 18:46:55 -0800. <9011040246.AA14274@well.sf.ca.us> Message-ID: ... There is no reason to expect or require that Homo sapiens will not undergo further evolution. The bio-historical trend indicates that the major evolutionary development in Homo is in the cortico-neural arena (i.e. increasingly more complex organization of the nervous system and the brain). I think that this is a very shaky bit of extrapolation. Perhaps the full paper addresses this point in more detail, but I think that many observers (including some of the more perceptive science fiction writers) have pointed out the following: The very matrix of culture and machines that give rise to your "trans-sapiens" has pretty much eliminated the selective pressure that has driven evolution for the last couple of billion years -- at least for humans living in developed countries. In fact, in such countries, it is often the most successful people (in the intellectual terms you are concerned with) who breed the least, so not-very-natural selection may actually be working against this kind of success. So, the cultural matrix continues to develop, but the individual humans who make up that matrix may, on the average, be less capable as time goes on. It may well be that within a generation or two we'll have such complete control over the genetic makeup of our human progeny that the old mutation/selection machinery will be irrelevant. Whether we will (or should) choose to use this control to deliberately change the species in certain directions is a *very* complex question. But whether or not we actively take control of human evolution, it makes little sense to predict the future based on "bio-historical trends" of the past million years. We have recently crossed a discontinuity that changes the rules of evolution in fundamental and not-yet-understood ways. That change is probably permanent unless the culture itself is destroyed. -- Scott From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Mon Nov 5 01:46:18 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Mon, 05 Nov 90 01:46:18 EST Subject: _homo_trans sapiens (Ooops!) Message-ID: A couple of people have suggested to me that the _homo_trans_sapiens_ discussion, while potentially interesting to some of us, really doesn't belong on this mailing list. Upon reflection, I have to agree. I hope Elan Moritz's original post and my hasty response don't generate a lot of amateur philosophizing on this list -- that's what usenet is for. :-) Perhaps Mr. Moritz could pick an appropriate newsgroup for the continuation of this discussion and inform people interested in this topic where to find the discussion. Then the rest of us can get back to building _back_propo_micro_sapiens_ or whatever. -- Scott From well!moritz at apple.com Sun Nov 4 21:46:19 1990 From: well!moritz at apple.com (Elan Moritz) Date: Sun, 4 Nov 90 18:46:19 pst Subject: _homo_trans sapiens, request for comments Message-ID: <9011050246.AA24500@well.sf.ca.us> You raise interesting points here, others have raised mutation rates, coordination with other physical parameters of human body, etc. The points most seem to make make are with respect to the genotype+phenotype set. When more comments come in I will summarize and post them. From rr at cstr.edinburgh.ac.uk Mon Nov 5 07:54:26 1990 From: rr at cstr.edinburgh.ac.uk (Richard Rohwer) Date: Mon, 5 Nov 90 12:54:26 GMT Subject: maximum entroppy and trans_homo sapiens Message-ID: <5812.9011051254@kahlo.cstr.ed.ac.uk> This isn't really a connectionist subject, but perhaps it has enough amusement value to merit posting... Elan Moritz writes: > Homo trans-sapiens is the [postulated] next step in > evolution of homo sapiens. There is no reason to expect or My view is that the fate of humanity is either anihilation or obsolesence (or both). There are lots of obvious possibilities for reaching these fates, and I do not wish to review the menu. Instead I would like to mention an amusing (if uncompelling) doomsday-is-imminent argument based on the Anthropic principle -- the idea that the universe appears as it does to folks like us because folks like us cannot survive in a substantially different universe, and therefore such universes go unnoticed. Suppose the human population grows exponentially until it is anihilated. A randomly selected person in this population (considered over all time) is most likely to live at a time when the population is large, which also happens to be a time near doomsday. So in the spirit of the Maximum Entropy Principle, one asserts that we are probably typical people, and therefore doomsday looms. (There's a wee problem with this argument... It applied equally well at any time in history.) Scott Fahlman makes the point that: > The very matrix of culture and machines that > give rise to your "trans-sapiens" has pretty much eliminated the selective > pressure that has driven evolution for the last couple of billion years -- Ah, but do selective pressures live on in the machines we build...? Those machines will survive which make the most effective use of their human (and other) resources... I presume this idea constitutes a standard theme in sci-fi. Richard Rohwer JANET: rr at uk.ac.ed.cstr Centre for Speech Technology Research ARPA: rr%ed.cstr at nsfnet-relay.ac.uk Edinburgh University BITNET: rr at cstr.ed.ac.uk, 80, South Bridge rr%cstr.ed.UKACRL Edinburgh EH1 1HN, Scotland UUCP: ...!uunet!mcsun!ukc!cstr!rr PHONE: (44 or 0) (31) 225-8883 x261 FAX: (44 or 0) (31) 226-2730 From kayama at CS.UCLA.EDU Sat Nov 3 23:34:09 1990 From: kayama at CS.UCLA.EDU (Masahiro Kayama) Date: Sat, 3 Nov 90 20:34:09 -0800 Subject: Question Message-ID: <9011040434.AA23120@oahu.cs.ucla.edu> Dear connectionist: I am Masahiro Kayama, a visiting scholar of UCLA from Hitachi Ltd., Japan. I am investigating to introduce Kohonen feature map into the feature extraction module of the Factory Automation Controller. Does anyone know how to determine a suitable size of units of a feature map? Are there any deterministic equation between a capacity of a object and a number of units? I am thinking two dimensional map. Masahiro Kayama kayama at cs.ucla.edu From JBECKMAN%DD0RUD81.BITNET at BITNET.CC.CMU.EDU Mon Nov 5 17:18:15 1990 From: JBECKMAN%DD0RUD81.BITNET at BITNET.CC.CMU.EDU (JBECKMAN%DD0RUD81.BITNET@BITNET.CC.CMU.EDU) Date: Mon, 05 Nov 90 17:18:15 CET Subject: NEURAL NETWORK APPLICATIONS Message-ID: <1881251526C00577@BITNET.CC.CMU.EDU> For a survey on NEURAL NETWORK APPLICATIONS (realized or intended) I am looking for REVIEW articles and/or publications about special projects. Who can help me ? - Thanks in advance. Juergen Beckmann, JBECKMAN at DD0RUD81.BITNET From malerma at batman.fi.upm.es Tue Nov 6 09:05:00 1990 From: malerma at batman.fi.upm.es (malerma@batman.fi.upm.es) Date: 6 Nov 90 15:05 +0100 Subject: No subject Message-ID: <9011061405.AA00938@batman.fi.upm.es> I am interested in constructive algorithms for neural networks (capable of generating new units and layers in the NN). Is there any constructive algorithm for NN whose units have continuous (e.g.: sigmoidal) activation function? Miguel A. Lerma Sancho Davila 18 28028 MADRID - SPAIN From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Tue Nov 6 11:35:49 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Tue, 06 Nov 90 11:35:49 EST Subject: No subject In-Reply-To: Your message of 06 Nov 90 15:05:00 +0100. <9011061405.AA00938@batman.fi.upm.es> Message-ID: For those of you who want to play with the Cascade-Correlation algorithm, a public-domain Common Lisp version of my Cascade-Correlation simulator is now available for FTP via Internet. This is the same version I've been using for my own experiments, except that a lot of non-portable display and user-interface code has been removed. I believe that this version is now strictly portable Common Lisp. [Some users have reported that changing "short-float" declarations to "single-float" in Lucid seems to make things work better.] Scott Crowder, one of my graduate students at CMU, has translated this code into C. The C version is considerably faster on most machines, but less flexible. Instructions for obtaining the code via Internet FTP are included at the end of this message. If people can't get it by FTP, contact me by E-mail and I'll try once to mail it to you. Specify whether you want the C or Lisp version. If it bounces or your mailer rejects such a large message, I don't have time to try a lot of other delivery methods. I am maintaining an E-mail list of people using this code so taht I can notify them of any changes or problems that occur. I would appreciate hearing about any interesting applications of this code, and will try to help with any problems people run into. Of course, if the code is incorporated into any products or larger systems, I would appreciate an acknowledgement of where it came from. There are several other programs in the "code" directory mentioned below: versions of Quickprop in Common Lisp and C, and some simulation code written by Tony Robinson for the vowel benchmark he contributed to the benchmark collection. NOTE: This code is distributed without charge on an "as is" basis. There is no warranty of any kind by the authors or by Carnegie-Mellon University. -- Scott *************************************************************************** For people (at CMU, MIT, and soon some other places) with access to the Andrew File System (AFS), you can access the files directly from directory "/afs/cs.cmu.edu/project/connect/code". This file system uses the same syntactic conventions as BSD Unix: case sensitive names, slashes for subdirectories, no version numbers, etc. The protection scheme is a bit different, but that shouldn't matter to people just trying to read these files. For people accessing these files via FTP: 1. Create an FTP connection from wherever you are to machine "pt.cs.cmu.edu". The internet address of this machine is 128.2.254.155, for those who need it. 2. Log in as user "anonymous" with no password. You may see an error message that says "filenames may not have /.. in them" or something like that. Just ignore it. 3. Change remote directory to "/afs/cs/project/connect/code". Any subdirectories of this one should also be accessible. Parent directories may not be. 4. At this point FTP should be able to get a listing of files in this directory and fetch the ones you want. The Cascade-Correlation simulator lives in files "cascor1.lisp" and "cascor1.c". If you try to access this directory by FTP and have trouble, please contact me. The exact FTP commands you use to change directories, list files, etc., will vary from one version of FTP to another. --------------------------------------------------------------------------- To access the postscript file for the tech report describing this algorithm: unix> ftp cheops.cis.ohio-state.edu (or, ftp 128.146.8.62) Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get fahlman.cascor-tr.ps.Z ftp> quit unix> uncompress fahlman.cascor-tr.ps.Z unix> lpr fahlman.cascor-tr.ps (use flag your printer needs for Postscript) --------------------------------------------------------------------------- From Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU Tue Nov 6 11:33:25 1990 From: Scott.Fahlman at SEF1.SLISP.CS.CMU.EDU (Scott.Fahlman@SEF1.SLISP.CS.CMU.EDU) Date: Tue, 06 Nov 90 11:33:25 EST Subject: No subject In-Reply-To: Your message of 06 Nov 90 15:05:00 +0100. <9011061405.AA00938@batman.fi.upm.es> Message-ID: I am interested in constructive algorithms for neural networks (capable of generating new units and layers in the NN). Is there any constructive algorithm for NN whose units have continuous (e.g.: sigmoidal) activation function? The only nets that come to mind that fit your description are Cascade-Correlation (Fahlman and Lebiere, paper in NIPS-89 proceedings) and SONN (Tenrio and Lee, NIPS-88). There are a lot of system around that build up a *single* layer of hidden units with continuous activation functions. Information on how to FTP the Cascade-Correlation tech report and simulation code has been sent to this list before. I'll send it to you in a separate message. -- Scott From moody-john at CS.YALE.EDU Wed Nov 7 12:34:22 1990 From: moody-john at CS.YALE.EDU (john moody) Date: Wed, 7 Nov 90 12:34:22 EST Subject: constructive algorithms Message-ID: <9011071734.AA26833@SUNNY.SYSTEMSX.CS.YALE.EDU> In addition to the Cascade Correlation and SONN networks, there is a large body of work by Barron and Barron and by Russian authors on synthesizing multilayer polynomial networks. The synthesis algorithms come under the general title GMDH (Group Method of Data Handling), and have been developed since the early 1960's. They have been the subject of large amounts of both theoretical analysis and empirical testing. Two useful references are: S.J. Farlow, Self Organizing Methods in Modeling: GMDH Type Algorithms, New York: Marcel Dekker 1984. A.R. Barron and R.L. Barron, "Statistical learning networks: a unifying view," Computing Science and Statistics: Proc. 20th Interface Symposium, Ed Wegman, ed. Am. Statist. Assoc. Washington DC, pp 192-203, 1988. Furthermore, an impressive new algorithm for generating tree-structured polynomial networks will be presented as an invited talk at NIPS*90 in three weeks. The algorithm called MARS (Multivariate Adaptive Regression Splines) was developed by Jerome Friedman, Chairman of the Stanford Statistics Department. I believe that a paper on the algorithm is about to appear in one of the statistics journals. Professor Friedman's email address is jhf at playfair.stanford.edu. Hope this helps, John Moody ------- From Alex.Waibel at SPEECH2.CS.CMU.EDU Wed Nov 7 12:12:41 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Wed, 7 Nov 90 12:12:41 EST Subject: constructive algorithms Message-ID: If you define it loosely as self-modifying architectures, there are more: Three at least four more that come to my mind: o Meiosis Nets (Hanson, NIPS'89) o Optimal Brain Damage (Denker and co., NIPS'89) o Adaptive Delays and Connections (Bodenhausen, NIPS'90) o Weight Elimination (Rumelhart) Alex From nips-90 at CS.YALE.EDU Wed Nov 7 14:17:37 1990 From: nips-90 at CS.YALE.EDU (nips90) Date: Wed, 7 Nov 90 14:17:37 EST Subject: For NIPS*90 Presenters! Message-ID: <9011071917.AA02402@CASPER.NA.CS.YALE.EDU> The NIPS*90 Conference is now less than 3 weeks away. I would like to encourage all NIPS*90 presenters to consider bringing videotapes of their work if available. We will most likely have a video projection system set up in the main conference room for oral and poster spotlight presenters. We will also have a limited number of televisions and video casette players available in the hall for the poster sessions. If you would like to present a videotape (VHS format preferred), please contact me as soon as possible so that I can reserve a machine for your time slot or for your poster session. Thanks much, John Moody NIPS*90 Program Chair nips at cs.yale.edu (203)432-1200 VOICE (203)432-0593 FAX ------- From moody-john at CS.YALE.EDU Wed Nov 7 22:08:55 1990 From: moody-john at CS.YALE.EDU (john moody) Date: Wed, 7 Nov 90 22:08:55 EST Subject: more constructive networks at NIPS Message-ID: <9011080308.AA29766@SUNNY.SYSTEMSX.CS.YALE.EDU> Four more constructive algorithms for networks will be presented at NIPS*90: "Bumptrees for Efficient Function, Constraint, and Classification Learning", by Steve Omohundro. "Neural Net Algorithms that Learn in Polynomial Time from Examples and Queries", by Eric Baum and Kevin Lang "A Resource-Allocating Network for Function Interpolation", by John Platt "A Tree-Structured Network for Approximation on High-Dimensional Space", by Terry Sanger ------- From GOLDFARB at UNB.CA Thu Nov 8 00:35:34 1990 From: GOLDFARB at UNB.CA (GOLDFARB) Date: Thu, 08 Nov 90 01:35:34 AST Subject: Reconfigurable neural networks Message-ID: > I am interested in constructive algorithms for neural networks > (capable of generating new units and layers in the NN). I believe that the current framework of neural networks is not well suited to the above task, i.e. for reconfigurable NN. See my message of 27 September. Introduction to a fundamentally new model that is "reconfigurable" can be found in my Pattern Recognition paper mentioned in the above message and in several more resent papers. Although the example considered in the above paper uses nonnumeric operations (and representation) and does not require the learning machine to reconfigure itself, the issue of reconfigurability was one of the main driving motivations for the proposed model and is partly addressed in the more resent papers. Since the proposed model is also valid (and even becomes simpler) for vector patterns, an excellent topic (particularly for a thesis) would be an application of the model to vector patterns. I would be glad to help with the necessary missing details. Lev Goldfarb From aboulang at BBN.COM Thu Nov 8 13:55:58 1990 From: aboulang at BBN.COM (aboulang@BBN.COM) Date: Thu, 8 Nov 90 13:55:58 EST Subject: constructive algorithms In-Reply-To: john moody's message of Wed, 7 Nov 90 12:34:22 EST <9011071734.AA26833@SUNNY.SYSTEMSX.CS.YALE.EDU> Message-ID: From: john moody Full-Name: john moody Date: Wed, 7 Nov 90 12:34:22 EST Furthermore, an impressive new algorithm for generating tree-structured polynomial networks will be presented as an invited talk at NIPS*90 in three weeks. The algorithm called MARS (Multivariate Adaptive Regression Splines) was developed by Jerome Friedman, Chairman of the Stanford Statistics Department. I believe that a paper on the algorithm is about to appear in one of the statistics journals. Professor Friedman's email address is jhf at playfair.stanford.edu. For those of you who can't make it to NIPS (me), or can't wait to find out about MARS (me), or would like to study-up on MARS before the meeting, or even would like to play with it, here are some references from the sci.math.stat newsgroup and how to get MARS code from statlib: From: pgh at stl.stc.co.uk (P.G.Hamer) Newsgroups: sci.math.stat Subject: Re: WHAT DOES MARS MEAN? Date: 23 Oct 90 15:04:20 GMT Reply-To: "P.G.Hamer" Organization: STC Technology Limited, London Road, Harlow, Essex, UK In article <90294.210657BKW1 at psuvm.psu.edu> BKW1 at psuvm.psu.edu writes: >I have been unable to obtain any reference on MARS. Can anyone >tell me what MARS means. c c Multivariate Adaptive Regression Splines (MARS modeling, version 2.5). c c MARS 2.5 is a collection of subroutines that implement the multivariate c adaptive regression spline strategy for data fitting and function c approximation described in Friedman (1988b). It is a generalization of c MARS 1.0 described in Friedman (1988a). It implements as a special case c a palindromically invariant version of the TURBO fitting technique for c smoothing and additive modeling described in Friedman and Silverman (1987). c c References: c c Friedman, J. H. (1988a). Fitting functions to noisy data in high dimensions. c Proceedings, Twentyth Symposium on the Interface, E. Wegman, Ed. c c Friedman, J. H. (1988b). Multivariate adaptive regression splines. c Department of Statistics, Stanford University, Tech. Report LCS102. c c Friedman, J. H. and Silverman, B. W. (1987). Flexible parsimonious smoothing c and additive modeling. Dept. of Statistics, Stanford University, Tech. c report. (TECHNOMETRICS: Feb. 1989). c I got fortran source from the statlib mail server with the request 'send newmars from general'. If you do this, you will first get a message replay that is bogus, following this reply will be another with the MARS source. Cheers, Albert Boulanger From GOLDFARB at unb.ca Thu Nov 8 15:13:28 1990 From: GOLDFARB at unb.ca (GOLDFARB) Date: Thu, 08 Nov 90 16:13:28 AST Subject: Reconfigurable neural networks Message-ID: " I am interested in constructive algorithms for neural networks (capable of generating new units and layers in the NN)." I believe that the current framework of neural networks is not well suited to the above task, i.e. for reconfigurable NN. See my message of 27 September. Introduction to a fundamentally new model that is "reconfigurable" can be found in my Pattern Recognition paper mentioned in the above message and in several more resent papers. Although the example considered in the above paper uses nonnumeric operations (and representation) and does not require the learning machine to reconfigure itself, the issue of reconfigurability was one of the main driving motivations for the proposed model and is partly addressed in the more resent papers. Since the proposed model is also valid (and even becomes simpler) for vector patterns, an excellent topic (particularly for a thesis) would be an application of the model to vector patterns. I would be glad to help with the necessary missing details. Lev Goldfarb From KOKAR at northeastern.edu Fri Nov 9 12:09:00 1990 From: KOKAR at northeastern.edu (KOKAR@northeastern.edu) Date: Fri, 9 Nov 90 12:09 EST Subject: Intelligent Control Conference Message-ID: CALL FOR PAPERS 1991 IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT CONTROL August 13-15,1991 Key Bridge Marriott Arlington, Virginia Sponsored by the IEEE Control Systems Society General Chairman: Harry E. Stephanou, Rensselaer Polytechnic lnstitute Program Chairman: Alexander H. Levis, George Mason University Finance Chairman: Elizabelh R. Ducot, MlT Lincoln Labs Registration Chairman : Umit Ozguner, Ohio State University Publications Chairman: Mieczyslaw Kokar, Northeastern University Local Arrangements: James E. Gaby, UNYSlS Corporation The 6th IEEE International Symposium on Intelligent Control (ISIC 91 ) will be held in conjunction with the 1991 IFAC Symposium on Distributed Intelligence Systems. Registrants in either symposium will be able to attend all technical and social events in both symposia and will receive preprint volumes from both. The ISIC 91 theme will be "Integrating Quantitative and Symbolic Processing". The design and analysis of automatic control systems have traditionally been based on rigorous, numerical techniques for modeling and optimization. Conventional controllers perform well in the presence of random disturbances, and can adapt to relatively small changes in fairly well known environments. Intelligent controllers are designed to operate in unknown environments and, therefore, require much higher levels of adaptation to unexpected events. They are also required to process and interpret large quantities of sensor data, and use the results for action planning or replanning. The design of intelligent controllers, therefore, incorporates heuristic and/or symbolic tools from artificial intelligence. Such tools which have traditionally been applied to open-loop, off-line problems, must now be integrated into the perception-reasoning-action closed loop of intelligent controllers. Effective methods for the integration of numerical and symbolic processing schemes are needed. Robustness and graceful degradation issues must be addressed. Reconfigurable feedback loops at varying levels of abstraction should be considered. Papers are being solicited ior presentation at the Symposium and publication in the Symposium Proceedings. Topics include, but are not limited to, the following: Intelligent control architectures Reasoning under uncertainty Self-organizing systems Sensor-based robot control Fault detection and error recovery Cellular robotics Intelligent manufacturing control Microelectro-mechanical systems systems Discrete event systems Variable precision reasoning Concurrent engineering Active sensing and perception Neural network controllers Multisensor data fusion Hierarchical controllers Intelligent inspection Learning control systems Intelligent database systems Autonomous control systems Microelectronics,advanced materials, Knowledge representation for and other novel applications real-time processing Five copies of papers should be sent by February 15,1991 to: Professor Alexander H. Levis Dept. of ECE George Mason University Fairfax, VA 22030-4444 Telephone: 703-764-6282 A separate cover sheet with the name of the corresponding author, telephone and fax numbers, and e-mail address should also be included. Authors will be notified of acceptance by April 15, 1991. Accepted papers, in final camera ready form, will be due on May 15, 1991. Proposals for invited sessions and tutorial workshops are also solicited. Cohesive sessions focusing on successful applications are particularly encouraged. Requests for additional information and proposal submissions (by February 15, 1991) should be addressed to Professor Levis. Symposium Program Committee: Suguru Arimoto, University of Tokyo Vivek V, Badami, General Electric John Baras, University of Maryland Research Lab Piero Bonissone, General Electric Hamid Berenji, NASA Ames Research Lab V.T. Chien, National Science David B. Cooper, Brown University Foundation David A. Dornfeld, University Kenneth J. DeJong, George Mason of California, Berkeley University Judy A. Franklin, GTE Laboratories Masakazu Ejiri, Hitachi Janos Gertler, George Mason Univesity Roger Geesey, BDM International Roderic Grupen, University of George Giralt, LAAS Massachusetts William A. Gruver, University of Susan Hackwood, University of Kentucky California, Riverside Thomas Henderson, Uiversity of Utah Joseph K. Kearney, University of Pradeep Khosla, Carnegie Mellon Iowa University Yves Kodratoff, Universite de Paris Benjamin Kuipers, University of Texas, Michael B. Leahy, Air Force Institute Austin of Technology Gaston H. Lefranc, Universidad Catolica Ramiro Liscano, Nat'l Research Council Valparaiso of Canada Ronald Lumia, NIST Yukio Mieda, Honda Engineering Co.,Ltd Thang N. Nguyen, IBM Corporation Kevin M. Passino, Ohio State Michael A.Peshkin, Northwestern University University Roger T. Schappell, Martin Marietta Yoshiaki Shirai, Osaka University Marwan Simaan, University of Janos Sztipanovits, Vanderbilt Pittsburgh University Zuheir Tumeh, General Motors Research Kimon P. Valavanis, Northeastern Labs University Agostino Villa, Politecnico di Torino John Wen, Rensselaer Polytechnic Institute From zl at guinness.ias.edu Fri Nov 9 13:20:54 1990 From: zl at guinness.ias.edu (Zhaoping Li) Date: Fri, 9 Nov 90 13:20:54 EST Subject: No subject Message-ID: <9011091820.AA04621@guinness.ias.edu> THE INSTITUTE FOR ADVANCED STUDY Anticipates the opening of one or more positions in the area of mathematical biology with emphasis on neural systems and computations. The positions are at a postdoctoral level and are for two years beginning academic year 1991-92. Applicants should send a CV and arrange for three letters of recommendations to be sent directly to Dr. Joseph J. Atick, School of Natural Sciences Institute for Advanced Study Princeton, NJ 08540 Applications should be received no later than January 20, 1991. Women and minorities are encouraged to apply. From stolcke at ICSI.Berkeley.EDU Sat Nov 10 02:13:11 1990 From: stolcke at ICSI.Berkeley.EDU (stolcke@ICSI.Berkeley.EDU) Date: Fri, 09 Nov 90 23:13:11 PST Subject: Cluster analysis software sought Message-ID: <9011100713.AA10859@icsib12.Berkeley.EDU> I am looking for a tool that does hierarchical cluster analyis *and* is able to display the resulting tree in some high-quality graphics format (such as Postscript). The program I am currently using prints only a crude approximation using ASCII characters. It would truely appreciate it if anybody who knows where to find such beast could drop me a note. --Andreas Stolcke From jose at learning.siemens.com Mon Nov 12 06:11:18 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Mon, 12 Nov 90 06:11:18 EST Subject: NIPS update Message-ID: <9011121111.AA07507@learning.siemens.com.siemens.com> A Friendly Reminder to REGISTER and get a room; its getting late: IEEE Conference on Neural Information Processing Systems -Natural and Synthetic- Monday, November 26 - Thursday, November 29, 1990 Sheraton Denver Tech Center Denver, Colorado Mail Requests For Registration Material To: Kathie Hibbard NIPS*90 Local Committee Engineering Center University of Colorado Campus Box 425 Boulder, CO 80309-0425 hibbard at boulder.colorado.edu From sun at umiacs.UMD.EDU Mon Nov 12 13:59:59 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Mon, 12 Nov 90 13:59:59 -0500 Subject: No subject Message-ID: <9011121859.AA00284@neudec.umiacs.UMD.EDU> >I am interested in constructive algorithms for neural networks >(capable of generating new units and layers in the NN). Is there >any constructive algorithm for NN whose units have continuous >(e.g.: sigmoidal) activation function? The works of this kind back to 1987 may also include: (1)G.Z. Sun, H.H. Chen and Y.C. Lee, "A Novel Net that Learns Sequential Decision Processes", NIP Proceedings (1987),Denver. (2) G.Z. Sun, H.H.Chen and Y.C.Lee, "Parallel Sequential Induction Network: A New Paradiagm of Neural Network Architecture", ICNN Proceedings,(1988), San Diego. These two papers generalized the idea of conventional regression tree to generate the new NN units dynamically. The continuous activation fuction is used. This PSIN (Parallel Sequential Induction Net) algorithm integrates the parallel processing of neural net and sequential processing of decision tree together to form a "most efficient" multilayered neural net classifier. The efficiency measure is the information entropy. -G. Z. Sun University of Maryland From baird%icsia8.Berkeley.EDU at berkeley.edu Mon Nov 12 14:48:39 1990 From: baird%icsia8.Berkeley.EDU at berkeley.edu (Bill Baird) Date: Mon, 12 Nov 90 11:48:39 PST Subject: preprint - associative memory in oscillating cortex Message-ID: <9011121948.AA02070@icsia8> Preprint announcement: available by ftp from neuroprose Learning with Synaptic Nonlinearities in a Coupled Oscillator Model of Olfactory Cortex Bill Baird Depts. Mathematics, and Molecular and Cell Biology, U.C.Berkeley, Berkeley, Ca. 94720 Abstract A simple network model of olfactory cortex, which assumes only minimal coupling justified by known anatomy, can be analytically proven to function as an associative memory for oscillatory patterns. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long range excitatory connections. Using a local Hebb-like learning rule for primary and higher order synapses at the ends of the long range connections, the system can learn to store the kinds of oscillation amplitude and phase patterns observed in olfactory and visual cortex. Memory capacity is N/2 oscillatory attractors, N/4 chaotic attractors in an N node network. The network can be truely self-organizing because a synapse can modify itself according to it's own pre and postsynaptic activity during stimulation by an input pattern to be learned. The neurons of the neuron pools modeled here can be viewed as operating in the linear region of the usual sigmoidal axonal nonlinearity, and multiple memories are stored instead by the learned {\em synaptic} nonlinearities. Introduction We report recent results of work which seeks to narrow the gap that exists between physiologically detailed network models of real vertebrate cortical memory systems and analytically understood artificial neural networks for associative memory. The secondary olfactory sensory cortex known as prepyriform cortex is thought to be one of the clearest cases of a real biological network with associative memory function. Patterns of 40 to 80 Hz oscillation have been observed in the large scale activity (local field potentials) of olfactory cortex and visual neocortex, and shown to predict the olfactory and visual pattern recognition responses of a trained animal. Similar Observations of 40 Hz oscillation in retina, auditory cortex, motor cortex and in the EMG have been reported. It thus appears that cortical computation in general may occur by dynamical interaction of resonant modes, as has been thought to be the case in the olfactory system. Given the sensitivity of neurons to the location and arrival times of dendritic input, the sucessive volleys of pulses that are generated by the oscillation of a neural net may be ideal for the formation and reliable longe range transmission of the collective activity of one cortical area to another. The oscillation can serve a macroscopic clocking function and entrain or ``bind" the relevant microscopic activity of disparate cortical regions into a well defined phase coherent collective state or ``gestalt". This can overide irrelevant microscopic activity and help produce coordinated motor output. If this view is correct, then oscillatory network modules form the actual cortical substrate of the diverse sensory, motor, and cognitive operations now studied in static networks. It must ultimately be shown how those functions can be accomplished with oscillatory dynamics. ftp proceedure: unix> ftp cheops.cis.ohio-state.edu Name: anonymous Password: neuron ftp> cd pub/neuroprose ftp> binary ftp> get baird.oscmem.ps.Z ftp> quit unix> uncompress baird.oscmem.ps.Z unix> lpr -P(your postscript printer) baird.oscmem.ps For background papers, send e-mail to baird at icsi.berkeley.edu, giving paper or e-mail address for Tex or Postscript output. From jt at cns.edinburgh.ac.uk Mon Nov 12 14:54:23 1990 From: jt at cns.edinburgh.ac.uk (Jay Buckingham) Date: Mon, 12 Nov 90 19:54:23 GMT Subject: Post-NIPS workshop on Associative Memory Message-ID: <2667.9011121954@cns.ed.ac.uk> There will be a 1-day post-NIPS workshop on Associative Memory this year. This is a short summary of what we plan to do. NEUROBIOLOGICALLY MOTIVATED ASSOCIATIVE MEMORY The 1990 ASSOCIATIVE MEMORY workshop will focus on theoretical issues relevant to the understanding of neurobiologically motivated associative memory models. We are organizing it into a set of discussions on key issues. In these discussions the workshop participants can explain how they have addressed the issue at hand in their work and hopefully improve their understanding via exchange with others. What we want are nuts-and-bolts discussions among people who have wrestled with these topics. Some of the key issues are: - Architectures for associative memories, with emphasis on partially connected, biologically motivated models, including multi-stage or modular architectures. - Synaptic learning rules - Performance measures and capacity - Information theoretic issues such as information efficiency - Thresholding techniques - Biological relevance of these models (Discussing, for example, how David Marr made functional statements about the cerebellum, hippocampus and neocortex with associative memory models.) This list is not exhaustive and the items are interrelated so we expect them to come up in various contexts. We will probably begin each discussion by formulating a few questions that the participants can address. So come with your questions about these topics and your ways of thinking about them. Jay Buckingham Cognitive Neuroscience University of Edinburgh 2 Buccleuch Place Edinburgh EH8 9LW, SCOTLAND Phone: (44 or 0) 31 667 1011 ext 6302 E-mail: jt%ed.cns at nsfnet-relay.ac.uk From dave at cogsci.indiana.edu Tue Nov 13 02:28:44 1990 From: dave at cogsci.indiana.edu (David Chalmers) Date: Tue, 13 Nov 90 02:28:44 EST Subject: Proceedings -- ConnectFest 1990 Message-ID: A small connectionist meeting, "ConnectFest 1990", was recently held at Indiana University. Here are selected highlights from the official proceedings. The full proceedings (about 3-4 times the size of this) are available on request from dave at cogsci.indiana.edu. The Frank Rosenblatt Memorial Award for best contribution was taken by Tim van Gelder, who has bravely allowed his name to be attached to his contribution at great risk to his career. For your very own ConnectFest T-shirt (only $8 each), send e-mail to Doug Blank, blank at copper.ucs.indiana.edu. Dave Chalmers (dave at cogsci.indiana.edu) Center for Research on Concepts and Cognition Indiana University. ========================================================================== Proceedings of ConnectFest 1990 -- Indiana University, November 3-4, 1990. Compiled and edited by Doug Blank and Dave Chalmers. (c) 1990, ConnectFest Publications. ----------------------------------------------------- Said the Boltzmann machine to the whore, When his temperature reached sixty-four, "This is such a great feeling I've slowed down my annealing. That way there's more time to explore." -- Mike Gasser ----------------------------------------------------- The folks from the journal Cognition Weren't more than a bad apparition. All's left of Fodor Is just a bad odor, And nobody's missin' Pylyshyn. -- Tim van Gelder ----------------------------------------------------- From Alex.Waibel at SPEECH2.CS.CMU.EDU Tue Nov 13 02:30:50 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Tue, 13 Nov 90 02:30:50 EST Subject: Job Openings at CMU Message-ID: The School of Computer Science at Carnegie Mellon University has several immediate positions at the level of Research Associate and Research Programmer. These positions are research positions without teaching obligations aimed at the development and evaluation of neural network based speech understanding systems. We are seeking strongly motivated, outstanding individuals who have a demonstrated background and research experience in at least one of the following areas and an interest and commitment to work in the others: o Speech Recognition and Understanding (Stochastic or Connectionist) o Neural Network Modeling and Connectionist System Design o System design and implementation, development of connectionist systems on fast parallel hardware, including various available supercomputers, signal processors and neural network chips. Applicants for the position of Research Associate should have completed their PhD and have a demonstrated ability to do independent innovative research in an area listed above. Applicants for the position of Research Programmer should have a B.S. or M.S. degree and be interested in performing independent research and in developing innovative system solutions. Carnegie Mellon University is one of the leading institutions in Computer Science and offers an exciting and challenging environment for research in the areas listed above. The connectionist group at CMU, is one of the most active research teams in this area in the US, and provides a stimulating environment for innovative research. It also maintains a lively interaction with other departments, including Psychology, the Center for Machine Translation, and Computational Linguistics. Interested individuals should send a complete curriculum vitae listing three references to: Dr. Alex Waibel School of Computer Science Carnegie Mellon University Pittburgh, PA 15213 Email: waibel at cs.cmu.edu From pratt at paul.rutgers.edu Tue Nov 13 09:18:42 1990 From: pratt at paul.rutgers.edu (Lorien Y. Pratt) Date: Tue, 13 Nov 90 09:18:42 EST Subject: Announcing a NIPS '90 workshop comparing decision trees and neural nets Message-ID: <9011131418.AA03376@paul.rutgers.edu> Neural Networks and Decision Tree Induction: Exploring the relationship between two research areas A NIPS '90 workshop, 11/30/1990 or 12/1/1990, Keystone, Colorado Workshop Co-Chairs: L. Y. Pratt and S. W. Norton The fields of Neural Networks and Machine Learning have evolved separately in many ways. However, close examination of multilayer perceptron learning algorithms (such as Back-Propagation) and decision tree induction methods (such as ID3 and CART) reveals that there is considerable convergence between these subfields. They address similar problem classes (inductive classifier learning) and can be characterized by a common representational formalism of hyperplane decision regions. Furthermore, topical subjects within both fields are related, from minimal trees and network reduction schemes to incremental learning. In this workshop, invited speakers from the Neural Network and Machine Learning communities will discuss their empirical and theoretical comparisons of the two areas, and then present work at the interface between these two fields which takes advantage of the potential for technology transfer between them. In a discussion period, we'll discuss our conclusions, comparing the methods along the dimensions of representation, learning, and performance. We'll debate the ``strong convergence hypothesis'' that these two research areas are really studying the same problem. Schedule of talks: AM: 7:30-7:50 Lori Pratt Introductory remarks 7:50-8:10 Tom Dietterich Evidence For and Against Convergence: Experiments Comparing ID3 and BP 8:15-8:35 Les Atlas Is backpropagation really better than classification and regression trees? 8:40-9:00 Ah Chung Tsoi Comparison of the performance of some popular machine learning algorithms: CART, C4.5, and multi-layer perceptrons 9:05-9:25 Ananth Sankar Neural Trees: A Hybrid Approach to Pattern Recognition PM: 4:30-4:55 Stephen Omohundro A Bayesian View of Learning with Tree Structures and Neural Networks 5:00-5:20 Paul Utgoff Linear Machine Decision Trees 5:25-5:45 Terry Sanger Basis Function Trees as a Generalization of CART, MARS, and Other Local Variable Selection Techniques 5:50-6:30 Discussion, wrap-up ------------------------------------------------------------------------------ L. Y. Pratt S. W. Norton pratt at paul.rutgers.edu, norton at learning.siemens.com Rutgers University Computer Science Dept. Siemens Corporate Research New Brunswick, NJ 08903. 755 College Road East (201) 932-4634 Princeton, NJ 08540 (609) 734-3365 From PEGJ at oregon.uoregon.edu Tue Nov 13 11:12:00 1990 From: PEGJ at oregon.uoregon.edu (Peggy Jennings) Date: Tue, 13 Nov 90 08:12 PST Subject: real thinking Message-ID: <05B83B73F77F40AB90@oregon.uoregon.edu> This is an open letter to those of you who assume that men are connectionists and women are whores. I am sick of having to point out to you that women are not only connectionists but are humans as well. I can't describe how angry it makes me to log onto the network and find a series of limericks, some of them clever and funny to all of us, others insulting and denigrating to women. What next? Sambo jokes? I wish you guys would use your real brains and think about others than yourselves. Get a clue. _____________________ Peggy J. Jennings Dept. of Psychology Univ. of Oregon From sun at umiacs.UMD.EDU Tue Nov 13 14:11:28 1990 From: sun at umiacs.UMD.EDU (Guo-Zheng Sun) Date: Tue, 13 Nov 90 14:11:28 -0500 Subject: ~m Message-ID: <9011131911.AA01209@neudec.umiacs.UMD.EDU> From ml-connectionists-request at Q.CS.CMU.EDU Tue Nov 13 03:09:35 1990 Received: from skippy.umiacs.umd.edu by neudec.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88) id AA00957; Tue, 13 Nov 90 03:09:34 -0500 Received: from Q.CS.CMU.EDU by skippy.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88) id AA06534; Tue, 13 Nov 90 03:09:31 -0500 Received: from Q.CS.CMU.EDU by Q.CS.CMU.EDU id aa22519; 12 Nov 90 14:11:22 EST Received: from cs.cmu.edu by Q.CS.CMU.EDU id aa22517; 12 Nov 90 14:05:09 EST Received: from neudec.umiacs.umd.edu by CS.CMU.EDU id aa20292; 12 Nov 90 14:00:08 EST Received: by neudec.umiacs.UMD.EDU (5.61/UMIACS-0.9/04-05-88) id AA00284; Mon, 12 Nov 90 13:59:59 -0500 Date: Mon, 12 Nov 90 13:59:59 -0500 From: Guo-Zheng Sun Message-Id: <9011121859.AA00284 at neudec.umiacs.UMD.EDU> To: connectionists at CS.CMU.EDU Status: RO >I am interested in constructive algorithms for neural networks >(capable of generating new units and layers in the NN). Is there >any constructive algorithm for NN whose units have continuous >(e.g.: sigmoidal) activation function? The works of this kind back to 1987 may also include: (1)G.Z. Sun, H.H. Chen and Y.C. Lee, "A Novel Net that Learns Sequential Decision Processes", NIP Proceedings (1987),Denver. (2) G.Z. Sun, H.H.Chen and Y.C.Lee, "Parallel Sequential Induction Network: A New Paradiagm of Neural Network Architecture", ICNN Proceedings,(1988), San Diego. These two papers generalized the idea of conventional regression tree to generate the new NN units dynamically. The continuous activation fuction is used. This PSIN (Parallel Sequential Induction Net) algorithm integrates the parallel processing of neural net and sequential processing of decision tree together to form a "most efficient" multilayered neural net classifier. The efficiency measure is the information entropy. -G. Z. Sun University of Maryland this is a test of mail tilde escape From stolcke at ICSI.Berkeley.EDU Mon Nov 12 23:34:53 1990 From: stolcke at ICSI.Berkeley.EDU (stolcke@ICSI.Berkeley.EDU) Date: Mon, 12 Nov 90 20:34:53 PST Subject: Summary: Cluster Analysis software Message-ID: <9011130434.AA00821@icsib12.Berkeley.EDU> Here is a short summary of the replies I got to my request for clustering software. Thanks to all who replied, the problem is more that solved. A couple of home-brewn programs were mentioned. One that apparently has been floating around quite a bit is "cluster" by Yoshiro Miyata. This turned out to be the program that draws cluster trees in ASCII that I had mentioned in my original message. Many people mentioned "S" (or "S+" in a more recent version), a general statistics package that does cluster analysis among other things. Reportedly PostScript and X Windows graphics output are supported. This is commercial software, the contact address is STATSCI, PO Box 85625, Seattle, WA 98145-1625, 206-322-8707 (standard disclaimer: I have no personal interest in the economic success of this company, I don't even know their product yet). There is also a book about S by R.A. Becker & J.M. Chambers, "S: An Inter- active Environment for Data Analysis and Graphics," Wadsworth, Belmont, Ca., 1984 (2nd edition, probably not the most recent one). An alternative (possibly less expensive, but then I have no pricing info yet), might be Berkeley's BLSS software package, available through BLSS Project, Dept. of Statistics, University of California, Berkeley, CA 94720, 415-642-5258, blss at back.berkeley.edu. Again, there's a book out, by D.M. Abrahams & F. Rizzardi, "BLSS: The Berkeley Interactive Statistical System," W.W. Norton & Company, 1988. Finally, since I needed something FAST, I created my own solution. I added a couple of features to Miyata's cluster program to make it more useful in conjunction with other programs. It now can be used conveniently as a UNIX filter, and optionally produces device-independent graphics output that may be rendered on a variety of output media through the family of standard UNIX programs dealing with plot(5) format (including PostScript and X Windows previewing). Other features added include options for output format selection, distance metric selection, and scaling of individual dimensions. I have done absolutely nothing to improve the algorithm used to do the actual clustering, since my data sets are typically small and I needed to quickly hack up something that produces acceptable output. (The complexity of the algorithm currently used is n^3, whereas n log n is feasible, I am told.) The program is written modularly enough so that it shouldn't be too hard to plug in a better algorithm if available. The program as it stands now should be flexible enough to be of public use. It could be a poor man's solution for those who need to produce cluster analyses but do not have access to fancy statistics software packages or don't want to deal with one. I'll make it available through anonymous FTP from icsi-ftp.berkeley.edu (128.32.201.55). To get it use FTP as follows: % ftp icsi-ftp.berkeley.edu Connected to icsic.Berkeley.EDU. 220 icsi-ftp (icsic) FTP server (Version 5.60 local) ready. Name (icsic.Berkeley.EDU:stolcke): anonymous Password (icsic.Berkeley.EDU:anonymous): 331 Guest login ok, send ident as password. 230 Guest login Ok, access restrictions apply. ftp> cd pub 250 CWD command successful. ftp> binary 200 Type set to I. ftp> get cluster.tar.Z 200 PORT command successful. 150 Opening BINARY mode data connection for cluster.tar.Z (15531 bytes). 226 Transfer complete. 15531 bytes received in 0.08 seconds (1.9e+02 Kbytes/s) ftp> quit 221 Goodbye. Then unpack and compile using % zcat cluster.tar.Z | tar xf - % make Please don't ask me to mail it if you can get it through FTP. Of course I cannot guarantee the usefulness of the program or promise any level of support etc. However, I'd like to know of any bug fixes and improvements, especially to the algorithm (hint, hint ...) Again, thanks for all the replies, Andreas From gary%cs at ucsd.edu Tue Nov 13 15:38:09 1990 From: gary%cs at ucsd.edu (Gary Cottrell) Date: Tue, 13 Nov 90 12:38:09 PST Subject: real thinking Message-ID: <9011132038.AA10517@desi.ucsd.edu> I totally agree with Peggy. I find several of those limericks really offensive. Degrading women degrades the degrader. g. From gary%cs at ucsd.edu Tue Nov 13 15:43:05 1990 From: gary%cs at ucsd.edu (Gary Cottrell) Date: Tue, 13 Nov 90 12:43:05 PST Subject: real thinking Message-ID: <9011132043.AA10527@desi.ucsd.edu> PS. As long as we're on the subject, I also find the continued use of the "Missionaries and Cannibals" problem in AI as really poor taste. I usually present it to my class as the "Masai and missionaries" problem - the goal being for the Masai to avoid being converted by the missionaries. g. From ISAI at TECMTYVM.BITNET Tue Nov 13 19:23:35 1990 From: ISAI at TECMTYVM.BITNET (Centro de Inteligencia Artificial(ITESM)) Date: Tue, 13 Nov 90 18:23:35 CST Subject: CALL FOR PAPERS Message-ID: <901113.182335.CST.ISAI@TECMTYVM> Please include the following information in your bulletin board. Thank you in advance. Sincerely, The Symposium Publicity Committee. ====================================================================== INTERNATIONAL SYMPOSIUM ON ARTIFICIAL INTELLIGENCE APPLICATIONS IN INFORMATICS: Software Engineering, Data Base Systems, Computer Networks, Programming Environments, Management Information Systems, Decision Support Systems. C A L L F O R P A P E R S Preliminary Version. The Fourth International Sysmposium on Artificial Intelligence will be held in Cancun Mexico on November 13-15, 1991. The Symposium is sponsored by the ITESM (Instituto Tecnologico y de Estudios Superiores de Monterrey) in cooperation with the International Joint Conferences on Artificial Intelligence Inc., the American Association for Artificial Intelligence, the Canadian Society for Computational Studies of Intelligence, the Sociedad Mexicana de Inteligencia Artificial and IBM of Mexico. Papers from all countries are sought that: (1) Present applications of artificial intelligence technology to the solution of problems in Software Engineering, Data Base Systems, Computer Networks, Programming Environments, Management Information Systems, Decision Support Systems and other Informatics technologies; and (2) Describe research on techniques to accomplish such applications, (3) Address the problem of transfering the AI Technology in different socio-economic contexts and environments. Areas of application include but are no limited to: Software development, software design, software testing and validation, computer-aided software engineering, programming environments, structured techniques, intelligent databases, operating systems, intelligent compilers, local networks, computer network design, satellite and telecommunications, MIS and data processing applications, intelligent decision support systems. AI techniques include but are not limited to: Expert systems, knowledge acquisition and representation, natural language processing, computer vision, neural networks and genetic algorithms, automated learning, automated reasoning, search and problem solving, knowledge engineering tools and methodologies. Persons wishing to submit a paper should send five copies written in English to: Hugo Terashima, Program Chair Centro de Inteligencia Artificial, ITESM. Ave. Eugenio Garza Sada 2501, Col.Tecnologico C.P. 64849 Monterrey, N.L. Mexico Tel.(52-83) 58-2000 Ext.5134 Telefax (52-83) 58-1400 Dial Ext.5143 or 58-2000 Ask Ext.5143 Net address: ISAI at tecmtyvm.bitnet or ISAI at tecmtyvm.mty.itesm.mx The paper should identify the area and technique to which it belongs. Extended abstract is not required. Use a serif type font, size 10, sigle-spaced with a maximum of 10 pages. No papers will be accepted by electronic means. Important dates: Papers must be received by April 30,1991. Authors will be notified of acceptance or rejection by June 15,1991. A final copy of each accepted paper, camera ready for inclusion in the Symposium proceedings, will be due by July 15,1991. ======================================================================= From pollack at cis.ohio-state.edu Tue Nov 13 21:22:08 1990 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Tue, 13 Nov 90 21:22:08 -0500 Subject: real thinking In-Reply-To: Peggy Jennings's message of Tue, 13 Nov 90 08:12 PST <05B83B73F77F40AB90@oregon.uoregon.edu> Message-ID: <9011140222.AA01478@dendrite.cis.ohio-state.edu> **Do Not Forward** As one of the persons responsible for the Connectfest, I apologize to Peggy Jennings for all the hidden units of misogyny in the abridged proceedings recently circulated. One has to read very closely to find: "I recursively RAAM it/with Symbols Goddammit" (Brutality?) "We only had backpropagation." (Sodomy?) "Your bottom feels better/Sliding over each letter" (slave branding?) "a willing basin of attraction" (body-part objectification?) I also apologize to the women from Bloomington who were present during the sophomoric smut contest, and must have felt uncomfortable with us males engaging in such a "connectionismo" poetry competition. However, I think the charge of sexism would be much more effective if offered with the correct meter and form: The Boltzwoman yelled at the chauvinist: "You used the word "whore" and I'm pissed! I'll clamp down on your nodes, anneal your temporal lobes, And your genetic code will be fixed!" Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Fax/Phone: (614) 292-4890 From harnad at clarity.Princeton.EDU Wed Nov 14 09:15:09 1990 From: harnad at clarity.Princeton.EDU (Stevan Harnad) Date: Wed, 14 Nov 90 09:15:09 EST Subject: Netiquette Message-ID: <9011141415.AA12499@psycho.Princeton.EDU> > Date: Tue, 13 Nov 90 12:38:09 PST > From: Gary Cottrell > > ...I find several of those limericks really offensive. > Degrading women degrades the degrader... > > ...I also find the continued use of the "Missionaries and > Cannibals" problem in AI as really poor taste... Is there more than one Gary Cottrell on the net? Or has the one that posted the "Rumelphart" piece just a few weeks ago undergone some sort of an aesthetic conversion experience? Stevan Harnad From fritz_dg%ncsd.dnet at gte.com Wed Nov 14 09:15:26 1990 From: fritz_dg%ncsd.dnet at gte.com (fritz_dg%ncsd.dnet@gte.com) Date: Wed, 14 Nov 90 09:15:26 -0500 Subject: missionaries and cannibals Message-ID: <9011141415.AA13520@bunny.gte.com> Open to g. Masai and missionaries still in poor taste, as someone remains the butt of the joke. Think up something new and neutral. Also, in future, please provide address so reply needn't be broadcast. fritz_dg%ncsd at gte.com From hendler at cs.UMD.EDU Wed Nov 14 09:33:28 1990 From: hendler at cs.UMD.EDU (Jim Hendler) Date: Wed, 14 Nov 90 09:33:28 -0500 Subject: real thinking Message-ID: <9011141433.AA07326@dormouse.cs.UMD.EDU> Before this debate gets too political (and I do agree with Peggy and Gary), I'd like to point out that all of the limericks posted were in the modern limerick form, in which the first and fifth lines rhyme, but don't recap. The traditional limerick form, made most popular by Lear, has a reiteration of first and fifth. Here's an example in the traditional form to make up for this lack: Said a wizened old grad with a net, "Alas, It has not converged yet!" The learn rate? why 1, Momentum? ummm ... none. That frustrated grad with a net. -J Hendler From pastor at PRC.Unisys.COM Wed Nov 14 10:58:13 1990 From: pastor at PRC.Unisys.COM (pastor@PRC.Unisys.COM) Date: Wed, 14 Nov 90 10:58:13 EST Subject: Stop it right now, please Message-ID: <9011141558.AA01269@gem.PRC.Unisys.COM> I hesitate to get involved in this, but I think that the limerick "discussion" has moved outside the scope of connectionists. A news group/mailing list that consistently provides challenging, interesting, and *pertinent* discussion (and *no* flames or invective) is a rare animal these days, and I hate to see this one jeopardized. I am not commenting on the merits of either party's arguments, or on the importance of the topic; all I'm saying is that it has nothing to do with connectionism per se. I would like to suggest that the rest of this apparently-escalating (and probably irresolvable) conflict be continued via direct e-mail among the combatants, or on a newsgroup that is more appropriate to its content. Thanks to all concerned, and good luck in resolving your disagreements. From kimd at gizmo.usc.edu Wed Nov 14 14:09:42 1990 From: kimd at gizmo.usc.edu (Kim Daugherty) Date: Wed, 14 Nov 1990 11:09:42 PST Subject: Hierarchical Representations Message-ID: We are interested in representing variable sized hierarchical tree structures as activations over a fixed number of units in a connectionist model. We are aware of two current approaches, Touretzky's BoltzCONS and Pollack's RAAM, that perform this task. Have other approaches been developed as well? We are also looking for some connectionist modeling tools. If anybody has or knows of any simulator that fits some or all of the following description, please contact Kim Daugherty (email below): (1) RAAM simulator written in C or a nice backprop simulator written in C that can be easily converted to a RAAM, (2) A graphical interface for unit activations and weights, and (3) Something written to work under UNIX hosted on a SUN. Thanks in advance for help. We will summarize responses and post to the mailing list if appropriate. Mark Seidenberg Kim Daugherty Neuroscience Program Neuroscience Program University of Southern California University of Southern California marks at neuro.usc.edu kimd at gizmo.usc.edu From LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU Tue Nov 13 23:19:00 1990 From: LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU (LAI-WAN CHAN) Date: Wed, 14 Nov 90 12:19 +0800 Subject: real thinking Message-ID: <04CC9E29C53F20023C@CUCSD.CUHK.HK> > From: Peggy Jennings > Subject: real thinking > To: connectionists at CS.CMU.EDU > Message-id: <05B83B73F77F40AB90 at oregon.uoregon.edu> > This is an open letter to those of you who assume that men are > connectionists and women are whores. I am sick of having to point out to > you that women are not only connectionists but are humans as well. I can't > describe how angry it makes me to log onto the network and find a series of > limericks, some of them clever and funny to all of us, others insulting and > denigrating to women. What next? Sambo jokes? I wish you guys would use > your real brains and think about others than yourselves. Get a clue. Agreed. I, as a woman, am expressing my anger too. Funny jokes are good but don't be insulting. Lai-Wan Chan, Computer Science Dept, The Chinese University of Hong Kong. From wilson at cs.UMD.EDU Wed Nov 14 13:17:38 1990 From: wilson at cs.UMD.EDU (wilson@cs.UMD.EDU) Date: Wed, 14 Nov 90 13:17:38 -0500 Subject: real thinking Message-ID: <9011141817.AA14886@brillig.cs.UMD.EDU> I too found some of the limericks to be offensive and showing a great insensitivity, and am glad to see a few others agree. But because I'm afraid we may be in the minority I feel compelled to add my voice. Although I believe Jordan Pollack had the best intentions, he may have inadvertently trivialized the issue by making light of it. Sigh. Will we ever learn?? Anne Wilson From dave at cogsci.indiana.edu Wed Nov 14 16:36:41 1990 From: dave at cogsci.indiana.edu (David Chalmers) Date: Wed, 14 Nov 90 16:36:41 EST Subject: limericks Message-ID: I apologize to anyone who was offended. The organizers felt, perhaps naively, that none of the limericks were degrading, being sexual rather than sexist. Sorry for cluttering connectionists with all this. Dave Chalmers. From gary%cs at ucsd.edu Wed Nov 14 16:39:02 1990 From: gary%cs at ucsd.edu (Gary Cottrell) Date: Wed, 14 Nov 90 13:39:02 PST Subject: Netiquette Message-ID: <9011142139.AA11426@desi.ucsd.edu> >From: Stevan Harnad >> Date: Tue, 13 Nov 90 12:38:09 PST >> From: Gary Cottrell >> >> ...I find several of those limericks really offensive. >> Degrading women degrades the degrader... >> >> ...I also find the continued use of the "Missionaries and >> Cannibals" problem in AI as really poor taste... > >Is there more than one Gary Cottrell on the net? Or has the one that >posted the "Rumelphart" piece just a few weeks ago undergone some sort >of an aesthetic conversion experience? > >Stevan Harnad I think there's a big difference between satirizing individuals and using notions that degrade a class of people. The notion of "whore" is something that applies only to women. You can argue about this, and *we should take this argument off the net*, but there is no equivalent notion for men. It is an inherently sexist concept. Yes, the fake names I use in my humor are sometimes cheap shots. Sometimes they are just the closest funniest similar sounding thing I can think of, and not meant to attribute any particular qualities to the individuals so targeted. As far as I know my farts are the worst around. gary From chan at unb.ca Wed Nov 14 15:58:47 1990 From: chan at unb.ca (Tony Chan) Date: Wed, 14 Nov 90 16:58:47 AST Subject: Peggy Jennings Message-ID: I came from a culture, where, as far as I know, feminism is assumed and femininity is worshipped. I unreservedly agree with Peggy on this issue. Sexism aside, I find it sickening to learn that the cream of the crop of this culture wastes so much of the resources---mental, electronic, and others---on non-scientific matters. Being a member of this list, I feel I must apologize sincerely to women on the list even though I have absolutely nothing to do with Jordan Pollock or the Connectfest that he was partly responsible for. Tony Chan Telephone: 506 648-5500 P.O. Box 5050 Fax: 506 648-5528 Computer Science Email: chan at unb.ca UNB Saint John Campus Saint John, New Brunswick E2L 4L5 Canada From Connectionists-Request at CS.CMU.EDU Thu Nov 15 10:19:56 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Thu, 15 Nov 90 10:19:56 EST Subject: Fwd: CALL FOR PAPERS Message-ID: <18295.658682396@B.GP.CS.CMU.EDU> ------- Forwarded Message From wilson at magi.ncsl.nist.gov Wed Nov 14 13:16:00 1990 From: wilson at magi.ncsl.nist.gov (Charles Wilson x2080) Date: Wed, 14 Nov 90 13:16:00 EST Subject: CALL FOR PAPERS Message-ID: <9011141816.AA11580@magi.ncsl.nist.gov> CALL FOR PAPERS PROGRESS IN NEURAL NETWORKS This is a call for papers for Special Volume of the Progress In Neural Networks Series. This volume will concentrate on software implementation of neural networks, natural and synthetic. Contributions from leading researchers and experts will be sought. This series is intended for a wide audience, including those professionally involved in neural network research, such as lecturers and primary investigators in neural computing, neural modeling, neural learning, neural memory, and neurocomputers. Authors are invited to submit an abstract, extended summary, or manuscripts describing recent progress in theoretical analysis, modeling, or design of neural network architecture. The manuscripts should be self contained and of a tutorial nature. Suggested topics include, but are not limited to: * Parallel Architectures * Distributed Architectures * Hybrid Architectures * Parallel Learning Models * Connectionist Architectures * Associative Memory * Self Organizing and Adaptive Systems * Neural Net Language)to)Architecture Translation Ablex and the editors invite you to submit an abstract, extended summary, or manuscript proposal for consideration. Please contact the editors directly. Omid M. Omidvar, Associate Professor Charles L. Wilson, Manager Progress Series Editor Associate Volume Editor University of the District of Columbia Image Recognition Group Computer Science Department Advanced Systems Division 4200 Connecticut Avenue, N.W. National Inst. of Tech & Stand. Washington, D.C. 20008 Gaithersburg, Maryland 20899 Tel:(202)282-7345 Fax:(202)282-3677 Tel:(301)975-2080 Fax:(301)590-0932 Email: OOMIDVAR at UDCVAX.BITNET Email:Wilson at MAGI.NCSL.NIST.GOV Publisher:Ablex Publishing Corp.,355 Chestnut St., Norwood,NJ 07648 (201)767)8450 ------- End of Forwarded Message From AC1MPS at primea.sheffield.ac.uk Thu Nov 15 16:02:49 1990 From: AC1MPS at primea.sheffield.ac.uk (AC1MPS@primea.sheffield.ac.uk) Date: Thu, 15 Nov 90 16:02:49 Subject: super-Turing systems Message-ID: Two points: (1) All this chat about limericks is irrelevant to the actual purpose of this network, as I understand it. Some subscribers have to pay to receive this stuff - others have better things to do with their time than wade through it all to find the bits that matter. Please stop it. (2) super-Turing systems ==================== For some time now I've been interested in identifying systems that seem to possess 'super-Turing' potential, in the sense that they can implement/model processes which are provably not modellable by a Turing machine program. A few weeks ago, the Pour-El work was broadcast, showing that some aspects of analog computation were super-Turing. Other examples exist as well. Does anyone know of any others, besides the ones listed below? a) Pour-El et al (cited in earlier broadcast) The wave equation with computable initial conditions can assume non-computable values after computable time. b) Myhill (cited in Pour-El's work) A recursive function can have non-recursive derivative. c) An analog version of the Turing machine can solve the Turing halting problem. (myself: "X-machines and the Halting Problem ... ", to appear in Formal Aspects of Computing 1990(4)). d) An extended version of the universal quantum computer (Deutsch 1985) can implement unbounded non-determinism. (myself: work in progress, unpublished) [Reference to Deutsch's work: "Quantum theory, the Church-Turing principle, and the universal quantum computer", Proc R.Soc.London A400 (1985) pp. 97-117]. e) Standard concurrency specification languages can represent unbounded non-determinism (e.g. CCS using infinite summation). f) Smolin/Bennett Quantum mechanics can be used in cryptology. Claimed in Deutsch 1989 (Quantum communication thwarts eavesdroppers, in New Scientist, December 9th 1989) that their QPKD system is the first physical information processing device with capabilities beyond those of the Turing machine. I'm currently trying to decide how (if?) these models can be unified, and whether any of them might reasonably be implemented within biological systems. Is anyone working on similar problems? Mike Stannett, AC1MPS @ uk.ac.shefpa From kanal at cs.UMD.EDU Thu Nov 15 11:50:37 1990 From: kanal at cs.UMD.EDU (Laveen N. Kanal) Date: Thu, 15 Nov 90 11:50:37 -0500 Subject: Announcing a NIPS '90 workshop comparing decision trees and neural nets Message-ID: <9011151650.AA03106@brillig.cs.UMD.EDU> Your information has come too late for me to take advantage and be present aty this interesting workshop. Many of the papers in the A.I literature and statistics literature on the topic of decision trees have ingnored other literature on the subject in pattern recogntion. For those who are interested an extensive survey appears in " decision trees in Pattern recognition" by G.R. Dattatreya and L.N. Kanal, in Progress in pattern recognition 2, L.kanal and A. Rosenfeld (editors), North-Holland, 1985. Please let me know if you are going to put together the papers presented at your workshop as I would be most interested in seing them. Thanks, laveen kanal From lina at ai.mit.edu Thu Nov 15 11:36:41 1990 From: lina at ai.mit.edu (Lina Massone) Date: Thu, 15 Nov 90 11:36:41 EST Subject: limericks Message-ID: <9011151636.AA01275@cauda-equina> I too feel compelled to add my voice. So that, hopefully, people will do some more thinking before sending offensive, not-funny jokes on the network. Lina Massone From jai at sol.boeing.com Thu Nov 15 13:49:53 1990 From: jai at sol.boeing.com (Jai Choi) Date: Thu, 15 Nov 90 10:49:53 PST Subject: NN for image pre-processing Message-ID: <9011151849.AA09339@sol.boeing.com> Hi, my name is Jai Choi I found this connectionist letter very useful to get new information. As you know NN has proven to be well suited for the low-level image pre-processing. We can take advantage of speed over conventional convolution technique. I am currently trying to implement NN processor for examining an X-ray image with hard-wired neural network. Do anybody know of any kind of hardware development for this problem ? Please share information regarding NN application for low level image pre-processing either software or hardware aspect. My e-mail address is jai at sol.boeing.com. My physical address is Jai Choi Boeing Computer Services P.O.Box 24346, MS 6R-08 Seattle, WA 98124. I will summarize and post the result later. From pratt at paul.rutgers.edu Thu Nov 15 18:03:01 1990 From: pratt at paul.rutgers.edu (Lorien Y. Pratt) Date: Thu, 15 Nov 90 18:03:01 EST Subject: Request for references to adaptive MLP/BP networks (i.e. changing training data) Message-ID: <9011152303.AA12431@paul.rutgers.edu> Hello, I'm collecting references for papers which address the behavior of back-propagation-like networks when the training data changes. My bibliography so far of papers that look applicable is reproduced below this message. I'd appreciate any pointers that you might have to other papers, especially recent ones, or where the fact that the training data changes is not obvious from the title. Thanks in advance! -- Lori ------------------------------------------------------------------------------ @techreport{ weigend-90, MYKEY = " weigend-90 : .unb .csy .con .meth", TITLE = "Predicting the Future: A Connectionist Approach", AUTHOR = "Andreas S. Weigend and Bernardo A. Huberman and David E. Rumelhart", YEAR = 1990, MONTH = "April", INSTITUTION = "Stanford PDP Research Group", ADDRESS = "Stanford, California 94305-2130", NUMBER = "Stanford-PDP-90-01, PARC-SSl-90-20", } @misc{ tenorio-89a, MYKEY = " tenorio-89a : .bap .unr10 .unb .csy .app .con ", TITLE = "{Adaptive Networks as a Model for Human Speech Development}", NOTE = "A large scaled-up back-propagation study", KEY = "tenorio-89a" } @INCOLLECTION{ barron-84, MYKEY = " barron-84 : .unf .dt ", AUTHOR = {R. L. Barron and A. N. Mucciardi and F. J. Cook and J. N. Craig and A. R. Barron}, TITLE = {Adaptive Learning Networks: Development and Application in the {United} {States} of Algorithms related to {GMDH}}, BOOKTITLE = {Self-Organizing Methods in Modeling}, PUBLISHER = {Marcel Dekker}, YEAR = {1984}, EDITOR = {S. J. Farlow}, address = "{New} {York}" } @misc{ nguyen-89, MYKEY = " nguyen-89: .hop .blt .con .cnf ", TITLE = "{A New LMS-Based Algorithm for Rapid Adaptive Classification in Dynamic Environments}", KEY = "nguyen-89" } Allen, R.B. Adaptive Training of Connectionist State Machines. ACM Computer Science Conference, Louisville, Feb, 1989, 428. --References for which I don't have full bibliographic information:-- Adaptive Pattern Recognition and Neural Networks by Y. Pao >From Von Nostran Reinhold: NEURAL NETWORK ARCHITECTURES An Introduction by Judith Dayhoff, Ph.D., Editor of the Journal of Neural Network Computing Contents include: Associative and Adaptive Networks - More Paradigms The book: ``Neural Computing'' From slehar at park.bu.edu Thu Nov 15 19:06:43 1990 From: slehar at park.bu.edu (slehar@park.bu.edu) Date: Thu, 15 Nov 90 19:06:43 -0500 Subject: real thinking In-Reply-To: connectionists@c.cs.cmu.edu's message of 15 Nov 90 01:03:18 GM Message-ID: <9011160006.AA24654@thalamus.bu.edu> When the meaning of every joke Is perceived as a sexist poke The problem may lie In the reader's own eye That they can't laugh like regular folk :-) From french at cogsci.indiana.edu Fri Nov 16 08:52:32 1990 From: french at cogsci.indiana.edu (Bob French) Date: Fri, 16 Nov 90 08:52:32 EST Subject: No subject Message-ID: If one were to simply effect a count of the responses to Ms. Peggy Jennings' lambasting of the limericks posted recently on connectionists, one might conclude that there is significant support of her view. I believe this is wrong. I think it probable that that those who disagree with her have simply kept their mouths shut for fear of being branded as sexist for their disagreement. Allow me to buck that trend. I wish to speak on behalf of Mike Gasser (author of the Offending Limerick) and Dave Chalmers (one of the organizers of the Offending Contest in which the Offending Limerick was read and responsible for posting It to the net), both of whom are good friends of mine. Each of them is absolutely aware, in theory and in practice, in their speech and in their writings of the issues of sexism, sexist language, etc. and it pains me to see the inflammatory remarks of the Oregon connectionist directed at them. Stauncher supporters of women in science than Mike and Dave you would be very hard pressed to find. I feel that a Ms. Jennings' comments were an over-reaction verging on paranoia ("men are connectionists and women are whores"). While one might argue that the limerick about the Boltzman machine might have been mildly offensive because of its use of an apparently proscribed word ("whore"), under no circumstances did it merit such an extremely virulent response. I personally thought the limerick in question was funny and, judging by the reaction of the audience of men AND WOMEN at the ConnectFest here in Bloomington, so did they. And, no, the participating women were not unthinking minions to male domination. They, like the guys there, simply had a sense of humor. And now, hopefully, back to connectionism. Bob French Center for Research on Concepts and Cognition, Indiana University From petsche at learning.siemens.com Fri Nov 16 09:20:57 1990 From: petsche at learning.siemens.com (Thomas Petsche) Date: Fri, 16 Nov 90 09:20:57 EST Subject: Peggy Jennings In-Reply-To: Message-ID: <9011161420.AA20254@learning.siemens.com.siemens.com> Perhaps we could all get off our high horses now and stop bashing Jordan and his Connectionfest now. In response to Tony Chan <@bitnet.cc.cmu.edu:chan at unb.ca>: Actually, I spend approximately half my waking hours engaged in all sorts of activity that "wastes so much of the resources---mental, electronic, and others---on non-scientific matters". Eg, cooking, cleaning, house repairs, playing with children, and yes, even childish joking with friends (my goodness, (nope, no profanity here!) most of them have graduate degrees -- what is the world coming to?). I'm ashamed to admit it, but I even exchange email (would you believe it, wasting all that bandwidth!) *containing winking smileys* with my wife. Hope this sexual (sexist?) revelation doesn't offend anyone. ... I was interested to discover that there is no such thing as a male prostitute ... hmm, what was that movie, "American ...". Don't tell me ... I'll get it ... starts with a "G" ... From Michael.Witbrock at MJW.BOLTZ.CS.CMU.EDU Fri Nov 16 10:28:34 1990 From: Michael.Witbrock at MJW.BOLTZ.CS.CMU.EDU (Michael.Witbrock@MJW.BOLTZ.CS.CMU.EDU) Date: Fri, 16 Nov 90 10:28:34 EST Subject: verifiable cognition Message-ID: The recent attempt to show humour, which subsists in restating the rumour, that ``regular folks'' amounts to ``us blokes'', is no better than those posted sooner. From Alex.Waibel at SPEECH2.CS.CMU.EDU Fri Nov 16 11:51:08 1990 From: Alex.Waibel at SPEECH2.CS.CMU.EDU (Alex.Waibel@SPEECH2.CS.CMU.EDU) Date: Fri, 16 Nov 90 11:51:08 EST Subject: Announcing a NIPS '90 workshop comparing decision trees and neural nets Message-ID: There will be summary write-ups for all workshops placed on file at Ohio-State for remote FTP-ing, after the workshops. Enjoy, Alex Waibel From jose at learning.siemens.com Fri Nov 16 08:30:39 1990 From: jose at learning.siemens.com (Steve Hanson) Date: Fri, 16 Nov 90 08:30:39 EST Subject: Netiquette Message-ID: <9011161330.AA20175@learning.siemens.com.siemens.com> Yes folks. Lets get this off the net please. Steve Hanson From Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU Sat Nov 17 00:46:56 1990 From: Dave.Touretzky at DST.BOLTZ.CS.CMU.EDU (Dave.Touretzky@DST.BOLTZ.CS.CMU.EDU) Date: Sat, 17 Nov 90 00:46:56 EST Subject: neural net position available Message-ID: <5238.658820816@DST.BOLTZ.CS.CMU.EDU> I'm posting this position announcement as a favor to the Navy. I have no additional information to what appears below, so contact Dr. Lorey, not me, if you have questions. And please: let's have no more limericks or limerick discussions. Everybody get back to work. -- Dave ................................................................ Subj: POSITION DESCRIPTION 1. The Naval Surface Warfare Center is seeking to expand its operations in basic research in Artificial Neural Nets to compliment the ongoing applied work. A new thrust in basic research is seeking to augment the existing staff with one or two new hires. Preference will be given candidates who will soon complete their PhD work but Masters candidates will be considered. We are also interested in filling at least one summer faculty position. Neural Network experience is a must for all positions. The full-time and summer positions will join the current staff of five--two with PhDs, two with Masters. 2. The ongoing research areas include: multi-sensor fusion, image/data compression, image processing, optimization, development of new learning rules for existing architectures and development of schemes to embed networks in multi-chip simulation systems. 3. Equipment on hand: 4 Sun Workstations 1 Alliant FX40 Minisuper Computer with 2 Compote engines 1 Silicon Graphics 4D/220GTXB Graphics Workstation 4. Planned positions(s) will encompass work in the following areas: -route planning for autonomous vehicles, -improvement of existing (Hopfield, Boltzmann machines) and -development of new network architectures for optimization, -development/implementation of networks for low-level vision processing, and -evaluation/integration of neural network processing chips 5. Planned procurements: 2 Silicon Graphics 4D/35TG 1 Terrain board 1 Silicon Graphics 4D1340VGXB 1 VME Sun Expansion box 1 VME SG Expansion box 1 Breadboard table Video capability 5. Technical questions regarding ongoing work at the Center may be addressed to Dr. George Rogers or Mr. Jeffrey Solka at 703-663- 7650. 6. Interested candidates should submit their resume to Dr. Richard Lorey (703-663-8159) at MILNET address: rlorey@ relay.nswc.navy.mil or via mail to: Commander Naval Surface Warfare Center Code K12 (Dr. Richard Lorey) Dahlgren, VA 22448-5000 7. ELIGIBILITY FOR SECRET LEVEL CLEARANCES MANDATORY - RESUME SHOULD INDICATE U.S. CITIZENSHIP STATUS. /s/ Dr. Richard Lorey From burr at mojave.stanford.edu Sat Nov 17 00:54:20 1990 From: burr at mojave.stanford.edu (Jim Burr) Date: Fri, 16 Nov 90 21:54:20 PST Subject: NIPS VLSI post conference workshop Message-ID: <9011170554.AA06228@mojave.Stanford.EDU> This year's post conference workshop on VLSI neural nets will be held Saturday, Dec 1. There will be a morning and an evening session. The workshop will address the latest advances in VLSI implementations of neural nets. How successful have implementations been so far? Are dedicated neurochips being used in real applications? What algorithms have been implemented? Which ones have not been? Why not? How important is on chip learning? How much arithmetic precision is necessary? Which is more important, capacity or performance? What are the issues in constructing very large networks? What are the technology scaling limits? Any new technology developments? Several invited speakers will address these and other questions from various points of view in discussing their current research. We will try to gain better insight into the strengths and limitations of dedicated hardware solutions. Jim Burr burr at mojave.stanford.edu From rsun at chaos.cs.brandeis.edu Sat Nov 17 00:14:40 1990 From: rsun at chaos.cs.brandeis.edu (Ron Sun) Date: Sat, 17 Nov 90 00:14:40 est Subject: No subject Message-ID: <9011170514.AA15626@chaos.cs.brandeis.edu> Tony Chan wrote: >I came from a culture, where, as far as I know, feminism is assumed and >femininity is worshipped. I wonder where that *culture* is. If you are from where I am from, you are too far from the truth. >I unreservedly agree with Pegg >y on this >issue. Sexism aside, I find it sickening to learn that the cream of the >crop of this culture wastes so much of the resources---mental, >electronic, and others---on non-scientific matters. so much? >Being a member of >this list, I feel I must apologize sincerely to women on the list even >though I have absolutely nothing to do with Jordan Pollock or the >Connectfest that he was partly responsible for. Sounds really weird to me. > >Tony Chan Telephone: 506 648-5500 >P.O. Box 5050 Fax: 506 648-5528 >Computer Science Email: chan at unb.ca >UNB Saint John Campus >Saint John, New Brunswick E2L 4L5 >Canada Please, folks, stop it now. enough of this stuff. --Ron From watt at compsci.stirling.ac.uk Fri Nov 16 11:07:01 1990 From: watt at compsci.stirling.ac.uk (R.J. Watt) Date: 16 Nov 90 16:07:01 GMT (Fri) Subject: Permanent Lectureship Message-ID: <9011161607.AA22485@uk.ac.stir.cs.lev> University of Stirling, Scotland Department of Psychology Centre for Cognitive and Computational Neuroscience (CCCN) We have a permanent post available for a LECTURER: M. Sc. in NEURAL COMPUTATION This new one-year M. Sc. will begin Sept 1991, and will cover a wide variety of topics in neural computation with vision as a major specialisation. The person appointed will have a major responsibility for running the course, and for teaching the vision components, and will also be exepected to have an active research program. The CCCN is multidisciplinary and includes staff from the Departments of Psychology, Computing Science and Mathematics. Starting date: 1 July 1991 Salary on scale UKL 12,086 - UKL 22,311 Applications (initially by by email, fax or snail-mail) including a CV with names and addresses (inc e-mail if poss) of 2 referees to Prof R. J. Watt, Psychology Dept., University of Stirling, FK9 4LA Scotland tel: 0786 67665 fax : + 44 786 63000 e-mail: watt at cs.stir.ac.uk by 26 Nov 1990. Further particulars and information from Prof Watt. The University of Stirling is an equal opportunities employer. From Connectionists-Request at CS.CMU.EDU Sat Nov 17 12:55:56 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Sat, 17 Nov 90 12:55:56 EST Subject: Enough Already! Message-ID: <18700.658864556@B.GP.CS.CMU.EDU> Move the discussion of the limericks off this mailing list! The Connectionists mailing list is a private mailing list for the discussion of *TECHNICAL* issues only. Persons who continue this limerick thread will be removed from the list. Scott Crowder Connectionists-Request at cs.cmu.edu (ARPAnet) From PEGJ at oregon.uoregon.edu Sat Nov 17 17:01:00 1990 From: PEGJ at oregon.uoregon.edu (Peggy Jennings) Date: Sat, 17 Nov 90 14:01 PST Subject: a limerick of my own Message-ID: <0262CDAF6F1F413D93@oregon.uoregon.edu> I am risking broadcasting this message on the network because I started a discussion surrounding the offensive limericks, but did not make clear what offended me. It is important to broadcast the clarification of this issue on the net because there is a recurring type of offense that happens on this network and at connectionist conferences and summer institutes that I am not faced with on other networks and at other conferences. The offenses in the limericks are not related to word usage ('whore') nor about topic selection (sex). I apologize to those who have involuntarily spent money or time reading those messages. The offenses lie in the communication of assumed male perspective and values, and in the assumption that humor that is insulting and degrading to another group of people is acceptable by tacit agreement. This *is* a concern of members of this network. Unfortunately, in my original message, I defined the issues in a conjunction using two asymmetrical relations: (1) that connectionists are men (2) that women are whores. Both of these assumptions anger me -- the second offends me personally (and is *not* the concern of this network) and the first offends me professionally (and *is* the concern of this network). I'd like clarify the issue and pull further discussion of this ***off the network.*** What I am trying to communicate is my frustration and anger at assumption (1): that those who attend Connectionist Modeling Summer Institutes, those who attend connectionism conferences and talks, and those who subscribe to connectionist networks are white men. The offense is one of egocentric perspective. This perspective is manifested as sexism in my case, but to others it could look like racism, for example, or of "religio- centrism" What angers me is the communication of the implicit assumption that members of this network and participants in the ConnectFest share a value system that, for example, finds humor in the function or dysfunction of male sexual arousal or in the exploitation of an objectified sexual "service". We who subscribe to this network have in common that we are *people,* but the only further assumption that can safely be made is that we are all interested in connectionist research and development. Posting humor or any other message that is insulting or degrading to *any* group of people is not appropriate. I am challenging members of this network to communicate a global perspective. When you post something on this network, when you give a presentation at a conference or special institute, do not assume that the audiece is just like you. Assume that members of the audience are different genders, races, nationalities and religions sharing scientific interests in connectionism. The only value system we have identified to you by subscribing to the network or by attending the conference is one of research interest. And that is the only topic we have asked to read about in our mail. With that, let's **get this off the net now** I'll gladly continue this discussion with individuals who want to continue. And finally, I'd like to close this discussion with a little limerick of my own (after Gasser): Said the Boltzmann machine to its lover, "There's an optimum we can discover. This is such a great feeling Let's slow our annealing Exploring each dip in the other." P.S. Thanks to all of you who wrote to support me, criticize me, agree with me, call me names, and ask me what I think. I am fortunate in having the opportunity to enter into discussion with so many of you who care about these issues in our profession. ******************************* -- Peg Jennings From Connectionists-Request at CS.CMU.EDU Mon Nov 19 09:12:22 1990 From: Connectionists-Request at CS.CMU.EDU (Connectionists-Request@CS.CMU.EDU) Date: Mon, 19 Nov 90 09:12:22 EST Subject: call for papers for COLT '91 Message-ID: <10821.659023942@B.GP.CS.CMU.EDU> From haussler at saturn.ucsc.edu Mon Nov 19 01:18:16 1990 From: haussler at saturn.ucsc.edu (David Haussler) Date: Sun, 18 Nov 90 22:18:16 -0800 Subject: call for papers for COLT '91 Message-ID: <9011190618.AA27456@saturn.ucsc.edu> CALL FOR PAPERS COLT '91 Fourth Workshop on Computational Learning Theory Santa Cruz, CA. August 5-7, 1991 The fourth workshop on Computational Learning Theory will be held at the Santa Cruz Campus of the University of California. Registration is open, within the limits of the space available (about 150 people). In previous years COLT has focused primarily on developments in the analysis of learning algorithms within certain computational learning models. This year we would like to widen the scope of the workshop by encouraging papers in all areas that relate directly to the theory of machine learning, including artificial and biological neural networks, robotics, pattern recognition, information theory, decision theory, Bayesian/MDL estimation, and cryptography. We look forward to a lively, interdisciplinary meeting. As part of our program, we are pleased to present two special invited talks. "Gambling, Inference and Data Compression" Prof. Tom Cover of Stanford University "The Role of Learning in Autonomous Robots" Prof. Rodney Brooks of MIT Authors should submit an extended abstract that consists of: (1) A cover page with title, authors' names, (postal and e-mail) addresses, and a 200 word summary. (2) A body not longer than 10 pages in twelve-point font. Be sure to include a clear definition of the theoretical model used, an overview of the results, and some discussion of their significance, including comparison to other work. Proofs or proof sketches should be included in the technical section. Experimental results are welcome, but are expected to be supported by theoretical analysis. Authors should send 11 copies of their abstract to L.G. Valiant, COLT '91, Aiken Computing Laboratory, Harvard University, Cambridge, MA 02138. The deadline for receiving submissions is February 15, 1991. This deadline is FIRM. Authors will be notified by April 8; final camera-ready papers will be due May 22. The proceedings will be published by Morgan-Kaufmann. Each individual author will keep the copyright to his/her abstract, allowing subsequent journal submission of the full paper. Chair: Manfred Warmuth (UC Santa Cruz). Local arrangements chair: David Helmbold (UC Santa Cruz). Program committee: Leslie Valiant (Harvard, chair), Dana Angluin (Yale), Andrew Barron (U. Illinois), Eric Baum (NEC, Princeton), Tom Dietterich (Oregon State U.), Mark Fulk (U. Rochester), Alon Itai (Technion, Israel), Michael Kearns (Int. Comp. Sci. Inst., Berkeley), Ron Rivest (MIT), Naftali Tishby (Bell Labs, Murray Hill), Manfred Warmuth (UCSC). Hosting Institution: Department of Computer and Information Science, UC Santa Cruz. Papers that have appeared in journals or other conferences, or that are being submitted to other conferences are not appropriate for submission to COLT. Unlike previous years, this includes papers submitted to the IEEE Symposium on Foundations of Computer Science (FOCS). We no longer have a dual submission policy with FOCS. Note: this call is being distributed to THEORY-NET, ML-LIST, CONNECTIONISTS, Alife, NEWS.ANNOUNCE.CONFERENCES, COMP.THEORY, COMP.AI, COMP.AI.EDU, COMP.AI.NEURAL-NETS, and COMP.ROBOTICS. Please help us by forwarding it to colleagues who may be interested and posting it on any other relevant electronic networks. ------- End of Forwarded Message From szand at cs.utexas.edu Mon Nov 19 12:32:26 1990 From: szand at cs.utexas.edu (Shahriar Zand-Biglari) Date: Mon, 19 Nov 90 11:32:26 CST Subject: On Necessity of Awareness of Regular Folks! Message-ID: <9011191732.AA19610@cs.utexas.edu> Apparently, the sexist/prejudice trend is not going to stop their exploitation of the connectionist network. Here is another that just came up: When the meaning of every joke Is perceived as a sexist poke The problem may lie In the reader's own eye That they can't laugh like regular folk Such statements, even if not consciously and intentionally brought up, only reflects the serious social unawareness of the "regular folks" in our scientific community. And yet another evidence on how education and degree does not provide the members of our community with a progressive culture, appropriate for our time. I very much appreciate the sharp eyes of those like Peggy, which could see the prejudice behind the sexist poetry. Regular folks, male (as myself) or female, better be aware of such sexist remarks and feel sorrow for their existence in our time that to laugh. Shar Zand-Biglari Department of Computer Sciences University of Texas at Austin szand at cs.utexas.edu From fozzard at boulder.Colorado.EDU Mon Nov 19 12:46:54 1990 From: fozzard at boulder.Colorado.EDU (Richard Fozzard) Date: Mon, 19 Nov 90 10:46:54 -0700 Subject: all this hoopla over limericks... Message-ID: <9011191746.AA25019@alumni.colorado.edu> Will all of you PLEASE put a "set Replyall" in your .mailrc? This will default "r" to reply only to the sender (and "R" will reply to sender and all recipients). [This may be different on some UNIXs - do a man mail to be sure.] This should get some of the personal discussions (aka "flames") off of our mailing list. thanks! ======================================================================== Richard Fozzard "Serendipity empowers" Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303 fozzard at boulder.colorado.edu (303)497-6011 or 444-3168 From ernst at russel.cns.caltech.edu Mon Nov 19 14:48:36 1990 From: ernst at russel.cns.caltech.edu (Ernst Miebur) Date: Mon, 19 Nov 90 11:48:36 PST Subject: Post NIPS Conference Workshop on Cortical Oscillations Message-ID: <9011191948.AA10858@russel.caltech.edu> NIPS Post Conference Workshop #1: Oscillations in Cortical Systems 40-60 Hz oscillations have long been reported in the rat and rabbit olfactory bulb and cortex on the basis of single-and multi-unit recordings as well as EEG activity. Periodicities in eye movement reaction times as well as oscillations in the auditory evoked potential in response to single click or a series of clicks all support a 30-50 Hz framework for aspects of cortical activity and possibly cortical information processing. Recently, highly synchronized, stimulus specific oscillations in the 35-85 Hz range were observed in areas 17, 18 and PMLS of anesthetized as well as awake cats. Neurons with similar orientation tuning up to 10 mm apart, even across the vertical meridian (i.e. in different hemispheres) can show phase-locked oscillations. Organization of the workshop will favor interaction between participants as much as possible. To set a framework, introductory talks will be presented. Speakers include J. Bower (Caltech): Experiments B. Ermentrout (U. Pittsburgh): Coupled Oscillators E. Niebur (Caltech): Models D. Schwenders (U. Munich): Psychophysics If you plan to present your work during a 5-10 minute talk, I would appreciate sending me a notice, although ``walk-ins'' are welcome. Topics that will be discussed include - possible functions of cortical oscillations, - crucial experiments to elucidate these functions, - mechanisms for long-range synchronization. Ernst Niebur Computation and Neural Systems Caltech 216-76 Pasadena, CA 91125 ernst at descartes.cns.caltech.edu (818)356-6885 From dave at cogsci.indiana.edu Tue Nov 20 00:16:24 1990 From: dave at cogsci.indiana.edu (David Chalmers) Date: Tue, 20 Nov 90 00:16:24 EST Subject: Bibliography available Message-ID: I originally sent this notice to a philosophy list, but somebody suggested that I send it to connectionists as a lot of the material is relevant, including about 60 papers on the philosophy of connectionism. -- For the last year or so, I've been working on a bibliography of recent work in the philosophy of mind, philosophy of cognitive science, and philosophy of AI. I keep intending to distribute it when it's complete, but of course it's never complete as I'm always coming across new things. So maybe I should just distribute it as it is. It consists of 861 entries, divided into 4 parts: 1. "First-person" issues (consciousness, qualia, etc) [218 entries] 2. "Third-person" issues (content, psych explanation, etc) [346 entries] 3. Philosophy of Artificial Intelligence [155 entries] 4. Miscellaneous topics [142 entries] About half of the entries are annotated with a 1-or-2-line summary, and occasionally criticism. The rest I either haven't read, or haven't got around to annotating yet. Of course none of the bibliographies are complete, but part 4 is particularly feeble, without any attempt at thoroughly covering the areas involved. The vast bulk of the bibliography consists of papers and books from the last 10-15 years, although a little earlier material is included where it is directly relevant to current concerns. I've enclosed a section-by-section summary below. To get a copy, write to me at dave at cogsci.indiana.edu. The files take up about 120K in total. Dave Chalmers (dave at cogsci.indiana.edu) Center for Research on Concepts and Cognition Indiana University. ----------------------------------------------------------------------------- Bibliography: Recent Work in the Philosophy of Mind and Cognition ================================================================= Compiled by David J. Chalmers, Center for Research on Concepts and Cognition, Indiana University, Bloomington, IN 47408. (c) 1990 David J. Chalmers. Summary ------- 1. "First-person" issues (consciousness, qualia, etc) [218] 1.1 Subjectivity (Nagel) [26] 1.2 The Knowledge Argument (Jackson) [13] 1.3 Functionalism and Qualia (including Absent Qualia, etc) [28] 1.4 Inverted Spectrum [12] 1.5 Qualia, General [18] 1.6 Are Programs Enough? (Searle) [32] 1.7 Machines and Conscious Mentality (other) [15] 1.8 Mind-Body Problem (Misc) [10] 1.9 Zombies and Other Minds [4] 1.10 Consciousness -- Eliminativist Perspectives [10] 1.11 Consciousness -- Functional Accounts [21] 1.12 Consciousness, General [17] 1.13 Subjective Mental Content [4] 1.14 Dualism [8] 2. "Third-person" issues (content, psych explanation, etc) [346] 2.1 The Reality of Propositional Psychology [50] 2.1a General [12] 2.1b Sententialism (esp. Fodor) [16] 2.1c Instrumentalism (Dennett) [17] 2.1d Syntactic Functionalism (Stich) [5] 2.2 Eliminativism, Psychology & Neuroscience (esp. Churchlands) [29] 2.2a Eliminative Materialism [16] 2.2b Psychology & Neuroscience [13] 2.3 Narrow/Wide Content [53] 2.3a Why Reference is not in the Head (Putnam) [9] 2.3b Implications for Psychology (Burge, Fodor) [20] 2.3c The Status of Narrow Content [12] 2.3d Miscellaneous [12] 2.4 Causal Theories of Content [37] 2.4a Information-Theoretic Accounts (Dretske) [11] 2.4b Causal Accounts, General [11] 2.4c Teleological Approaches (Millikan) [8] 2.4d Situation Semantics (Barwise/Perry) [7] 2.5 Theories of Content, Misc [13] 2.6 Representation (General) [13] 2.7 Supervenience, Reduction, Mental Causation [46] 2.7a Supervenience (Kim, etc) [13] 2.7b Anomalous Monism (Davidson) [14] 2.7c Token Identity (Davidson, etc) [4] 2.7d Mental Causation [8] 2.7e Mental/Physical, Misc [7] 2.8 Functionalism (General) [31] 2.9 Computationalism (General) [17] 2.10 Psychological Explanation, Misc [7] 2.11 Perception/Modularity/Plasticity (Fodor, Churchland) [12] 2.12 Nativism (Chomsky, etc) [22] 2.13 Misc Phil of Mind [16] 3. Philosophy of Artificial Intelligence [155] 3.1 Can Machines be Conscious? -- see 1.6, 1.7. 3.2 Computationalism as Psychological Explanation -- see 2.9. 3.3 The Turing Test [9] 3.4 Godelian Arguments (Lucas) [23] 3.5 Philosophy of Connectionism [40] 3.6 Foundations of Connectionism (more empirical) [10] 3.7 Connectionism & Structured Representation (Fodor/Pylyshyn) [10] 3.8 Foundations of AI (somewhat empirical) [14] 3.9 Computation and Semantics [12] 3.10 The Frame Problem [10] 3.11 Analog and Digital Processing [5] 3.12 Levels of Analysis (Marr, etc) [7] 3.13 Philosophy of AI, Misc [15] 4. Miscellaneous Topics [142] 4.1 Colour, General [15] 4.2 Colour Incompatibilities [6] 4.3 Split Brains [11] 4.4 Personal Identity (tiny selection) [7] 4.5 Pain and Pleasure [13] 4.6 Dreaming [9] 4.7 Phenomenal Qualities and the Sorites Paradox [7] 4.8 Mental Images (Pylyshyn, Kosslyn) [21] 4.9 Sensation and Perception, Misc [8] 4.10 Emotions, etc [6] 4.11 Free Will (tiny selection) [7] 4.12 Animal Cognition [7] 4.13 Brains in Vats (Putnam) [14] 4.14 Rationality [11] From AC1MPS at primea.sheffield.ac.uk Tue Nov 20 09:57:02 1990 From: AC1MPS at primea.sheffield.ac.uk (AC1MPS@primea.sheffield.ac.uk) Date: Tue, 20 Nov 90 09:57:02 Subject: thanks and  as requested Message-ID: Many thanks to everyone who replied to my last broadcast about super-Turing systems. To those of you who asked for hard-copies of my paper: they're in the post. A lot of you missed the original references to Pour-El's work. Here are the references I've got, although there may be later work as well. Myhill J. "A recursive function, defined on a compact interval and having a continuous derivative that is not recursive." Michegan Math. J. 18 (1971) pp 97-8. M. Boykan Pour-El "Abstract computability versus analog-generability." Springer Lecture Notes in Math 337 (1971) pp 345-60 (Cambridge summer school in math. logic). Marian Boykan Pour-El "Abstract computability and its relation to the general purpose analog computer." Trans AMS 199 (1974) pp 1-28. Marian Boykan Pour-El + Ian Richards "The wave equation with computable initial data such that its unique solution is not computable." Advances in math 39 (1981) pp 215-39. Thanks again. Mike Stannett. From pollack at cis.ohio-state.edu Tue Nov 20 13:00:39 1990 From: pollack at cis.ohio-state.edu (Jordan B Pollack) Date: Tue, 20 Nov 90 13:00:39 -0500 Subject: Postdoctoral Positions Message-ID: <9011201800.AA00608@dendrite.cis.ohio-state.edu> POSTDOCTORAL FELLOWS IN COGNITIVE SCIENCE The Ohio State University The Center for Cognitive Science at the Ohio State University has several openings for postdoctoral researchers. We will consider recent Ph.D.'s in all areas of Cognitive Science based on overall quality of research and commitment to interdisciplinary work. Applications are especially encouraged from candidates in areas in which we have active faculty interests.(See attached note) These two-year positions will begin July 1991 and carry a yearly stipend of $25,000 with $1000 for moving expenses and $1000 per year for research and travel. An office and computer access will be provided by the Center. The Center for Cognitive Science is an interdisciplinary University-wide research center with approximately eighty members from sixteen departments. OSU is one of the largest universities in the country with significant research resources including a Cray YMP and a PET scanner. Columbus provides a high quality of life along with very affordable housing. To apply, send your vita, reprints and a statement of research interests and a research plan, tell us the name of the faculty member(s) you wish to work with at OSU, and arrange for three recommendation letters to be sent to: Postdoctoral Search Commmittee Center for Cognitive Science 208 Ohio Stadium East 1961 Tuttle Park Place Columbus, OH 43210-1102 Materials must be postmarked by January 15, 1991. The Ohio State University is an Equal Opportunity/Affirmative Action employer. ------------------------------------- I've put a summary of the main interests of the center in neuroprose (file OSU.Cogsci). Personally, I am looking for someone with a solid background in non-linear dynamical systems to collaborate on the question of how fractals and chaos are exploited by cognition. Other opportunities exist in neuroscience, AI, vision, speech, language, music, motor control, philosophy of mind, and elsewhere, under sponsorship of faculty in those areas. Please contact me (and/or any other faculty you know at OSU) for further info. Jordan Pollack Assistant Professor CIS Dept/OSU Laboratory for AI Research 2036 Neil Ave Email: pollack at cis.ohio-state.edu Columbus, OH 43210 Fax/Phone: (614) 292-4890 From mike at park.bu.edu Tue Nov 20 18:11:35 1990 From: mike at park.bu.edu (mike@park.bu.edu) Date: Tue, 20 Nov 90 18:11:35 -0500 Subject: No subject Message-ID: <9011202311.AA01221@fenway.bu.edu> BOSTON UNIVERSITY A World Leader In Neural Network Research and Technology Presents Two Major Events on the Cutting Edge NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, MAY 5-10, 1991 A self-contained systematic course by leading neural architects. NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, MAY 10-12, 1991 An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. Special student registration rates are available. Sponsored by: Boston University's Wang Institute, Center for Adaptive Systems, and Graduate Program in Cognitive and Neural Systems, with partial support from the Air Force Office of Scientific Research. NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 5-10, 1991 This self-contained systematic five-day course is based on the graduate curriculum in the technology, computation, mathematics, and biology of neural networks developed at the Center for Adaptive Systems (CAS) and the graduate program in Cognitive and Neural Systems (CNS) of Boston University. The curriculum refines and updates the successful course held at the Wang Institute in May, 1990. The course will be taught by CAS/CNS faculty, as well as by distinguished guest lecturers at the beautiful and superbly equipped campus of the Wang Institute. An extraordinary range and depth of models, methods, and applications will be presented with ample opportunity for interaction with the lecturers and other participants at the daily discussion sections, meals, receptions, and breaks that are included with registration. At the 1990 Course, participants came from 20 countries and 35 states of the U.S. Boston University tutors are STEPHEN GROSSBERG, GAIL CARPENTER, ENNIO MINGOLLA, MICHAEL COHEN, DAN BULLOCK, AND JOHN MERRILL. Guest tutors are FEDERICO FAGGIN, ROBERT HECHT-NIELSEN, MICHAEL JORDAN, ANDY BARTO, AND ALEX WAIBEL. DAY 1 COURSE SCHEDULE (May 6, 1991) PROFESSOR GROSSBERG: Historical Overview, Cooperation and Competition, Content Addressable Memory, and Associative Learning. PROFESSORS CARPENTER, GROSSBERG, AND MINGOLLA: Associative Learning Continued, Neocognitron, Perceptrons, and Introduction to Back Propagation. PROFESSOR JORDAN: Recent Developments of Back Propagation. Evening Discussions with Tutors and Informal Presentations. DAY 2 COURSE SCHEDULE (May 7, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Adaptive Pattern Recognition. PROFESSORS CARPENTER AND GROSSBERG: Introduction to Adaptive Resonance, Theory and Analysis of ART 1. PROFESSOR CARPENTER: Analysis of ART 2, ART 3, Predictive ART, and Self-Organization of Invariant Pattern Recognition codes. Evening Discussions with Tutors and Informal Presentations. DAY 3 COURSE SCHEDULE (May 8, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Vision and Image Processing. PROFESSORS BULLOCK AND GROSSBERG: Adaptive Sensory-Motor Planning and Control. Evening Discussions with Tutors and Informal Presentations. DAY 4 COURSE SCHEDULE (May 9, 1991) PROFESSORS COHEN, GROSSBERG, AND WAIBEL: Speech Perception and Production. PROFESSORS BARTO, GROSSBERG, AND MERRILL: Reinforcement Learning and Prediction. DR. HECHT-NIELSEN: Recent Developments in the Neurocomputer Industry. Evening Discussions with Tutors and Informal Presentations. DAY 5 COURSE SCHEDULE (May 10, 1991) DR. FAGGIN: VLSI Implementation of Neural Networks. END OF COURSE (at 1:30 PM). RESEARCH CONFERENCE NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING MAY 10-12, 1991 This international research conference on a topic at the cutting edge of science and technology will bring together leading experts in academe, government, and industry to present their results on vision and image processing in INVITED LECTURES and CONTRIBUTED POSTERS. Topics range from visual neurobiology and psychophysics through computational modelling to technological applications. CALL FOR PAPERS - VIP POSTER SESSION: A featured 3-hour poster session on neural network research related to vision and image processing will be held on May 11, 1991. Attendees who wish to present a poster should submit three copies of an abstract (one single-spaced page), postmarked by March 1, 1991, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1991. DAY 1 CONFERENCE PROGRAM (May 10, 1991, 5:00-7:30 PM) PROFESSOR JOHN DAUGMAN, CAMBRIDGE UNIVERSITY: "High-Confidence Personal Identification System Built from Quadrature Neural Filter" PROFESSOR DAVID CASASENT, CARNEGIE MELLON UNIVERSITY: "CMU Hybrid Optical/ Digital Neural Net for Scene Analysis" DR. ROBERT HECHT-NIELSEN, HNC,: "Neurocomputers for Image Analysis" DAY 2 CONFERENCE PROGRAM (May 11, 1991) PROFESSOR V.S. RAMACHANDRAN, UNIVERSITY OF CALIFORNIA, SAN DIEGO: "Interactions Between `Channels' Concerned with the Perception of Motion, Depth, Color, and Form" PROFESSOR STEPHEN GROSSBERG, BOSTON UNIVERSITY: "A Neural Network Architecture for 3-D Vision and Figure-Ground Separation" PROFESSOR ENNIO MINGOLLA, BOSTON UNIVERSITY: "A Neural Network Architecture for Visual Motion Segmentation" PROFESSOR GEORGE SPERLING, NEW YORK UNIVERSITY: "Two Systems of Visual Processing" DR. ROBERT DESIMONE, NATIONAL INSTITUTE OF MENTAL HEALTH: "Attentional Control of Visual Perception: Cortical and Subcortical Mechanisms" PROFESSOR GAIL CARPENTER, BOSTON UNIVERSITY: "Neural Network Architectures for Attentive Learning, Recognition, and Prediction" DR. RALPH LINSKER, IBM T.J. WATSON RESEARCH CENTER: "New Approaches to Network Learning and Optimization" PROFESSOR STUART ANSTIS, UNIVERSITY OF TORONTO: "My Recent Research on Motion Perception" POSTER SESSION DAY 3 CONFERENCE PROGRAM (May 12, 1991) PROFESSOR JACOB BECK, UNIVERSITY OF OREGON: "Preattentive Visual Processing" PROFESSOR JAMES TODD, BRANDEIS UNIVERSITY: "Neural Analysis of Motion" DR. ALLEN M. WAXMAN, MIT LINCOLN LAB: "Extraction" PROFESSOR ERIC SCHWARTZ, NEW YORK UNIVERSITY: "Biologically Motivated Machine Vision" PROFESSOR ALEX PENTLAND, MASSACHUSETTS INSTITUTE OF TECHNOLOGY: "The Optimal Observer: Design of a Dynamically-Responding Visual System" DISCUSSION END OF RESEARCH CONFERENCE (at 1 PM) CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. STUDENT REGISTRATION: A limited number of spaces at the course and conference have been reserved at a subsidized rate for full time students. These spaces will be assigned on a first-come, first-served basis. Completed registration form and payment for students who wish to be considered for the reduced student rate must be received by April 15, 1991. YOUR REGISTRATION FEE INCLUDES: COURSE CONFERENCE Five days of tutorials Admission to all invited lectures Course notebooks for all tutorials Admission to poster session All guest lectures One reception Sunday evening reception Two continental breakfasts Five continental breakfasts One lunch Five lunches One dinner Four dinners Daily morning/afternoon Daily morning/afternoon coffee coffee service service Evening discussion sessions with leading neural architects CANCELLATION POLICY: Course fee, less $100, and the research conference fee, less $60, will be refunded upon receipt of a written request postmarked before March 31, 1991. After this date no refund will be made. Registrants who do not attend and who do not cancel in writing before March 31, 1991 are liable for the full amount of the registration fee. You must obtain a cancellation number from our registrar in order to make the cancellation valid. HOW TO REGISTER: ADVANCE REGISTRATION: To register by telephone, call (508) 649-9731 with VISA or Mastercard between 8:00-5:00 PM (EST). To register by fax, complete and fax back the Registration Form to (508) 649-6926. To register by mail, complete the registration form and mail it with your full form of payment as directed. Make check payable in U.S. dollars to Boston University. ON-SITE REGISTRATION: Those who wish to register for the course and the research conference on-site may do so on a space-available basis. SITE: The Wang Institute of Boston University possesses excellent conference facilities in a beautiful 220-acre setting. It is easily reached from Boston's Logan Airport and Route 128. HOTEL RESERVATIONS: Sheraton Tara, Nashua, NH (603) 888-9970; Red Roof Inn, Nashua, NH (603) 888-1893; or Stonehedge Inn, Tyngsboro, MA, (508) 649-4342. The special conference rate applies only if you mention the name and dates of the meeting when making the reservation. The hotels in Nashua are located approximately five miles from the Wang Institute. Shuttle bus service will be provided. REGISTRATION FORM: COURSE - NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, May 5-10, 1991 RESEARCH CONFERENCE - NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, May 10-12, 1991 Name: ______________________________________________________________ Title: _____________________________________________________________ Organization: ______________________________________________________ Address: ___________________________________________________________ City: ____________________________ State: __________ Zip: __________ Telephone: _________________________________________________________ Course: Research Conference: ( ) regular attendee $985 ( ) regular attendee $95 ( ) full-time student $275* ( ) Full-time student $75* *limited number of spaces. Student registrations must be received by April 15, 1991. Total payment enclosed: ____________________________________________ Form of payment: ( ) Check or money order (payable in U.S. dollars to Boston University). ( ) VISA ( ) Mastercard #_______________________________________Exp. Date:__________________ Signature (as it appears on card): _________________________________ Return to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 Boston University's policies provide for equal opportunity and affirmative action in employment and admission to all programs of the University. From WANG at nbivax.nbi.dk Wed Nov 21 06:56:00 1990 From: WANG at nbivax.nbi.dk (WANG@nbivax.nbi.dk) Date: Wed, 21 Nov 90 12:56 +0100 (NBI, Copenhagen) Subject: Copenhagen Optimization Conference Message-ID: ------------------------------------------------------------ INTERNATIONAL CONFERENCE ON NOVEL METHODS IN OPTIMIZATION February 7 - 8, 1991 arranged by NORDITA Nordic Institute of Theoretical Physics Copenhagen and DIKU Department of Computer Science University of Copenhagen supported by funding from Nordic Initiative for Neural Computation (NINC) ------------------------------------------------------------ In recent years there has been an increasing interest in using neural networks, simulated annealing, and genetics as modelling frames of reference to construct novel search heuristics for solving hard optimization problems. Algorithms constructed in this way, together with tabu search, constitute promising new approaches to optimization and are the subjects of this conference. The aim of the conference is to bring together researchers in classical optimization and researchers working with the novel methods, thus enabling a fruitful exchange of information and results. An important part of the conference will be a tutorial presentation of both classical and new methods to establish a common base for discussion among the participants. Tutorial session. ----------------- The first day of the conference will be devoted to introductory lectures given by invited speakers. The lectures will be on: * Classical Optimization. a) Laurence Wolsey, Center for Operations Research and Econometrics, Universite de Louvain, Belgium: Introduction to Classical Optimization: P-problems and their solution. b) Susan Powell, London School of Economics: Introduction to Classical Optimization: NP-problems and their solution. * Neural Networks. Carsten Peterson, Lund University, Sweden: The use of neural networks and optimization. * Simulated Annealing. (Speaker to be announced later) * Genetic Algorithms. (Speaker to be announced later) * Tabu Search. (Speaker to be announced later) * Statistical Mechanics. Marc Mezard, Ecole Normale Superieure, Paris: "Formal statistical mechanical methods in optimization problems." About the speakers: Laurence A. Wolsey is Professor of Applied Mathematics at CORE and is one of the leading researchers in the field of computational mathematical programming. He received the Beale-Orchard-Hays prize for his work in 1988, and is one of the authors of the widely used book "Integer and Combinatorial Optimization". Susan Powell is Lecturer in Operations Research at London School of Economics and is well known for her work on Fortran Codes for linear and integer programs. She has a solid background in prac- tical problem solving through her contacts with industry and British OR companies. Carsten Peterson is Lecturer in Theoretical Physics at Lund University. He is co-inventor of the deterministic Boltzmann learning algorithm for symmetric recurrent networks and a leader in applications of neural networks to optimization problems. Marc Mezard is Lecturer in Physics at the Ecole Nomale Superieure, Paris. Together with his colleagues there and their coworkers at the University of Rome, he pioneered the application of methods from the statistical mechanics of random systems to optimizaation problems. Contributed Papers. ------------------- The second day of the conference will be devoted to selected half-hour contributed presentations. An abstract of each paper submitted for presentation should be mailed or e-mailed to: Prof. Jens Clausen DIKU, Dept. of Computer Science Universitetsparken 1, DK-2100 Copenhagen OE Denmark. e-mail: clausen at diku.dk before January 1, 1991. Authors of accepted papers will be notified before January 15, 1991. (No proceedings will be published). Poster Sessions. ---------------- On both seminar days there will be poster sessions. An abstract of the poster should be mailed or e-mailed to Prof. Jens Clausen DIKU, Dept. of Computer Science Universitetsparken 1, DK-2100 Copenhagen OE Denmark. e-mail: clausen at diku.dk before january 1, 1991. Authors of accepted posters will be notified before January 15, 1991. Registration. ------------- The registration fee is 500 DKK (or equivalent in other convertible currency) and covers coffee/tea and lunch both days as well as an informal conference dinner on the evening of February 7. To register please fill in the form below and mail it together with the registrations fee to the address given on the form. No credit cards accepted. Cheques or Eurocheques should be payable to OPTIMIZATION CONFERENCE. The organizing commitee must receive your registration form January 15, 1991 the latest, and the final program will be mailed by January 22, 1991. Travel support for Nordic participants. --------------------------------------- A limited amount of money from NINC is reserved for paying the travel costs of participants from the Nordic countries, especially younger researchers. If you would like to apply for this support, please indicate on the registration form. Accommodation. -------------- The organizing commitee has reserved a certain number of hotel rooms. Please indicate on the registration form if you would like the conference to book one for you. ------------------------------------------------------------- INTERNATIONAL CONFERENCE ON NOVEL METHODS IN OPTIMIZATION February 7 - 8, 1991 ------------------------------------------------------------- REGISTRATION FORM ------------------------------------------------------------- Name:_______________________________________________ Affiliation:_______________________________________________ Address:_______________________________________________ _______________________________________________ _______________________________________________ Telephone no.:_______________________________________________ e-mail:_______________________________________________ If you want the conference to reserve you a hotel room, please indicate here for which nights: ______________________________________________________________ Nordic participants: If you want to be considered for travel support, please indicate your needs here: ______________________________________________________________ Mail this registration form to: John Hertz NORDITA Blegdamsvej 17 DK-2100 Copenhagen OE, Denmark For further information: e-mail: hertz at nordita.dk FAX: [+45] 31 38 91 57 From dyer at CS.UCLA.EDU Wed Nov 21 12:35:03 1990 From: dyer at CS.UCLA.EDU (Dr Michael G Dyer) Date: Wed, 21 Nov 90 09:35:03 PST Subject: tech rep available on evolving neural networks Message-ID: <901121.173503z.22459.dyer@lanai.cs.ucla.edu> Evolution of Communication in Artificial Organisms* Gregory M. Werner Michael G. Dyer Tech. Rep. UCLA-AI-90-06 Abstract: A population of artificial organisms evolved simple communication protocols for mate finding. Female animals in our artificial environment had the ability to see males and to emit sounds. Male animals were blind, but could hear signals from females. Thus, the environment was designed to favor organisms that evolved to generate and interpret meaningful signals. Starting with random neural networks, the simulation resulted in a progression of generations that exhibit increasingly effective mate finding strategies. In addition, a number of distinct subspecies, i.e. groups with different signaling protocols or "dialects", evolve and compete. These protocols become a behavioral barrier to mating that supports the formation of distinct subspecies. Experiments with physical barriers in the environment were also performed. A partially permeable barrier allows a separate subspecies to evolve and survive for indefinite periods of time, in spite of occasional migration and contact from members of other subspecies. * To appear in: J. D. Farmer, C. Langton, S. Rasmussen & C. Taylor (Eds.), Artificial Life II, Addison-Wesley, in press. For a copy of the above paper, please send a request for Tech. Rep. UCLA-AI-90-06 to: valerie at cs.ucla.edu From mike at park.bu.edu Tue Nov 20 18:11:35 1990 From: mike at park.bu.edu (mike@park.bu.edu) Date: Tue, 20 Nov 90 18:11:35 -0500 Subject: No subject Message-ID: <9011202311.AA01221@fenway.bu.edu> BOSTON UNIVERSITY A World Leader In Neural Network Research and Technology Presents Two Major Events on the Cutting Edge NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, MAY 5-10, 1991 A self-contained systematic course by leading neural architects. NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, MAY 10-12, 1991 An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. Special student registration rates are available. Sponsored by: Boston University's Wang Institute, Center for Adaptive Systems, and Graduate Program in Cognitive and Neural Systems, with partial support from the Air Force Office of Scientific Research. NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 5-10, 1991 This self-contained systematic five-day course is based on the graduate curriculum in the technology, computation, mathematics, and biology of neural networks developed at the Center for Adaptive Systems (CAS) and the graduate program in Cognitive and Neural Systems (CNS) of Boston University. The curriculum refines and updates the successful course held at the Wang Institute in May, 1990. The course will be taught by CAS/CNS faculty, as well as by distinguished guest lecturers at the beautiful and superbly equipped campus of the Wang Institute. An extraordinary range and depth of models, methods, and applications will be presented with ample opportunity for interaction with the lecturers and other participants at the daily discussion sections, meals, receptions, and breaks that are included with registration. At the 1990 Course, participants came from 20 countries and 35 states of the U.S. Boston University tutors are STEPHEN GROSSBERG, GAIL CARPENTER, ENNIO MINGOLLA, MICHAEL COHEN, DAN BULLOCK, AND JOHN MERRILL. Guest tutors are FEDERICO FAGGIN, ROBERT HECHT-NIELSEN, MICHAEL JORDAN, ANDY BARTO, AND ALEX WAIBEL. DAY 1 COURSE SCHEDULE (May 6, 1991) PROFESSOR GROSSBERG: Historical Overview, Cooperation and Competition, Content Addressable Memory, and Associative Learning. PROFESSORS CARPENTER, GROSSBERG, AND MINGOLLA: Associative Learning Continued, Neocognitron, Perceptrons, and Introduction to Back Propagation. PROFESSOR JORDAN: Recent Developments of Back Propagation. Evening Discussions with Tutors and Informal Presentations. DAY 2 COURSE SCHEDULE (May 7, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Adaptive Pattern Recognition. PROFESSORS CARPENTER AND GROSSBERG: Introduction to Adaptive Resonance, Theory and Analysis of ART 1. PROFESSOR CARPENTER: Analysis of ART 2, ART 3, Predictive ART, and Self-Organization of Invariant Pattern Recognition codes. Evening Discussions with Tutors and Informal Presentations. DAY 3 COURSE SCHEDULE (May 8, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Vision and Image Processing. PROFESSORS BULLOCK AND GROSSBERG: Adaptive Sensory-Motor Planning and Control. Evening Discussions with Tutors and Informal Presentations. DAY 4 COURSE SCHEDULE (May 9, 1991) PROFESSORS COHEN, GROSSBERG, AND WAIBEL: Speech Perception and Production. PROFESSORS BARTO, GROSSBERG, AND MERRILL: Reinforcement Learning and Prediction. DR. HECHT-NIELSEN: Recent Developments in the Neurocomputer Industry. Evening Discussions with Tutors and Informal Presentations. DAY 5 COURSE SCHEDULE (May 10, 1991) DR. FAGGIN: VLSI Implementation of Neural Networks. END OF COURSE (at 1:30 PM). RESEARCH CONFERENCE NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING MAY 10-12, 1991 This international research conference on a topic at the cutting edge of science and technology will bring together leading experts in academe, government, and industry to present their results on vision and image processing in INVITED LECTURES and CONTRIBUTED POSTERS. Topics range from visual neurobiology and psychophysics through computational modelling to technological applications. CALL FOR PAPERS - VIP POSTER SESSION: A featured 3-hour poster session on neural network research related to vision and image processing will be held on May 11, 1991. Attendees who wish to present a poster should submit three copies of an abstract (one single-spaced page), postmarked by March 1, 1991, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1991. DAY 1 CONFERENCE PROGRAM (May 10, 1991, 5:00-7:30 PM) PROFESSOR JOHN DAUGMAN, CAMBRIDGE UNIVERSITY: "High-Confidence Personal Identification System Built from Quadrature Neural Filter" PROFESSOR DAVID CASASENT, CARNEGIE MELLON UNIVERSITY: "CMU Hybrid Optical/ Digital Neural Net for Scene Analysis" DR. ROBERT HECHT-NIELSEN, HNC,: "Neurocomputers for Image Analysis" DAY 2 CONFERENCE PROGRAM (May 11, 1991) PROFESSOR V.S. RAMACHANDRAN, UNIVERSITY OF CALIFORNIA, SAN DIEGO: "Interactions Between `Channels' Concerned with the Perception of Motion, Depth, Color, and Form" PROFESSOR STEPHEN GROSSBERG, BOSTON UNIVERSITY: "A Neural Network Architecture for 3-D Vision and Figure-Ground Separation" PROFESSOR ENNIO MINGOLLA, BOSTON UNIVERSITY: "A Neural Network Architecture for Visual Motion Segmentation" PROFESSOR GEORGE SPERLING, NEW YORK UNIVERSITY: "Two Systems of Visual Processing" DR. ROBERT DESIMONE, NATIONAL INSTITUTE OF MENTAL HEALTH: "Attentional Control of Visual Perception: Cortical and Subcortical Mechanisms" PROFESSOR GAIL CARPENTER, BOSTON UNIVERSITY: "Neural Network Architectures for Attentive Learning, Recognition, and Prediction" DR. RALPH LINSKER, IBM T.J. WATSON RESEARCH CENTER: "New Approaches to Network Learning and Optimization" PROFESSOR STUART ANSTIS, UNIVERSITY OF TORONTO: "My Recent Research on Motion Perception" POSTER SESSION DAY 3 CONFERENCE PROGRAM (May 12, 1991) PROFESSOR JACOB BECK, UNIVERSITY OF OREGON: "Preattentive Visual Processing" PROFESSOR JAMES TODD, BRANDEIS UNIVERSITY: "Neural Analysis of Motion" DR. ALLEN M. WAXMAN, MIT LINCOLN LAB: "Extraction" PROFESSOR ERIC SCHWARTZ, NEW YORK UNIVERSITY: "Biologically Motivated Machine Vision" PROFESSOR ALEX PENTLAND, MASSACHUSETTS INSTITUTE OF TECHNOLOGY: "The Optimal Observer: Design of a Dynamically-Responding Visual System" DISCUSSION END OF RESEARCH CONFERENCE (at 1 PM) CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. STUDENT REGISTRATION: A limited number of spaces at the course and conference have been reserved at a subsidized rate for full time students. These spaces will be assigned on a first-come, first-served basis. Completed registration form and payment for students who wish to be considered for the reduced student rate must be received by April 15, 1991. YOUR REGISTRATION FEE INCLUDES: COURSE CONFERENCE Five days of tutorials Admission to all invited lectures Course notebooks for all tutorials Admission to poster session All guest lectures One reception Sunday evening reception Two continental breakfasts Five continental breakfasts One lunch Five lunches One dinner Four dinners Daily morning/afternoon Daily morning/afternoon coffee coffee service service Evening discussion sessions with leading neural architects CANCELLATION POLICY: Course fee, less $100, and the research conference fee, less $60, will be refunded upon receipt of a written request postmarked before March 31, 1991. After this date no refund will be made. Registrants who do not attend and who do not cancel in writing before March 31, 1991 are liable for the full amount of the registration fee. You must obtain a cancellation number from our registrar in order to make the cancellation valid. HOW TO REGISTER: ADVANCE REGISTRATION: To register by telephone, call (508) 649-9731 with VISA or Mastercard between 8:00-5:00 PM (EST). To register by fax, complete and fax back the Registration Form to (508) 649-6926. To register by mail, complete the registration form and mail it with your full form of payment as directed. Make check payable in U.S. dollars to Boston University. ON-SITE REGISTRATION: Those who wish to register for the course and the research conference on-site may do so on a space-available basis. SITE: The Wang Institute of Boston University possesses excellent conference facilities in a beautiful 220-acre setting. It is easily reached from Boston's Logan Airport and Route 128. HOTEL RESERVATIONS: Sheraton Tara, Nashua, NH (603) 888-9970; Red Roof Inn, Nashua, NH (603) 888-1893; or Stonehedge Inn, Tyngsboro, MA, (508) 649-4342. The special conference rate applies only if you mention the name and dates of the meeting when making the reservation. The hotels in Nashua are located approximately five miles from the Wang Institute. Shuttle bus service will be provided. REGISTRATION FORM: COURSE - NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, May 5-10, 1991 RESEARCH CONFERENCE - NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, May 10-12, 1991 Name: ______________________________________________________________ Title: _____________________________________________________________ Organization: ______________________________________________________ Address: ___________________________________________________________ City: ____________________________ State: __________ Zip: __________ Telephone: _________________________________________________________ Course: Research Conference: ( ) regular attendee $985 ( ) regular attendee $95 ( ) full-time student $275* ( ) Full-time student $75* *limited number of spaces. Student registrations must be received by April 15, 1991. Total payment enclosed: ____________________________________________ Form of payment: ( ) Check or money order (payable in U.S. dollars to Boston University). ( ) VISA ( ) Mastercard #_______________________________________Exp. Date:__________________ Signature (as it appears on card): _________________________________ Return to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 Boston University's policies provide for equal opportunity and affirmative action in employment and admission to all programs of the University. From mike at park.bu.edu Tue Nov 20 18:11:35 1990 From: mike at park.bu.edu (mike@park.bu.edu) Date: Tue, 20 Nov 90 18:11:35 -0500 Subject: No subject Message-ID: <9011202311.AA01221@fenway.bu.edu> BOSTON UNIVERSITY A World Leader In Neural Network Research and Technology Presents Two Major Events on the Cutting Edge NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, MAY 5-10, 1991 A self-contained systematic course by leading neural architects. NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, MAY 10-12, 1991 An international research conference presenting INVITED and CONTRIBUTED papers, herewith solicited, on one of the most active research topics in science and technology today. Special student registration rates are available. Sponsored by: Boston University's Wang Institute, Center for Adaptive Systems, and Graduate Program in Cognitive and Neural Systems, with partial support from the Air Force Office of Scientific Research. NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS MAY 5-10, 1991 This self-contained systematic five-day course is based on the graduate curriculum in the technology, computation, mathematics, and biology of neural networks developed at the Center for Adaptive Systems (CAS) and the graduate program in Cognitive and Neural Systems (CNS) of Boston University. The curriculum refines and updates the successful course held at the Wang Institute in May, 1990. The course will be taught by CAS/CNS faculty, as well as by distinguished guest lecturers at the beautiful and superbly equipped campus of the Wang Institute. An extraordinary range and depth of models, methods, and applications will be presented with ample opportunity for interaction with the lecturers and other participants at the daily discussion sections, meals, receptions, and breaks that are included with registration. At the 1990 Course, participants came from 20 countries and 35 states of the U.S. Boston University tutors are STEPHEN GROSSBERG, GAIL CARPENTER, ENNIO MINGOLLA, MICHAEL COHEN, DAN BULLOCK, AND JOHN MERRILL. Guest tutors are FEDERICO FAGGIN, ROBERT HECHT-NIELSEN, MICHAEL JORDAN, ANDY BARTO, AND ALEX WAIBEL. DAY 1 COURSE SCHEDULE (May 6, 1991) PROFESSOR GROSSBERG: Historical Overview, Cooperation and Competition, Content Addressable Memory, and Associative Learning. PROFESSORS CARPENTER, GROSSBERG, AND MINGOLLA: Associative Learning Continued, Neocognitron, Perceptrons, and Introduction to Back Propagation. PROFESSOR JORDAN: Recent Developments of Back Propagation. Evening Discussions with Tutors and Informal Presentations. DAY 2 COURSE SCHEDULE (May 7, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Adaptive Pattern Recognition. PROFESSORS CARPENTER AND GROSSBERG: Introduction to Adaptive Resonance, Theory and Analysis of ART 1. PROFESSOR CARPENTER: Analysis of ART 2, ART 3, Predictive ART, and Self-Organization of Invariant Pattern Recognition codes. Evening Discussions with Tutors and Informal Presentations. DAY 3 COURSE SCHEDULE (May 8, 1991) PROFESSORS GROSSBERG AND MINGOLLA: Vision and Image Processing. PROFESSORS BULLOCK AND GROSSBERG: Adaptive Sensory-Motor Planning and Control. Evening Discussions with Tutors and Informal Presentations. DAY 4 COURSE SCHEDULE (May 9, 1991) PROFESSORS COHEN, GROSSBERG, AND WAIBEL: Speech Perception and Production. PROFESSORS BARTO, GROSSBERG, AND MERRILL: Reinforcement Learning and Prediction. DR. HECHT-NIELSEN: Recent Developments in the Neurocomputer Industry. Evening Discussions with Tutors and Informal Presentations. DAY 5 COURSE SCHEDULE (May 10, 1991) DR. FAGGIN: VLSI Implementation of Neural Networks. END OF COURSE (at 1:30 PM). RESEARCH CONFERENCE NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING MAY 10-12, 1991 This international research conference on a topic at the cutting edge of science and technology will bring together leading experts in academe, government, and industry to present their results on vision and image processing in INVITED LECTURES and CONTRIBUTED POSTERS. Topics range from visual neurobiology and psychophysics through computational modelling to technological applications. CALL FOR PAPERS - VIP POSTER SESSION: A featured 3-hour poster session on neural network research related to vision and image processing will be held on May 11, 1991. Attendees who wish to present a poster should submit three copies of an abstract (one single-spaced page), postmarked by March 1, 1991, for refereeing. Include with the abstract the name, address, and telephone number of the corresponding author. Mail to: Poster Session, Neural Networks Conference, Wang Institute of Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be informed of abstract acceptance by March 31, 1991. DAY 1 CONFERENCE PROGRAM (May 10, 1991, 5:00-7:30 PM) PROFESSOR JOHN DAUGMAN, CAMBRIDGE UNIVERSITY: "High-Confidence Personal Identification System Built from Quadrature Neural Filter" PROFESSOR DAVID CASASENT, CARNEGIE MELLON UNIVERSITY: "CMU Hybrid Optical/ Digital Neural Net for Scene Analysis" DR. ROBERT HECHT-NIELSEN, HNC,: "Neurocomputers for Image Analysis" DAY 2 CONFERENCE PROGRAM (May 11, 1991) PROFESSOR V.S. RAMACHANDRAN, UNIVERSITY OF CALIFORNIA, SAN DIEGO: "Interactions Between `Channels' Concerned with the Perception of Motion, Depth, Color, and Form" PROFESSOR STEPHEN GROSSBERG, BOSTON UNIVERSITY: "A Neural Network Architecture for 3-D Vision and Figure-Ground Separation" PROFESSOR ENNIO MINGOLLA, BOSTON UNIVERSITY: "A Neural Network Architecture for Visual Motion Segmentation" PROFESSOR GEORGE SPERLING, NEW YORK UNIVERSITY: "Two Systems of Visual Processing" DR. ROBERT DESIMONE, NATIONAL INSTITUTE OF MENTAL HEALTH: "Attentional Control of Visual Perception: Cortical and Subcortical Mechanisms" PROFESSOR GAIL CARPENTER, BOSTON UNIVERSITY: "Neural Network Architectures for Attentive Learning, Recognition, and Prediction" DR. RALPH LINSKER, IBM T.J. WATSON RESEARCH CENTER: "New Approaches to Network Learning and Optimization" PROFESSOR STUART ANSTIS, UNIVERSITY OF TORONTO: "My Recent Research on Motion Perception" POSTER SESSION DAY 3 CONFERENCE PROGRAM (May 12, 1991) PROFESSOR JACOB BECK, UNIVERSITY OF OREGON: "Preattentive Visual Processing" PROFESSOR JAMES TODD, BRANDEIS UNIVERSITY: "Neural Analysis of Motion" DR. ALLEN M. WAXMAN, MIT LINCOLN LAB: "Extraction" PROFESSOR ERIC SCHWARTZ, NEW YORK UNIVERSITY: "Biologically Motivated Machine Vision" PROFESSOR ALEX PENTLAND, MASSACHUSETTS INSTITUTE OF TECHNOLOGY: "The Optimal Observer: Design of a Dynamically-Responding Visual System" DISCUSSION END OF RESEARCH CONFERENCE (at 1 PM) CNS FELLOWSHIP FUND: Net revenues from the course will endow fellowships for Ph.D. candidates in the CNS Graduate Program. Corporate and individual gifts to endow CNS Fellowships are also welcome. Please write: Cognitive and Neural Systems Fellowship Fund, Center for Adaptive Systems, Boston University, 111 Cummington Street, Boston, MA 02215. STUDENT REGISTRATION: A limited number of spaces at the course and conference have been reserved at a subsidized rate for full time students. These spaces will be assigned on a first-come, first-served basis. Completed registration form and payment for students who wish to be considered for the reduced student rate must be received by April 15, 1991. YOUR REGISTRATION FEE INCLUDES: COURSE CONFERENCE Five days of tutorials Admission to all invited lectures Course notebooks for all tutorials Admission to poster session All guest lectures One reception Sunday evening reception Two continental breakfasts Five continental breakfasts One lunch Five lunches One dinner Four dinners Daily morning/afternoon Daily morning/afternoon coffee coffee service service Evening discussion sessions with leading neural architects CANCELLATION POLICY: Course fee, less $100, and the research conference fee, less $60, will be refunded upon receipt of a written request postmarked before March 31, 1991. After this date no refund will be made. Registrants who do not attend and who do not cancel in writing before March 31, 1991 are liable for the full amount of the registration fee. You must obtain a cancellation number from our registrar in order to make the cancellation valid. HOW TO REGISTER: ADVANCE REGISTRATION: To register by telephone, call (508) 649-9731 with VISA or Mastercard between 8:00-5:00 PM (EST). To register by fax, complete and fax back the Registration Form to (508) 649-6926. To register by mail, complete the registration form and mail it with your full form of payment as directed. Make check payable in U.S. dollars to Boston University. ON-SITE REGISTRATION: Those who wish to register for the course and the research conference on-site may do so on a space-available basis. SITE: The Wang Institute of Boston University possesses excellent conference facilities in a beautiful 220-acre setting. It is easily reached from Boston's Logan Airport and Route 128. HOTEL RESERVATIONS: Sheraton Tara, Nashua, NH (603) 888-9970; Red Roof Inn, Nashua, NH (603) 888-1893; or Stonehedge Inn, Tyngsboro, MA, (508) 649-4342. The special conference rate applies only if you mention the name and dates of the meeting when making the reservation. The hotels in Nashua are located approximately five miles from the Wang Institute. Shuttle bus service will be provided. REGISTRATION FORM: COURSE - NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS, May 5-10, 1991 RESEARCH CONFERENCE - NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING, May 10-12, 1991 Name: ______________________________________________________________ Title: _____________________________________________________________ Organization: ______________________________________________________ Address: ___________________________________________________________ City: ____________________________ State: __________ Zip: __________ Telephone: _________________________________________________________ Course: Research Conference: ( ) regular attendee $985 ( ) regular attendee $95 ( ) full-time student $275* ( ) Full-time student $75* *limited number of spaces. Student registrations must be received by April 15, 1991. Total payment enclosed: ____________________________________________ Form of payment: ( ) Check or money order (payable in U.S. dollars to Boston University). ( ) VISA ( ) Mastercard #_______________________________________Exp. Date:__________________ Signature (as it appears on card): _________________________________ Return to: Neural Networks Wang Institute of Boston University 72 Tyng Road Tyngsboro, MA 01879 Boston University's policies provide for equal opportunity and affirmative action in employment and admission to all programs of the University. From birnbaum at fido.ils.nwu.edu Wed Nov 21 16:31:40 1990 From: birnbaum at fido.ils.nwu.edu (Lawrence Birnbaum) Date: Wed, 21 Nov 90 15:31:40 CST Subject: ML91 Call for papers: Eighth International Machine Learning Workshop Message-ID: <9011212131.AA00580@fido.ils.nwu.edu> ML91 The Eighth International Workshop on Machine Learning Call for Papers The organizing committee is please to announce that ML91 will include the following workshop topics: Automated Knowledge Acquisition Computational Models of Human Learning Learning Relations Machine Learning in Engineering Automation Learning to React/in Complex Environments Constructive Induction Learning in Intelligent Information Retrieval Learning from Theory and Data Papers must be submitted to one of these workshops for consideration. The provisional deadline for submission is February 1, 1991. Papers to appear in the Proceedings must fit in 4 pages, double column format. More details about the constituent workshops, including submission procedures, contact points, and reviewing committees, will be forthcoming shortly. ML91 will be held at Northwestern University, Evanston, Illinois, USA (just north of Chicago), June 27-29, 1991. On behalf of the organizing committee, Larry Birnbaum and Gregg Collins From len at mqcomp.mqcs.mq.oz.au Thu Nov 22 10:35:18 1990 From: len at mqcomp.mqcs.mq.oz.au (Len Hamey) Date: Thu, 22 Nov 90 10:35:18 EST Subject: Back-propogation Message-ID: <9011220035.AA05175@mqcomp.mqcs.mq.oz.au> In studying back-propogation, I find a degree of uncertainty as to exactly what the algorithm is. In particular, we all know and love the parameter ETA which specifies how big a step to take, but is the step taken simply ETA times the derivative of the error, or is the derivative vector normalised so that the step taken is ETA in weight space? I have experimented with both of these variations and observe that normalising the size of the step taken produces faster convergence on parity and XOR problems, but can also introduce oscillation. SUrprisingly, I have not been able to observe oscillatory behaviour when using un-normalised derivative steps. (I refer to oscillation of the solution across a narrow valley in the error surface). I assume then that proponents of momentum (which stems back at least to Rumelhart et al) have all been normalising the derivative vector to achieve fixed-size steps in weight space. Is this assumption correct? If not, then can somebody point me to a problem that generates oscillatory behaviour (and please indicate the value of ETA also). Len Hamey len at mqcomp.mqcs.mq.oz.au From yoshua at HOMER.MACH.CS.CMU.EDU Wed Nov 21 19:41:36 1990 From: yoshua at HOMER.MACH.CS.CMU.EDU (Yoshua BENGIO) Date: Wed, 21 Nov 90 19:41:36 EST Subject: learning a synaptic learning rule. TR available. Message-ID: <9011220041.AA13980@homer.cs.mcgill.ca> The following technical report is now available by ftp from neuroprose: Bengio Y. and Bengio S. (1990). Learning a synaptic learning rule. Technical Report #751, Universite de Montreal, Departement d'informatique et de recherche operationelle. Learning a synaptic learning rule Yoshua Bengio Samy Bengio McGill University, Universite de Montreal School of Computer Science, Departement d'informatique 3480 University street, et de recherche operationelle, Montreal, Qc, Canada, H3A 2A7 Montreal, Qc, Canada, H3C 3J7 yoshua at cs.mcgill.ca bengio at iro.umontreal.ca An original approach to neural modeling is presented, based on the idea of searching for and tuning, with learning methods, a synaptic learning rule which is biologically plausible, and yields networks capable to learn to perform difficult tasks. This method relies on the idea of considering the synaptic modification rule DeltaW() as a parametric function. This function has local inputs and is the same in many neurons. Its parameters can be estimated with known learning methods. For this optimization, we give particular attention to gradient descent and genetic algorithms. Estimation of these parameters consists of a joint global optimization of (a) the synaptic modification function, and (b) the networks that are learning to perform some tasks, using this function. We show how to compute the gradient of an optimization criteria with respect to the parameters of DeltaW(). Both network architecture and the learning function can be designed within constraints derived from biological knowledge. To avoid that DeltaW() be too specialized, this function is forced to be the same for a large number of synapses, in a population of networks learning to perform different tasks. To enforce efficiency constraints, some of these networks should learn complex mappings (as in pattern recognition). Others should learn to reproduce behavioral phenomena, such as associative conditioning, and neurological phenomena, such as habituation, recovery, dishabituation and sensitization. The architecture of the networks reproducing these biological phenomena can be designed based on well-studied circuits, such as those involved in associations in Aplysia, Hermissenda, or the rabbit eyelid closure response. Multiple synaptic modification functions allow for the diverse types of synapses (e.g. inhibitory, excitatory). Models of pre-, epi- and post-synaptic mechanisms can be used to bootstrap Delta W(), so that it initially consists of a combination of simpler modules, each emulating a particular synaptic mechanism. --------------------------------------------------------------------------- Copies of the postscript file bengio.learn.ps.Z may be obtained from the pub/neuroprose directory in cheops.cis.ohio-state.edu. Either use the Getps script or do this: unix-1> ftp cheops.cis.ohio-state.edu # (or ftp 128.146.8.62) Connected to cheops.cis.ohio-state.edu. Name (cheops.cis.ohio-state.edu:): anonymous 331 Guest login ok, sent ident as password. Password: neuron 230 Guest login ok, access restrictions apply. ftp> cd pub/neuroprose ftp> binary ftp> get bengio.learn.ps.Z ftp> quit unix-2> uncompress bengio.learn.ps.Z unix-3> lpr -P(your_local_postscript_printer) bengio.learn.ps Or, order a hardcopy by sending your physical mail address to bengio at iro.umontreal.ca, mentioning Technical Report #751. Please do this only if you cannot use the ftp method described above. ---------------------------------------------------------------------------- From ackley at chatham.bellcore.com Wed Nov 21 14:39:37 1990 From: ackley at chatham.bellcore.com (David H Ackley) Date: Wed, 21 Nov 90 14:39:37 -0500 Subject: NIPS*90 workshop at Keystone, CO. 11/30/90 or 12/1/90 Message-ID: <9011211939.AA09300@chatham.bellcore.com> Genetic Algorithms, Neural Networks, and Artificial Life David H. Ackley Richard K. Belew Based on the principles of natural selection, "genetic algorithms" (GAs) are a class of adaptive techniques that use a population of structures to represent a set of potential solutions to some problem. Selective reproduction emphasizes "more fit" individuals and focuses the search process, while genetic operators modify the offspring to increase diversity and search broadly. Theoretical and empirical results highlight the importance of employing the "crossover" operator to exchange information between individuals. Such genetic recombination produces a global search strategy quite different from --- and in some ways complementary to --- the gradient-based techniques popular in neural network learning. We will survey the theory and practice of genetic algorithms, and then focus on the growing body of research efforts that combine genetic algorithms and neural networks. Brief presentations from researchers active in the field (including Richard Lippmann, David Stork, and Darrell Whitley) will set the stage for in-depth discussions of issues in the area, such as: * Comparison and composition of GA sampling and NNet searching * The advantages and costs of recombination operators * Parallel implementations of GAs * Appropriate representations for NNets with the GA * Roles for ontogeny between GA evolution and NNet learning During the course of the workshop we will gradually broaden our scope. As natural neurons provide inspiration for artificial neural networks, and natural selection provides inspiration for GAs, other aspects of natural life can provide inspirations for studies in "artificial life". We will sample recent "alife" research efforts, and conclude with a discussion of prospects and problems for this new, interdisciplinary field. From carol at ai.toronto.edu Thu Nov 22 10:15:39 1990 From: carol at ai.toronto.edu (Carol Plathan) Date: Thu, 22 Nov 1990 10:15:39 -0500 Subject: CRG-TR-90-6 and 90-7 requests Message-ID: <90Nov22.101545edt.434@neuron.ai.toronto.edu> PLEASE DO NOT FORWARD TO OTHER NEWSGROUPS OF MAILING LISTS ********************************************************** You may order the following two new technical reports by sending your physical mailing address to carol at ai.toronto.edu. Pls do not reply to the whole list. 1. CRG-TR-90-6: Speaker Normalization and Adaptation using Second-Order Connectionist Networks by Raymond L. Watrous 2. CRG-TR-90-7: Learning Stochastic Feedforward Networks by Radford M. Neal Abstracts follow: ------------------------------------------------------------------------------- This technical report is an extended version of the paper that was presented to the Acoustical Society of America in May, 1990. SPEAKER NORMALIZATION AND ADAPTATION USING SECOND-ORDER CONNECTIONIST NETWORKS Raymond L. Watrous Department of Computer Science, University of Toronto, Toronto, Canada M5S 1A4 CRG-TR-90-6 A method for speaker-adaptive classification of vowels using connectionist networks is developed. A normalized representation of the vowels is computed by a speaker-specific linear transformation of observations of the speech signal using second-order connectionist network units. Vowel classification is accomplished by a multilayer network which operates on the normalized speech data. The network is adapted for a new talker by modifying the transformation parameters while leaving the classifier fixed. This is accomplished by back-propagating classification error through the classifier to the second-order transformation units. This method was evaluated for the classification of ten vowels for 76 speakers using the first two formant values of the Peterson/Barney data. A classifier optimized on the normalized data led to a recognition accuracy of 93.2%. When adapted to each speaker from various initial transformation parameters, the accuracy improved to 96.6%. When the speaker-dependent transformation and nonlinear classifier were simultaneously optimized, a vowel recognition accuracy of as high as 97.5% was obtained. Speaker adaptation using this network also yielded an accuracy of 96.6%. The results suggest that rapid speaker adaptation resulting in high classification accuracy can be accomplished by this method. ------------------------------------------------------------------------------- LEARNING STOCHASTIC FEEDFORWARD NETWORKS Radford M. Neal Department of Computer Science, University of Toronto, Toronto, Canada M5S 1A4 CRG-TR-90-7 Connectionist learning procedures are presented for "sigmoid" and "noisy-OR" varieties of stochastic feedforward network. These networks are in the same class as the "belief networks" used in expert systems. They represent a probability distribution over a set of visible variables using hidden variables to express correlations. Conditional probability distributions can be exhibited by stochastic simulation for use in tasks such as classification. Learning from empirical data is done via a gradient-ascent method analogous to that used in Boltzmann machines, but due to the feedforward nature of the connections, the negative phase of Boltzmann machine learning is unnecessary. Experimental results show that, as a result, learning in a sigmoid feedforward network can be faster than in a Boltzmann machine. These networks have other advantages over Boltzmann machines in pattern classification and decision making applications, and provide a link between work on connectionist learning and work on the representation of expert knowledge. ------------------------------------------------------------------------------- From perham at nada.kth.se Fri Nov 23 06:43:59 1990 From: perham at nada.kth.se (Per Hammarlund) Date: Fri, 23 Nov 90 12:43:59 +0100 Subject: Performance of the biosim program. Message-ID: <9011231143.AA14396@nada.kth.se> We have been approached to give some details about the capabilities and performance of the biosim program. (A program that enables biologically realistic simulations of large neuronal networks on the Connection Machine from TMC. The program has been developed by the SANS group, Royal Institute of Technology; Technical Report TRITA-NA-P9021.) The biosim program is used, among other things, to simulate the swimming rhythm generator of the lamprey. The present model includes 100 segments each consisting of 6 neurons who in turn are built up of 4 compartments. The segments have 12 internal synapses and are interconnected by 4 synapses, giving a total of 2400 compartments and 1600 synapses. The simulation, as presented in the TR, uses a time step of 0.4 milliseconds. To simulate 2 seconds of real time with this setup takes about 11 minutes or one time step in 0.135 seconds on our 8K machine. The system is thus running at a speed of about 0.3% of real time. The time step can often be doubled without significant changes in simulation results. Obviously, changing the time step directly effects the simulation time. It should probably also be mentioned that the program does not put any limitations on the number of compartments nor synapses per cell. These numbers are set individually for every cell. Due to the SIMD architecture of the Connection Machine, what is simulated is actually at least 8192 compartments and equally many synapses, since that is the number of processing elements in our machine. It is therefore possible to simulate either a three times larger model of the lamprey or three lampreys with the above specifications simultaneously without ANY change in execution time. It also turns out that up to 65536 compartments and synapses, i.e. 27 lampreys of the above type, can be simulated in roughly four times the time. This is in some sense equivalent to simulating one lamprey at about 2% of real time. For a full size Connection Machine with 64K processing elements it is possible to simulate roughly eight times larger models in the same time, i.e. using the same reasoning you get 16% of real time. All of the above shows that it is difficult to state any single general nice and easy to grasp number regarding the relation to real time but we hope that this has explained the issue. Why would anyone want to simulate the same thing 27 or 216 times in parallel? An excellent use of such an ability is to try out different settings of a number of parameters in one shot. A more obvious use of the biosim program is to run a single large model. For instance the number of cells in each segment of the lamprey model can be increased giving even better correspondence with reality. Bjorn Levin, Per Hammarlund, Anders Lansner blevin at bion.kth.se, perham at nada.kth.se, ala at bion.kth.se SANS-NADA Royal Institute of Technology S-100 44 Stockholm Sweden From burr at mojave.stanford.edu Sat Nov 24 00:26:13 1990 From: burr at mojave.stanford.edu (Jim Burr) Date: Fri, 23 Nov 90 21:26:13 PST Subject: NIPS90 VLSI workshop Message-ID: <9011240526.AA05352@mojave.Stanford.EDU> papers/neuro/nips90/agenda.t To: everyone I've contacted about the NIPS90 VLSI workshop, thanks for your help! It's shaping up to be a great session. Special thanks to those who have volunteered to give presentations. Workshop 8. on VLSI Neural Networks is being held Saturday, Dec 1 at Keystone. Related workshops are workshop 7. on implementations of neural networks on digital, massively parallel computers, and workshop 9. on optical implementations. Abstract: 8. VLSI Neural Networks Jim Burr Stanford University Stanford, CA 94305 (415) 723-4087 burr at mojave.stanford.edu This one day workshop will address the latest advances in VLSI implementations of neural nets. How successful have implementations been so far? Are dedicated neurochips being used in real applications? What algorithms have been implemented? Which ones have not been? Why not? How important is on chip learning? How much arithmetic precision is necessary? Which is more important, capacity or performance? What are the issues in constructing very large networks? What are the technology scaling limits? Any new technology developments? Several invited speakers will address these and other questions from various points of view in discussing their current research. We will try to gain better insight into the strengths and limitations of dedicated hardware solutions. Agenda: morning: 1. review of new chips capacity performance power learning architecture 2. guidelines for reporting results - recommendation at evening session specify technology, performance, power if possible translate power into joules/connection or joules/update 3. analog vs digital - the debate goes on 4. on-chip learning - who needs it 5. locality - who needs it (Boltzmann vs backprop) 6. precision - how much 7. leveraging tech scaling evening: 1. large networks - how big memory power Here are some of the issues we will discuss during the workshop: - What is the digital/analog tradeoff for storing weights? - What is the digital/analog tradeoff for doing inner products? - What is the digital/analog tradeoff for multichip systems? - Is on-chip learning necessary? - How important is locality? - How much precision is needed in a digital system? - What capabilities can we expect in 2 years? 5 years? - What are the biggest obstacles to implementing LARGE networks? Capacity? Performance? Power? Connectivity? presenters: Kristina Johnson, UC Boulder electro-optical networks Josh Alspector, Bellcore analog Boltzmann machines Andy Moore, Caltech subthreshold (with Video!) Edi Saeckinger, ATT NET32K Tom Baker, Adaptive Solutions precision Hal McCartor, Adaptive Solutions the X1 Chip presenters: please consider mentioning the following: - technology (eg 2.0 micron CMOS, 0.8 micron GaAs) - capacity in connections and neurons - performance in connections per second - energy per connection (power dissipation) - on-chip learning? (Updates per second) - scalability? How large a network? - a few words on tools See you at the workshop! Jim. From LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU Thu Nov 22 21:35:00 1990 From: LWCHAN%CUCSD.CUHK.HK at VMA.CC.CMU.EDU (LAI-WAN CHAN) Date: Fri, 23 Nov 90 10:35 +0800 Subject: Back-propogation Message-ID: > From: IN%"len at mqcomp.mqcs.mq.oz.au" "Len Hamey" 23-NOV-1990 09:39:15.77 > In studying back-propogation, I find a degree of uncertainty as to exactly > what the algorithm is. In particular, we all know and love the parameter > ETA which specifies how big a step to take, but is the step taken > simply ETA times the derivative of the error, or is the derivative vector > normalised so that the step taken is ETA in weight space? I have The standard back propagation algorithm uses the gradient descent method in which the step size is ETA times the gradient. The addition of the momentum term reduces some oscillations. However, the convergence depends very much on the step size (i.e. ETA and momentum). A small step size gives a smooth but slow learning path and a large step size gives an oscillatory path. The oscillatory path sometimes has a faster convergence but too much oscillation is hazardous to your nets !! The two updating methods that you mentioned were effectively adopting two sets of step sizes and this is why the network show different convergence speed. In fact, the optimal values of ETA (gradient term) and ALPHA (momentum term) depend on the training patterns and the size of the network. Finding these values are an headache. There are methods that automatically adapt the step size so as to speed up the training. I have worked on the adaptive learning [1] which I found it very useful and efficient. Other commonly used training methods which show faster convergence include the delta-bar-delta method [2] and the conjugate gradient method (used in optimisation). A comparison of these methods can be found in [3]. Reference : [1] L-W. Chan & F. Fallside, "An adaptive training algorithm for back propagation networks", Computer Speech and Language, 2, 1987, p205-218. [2] R.A. Jacobs, "Increased rates of convergence through learning rate adaptation", Neural networks, Vol 1, 1988, p295-307. [3] L-W. Chan, "Efficacy of different learning algorithms of the back propagation network", Proceeding of the IEEE Region 10 Conference on Computer and Communication Systems (TENCON'90), 1990, Vol 1, p23-27. Lai-Wan Chan, Computer Science Dept, The Chinese University of Hong Kong, Shatin, N.T., Hong Kong. email : lwchan at cucsd.cuhk.hk (bitnet) tel : (+852) 695 2795 FAX : (+852) 603 5024 From WWILBERG at WEIZMANN.BITNET Mon Nov 26 03:28:19 1990 From: WWILBERG at WEIZMANN.BITNET (Asher Merhav) Date: Mon, 26 Nov 90 10:28:19 +0200 Subject: add new member Message-ID: <57C5DA0DF860012D@BITNET.CC.CMU.EDU> Dear mail distributor, Please add my name to your mailing list. I would very much appreciate you confirmation to this massage. Sincerely, Asher Merhav From bharucha at eleazar.dartmouth.edu Tue Nov 27 11:21:00 1990 From: bharucha at eleazar.dartmouth.edu (Jamshed Bharucha) Date: Tue, 27 Nov 90 11:21:00 -0500 Subject: Cognitive Neuroscience Message-ID: <9011271621.AA08894@eleazar.dartmouth.edu> The 1991 James S. McDonnell Foundation Summer Institute in Cognitive Neuroscience The 1991 Summer Institute will be held at Dartmouth College, July 1-12. The two week course will examine how information about the brain bears on issues in cognitive science, and how approaches in cognitive science apply to neuroscience research. A distinguished international faculty will lecture on current topics in attention and emotion, including neurological and psychiatric disorders; laboratories, demonstrations and videotapes will offer practical experience with cognitive neuropsychology experiments, connectionist and computational modeling, and neuroanatomy. At every stage, the relationship between cognitive issues and underlying neural circuits will be explored. The Institute directors will be Michael I. Posner, David L. LaBerge, Joseph E. LeDoux, Michael S. Gazzaniga, and Gordon M. Shepherd. The Foundation is providing limited support for travel expenses and room/board. Applications are invited from beginning and established researchers, and must be received by January 11, 1991. For further information contact P. Reuter-Lorenz (PARL at mac.dartmouth .edu) or write to: McDonnell Summer Institute in Cognitive Neuroscience HB 7915-A Dartmouth Medical School Hanover, NH 03756 ______________________________________________________ APPLICATION FORM 1991 JAMES S. McDONNELL FOUNDATION SUMMER INSTITUTE IN COGNITIVE NEUROSCIENCE NAME_________________________________________________ INSTITUTIONAL AFFILIATION________________________________________ RESEARCH INTERESTS________________________________________ POSITION______________________________________________ HOME ADDRESS_______________________________________________ ______________________________________________________ WORK ADDRESS_______________________________________________ ______________________________________________________ E-MAIL ADDRESS______________________________________________ TELEPHONES: WORK: ( )______________ HOME: ( )______________ Housing expenses and some meal costs will be covered by the Foundation. There will also be limited travel support available. Please indicate the percent of your need for this support: _______% APPLICATION DEADLINE: All materials must be recevied by JANUARY 11, 1991 NOTIFICATION OF ACCEPTANCE: MARCH 10, 1991 PLEASE SEND THIS FORM, TOGETHER WITH: 1. A one-page statement explaining why you wish to attend. 2. A curriculum vitae. 3. Two letters of recommendation. TO: McDonnell Summer Institute HB 7915-A Dartmouth Medical School Hanover, New Hampshire 03756 From faramarz at ecn.purdue.edu Tue Nov 27 19:32:39 1990 From: faramarz at ecn.purdue.edu (Faramarz Valafar) Date: Tue, 27 Nov 90 19:32:39 -0500 Subject: Network Message-ID: <9011280032.AA13943@fourier.ecn.purdue.edu> Please put me on the mail list for the connectionist mail network. Thank you FARAMARZ VALAFAR From dyer at CS.UCLA.EDU Wed Nov 28 00:29:39 1990 From: dyer at CS.UCLA.EDU (Dr Michael G Dyer) Date: Tue, 27 Nov 90 21:29:39 PST Subject: tech. rep. available on evolving trail-following organisms Message-ID: <901128.052939z.14327.dyer@lanai.cs.ucla.edu> Evolution as a Theme in Artificial Life: The Genesys/Tracker System* David Jefferson, Robert Collins, Claus Cooper Michael Dyer, Margot Flowers, Richard Korf Charles Taylor, Alan Wang Abstract Direct, fine-grained simulation is a promising way of investigating and modeling natural evolution. In this paper we show how we can model a population of organisms as a population of computer programs, and how the evolutionarily significant activities of organisms (birth, interaction with the environment, migration, sexual reproduction with genetic mutation and recombination, and death) can all be represented by corresponding operations on programs. We illustrate these ideas in a system built for the Connection Machine called Genesys/Tracker, in which artificial "ants" evolve the ability to perform a complex task. In less than 100 generations a population of 64K "random" ants, represented either as finite state automata or as artificial neural nets, evolve the ability to traverse a winding broken "trail" in a rectilinear grid environment. Throughout this study we pay special attention to methodological issues, such as the avoidance of representational artifacts, and to biological verisimilitude. * To appear in J. D. Farmer, C. Langton, S. Rasmussen and C. Taylor (Eds.), Artificial Life II, Addison-Wesley, in press. For a copy of this tech. rep., please send e-mail to: valerie at cs.ucla.edu requesting "Jefferson et al. -- Evolution as Theme in ALife" Be sure to include your USA mail address. From SCHWARTZ at PURCCVM.BITNET Wed Nov 28 11:39:00 1990 From: SCHWARTZ at PURCCVM.BITNET (SCHWARTZ) Date: Wed, 28 Nov 90 11:39 EST Subject: Please add me to the mailing list Message-ID: <2E58E50AF8600D38@BITNET.CC.CMU.EDU> Please add my name Richard G. Schwartz (schwartz at purccvm.bitnet) to the mailing list. Thank you. From menon at cs.utexas.edu Wed Nov 28 13:34:21 1990 From: menon at cs.utexas.edu (menon@cs.utexas.edu) Date: Wed, 28 Nov 90 12:34:21 cst Subject: Dissertation abstract & Technical report Message-ID: <9011281834.AA08555@jaguar.cs.utexas.edu> Technical report announcement: Dynamic Aspects of Signaling in Distributed Neural Systems Vinod Menon Dept. of Computer Sciences Univ. of Texas at Austin Austin, TX 78712 ABSTRACT A distributed neural system consists of localized populations of neurons -- neuronal groups -- linked by massive reciprocal connections. Signaling between neuronal groups forms the basis of functioning of such a system. In this thesis, fundamental aspects of signaling are investigated mathematically with particular emphasis on the architecture and temporal self-organizing features of distributed neural systems. Coherent population oscillations, driven by exogenous and endogenous events, serve as autonomous timing mechanisms and are the basis of one possible mechanism of signaling. The theoretical analysis has, therefore, concentrated on a detailed study of the origin and frequency-amplitude-phase characteristics of the oscillations and the emergent features of inter-group reentrant signaling. It is shown that a phase shift between the excitatory and inhibitory components of the interacting intra-neuronal-group signals underlies the generation of oscillations. Such a phase shift is readily induced by delayed inhibition or slowly decaying inhibition. Theoretical analysis shows that a large dynamic frequency-amplitude range is possible by varying the time course of the inhibitory signal. Reentrant signaling between two groups is shown to give rise to synchronization, desynchronization, and resynchronization (with a large jump in frequency and phase difference) of the oscillatory activity as the latency of the reentrant signal is varied. We propose that this phenomenon represents a correlation dependent non-Boolean switching mechanism. A study of triadic neuronal group interactions reveals topological effects -- the existence of stabilizing (closed loop) and destabilizing (open loop) circuits. The analysis indicates (1) the metastable nature of signaling, (2) the existence of time windows in which correlated and uncorrelated activity can take place, and (3) dynamic frequency-amplitude-phase modulation of oscillations. By varying the latencies, and hence the relative phases of the reentrant signals, it is possible to dynamically and selectively modulate the cross-correlation between coactive neuronal groups in a manner that reflects the mapping topology as well as the intrinsic neuronal circuit properties. These mechanisms, we argue, provide dynamic linkage between neuronal groups thereby enabling the distributed neural system to operate in a highly parallel manner without clocks, algorithms, and central control. To obtain a copy of the technical report TR-90-36 please send $5 in US bank check or international money order payable to "The University of Texas" at the following address: Technical Report Center Department of Computer Sciences University of Texas at Austin Taylor Hall 2.124 Austin, TX 78712-1188 USA ------------------------------------------------------------------------- Dept. of Computer Sciences email: menon at cs.utexas.edu University of Texas at Austin tel: 512-343-8033 Austin, TX 78712 -471-9572 ------------------------------------------------------------------------- From Alexis_Manaster_Ramer%Wayne-MTS at um.cc.umich.edu Wed Nov 28 11:18:58 1990 From: Alexis_Manaster_Ramer%Wayne-MTS at um.cc.umich.edu (Alexis_Manaster_Ramer%Wayne-MTS@um.cc.umich.edu) Date: Wed, 28 Nov 90 11:18:58 EST Subject: No subject Message-ID: <273352@Wayne-MTS> What is the address for business correspondence? From SCHNEIDER at vms.cis.pitt.edu Wed Nov 28 15:08:00 1990 From: SCHNEIDER at vms.cis.pitt.edu (SCHNEIDER@vms.cis.pitt.edu) Date: Wed, 28 Nov 90 16:08 EDT Subject: Neural Processes in Cognition Training Program Message-ID: Program announcement for Interdisciplinary Graduate and Postdoctoral Training in Neural Processes in Cognition at the University of Pittsburgh and Carnegie Mellon University The National Science Foundation has newly established an innovative program for students investigating the neurobiology of cognition. The program's focus is the interpretation of cognitive functions in terms of neuroanatomical and neurophysiological data and computer simulations. Such functions include perceiving, attending, learning, planning, and remembering in humans and in animals. A carefully designed program of study prepares each student to perform original research investigating cortical function at multiple levels of analysis. State of the art facilities include: computerized microscopy, human and animal electrophysiological instrumentation, behavioral assessment laboratories, brain scanners, the Pittsburgh Supercomputing Center, and a regional medical center providing access to human clinical populations. This is a joint program between the University of Pittsburgh, its School of Medicine, and Carnegie Mellon University. Each student receives full financial support, travel allowances and a computer workstation. Applications are encouraged from students with interest in biology, psychology, engineering, physics, mathematics, or computer science. Pittsburgh is one of America's most exciting and affordable cities, offering outstanding symphony, theater, professional sports, and outdoor recreation in the surrounding Allegheny mountains. More than ten thousand graduate students attend its universities. Core Faculty and interests and affiliation Carnegie Mellon University -Psychology- James McClelland, Johnathan Cohen, Martha Farah, Mark Johnson University of Pittsburgh Behavioral Neuroscience - Michael Ariel, Theodore Berger Biology - Teresa Chay Information Science - Paul Munro Neurobiology Anatomy and Cell Sciences - Al Humphrey, Jennifer Lund Neurological Surgery - Don Krieger, Robert Sclabassi Neurology Steven Small Psychiatry - David Lewis, Lisa Morrow, Stuart Steinhauer Psychology - Walter Schneider, Velma Dobson Physiology - Dan Simons Applications: To apply to the program contact the program office or one of the affiliated departments. Students are admitted jointly to a home department and the Neural Processes in Cognition Program. The application deadline is February 1. Students must be accepted both by an affiliated department and this program. For information contact: Professor Walter Schneider Program Director Neural Processes in Cognition University of Pittsburgh 3939 O'Hara St Pittsburgh, PA 15260 Or: call 412-624-7064 or Email to NEUROCOG at PITTVMS.BITNET. From BOHANNON%BUTLERU.BITNET at VMA.CC.CMU.EDU Thu Nov 29 11:59:00 1990 From: BOHANNON%BUTLERU.BITNET at VMA.CC.CMU.EDU (BOHANNON%BUTLERU.BITNET@VMA.CC.CMU.EDU) Date: Thu, 29 Nov 90 11:59 EST Subject: job opening - Rank open Message-ID: Applied Cognition - Rank Open: The Department of Psychology at Butler University is seeking nominations/applications for a tenure track opening to start August, 1991. We are seeking faculty who would be knowledgeable in distributed processing systems and their applications. Specific area is less important than evidence of excellence in both teaching and research. Salaries are negotiable and competitive. Responsibilities include: maintain an active research program with undergraduates, teach an undergraduate course in Human Factors and other courses in the candidate's area of interest. Minimum Qualifications: Ph.D. in psychology at time of appointment. For junior nominees/applicants: potential excellence in both research and teaching. Teaching experience preferred. For senior nominees/applicants: established record of excellence in research, more than 3 years of teaching experience, and an interest and/or experience in attracting external funding. Butler University is a selective, private university located on a 400+ acre campus in the residential heart of Indianapolis. The psychology department is housed in recently renovated space with state-of-the-art, video/social interaction and computer-cognition labs (Macintosh II and up). Screening of nominees/applicants will continue until suitable faculty are found. Those interested should send a statement of research and teaching interests, vita, and three original letters of reference to: John Neil Bohannon III, Head, Department of Psychology, Butler University, 4600 Sunset Ave., Indianapolis, IN 46208. Bitnet : Bohannon@ ButlerU. AA/EEO From hwang at uw-isdl.ee.washington.edu Thu Nov 29 19:39:07 1990 From: hwang at uw-isdl.ee.washington.edu ( J. N. Hwang) Date: Thu, 29 Nov 90 16:39:07 PST Subject: call for papers Message-ID: <9011300039.AA12518@uw-isdl.ee.washington.edu.isdlyp> IJCNN'91 SINGAPORE, CALL FOR PAPERS CONFERENCE: The IEEE Neural Network Council and the international neural network society (INNS) invite all persons interested in the field of Neural Networks to submit FULL PAPERS for possible presentation at the conference. FULL PAPERS: must be received by May 31, 1991. All submissions will be acknowledged by mail. Authors should submit their work via Air Mail or Express Courier so as to ensure timely arrival. Papers will be reviewed by senior researchers in the field, and all papers accepted will be published in full in the conference proceedings. The conference hosts tutorials on Nov. 18 and tours arranged probably on Nov. 17 and Nov. 22, 1991. Conference sess- ions will be held from Nov. 19-21, 1991. Proposals for tutorial speakers & topics should be submitted to Professor Toshio Fukuda (address below) by Nov. 15, 1990. TOPICS OF INTEREST: original, basic and applied papers in all areas of neural networks & their applications are being solicited. FULL PAPERS may be submitted for consideration as oral or poster pres- entation in (but not limited to) the following sessions: -- Associative Memory -- Sensation & Perception -- Electrical Neurocomputers -- Sensormotor Control System -- Image Processing -- Supervised Learning -- Invertebrate Neural Networks -- Unsupervised Learning -- Machine Vision -- Neuro-Physiology -- Neurocognition -- Hybrid Systems (AI, Neural -- Neuro-Dynamics Networks, Fuzzy Systems) -- Optical Neurocomputers -- Mathematical Methods -- Optimization -- Applications -- Robotics AUTHORS' SCHEDULE: Deadline for submission of FULL PAPERS (camera ready) May 31, 1991 Notification of acceptance AUg. 31, 1991 SUBMISSION GUIDELINES: Eight copies (One original and seven photocopies) are required for submission. Do not fold or staple the original, camera ready copy. Papers of no more than 6 pages, including figures, tables and references, should be written in English and only complete papers will be considered. Papers must be submitted camera-ready on 8 1/2" x 11" white bond paper with 1" margins on all four sides. They should be prepared by typewriter or letter quality printer in one-column format, single-spaced or similar type of 10 points or larger and should be printed on one side of the paper only. FAX submissions are not acceptable. Centered at the top of the first page should be the complete title, author name(s), affiliation(s) and mailing address(es). This is followed by a blank space and then the abstract, up to 15 lines, followed by the text. In an accompanying letter, the following must be included: -- Corresponding author: -- Presentation preferred: Name Oral Mailing Address Poster Telephone & FAX number -- Technical Session: -- Presenter: 1st Choice Name 2nd Choice Mailing Address Telephone & FAX number FOR SUBMISSION FROM JAPAN, SEND TO: Professor Toshio Fukuda Programme Chairman IJCNN'91 SINGAPORE Dept. of Mechanical Engineering Nagoya University, Furo-cho, Chikusa-Ku Nagoya 464-01 Japan. (FAX: 81-52-781-9243) FOR SUBMISSION FROM USA, SEND TO: Ms Nomi Feldman Meeting Management 5565 Oberlin Drive, Suite 110 San Diego, CA 92121 (FAX: 81-52-781-9243) FOR SUBMISSION FROM REST OF THE WORLD, SEND TO: Dr. Teck-Seng, Low IJCNN'91 SINGAPORE Communication Intl Associates Pte Ltd 44/46 Tanjong Pagar Road Singapore 0208 (TEL: (65) 226-2838, FAX: (65) 226-2877, (65) 221-8916) From fodslett at daimi.aau.dk Fri Nov 30 13:48:14 1990 From: fodslett at daimi.aau.dk (Martin Moller) Date: Fri, 30 Nov 90 19:48:14 +0100 Subject: Preprint about scaled conjugate gradient available. Message-ID: <9011301848.AA22842@daimi.aau.dk> ************************************************************* ******************** PREPRINT announcement: **************** A Scaled Conjugate Gradient Algorithm for Fast Supervised Learning. Martin F. Moller Computer Science Dept. University of Aarhus Denmark e-mail: fodslett at daimi.aau.dk Abstract-- A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. SCG uses second order information from the neural network, but requires only O(N) memory usage. The performance of SCG is benchmarked against the performance of the standard backpropagation algorithm (BP) and several recently proposed standard conjugate gradient algorithms. SCG yields a speed-up at least an order of magnitude relative to BP. The speed-up depends on the convergence criterion,i.e., the bigger demand for reduction in error the bigger the speed-up. SCG is fully automated including no user dependent parameters and avoids a time consuming line search, which other conjugate gradient algorithms use in order to determine a good step size. Incorporating problem dependent structural information in the architecture of a neural network often lowers the overall complexity. The smaller the complexity of the neural network relative to the problem domain, the bigger the possibility that the weights space contains long ravines characterized by sharp curvature. While BP is inefficient on these ravine phenomena, SCG handles them effectively. The paper is available by ftp in the neuroprose directory under the name: moller.conjugate-gradient.ps.Z Any question or comments on this short writing or on the preprint would be very much appriciated. Martin M. From caroly at park.bu.EDU Fri Nov 30 15:13:20 1990 From: caroly at park.bu.EDU (caroly@park.bu.EDU) Date: Fri, 30 Nov 90 15:13:20 -0500 Subject: graduate study in neural networks Message-ID: <9011302013.AA00760@bucasb.bu.edu> (please post) *********************************************** * * * GRADUATE PROGRAM IN * * COGNITIVE AND NEURAL SYSTEMS (CNS) * * AT BOSTON UNIVERSITY * * * *********************************************** Gail A.Carpenter & Stephen Grossberg, Co-Directors The Boston University graduate program in Cognitive and Neural Systems offers comprehensive advanced training in the neural and computational principles, mechanisms, and architectures that underly human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Applications for Fall, 1991 admissions and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write or telephone: Cognitive & Neural Systems Program Boston University 111 Cummington Street, Room 240 Boston, MA 02215 (617) 353-9481 or send a mailing address to: caroly at park.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores may decrease an applicant's chances for admission and financial aid. Description of the CNS Program: The Cognitive and Neural Systems (CNS) Program provides advanced training and research experience for graduate students interested in the neural and computational principles, mechanisms, and architectures that underly human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Students are trained in a broad range of areas concerning cognitive and neural systems, including vision and image processing; speech and language understanding; adaptive pattern recognition; associative learning and long-term memory; cognitive information processing; self-organization; cooperative and competitive network dynamics and short-term memory; reinforcement, motivation, and attention; adaptive sensory-motor control and robotics; and biological rhythms; as well as the mathematical and computational methods needed to support advanced modeling research and applications. The CNS Program awards MA, PhD, and BA/MA degrees. The CNS Program embodies a number of unique features. Its core curriculum consists of eight interdisciplinary graduate courses each of which integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of neural networks to technology. Each course is taught once a week in the evening to make the program available to qualified students, including working professionals, throughout the Boston area. Students develop a coherent area of expertise by designing a program that includes courses in areas such as Biology, Computer Science, Engineering, Mathematics, and Psychology, in addition to courses in the CNS core curriculum. The CNS Program prepares Ph.D. students for thesis research with scientists in one of several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The unit most closely linked to the Program is the Center for Adaptive Systems. The Center for Adaptive Systems is also part of the Boston Consortium for Behavioral and Neural Studies, a Boston-area multi-institutional Congressional Center of Excellence. Another multi-institutional Congressional Center of Excellence focussed at Boston University is the Center for the Study of Rhythmic Processes. Other research resources include distinguished research groups in dynamical systems within the mathematics department; in theoretical computer science within the Computer Science Department; in biophysics and computational physics within the Physics Department; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the Engineering School; and in neurophysiology, neuroanatomy, and neuropharmacology at the Medical School.