From hava at bimacs.cs.biu.ac.il Wed Dec 1 13:33:07 1993 From: hava at bimacs.cs.biu.ac.il (Siegelmann Hava) Date: Wed, 1 Dec 93 20:33:07 +0200 Subject: TR: Unreliable Neurons and Asynchronous Recurrent Nets Message-ID: <9312011833.AA21754@bimacs.cs.biu.ac.il> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/siegelmann.prob.ps.Z The file siegelmann.prob.ps.Z is now available for copying from the Neuroprose repository: ============================================================================= ON THE COMPUTATIONAL POWER OF FAULTY AND ASYNCHRONOUS NEURAL NETWORKS (28 pages) Hava Siegelmann Bar-Ilan University ABSTRACT ======== This paper deals with finite size recurrent neural networks which consist of general (possibly with cycles) interconnections of evolving processors. Each neuron may assume real activation value. We provide the first rigorous foundations for {\it recurrent} networks which are built of unreliable {\it analog} devices and present asynchronicity in their updates. The first model considered incorporates unreliable devices (either neurons or the connections between them) which assume fixed error probabilities, independent of the history and the global state of the network. The model corresponds to the random-noise philosophy of Shannon. Another model allows the error probabilities to depend both on the global state and the history. Next, we change the various faulty nets to update in a total asynchronity. We prove all the above models to be computationally equivalent and we express their power. In particular, we see that for some constrained models of networks, the random behavior adds nonunifromity to the computation. However, the general simple model of networks is robust to probabilistic failures and asynchronous behavior. Nondeterministic unreliable nets are defined, and we show that in the faulty model the equality P = NP holds. ======================================================================= Hope you find it interesting, Hava From gupta at prl.philips.co.uk Thu Dec 2 05:09:43 1993 From: gupta at prl.philips.co.uk (Ashok Gupta) Date: Thu, 2 Dec 93 10:09:43 UTC Subject: Announcement - General Purpose Parallel Computing Message-ID: <29129.9312021009@apollo60.prl.philips.co.uk> The British Computer Society Parallel Processing Specialist Group (BCS PPSG) General Purpose Parallel Computing A One Day Open Meeting with Invited and Contributed Papers 22 December 1993, University of Westminster, London, UK Invited speakers : Les Valiant, Harvard University Bill McColl, PRG, University of Oxford, UK David May, Inmos, UK A key factor for the growth of parallel computing is the availability of port- able software. To be portable, software must be written to a model of machine performance with universal applicability. Software providers must be able to provide programs whose performance will scale with machine and application size according to agreed principles. This environment presupposes a model of paral- lel performance, and one which will perform well for irregular as well as regu- lar patterns of interaction. Adoption of a common model by machine architects, algorithm & language designers and programmers is a precondition for general purpose parallel computing. Valiant's Bulk Synchronous Parallel (BSP) model provides a bridge between appli- cation, language design and architecture for parallel computers. BSP is of the same nature for parallel computing as the Von Neumann model is for sequential computing. It forms the focus of a project for scalable performance parallel architectures supporting architecture independent software. The model and its implications for hardware and software design will be described in invited and contributed talks. The PPSG, founded in 1986, exists to foster development of parallel architec- tures, languages and applications & to disseminate information on parallel pro- cessing. Membership is completely open; you do not have to be a member of the British Computer Society. For further information about the group contact ei- ther of the following : Chair : Mr. A. Gupta Membership Secretary: Dr. N. Tucker Philips Research Labs, Crossoak Lane, Paradis Consultants, East Berriow, Redhill, Surrey, RH1 5HA, UK Berriow Bridge, North Hill, Nr. Launceston, gupta at prl.philips.co.uk Cornwall, PL15 7NL, UK Please share this information and display this announcement The British Computer Society Parallel Processing Specialist Group (BCS PPSG) General Purpose Parallel Computing 22 December 1993, Fyvie Hall, 309 Regent Street, University of Westminster, London, UK Provisional Programme 9 am-10 am Registration & Coffee L. Valiant, Harvard University, "Title to be announced" W. McColl, Oxford University, Programming models for General Purpose Parallel Computing A. Chin, King's College, London University, Locality of Reference in Bulk-Synchronous Parallel Computation P. Thannisch et al, Edinburgh University, Exponential Processor Requirements for Optimal Schedules in Architecture with Locality Lunch D. May, Inmos "Title to be announced" R. Miller, Oxford University, A Library for Bulk Synchronous Parallel Programming C. Jesshope et al, Surrey University, BSPC and the N-Computer Tea/Coffee P. Dew et al, Leeds University, Scalable Parallel Computing using the XPRAM model S. Turner et al, Exeter University, Portability and Parallelism with `Lightweight P4' N. Kalentery et al, University of Westminster, From BSP to a Virtual Von Neumann Machine R. Bisseling, Utrecht University, Scientific Computing on Bulk Synchronous Parallel Architectures B. Thompson et al, University College of Swansea, Equational Specification of Synchronous Concurrent Algorithms and Architectures 5.30 pm Close Please share this information and display this announcement The British Computer Society Parallel Processing Specialist Group Booking Form/Invoice BCS VAT No. : 440-3490-76 Please reserve a place at the Conference on General Purpose Parallel Computing, London, December 22 1993, for the individual(s) named below. Name of delegate BCS membership no. Fee VAT Total (if applicable) ___________________________________________________________________________ ___________________________________________________________________________ ___________________________________________________________________________ Cheques, in pounds sterling, should be made payable to "BCS Parallel Processing Specialist Group". Unfortunately credit card bookings cannot be accepted. The delegate fees (including lunch, refreshments and proceedings) are (in pounds sterling) : Members of both PPSG & BCS: 55 + 9.62 VAT = 64.62 PPSG or BCS members: 70 + 12.25 VAT = 82.25 Non members: 90 + 15.75 VAT = 105.75 Full-time students: 25 + 4.37 VAT = 29.37 (Students should provide a letter of endorsement from their supervisor that also clearly details their institution) Contact Address: ___________________________________________ ___________________________________________ ___________________________________________ Email address: _________________ Date: _________________ Day time telephone: ________________ Places are limited so please return this form as soon as possible to : Mrs C. Cunningham BCS PPSG 2 Mildenhall Close, Lower Earley, Reading, RG6 3AT, UK (Phone 0734 665570) From pf2 at st-andrews.ac.uk Thu Dec 2 06:24:27 1993 From: pf2 at st-andrews.ac.uk (Peter Foldiak) Date: Thu, 2 Dec 93 11:24:27 GMT Subject: Lectureship in St Andrews Message-ID: <27004.9312021124@psych.st-andrews.ac.uk> UNIVERSITY OF ST ANDREWS LECTURESHIP IN THE SCHOOL OF PSYCHOLOGY Applications are invited for the above post, which is made available following the retirement of Professor MA Jeeves. Individuals with a strong research record in an area of psychology related to existing research strengths in the School are encouraged to apply. The successful candidate will be expected to make a significant contribution to the School's research activity, and to contribute to undergraduate and graduate teaching programmes. The appointment is available from 1 September 1994. The University is prepared to consider appointing at the senior lectureship or readership level in light of an assessment of the quality of the field available to it. The salary shall be on the appropriate point on the Academic payscale GBP 13601 - GBP 29788 per annum. Application forms and further particulars are available from Personnel Services, University of St Andrews, College Gate, St Andrews KY16 9AJ, U.K. (tel: +44 334 62562, out of hours +44 334 62571 or by fax +44 334 62570), to whom completed forms accompanied by a letter of application and CV should be returned to arrive not later than 10 December 1993. PLEASE QUOTE REFERENCE NUMBER SL/APS0001. The University operates an Equal Opportunities Policy. From kaylan%TRBOUN.BITNET at FRMOP11.CNUSC.FR Thu Dec 2 05:18:58 1993 From: kaylan%TRBOUN.BITNET at FRMOP11.CNUSC.FR (kaylan%TRBOUN.BITNET@FRMOP11.CNUSC.FR) Date: Thu, 02 Dec 1993 12:18:58 +0200 Subject: ESS'94 - Call For Papers Message-ID: <00976695.E0F01620.16208@trboun.bitnet> ESS'94 EUROPEAN SIMULATION SYMPOSIUM CALL FOR PAPERS ISTANBUL, TURKEY OCTOBER 9-12, 1994 HOSTED BY BOGAZICI UNIVERSITY Organized and sponsored by: The Society for Computer Simulation International (SCS) With cooperation of: The European Simulation Council (ESC) Ministry of Industry and Trade, Turkey Operational Research Society of Turkey (ORST) Cosponsored by: Bekoteknik Digital Hewlett Packard IBM Turk Main Topics: * Advances in Simulation Methodology and Practices * Artificial Intelligence in Simulation * Innovative Simulation Technologies * Industrial Simulation * Computer and Telecommunication Systems CONFERENCE COMMITTEE Conference Chairman: Prof. Dr. Tuncer I. Oren University of Ottawa, Computer Science Department, 150 Louis Pasteur / Pri., Ottawa, Ontario, Canada K1N 6N5 Phone: 1.613.654.5068 Fax: 1.613.564.7089 E-mail: oren at csi.uottawa.ca Program Chairman: Prof. Dr. Ali Riza Kaylan Bogazici University, Dept.of Industrial Engineering, 80815 Bebek, Istanbul, Turkey Phone: 90.212.2631540/2072 Fax: 90.212.2651800 E-Mail: Kaylan at trboun.bitnet Program Co-chairman: Prof. Dr. Axel Lehmann Universitaet der Bundeswehr, Munchen, Institut fur Technische Informatik, Werner-Heisenberg-Weg 39, D 85577 Neubiberg, Germany. Phone: 49.89.6004.2648/2654 Fax: 49.89.6004.3560 E-Mail: Lehmann at informatik.unibw-muenchen.de Finance Chairman: Rainer Rimane, University of Erlangen - Nurnberg Organization Committee: Ali Riza Kaylan, Yaman Barlas, Murat Draman, Levent Mollamustafaoglu, Tulin Yazgac International Program Committee (Preliminary): O. Balci, USA J. Banks, USA G. Bolch, Germany R. Crosbie, USA B. Delaney, USA M. S. Elzas, Netherlands H. Erkut, Turkey A. Eyler, Turkey P. Fishwick, USA E. Gelenbe, USA A. Guasch, Spain M. Hitz, Austria R. Huntsinger, USA G. Iazeolla, Italy K. Irmscher, Germany K. Juslin, Finland A. Javor, Hungary E. Kerckhoffs, Netherlands J. Kleijnen, Netherlands M. Kotva, Czech Rep. M. Koksalan, Turkey M. L. Pagdett, USA M. Pior, Germany R. Reddy, USA S. Reddy, USA B. Schmidt, Germany S. Sevinc, Australia H. Szczerbicka, Germany S. Tabaka, Japan O. Tanir, Canada G. Vansteenkiste, Belgium M. Wildberger, USA S. Xia, UK R. Zobel, UK CONFERENCE INFORMATION The ESS series (organized by SCS, the Society for Computer Simulation International) is now in its fifth year. SCS is an international non-profit organization founded in 1952. On a yearly basis SCS organizes 6 Simulation Conferences worldwide, cooperates in 2 others, and publishes the monthly magazine Simulation, a quarterly Transactions, and books. For more information, please tick the appropriate box on the reply card. During ESS'94 the following events will be presented besides the scientific program: Professional Seminars The first day of the conference is dedicated to professional seminars, which will be presented for those interested participants to expose the state-of-art overview of each of the five main themes of this conference. Participation fee is included in the conference registration fee. If you have suggestions for other advanced tutorial topics, please contact one of the program chairmen. Exhibits An exhibition will be held in the central hall where all participants meet for coffee and tea. There will be a special exhibition section for universities and non-profit organizations, and a special section for publishers and commercial stands. If you would like to participate in the exhibition, please contact the SCS European Office. Vendor Sessions, Demonstrations and Video Presentations For demonstrations or video sessions, please contact SCS International at the European Office. Special sessions within the scientific program will be set up for vendor presentations. Other Organized Meetings Several User Group meetings for simulation languages and tools will be organized on Monday. It is possible to have other meetings on Monday as well. If you would like to arrange a meeting, please contact the Conference Chairman. We will be happy to provide a meeting room and other necessary equipment. VENUE Istanbul, the only city in the world built on two continents, stands on the shores of the Istanbul Bogazi (Bosphorus) where the waters of the Black Sea mingle with those of the Sea of Marmara and the Golden Horn. Here on this splendid site, Istanbul guards the precious relics of three empires of which she has been the capital; a unique link between East and West, past and present. Istanbul has infinite variety: museums, ancient churches, palaces, great mosques, bazaars and the Bosphorus. However long you stay, just a few days or longer, your time will be wonderfully filled in this unforgettable city. Bogazici University, which will host ESS'94 has its origins in Robert College, first American College founded outside of the United States in 1863. It has a well deserved reputation for academic excellence and accordingly attracts students from among the best and brightest in Turkey. The University is composed of four faculties, six institutes (offering graduate programs), and two other schools. The conference location is Istanbul Dedeman, an international five star hotel, which is located in the center of the city with a spectacular view of the Bosphorus. It is in a very close district to the most of the historical places as well as to the business center. For the conference participants the single room special rate is 65 US dollars. SCIENTIFIC PROGRAM The 1994 SCS European Simulation Symposium is structured around the following five major themes. A parallel track will be devoted to each of the five topics. The conference language is English. * Advances in Simulation Methodology and Practices, e.g.: - Advanced Modelling, Experimentation, and Output Analysis and Display - Object-Oriented System Design and Simulation - Optimization of Simulation Models - Validation and Verification Techniques - Mixed Methodology Modelling - Special Simulation Tools and Environments * Artificial Intelligence in Simulation, e.g.: - Knowledge-based Simulation Environments and Knowledge Bases - Knowledge-based System Applications - Reliability Assurance through Knowledge-based Techniques - Mixed Qualitative and Quantitative Simulation - Neural Networks in Simulation * Innovative Simulation Technologies: - Virtual Reality - Multimedia Applications * Industrial Simulation, e.g. Simulation in: - Design and Manufacturing, CAD, CIM - Process Control - Robotics and Automation - Concurrent Engineering, Scheduling * Computer and Telecommunication Systems, e.g.: - Circuit Simulation, Fault Simulation - Computer Systems - Telecommunication Devices and Systems - Networks INVITED SPEAKERS Focusing on the main tracks of the conference, invited speakers will give special in-depth presentations in plenary sessions, which will be included in the proceedings of the conference. BEST PAPER AWARDS The 1994 European Simulation Symposium will award the best five papers, one in each of the five tracks. From these five papers, the best overall paper of the conference will be chosen. The awarded papers will be published in an International Journal, if necessary after incorporating modifications in the paper. DEADLINES AND REQUIREMENTS Extended abstracts (300 words, 2-3 pages for full and 150 words, 1 page for short papers typewritten without drawings and tables) are due to arrive in QUADRUPLICATE at the office of Ali Riza Kaylan, at the Industrial Engineering Department of Bogazici University, TURKEY before March 1, 1994. Only original papers, written in English, which have not previously been published elsewhere will be accepted. In case you want to organize a panel discussion, please contact the program chairmen. Authors are expected to register early (at a reduced fee) and to attend the conference at their own expense to present the accepted papers. If early registration and payment are not made, the paper will not be published in the conference proceedings. In the case of multi-authors, one author should be identified as the person who will act as correspondent for the paper. Abstracts will be reviewed by 3 members of the International Program Committee for full papers and one member for short papers. Notification of acceptance or rejection will be sent by April 30, 1994. An author kit with complete instruction for preparing a camera-ready copy for the proceedings will be sent to authors of accepted abstracts. The camera-ready copy of the papers must be in by July 15, 1994. Only the full papers, which are expected to be 5-6 pages long, will be published in the conference proceedings. In order to guarantee a high-quality conference, the full papers will be reviewed as well, to check whether the suggestions of the program committee have been incorporated. The nominees for the best paper awards will be selected as well. REGISTRATION FEE Author SCS members Other participants ----------------------------------------------- Registration before BF 15000 BF 15000 BF 17000 August 31, 1994 (375 ECU) (375 ECU) (425 ECU) Registration after Preregistration BF 17000 BF 20000 August 31, 1994 required (425 ECU) (500 ECU) or at the conference The registration fee includes one copy of the Conference Proceedings, attending professional seminars, coffee and tea during the breaks, all lunches, a welcome cocktail and the conference dinner. CORRESPONDENCE ADDRESS Philippe Geril The Society for Computer Simulation, European Simulation Office, University of Ghent Coupure Links 653, B-9000 Ghent, Belgium. Phone (Office): 32.9.233.77.90 Phone (Home): 32.59.800.804 Fax (Office): 32.9.223.49.41 E-Mail: Philippe.Geril at rug.ac.be REPLY CARD Family Name: First Name: Occupation and/or Title: Affiliation: Mailing Address: Zip: City: Country: Telephone: Fax: E-mail: Yes, I intend to attend the European Simulation Symposium ESS'94: o Proposing a paper o Proposing a panel discussion o Participating a vendor session o Contributing to the exhibition o Without presenting a paper The provisional title of my paper / poster / exhibited tool is: With the following topics: The paper belongs to the category (please tick one): o Advances in Simulation Methodology and Practices o Artificial Intelligence in Simulation o Innovative Simulation Technologies o Industrial Simulation o Computer and Telecommunication Systems The paper will be submitted as a: o Full paper o Short Paper o Poster session o Demonstration Other colleague(s) interested in the topics of the conference is/are: Name: Address: Name: Address: If you would like to receive more information about SCS and its activities, please tick the following box: o YES, I would to know more about SCS. Please mail this card immediately to: Philippe Geril, The Society for Computer Simulation, European Simulation Office University of Ghent, Coupure Links 653, B-9000 Ghent, Belgium. ============================================================================= Prof.Dr. Ali R. Kaylan Director of Computer Center Bogazici University e-mail: Kaylan at Trboun.Bitnet Dept. of Industrial Eng'g. fax-no: (90-1)265 63 57 or (90-1)265 93 62 Bebek 80815 phone: (90-1)265 93 62 Istanbul, TURKIYE phone: (90-1)263 15 40 ext. 1445,1727,1407 ============================================================================= From plunkett at dragon.psych Thu Dec 2 10:04:26 1993 From: plunkett at dragon.psych (plunkett (Kim Plunkett)) Date: Thu, 2 Dec 93 15:04:26 GMT Subject: No subject Message-ID: <9312021504.AA20456@dragon.psych.pdp> Lecturer in Cognitive Psychology University of Oxford Department of Experimental Psychology Job Specification The successful applicant will be required to assume special responsibility for teaching the Final Honours School paper "Memory and Cognition" which covers the following topics to be published in Examination Decrees and Regulations: Basic processes and varieties of human memory. Memory retrieval and interference; recognition and recall; short- and long-term memory; working memory; sensory memory; priming; acquisition of cognitive and motor skills; modality-specific and material-specific varieties of coding in memory; mnemonics; every-day memory; mathematical and com- putational models of learning and memory; impair- ment of learning and memory. The representation and use of knowledge. Topics such as: semantic memory; inference; concept for- mation; encoding of similarities and differences; concepts, prototypes, and natural categories; schemata; imagery; problem solving; decision- making; heuristics and biases; cross-cultural differences in cognition. This is one of four papers in cognitive psychology offered in Final Honours. The appointed lecturer will be expected to pursue active research in an area of cognitive psychology. Although interests in higher mental functions, cognitive neurop- sychology, language, or artificial intelligence would be an advantage, it should be stressed to potential applicants that there is no restriction on area of interest. Further details can be obtained from: Professor S.D. Iversen, Head of Department Department of Experimental Psychology South Parks Road Oxford OX1 3UD or email: Jane Brooks - brooks at psy.ox.ac.uk From oja at dendrite.hut.fi Fri Dec 3 09:45:27 1993 From: oja at dendrite.hut.fi (Erkki Oja) Date: Fri, 3 Dec 93 16:45:27 +0200 Subject: No subject Message-ID: <9312031445.AA25681@dendrite.hut.fi.hut.fi> % % *** A LIST OF REFERENCES RELATED TO PCA NEURAL NETWORKS *** % We offer a fairly extensive collection of references on Principal Component Analysis (PCA) neural networks and learning algorithms, available by anonymous ftp. The list also contains references on extensions and generalizations of such networks and some basic references on PCA and related matters. You can copy the list freely on your own responsibility. The original list has been compiled by Liu-Yue Wang, a graduate student of Erkki Oja, and updated by Juha Karhunen, all from Helsinki University of Technology, Finland. The list should cover fairly well the field of PCA networks. Although it is not complete and contains possibly some errors and nonuniformity in notation, the reference collection should be useful for people interested in PCA neural networks already in its present form. To get the list, connect by ftp to dendrite.hut.fi and give anonymous as the user id. Then proceed according to instructions. Erkki Oja, Liu-Yue Wang, Juha Karhunen % ************************************************************ From liaw at rana.usc.edu Fri Dec 3 19:00:47 1993 From: liaw at rana.usc.edu (Jim Liaw) Date: Fri, 3 Dec 93 16:00:47 PST Subject: Frog-Net Announcement Message-ID: <9312040000.AA21922@rana.usc.edu> ********************************************************************* ** ** ** o/\o Frog-Net o/\o ** ** \| |/ \| |/ ** ** | | An electronic forum for researchers | | ** ** -- engaged in the study of the behavior -- ** ** \/ \/ and the underlying neural mechanisms \/ \/ ** ** in amphibians ** ** ** ********************************************************************* This mailing list is set up to facilitate the communication and interaction among researchers interested in the behavior and the underlying neural mechanisms in amphibians. If you would like to send email to all members of the list, address it to "frog-net at rana.usc.edu" If you want to subscribe to the mailing list, please send an email to "liaw at rana.usc.edu" ::::::::::::::::::::::::::::::::::: Jim Liaw Center for Neural Engineering Univ. of Southern California Los Angeles, CA 90089-2520 (213) 740-6991 liaw at rana.usc.edu From amari at sat.t.u-tokyo.ac.jp Sat Dec 4 13:43:53 1993 From: amari at sat.t.u-tokyo.ac.jp (amari@sat.t.u-tokyo.ac.jp) Date: Sat, 4 Dec 93 13:43:53 JST Subject: PCA reference Message-ID: <9312040443.AA12881@bpel.tutics.tut.ac.jp> I have copied Dr.Sanger's very useful bibliography on PCA. I would like to add one "prehistoric" reference. In the paper S.Amari, Neural Theory of Association and Concept Formation, Biological Cybernetics, vol.26, 175 - 185, 1977, I discussed the general aspect of neural learning of the form (d/dt)w = -cw + c'rx, where w is the synaptic weight vector, c and c' are constant, x is the input vector and r is the "learning signal" depending on w, x and an extra signal. I have proved the existence of the potential or Lyapunov function in various types of neural learning. The case that r = w.x was also remarked. In p.179, one can see the following statement. "If the connection weight is subject to the subsidiary condition w.w = const, so that w(t) is normalized after each step of learning, we can prove that w(t) converges to the minimum of L(w) under the subsidiary condition. It is the direction of the eigenvector of the matrix corresponding to the maximum eigenvalue." Here L(w) is the Lyapunov function which is a special case of more general one I proposed, and is the covariance matrix (second order moment matrix) of the input signals. This was only a few lines of description, and the main theme of this paper was not the neural PCA. So this was only a prehistory. But I think that someone might have interest in a prehistoric anecdote. From LC4A%ICINECA.BITNET at BITNET.CC.CMU.EDU Tue Dec 7 15:09:35 1993 From: LC4A%ICINECA.BITNET at BITNET.CC.CMU.EDU (F. Ventriglia) Date: Tue, 07 Dec 93 15:09:35 SET Subject: New Book Announcement Message-ID: <01H672ZH3AKY0003VW@BITNET.CC.CMU.EDU> Dear Connectionists fellow, The following book has appeared as part of Studies in Neuroscience Series, and may be of interest to you. Best, Francesco Ventriglia Neurodynamics Department Cybernetics Institute, CNR Arco Felice (NA), Italy ***************************************************************** Neural Modeling and Neural Networks F. Ventriglia editor - Pergamon Press Research in neural modeling and neural networks has escalated dramatically in the last decade, acquiring along the way terms and concepts, such as learning, memory, perception, recognition, which are the basis of neuropsychology. Nevertless, for many, neural modeling remains controversial in its purported ability to describe brain activity. The difficulties in modeling are various, but arise principally in identifying those elements that are fundamental for the espression (and description) of superior neural activity. This is complicated by our incomplete knowledge of neural structures and functions, at the cellular and population levels. The firts step towards enhanced appreciation of the value of neural modeling and neural networks is to be aware of what has been achieved in this multidisclipinary field of research. This book sets out to create such awareness. Leading experts develop in twelve chapters the key topics of neural structures and functions, dynamics of single neurons, oscillations in groups of neurons, randomness and chaos in neural activity, (statistical) dynamics of neural networks, learning, memory and pattern recognition. Contents: Preface. Contributors. Anatomical bases of neural network modeling (J. Szentagothai) Models of visuomotor coordination in frog and monkey (M.A. Arbib) Analysis of single-unit activity in the cerebral cortex (M. Abeles) Single neuron dynamics: an introduction (L.F. Abbott) An introduction to neural oscillators (B. Ermentrout) Mechanisms responsible for epilepsy in hippocampal slices predispose the brain to collective oscillations (R.D. Traub, J.G.R. Jefferys) Diffusion models of single neurones' activity and related problems (L.M. Ricciardi) Noise and chaos in neural systems (P. Erdi) Qualitative overview of population neurodynamics (W.F. Freeman) Towards a kinetic theory of cortical-like neural fields (F. Ventriglia) Psychology, neuro-biology and modeling: the science of hebbian reverberations (D.J. Amit) Pattern recognition with neural networks (K. Fukushima) Bibliography. Author index. Subject index. Publication date November 1993 Approx 300 pages Price US$ 125.00 Available from: Pergamon Press Inc. 660 White Plains Road Tarrytown NY 10591-5153 USA Phone +1-914-524-9200 Fax +1-914-333-2444 From tds at ai.mit.edu Tue Dec 7 18:40:47 1993 From: tds at ai.mit.edu (Terence D. Sanger) Date: Tue, 7 Dec 93 18:40:47 EST Subject: PCA algorithms, continued. Message-ID: <9312072340.AA11203@rice-chex> In response to my previous message, many people have sent me new references to PCA algorithms, and these have been included in the BibTex database pca.bib. (Also note Wang's more extensive pclist.tex file announced recently on this net.) Errki Oja has been kind enough to forward copies of some of his recent papers on the "Weighted Subspace Algorithm" and "Nonlinear PCA". Looking at these carefully, I think both algorithms are closely related to Brockett's algorithm, and probably work for the same reason. I have created another short derivation "oja.tex" which is available along with the updated pca.bib by anonymous ftp from ftp.ai.mit.edu in the directory pub/sanger-papers. One could invoke some sort of transitivity property to claim that since Oja's algorithms are related to Brockett's, Brockett's are related to GHA, and GHA does deflation, then Oja's algorithms must also do deflation. This would imply that Oja's algorithms also satisfy the hypothesis: "All algorithms for PCA which are based on a Hebbian learning rule must use sequential deflation to extract components beyond the first." But I must admit that the connection is becoming somewhat tenuous. Probably the hypothesis should be interpreted as a vague description of a motivation for the computational mechanism, rather than a direct description of the algorithm. However, I still feel that it is important to realize the close relationship between the many algorithms which use Hebbian learning to find exact eigenvectors. As always, comments/suggestions/counterexamples/references are welcomed! Terry Sanger Instructions for retrieving latex documents: ftp ftp.ai.mit.edu login: anonymous password: your-net-address cd pub/sanger-papers get pca.bib get oja.tex quit latex oja lpr oja.dvi From brutlag at cmgm.stanford.edu Wed Dec 8 20:48:58 1993 From: brutlag at cmgm.stanford.edu (Doug Brutlag) Date: Wed, 8 Dec 93 17:48:58 -0800 Subject: Intelligent Systems for Molecular Biology Message-ID: Last year's version of the following conference contained many papers that involved neural networks. Hence, I thought that some of the readers of this mailing list might be interested. Doug Brutlag ***************** CALL FOR PAPERS ***************** The Second International Conference on Intelligent Systems for Molecular Biology August 15-17, 1994 Stanford University Organizing Committee Deadlines Russ Altman, Stanford U, Stanford Papers due: March 11, 1994 Doug Brutlag, Stanford U, Stanford Replies to authors: April 29, 1994 Peter Karp, SRI, Menlo Park Revised papers due: May 27, 1994 Richard Lathrop, MIT, Cambridge David Searls, U Penn, Philadelphia Program Committee K. Asai, ETL, Tsukuba A. Lapedes, LANL, Los Alamos D. Benson, NCBI, Bethesda M. Mavrovouniotis, Northwestern U, Evanston B. Buchanan, U of Pittsburgh G. Michaels, George Mason U, Fairfax C. Burks, LANL, Los Alamos G. Myers, U. Arizona, Tucson D. Clark, ICRF, London K. Nitta, ICOT, Tokyo F. Cohen, UCSF, San Francisco C. Rawlings, ICRF, London T. Dietterich, OSU, Corvallis J. Sallatin, LIRM, Montpellier S. Forrest, UNM, Albuquerque C. Sander, EMBL, Heidelberg J. Glasgow, Queen's U., Kingston J. Shavlik, U Wisconsin, Madison P. Green, Wash U, St. Louis D. States, Wash U, St. Louis M. Gribskov, SDSC, San Diego G. Stormo, U Colorado, Boulder D. Haussler, UCSC, Santa Cruz E. Uberbacher, ORNL, Oak Ridge S. Henikoff, FHRC, Seattle M. Walker, Stanford U, Stanford L. Hunter, NLM, Bethesda T. Webster, Stanford U, Stanford T. Klein, UCSF, San Francisco X. Zhang, TMC, Cambridge The Second International Conference on Intelligent Systems for Molecular Biology will take place at Stanford University in the San Francisco Bay Area, August 14-17, 1994. The ISMB conference, held for the first time last summer in Bethesda, MD, attracted an overflow crowd, yielded an excellent offering of papers, invited speakers, posters and tutorials, provided an exciting opportunity for researchers to meet and exchange ideas, and was an important forum for the developing field. We will continue the tradition of pre-published, rigorously refereed proceedings, and opportunities for fruitful personal interchange. The conference will bring together scientists who are applying the technologies of advanced data modeling, artificial intelligence, neural networks, probabilistic reasoning, massively parallel computing, robotics, and related computational methods to problems in molecular biology. We invite participation from both developers and users of any novel system, provided it supports a biological task that is cognitively challenging, involves a synthesis of information from multiple sources at multiple levels, or in some other way exhibits the abstraction and emergent properties of an "intelligent system." The four-day conference will feature introductory tutorials (August 14), presentations of original refereed papers and invited talks (August 15-17). Paper submissions should be single-spaced, 12 point type, 12 pages maximum including title, abstract, figures, tables, and bibliography with titles. The first page should include the full postal address, electronic mailing address, telephone and FAX number of each author. Also, please list five to ten keywords describing the methods and concepts discussed in the paper. State whether you wish the paper to be considered for oral presentation only, poster presentation only or for either presentation format. Submit 6 copies to the address below. For more information, please contact ismb at camis.stanford.edu. Proposals for introductory tutorials must be well documented, including the purpose and intended audience of the tutorial as well as previous experience of the author in presenting such material. Those considering submitting tutorial proposals are strongly encouraged to submit a one-page outline, before the deadline, to enable early feed-back regarding topic and content suitability. The conference will pay an honorarium and support, in part, the travel expenses of tutorial speakers. Limited funds are available to support travel to ISMB-94 for those students, post-docs, minorities and women who would otherwise be unable to attend.. Please submit papers and tutorial proposals to: Intelligent Systems for Molecular Biology c/o Dr. Douglas L. Brutlag Beckman Center, B400 Department of Biochemistry Stanford University School of Medicine Stanford, California 94305-5307 From sylee%eekaist.kaist.ac.kr at daiduk.kaist.ac.kr Fri Dec 10 11:14:43 1993 From: sylee%eekaist.kaist.ac.kr at daiduk.kaist.ac.kr (Soo-Young Lee) Date: Fri, 10 Dec 93 11:14:43 KST Subject: Postdoc and Graduate Studies Message-ID: <9312100214.AA02590@eekaist.kaist.ac.kr> Subject: Postdoc/Graduate Study - Neural Net Applications and Implementation From: "Soo-Young Lee" POSTDOCTORAL POSITION / GRADUATE STUDENTS Computation and Neural Systems Laboratory Department of Electrical Engineering Korea Advanced Institute of Science and Technology A postdoctoral position is available beginning after March 1st, 1994. The position is for one year initially, and may be extended for another year. Graduate students with full scholarship are also welcome, especially from developing countries. We are seeking individuals interested in researches on neural net applications and/or VLSI implementation. Especially we emphasizes "systems" approach, which combines neural network theory, application-specific knowledge, and hardware implementation technology for much better perofrmance. Although many applications are currently investigated, speech recognition is the preferred choice at this moment. Interested parties should send a C.V. and a brief statement of research interests to the address listed below. Present address: Prof. Soo-Young Lee Computation and Neural Systems Laboratory Department of Electrical Engineering Korea Advanced Institute of Science and Technology 373-1 Kusong-dong, Yusong-gu Taejon 305-701 Korea (South) Fax: +82-42-869-3410 E-mail: sylee at ee.kaist.ac.kr RESEARCH INTERESTS OF THE GROUP The Korea Advanced Institute of Science and Technology (KAIST) is an unique engineering school, which emphasies graduate studies through high-quality researches. All graduate students receive full scholarship, and Ph.D. course students are free from military services. The Department of Electrical Engineering is the largest one with 39 professors, 250 Ph.D. course students, 180 Master course students, and 300 undergraduate students. The Computation and Neural Systems Laboratory is lead by Prof. Soo-Young Lee, and consists of about 10 Ph.D. course students and about 5 Master course students. The primary focus of this laboratory is to merge neural network theory, VLSI implementation technology, and application-specific knowledge for much better performance at real world applications. Speech recognition, pattern recognition, and control applications have been emphasized. Neural network models develpoed include Multilayer Bidirectional Associative Memoryas an extention of BAM into multilayer architecture, IJNN (Intelligent Judge Neural Networks) for intelligent ruling verdict for disputes from several low-level classifiers, TAG (Training by Adaptive Gain) for large-scale implementation and speaker adaptation, and Hybrid Hebbian-Backpropagation Algorithm for MLP for improved robustness and generalization. The correlation matrix MBAM chip had been fabricated, and new on-chip learning analog neuro-chip is under design now. From ken at phy.ucsf.edu Fri Dec 10 04:58:05 1993 From: ken at phy.ucsf.edu (ken@phy.ucsf.edu) Date: Fri, 10 Dec 93 01:58:05 -0800 Subject: Graduate studies in computational and systems neuroscience Message-ID: <9312100958.AA13745@phybeta.ucsf.EDU> University of California, San Francisco (UCSF) is a leading institute of biomedical research. Its graduate program in Neuroscience is widely regarded as one of the very best such programs. The organization of the Keck Center for Integrative Neuroscience (see below), including the hiring of computational faculty, makes UCSF an exciting location for students interested in theoretical as well as experimental approaches to understanding brain function. UCSF is *not* a reasonable place for those wishing to work on applications of neural networks, as we have no programs in that area. But, for those truly interested in understanding the nervous system and its function, using theoretical and/or experimental methods and remaining solidly based in biology, it is a superb program. I would like to personally encourage theoretically inclined individuals with such an interest to apply. Application deadline is Jan. 15. For further information and application materials, contact Patricia Arrandale: patricia at phy.ucsf.edu; 415-476-2248 (phone); 415-476-4929 (fax). -------------------------------------------- The Keck Center for Integrative Neuroscience: A completely reconstructed space at UCSF, to open in January, 1994, will house the following seven faculty and their labs in a highly interactive setting: Allan Basbaum: The Neural Substrate of Pain and Pain Control Allison Doupe: The Neural Basis of Vocal Learning in Songbirds Stephen Lisberger: Neural Control of Eye Movements Michael Merzenich: Dynamic Neocortical Processes: Neural Origins of Higher Brain Functions Kenneth Miller: Computational Neuroscience Christof Schreiner: Mammalian Auditory Cortex Michael Stryker: Development and Plasticity of Mammalian Central Visual System Other faculty closely associated with the Center, although not housed in the center itself, include: Howard Fields: Neural Circuitry Underlying Pain Modulation Rob Malenka: Synaptic Plasticity in the Mammalian Central Nervous System Roger Nicoll: Physiology and Pharmacology of CNS Synapses Henry Ralston: Neuronal Organization in Spinal Cord and Thalamus From paulh at hdl.ie Fri Dec 10 10:09:40 1993 From: paulh at hdl.ie (Paul Horan) Date: Fri, 10 Dec 93 15:09:40 GMT Subject: Post-Doc position Message-ID: <9312101509.AA07932@sun1.hdl.ie> Post-Doctoral Research Position Dept of Pure and Applied Physics Trinity College Dublin, Ireland. Applications are invited for a postdoctoral position in the Department of Physics at Trinity College Dublin to work on the integration of semiconductor optical modulators and electronics, as part of a smart pixel neural network project which is currently underway. The research will be carried out in collaboration with a team at the Hitachi Dublin Lab in Trinity College. The successful candidate should have experience in both electronics and optics, preferably in the design and processing of GaAs devices. Applicants should have a PhD. The post will be for two years initially, with the possibility of an extension. Inquiries or applications + CV + 2 referees to: Prof. John Hegarty head of Dept., Dept of Pure and Applied physics, Trinity College, Dublin 2 Ireland. Tel +353-1-7021675 Fax +353-1-6711759 email _______________________________________________________ Paul Horan, Hitachi Dublin Lab., Trinity College, Dublin 2, Ireland Fax +353-1-6798926, e-mail paulh at hdl.ie From P.McKevitt at dcs.shef.ac.uk Fri Dec 10 13:00:24 1993 From: P.McKevitt at dcs.shef.ac.uk (Paul Mc Kevitt) Date: Fri, 10 Dec 93 18:00:24 GMT Subject: No subject Message-ID: <9312101800.AA06422@dcs.shef.ac.uk> *PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST* Advance Announcement CALL FOR PAPERS AND PARTICIPATION AAAI-94 Workshop on the Integration of Natural Language and Vision Processing Twelfth National Conference on Artificial Intelligence (AAAI-94) Seattle, Washington, USA 2 days during July 31st-August 4th 1994 Chair: Paul Mc Kevitt Department of Computer Science University of Sheffield WORKSHOP DESCRIPTION There has been a recent move towards considering the integration of perception sources in Artificial Intelligence (AI) (see Dennett 1991 and Mc Kevitt (Guest Ed.) 1994). This workshop will focus on research involved in the integration of Natural Language Processing (NLP) and Vision Processing (VP). Although there has been much progress in developing theories, models and systems in the areas of NLP and VP there has been little progress on integrating these two subareas of Artificial Intelligence (AI). It is not clear why there has not already been much activity in integrating NLP and VP. Is it because of the long-time reductionist trend in science up until the recent emphasis on chaos theory, non-linear systems, and emergent behaviour? Or, is it because the people who have tended to work on NLP tend to be in other Departments, or of a different ilk, to those who have worked on VP? We believe it is high time to bring together NLP and VP. Already we have advertised a call for papers for a special issue of the Journal of AI Review to focus on the integration of NLP and VP and we have had a tremendous response. There will be three special issues focussing on theory and applications of NLP and VP. Also, there will be an issue focussing on intelligent multimedia systems. The workshop is of particular interest at this time because research in NLP and VP have advanced to the stage that they can each benefit from integrated approaches. Also, such integration is important as people in NLP and VP can gain insight from each others' work. References Dennett, Daniel (1991) Consciousness explained Harmondsworth: Penguin Mc Kevitt, Paul (1994) (Guest Editor) Integration of Natural Language and Vision Processing Special Volume (Issues 1,2,3) of AI Review Journal Dordrecht: Kluwer (forthcoming) WORKSHOP TOPICS: The workshop will focus on three themes: * Theoretical issues on integrated NLP and VP * Systems exhibiting integrated NLP and VP * Intelligent multimedia involving NLP and VP The following issues will be focussed upon during the workshop: * Common representations for NLP and VP * How does NLP help VP and vice-versa? * What does integration buy us? * Symbolic versus connectionist models * Varieties of communication between NLP and VP processors * Designs for integrating NLP + VP * Tools for integrating NLP + VP * Possible applications of integration WORKSHOP FORMAT: Our intention is to have as much discussion as possible during the workshop and to stress panel sessions and discussion rather than having formal paper presentations. We will also organize a number of presentations on Site Descriptions of ongoing work on NLP + VP. There may be a number of invited speakers. Day 1: Theory and modelling for integrated NLP and VP. Day 2: Systems for integrated NLP/VP, and intelligent multimedia. ATTENDANCE: We hope to have an attendance between 25-50 people at the workshop. SUBMISSION REQUIREMENTS: Papers of not more than 8 pages should be submitted by electronic mail to Paul Mc Kevitt at p.mckevitt at dcs.shef.ac.uk. Preferred format is two columns with 3/4 " margins all round. Papers must be printed to 8 1/2" x 11" size. Double sided printing is encouraged. If you cannot submit your paper by e-mail please submit three copies by snail mail. *******Submission Deadline: March 18th 1994 *******Notification Date: April 8th 1994 *******Camera ready Copy: April 29th 1994 PUBLICATION: Workshop notes/preprints will be published by AAAI. If there is sufficient interest we will publish a book on the workshop with AAAI Press. WORKSHOP CHAIR: Paul Mc Kevitt Department of Computer Science Regent Court University of Sheffield 211 Portobello Street GB- S1 4DP, Sheffield England, UK, EC. e-mail: p.mckevitt at dcs.shef.ac.uk fax: +44 742 780972 phone: +44 742 825572 (office) 825590 (secretary) WORKSHOP COMMITTEE: Prof. Jerry Feldman (ICSI, Berkeley, USA) Prof. John Frisby (Sheffield, England) Dr. Eduard Hovy (USC ISI, Los Angeles, USA) Dr. Mark Maybury (MITRE, Cambridge, USA) Dr. Ryuichi Oka (RWC, Tsukuba, Japan) Dr. Terry Reiger (ICSI, Berkeley, USA) Prof. Roger Schank (ILS, Illinois, USA) Dr. Oliviero Stock (IRST, Italy) Prof. Dr. Wolfgang Wahlster (DFKI, Germany) Prof. Yorick Wilks (Sheffield, England) *PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST* From jaap.murre at mrc-apu.cam.ac.uk Mon Dec 13 09:25:51 1993 From: jaap.murre at mrc-apu.cam.ac.uk (Jaap Murre) Date: Mon, 13 Dec 93 14:25:51 GMT Subject: Neurosimulator review available by ftp Message-ID: <12980.9312131425@rigel.mrc-apu.cam.ac.uk> I have recently completed a review of roughly 40 neurosimulators. A first draft of this review is now available through our ftp site. Comments to the current review are very welcome. -- Jaap Murre --------------------------------------------------------------------------- The filename is neurosim1.ps.Z. A WordPerfect 5.1 version (.w51.Z extension) and an ASCII dump (.txt) are also available, as well as uncompressed versions (no .Z extension). To obtain the file use the following command sequence: ftp ftp.mrc-apu.cam.ac.uk NAME: anonymous PASSWORD: cd pub cd nn bin /* if you want to retrieve a binary file */ get neurosim1.ps.Z quit decompress neurosim1.ps.Z lpr -P neurosim1.ps --------------------------------------------------------------------------- Dr. Jacob M.J. Murre Medical Research Council Applied Psychology Unit 15 Chaucer Road Cambridge CB2 2EF United Kingdom tel 44 223 355294 (ext.139) fax 44 223 359062 E-mail: Jaap.Murre at MRC-APU.CAM.AC.UK From robbie at prodigal.psych.rochester.edu Mon Dec 13 11:07:38 1993 From: robbie at prodigal.psych.rochester.edu (Robbie Jacobs) Date: Mon, 13 Dec 93 11:07:38 EST Subject: paper available Message-ID: <9312131607.AA27061@prodigal.psych.rochester.edu> The following paper is now available via anonymous ftp from the neuroprose archive. The paper has been accepted for publication in the journal "Cognitive Science." The manuscript is 44 pages. ------------------------------------------------------------------ Encoding Shape and Spatial Relations: The Role of Receptive Field Size in Coordinating Complementary Representations Robert A. Jacobs Stephen M. Kosslyn University of Rochester Harvard University An effective functional architecture facilitates interactions among subsystems that are often used together. Computer simulations showed that differences in receptive field sizes can promote such organization. When input was filtered through relatively small nonoverlapping receptive fields, artificial neural networks learned to categorize shapes relatively quickly; in contrast, when input was filtered through relatively large overlapping receptive fields, networks learned to encode specific shape exemplars or metric spatial relations relatively quickly. Moreover, when the receptive field sizes were allowed to adapt during learning, networks developed smaller receptive fields when they were trained to categorize shapes or spatial relations, and developed larger receptive fields when they were trained to encode specific exemplars or metric distances. In addition, when pairs of networks were constrained to use input from the same type of receptive fields, networks learned a task faster when they were paired with networks that were trained to perform a compatible type of task. Finally, using a novel modular architecture, networks were not pre-assigned a task, but rather competed to perform the different tasks. Networks with small nonoverlapping receptive fields tended to win the competition for categorical tasks whereas networks with large overlapping receptive fields tended to win the competition for exemplar/metric tasks. ------------------------------------------------------------------ FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/jacobs.rfield.ps.Z Robbie Jacobs robbie at psych.rochester.edu From jaap.murre at mrc-apu.cam.ac.uk Mon Dec 13 11:35:11 1993 From: jaap.murre at mrc-apu.cam.ac.uk (Jaap Murre) Date: Mon, 13 Dec 93 16:35:11 GMT Subject: Workshop on Language Acquisition Message-ID: <13227.9312131635@rigel.mrc-apu.cam.ac.uk> *************** CALL FOR PAPERS ****************************************************** Workshop on 'Cognitive Models of Language Acquisition' April 21-23, 1994 Tilburg University, The Netherlands ****************************************************** Organizers: Peter Broeder (Tilburg) Jaap Murre (Cambridge) Scientific committee: Melissa Bowerman (Nijmegen) Peter Coopmans (Utrecht) Guus Extra (Tilburg) Peter Jordens (Amsterdam) Sponsored by: L.O.T. (Landelijke Onderzoeksschool Taalkunde), The Dutch national Ph.D. program in Linguistics AIM OF THE WORKSHOP The workshop is centered around two basic questions with respect to the nature and origins of language as "an individual phenomenon": (1) What constitutes knowledge of language? (2) How is knowledge of language acquired? Currently, these questions are being addressed within different cognitive models of language acquisition which derive from strongly contrasting research paradigms. The paradigms start from fundamentally different assumptions about language (symbolic or subsymbolic) and the mechanisms that drive the process of language acquisition (inductive or deductive). The workshop will focus on processes of language acquisition in children and adults and on modelling theses processes. In particular, the acquisition and representation of words will be a central topic. The workshop aims to bring together researchers willing to discuss the merits and constraints of the various models based on the interdisciplinary approaches of linguistics, psychology, cognitive science, NLP, and AI. PARTICIPANTS Melissa Bowerman (Nijmegen), Harald Clahsen (Colchester), Vivian Cook (Colchester), Peter Coopmans (Utrecht), Walter Daelemans (Tilburg), Guus Extra (Tilburg), Michael Gasser (Indiana), Steven Gillis (Antwerp), Peter Jordens (Amsterdam), Gerard Kempen (Leiden), Brian MacWhinney (Pittsburg), Paul Meara (Swansea), Dennis Norris (Cambridge), Kim Plunkett (Oxford), Henk van Riemsdijk (Tilburg), Mike Sharwood-Smith (Utrecht), Paul Smolensky (Colorado), Sven Stromqvist (Goteborg). ABSTRACTS We invite those interested to submit a two-page abstract (for a 30 minute oral presentation) by January 15, 1994. We would prefer to receive the abstracts by e-mail. The organizers can be contacted at the following addresses: Peter Broeder Jaap Murre Department of Linguistics Medical Research Council University of Tilburg Applied Psychology Unit P.O. Box 90153 15 Chaucer Road 5000 LE Tilburg Cambridge CB2 2EF The Netherlands United Kingdom tel: +31 13-662239 tel: +44 223 355294 fax: +31 13-663110 fax: +44 223 359062 e-mail: peter.broeder at kub.nl e-mail: jaap.murre at mrc-apu.cam.ac.apu From hava at bimacs.cs.biu.ac.il Mon Dec 13 13:46:31 1993 From: hava at bimacs.cs.biu.ac.il (Siegelmann Hava) Date: Mon, 13 Dec 93 20:46:31 +0200 Subject: TR available: Complexity of Training Message-ID: <9312131846.AA29216@bimacs.cs.biu.ac.il> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/filename.ps.Z The file siegelmann.training.ps.Z is now available for copying from the Neuroprose repository: ===================================================================== On the Complexity of Training Neural Networks with Continuous Activation Functions (30 pages) Bhaskar DasGupta Hava T. Siegelmann Eduardo D. Sontag Minnesota Bar-Ilan Rutgers ====================================================================== We deal with computational issues of loading a fixed-architecture neural network with a set of positive and negative examples. This is the first result on the hardness of loading networks which do not consist of the binary-threshold neurons, but rather utilize a particular continuous activation function, commonly used in the neural network literature. We observe that the loading problem is polynomial-time if the input dimension is constant. Otherwise, however, it any possible learning algorithm based on particular fixed architectures faces severe computational barriers. Similar theorems have already been proved by Megiddo and by Blum and Rivest, to the case of binary-threshold networks only. Our theoretical results lend further justification to the use of incremental (architecture-changing) techniques for training networks From gjg at cns.edinburgh.ac.uk Tue Dec 14 16:11:08 1993 From: gjg at cns.edinburgh.ac.uk (Geoffrey Goodhill) Date: Tue, 14 Dec 93 21:11:08 GMT Subject: Preprints available Message-ID: <25549.9312142111@cns.ed.ac.uk> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/goodhill.normalization.ps.Z FTP-filename: /pub/neuroprose/goodhill.elastic.ps.Z The following two papers have been accepted for publication in "Neural Computation": for abstracts and information on obtaining preprints, see below. "The Role of Weight Normalization in Competitive Learning" - Goodhill, G.J. and Barrow, H.G. "Elastic Net Model of Ocular Dominance: Overall Stripe Pattern and Monocular Deprivation" - Goodhill, G.J. and Willshaw, D.J. Geoff Goodhill (gjg at cns.ed.ac.uk) -------------------- Instructions for obtaining by anonymous ftp: % ftp archive.cis.ohio-state.edu (128.146.8.52) Name: anonymous ftp> binary ftp> cd pub/neuroprose ftp> get goodhill.normalization.ps.Z ftp> get goodhill.elastic.ps.Z ftp> quit % uncompress ..... The papers are of length 11 pages and 6 pages respectively. --------------------- The Role of Weight Normalization in Competitive Learning Goodhill, G.J. and Barrow, H.G. The effect of different kinds of weight normalization on the outcome of a simple competitive learning rule is analysed. It is shown that there are important differences in the representation formed depending on whether the constraint is enforced by dividing each weight by the same amount (``divisive enforcement''), or subtracting a fixed amount from each weight (``subtractive enforcement''). For the divisive cases weight vectors spread out over the space so as to evenly represent ``typical'' inputs, whereas for the subtractive cases the weight vectors tend to the axes of the space, so as to represent ``extreme'' inputs. The consequences of these differences are examined. ----- Elastic Net Model of Ocular Dominance: Overall Stripe Pattern and Monocular Deprivation Goodhill, G.J. and Willshaw, D.J. The elastic net \cite{durwil87} can account for the development of both topography and ocular dominance in the mapping from the lateral geniculate nucleus to primary visual cortex. Here it is further shown for this model that (a) the overall pattern of stripes produced is strongly influenced by the shape of the cortex: in particular stripes with a global order similar to that seen biologically can be produced under appropriate conditions, and (b) the observed changes in stripe width associated with monocular deprivation are reproduced in the model. From jaap.murre at mrc-apu.cam.ac.uk Fri Dec 17 13:57:46 1993 From: jaap.murre at mrc-apu.cam.ac.uk (Jaap Murre) Date: Fri, 17 Dec 93 18:57:46 GMT Subject: Six papers available by ftp Message-ID: <29396.9312171857@rigel.mrc-apu.cam.ac.uk> Six reports by the Leiden Connectionist Group have recently been placed in our ftp site. Retrieval instructions can be found at the end of this message. -- Jaap Murre --------------------------------------------------------------------------- File: modann1.ps.Z Title: Designing Modular Artificial Neural Networks Authors: Egbert J.W. Boers, Herman Kuiper, Bart L.M. Happel, and Ida G. Sprinkhuizen-Kuyper Abstract: This paper presents a method for designing artificial neural network architectures. The method implies a reverse engineering of the processes resulting in the mammalian brain. The method extends the brain metaphor in neural network design with genetic algorithms and L-systems, modelling natural evolution and growth. It will be argued that a principle of modularity, which is inherent to the design method as well as the resulting network architectures, improves network performance. E-mail comments to boers at WI.LeidenUniv.nl or to happel at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: bsp400.ps.Z Title: The BSP400: A Modular Neurocomputer Authors: Jan N.H. Heemskerk, Jaap Hoekstra, Jacob M.J. Murre, Leon H.J.G. Kemna, and Patrick T.W. Hudson Abstract: This paper discusses the main architectural issues, the implementation, and the performance of a parallel neurocomputer, the Brain-Style Processor or BSP400. This project presents a feasibility study for larger parallel neurocomputers. The design principles are hardware modularity, simple processors, and in situ (local) learning. The modular approach of the design ensures extensibility of the present version. The BSP400 consists of 25 Modules (boards) each containing 16 simple 8-bit single-chip computers. The Module boards are connected to a dedicated connection network. The architectural configuration of the BSP400 supports local activation and learning rules. The ability to communicate activations with the outside world in real-time makes the BSP400 particularly suited for real-world applications. The present version implements a modular type of neural network, the CALM (Categorizing And Learning Module) neural network. In this implementation of CALM, activations are transmitted as single bits, but an internal representation of one byte is kept for both activations and weights. The system has a capacity of 400 processing elements and 32,000 connections. Even with slow and simple processing elements, it still achieves a speed of 6.4 million connections per second for a non-learning CALM network. Some small network simulation studies carried out on the BSP400 are reported. A comparison with a design study (Mark III and Mark IV) is made. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: schema.ps.Z Title: A Real-Time Neural Implementation of a Schema Driven Toy-Car Authors: Jan N.H. Heemskerk and Fred A. Keijzer Abstract: An actual implementation of a schema driven toy-car is presented in this paper. The car is equipped with two motors and 4 light sensors. Supervised learning behavior of the car is achieved by using a neural network with adaptive connections. The car can be taught to drive towards a light and avoid obstacles. The controlling neural network is implemented on the BSP400 neurocomputer, a Brain Style Processor with 400 nodes. A subset of the digital nodes in the BSP400 are connected by fixed weights to form logical circuits in order to re-train the car. In this way cooperative computation of both 'logical' and 'neural' processes are integrated into one system. The actions of the car are described in terms of both distal and proximal schemas. These correspond respectively with a description of the car's actions in terms of distal system environmental stimuli and effects, and a description of the local routines carried out by the system. The latter are restricted to the specific situatedness and embodiment of the acting system. The distinction between distal and proximal schemas is presented as a way to link neural structure to adaptive action. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: mindshap.ps.Z Title: MindShape: a neurocomputer concept based on a fractal architecture Authors: Jan N.H. Heemskerk, Jacob M.J. Murre, Arend Melissant, Mirko Pelgrom, and Patrick T.W. Hudson Abstract: A parallel architecture for implementing massive neural networks, called MindShape, is presented. MindShape is the successor to the Brain Style Processor, a 400-processor neurocomputer based on the principle of modularity. The MindShape machine consists of Neural Processing Elements (NPEs) and Communication Elements (CEs) organized in what we have called a fractal architecture. The architecture is by definition scalable, permitting the implementation of very large networks consisting of many thousands of nodes. Through simulations of data- communication flow on different architectures, and through implementation studies of VLSI hardware on a chip simulator, the specific requirements of the CEs and the NPEs have been investigated. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: optim.ps.Z Title: Implementation of Optimization Networks in Synchronous Massively Parallel Hardware Authors: Jan N.H. Heemskerk, Peter A. Starreveld, and Patrick T.W. Hudson Abstract: In this paper, implementation possibilities of a synchronous binary neural model for solving optimization problems in massively parallel hardware are studied. It is argued that synchronous, as opposed to asynchronous models are best suited to the general characteristics of massively parallel architectures. In this study the massively parallel target device is the BSP400 (Brain Style Processor with 400 nodes). The updating of the nodes in the BSP400 is synchronous and the nodes can only process local data (i.e., activations). The synchronous models studied were introduced by Takefuji [7] and make use of both local and global operators. The functionality of these operators with regard to the quality of the solutions was examined through software simulations. Fully digital neurocomputers such as the BSP400 offer sufficient flexibility for programming local operations on node level. The possibilities of translating the function of global operators into local operations were also studied. The aim of this study is to combine massively parallel hardware with synchronous neural networks models for optimization problems in order to get both high speed and high quality of the solutions. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: pictwrd.ps.Z Title: A Connectionist Model for Context Effects in the Picture-Word Interference Task Authors: Peter A. Starreveld and Jan N. H. Heemskerk Abstract: In the picture-word interference task, two context effects can be distin- guished: the semantic interference effect and the orthographic facilitation effect. A theory is described to explain these effects. This theory was implemented in a connectionist model. The model is able to simulate the time courses of the two context effects and their interaction. E-mail comments to STARREVE at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- To obtain the file use the following command sequence: ftp ftp.mrc-apu.cam.ac.uk NAME: anonymous PASSWORD: cd pub cd nn bin get quit decompress lpr -P --------------------------------------------------------------------------- From hava at bimacs.cs.biu.ac.il Wed Dec 1 13:33:07 1993 From: hava at bimacs.cs.biu.ac.il (Siegelmann Hava) Date: Wed, 1 Dec 93 20:33:07 +0200 Subject: TR: Unreliable Neurons and Asynchronous Recurrent Nets Message-ID: <9312011833.AA21754@bimacs.cs.biu.ac.il> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/siegelmann.prob.ps.Z The file siegelmann.prob.ps.Z is now available for copying from the Neuroprose repository: ============================================================================= ON THE COMPUTATIONAL POWER OF FAULTY AND ASYNCHRONOUS NEURAL NETWORKS (28 pages) Hava Siegelmann Bar-Ilan University ABSTRACT ======== This paper deals with finite size recurrent neural networks which consist of general (possibly with cycles) interconnections of evolving processors. Each neuron may assume real activation value. We provide the first rigorous foundations for {\it recurrent} networks which are built of unreliable {\it analog} devices and present asynchronicity in their updates. The first model considered incorporates unreliable devices (either neurons or the connections between them) which assume fixed error probabilities, independent of the history and the global state of the network. The model corresponds to the random-noise philosophy of Shannon. Another model allows the error probabilities to depend both on the global state and the history. Next, we change the various faulty nets to update in a total asynchronity. We prove all the above models to be computationally equivalent and we express their power. In particular, we see that for some constrained models of networks, the random behavior adds nonunifromity to the computation. However, the general simple model of networks is robust to probabilistic failures and asynchronous behavior. Nondeterministic unreliable nets are defined, and we show that in the faulty model the equality P = NP holds. ======================================================================= Hope you find it interesting, Hava From gupta at prl.philips.co.uk Thu Dec 2 05:09:43 1993 From: gupta at prl.philips.co.uk (Ashok Gupta) Date: Thu, 2 Dec 93 10:09:43 UTC Subject: Announcement - General Purpose Parallel Computing Message-ID: <29129.9312021009@apollo60.prl.philips.co.uk> The British Computer Society Parallel Processing Specialist Group (BCS PPSG) General Purpose Parallel Computing A One Day Open Meeting with Invited and Contributed Papers 22 December 1993, University of Westminster, London, UK Invited speakers : Les Valiant, Harvard University Bill McColl, PRG, University of Oxford, UK David May, Inmos, UK A key factor for the growth of parallel computing is the availability of port- able software. To be portable, software must be written to a model of machine performance with universal applicability. Software providers must be able to provide programs whose performance will scale with machine and application size according to agreed principles. This environment presupposes a model of paral- lel performance, and one which will perform well for irregular as well as regu- lar patterns of interaction. Adoption of a common model by machine architects, algorithm & language designers and programmers is a precondition for general purpose parallel computing. Valiant's Bulk Synchronous Parallel (BSP) model provides a bridge between appli- cation, language design and architecture for parallel computers. BSP is of the same nature for parallel computing as the Von Neumann model is for sequential computing. It forms the focus of a project for scalable performance parallel architectures supporting architecture independent software. The model and its implications for hardware and software design will be described in invited and contributed talks. The PPSG, founded in 1986, exists to foster development of parallel architec- tures, languages and applications & to disseminate information on parallel pro- cessing. Membership is completely open; you do not have to be a member of the British Computer Society. For further information about the group contact ei- ther of the following : Chair : Mr. A. Gupta Membership Secretary: Dr. N. Tucker Philips Research Labs, Crossoak Lane, Paradis Consultants, East Berriow, Redhill, Surrey, RH1 5HA, UK Berriow Bridge, North Hill, Nr. Launceston, gupta at prl.philips.co.uk Cornwall, PL15 7NL, UK Please share this information and display this announcement The British Computer Society Parallel Processing Specialist Group (BCS PPSG) General Purpose Parallel Computing 22 December 1993, Fyvie Hall, 309 Regent Street, University of Westminster, London, UK Provisional Programme 9 am-10 am Registration & Coffee L. Valiant, Harvard University, "Title to be announced" W. McColl, Oxford University, Programming models for General Purpose Parallel Computing A. Chin, King's College, London University, Locality of Reference in Bulk-Synchronous Parallel Computation P. Thannisch et al, Edinburgh University, Exponential Processor Requirements for Optimal Schedules in Architecture with Locality Lunch D. May, Inmos "Title to be announced" R. Miller, Oxford University, A Library for Bulk Synchronous Parallel Programming C. Jesshope et al, Surrey University, BSPC and the N-Computer Tea/Coffee P. Dew et al, Leeds University, Scalable Parallel Computing using the XPRAM model S. Turner et al, Exeter University, Portability and Parallelism with `Lightweight P4' N. Kalentery et al, University of Westminster, From BSP to a Virtual Von Neumann Machine R. Bisseling, Utrecht University, Scientific Computing on Bulk Synchronous Parallel Architectures B. Thompson et al, University College of Swansea, Equational Specification of Synchronous Concurrent Algorithms and Architectures 5.30 pm Close Please share this information and display this announcement The British Computer Society Parallel Processing Specialist Group Booking Form/Invoice BCS VAT No. : 440-3490-76 Please reserve a place at the Conference on General Purpose Parallel Computing, London, December 22 1993, for the individual(s) named below. Name of delegate BCS membership no. Fee VAT Total (if applicable) ___________________________________________________________________________ ___________________________________________________________________________ ___________________________________________________________________________ Cheques, in pounds sterling, should be made payable to "BCS Parallel Processing Specialist Group". Unfortunately credit card bookings cannot be accepted. The delegate fees (including lunch, refreshments and proceedings) are (in pounds sterling) : Members of both PPSG & BCS: 55 + 9.62 VAT = 64.62 PPSG or BCS members: 70 + 12.25 VAT = 82.25 Non members: 90 + 15.75 VAT = 105.75 Full-time students: 25 + 4.37 VAT = 29.37 (Students should provide a letter of endorsement from their supervisor that also clearly details their institution) Contact Address: ___________________________________________ ___________________________________________ ___________________________________________ Email address: _________________ Date: _________________ Day time telephone: ________________ Places are limited so please return this form as soon as possible to : Mrs C. Cunningham BCS PPSG 2 Mildenhall Close, Lower Earley, Reading, RG6 3AT, UK (Phone 0734 665570) From pf2 at st-andrews.ac.uk Thu Dec 2 06:24:27 1993 From: pf2 at st-andrews.ac.uk (Peter Foldiak) Date: Thu, 2 Dec 93 11:24:27 GMT Subject: Lectureship in St Andrews Message-ID: <27004.9312021124@psych.st-andrews.ac.uk> UNIVERSITY OF ST ANDREWS LECTURESHIP IN THE SCHOOL OF PSYCHOLOGY Applications are invited for the above post, which is made available following the retirement of Professor MA Jeeves. Individuals with a strong research record in an area of psychology related to existing research strengths in the School are encouraged to apply. The successful candidate will be expected to make a significant contribution to the School's research activity, and to contribute to undergraduate and graduate teaching programmes. The appointment is available from 1 September 1994. The University is prepared to consider appointing at the senior lectureship or readership level in light of an assessment of the quality of the field available to it. The salary shall be on the appropriate point on the Academic payscale GBP 13601 - GBP 29788 per annum. Application forms and further particulars are available from Personnel Services, University of St Andrews, College Gate, St Andrews KY16 9AJ, U.K. (tel: +44 334 62562, out of hours +44 334 62571 or by fax +44 334 62570), to whom completed forms accompanied by a letter of application and CV should be returned to arrive not later than 10 December 1993. PLEASE QUOTE REFERENCE NUMBER SL/APS0001. The University operates an Equal Opportunities Policy. From kaylan%TRBOUN.BITNET at FRMOP11.CNUSC.FR Thu Dec 2 05:18:58 1993 From: kaylan%TRBOUN.BITNET at FRMOP11.CNUSC.FR (kaylan%TRBOUN.BITNET@FRMOP11.CNUSC.FR) Date: Thu, 02 Dec 1993 12:18:58 +0200 Subject: ESS'94 - Call For Papers Message-ID: <00976695.E0F01620.16208@trboun.bitnet> ESS'94 EUROPEAN SIMULATION SYMPOSIUM CALL FOR PAPERS ISTANBUL, TURKEY OCTOBER 9-12, 1994 HOSTED BY BOGAZICI UNIVERSITY Organized and sponsored by: The Society for Computer Simulation International (SCS) With cooperation of: The European Simulation Council (ESC) Ministry of Industry and Trade, Turkey Operational Research Society of Turkey (ORST) Cosponsored by: Bekoteknik Digital Hewlett Packard IBM Turk Main Topics: * Advances in Simulation Methodology and Practices * Artificial Intelligence in Simulation * Innovative Simulation Technologies * Industrial Simulation * Computer and Telecommunication Systems CONFERENCE COMMITTEE Conference Chairman: Prof. Dr. Tuncer I. Oren University of Ottawa, Computer Science Department, 150 Louis Pasteur / Pri., Ottawa, Ontario, Canada K1N 6N5 Phone: 1.613.654.5068 Fax: 1.613.564.7089 E-mail: oren at csi.uottawa.ca Program Chairman: Prof. Dr. Ali Riza Kaylan Bogazici University, Dept.of Industrial Engineering, 80815 Bebek, Istanbul, Turkey Phone: 90.212.2631540/2072 Fax: 90.212.2651800 E-Mail: Kaylan at trboun.bitnet Program Co-chairman: Prof. Dr. Axel Lehmann Universitaet der Bundeswehr, Munchen, Institut fur Technische Informatik, Werner-Heisenberg-Weg 39, D 85577 Neubiberg, Germany. Phone: 49.89.6004.2648/2654 Fax: 49.89.6004.3560 E-Mail: Lehmann at informatik.unibw-muenchen.de Finance Chairman: Rainer Rimane, University of Erlangen - Nurnberg Organization Committee: Ali Riza Kaylan, Yaman Barlas, Murat Draman, Levent Mollamustafaoglu, Tulin Yazgac International Program Committee (Preliminary): O. Balci, USA J. Banks, USA G. Bolch, Germany R. Crosbie, USA B. Delaney, USA M. S. Elzas, Netherlands H. Erkut, Turkey A. Eyler, Turkey P. Fishwick, USA E. Gelenbe, USA A. Guasch, Spain M. Hitz, Austria R. Huntsinger, USA G. Iazeolla, Italy K. Irmscher, Germany K. Juslin, Finland A. Javor, Hungary E. Kerckhoffs, Netherlands J. Kleijnen, Netherlands M. Kotva, Czech Rep. M. Koksalan, Turkey M. L. Pagdett, USA M. Pior, Germany R. Reddy, USA S. Reddy, USA B. Schmidt, Germany S. Sevinc, Australia H. Szczerbicka, Germany S. Tabaka, Japan O. Tanir, Canada G. Vansteenkiste, Belgium M. Wildberger, USA S. Xia, UK R. Zobel, UK CONFERENCE INFORMATION The ESS series (organized by SCS, the Society for Computer Simulation International) is now in its fifth year. SCS is an international non-profit organization founded in 1952. On a yearly basis SCS organizes 6 Simulation Conferences worldwide, cooperates in 2 others, and publishes the monthly magazine Simulation, a quarterly Transactions, and books. For more information, please tick the appropriate box on the reply card. During ESS'94 the following events will be presented besides the scientific program: Professional Seminars The first day of the conference is dedicated to professional seminars, which will be presented for those interested participants to expose the state-of-art overview of each of the five main themes of this conference. Participation fee is included in the conference registration fee. If you have suggestions for other advanced tutorial topics, please contact one of the program chairmen. Exhibits An exhibition will be held in the central hall where all participants meet for coffee and tea. There will be a special exhibition section for universities and non-profit organizations, and a special section for publishers and commercial stands. If you would like to participate in the exhibition, please contact the SCS European Office. Vendor Sessions, Demonstrations and Video Presentations For demonstrations or video sessions, please contact SCS International at the European Office. Special sessions within the scientific program will be set up for vendor presentations. Other Organized Meetings Several User Group meetings for simulation languages and tools will be organized on Monday. It is possible to have other meetings on Monday as well. If you would like to arrange a meeting, please contact the Conference Chairman. We will be happy to provide a meeting room and other necessary equipment. VENUE Istanbul, the only city in the world built on two continents, stands on the shores of the Istanbul Bogazi (Bosphorus) where the waters of the Black Sea mingle with those of the Sea of Marmara and the Golden Horn. Here on this splendid site, Istanbul guards the precious relics of three empires of which she has been the capital; a unique link between East and West, past and present. Istanbul has infinite variety: museums, ancient churches, palaces, great mosques, bazaars and the Bosphorus. However long you stay, just a few days or longer, your time will be wonderfully filled in this unforgettable city. Bogazici University, which will host ESS'94 has its origins in Robert College, first American College founded outside of the United States in 1863. It has a well deserved reputation for academic excellence and accordingly attracts students from among the best and brightest in Turkey. The University is composed of four faculties, six institutes (offering graduate programs), and two other schools. The conference location is Istanbul Dedeman, an international five star hotel, which is located in the center of the city with a spectacular view of the Bosphorus. It is in a very close district to the most of the historical places as well as to the business center. For the conference participants the single room special rate is 65 US dollars. SCIENTIFIC PROGRAM The 1994 SCS European Simulation Symposium is structured around the following five major themes. A parallel track will be devoted to each of the five topics. The conference language is English. * Advances in Simulation Methodology and Practices, e.g.: - Advanced Modelling, Experimentation, and Output Analysis and Display - Object-Oriented System Design and Simulation - Optimization of Simulation Models - Validation and Verification Techniques - Mixed Methodology Modelling - Special Simulation Tools and Environments * Artificial Intelligence in Simulation, e.g.: - Knowledge-based Simulation Environments and Knowledge Bases - Knowledge-based System Applications - Reliability Assurance through Knowledge-based Techniques - Mixed Qualitative and Quantitative Simulation - Neural Networks in Simulation * Innovative Simulation Technologies: - Virtual Reality - Multimedia Applications * Industrial Simulation, e.g. Simulation in: - Design and Manufacturing, CAD, CIM - Process Control - Robotics and Automation - Concurrent Engineering, Scheduling * Computer and Telecommunication Systems, e.g.: - Circuit Simulation, Fault Simulation - Computer Systems - Telecommunication Devices and Systems - Networks INVITED SPEAKERS Focusing on the main tracks of the conference, invited speakers will give special in-depth presentations in plenary sessions, which will be included in the proceedings of the conference. BEST PAPER AWARDS The 1994 European Simulation Symposium will award the best five papers, one in each of the five tracks. From these five papers, the best overall paper of the conference will be chosen. The awarded papers will be published in an International Journal, if necessary after incorporating modifications in the paper. DEADLINES AND REQUIREMENTS Extended abstracts (300 words, 2-3 pages for full and 150 words, 1 page for short papers typewritten without drawings and tables) are due to arrive in QUADRUPLICATE at the office of Ali Riza Kaylan, at the Industrial Engineering Department of Bogazici University, TURKEY before March 1, 1994. Only original papers, written in English, which have not previously been published elsewhere will be accepted. In case you want to organize a panel discussion, please contact the program chairmen. Authors are expected to register early (at a reduced fee) and to attend the conference at their own expense to present the accepted papers. If early registration and payment are not made, the paper will not be published in the conference proceedings. In the case of multi-authors, one author should be identified as the person who will act as correspondent for the paper. Abstracts will be reviewed by 3 members of the International Program Committee for full papers and one member for short papers. Notification of acceptance or rejection will be sent by April 30, 1994. An author kit with complete instruction for preparing a camera-ready copy for the proceedings will be sent to authors of accepted abstracts. The camera-ready copy of the papers must be in by July 15, 1994. Only the full papers, which are expected to be 5-6 pages long, will be published in the conference proceedings. In order to guarantee a high-quality conference, the full papers will be reviewed as well, to check whether the suggestions of the program committee have been incorporated. The nominees for the best paper awards will be selected as well. REGISTRATION FEE Author SCS members Other participants ----------------------------------------------- Registration before BF 15000 BF 15000 BF 17000 August 31, 1994 (375 ECU) (375 ECU) (425 ECU) Registration after Preregistration BF 17000 BF 20000 August 31, 1994 required (425 ECU) (500 ECU) or at the conference The registration fee includes one copy of the Conference Proceedings, attending professional seminars, coffee and tea during the breaks, all lunches, a welcome cocktail and the conference dinner. CORRESPONDENCE ADDRESS Philippe Geril The Society for Computer Simulation, European Simulation Office, University of Ghent Coupure Links 653, B-9000 Ghent, Belgium. Phone (Office): 32.9.233.77.90 Phone (Home): 32.59.800.804 Fax (Office): 32.9.223.49.41 E-Mail: Philippe.Geril at rug.ac.be REPLY CARD Family Name: First Name: Occupation and/or Title: Affiliation: Mailing Address: Zip: City: Country: Telephone: Fax: E-mail: Yes, I intend to attend the European Simulation Symposium ESS'94: o Proposing a paper o Proposing a panel discussion o Participating a vendor session o Contributing to the exhibition o Without presenting a paper The provisional title of my paper / poster / exhibited tool is: With the following topics: The paper belongs to the category (please tick one): o Advances in Simulation Methodology and Practices o Artificial Intelligence in Simulation o Innovative Simulation Technologies o Industrial Simulation o Computer and Telecommunication Systems The paper will be submitted as a: o Full paper o Short Paper o Poster session o Demonstration Other colleague(s) interested in the topics of the conference is/are: Name: Address: Name: Address: If you would like to receive more information about SCS and its activities, please tick the following box: o YES, I would to know more about SCS. Please mail this card immediately to: Philippe Geril, The Society for Computer Simulation, European Simulation Office University of Ghent, Coupure Links 653, B-9000 Ghent, Belgium. ============================================================================= Prof.Dr. Ali R. Kaylan Director of Computer Center Bogazici University e-mail: Kaylan at Trboun.Bitnet Dept. of Industrial Eng'g. fax-no: (90-1)265 63 57 or (90-1)265 93 62 Bebek 80815 phone: (90-1)265 93 62 Istanbul, TURKIYE phone: (90-1)263 15 40 ext. 1445,1727,1407 ============================================================================= From plunkett at dragon.psych Thu Dec 2 10:04:26 1993 From: plunkett at dragon.psych (plunkett (Kim Plunkett)) Date: Thu, 2 Dec 93 15:04:26 GMT Subject: No subject Message-ID: <9312021504.AA20456@dragon.psych.pdp> Lecturer in Cognitive Psychology University of Oxford Department of Experimental Psychology Job Specification The successful applicant will be required to assume special responsibility for teaching the Final Honours School paper "Memory and Cognition" which covers the following topics to be published in Examination Decrees and Regulations: Basic processes and varieties of human memory. Memory retrieval and interference; recognition and recall; short- and long-term memory; working memory; sensory memory; priming; acquisition of cognitive and motor skills; modality-specific and material-specific varieties of coding in memory; mnemonics; every-day memory; mathematical and com- putational models of learning and memory; impair- ment of learning and memory. The representation and use of knowledge. Topics such as: semantic memory; inference; concept for- mation; encoding of similarities and differences; concepts, prototypes, and natural categories; schemata; imagery; problem solving; decision- making; heuristics and biases; cross-cultural differences in cognition. This is one of four papers in cognitive psychology offered in Final Honours. The appointed lecturer will be expected to pursue active research in an area of cognitive psychology. Although interests in higher mental functions, cognitive neurop- sychology, language, or artificial intelligence would be an advantage, it should be stressed to potential applicants that there is no restriction on area of interest. Further details can be obtained from: Professor S.D. Iversen, Head of Department Department of Experimental Psychology South Parks Road Oxford OX1 3UD or email: Jane Brooks - brooks at psy.ox.ac.uk From oja at dendrite.hut.fi Fri Dec 3 09:45:27 1993 From: oja at dendrite.hut.fi (Erkki Oja) Date: Fri, 3 Dec 93 16:45:27 +0200 Subject: No subject Message-ID: <9312031445.AA25681@dendrite.hut.fi.hut.fi> % % *** A LIST OF REFERENCES RELATED TO PCA NEURAL NETWORKS *** % We offer a fairly extensive collection of references on Principal Component Analysis (PCA) neural networks and learning algorithms, available by anonymous ftp. The list also contains references on extensions and generalizations of such networks and some basic references on PCA and related matters. You can copy the list freely on your own responsibility. The original list has been compiled by Liu-Yue Wang, a graduate student of Erkki Oja, and updated by Juha Karhunen, all from Helsinki University of Technology, Finland. The list should cover fairly well the field of PCA networks. Although it is not complete and contains possibly some errors and nonuniformity in notation, the reference collection should be useful for people interested in PCA neural networks already in its present form. To get the list, connect by ftp to dendrite.hut.fi and give anonymous as the user id. Then proceed according to instructions. Erkki Oja, Liu-Yue Wang, Juha Karhunen % ************************************************************ From liaw at rana.usc.edu Fri Dec 3 19:00:47 1993 From: liaw at rana.usc.edu (Jim Liaw) Date: Fri, 3 Dec 93 16:00:47 PST Subject: Frog-Net Announcement Message-ID: <9312040000.AA21922@rana.usc.edu> ********************************************************************* ** ** ** o/\o Frog-Net o/\o ** ** \| |/ \| |/ ** ** | | An electronic forum for researchers | | ** ** -- engaged in the study of the behavior -- ** ** \/ \/ and the underlying neural mechanisms \/ \/ ** ** in amphibians ** ** ** ********************************************************************* This mailing list is set up to facilitate the communication and interaction among researchers interested in the behavior and the underlying neural mechanisms in amphibians. If you would like to send email to all members of the list, address it to "frog-net at rana.usc.edu" If you want to subscribe to the mailing list, please send an email to "liaw at rana.usc.edu" ::::::::::::::::::::::::::::::::::: Jim Liaw Center for Neural Engineering Univ. of Southern California Los Angeles, CA 90089-2520 (213) 740-6991 liaw at rana.usc.edu From amari at sat.t.u-tokyo.ac.jp Sat Dec 4 13:43:53 1993 From: amari at sat.t.u-tokyo.ac.jp (amari@sat.t.u-tokyo.ac.jp) Date: Sat, 4 Dec 93 13:43:53 JST Subject: PCA reference Message-ID: <9312040443.AA12881@bpel.tutics.tut.ac.jp> I have copied Dr.Sanger's very useful bibliography on PCA. I would like to add one "prehistoric" reference. In the paper S.Amari, Neural Theory of Association and Concept Formation, Biological Cybernetics, vol.26, 175 - 185, 1977, I discussed the general aspect of neural learning of the form (d/dt)w = -cw + c'rx, where w is the synaptic weight vector, c and c' are constant, x is the input vector and r is the "learning signal" depending on w, x and an extra signal. I have proved the existence of the potential or Lyapunov function in various types of neural learning. The case that r = w.x was also remarked. In p.179, one can see the following statement. "If the connection weight is subject to the subsidiary condition w.w = const, so that w(t) is normalized after each step of learning, we can prove that w(t) converges to the minimum of L(w) under the subsidiary condition. It is the direction of the eigenvector of the matrix corresponding to the maximum eigenvalue." Here L(w) is the Lyapunov function which is a special case of more general one I proposed, and is the covariance matrix (second order moment matrix) of the input signals. This was only a few lines of description, and the main theme of this paper was not the neural PCA. So this was only a prehistory. But I think that someone might have interest in a prehistoric anecdote. From LC4A%ICINECA.BITNET at BITNET.CC.CMU.EDU Tue Dec 7 15:09:35 1993 From: LC4A%ICINECA.BITNET at BITNET.CC.CMU.EDU (F. Ventriglia) Date: Tue, 07 Dec 93 15:09:35 SET Subject: New Book Announcement Message-ID: <01H672ZH3AKY0003VW@BITNET.CC.CMU.EDU> Dear Connectionists fellow, The following book has appeared as part of Studies in Neuroscience Series, and may be of interest to you. Best, Francesco Ventriglia Neurodynamics Department Cybernetics Institute, CNR Arco Felice (NA), Italy ***************************************************************** Neural Modeling and Neural Networks F. Ventriglia editor - Pergamon Press Research in neural modeling and neural networks has escalated dramatically in the last decade, acquiring along the way terms and concepts, such as learning, memory, perception, recognition, which are the basis of neuropsychology. Nevertless, for many, neural modeling remains controversial in its purported ability to describe brain activity. The difficulties in modeling are various, but arise principally in identifying those elements that are fundamental for the espression (and description) of superior neural activity. This is complicated by our incomplete knowledge of neural structures and functions, at the cellular and population levels. The firts step towards enhanced appreciation of the value of neural modeling and neural networks is to be aware of what has been achieved in this multidisclipinary field of research. This book sets out to create such awareness. Leading experts develop in twelve chapters the key topics of neural structures and functions, dynamics of single neurons, oscillations in groups of neurons, randomness and chaos in neural activity, (statistical) dynamics of neural networks, learning, memory and pattern recognition. Contents: Preface. Contributors. Anatomical bases of neural network modeling (J. Szentagothai) Models of visuomotor coordination in frog and monkey (M.A. Arbib) Analysis of single-unit activity in the cerebral cortex (M. Abeles) Single neuron dynamics: an introduction (L.F. Abbott) An introduction to neural oscillators (B. Ermentrout) Mechanisms responsible for epilepsy in hippocampal slices predispose the brain to collective oscillations (R.D. Traub, J.G.R. Jefferys) Diffusion models of single neurones' activity and related problems (L.M. Ricciardi) Noise and chaos in neural systems (P. Erdi) Qualitative overview of population neurodynamics (W.F. Freeman) Towards a kinetic theory of cortical-like neural fields (F. Ventriglia) Psychology, neuro-biology and modeling: the science of hebbian reverberations (D.J. Amit) Pattern recognition with neural networks (K. Fukushima) Bibliography. Author index. Subject index. Publication date November 1993 Approx 300 pages Price US$ 125.00 Available from: Pergamon Press Inc. 660 White Plains Road Tarrytown NY 10591-5153 USA Phone +1-914-524-9200 Fax +1-914-333-2444 From tds at ai.mit.edu Tue Dec 7 18:40:47 1993 From: tds at ai.mit.edu (Terence D. Sanger) Date: Tue, 7 Dec 93 18:40:47 EST Subject: PCA algorithms, continued. Message-ID: <9312072340.AA11203@rice-chex> In response to my previous message, many people have sent me new references to PCA algorithms, and these have been included in the BibTex database pca.bib. (Also note Wang's more extensive pclist.tex file announced recently on this net.) Errki Oja has been kind enough to forward copies of some of his recent papers on the "Weighted Subspace Algorithm" and "Nonlinear PCA". Looking at these carefully, I think both algorithms are closely related to Brockett's algorithm, and probably work for the same reason. I have created another short derivation "oja.tex" which is available along with the updated pca.bib by anonymous ftp from ftp.ai.mit.edu in the directory pub/sanger-papers. One could invoke some sort of transitivity property to claim that since Oja's algorithms are related to Brockett's, Brockett's are related to GHA, and GHA does deflation, then Oja's algorithms must also do deflation. This would imply that Oja's algorithms also satisfy the hypothesis: "All algorithms for PCA which are based on a Hebbian learning rule must use sequential deflation to extract components beyond the first." But I must admit that the connection is becoming somewhat tenuous. Probably the hypothesis should be interpreted as a vague description of a motivation for the computational mechanism, rather than a direct description of the algorithm. However, I still feel that it is important to realize the close relationship between the many algorithms which use Hebbian learning to find exact eigenvectors. As always, comments/suggestions/counterexamples/references are welcomed! Terry Sanger Instructions for retrieving latex documents: ftp ftp.ai.mit.edu login: anonymous password: your-net-address cd pub/sanger-papers get pca.bib get oja.tex quit latex oja lpr oja.dvi From brutlag at cmgm.stanford.edu Wed Dec 8 20:48:58 1993 From: brutlag at cmgm.stanford.edu (Doug Brutlag) Date: Wed, 8 Dec 93 17:48:58 -0800 Subject: Intelligent Systems for Molecular Biology Message-ID: Last year's version of the following conference contained many papers that involved neural networks. Hence, I thought that some of the readers of this mailing list might be interested. Doug Brutlag ***************** CALL FOR PAPERS ***************** The Second International Conference on Intelligent Systems for Molecular Biology August 15-17, 1994 Stanford University Organizing Committee Deadlines Russ Altman, Stanford U, Stanford Papers due: March 11, 1994 Doug Brutlag, Stanford U, Stanford Replies to authors: April 29, 1994 Peter Karp, SRI, Menlo Park Revised papers due: May 27, 1994 Richard Lathrop, MIT, Cambridge David Searls, U Penn, Philadelphia Program Committee K. Asai, ETL, Tsukuba A. Lapedes, LANL, Los Alamos D. Benson, NCBI, Bethesda M. Mavrovouniotis, Northwestern U, Evanston B. Buchanan, U of Pittsburgh G. Michaels, George Mason U, Fairfax C. Burks, LANL, Los Alamos G. Myers, U. Arizona, Tucson D. Clark, ICRF, London K. Nitta, ICOT, Tokyo F. Cohen, UCSF, San Francisco C. Rawlings, ICRF, London T. Dietterich, OSU, Corvallis J. Sallatin, LIRM, Montpellier S. Forrest, UNM, Albuquerque C. Sander, EMBL, Heidelberg J. Glasgow, Queen's U., Kingston J. Shavlik, U Wisconsin, Madison P. Green, Wash U, St. Louis D. States, Wash U, St. Louis M. Gribskov, SDSC, San Diego G. Stormo, U Colorado, Boulder D. Haussler, UCSC, Santa Cruz E. Uberbacher, ORNL, Oak Ridge S. Henikoff, FHRC, Seattle M. Walker, Stanford U, Stanford L. Hunter, NLM, Bethesda T. Webster, Stanford U, Stanford T. Klein, UCSF, San Francisco X. Zhang, TMC, Cambridge The Second International Conference on Intelligent Systems for Molecular Biology will take place at Stanford University in the San Francisco Bay Area, August 14-17, 1994. The ISMB conference, held for the first time last summer in Bethesda, MD, attracted an overflow crowd, yielded an excellent offering of papers, invited speakers, posters and tutorials, provided an exciting opportunity for researchers to meet and exchange ideas, and was an important forum for the developing field. We will continue the tradition of pre-published, rigorously refereed proceedings, and opportunities for fruitful personal interchange. The conference will bring together scientists who are applying the technologies of advanced data modeling, artificial intelligence, neural networks, probabilistic reasoning, massively parallel computing, robotics, and related computational methods to problems in molecular biology. We invite participation from both developers and users of any novel system, provided it supports a biological task that is cognitively challenging, involves a synthesis of information from multiple sources at multiple levels, or in some other way exhibits the abstraction and emergent properties of an "intelligent system." The four-day conference will feature introductory tutorials (August 14), presentations of original refereed papers and invited talks (August 15-17). Paper submissions should be single-spaced, 12 point type, 12 pages maximum including title, abstract, figures, tables, and bibliography with titles. The first page should include the full postal address, electronic mailing address, telephone and FAX number of each author. Also, please list five to ten keywords describing the methods and concepts discussed in the paper. State whether you wish the paper to be considered for oral presentation only, poster presentation only or for either presentation format. Submit 6 copies to the address below. For more information, please contact ismb at camis.stanford.edu. Proposals for introductory tutorials must be well documented, including the purpose and intended audience of the tutorial as well as previous experience of the author in presenting such material. Those considering submitting tutorial proposals are strongly encouraged to submit a one-page outline, before the deadline, to enable early feed-back regarding topic and content suitability. The conference will pay an honorarium and support, in part, the travel expenses of tutorial speakers. Limited funds are available to support travel to ISMB-94 for those students, post-docs, minorities and women who would otherwise be unable to attend.. Please submit papers and tutorial proposals to: Intelligent Systems for Molecular Biology c/o Dr. Douglas L. Brutlag Beckman Center, B400 Department of Biochemistry Stanford University School of Medicine Stanford, California 94305-5307 From sylee%eekaist.kaist.ac.kr at daiduk.kaist.ac.kr Fri Dec 10 11:14:43 1993 From: sylee%eekaist.kaist.ac.kr at daiduk.kaist.ac.kr (Soo-Young Lee) Date: Fri, 10 Dec 93 11:14:43 KST Subject: Postdoc and Graduate Studies Message-ID: <9312100214.AA02590@eekaist.kaist.ac.kr> Subject: Postdoc/Graduate Study - Neural Net Applications and Implementation From: "Soo-Young Lee" POSTDOCTORAL POSITION / GRADUATE STUDENTS Computation and Neural Systems Laboratory Department of Electrical Engineering Korea Advanced Institute of Science and Technology A postdoctoral position is available beginning after March 1st, 1994. The position is for one year initially, and may be extended for another year. Graduate students with full scholarship are also welcome, especially from developing countries. We are seeking individuals interested in researches on neural net applications and/or VLSI implementation. Especially we emphasizes "systems" approach, which combines neural network theory, application-specific knowledge, and hardware implementation technology for much better perofrmance. Although many applications are currently investigated, speech recognition is the preferred choice at this moment. Interested parties should send a C.V. and a brief statement of research interests to the address listed below. Present address: Prof. Soo-Young Lee Computation and Neural Systems Laboratory Department of Electrical Engineering Korea Advanced Institute of Science and Technology 373-1 Kusong-dong, Yusong-gu Taejon 305-701 Korea (South) Fax: +82-42-869-3410 E-mail: sylee at ee.kaist.ac.kr RESEARCH INTERESTS OF THE GROUP The Korea Advanced Institute of Science and Technology (KAIST) is an unique engineering school, which emphasies graduate studies through high-quality researches. All graduate students receive full scholarship, and Ph.D. course students are free from military services. The Department of Electrical Engineering is the largest one with 39 professors, 250 Ph.D. course students, 180 Master course students, and 300 undergraduate students. The Computation and Neural Systems Laboratory is lead by Prof. Soo-Young Lee, and consists of about 10 Ph.D. course students and about 5 Master course students. The primary focus of this laboratory is to merge neural network theory, VLSI implementation technology, and application-specific knowledge for much better performance at real world applications. Speech recognition, pattern recognition, and control applications have been emphasized. Neural network models develpoed include Multilayer Bidirectional Associative Memoryas an extention of BAM into multilayer architecture, IJNN (Intelligent Judge Neural Networks) for intelligent ruling verdict for disputes from several low-level classifiers, TAG (Training by Adaptive Gain) for large-scale implementation and speaker adaptation, and Hybrid Hebbian-Backpropagation Algorithm for MLP for improved robustness and generalization. The correlation matrix MBAM chip had been fabricated, and new on-chip learning analog neuro-chip is under design now. From ken at phy.ucsf.edu Fri Dec 10 04:58:05 1993 From: ken at phy.ucsf.edu (ken@phy.ucsf.edu) Date: Fri, 10 Dec 93 01:58:05 -0800 Subject: Graduate studies in computational and systems neuroscience Message-ID: <9312100958.AA13745@phybeta.ucsf.EDU> University of California, San Francisco (UCSF) is a leading institute of biomedical research. Its graduate program in Neuroscience is widely regarded as one of the very best such programs. The organization of the Keck Center for Integrative Neuroscience (see below), including the hiring of computational faculty, makes UCSF an exciting location for students interested in theoretical as well as experimental approaches to understanding brain function. UCSF is *not* a reasonable place for those wishing to work on applications of neural networks, as we have no programs in that area. But, for those truly interested in understanding the nervous system and its function, using theoretical and/or experimental methods and remaining solidly based in biology, it is a superb program. I would like to personally encourage theoretically inclined individuals with such an interest to apply. Application deadline is Jan. 15. For further information and application materials, contact Patricia Arrandale: patricia at phy.ucsf.edu; 415-476-2248 (phone); 415-476-4929 (fax). -------------------------------------------- The Keck Center for Integrative Neuroscience: A completely reconstructed space at UCSF, to open in January, 1994, will house the following seven faculty and their labs in a highly interactive setting: Allan Basbaum: The Neural Substrate of Pain and Pain Control Allison Doupe: The Neural Basis of Vocal Learning in Songbirds Stephen Lisberger: Neural Control of Eye Movements Michael Merzenich: Dynamic Neocortical Processes: Neural Origins of Higher Brain Functions Kenneth Miller: Computational Neuroscience Christof Schreiner: Mammalian Auditory Cortex Michael Stryker: Development and Plasticity of Mammalian Central Visual System Other faculty closely associated with the Center, although not housed in the center itself, include: Howard Fields: Neural Circuitry Underlying Pain Modulation Rob Malenka: Synaptic Plasticity in the Mammalian Central Nervous System Roger Nicoll: Physiology and Pharmacology of CNS Synapses Henry Ralston: Neuronal Organization in Spinal Cord and Thalamus From paulh at hdl.ie Fri Dec 10 10:09:40 1993 From: paulh at hdl.ie (Paul Horan) Date: Fri, 10 Dec 93 15:09:40 GMT Subject: Post-Doc position Message-ID: <9312101509.AA07932@sun1.hdl.ie> Post-Doctoral Research Position Dept of Pure and Applied Physics Trinity College Dublin, Ireland. Applications are invited for a postdoctoral position in the Department of Physics at Trinity College Dublin to work on the integration of semiconductor optical modulators and electronics, as part of a smart pixel neural network project which is currently underway. The research will be carried out in collaboration with a team at the Hitachi Dublin Lab in Trinity College. The successful candidate should have experience in both electronics and optics, preferably in the design and processing of GaAs devices. Applicants should have a PhD. The post will be for two years initially, with the possibility of an extension. Inquiries or applications + CV + 2 referees to: Prof. John Hegarty head of Dept., Dept of Pure and Applied physics, Trinity College, Dublin 2 Ireland. Tel +353-1-7021675 Fax +353-1-6711759 email _______________________________________________________ Paul Horan, Hitachi Dublin Lab., Trinity College, Dublin 2, Ireland Fax +353-1-6798926, e-mail paulh at hdl.ie From P.McKevitt at dcs.shef.ac.uk Fri Dec 10 13:00:24 1993 From: P.McKevitt at dcs.shef.ac.uk (Paul Mc Kevitt) Date: Fri, 10 Dec 93 18:00:24 GMT Subject: No subject Message-ID: <9312101800.AA06422@dcs.shef.ac.uk> *PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST* Advance Announcement CALL FOR PAPERS AND PARTICIPATION AAAI-94 Workshop on the Integration of Natural Language and Vision Processing Twelfth National Conference on Artificial Intelligence (AAAI-94) Seattle, Washington, USA 2 days during July 31st-August 4th 1994 Chair: Paul Mc Kevitt Department of Computer Science University of Sheffield WORKSHOP DESCRIPTION There has been a recent move towards considering the integration of perception sources in Artificial Intelligence (AI) (see Dennett 1991 and Mc Kevitt (Guest Ed.) 1994). This workshop will focus on research involved in the integration of Natural Language Processing (NLP) and Vision Processing (VP). Although there has been much progress in developing theories, models and systems in the areas of NLP and VP there has been little progress on integrating these two subareas of Artificial Intelligence (AI). It is not clear why there has not already been much activity in integrating NLP and VP. Is it because of the long-time reductionist trend in science up until the recent emphasis on chaos theory, non-linear systems, and emergent behaviour? Or, is it because the people who have tended to work on NLP tend to be in other Departments, or of a different ilk, to those who have worked on VP? We believe it is high time to bring together NLP and VP. Already we have advertised a call for papers for a special issue of the Journal of AI Review to focus on the integration of NLP and VP and we have had a tremendous response. There will be three special issues focussing on theory and applications of NLP and VP. Also, there will be an issue focussing on intelligent multimedia systems. The workshop is of particular interest at this time because research in NLP and VP have advanced to the stage that they can each benefit from integrated approaches. Also, such integration is important as people in NLP and VP can gain insight from each others' work. References Dennett, Daniel (1991) Consciousness explained Harmondsworth: Penguin Mc Kevitt, Paul (1994) (Guest Editor) Integration of Natural Language and Vision Processing Special Volume (Issues 1,2,3) of AI Review Journal Dordrecht: Kluwer (forthcoming) WORKSHOP TOPICS: The workshop will focus on three themes: * Theoretical issues on integrated NLP and VP * Systems exhibiting integrated NLP and VP * Intelligent multimedia involving NLP and VP The following issues will be focussed upon during the workshop: * Common representations for NLP and VP * How does NLP help VP and vice-versa? * What does integration buy us? * Symbolic versus connectionist models * Varieties of communication between NLP and VP processors * Designs for integrating NLP + VP * Tools for integrating NLP + VP * Possible applications of integration WORKSHOP FORMAT: Our intention is to have as much discussion as possible during the workshop and to stress panel sessions and discussion rather than having formal paper presentations. We will also organize a number of presentations on Site Descriptions of ongoing work on NLP + VP. There may be a number of invited speakers. Day 1: Theory and modelling for integrated NLP and VP. Day 2: Systems for integrated NLP/VP, and intelligent multimedia. ATTENDANCE: We hope to have an attendance between 25-50 people at the workshop. SUBMISSION REQUIREMENTS: Papers of not more than 8 pages should be submitted by electronic mail to Paul Mc Kevitt at p.mckevitt at dcs.shef.ac.uk. Preferred format is two columns with 3/4 " margins all round. Papers must be printed to 8 1/2" x 11" size. Double sided printing is encouraged. If you cannot submit your paper by e-mail please submit three copies by snail mail. *******Submission Deadline: March 18th 1994 *******Notification Date: April 8th 1994 *******Camera ready Copy: April 29th 1994 PUBLICATION: Workshop notes/preprints will be published by AAAI. If there is sufficient interest we will publish a book on the workshop with AAAI Press. WORKSHOP CHAIR: Paul Mc Kevitt Department of Computer Science Regent Court University of Sheffield 211 Portobello Street GB- S1 4DP, Sheffield England, UK, EC. e-mail: p.mckevitt at dcs.shef.ac.uk fax: +44 742 780972 phone: +44 742 825572 (office) 825590 (secretary) WORKSHOP COMMITTEE: Prof. Jerry Feldman (ICSI, Berkeley, USA) Prof. John Frisby (Sheffield, England) Dr. Eduard Hovy (USC ISI, Los Angeles, USA) Dr. Mark Maybury (MITRE, Cambridge, USA) Dr. Ryuichi Oka (RWC, Tsukuba, Japan) Dr. Terry Reiger (ICSI, Berkeley, USA) Prof. Roger Schank (ILS, Illinois, USA) Dr. Oliviero Stock (IRST, Italy) Prof. Dr. Wolfgang Wahlster (DFKI, Germany) Prof. Yorick Wilks (Sheffield, England) *PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST**PLEASE POST* From jaap.murre at mrc-apu.cam.ac.uk Mon Dec 13 09:25:51 1993 From: jaap.murre at mrc-apu.cam.ac.uk (Jaap Murre) Date: Mon, 13 Dec 93 14:25:51 GMT Subject: Neurosimulator review available by ftp Message-ID: <12980.9312131425@rigel.mrc-apu.cam.ac.uk> I have recently completed a review of roughly 40 neurosimulators. A first draft of this review is now available through our ftp site. Comments to the current review are very welcome. -- Jaap Murre --------------------------------------------------------------------------- The filename is neurosim1.ps.Z. A WordPerfect 5.1 version (.w51.Z extension) and an ASCII dump (.txt) are also available, as well as uncompressed versions (no .Z extension). To obtain the file use the following command sequence: ftp ftp.mrc-apu.cam.ac.uk NAME: anonymous PASSWORD: cd pub cd nn bin /* if you want to retrieve a binary file */ get neurosim1.ps.Z quit decompress neurosim1.ps.Z lpr -P neurosim1.ps --------------------------------------------------------------------------- Dr. Jacob M.J. Murre Medical Research Council Applied Psychology Unit 15 Chaucer Road Cambridge CB2 2EF United Kingdom tel 44 223 355294 (ext.139) fax 44 223 359062 E-mail: Jaap.Murre at MRC-APU.CAM.AC.UK From robbie at prodigal.psych.rochester.edu Mon Dec 13 11:07:38 1993 From: robbie at prodigal.psych.rochester.edu (Robbie Jacobs) Date: Mon, 13 Dec 93 11:07:38 EST Subject: paper available Message-ID: <9312131607.AA27061@prodigal.psych.rochester.edu> The following paper is now available via anonymous ftp from the neuroprose archive. The paper has been accepted for publication in the journal "Cognitive Science." The manuscript is 44 pages. ------------------------------------------------------------------ Encoding Shape and Spatial Relations: The Role of Receptive Field Size in Coordinating Complementary Representations Robert A. Jacobs Stephen M. Kosslyn University of Rochester Harvard University An effective functional architecture facilitates interactions among subsystems that are often used together. Computer simulations showed that differences in receptive field sizes can promote such organization. When input was filtered through relatively small nonoverlapping receptive fields, artificial neural networks learned to categorize shapes relatively quickly; in contrast, when input was filtered through relatively large overlapping receptive fields, networks learned to encode specific shape exemplars or metric spatial relations relatively quickly. Moreover, when the receptive field sizes were allowed to adapt during learning, networks developed smaller receptive fields when they were trained to categorize shapes or spatial relations, and developed larger receptive fields when they were trained to encode specific exemplars or metric distances. In addition, when pairs of networks were constrained to use input from the same type of receptive fields, networks learned a task faster when they were paired with networks that were trained to perform a compatible type of task. Finally, using a novel modular architecture, networks were not pre-assigned a task, but rather competed to perform the different tasks. Networks with small nonoverlapping receptive fields tended to win the competition for categorical tasks whereas networks with large overlapping receptive fields tended to win the competition for exemplar/metric tasks. ------------------------------------------------------------------ FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/jacobs.rfield.ps.Z Robbie Jacobs robbie at psych.rochester.edu From jaap.murre at mrc-apu.cam.ac.uk Mon Dec 13 11:35:11 1993 From: jaap.murre at mrc-apu.cam.ac.uk (Jaap Murre) Date: Mon, 13 Dec 93 16:35:11 GMT Subject: Workshop on Language Acquisition Message-ID: <13227.9312131635@rigel.mrc-apu.cam.ac.uk> *************** CALL FOR PAPERS ****************************************************** Workshop on 'Cognitive Models of Language Acquisition' April 21-23, 1994 Tilburg University, The Netherlands ****************************************************** Organizers: Peter Broeder (Tilburg) Jaap Murre (Cambridge) Scientific committee: Melissa Bowerman (Nijmegen) Peter Coopmans (Utrecht) Guus Extra (Tilburg) Peter Jordens (Amsterdam) Sponsored by: L.O.T. (Landelijke Onderzoeksschool Taalkunde), The Dutch national Ph.D. program in Linguistics AIM OF THE WORKSHOP The workshop is centered around two basic questions with respect to the nature and origins of language as "an individual phenomenon": (1) What constitutes knowledge of language? (2) How is knowledge of language acquired? Currently, these questions are being addressed within different cognitive models of language acquisition which derive from strongly contrasting research paradigms. The paradigms start from fundamentally different assumptions about language (symbolic or subsymbolic) and the mechanisms that drive the process of language acquisition (inductive or deductive). The workshop will focus on processes of language acquisition in children and adults and on modelling theses processes. In particular, the acquisition and representation of words will be a central topic. The workshop aims to bring together researchers willing to discuss the merits and constraints of the various models based on the interdisciplinary approaches of linguistics, psychology, cognitive science, NLP, and AI. PARTICIPANTS Melissa Bowerman (Nijmegen), Harald Clahsen (Colchester), Vivian Cook (Colchester), Peter Coopmans (Utrecht), Walter Daelemans (Tilburg), Guus Extra (Tilburg), Michael Gasser (Indiana), Steven Gillis (Antwerp), Peter Jordens (Amsterdam), Gerard Kempen (Leiden), Brian MacWhinney (Pittsburg), Paul Meara (Swansea), Dennis Norris (Cambridge), Kim Plunkett (Oxford), Henk van Riemsdijk (Tilburg), Mike Sharwood-Smith (Utrecht), Paul Smolensky (Colorado), Sven Stromqvist (Goteborg). ABSTRACTS We invite those interested to submit a two-page abstract (for a 30 minute oral presentation) by January 15, 1994. We would prefer to receive the abstracts by e-mail. The organizers can be contacted at the following addresses: Peter Broeder Jaap Murre Department of Linguistics Medical Research Council University of Tilburg Applied Psychology Unit P.O. Box 90153 15 Chaucer Road 5000 LE Tilburg Cambridge CB2 2EF The Netherlands United Kingdom tel: +31 13-662239 tel: +44 223 355294 fax: +31 13-663110 fax: +44 223 359062 e-mail: peter.broeder at kub.nl e-mail: jaap.murre at mrc-apu.cam.ac.apu From hava at bimacs.cs.biu.ac.il Mon Dec 13 13:46:31 1993 From: hava at bimacs.cs.biu.ac.il (Siegelmann Hava) Date: Mon, 13 Dec 93 20:46:31 +0200 Subject: TR available: Complexity of Training Message-ID: <9312131846.AA29216@bimacs.cs.biu.ac.il> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/filename.ps.Z The file siegelmann.training.ps.Z is now available for copying from the Neuroprose repository: ===================================================================== On the Complexity of Training Neural Networks with Continuous Activation Functions (30 pages) Bhaskar DasGupta Hava T. Siegelmann Eduardo D. Sontag Minnesota Bar-Ilan Rutgers ====================================================================== We deal with computational issues of loading a fixed-architecture neural network with a set of positive and negative examples. This is the first result on the hardness of loading networks which do not consist of the binary-threshold neurons, but rather utilize a particular continuous activation function, commonly used in the neural network literature. We observe that the loading problem is polynomial-time if the input dimension is constant. Otherwise, however, it any possible learning algorithm based on particular fixed architectures faces severe computational barriers. Similar theorems have already been proved by Megiddo and by Blum and Rivest, to the case of binary-threshold networks only. Our theoretical results lend further justification to the use of incremental (architecture-changing) techniques for training networks From gjg at cns.edinburgh.ac.uk Tue Dec 14 16:11:08 1993 From: gjg at cns.edinburgh.ac.uk (Geoffrey Goodhill) Date: Tue, 14 Dec 93 21:11:08 GMT Subject: Preprints available Message-ID: <25549.9312142111@cns.ed.ac.uk> FTP-host: archive.cis.ohio-state.edu FTP-filename: /pub/neuroprose/goodhill.normalization.ps.Z FTP-filename: /pub/neuroprose/goodhill.elastic.ps.Z The following two papers have been accepted for publication in "Neural Computation": for abstracts and information on obtaining preprints, see below. "The Role of Weight Normalization in Competitive Learning" - Goodhill, G.J. and Barrow, H.G. "Elastic Net Model of Ocular Dominance: Overall Stripe Pattern and Monocular Deprivation" - Goodhill, G.J. and Willshaw, D.J. Geoff Goodhill (gjg at cns.ed.ac.uk) -------------------- Instructions for obtaining by anonymous ftp: % ftp archive.cis.ohio-state.edu (128.146.8.52) Name: anonymous ftp> binary ftp> cd pub/neuroprose ftp> get goodhill.normalization.ps.Z ftp> get goodhill.elastic.ps.Z ftp> quit % uncompress ..... The papers are of length 11 pages and 6 pages respectively. --------------------- The Role of Weight Normalization in Competitive Learning Goodhill, G.J. and Barrow, H.G. The effect of different kinds of weight normalization on the outcome of a simple competitive learning rule is analysed. It is shown that there are important differences in the representation formed depending on whether the constraint is enforced by dividing each weight by the same amount (``divisive enforcement''), or subtracting a fixed amount from each weight (``subtractive enforcement''). For the divisive cases weight vectors spread out over the space so as to evenly represent ``typical'' inputs, whereas for the subtractive cases the weight vectors tend to the axes of the space, so as to represent ``extreme'' inputs. The consequences of these differences are examined. ----- Elastic Net Model of Ocular Dominance: Overall Stripe Pattern and Monocular Deprivation Goodhill, G.J. and Willshaw, D.J. The elastic net \cite{durwil87} can account for the development of both topography and ocular dominance in the mapping from the lateral geniculate nucleus to primary visual cortex. Here it is further shown for this model that (a) the overall pattern of stripes produced is strongly influenced by the shape of the cortex: in particular stripes with a global order similar to that seen biologically can be produced under appropriate conditions, and (b) the observed changes in stripe width associated with monocular deprivation are reproduced in the model. From jaap.murre at mrc-apu.cam.ac.uk Fri Dec 17 13:57:46 1993 From: jaap.murre at mrc-apu.cam.ac.uk (Jaap Murre) Date: Fri, 17 Dec 93 18:57:46 GMT Subject: Six papers available by ftp Message-ID: <29396.9312171857@rigel.mrc-apu.cam.ac.uk> Six reports by the Leiden Connectionist Group have recently been placed in our ftp site. Retrieval instructions can be found at the end of this message. -- Jaap Murre --------------------------------------------------------------------------- File: modann1.ps.Z Title: Designing Modular Artificial Neural Networks Authors: Egbert J.W. Boers, Herman Kuiper, Bart L.M. Happel, and Ida G. Sprinkhuizen-Kuyper Abstract: This paper presents a method for designing artificial neural network architectures. The method implies a reverse engineering of the processes resulting in the mammalian brain. The method extends the brain metaphor in neural network design with genetic algorithms and L-systems, modelling natural evolution and growth. It will be argued that a principle of modularity, which is inherent to the design method as well as the resulting network architectures, improves network performance. E-mail comments to boers at WI.LeidenUniv.nl or to happel at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: bsp400.ps.Z Title: The BSP400: A Modular Neurocomputer Authors: Jan N.H. Heemskerk, Jaap Hoekstra, Jacob M.J. Murre, Leon H.J.G. Kemna, and Patrick T.W. Hudson Abstract: This paper discusses the main architectural issues, the implementation, and the performance of a parallel neurocomputer, the Brain-Style Processor or BSP400. This project presents a feasibility study for larger parallel neurocomputers. The design principles are hardware modularity, simple processors, and in situ (local) learning. The modular approach of the design ensures extensibility of the present version. The BSP400 consists of 25 Modules (boards) each containing 16 simple 8-bit single-chip computers. The Module boards are connected to a dedicated connection network. The architectural configuration of the BSP400 supports local activation and learning rules. The ability to communicate activations with the outside world in real-time makes the BSP400 particularly suited for real-world applications. The present version implements a modular type of neural network, the CALM (Categorizing And Learning Module) neural network. In this implementation of CALM, activations are transmitted as single bits, but an internal representation of one byte is kept for both activations and weights. The system has a capacity of 400 processing elements and 32,000 connections. Even with slow and simple processing elements, it still achieves a speed of 6.4 million connections per second for a non-learning CALM network. Some small network simulation studies carried out on the BSP400 are reported. A comparison with a design study (Mark III and Mark IV) is made. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: schema.ps.Z Title: A Real-Time Neural Implementation of a Schema Driven Toy-Car Authors: Jan N.H. Heemskerk and Fred A. Keijzer Abstract: An actual implementation of a schema driven toy-car is presented in this paper. The car is equipped with two motors and 4 light sensors. Supervised learning behavior of the car is achieved by using a neural network with adaptive connections. The car can be taught to drive towards a light and avoid obstacles. The controlling neural network is implemented on the BSP400 neurocomputer, a Brain Style Processor with 400 nodes. A subset of the digital nodes in the BSP400 are connected by fixed weights to form logical circuits in order to re-train the car. In this way cooperative computation of both 'logical' and 'neural' processes are integrated into one system. The actions of the car are described in terms of both distal and proximal schemas. These correspond respectively with a description of the car's actions in terms of distal system environmental stimuli and effects, and a description of the local routines carried out by the system. The latter are restricted to the specific situatedness and embodiment of the acting system. The distinction between distal and proximal schemas is presented as a way to link neural structure to adaptive action. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: mindshap.ps.Z Title: MindShape: a neurocomputer concept based on a fractal architecture Authors: Jan N.H. Heemskerk, Jacob M.J. Murre, Arend Melissant, Mirko Pelgrom, and Patrick T.W. Hudson Abstract: A parallel architecture for implementing massive neural networks, called MindShape, is presented. MindShape is the successor to the Brain Style Processor, a 400-processor neurocomputer based on the principle of modularity. The MindShape machine consists of Neural Processing Elements (NPEs) and Communication Elements (CEs) organized in what we have called a fractal architecture. The architecture is by definition scalable, permitting the implementation of very large networks consisting of many thousands of nodes. Through simulations of data- communication flow on different architectures, and through implementation studies of VLSI hardware on a chip simulator, the specific requirements of the CEs and the NPEs have been investigated. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: optim.ps.Z Title: Implementation of Optimization Networks in Synchronous Massively Parallel Hardware Authors: Jan N.H. Heemskerk, Peter A. Starreveld, and Patrick T.W. Hudson Abstract: In this paper, implementation possibilities of a synchronous binary neural model for solving optimization problems in massively parallel hardware are studied. It is argued that synchronous, as opposed to asynchronous models are best suited to the general characteristics of massively parallel architectures. In this study the massively parallel target device is the BSP400 (Brain Style Processor with 400 nodes). The updating of the nodes in the BSP400 is synchronous and the nodes can only process local data (i.e., activations). The synchronous models studied were introduced by Takefuji [7] and make use of both local and global operators. The functionality of these operators with regard to the quality of the solutions was examined through software simulations. Fully digital neurocomputers such as the BSP400 offer sufficient flexibility for programming local operations on node level. The possibilities of translating the function of global operators into local operations were also studied. The aim of this study is to combine massively parallel hardware with synchronous neural networks models for optimization problems in order to get both high speed and high quality of the solutions. E-mail comments to HMSKERK at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- File: pictwrd.ps.Z Title: A Connectionist Model for Context Effects in the Picture-Word Interference Task Authors: Peter A. Starreveld and Jan N. H. Heemskerk Abstract: In the picture-word interference task, two context effects can be distin- guished: the semantic interference effect and the orthographic facilitation effect. A theory is described to explain these effects. This theory was implemented in a connectionist model. The model is able to simulate the time courses of the two context effects and their interaction. E-mail comments to STARREVE at rulfsw.LeidenUniv.nl --------------------------------------------------------------------------- To obtain the file use the following command sequence: ftp ftp.mrc-apu.cam.ac.uk NAME: anonymous PASSWORD: cd pub cd nn bin get quit decompress lpr -P ---------------------------------------------------------------------------