From peter.hansen at physiol.ox.ac.uk Thu Jun 4 10:59:35 1998 From: peter.hansen at physiol.ox.ac.uk (Peter Hansen) Date: Thu, 4 Jun 1998 15:59:35 +0100 (BST) Subject: Job Openings; University of Oxford Message-ID: +-------------------------------------------+ | UNIVERSITY OF OXFORD | | CENTRE FOR COGNITIVE NEUROSCIENCE | +-------------------------------------------+ The Centre, which is funded by grants from the Medical Research Council and the McDonnell-Pew Program, supports collaborative, interdisciplinary research on many aspects of brain function relevant to human cognition, in several departments at Oxford. Computational Neuroscientist (Ref IRC2) This appointment, for computational approaches to cognitive function, will probably be made for three years in the first instance, on the RSII scale (21,016 - 27,935 UKP), or on the RS1A scale (15,159 - 22,785 UKP), depending on experience and specified duties. The post holder should have a PhD or equivalent experience, evidence of a capacity for independent research, and expertise in the mathematical analysis of neuronal networks and/or other computational approaches to neuroscience. S/he will work with neuroscientists to develop biologically constrained models of cortical function. S/he will probably have the opportunity to supervise graduate students and to participate in organising seminars and workshops. Computer Officer (Ref McDPCO) Applications are invited from individuals, with extensive experience of modern computing techniques, for the above post, for three years in the first instance, full-time (or for a longer period part-time, pro rata), on the RS1A scale (15,159 - 22,785 UKP) or RSII scale (21,016 - 27,935 UKP), depending on experience and specified duties. The person appointed to this post will carry the main responsibility for supporting computing-related research activities in the Centre. The present facilities include a network of Unix workstations (Sun/SGI) and PCs using MS Windows 95/NT. The work will involve varied project-based programming using C or C++, general computing support, training of research staff and liaising with existing computer staff. Knowledge of experimental control programs or system administration (Unix/PC) would be an advantage. Further information is available from the Centre Web site (http://www.physiol.ox.ac.uk/mcdp/jobs) or from the Administrative Secretary, Harriet Fishman, University Laboratory of Physiology, Parks Road, Oxford OX1 3PT. Tel: 01865-272497; Fax: 01865-272488. Applicants should write, quoting the reference number of the post and enclosing a full curriculum vitae together with names and addresses of two referees, to the Assistant Administrator, University Laboratory of Physiology, Parks Road, Oxford OX1 3PT. Closing date is 10 July 1998. The University is an equal opportunity employer. From hali at theophys.kth.se Thu Jun 4 18:41:23 1998 From: hali at theophys.kth.se (Hans Liljenstrm) Date: Fri, 05 Jun 1998 00:41:23 +0200 Subject: 1998 Sigtuna Workshop on Fluctuations Message-ID: <35772293.841A08E6@theophys.kth.se> 2nd announcement and call for participation Third Sigtuna Workshop Random Events in Biological Systems Sigtuna, Sweden 3-5 Sep 1998 organized by the Agora for Biosystems Objectives In this meeting we wish to address questions concerning various forms of fluctuations and disorder in biological systems. By bringing together experimentalists and theoreticians with knowledge and insights from different disciplines, such as biology, physics, and computer science, we hope to shed more light on problems, which we think are profound for understanding the phenomenon of life. Topics will include synchronization, oscillations, chaos, noise, and stochastic resonance in e.g. the origin and evolution of life, biomolecular kinetics, neural information processing, and organ system functioning. Both experimental data and theory from the frontiers of science will be discussed. A number of invited speakers will provide presentations on the fundamental problems, but we invite further contributions, in the form of short lectures, computer demonstrations and posters. In order to maintain a close contact between all participants, and to provide an efficient workshop atmosphere, the number of participants will be limited to approximately forty people. The location of the workshop is at a unique guest house in Sigtuna, a royal town in the early Middle Ages. Situated at the shore of the beautiful lake M=E4laren, Sigtuna is only 15 km away from the Stockholm airport and 45 km from downtown Stockholm. It is also close to the city of Uppsala. The total cost, including accomodation, all meals and registration fee is 3000 SEK (approx 375 USD). Call for submissions: A small number of contributed talks will be organized. Interested participants are asked to submit by email a title and abstract to any of the organizers, by July 15, 1998. Organizing committee: Hans Liljenstr=F6m, Dept. of Physics, Royal Institute of Technology, Stockholm Peter =C5rhem, Nobel Institute for Neurophysiology, Karolinska Institutet, Stockholm Clas Blomberg, Dept. of Physics, Royal Institute of Technology, Stockholm (All also affiliated with the Agora for Biosystems) Confirmed invited speakers: Agnes Babloyantz, Dept of Chemical Physics, Free University of Brussels, Belgium Hans Braun, Institute of Physiology, University of Marburg, Germany Jarl-Thure Eriksson, Laboratory of Electricity, Technical University of Tampere, Finland Hans Frauenfelder, Los Alamos National Laboratory, New Mexico, USA Hermann Haken, Institut f=FCr Theoret. Physik und Synergetik, Universitet Stuttgart, Germany John Hertz, Nordita, Copenhagen, Denmark Amit Manwani, Computation and Neural Systems Program, Caltech, Pasadena, USA Michael Mackey, Dept. of Physiology, McGill University, Montreal, Canada Frank Moss, Dept. of Physics, University of Missouri, St Louis, USA Erik Mosekilde, Dept of Physics, Technical University of Denmark, Lyngby Sakire P=F6gun, Center for Brain Research, Ege University, Turkey J=F6rg Stucki, Dept. of Pharmacology, University of Bern, Switzerland E=F6rs Szathmary, Collegium Budapest, Hungary Peter Wolynes, Dept. of Chemistry, Univ. of Indiana, Urbana For further information, please see our web site, http://www.theophys.kth.se/~hali/agora/sigtuna98, or contact Hans Liljenstr=F6m Theoretical Biophysics Group Dept. of Physics Royal Institute of Technology S-100 44 Stockholm, SWEDEN Email: hali at theophys.kth.se Phone: +46-(0)8-790 9423 Fax: +46-(0)8-10 48 79 If you are interested in participating in this workshop, please fill in and return the pre-registration form below. 1998 Sigtuna Workshop on RANDOM EVENTS IN BIOLOGICAL SYSTEMS Pre-Registration Name:___________________________________________________________ Address:_________________________________________________________ __________________________________________________________ Student: Yes No Willing to contribute with a presentation: Yes No Presentation preference: Oral Poster Preliminary title/subject:____________________________________________ ________________________________________________________________ DEADLINE FOR SUBMISSIONS IS JULY 15, 1998 - please post - From dld at cs.monash.edu.au Fri Jun 5 01:47:24 1998 From: dld at cs.monash.edu.au (David L Dowe) Date: Fri, 5 Jun 1998 15:47:24 +1000 Subject: CFPs: Info theory in biology, due July 13 Message-ID: <199806050547.PAA06363@dec11.cs.monash.edu.au> Information-theoretic approaches to biology ------------------------------------------- This is the Call For Papers for the 4th Pacific Symposium on BioComputing (PSB99, 1999) conference track on "Information-theoretic approaches to biology". PSB-99 will be held from 4-9 January, 1999, in Mauni Lani on the Big Island of Hawaii. Track Organisers: David L. Dowe (dld at cs.monash.edu.au) and Klaus Prank. WWW site: http://www.cs.monash.edu.au/~dld/PSB99/PSB99.Info.CFPs.html . Specific technical area to be covered by this track: Approaches to biological problems using notions of information or complexity, including methods such as Algorithmic Probability, Minimum Message Length and Minimum Description Length. Two possible applications are (e.g.) protein folding and biological information processing. Kolmogorov (1965) and Chaitin (1966) studied the notions of complexity and randomness, with Solomonoff (1964), Wallace (1968) and Rissanen (1978) applying these to problems of statistical and inferential learning (and ``data mining'') and to prediction. The methods of Solomonoff, Wallace and Rissanen have respectively come to be known as Algorithmic Probability (ALP), Minimum Message Length (MML) and Minimum Description Length (MDL). All of these methods relate to information theory, and can also be thought of in terms of Shannon's information theory, and can also be thought of in terms of Boltzmann's thermo-dynamic entropy. An MDL/MML perspective has been suggested by a number of authors in the context of approximating unknown functions with some parametric approximation scheme (such as a neural network). The designated measure to optimize under this scheme combines an estimate of the cost of misfit with an estimate of the cost of describing the parametric approximation (Akaike 1973, Rissanen 1978, Barron and Barron 1988, Wallace and Boulton, 1968). This track invites all original papers of a biological nature which use notions of information and/or information-theoretic complexity, with no strong preference as to what specific nature. Such work has been done in problems of, e.g., protein folding and DNA string alignment. As we shortly describe in some detail, such work has also been done in the analysis of temporal dynamics in biology such as neural spike trains and endocrine (hormonal) time series analysis using the MDL principle in the context of neural networks and context-free grammar complexity. To elaborate on one of the relevant topics above, in the last three years or so, there has been a major focus on the aspect of timing in biological information processing ranging from fields such as neuroscience to endocrinology. The latest work on information processing at the single-cell level using computational as well as experimental approaches reveals previously unimagined complexity and dynamism. Timing in biological information processing on the single-cell level as well as on the systems level has been studied by signal-processing and information-theoretic approaches in particular in the field of neuroscience (see for an overview: Rieke et al. 1996). Using such approaches to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing could be revealed by computational methods (Mainen and Sejnowski, 1995; Gabbiani and Koch 1996; Gabbiani et al., 1996). The examples given above are examples of some possible biological application domains. We invite and solicit papers in all areas of (computational) biology which make use of ALP, MDL, MML and/or other notions of information and information-theoretic complexity. In problems of prediction, as well as using "yes"/"no" predictions, we would encourage the authors to consider also using probabilistic prediction, where the score assigned to a probabilistic prediction is given according to the negative logarithm of the stated probability of the event. Further comments re PSB-99 : ---------------------------- PSB99 will publish accepted full papers in an archival Proceedings. All contributed papers will be rigorously peer-reviewed by at least three referees. Each accepted full paper will be allocated up to 12 pages in the conference Proceedings. The best papers will be selected for a 30-minute oral presentation to the full assembled conference. Accepted poster abstracts will be distributed at the conference separately from the archival Proceedings. To be eligible for proceedings publication, each full paper must be accompanied by a cover letter stating that it contains original unpublished results not currently under consideration elsewhere. See http://www.cgl.ucsf.edu/psb/cfp.html for more information. IMPORTANT DATES: Full paper submissions due: July 13, 1998 Poster abstracts due: August 22, 1998 Notification of paper acceptance: September 22, 1998 Camera-ready copy due: October 1, 1998 Conference: January 4 - 9, 1999 More information about the "Information-theoretic approaches to biology" track, including a sample list of relevant papers is available on the WWW at http://www.cs.monash.edu.au/~dld/PSB99/PSB99.Info.CFPs.html . More information about PSB99 is available from http://www.cgl.ucsf.edu/psb/cfp.html For further information, e-mail Dr. David Dowe, dld at cs.monash.edu.au or e-mail Dr. Klaus Prank, ndxdpran at rrzn-serv.de . This page was put together by Dr. David Dowe, School of Computer Science and Softw. Eng., Monash University, Clayton, Vic. 3168, Australia e-mail: dld at cs.monash.edu.au Fax: +61 3 9905-5146 http://www.csse.monash.edu.au/~dld/ and Dr. Klaus Prank, Abteilung Klinische Endokrinologie Medizinische Hochschule Hannover Carl-Neuberg-Str. 1 D-30623 Hannover Germany e-mail: ndxdpran at rrzn-serv.de Tel.: +49 (511) 532-3827 Fax.: +49 (511) 532-3825 http://sun1.rrzn-user.uni-hannover.de/~ndxdpran/ From harnad at coglit.soton.ac.uk Fri Jun 5 14:37:49 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Fri, 5 Jun 1998 19:37:49 +0100 (BST) Subject: Pylyshyn on Vision & Cognition: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article on: IS VISION CONTINUOUS WITH COGNITION? THE CASE FOR COGNITIVE IMPENETRABILITY OF VISUAL PERCEPTION by Zenon Pylyshyn This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences Department of Psychology University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. ____________________________________________________________________ IS VISION CONTINUOUS WITH COGNITION? THE CASE FOR COGNITIVE IMPENETRABILITY OF VISUAL PERCEPTION Zenon Pylyshyn Rutgers Center for Cognitive Science Rutgers University Psychology Addition, Busch Campus, New Brunswick, NJ 08903 zenon at ruccs.rutgers.edu KEYWORDS: visual processing, modularity, cognitive penatrability, early vision context effects, top down processes, signal detection theory, attention expert perception, perceptual learning, knowledge-based vision, visual agnosia, categorical perception. ABSTRACT: Although the study of visual perception has made more progress in the past 40 years than any other area of cognitive science, there remain major disagreements as to how closely vision is tied to cognition. This paper sets out some of the arguments for both sides (arguments from computer vision, neuroscience, Psychophysics, perceptual learning and other areas of vision science) and defends the position that an important part of visual perception, corresponding to what some people have called early vision, is prohibited from accessing relevant expectations, knowledge and utilities in determining the function it computes - in other words it is cognitively impenetrable. That part of vision is complex and involves top-down interactions that are internal to the early vision system. Its function is to provide a structured representation of the 3-D surfaces of objects sufficient to serve as an index into memory, with somewhat different outputs being made available to other systems such as those dealing with motor control. The paper also addresses certain conceptual and methodological issues raised by this claim, including the use of signal detection theory and event-related potentials to assess cognitive penetration of vision. A distinction is made among several stages in visual processing. These include, in addition to the inflexible early-vision stage, a pre-perceptual attention-allocation stage and a post-perceptual evaluation, selection, and inference stage which accesses long-term memory. These two stages provide the primary ways in which cognition can affect the outcome of visual perception. The paper discusses arguments that have been presented in both computer vision and psychology showing that vision is "intelligent" and involves elements of "problem solving". It is suggested that the cases of apparently intelligent interpretation that are sometimes cited in support of this claim do not show cognitive penetration, but rather they show that certain natural constraints on interpretation, concerned primarily with optical and geometrical properties of the world, have been compiled into the visual system. The paper also examines a number of examples where instructions and "hints" are alleged to affect what is seen. In each case it is concluded that the evidence is more readily assimilated to the view that when cognitive effects are found, they have a locus outside early vision, in such processes as the allocation of focal attention and identification of the stimulus. -------------------------------------------------------------- To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp or gopher from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.pylyshyn.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.pylyshyn ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.pylyshyn gopher://gopher.princeton.edu:70/11/.libraries/.pujournals To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.pylyshyn When you have the file(s) you want, type: quit From terry at salk.edu Mon Jun 8 18:27:36 1998 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 8 Jun 1998 15:27:36 -0700 (PDT) Subject: NEURAL COMPUTATION 10:5 Message-ID: <199806082227.PAA05690@helmholtz.salk.edu> Neural Computation - Contents Volume 10, Number 5 - July 1, 1998 ARTICLE Dynamics of Membrane Excitability Determine Inter-Spike Interval Variability: A Link Between Spike Generation Mechanisms and Cortical Spike Train Statistics Boris S. Gutkin and G. Bard Ermentrout NOTE Correction To Proof That Recurrent Neural Networks Can Robustly Recognize Only Regular Languages Michael Casey LETTERS On the Effect of Analog Noise in Discrete-Time Analog Computations Wolfgang Maass, and Pekka Orponen Category Learning Through Multi-Modality Sensing Virginia R. de Sa and Dana H. Ballard A Hierarchical Model of Binocular Rivalry Peter Dayan Efficient Learning in Boltzmann Machines Using Linear Response Theory H. J. Kappen and F. B. Rodriguez A Learning Theorem for Networks at Detailed Stochastic Equilibrium Javier R. Movellan Asymmetric Dynamics in Optimal Variance Adaptation Michael DeWeese and Anthony Zador Computation with Infinite Neural Networks Christopher K. I. Williams Bayesian Radial Basis Functions of Variable Dimension C. C. Holmes and B. K. Mallick Absence of Cycles in Symmetric Neural Networks Xin Wang, Arun Jagota, Fernanda Botelho, and Max Garzon Pattern Generation by Two Coupled Time-Discrete Neural Networks with Synaptic Depression W. Senn, and Th. Wannier, J. Kleinle, H.-R. Luscher, L. Muller, J. Streit, and K. Wyler Computational Studies of Lateralization of Phoneme Sequence Generation James A. Reggia, Sharon Goodall, and Yuri Shkuro Nonlinear Component Analysis as a Kernel Eigenvalue Problem Bernhard Scholkopf, Alexander Smola, and Klaus-Robert Muller ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1998 - VOLUME 10 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $78 Individual $82 $87.74 $110 Institution $285 $304.95 $318 * includes 7% GST (Back issues from Volumes 1-9 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From S.Holden at cs.ucl.ac.uk Tue Jun 9 09:42:27 1998 From: S.Holden at cs.ucl.ac.uk (Sean Holden) Date: Tue, 09 Jun 1998 14:42:27 +0100 Subject: Special issue on generalization Message-ID: <1057.897399747@cs.ucl.ac.uk> Readers of this mailing list may be interested to know that the March 1998 issue of the journal, STATISTICS AND COMPUTING published by Chapman & Hall is a special issue on the subject of "generalization". Full details can be found at, http://statsandcomp.thomsonscience.com and the list of contents follows. Best wishes, Sean Holden. Guest Editor. M. Anthony "Probabilistic 'generalization' of functions and dimension-based uniform convergence results" D. J. C. MacKay "Interpolation models with multiple hyperparameters" and R. Takeuchi R. Tibshirani "Coaching variables for regression and classification" and G. Hinton D. Wolpert, "Some results concerning off-training-set and IID error E. Knill for the Gibbs and Bayes Optimal Generalizers" and T. Grossman C. W. H. Mace "Statistical mechanical analysis of the dynamics of and A. C. C. learning in perceptrons" Coolen From Dave_Touretzky at skinner.boltz.cs.cmu.edu Mon Jun 8 14:18:46 1998 From: Dave_Touretzky at skinner.boltz.cs.cmu.edu (Dave Touretzky) Date: Mon, 08 Jun 1998 14:18:46 -0400 Subject: test posting Message-ID: <16336.897329926@skinner.boltz.cs.cmu.edu> ------- Blind-Carbon-Copy From Dave_Touretzky at cs.cmu.edu Mon Jun 8 14:18:46 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Mon, 08 Jun 1998 14:18:46 -0400 Subject: test posting Message-ID: <16336.897329926@skinner.boltz.cs.cmu.edu> The Connectionists list has been experiencing some technical problems in the past week. Please excuse this test posting. - -- Dave Touretzky, CONNECTIONISTS moderator ------- End of Blind-Carbon-Copy From tp at ai.mit.edu Thu Jun 4 11:27:57 1998 From: tp at ai.mit.edu (Tomaso Poggio) Date: Thu, 04 Jun 1998 11:27:57 -0400 Subject: Computational Position Message-ID: <3.0.5.32.19980604112757.00c57ea0@ai.mit.edu> MASSACHUSETTS INSTITUTE OF TECHNOLOGY DEPARTMENT OF BRAIN SCIENCES The MIT Department of Brain Sciences anticipates making another tenure-track appointment in computational brain and cognitive science at the Assistant Professor level. Candidates should have a strong mathematical background and an active research interest in the mathematical modeling of specific biophysical, neural or cognitive phenomena. Individuals whose research focuses on learning and memory at the level of neurons and networks of neurons are especially encouraged to apply. Responsibilities include graduate and undergraduate teaching and research supervision. Applications should include a brief cover letter stating the candidate's research and teaching interests, a vita, three letters of recommendation and representative reprints. Qualified individuals should send their dossiers by October 21, 1998 to: Chair, Faculty Search Committee/Computational Neuroscience Department of Brain & Cognitive Sciences, E25-406 MIT 77 Massachusetts Avenue Cambridge, MA 02139-4307 Previous pplicants need not resubmit their dossiers. MIT is an Affirmative Action/Equal Opportunity Employer. Qualified women and minority candidates are encouraged to apply. Tomaso Poggio Uncas and Helen Whitaker Professor Brain Sciences Department and A.I. Lab M.I.T., E25-218, 45 Carleton St Cambridge, MA 02142 E-mail: tp at ai.mit.edu Web: Phone: 617-253-5230 Fax: 617-253-2964 From singer at research.att.com Mon Jun 8 17:31:39 1998 From: singer at research.att.com (Yoram Singer) Date: Mon, 8 Jun 1998 17:31:39 -0400 (EDT) Subject: new and improved family of boosting algorithms Message-ID: <199806082131.RAA10700@allegro.research.att.com> The following papers introduce, analyze, and describe applications of a new and improved family of boosting algorithms. The papers are available from: http://www.research.att.com/~schapire/boost.html and http://www.research.att.com/~singer/pub.html Questions and comments are welcome. - Rob Schapire and Yoram Singer {schapire,singer}@research.att.com ----------------------------------------------------------------------------- Improved Boosting Algorithms Using Confidence-rated Predictions Robert Robert E. Schapire and Yoram Singer We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find improved parameter settings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns out to be identical to one proposed by Kearns and Mansour. We focus next on how to apply the new boosting algorithms to multiclass classification problems, particularly to the multi-label case in which each example may belong to more than one class. We give two boosting methods for this problem. One of these leads to a new method for handling the single-label case which is simpler but as effective as techniques suggested by Freund and Schapire. Finally, we give some experimental results comparing a few of the algorithms discussed in this paper. ----------------------------------------------------------------------------- BoosTexter: A System for Multiclass Multi-label Text Categorization Robert E. Schapire and Yoram Singer This work focuses on algorithms which learn from examples to perform multiclass text and speech categorization tasks. We first show how to extend the standard notion of classification by allowing each instance to be associated with multiple labels. We then discuss our approach for multiclass multi-label text categorization which is based on a new and improved family of boosting algorithms. We describe in detail an implementation, called BoosTexter, of the new boosting algorithms for text categorization tasks. We present results comparing the performance of BoosTexter and a number of other text-categorization algorithms on a variety of tasks. We conclude by describing the application of our system to automatic call-type identification from unconstrained spoken customer responses. ----------------------------------------------------------------------------- An Efficient Boosting Algorithm for Combining Preferences Yoav Freund, Raj Iyer, Robert E. Schapire, Yoram Singer The problem of combining preferences arises in several applications, such as combining the results of different search engines. This work describes an efficient algorithm for combining multiple preferences. We first give a formal framework for the problem. We then describe and analyze a new boosting algorithm for combining preferences called RankBoost. We also describe an efficient implementation of the algorithm for a restricted case. We discuss two experiments we carried out to assess the performance of RankBoost. In the first experiment, we used the algorithm to combine different WWW search strategies, each of which is a query expansion for a given domain. For this task, we compare the performance of RankBoost to the individual search strategies. The second experiment is a collaborative-filtering task for making movie recommendations. Here, we present results comparing RankBoost to nearest-neighbor and regression algorithms. From giro at open.brain.riken.go.jp Mon Jun 8 01:42:19 1998 From: giro at open.brain.riken.go.jp (Dr. Mark Girolami) Date: Mon, 08 Jun 1998 14:42:19 +0900 Subject: PhD Research Studentships Message-ID: <357B79BB.FF6@open.brain.riken.go.jp> UNIVERSITY OF PAISLEY DEPARTMENT OF COMPUTING AND INFORMATION SYSTEMS PhD Studentships 'Applying Artificial Neural Networks in Non-invasive Direct Depth-Of-Anaesthesia Monitoring' Applications are invited for PhD research studentships to participate in a three year funded project on applying artificial neural networks and advanced signal processing techniques to non-invasive direct depth-of-anaesthesia monitoring. This project is being carried out in collaboration with the Department of Anaesthesia, Glasgow Western Infirmary. Suitable candidates will also have the opportunity of carrying out periods of research in collaborating laboratories based in Japan and the USA. Despite much research, directly monitoring the depth-of-anaesthesia has not yet found a place in routine anaesthetic practice. Two technologies form the basis of current monitors: auditory evoked potentials (AEPs) and bispectral index (BIS). AEPs involve the analysis of electrical signals produced by the auditory cortex in response to a pattern of clicks through a pair of headphones. BIS is a method which employs higher order statistics in processing brain EEG data which results in a single figure measure of depth of anaesthesia. Recording EEG and AEP data are now established and reliable techniques. The extraction of anaesthesia related data from the resultant stream of information holds the key to further advances in directly quantifying depth-of-anaesthesia. The project aims to develop and assess the use of artificial neural network (ANN) techniques in processing and analysing EEG and AEP data with the specific aim of improving the sensitivity and specificity of direct depth-of-anaesthesia monitoring. Applicants should have at least an upper second class degree in one of these disciplines: electronic engineering, mathematics, physics or computer science. Knowledge of signal processing or neural networks would be desirable. Applications in the form of a CV and names and addresses of three referees should be sent, as soon as possible and at the latest by 30th July 1998, to Dr. Mark Girolami, Computational Intelligence Research Unit, Department of Computing and Information Systems, University of Paisley, High Street, Paisley, PA1 2BE, Scotland, UK. Informal inquiries can be made direct to Dr. Mark Girolami giro at open.brain.riken.go.jp Or giro0ci at paisley.ac.uk -- ---------------------------------------------- Dr. Mark Girolami (TM) RIKEN, Brain Science Institute Laboratory for Open Information Systems 2-1 Hirosawa, Wako-shi, Saitama 351-01, Japan Email: giro at open.brain.riken.go.jp Tel: +81 48 467 9666 Tel: +81 48 462 3769 (apartment) Fax: +81 48 467 9694 --------------------------------------------- Currently on Secondment From: Department of Computing and Information Systems University of Paisley High Street, PA1 2BE Scotland, UK Email: giro0ci at paisley.ac.uk Tel: +44 141 848 3963 Fax: +44 141 848 3542 Secretary: Mrs E Campbell Tel: +44 141 848 3966 --------------------------------------------- From mac+ at andrew.cmu.edu Mon Jun 1 13:57:35 1998 From: mac+ at andrew.cmu.edu (Mary Anne Cowden) Date: Mon, 1 Jun 1998 13:57:35 -0400 (EDT) Subject: Carnegie Symposium on Mechanisms of Cognitive Development, Oct 9-11, 1998 Message-ID: =============================================================== CALL FOR PARTICIPATION The 29th Carnegie Symposium on Cognition Mechanisms of Cognitive Development: Behavioral and Neural Perspectives October 9 - 11, 1998 James L. McClelland and Robert S. Siegler, Organizers ---------------------------------------------------------------------------- The 29th Carnegie Symposium on Cognition is sponsored by the Department of Psychology and the Center for the Neural Basis of Cognition. The symposium is supported by the National Science Foundation, the National Institute of Mental Heatlh, and the National Institute of Child Health and Human Development. ---------------------------------------------------------------------------- This post contains the following entries relevant to the symposium: * Overview * Schedule of Events * Attending the Symposium * Travel Fellowships ---------------------------------------------------------------------------- Overview This symposium will consider how children's thinking evolves during development, with a focus on the role of experience in causing change. Speakers will examine the processes by which children learn and those that make children ready and able to learn at particular points in development, using both behavioral and neural approaches. Behavioral approaches will include research on the 'microgenesis' of cognitive change over short time periods (e.g., several hour-long sessions) in specific task situations. Research on cognitive change over longer time scales (months and years) will also be presented, as will research that uses computational modeling and dynamical systems approaches to understand learning and development. Neural approaches will include the study of how neuronal activity and connectivity change during acquisition of cognitive skills in children and adults. Other studies will consider the possible emergence of cognitive abilities through the maturation of brain structures and the effects of experience on the organization of functions in the brain. Developmental anomalies such as autism and attention deficit disorder will also be examined, as windows on normal development. Four questions will be examined throughout the symposium: 1) Why do cognitive abilities emerge when they do during development? 2) What are the sources of developmental and individual differences, and of developmental anomalies in learning? 3) What happens in the brain when people learn? 4) How can experiences be ordered and timed so as to optimize learning? The answers to these questions have strong implications for how we educate children and remediate deficits that impede development of thinking abilities. These implications will be explored in discussions among the participants. ---------------------------------------------------------------------------- The 29th Carnegie Symposium on Cognition: Schedule ---------------------------------------------------------------------------- Friday, October 9th: Studies of the Microgenesis of Cognitive Development 8:30 - 9:00 Continental Breakfast 9:00 Welcome BEHAVIORAL APPROACHES 9:20 Susan Goldin-Meadow, University of Chicago Giving the mind a hand: The role of gesture in cognitive change 10:20 Break 10:40 Robert Siegler, Carnegie Mellon University Microgenetic studies of learning in children and in brain-damaged adults 11:40 Lunch NEUROSCIENCE APPROACHES 1:00 Michael Merzenich, University of California, San Francisco Cortical plasticity phenomenology and mechanisms: Implications for neurorehabilitation 2:00 James L. McClelland, Carnegie Mellon University/CNBC Revisiting the critical period: Interventions that enhance adaptation to non-native phonological contrasts in Japanese adults 3:00 Break 3:20 Richard Haier, University of California, Irvine PET studies of learning and individual differences 4:20 Discussant: James Stigler, UCLA Saturday, October 10th: Studies of Change Over Long Time Scales 8:30 - 9:00 Continental Breakfast BEHAVIORAL APPROACHES 9:00 Esther Thelen, Indiana University Dynamic mechanisms of change in early perceptual motor development 10:00 Robbie Case, University of Toronto Differentiation and integration as the mechanisms in cognitive and neurological development 11:00 Break 11:20 Deanna Kuhn, Teacher's College, Columbia University Why development does (and doesn't) occur: Evidence from the domain of inductive reasoning 12:20 Lunch NEUROSCIENCE APPROACHES 2:00 Mark Johnson, Birkbeck College/University College London Cortical specialization for cognitive functions 3:00 Helen Neville, University of Oregon Specificity and plasticity in human brain development 4:00 Break 4:20 Discussant: David Klahr, Carnegie Mellon University Sunday, October 11th: Developmental Disorders 8:30 - 9:00 Continental Breakfast DYSLEXIA 9:00 Albert Galaburda, Harvard Medical School Toxicity of neural plasticity as seen through a model of learning disability AUTISM 10:00 Patricia Carpenter, Marcel Just, Carnegie Mellon University Cognitive load distribution in normal and autistic individuals 11:00 Break ATTENTION DEFICIT DISORDER 11:20 B. J. Casey, University of Pittsburgh Medical Center Disruption and inhibitory control in developmental disorders: A mechanistic model of implicated frontostriatal circuitry 12:20 Concluding discussant: Michael I. Posner, University of Oregon ---------------------------------------------------------------------------- Attending the Symposium Sessions on Friday, October 9 will be held in McConomy Auditorium, University Center, Carnegie Mellon. Sessions on Saturday, October 10 and Sunday, October 11 will be held in the Adamson Wing, Room 135 Baker Hall. Admission is free, and everyone is welcome to attend. Out of town visitors can contact Mary Anne Cowden, (412) 268-3151, mac+ at cmu.edu, for additional information. Travel Fellowships Fellowships are available for junior scientists for travel and lodging expenses associated with attending the symposium. Interested applicants should send a brief statement of interest, a curriculum vitae, and one letter of recommendation by August 15, 1998 to Mary Anne Cowden, Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213. --------------------------------------------------------------------------- This material is based on the symposium web-page: http://www.cnbc.cmu.edu/carnegie-symposium ---------------------------------------------------------------------------- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Mary Anne Cowden, Administrative Coord. Psychology Dept, Carnegie Mellon University Phone: 412/268-3151 Fax: 412/268-3464 http://www.contrib.andrew.cmu.edu/~mac/ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From mjjs at eng.cam.ac.uk Tue Jun 2 07:22:37 1998 From: mjjs at eng.cam.ac.uk (M.J.J. Scott) Date: Tue, 02 Jun 1998 12:22:37 +0100 Subject: Technical report available Message-ID: <3573E07D.796@eng.cam.ac.uk> The following technical report is available by anonymous ftp from the archive of the Speech, Vision and Robotics Group at the Cambridge University Engineering Department. The authors would welcome comments on this report. Parcel: feature subset selection in variable cost domains M.J.J. Scott, M. Niranjan, R.W. Prager. Technical Report CUED/F-INFENG/TR.323 Cambridge University Engineering Department Trumpington Street Cambridge CB2 1PZ England Abstract The vast majority of classification systems are designed with a single set of features, and optimised to a single specified cost. However, in examples such as medical and financial risk modelling, costs are known to vary subsequent to system design. In this paper, we present a design method for feature selection in the presence of varying costs. Starting from the Wilcoxon nonparametric statistic for the performance of a classification system, we introduce a concept called the maximum realisable receiver operating characteristic (MRROC), and prove a related theorem. A novel criterion for feature selection, based on the area under the MRROC curve, is then introduced. This leads to a framework which we call Parcel. This has the flexibility to use different combinations of features at different operating points on the resulting MRROC curve. Empirical support for each stage in our approach is provided by experiments on real world problems, with Parcel achieving superior results. ************************ How to obtain a copy ************************ a) http://svr-www.eng.cam.ac.uk/reports/abstracts/Scott_tr323.html b) Via FTP: unix> ftp svr-ftp.eng.cam.ac.uk Name: anonymous Password: (type your email address) ftp> cd reports ftp> binary ftp> get Scott_tr323.ps.gz ftp> quit unix> gunzip Scott_tr323.ps.gz unix> lpr Scott_tr323.ps (or however you print PostScript) c) Via postal mail: Request a hardcopy from Martin J.J. Scott, Cambridge University Engineering Department, Trumpington Street, Cambridge CB2 1PZ, England. or email me: mjjs at eng.cam.ac.uk -- Martin JJ Scott Fallside Lab, Engineering Dept, Trumpington St., Cambridge CB2 1PZ, +(44 1223) 332754 http://svr-www.eng.cam.ac.uk/~mjjs/Personal.html "We have heard the chimes at midnight ..." From derrabi at fin.ucl.ac.be Tue Jun 2 09:40:16 1998 From: derrabi at fin.ucl.ac.be (Derrabi Mohamed) Date: Tue, 02 Jun 1998 15:40:16 +0200 Subject: ACESG - Remainder Message-ID: <1.5.4.32.19980602134016.006a8398@doyens1.iag.ucl.ac.be> CONNECTIONIST APPROACHES IN ECONOMICS AND MANAGEMENT SCIENCES FIFTH INTERNATIONAL MEETING COMPLEX DATA : MODELING AND ANALYSIS LOUVAIN-LA-NEUVE, NOVEMBER 20, 1998 CALL FOR PAPERS - REMINDER ---------------------------------------------------------------------------- ------ Since the beginning of the 80s, important advances have been made in developing diverse new approaches of bio-mimetic inspiration (neural nets, genetic algorithms, cellular automata, ...). These approaches are of prime interest for researcher both in Economics and in Management Sciences. The ACSEG International Meetings give the opportunity to assess the state-of-the-art in the domain, to delineate future developments, and to evidence the contribution of bio-mimetic methods to Economics and Management Sciences. They also allow the researchers to present their recent work, to exchange know-how, and to discuss the problems encountered in their research. The 1998 ASCEG International Meeting on COMPLEX DATA : MODELING AND ANALYSIS will take place at the Universite catholique de Louvain, November 20, 1998. Organizers are the research centers SAMOS (Universite de Paris 1- Pantheon - Sorbonne), CEGF (Universite catholique de Louvain) and CeReFim (Facultes Universitaires Notre-Dame de la Paix). The members of the scientific committee invite you to submit papers in Economics and Management Sciences on the following topics: - simulation of complex processes (non-linear, non-parametric, ...) - new approaches for data analysis - local and global optimization - forecasting (financial series, bankruptcies, consumer behavior, ...) - behavioral modeling - hybrid approaches associating new and classical approaches - numerical evaluation methods (prices of financial assets, ...) If interested, check the conference page http://mkb.fin.ucl.ac.be/Acseg98 or write to: ACSEG98, Centre d'Etudes en Gestion Financi?re, Institut d'Administration et de Gestion, Universite catholique de Louvain, 1 place des Doyens, 1348 Louvain-la-Neuve - Belgium (Fax. : + (32).10.47.83.24) for additional information. Submission Date : Before June 30, 1998 **************************************************************************** ******* Mohamed DERRABI UCL- Institut d'Administration et de Gestion Unite Finance d'entreprise 1, Place des doyens B-1348 LLN Tel: 010 / 47 84 36 Fax: 010 / 47 83 24 **************************************************************************** ******* From xli at sckcen.be Tue Jun 9 05:32:50 1998 From: xli at sckcen.be (Xiaozhong Li) Date: Tue, 9 Jun 1998 11:32:50 +0200 Subject: Papers related to FLINS are available Message-ID: <2.2.16.19980609113045.0c5f5cf2@mail.sckcen.be> The following papers (1997-1998) related to FLINS are available from the following site: http://www.sckcen.be/people/xli/ Directory: Publications in English Tip: The files are in postscript formats compressed by WinZiP. Comments are welcome. My regards. Xiaozhong Li Xiaozhong Li, Da Ruan Novel Neural Algorithms Based on Fuzzy $\delta$ Rules for Solving Fuzzy Relation Equations: Part I Fuzzy Sets and Systems 90 (1997) 11-23. ABSTRACT Although there are some papers on using neural networks to solve fuzzy relation equations, they have some widespread problems. For example, the best learning rate cannot be decided easily and strict theoretic analyses on convergence of algorithms are not given due to the complexity in a given system. To overcome these problems, we present some novel neural algorithms in this paper. We first describe such algorithms for max-min operator networks, then we demonstrate these algorithms can also be extended to max-times operator network. Important results include some improved fuzzy $\delta$ rules, a convergence theorem and an equivalence theorem which reflects fuzzy theory and neural networks can reach the same goal by different routes. The fuzzy bidirectional associative memory network and its training algorithms are also discussed. All important theorems are well proved and a simulation and a comparision result with Blanco and Pedrycz are reported. Xiaozhong Li, Da Ruan Fuzzy $\delta$ Rule and Its Simulations in Fuzzy Relation Equations Int. J. of Fuzzy Mathematics . Accepted. ABSTRACT After a short review of our previous work, in this paper we will present a new simplified proof to a lemma which plays an important role in proving the convergence theorem of the fuzzy perceptron. The new proof is much shorter. Moreover, we give some typical simulation results to illustrate the power of the fuzzy $\delta$ rule. Xiaozhong Li, Da Ruan Novel Neural Algorithms Based on Fuzzy $\delta$ Rules for Solving Fuzzy Relation Equations: Part II Fuzzy Sets and Systems , Accepted. ABSTRACT In this paper, we first design a fuzzy neuron which possesses some generality. This fuzzy neuron is founded by replacing the operators of the traditional neuron with a pair of abstract fuzzy operators as ($\widehat+$, $\widehat\bullet$) which we call fuzzy neuron operators. For example, it may be $(+, \bullet)$, $(\bigwedge,\bullet)$, $(\bigvee,\bullet)$, or $(\bigwedge,\bigwedge)$, etc. It is an extended fuzzy neuron and a network composed of such neurons is an extended fuzzy neural network. Then we discuss the relationship between the fuzzy neuron operators and $t$-norm and $t$-conorm, and point out fuzzy neuron operators are based on $t$-norm but much wider than $t$-norm. In this paper we will emphatically discuss a two-layered network and its training algorithm which will have to satisfy a set of various operators. This work is very related to solving fuzzy relation equations. So it can be used to resolve fuzzy relation equations. Furthermore, the new fuzzy neural algorithm is found to be stronger than other existing methods to some degree. Some simulation results will be reported in detail. Xiaozhong Li, Da Ruan Novel Neural Algorithms Based on Fuzzy $\delta$ Rules for Solving Fuzzy Relation Equations: Part III Fuzzy Sets and Systems . Accepted. ABSTRACT In our previous work, we proposed a max-min operator network and a series of training algorithms, called fuzzy $\delta$ rules, which could be used to solve fuzzy relation equations. The most basic and important result is the convergence theorem of fuzzy perceptron based on max-min operators. This convergence theorem has been extended to the max-times operator network in the previous paper. In this paper, we will further extend the fuzzy $\delta$ rule and its convergence theorem to the case of max-* operator network in which * is a t-norm. An equivalence theorem points out that the neural algorithm in solving this kind of fuzzy relation equations is equivalent to the fuzzy solving method (non-neural) in \cite{Nol:848,Got:946}. The proof and simulation will be given. Xiaozhong Li, Da Ruan, Arien J. Van del Wal Discussions on Soft Computing at FLINS'96 International Journal of Intelligent Systems. , Vol. 13, Nos. 2/3, Feb./Mar. 1998. pp. 287-300. ABSTRACT This is a report on the discussion about soft computing (SC) during FLINS'96. The discussion is based on the 5 questions formulated by X. Li, viz. (1) What is SC? (2)What are the characteristics of SC? (3)What are the principal achievements of SC? (4)What are the typical problems of SC and what are the solutions? and (5)What is the prediction of SC for the future. Before and during FLINS'96, these 5 questions have been sent to several known specialists for a reply. Among them, Martin Wildberger, Bart Kosko, Bo Yuan, Hideyuki Takagi, Takehisa Onisawa, Germano Resconi, Zhong Zhang and Yasushi Nishiwaki answered these questions with their opinions. By this report we hope to stimulate some further discussion on this topic. Xiaozhong Li, Da Ruan Constructing A Fuzzy Logic Control Demo Model at SCK?CEN Proceedinds of the 5th European Congress on Intelligent Techniques and Soft Computing (EUFIT'97) , Aachen, Germany, September 8-11, 1997. Vol. 2, pp. 1408-1412. ABSTRACT Based on the background of fuzzy logic control application in nuclear reactors at SCK?CEN, we have made a real fuzzy logic control demo model. The demo model is suitable for us to test and compare our new algorithms of fuzzy control, because it is always difficult and risky to do all experiments in a real nuclear environment. This paper will mainly report the construction of the demo model and its fuzzy logic control system. Although this demo model is special designed to simulate the working principle of a nuclear reactor, it can be also used as a general object or flat for control experiments. It is much better than an inverted pendulum system which is often used as a test flat in imitating the delay of a real complex system. The current fuzzy logic control algorithm in this demo model is a normal algorithm based on Mamdani model. In our system, triangular shaped membership functions are used. In order to overcome the well known dilemma of fast response and no overshot, some parameters, for instance, fuzzy control rules and universes of discourse, must be adjusted. Finally, we have fulfilled this goal, however it is not easy to choose suitable parameters. This is the real drawback which has slowed down the wide applications of fuzzy logic control. Therefore new effective algorithms must be further researched, and it is possible to combine other intelligent technologies, such as the learning of neural network and evolving of genetic algorithm, although much work has already been done. Da Ruan, Xiaozhong Li Fuzzy Logic Control Applications to Belgian Nuclear Reactor 1 (BR1) Computers and Artificial Intelligence , Accepted. ABSTRACT Fuzzy logic applications in nuclear industry present a tremendous challenge. The main reason for this is the public awareness of the risks of nuclear industry and the very strict safety regulations in force for nuclear power plants. The very same regulations prevent a researcher from quickly introducing novel fuzzy-logic methods into this field. On the other hand, the application of fuzzy logic has, despite the ominous sound of the word "fuzzy" to nuclear engineers, a number of very desirable advantages over classical methods, e.g., its robustness and the capability to include human experience into the ontroller. In this paper we report an on-going R&D project for controlling the power level of the Belgian Nuclear Reactor 1 (BR1) at the Belgian Nuclear Research Centre SCK?CEN). The project started in 1995 and aims to investigate the added value of fuzzy logic control for nuclear reactors. We first review some relevant literature on fuzzy logic control in nuclear reactors, then present the state-of-the-art of the BR1 project. After experimenting fuzzy logic control under off-line test cases at the BR1 reactor, we now foresee a new development for a closed-loop fuzzy control as an on-line operation of the BR1 reactor. Finally, we present the new development for a closed-loop fuzzy logic control at BR1 with an understanding of the safety requirements for this real fuzzy logic control application in nuclear reactors. Xiaozhong Li, Da Ruan Comparative Study of Fuzzy Control, PID control, and Advanced Fuzzy Control for Simulating a Nuclear Reactor Operation Intelligent Systems and Soft Computing for Nuclear Science and Industry , Proceedings of the 3nd International FLINS Workshop, Mol, Belgium, September 14-16, 1998, Eds. Da Ruan, Pierre D'hondt et al, World Scientific Publisher. ABSTRACT Based on the background of fuzzy control applications at the BR1 reactor at SCK?CEN, we have made a real fuzzy logic control demo model. The demo model is suitable for us to test and compare any new algorithms of fuzzy control and intelligent systems, because it is always difficult and time consuming due to safety aspects to do all experiments in a real nuclear environment. In this paper, we first briefly report the construction of the demo model, and then introduce the results of a fuzzy control, a PID control, and an advanced fuzzy control, in which the advanced fuzzy control is a fuzzy control with an adaptive function which can self-regulate the fuzzy control rules. Afterwards, we give a comparative study among those three methods. The results have shown that fuzzy control has more advantages in term of flexibility, robustness, and easy updated facilities with respect to the PID control of the demo model, but PID control has much higher regulation resolution due to its integration term. The adaptive fuzzy control can dynamically adjust the rule base, therefore it is more robust and suitable to those very uncertain occasions. _____________________________________________________________________ * Xiaozhong Li. PhD, Currently Young Scientific Researcher * * Belgian Nuclear Research Centre (SCK.CEN) *----------* * Boeretang 200, B-2400 Mol, Belgium | _L_ * * phone: (+32-14) 33 22 30(O); (+32-14) 32 25 52(H) | /\X/\ * * fax: (+32-14) 32 15 29 | \/Z\/ * * e-mail:xli at sckcen.be http://www.sckcen.be/people/xli | / \ @ * *________________________________________________________*----------* From delapaz at dia.uned.es Wed Jun 10 05:02:28 1998 From: delapaz at dia.uned.es (Felix de la Paz Lopez) Date: Wed, 10 Jun 1998 11:02:28 +0200 Subject: IWANN'99 Message-ID: <004d01bd944e$7ff0f660$8df092c1@pc-felix.dia.uned.es> (sorry if you have received this message previously) Call for papers 5TH.INTERNATIONAL WORK-CONFERENCE ON ARTIFICIAL AND NATURAL NEURAL NETWORKS Biological and Artificial Computation: Methodologies, Neural Modeling and Bioinspired Applications IWANN'99 Alicante, Spain June 2-4, 1999 http://iwann99.umh.es/ Organized by: Asociación Española de Redes Neuronales (AERN) Universidad Nacional de Educacion a Distancia (UNED) Instituto de Bioingenieria, Universidad Miguel Hernandez (UMH) IN COOPERATION WITH Universidad de Granada Universidad de Malaga Universitat Politecnica de Catalunya Universidad de Las Palmas de Gran Canaria AND IFIP (Working Group in Neural Computer Systems, WG10.6) Spanish RIG IEEE Neural Networks Council UK&RI Communication Chapter of IEEE SCOPE Under the basic idea that living beings and machines can be understood using the same experimental methodology and the same theoretical and formal tools, the interdisciplinary team of the IWANN'99 program committee recognizes as global goals the following: I. Developments on Foundations and Methodology. II. From artificial to natural: How can help the Systems theory, the Electronics and the Computation (including AI) to the understanding of Nervous System?. As a science of analysis, neural computation seeks to help neurology, brain theory, and cognitive psychology in the understanding of the functioning of the Nervous System by means of computational models of neurons, neural nets and subcellular processes. III. From Natural to Artificial: How can help the understanding of Nervous System to the obtention of bio-inspired models of artificial neurons, evolutionary architectures, and learning algorithms of value in computation and engineering?. As engineering, neural computation seeks to complement the symbolic perspective of Artificial Intelligence (AI), using these biologically inspired models of neurons and nets to solve those non-algorithmic problems of function approximation and pattern classification having to do with changing and only partially known environments. IV. Bio-inspired Technology and Engineering Applications: How can we obtain bio-inspired formulations for sensory coding, perception, memory, decision making, planning, and control?. The essential aim of this perspective is to reduce the distance between the biological and artificial perspectives of neural computation. Contributions on the following and related topics are welcome. TOPICS 1. Foundations of Computational Neuroscience: Brain Organization Principles: Communication, control and oscillations, cooperativity, self-organization, and evolution. Convergency between theory and experiments. Principles: Communication, control and oscillations, cooperativity, self-organization, and evolution. Convergency between theory and experiments. 2. Neural Modeling: Biophysical and Structural Models: Ionic chanels, synaptic level, neurons, circuits and system level. Functional Models: Analogue, digital, probabilistic, bayesian, fuzzy and object oriented formulations. Energy related models. Hybrid techniques. 3. Plasticity Phenomena (Maturing, Learning and Memory): Biological mechanisms at the molecular, cellular, network, and behavioural levels. Computable Models of adaptation and plasticity. Supervised and non-supervised algorithms. Inductive, deductive and hybrid symbolic-subsymbolic formulations. 4. Complex Systems Dynamics: Optimization, self-organization, cooperative processes, fault-tolerance and self-repair. Genetic algorithms. Simulated evolution. Social organization processes and large scale neural models, non-linear dynamics in biological systems. 5. Artificial Intelligence and Cognitive Neuroscience: Knowledge modeling. Ontologies. Generic tasks of analysis, modification and synthesis. Libraries of problem solving methods and reusable components. Concept formation. Natural language understanding and linguistic. Intentionality and consciousness in autonomous agents. 6. Artificial Neural Nets Simulation, Implementation, and Evaluation: Development environments, formal frames, and simulation languages. Neural models editing tools. Advances in ANN's implementation. Evolving hardware. Validation and evaluation criteria. Acceptability and explanatory capacity. 7. Methodology for Nets Design: Data analysis, task identification and recursive hierarchical design in specific domains. Hybrid solutions to hybrid problems. 8. Bio-inspired Systems and Engineering: Signal processing, cochlear systems, auditory processing, retinomorphic systems, other sensory processing systems, neuromorphic communication, neuromorphic learning, neural prosthetic devices. 9. Other applications: Artificial vision, speech recognition, multisensorial integration, spatio-temporal planning and scheduling, strategies of sensory-motor coordination. Applications of ANN's in vision, real time, control, robotics, economy, industry and medicine. IMPORTANT DATES Second and final call for papers: September 1998 *** Final date for submission: January 15, 1999 *** Acceptance notification: February 15, 1999 Formalization of inscription: March 1, 1999 Contributions must be sent by surface mail to: Prof. Jose Mira-Mira Dpto. Inteligencia Artificial - UNED Senda del Rey s/n. E-28040 MADRID, Spain. Additional Information: http://iwann99.umh.es/ Phone: +34 91-398-7155 FAX: +34 91-398-6697 e-mail: iwann99 at dia.uned.es PAPER SUBMISSION The Programme Committee request original papers on the mentioned topics. Authors are invited to submit five copies of papers, written in english, of up to 10 pages, including figures, tables and references. The format should be A4 or 1/2 11 inch paper, in a Roman font, 12 point in size, with a printing area of 15.3 x 24.2 cm2 (6.0 x 9.5 sq. inches). If possible, please make use of the latex/plaintex style available in our WWW site. In adiction, one sheet must be attached including: title, author's names, a list of five keywords, the topic under the paper fit the best, the preferred presentation (oral or poster) and the corresponding author information (name, postal and e-mail address, phone and fax number). All received papers will be reviewed by the Programm Committee. Accepted papers may be presented orally or as a poster panels, however all accepted contributions will be published at full length (Springer-Verlag Proceedings are expected, as usual). -------------------------------------------------------------------------- From ngoddard at psc.edu Wed Jun 10 22:45:21 1998 From: ngoddard at psc.edu (Nigel Goddard) Date: Wed, 10 Jun 98 22:45:21 -0400 Subject: Position in Computational Neural Science Message-ID: <17039.897533121@pscuxc.psc.edu> This position is responsible for developing a nationally recognized research program in computational neural science. The following research areas of computational neuroscience are included: neural modeling, neural nets and machine learning and functional/structural MRI. The position will determine the priorities and direction for this effort based on the overall goals and objectives of the biomedical applications group and the Pittsburgh Supercomputing Center (PSC). This position will be responsible for writing research grants for continued funding, writing annual reports, presenting research results at national and international conferences, developing application-specific or research workshops, and publishing the results of this work in peer-reviewed publications. QUALIFICATIONS: Ph.D. in a computer science, mathematics or related discipline or equivalent combination of training and experience; five or more years of experience; proficient in C++ or C; proficient with message passing libraries; and ability to communicate effectively required. Experience performing computational neural science research on high performance computers preferred. This is a summary statement of the responsibilities and qualifications for this position. AA/EEO EMPLOYER To apply, send resume and cover letter to: David W. Deerfield, Biomedical Applications Manager Pittsburgh Supercomputing Center 4400 Fifth Avenue Pittsburgh, PA 15213 email: deerfield at psc.edu From wkistler at physik.tu-muenchen.de Thu Jun 11 04:37:40 1998 From: wkistler at physik.tu-muenchen.de (Werner Kistler) Date: Thu, 11 Jun 1998 10:37:40 +0200 Subject: Paper available: Modelling Collective Excitations in Cortical Tissue Message-ID: <357F9754.B2770228@physik.tu-muenchen.de> The following paper is available on my web page: http://www.physik.tu-muenchen.de/~wkistler/kistler98a.html ------------------------------------------------------------------ W. M. Kistler, R. Seitz, and J. L. van Hemmen. Modelling Collective Excitations in Cortical Tissue. Physica D, 114(3/4): 273-295, 1998. Abstract: We study a two-dimensional system of spiking neurons with local interactions depending on distance. The interactions between the neurons decrease as the distance between them increases and can be either excitatory or inhibitory. Depending on the mix of excitation and inhibition, this kind of system exhibits a rich repertoire of collective excitations such as traveling waves, expanding rings, and rotating spirals. We present a continuum approximation that allows an analytic treatment of plane waves and circular rings. We calculate the dispersion relation for plane waves and perform a linear stability analysis. Only waves that have a speed of propagation below a certain critical velocity, are stable. For target patterns, we derive an integro-differential equation that describes the evolution of a circular excitation. Its asymptotic behavior is handled exactly. We illustrate the analytic results by parallel-computer simulations of a network of 10^6 neurons. In so doing, we exhibit a novel type of local excitation, a so-called `paternoster'. ------------------------------------------------------------------ Werner Kistler Phone: +49(89)289.12193 Dipl.-Phys. Fax: +49(89)289.12296 email: wkistler at physik.tu-muenchen.de WWW: http://www.physik.tu-muenchen.de/~wkistler Institut f"ur Theoretische Physik Physik-Department der Technischen Universit"at M"unchen James-Franck-Strasse D-85748 Garching bei M"unchen Germany ------------------------------------------------------------------ From leila at ida.his.se Thu Jun 11 05:36:43 1998 From: leila at ida.his.se (Leila Khammari) Date: Thu, 11 Jun 1998 11:36:43 +0200 Subject: ICANN 98 Message-ID: <357FA52B.980A47C0@ida.his.se> CALL FOR PARTICIPATION: 8th INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN 98) September 1-4, 1998, Skoevde, Sweden (Tutorials Sept 1, Conference Sept 2-4) ==================================== PRELIMINARY PROGRAM AND REGISTRATION FORMS NOW AVAILABLE AT: http://www.his.se/ida/icann98/ ==================================== ==================================== INVITED TALKS: ==================================== Diagrammatic Representation and Reasoning in a Connectionist Framework John Barnden, University of Birmingham, UK Variational Learning in Graphical Models and Neural Networks Chris Bishop, Microsoft Research, Cambridge, UK Learning To Be Social Rodney Brooks, MIT, Cambridge, USA Synchronization: The Computational Currency of Cognition Leif Finkel, University of Pennsylvania, USA Applications of Vapnik's theory for prediction Francoise Fogelman Soulie, Atos, France Title pending David Hansel, CNRS, France Brains, Gases and Robots Phil Husbands, University of Sussex, UK Self-Organization of Very Large Document Collections: State of the Art Teuvo Kohonen, Helsinki Univ. of Technology, Finland Gaussian Processes- a replacement for supervised neural networks? David MacKay, Cavendish Laboratory, Cambridge, UK Title pending Barak Pearlmutter, University of New Mexico, USA The Silicon Way to Artificial Neural Networks Ulrich Rueckert, Universitaet Paderborn, Germany Title pending David Rumelhart, Stanford University, USA ==================================== SCOPE: ==================================== ICANN 98 covers all aspects of ANN research, broadly divided into six areas, corresponding to separate organizational modules. - THEORY - APPLICATIONS - COMPUTATIONAL NEUROSCIENCE AND BRAIN THEORY - CONNECTIONIST COGNITIVE SCIENCE AND AI - AUTONOMOUS ROBOTICS AND ADAPTIVE BEHAVIOR - HARDWARE/IMPLEMENTATION Out of 340 submissions, 180 papers have been accepted for presentation. 65 of these will be presented orally in 3 parallel tracks, and 115 will be presented in two separate poster sessions. In addition to the modules mentioned above we aim to further promote contacts between researchers and industry. To achieve this, a special session on INDUSTRY AND RESEARCH is organized, featuring the following talks: Neural Computation at Siemens: Challenges in Applications and Research, Bernd Schuermann, Siemens, Germany Toward Real World Intelligence: R&D in the Real World Computing Program, Nobuyuki Otsu, ETL, Japan Industry - researchers interface: What is important for effective co-operation?, Timo Salo, Helsinki University, Finland Industrial perspective on ANN-research, Tony Larsson, Ericsson, Sweden Funding programs in Europe, Karl-Einar Sj?din, NUTEK, Sweden Technology transfer from European academia to industry Trevor Clarkson, King's College London, NEuroNet, Great Britain ==================================== TUTORIALS, September, 1 ==================================== Spiking Neurons Wulfram Gerstner, EPFL, Lausanne Realistic Modeling of Neurons and Networks using GENESIS Erik De Schutter, University of Antwerp The Self-Organizing Map Teuvo Kohonen, HUT, Helsinki Combining Artificial Neural Networks Amanda Sharkey, University of Sheffield The Working Brain: Brain Imaging and its Implications John Taylor, King's College, London Analogic Cellular Computing based on Cellular Neural Networks Tamas Roska, Computer and Automation Institute, Budapest Evolutionary Robotics Stefano Nolfi, National Research Council, Rome Dario Floreano, EPFL, Lausanne ==================================== REGISTRATION: ==================================== Early registration deadline: July 14th. Registration Fees: Before July 14 After July 14 Regular ENNS-member 3000 SEK 3500 SEK Regular non ENNS-member 3500 SEK 4000 SEK Student 2000 SEK 2500 SEK Tutorial day 500 SEK 500 SEK ==================================== For more detailed information, and registration forms please use www.his.se/ida/icann98, or get in contact with the ICANN 98 secretariat: Address: ICANN 98 Hoegskolan i Skoevde P.O. Box 408 541 28 Skoevde, SWEDEN Email: icann98 at ida.his.se Fax: +46 (0)500 46 47 25 ==================================== From c.k.i.williams at aston.ac.uk Thu Jun 11 06:48:59 1998 From: c.k.i.williams at aston.ac.uk (Chris Williams) Date: Thu, 11 Jun 1998 11:48:59 +0100 Subject: Postdoc position at the University of Edinburgh Message-ID: <2284.9806111048@sun.aston.ac.uk> [Apologies for cross-posting. Please note that I shall be moving to the Department of Arificial Intelligence, Univeristy of Edinburgh on 1 July 1998] ---------------------------------------------------------------------- Research Associate: Probabilistic Models for Sequences Department of Artificial Intelligence, University of Edinburgh A vacancy exists for a research associate on the RA1A scale (point 6), to work on a 3-year EPSRC funded research project entitled "Probabilistic Models for Sequences". The aim of the project is to investigate belief network models for sequences, with a particular focus on image sequences. This will entail the development and evaluation of approximation schemes for multiple-cause/hierarchical belief network models. These methods will be applied to problems such as road-scene interpretation and medical image analysis. This post is fixed-term, funded until 30 September 2001. Candidates should have strong mathematical and computational skills, preferably with a background in belief networks or more generally in probabilistic modelling. Starting salary will be point 6 on the RA1A scale, 16,927 pounds per annum. The successful applicant will work with a PhD student and a MSc student who are also funded through the project. Futher information about the project can be obtained from http://www.dai.ed.ac.uk/daidb/byhand/ckiw/, and/or by contacting Dr. Chris Williams (ckiw at dai.ed.ac.uk) telephone: 0121 359 3621 ext 4382 (international +44 121 359 3621 ext 4382) (with voicemail). Further particulars including the application procedure should be obtained from The Personnel Office, 1 Roxburgh Street, Edinburgh EH8 9TB Scotland or telephone: 0131 650 2511 (24 hour answering service). Please quote reference 896421. Closing date for receipt of applications is 3 July 1998. From pollack at cs.brandeis.edu Thu Jun 11 14:27:38 1998 From: pollack at cs.brandeis.edu (jordan pollack) Date: Thu, 11 Jun 1998 14:27:38 -0400 Subject: NCS survey paper References: <199805290056.RAA18938@arapaho.cse.ucsc.edu> Message-ID: <3580219A.37D5@cs.brandeis.edu> my history survey was written 10 years ago, and may be of interest to some to see whats changed since then: Pollack, J. B. (1989). Connectionism: Past, Present, and Future. Artificial Intelligence Review, 3, 3-20. Research efforts to study computation and cognitive modeling on neurally-inspired mechanisms have come to be called Connectionism. ..This paper surveys the history of the field, often in relation to AI, discusses its current successes and failures, and makes some predictions for where it might lead in the future. http://www.demo.cs.brandeis.edu/papers/long.html#nnhistory -- Professor Jordan B. Pollack DEMO Laboratory, Volen Center for Complex Systems Computer Science Dept, MS018 Phone (781) 736-2713/Lab x3366/Fax x2741 Brandeis University website: http://www.demo.cs.brandeis.edu Waltham, MA 02254 email: pollack at cs.brandeis.edu From terry at salk.edu Thu Jun 11 15:41:40 1998 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 11 Jun 1998 12:41:40 -0700 (PDT) Subject: NIPS Volume 10 Message-ID: <199806111941.MAA26296@helmholtz.salk.edu> The abstracts for NIPS*97 are available online at: http://mitpress.mit.edu/cognet/abstracts/NIPS10/ The full papers are available in: Advances in Neural Information Processing Systems 10 Michael I. Jordan, Michael J. Kearns, and Sara A. Solla (eds.) Cambridge, MA: MIT Press (1998) This volume has been mailed to all registered participants. Terry ----- From segevr at post.tau.ac.il Fri Jun 12 05:32:39 1998 From: segevr at post.tau.ac.il (Ronen Segev) Date: Fri, 12 Jun 1998 12:32:39 +0300 (IDT) Subject: Paper: From Neurons to Brain: Adaptive Self-Wiring of Neurons Message-ID: Dear Connectionist, The following paper will be published at Journal of Complex Systems, vol 1, (1998). Hard copies can be obtained by sending an email to: segevr at post.tau.ac.il An electronic version can be found at: http://xxx.lanl.gov/find/cond-mat/1/segev/0/1/0/past/3/0 Your comments are welcome! Ronen Segev, email: segevr at post.tau.ac.il, School of Physics & Astronomy, Tel Aviv university. Title: From Neurons to Brain: Adaptive Self-Wiring of Neurons Authors: Ronen Segev , Eshel Ben-Jacob Comments: Latex, 12 pages, 9 gif figures. Report-no: S2 Subj-class: Neural Networks and Disorderd Systems. Journal-ref: J. Comp. Sys. 1 (1998) During embryonic morpho-genesis, a collection of individual neurons turns into a functioning network with unique capabilities. Only recently has this most staggering example of emergent process in the natural world, began to be studied. Here we propose a navigational strategy for neurites growth cones, based on sophisticated chemical signaling. We further propose that the embryonic environment (the neurons and the glia cells) acts as an excitable media in which concentric and spiral chemical waves are formed. Together with the navigation strategy, the chemical waves provide a mechanism for communication, regulation, and control required for the adaptive self-wiring of neurons. From stefan.wermter at sunderland.ac.uk Fri Jun 12 13:57:04 1998 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Fri, 12 Jun 1998 18:57:04 +0100 Subject: Job: Neural and Intelligent Systems Message-ID: <35816BF0.53A4D1E4@sunderland.ac.uk> Please post, thank you very much, Stefan Wermter --------------------------- Research Assistant in Neural and Intelligent Systems (reference number CIRG28) Applications are invited for a three year research assistant position in the School of Computing and Information Systems investigating the development of hybrid neural/symbolic techniques for intelligent processing. This is an exciting new project which aims at developing new environments for integrating neural networks and symbolic processing. You will play a key role in the development of such hybrid subsymbolic/symbolic environments. It is intended to apply the developed hybrid environments in areas such as natural language processing, intelligent information extraction, or the integration of speech/language in multimedia applications. You should have a degree in a computing discipline and will be able to register for a higher degree. A demonstrated interest in artificial neural networks, software engineering skills and programming experience are essential (preferably including a subset of C, C++, CommonLisp, Java, GUI). Experience and interest in neural network software and simulators would be an advantage (e.g. Planet, SNNS, Tlearn, Matlab, etc). Salary is according to the researcher A scale (currently up to 13,871, under revision). Application forms and further particulars are available from the Personnel department under +44 191 515 and extensions 2055, 2429, 2054, 2046, or 2425 or E-Mail employee.recruitment at sunderland.ac.uk quoting the reference number CIRG28. For informal inquiries please contact Professor Stefan Wermter, e-mail: Stefan.Wermter at sunderland.ac.uk. Closing date: 10 July 1998. The successful candidate is expected to start the job as soon as possible. ******************************************** Professor Stefan Wermter University of Sunderland Dept. of Computing & Information Systems St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://osiris.sunderland.ac.uk/~cs0stw/ ******************************************** From gary at cs.ucsd.edu Fri Jun 12 20:22:21 1998 From: gary at cs.ucsd.edu (Gary Cottrell) Date: Fri, 12 Jun 1998 17:22:21 -0700 (PDT) Subject: Recent publications from GURU Message-ID: <199806130022.RAA02659@gremlin.ucsd.edu> Hello all, Below are titles of six recent papers from Gary's Unbelievable Research Unit (GURU). Five of these will appear in the 1998 Proceedings of the Cognitive Science Society. One will appear in the Proceedings of Special Interest Group on Information Retrieval. All are available from my home page: http://www-cse.ucsd.edu/users/gary/ Abstracts are appended to the end of this message. Cheers, gary Gary Cottrell 619-534-6640 FAX: 619-534-7029 Faculty Assistant Joy Gorback: 619-534-5948 Computer Science and Engineering 0114 IF USING FED EX INCLUDE THE FOLLOWING LINE: "Only connect" 3101 Applied Physics and Math Building University of California San Diego -E.M. Forster La Jolla, Ca. 92093-0114 Email: gary at cs.ucsd.edu or gcottrell at ucsd.edu Anderson, Karen, Milostan, Jeanne C. and Cottrell, Garrison W. (1998) Assessing the contribution of representation to results. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Clouse, Daniel S. and Cottrell, Garrison W. (1998) Regulari- ties in a Random Mapping from Orthography to Semantics. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Dailey, Matthew N., Cottrell, Garrison W. and Busey, Thomas A. (1998) Eigenfaces for familiarity. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Laakso, Aarre and Cottrell, Garrison W. (1998) How can I know what You think?: Assessing representational similarity in neural systems. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Padgett, Curtis and Cottrell, Garrison W. (1998) A simple neural network models categorical perception of facial expressions. In Proceedings of the Twentieth Annual Cogni- tive Science Conference, Madison, WI, Mahwah: Lawrence Erl- baum. Vogt, Christopher C. and Cottrell, Garrison W. (1998) Predicting the performance of linearly combined IR systems. In Proceedings of Special Interest Group on Information Retrieval. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Abstracts XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Assessing the Contribution of Representation to Results Karen Anderson kanders at cs.ucsd.edu Jeanne Milostan jmilosta at cs.ucsd.edu Garrison W. Cottrell gary at cs.ucsd.edu Computer Science and Engineering Department 0114 Institute for Neural Computation University of California San Diego La Jolla, CA 92093-0114 In this paper, we make a methodological point concerning the contribution of the representation of the output of a neural network model when using the model to compare to human error performance. We replicate part of Dell, Juliano \& Govindjee's work on modeling speech errors using recurrent networks (Dell et al. 1993). We find that 1) the error patterns reported by Dell et al. do not appear to remain when more networks are used; and 2) some components of the error patterns that are found can be accounted for by simply adding Gaussian noise to the output representation they used. We suggest that when modeling error behavior, the technique of adding noise to the output representation of a network should be used as a control to assess to what degree errors may be attributed to the underlying network. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Regularities in a Random Mapping from Orthography to Semantics Daniel S. Clouse and Garrison W. Cottrell Computer Science & Engineering 0114 University of California, San Diego La Jolla, CA 92093 {dclouse,gary}@cs.ucsd.edu In this paper we investigate representational and methodological issues in a attractor network model of the mapping from orthography to semantics based on (Plaut, 1995). We find that, contrary to psycholinguistic studies, the response time to concrete words (represented by more 1 bits in the output pattern) is slower than for abstract words. This model also predicts that response times to words in a dense semantic neighborhood will be faster than words which have few semantically similar neighbors in the language. This is conceptually consistent with the neighborhood effect seen in the mapping from orthography to phonology (Seidenberg & McClelland, 1989; Plaut et al. 1996) in that patterns with many neighbors are faster in both pathways, but since there is no regularity in the random mapping used here, it is clear that the cause of this effect is different than that of previous experiments. We also report a rather distressing finding. Reaction time in this model is measured by the time it takes the network to settle after being presented with a new input. When the criterion used to determine when the network is ``settled'' is changed to include testing of the hidden units, each of the results reported above change the direction of effect -- abstract words are now slower, as are words in dense semantic neighborhoods. Since there are independent reasons to exclude hidden units from the stopping criterion, and this is what is done in common practice, we believe this phenomenon to be of interest mostly to neural network practitioners. However, it does provide some insight into the interaction between the hidden and output units during settling. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Eigenfaces for Familiarity Matthew N. Dailey mdailey at cs.ucsd.edu Garrison W. Cottrell gary at cs.ucsd.edu Computer Science and Engineering Department University of California, San Diego 9500 Gilman Dr., La Jolla CA 92093-0114 USA Thomas A. Busey busey at indiana.edu Department of Psychology Indiana University Bloomington, IN 47405 USA A previous experiment tested subjects' new/old judgments of previously-studied faces, distractors, and morphs between pairs of studied parents. We examine the extent to which models based on principal component analysis (eigenfaces) can predict human recognition of studied faces and false alarms to the distractors and morphs. We also compare eigenface models to the predictions of previous models based on the positions of faces in a multidimensional ``face space'' derived from a multidimensional scaling (MDS) of human similarity ratings. We find that the error in reconstructing a test face from its position in an ``eigenface space'' provides a good overall prediction of human familiarity ratings. However, the model has difficulty accounting for the fact that humans false alarm to morphs with similar parents more frequently than they false alarm to morphs with dissimilar parents. We ascribe this to the limitations of the simple reconstruction error-based model. We then outline preliminary work to improve the fine-grained fit within the eigenface-based modeling framework, and discuss the results' implications for exemplar- and face space-based models of face processing. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX How Can *I* Know What *You* Think?: Assessing Representational Similarity in Neural Systems Aarre Laakso aarre at ucsd.edu Department of Philosophy University of California, San Diego La Jolla, CA 92093 Garrison W. Cottrell gary at cs.ucsd.edu Institute for Neural Computation Computer Science and Engineering University of California, San Diego La Jolla, CA 92093 How do my mental states compare to yours? We suggest that, while we may not be able to compare experiences, we can compare neural representations, and that the correct way to compare neural representations is through analysis of the distances between them. In this paper, we present a technique for measuring the similarities between representations at various layers of neural networks. We then use the measure to demonstrate empirically that different artificial neural networks trained by backpropagation on the same categorization task, even with different representational encodings of the input patterns and different numbers of hidden units, reach states in which representations at the hidden units are similar. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX A Simple Neural Network Models Categorical Perception of Facial Expressions Curtis Padgett and Garrison W. Cottrell Computer Science & Engineering 0114 University of California, San Diego La Jolla, CA 92093-0114 {cpadgett,gary}@cs.ucsd.edu The performance of a neural network that categorizes facial expressions is compared with human subjects over a set of experiments using interpolated imagery. The experiments for both the human subjects and neural networks make use of interpolations of facial expressions from the Pictures of Facial Affect Database (Ekman & Freisen, 1976). The only difference in materials between those used in the human subjects experiments (Young et al., 1997) and our materials are the manner in which the interpolated images are constructed -- image-quality morphs versus pixel averages. Nevertheless, the neural network accurately captures the categorical nature of the human responses, showing sharp transitions in labeling of images along the interpolated sequence. Crucially for a demonstration of categorical perception (Harnad, 1987), the model shows the highest discrimination between transition images at the crossover point. The model also captures the shape of the reaction time curves of the human subjects along the sequences. Finally, the network matches human subjects' judgements of which expressions are being mixed in the images. The main failing of the model is that there are intrusions of ``neutral'' responses in some transitions, which are not seen in the human subjects. We attribute this difference to the difference between the pixel average stimuli and the image quality morph stimuli. These results show that a simple neural network classifier, with no access to the biological constraints that are presumably imposed on the human emotion processor, and whose only access to the surrounding culture is the category labels placed by American subjects on the facial expressions, can nevertheless simulate fairly well the human responses to emotional expressions. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Predicting the Performance of Linearly Combined IR Systems Christopher C. Vogt University of California, San Diego, CSE 0114, La Jolla, CA 92093, USA Garrison W. Cottrell University of California, San Diego, CSE 0114, La Jolla, CA 92093, USA Abstract We introduce a new technique for analyzing combination models. The technique allows us to make qualitative conclusions about which IR systems should be combined. We achieve this by using a linear regression to accurately (r2=0.98) predict the performance of the combined system based on quantitative measurements of individual component systems taken from TREC5. When applied to a linear model (weighted sum of relevance scores), the technique supports several previously suggested hypotheses: one should maximize both the individual systems' performances and the overlap of relevant documents between systems, while minimizing the overlap of nonrelevant documents. It also suggests new conclusions: both systems should distribute scores similarly, but not rank relevant documents similarly. It furthermore suggests that the linear model is only able to exploit a fraction of the benefit possible from combination. The technique is general in nature and capable of pointing out the strengths and weaknesses of any given combination approach. SIGIR'98 24-28 August 1998 Melbourne, Australia. From harnad at coglit.soton.ac.uk Sat Jun 13 05:47:40 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Sat, 13 Jun 1998 10:47:40 +0100 (BST) Subject: Invitation to archive your papers in CogPrints Archive Message-ID: To all cognitive scientists (apologies if you receive this more than once): You are invited to archive your preprints and reprints in the CogPrints electronic archive. The Archive covers all the Cognitive Sciences: Psychology, Neuroscience, Biology, Computer Science, Linguistics and Philosophy CogPrints is completely free for everyone, both authors and readers, thanks to a subsidy from the Electronic Libraries Programme of the Joint Information Systems of the United Kingdom and the collaboration of the NSF-supported Physics Eprint Archive at Los Alamos. CogPrints has just been opened for public automatic archiving. This means authors can now deposit their own papers automatically. The first wave of papers had been invited and hand-archived by CogPrints in order to set a model of the form and content of CogPrints. To see the current holdings: http://cogprints.soton.ac.uk/ To archive your own papers automatically: http://cogprints.soton.ac.uk/author.html All authors are encouraged to archive their papers on their home servers as well. For ferther information: admin at coglit.soton.ac.uk -------------------------------------------------------------------- BACKGROUND INFORMATION (No need to read if you wish to proceed directly to the Archive.) The objective of CogPrints is to emulate in the cognitive and biobehavioral sciences the remarkable success of the NSF-subsidised Physics Eprint Archive at Los Alamos http://xxx.lanl.gov The Physics Eprint Archive now makes available, free for all, over half of the annual physics periodical literature, with its annual growth strongly suggesting that it will not be long before it becomes the locus classicus for all of the literature in Physics. What this means is that anyone in the world with access to the Internet (and that number too is rising at a breath-taking rate, and already includes all academics, researchers and students in the West, and an increasing proportion in the Third World as well) can now search and retrieve virtually all current work in, for example, High Energy Physics, much of it retroactive to 1990 when the Physics archive was founded by Paul Ginsparg, who must certainly be credited by historians with having launched this revolution in scientific and scholarly publication (www-admin at xxx.lanl.gov). Does this mean that learned journals will disappear? Not at all. They will continue to play their traditional role of validating research through peer review, but this function will be an "overlay" on the electronic archives. The literature that is still in the form of unrefereed preprints and technical reports will be classified as such, to distinguish it from the refereed literature, which will be tagged with the imprimatur of the journal that refereed and accepted it for publication, as it always has been. It will no longer be necessary for publishers to recover (and research libraries to pay) the substantial costs of producing and distributing paper through ever-higher library subscription prices: Instead, it will be the beneficiaries of the global, unimpeded access to the learned research literature -- the funders of the research and the employers of the researcher -- who will cover the much reduced costs of implementing peer review, editing, and archiving in the electronic medium alone, in the form of minimal page-charges, in exchange for instant, permanent, worldwide access to the research literature for all, for free. If this arrangement strikes you as anomalous, consider that the real anomaly was that the authors of the scientific and scholarly periodical research literature, who, unlike trade authors, never got (or expected) royalties for the sale of their texts -- on the contrary, so important was it to them that their work should reach all potentially interested fellow-researchers that they had long been willing to pay for the printing and mailing of preprints and reprints to those who requested them -- nevertheless had to consent to have access to their work restricted to those who paid for it. This Faustian bargain was unavoidable in the Gutenberg age, because of the need to recover the high cost of producing and disseminating print on paper, but Paul Ginsparg has shown the way to launch the entire learned periodical literature into the PostGutenberg Galaxy, in which scientists and scholars can publish their work in the form of "skywriting": visible and available for free to all. -------------------------------------------------------------------- Stevan Harnad harnad at cogsci.soton.ac.uk Professor of Psychology harnad at princeton.edu Director, phone: +44 1703 592582 Cognitive Sciences Centre fax: +44 1703 594597 Department of Psychology http://www.cogsci.soton.ac.uk/~harnad/ University of Southampton http://www.princeton.edu/~harnad/ Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/ SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/ From juergen at idsia.ch Mon Jun 15 13:35:55 1998 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Mon, 15 Jun 1998 19:35:55 +0200 Subject: fractal face Message-ID: <199806151735.TAA19652@ruebe.idsia.ch> FACIAL BEAUTY AND FRACTAL GEOMETRY Juergen Schmidhuber What is it that makes a face beautiful? Average faces obtained by photographic (Galton 1878) or digital (Langlois & Roggman 1990) blending are judged attractive but not optimally attractive (Alley & Cunningham 1991) --- digital exaggerations of deviations from average face blends can lead to higher attractiveness ratings (Perrett, May, & Yoshikawa 1994). My novel approach to face design does not involve blending at all. Instead, the image of a female face with high ratings is composed from a fractal geometry based on rotated squares and powers of two. The corresponding geometric rules are more specific than those previously used by artists such as Leonardo and Duerer. They yield a short algorithmic description of all facial characteristics, many of which are compactly encod- able with the help of simple feature detectors similar to those found in mammalian brains. This suggests that a face's beauty correlates with simplicity relative to the subjective observer's way of encoding it. HTML: http://www.idsia.ch/~juergen/locoface/locoface.html (5 color figures, total of 0.7MB) Postscript: ftp://ftp.idsia.ch/pub/juergen/locoface.ps.gz (7 pages, 1.3MB, 5MB gunzipped) Comments welcome! IDSIA, Switzerland Juergen Schmidhuber www.idsia.ch From stefan.wermter at sunderland.ac.uk Tue Jun 16 09:12:59 1998 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Tue, 16 Jun 1998 14:12:59 +0100 Subject: Neural and Intelligent Systems Message-ID: <35866F5B.FA15A41A@sunderland.ac.uk> Research Assistant in Neural and Intelligent Systems (reference number CIRG28) Applications are invited for a three year research assistant position in the School of Computing and Information Systems investigating the development of hybrid neural/symbolic techniques for intelligent processing. This is an exciting new project which aims at developing new environments for integrating neural networks and symbolic processing. You will play a key role in the development of such hybrid subsymbolic/symbolic environments. It is intended to apply the developed hybrid environments in areas such as natural language processing, intelligent information extraction, or the integration of speech/language in multimedia applications. You should have a degree in a computing discipline and will be able to register for a higher degree. A demonstrated interest in artificial neural networks, software engineering skills and programming experience are essential (preferably including a subset of C, C++, CommonLisp, Java, GUI). Experience and interest in neural network software and simulators would be an advantage (e.g. Planet, SNNS, Tlearn, Matlab, etc). Salary is according to the researcher A scale (currently up to 13,871, under revision). Application forms and further particulars are available from the Personnel department under +44 191 515 and extensions 2055, 2429, 2054, 2046, or 2425 or E-Mail employee.recruitment at sunderland.ac.uk quoting the reference number CIRG28. For informal inquiries please contact Professor Stefan Wermter, e-mail: Stefan.Wermter at sunderland.ac.uk. Closing date: 10 July 1998. The successful candidate is expected to start the job as soon as possible. ******************************************** Professor Stefan Wermter University of Sunderland Dept. of Computing & Information Systems St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://osiris.sunderland.ac.uk/~cs0stw/ ******************************************** From Dave_Touretzky at cs.cmu.edu Mon Jun 15 17:06:08 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Mon, 15 Jun 1998 14:06:08 -0700 Subject: CNS*98 listing of papers, and registration information Message-ID: ************************************************************************ SEVENTH ANNUAL COMPUTATIONAL NEUROSCIENCE MEETING (CNS*98) July 26 - 30, 1998 Santa Barbara, California REGISTRATION INFORMATION ************************************************************************ Registration is now open for this year's Computational Neuroscience meeting (CNS*98). This is the seventh in a series of annual inter-disciplinary conferences intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience. As in previous years, this meeting will bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in understanding how biological neural systems compute. The meeting will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1998 will take place at Fess Parker's Doubletree Resort in Santa Barbara, California and include plenary, contributed, and poster sessions. The first session starts at 9 am, Sunday July 26th and the meeting ends with the annual CNS banquet on Thursday evening, July 30th. There will be no parallel sessions. The meeting includes two half days of informal workshops focused on current issues in computational neuroscience. Day care will be available for children and given the beauty and recreational interest of the area, we encourage families to attend. LOCATION: The meeting will take place at the Fess Parker's Double Tree Resort in Santa Barbara, California. MEETING ACCOMMODATIONS: Accommodations for the meeting have been arranged at Fess Parker's Doubletree Resort. Information concerning reservations, hotel accommodations, etc. are available at the meeting web site indicated below. A block of rooms are reserved at special rates. 30 student rate rooms are available on a first-come-first-served basis, so we recommend students acting quickly to reserve these slots. NOTE that registering for the meeting, WILL NOT result in an automatic room reservation. Instead you must make your own reservations by contacting the hotel itself. As this is the high season for tourists in Santa Barbara, you should make sure and reserve your accommodations quickly by contacting: =46ess Parker's Doubletree Resort (RESERVATION REQUEST ORDER FORM LOCATED BELOW) NOTE: IN ORDER TO GET THE AVAILABLE ROOMS, YOU MUST CONFIRM HOTEL REGISTRATIONS BY JUNE 24, 1997. When making reservations by phone, make sure and indicate that you are registering for the Computational Neuroscience (CNS*98) meeting. Students will be asked to verify their status on check in with a student ID or other documentation. MEETING REGISTRATION FEES: Registration received on or before June 26, 1998: Student: $ 95 Regular: $ 225 Meeting registration after June 26, 1998: Student: $ 125 Regular: $ 250 BANQUET: Registration for the meeting includes a single ticket to the annual CNS Banquet. Additional Banquet tickets can be purchased for $35 each person. The banquet will be held on Thursday, July 30th. DAY CARE: Day care will be available at the conference for those who inform us in advance of their day care needs. Note that day care will not be provided during the evening. Please send e-mail to judy at bbb.caltech.edu. Please provide the following information: 1. name of parent(s), 2. e-mail address, 3. age of children and 4. estimated times during which children will need day care. Day care will be provided free of charge accept for children under the age of 2 years old for whom a fee may be charged. AIRFARE: Santa Barbara has its own small airport with daily flights from Los Angeles and San Francisco. In addition, ground transportation is available to Santa Barbara from Los Angeles International Airport (and a one and a half hour drive). Special discount rates have been arranged with United and Northwest airlines, if you mention the following group ID with airline reservations: Northwest Airlines - Phone No: 1-800-328-1111 Meeting I.D. No: NMG66 United Airlines - Phone No: 1-800-521-4041 (U.S. and Canada) Meeting I.D. No: 5255V ********************************************************************* ADDITIONAL INFORMATION (including the agenda with list of talks) can be obtained by: o Using our on-line WWW information and registration server, URL of: http://www.bbb.caltech.edu/cns98.html o ftp-ing to our ftp site. yourhost% ftp ftp.bbb.caltech.edu Name (ftp.bbb.caltech.edu:<): ftp Password: yourname at yourhost.yourside.yourdomain ftp> cd cns98 ftp> ls o Sending Email to: cns98 at bbb.caltech.edu ************************************************************************ ************************************************************************ SUNDAY, JULY 26, 1998 9:00 Welcoming Remarks and General Information 9:15 Featured Contributed Talk: Barry J. Richmond (NIH/NIMH) John A. Hertz and Timothy J. Gawne Comparing Responses to Visual Stimuli Appearing on Receptive Fields of V1 Complex Cells Due to Saccades with Responses Elicited by Stimulus Sequences Contributed Talks 10:05 Udo Ernst (MPI for Fluid Dynamics) Klaus Pawelzik, Fred Wolf, and Theo Geisel Theory of Nonclassical Receptive Field Phenomena in the Visual Cortex 10:25 Ko Sakai (RIKEN Brain Science Institute) Shigeru Tanaka Retinotopic Coding and Perceptual Segmentation in Tilt Illusion 10:45 David H. Goldberg (Brown University) Harel Shouval and Leon N Cooper Lateral Connectivity: A Possible Scaffolding for the Development of Orientation Preference Maps 11:05 Break 11:20 Gy=F6ngyi Ga=E1l (Emory University) John P. Donoghue and Jerome N. Sa= nes Relations Among Neural Activities Recorded in Premotor and Motor Cortex of Trained Monkeys During Visually Guided Hand and Arm Movements Tasks 11:40 Emery N. Brown (Massachusetts General Hospital / MIT) Loren M. =46rank, Dengda Tang, Michael C. Quirk, and Matthew A. Wilson A Statistical Model of Spatial Information Encoding in The Rat Hippocampus 12:00 Satoru Inoue (The University of Electro-Communications) Yoshiki Kashimori and Takeshi Kambara The Neural Model of Nucleus Laminaris and Integration Layer Accomplishing Hyperacuity in Sound Location in the Barn Owl 12:20 Lunch Break and Poster Preview Session A 2:20 Featured Contributed Talk: Michael E Hasselmo (Harvard University) Erik Fransen, Gene V Wallenstein, Angel A Alonso, and Clayton T Dickson A Biophysical Simulation of Intrinsic and Network Properties of Entorhinal Cortex Contributed Talks 3:10 Jonathan Wolfe (University of New Mexico) Akaysha Tang Neuromodulation of Spike Timing and Spike Rate 3:30 Omer Artun (Brown University) Harel Z. Shouval Temporal Coding by Dynamic Synapses 3:50 Break 4:10 Hans E. Plesser (Max-Planck-Institut) Theo Geisel Bandpass Properties of Integrate-Fire Neurons 4:30 Invited Talk: To be Announced 5:20 End of Day Announcements 8:00 Poster Session A P=E9ter Adorj=E1n (Technische Universit=E4t Berlin) Gy=F6rgy Barna, P=E9ter =C9rdi, and Klaus Obermayer A Statistical Neural Field Approach to Orientation Selectivity 543 Charles H. Anderson (Washington University School of Medicine) Modeling Population Codes using Probability Density Functions 239 Charles H. Anderson (Washington University School of Medicine) Shahin Hakimian and W. Thomas Thach A PDF Model of Populations of Purkinje Cells: Non-linear Interactions and High Variability 248 David J. Anderson (University of Michigan) Steven M. Bierer Noise Reduction of Multi-channel Neural Activity Using an Array Processing Technique 501 Ildiko Aradi (Ohio University) William R. Holmes Active Dendrites Regulate Spatio-temporal Synaptic Integration in Hippocampal Dentate Granule Cells 388 Delorme Arnaud (Centre de Recherche Cerveau et Cognition) =46abre-Thorpe Mich=E8le, Richard Ghislaine, Fize Denis, and Simon Thorpe Rapid Processing of Complex Natural Scenes : A Role for the Magnocellular Visual Pathways? 235 Delorme Arnaud (Centre de Recherche Cerveau et Cognition) Jacques Gautrais, Rufin van Rullen, and Simon Thorpe Pikenet : A Simulator for Modeling Large Networks of Integrate and Fire Neurons 369 Bill Baird (U.C. Berkeley) An Oscillating Cortical Architecture Simulating Auditory Attention and Eeg-erp Data 513 Davis Barch (University of California, Berkeley) Donald A. Glaser Detection and Characterization of Coherent Motion by A 2-dimensional Sheet of Connected Elements: The "bow-wave" Model 412 William H. A. Beaudot (McGill University) =46igure-ground Segregation of Coherent Motion In V1: A Model Based on The Role of Intra-cortical and Extra-cortical Feedbacks 386 Avrama Blackwell (George Mason University) Dynamics of the Light-induced Na+ Current in Hermissenda 206 Brian Blais (Brown University) Harel Shouval and Leon N Cooper =46ormation of Direction Selectivity in Natural Scene Environments 226 Alan H. Bond (Caltech) A System Model of the Primate Neocortex 593 Vladimir E. Bondarenko (Russian Academy of Sciences) Teresa Ree Chay Generation of Various Rhythms by Thalamic Neural Network Model 87 Victoria Booth (New Jersey Institute of Technology) Dendritic Plateau Potentials in Bistable Motoneurons 236 Carlos Brody (UNAM) Slow Resting Potential Covariations in Lgn Neurons Can Lead To Apparently =46ast Cross-correlations in Their Spike Trains 316 David Brown (The Babraham Institute) Jianfeng Feng Is There a Problem Matching Model and Real Cv(isi)? 238 Anthony N. Burkitt (The Bionic Ear Institute) Graeme M. Clark New Technique for Analyzing Integrate and Fire Neurons 574 Gully Burns (University of Southern California) Neuroscholar 1.00, A Neuroinformatics Databasing Website 517 Maria Bykhovskaia (University of Virginia) Mary Kate Worden and John T. Hackett =46requency Facilitation at the Lobster Neuromuscular Junction: Quantal Analysis and Simulations 193 Martin T. Chian (University of Southern California) M. T. Chian, V.Z. Marmarelis, and T.W. Berger Identification of Unobservable Neural Systems in the Hippocampus using Adaptive Estimation 406 Ryan Clement (Arizona State University) Russell Witte, Rob Rennaker, and Daryl Kipke =46unctional Connectivity in Auditory Cortex using Chronic, Multichannel Microelectrodes in Awake Animals 602 Sharon Crook (Montana State University) John P. Miller The Mechanistic Basis of Neural Encoding 192 Erik De Schutter (University of Antwerp) Reinoud Maex and Bart Vos Synchronized Firing of Golgi Cells: Functional Modulation by the Cerebellar Circuitry 259 Patricia M. Di Lorenzo (SUNY at Binghamton) Inhibitory Influence on Electrophysiological Response to Taste in the Brain Stem 373 Jim Dilmore (University of Pittsburgh) J. G. Dilmore, B. S. Gutkin, and G. B. Ermentrout A Biophysical Model of Dopaminergic Modulation of Persistent Sodium Currents in Pfc Pyramidal Neurons: Effects on Neural Response Properties 217 Alexander Dimitrov (The University of Chicago) Alexander Dimitrov, Trevor Mundel, Vernon L. Towle, and Jack D. Cowan Independent Components Analysis of Subdural Ecog Recordings from an Epileptic Patient 396 Mikael Djurfeldt (SANS/NADA, KTH) Anders Sandberg, =D6rjan Ekeberg, and Anders Lansner See---a Framework for Simulation of Biologically Detailed and Artificial Neural Networks and Systems 518 Gideon Dror (The Academic College of Tel-Aviv-Yaffo) Analysis and Modelling of Population Dynamics in the Visual Cortex 564 Witali L. Dunin-Barkowski (Texas Tech University) Serge L. Shishkin and Donald C. Wunsch Phase-based Storage of Information in Cerebellum: A Case of Stationary Random Inputs 531 Michael Eisele (Salk Institute) Terry Sejnowski Model-based Reinforcement Learning by Pyramidal Neurons 485 Chris Eliasmith (Washington University in St. Louis) Charles H. Anderson Attractors, Representation, and the Pdf Framework 288 P=C8ter Erdi (Hungarian Academy of Sciences) A Statistical Approach to Neural Population Dynamics: Theory, Algorithms, Simulations 263 Jianfeng FENG (The Babraham Institute) Coefficient of Variation Greater Than .5 How And When? 223 Brent A. Field (Univeristy of Oregon) Alexander R. Pico and Richard T. Marrocco Local Injections Of Neurotransmitters Significantly Alter High-frequency (245hz) Activity 559 Piotr J. Franaszczuk (Univ.of Maryland School of Medicine) Pawel Kudela and Gregory K. Bergey Model of the Propagation of Synchronous Firing in a Reduced Neuron Network 368 =46rancesco Frisone (University of Genova) Paolo Vitali, Pietro G. Morasso, Guido Rodriguez, Alberto Pilot, and Marco R= osa Can the Synchronization of Cortical Areas Be Evidenced by Fmri? 264 Tomoki Fukai (Tokai University) Modeling the Interplay of Short-term Memory and the Basal Ganglia in Sequence Processing 157 Gradwohl Gideon (University of the Negev) Nitzan Ron Grossman Yoram Homogeneous Distribution of Excitatory and Inhibitory Synapses on the Dendrites of the Cat Surea Triceps Alpha-motoneurons Increases Synaptic Efficacy: Computer Model 191 J. Randall Gobbel Carnegie Mellon University Synchronization of Tonically Active Neurons in a Biophysical Model of the Neostriatum 488 David Golomb (Ben-Gurion University of Negev) David Hansel Theory of Synchrony in Sparse Neuronal Networks 208 Jeremy P. Goodridge (Carnegie Mellon University) A. David Redish and David S. Touretzky A Model of the Rat Head Direction System That Accounts for the Unique Properties of Anterior Thalamic Head Direction Cell Firing 384 Alex Guazzelli (USC Brain Project) Mihail Bota and Michael A. Arbib Incorporating Path Integration Capabilities In the Tam-wg Model of Rodent Navigation 428 Alex Guazzelli (USC Brain Project) Mihail Bota and Michael A. Arbib Incorporating Path Integration Capabilities in the Tam-wg Model of Rodent Navigation 524 Juergen Haag (Friedrich-Miescher-Laboratory) Alexander Borst Influence of Active Membrane Properties on the Encoding of Motion Information In Visual Interneurons of the Blowfly 103 Rolf Henkel University of Bremen Sampling Three-dimensional Space --- The Interplay of Vergence- And =46usion-system 198 Michael Herrmann (Max-Planck-Institut) Klaus Pawelzik and Theo Geisel Simultaneous Self-organization of Place and Direction Selectivity in a Neural Model of Self-localization 420 MONDAY, JULY 27, 1998 9:00 General Information 9:15 Featured Contributed Talk: Vikaas S. Sohal (Stanford University) and John R. Huguenard Long-range Connections Synchronize Rather Than Spread Intrathalamic Oscillatory Activity: Computational Modeling and In Vitro Electrophysiology Contributed Talks 10:05 Thomas Wennekers (University of Ulm) Gunther Palm How Imprecise is Neuronal Synchronization? 10:25 Steven P. Dear (Pennsylvania State University) Corey B. Hart Computational Mechanisms Linking Synchronization and Information Coding 10:45 Nicholas Hatsopoulos (Brown University) Liam Paninski, Nicholas G. Hatsopoulos, and John P. Donoghue Mutual Information Provided by Synchronous Neuronal Discharge about Target Location 11:05 Break 11:20 Dror Gideon (Academic College of Tel-Aviv-Yaffo) Tsodyks Misha Analysis and Modelling of Population Dynamics in the Visual Cortex 11:40 Steven L. Bressler (Florida Atlantic University) Mingzhou Ding and Weiming Yang Investigation of Cooperative Cortical Dynamics by Multivariate Autoregressive Modeling of Event-related Local Field Potentials 12:00 Charlotte Gruner (Rice University) Don H. Johnson Correlation and Neural Information Coding Efficiency 12:20 Mark S. Goldman (Brandeis University) Sacha B. Nelson and Laurence F. Abbott Decorrelation Of Spike Trains by Synaptic Depression 12:40 Lunch Break and Poster Preview Session B 2:00 Featured Contributed Talk: John K. Chapin (Allegheny University of Health Sci.) Ronald S. Markowitz and Karen A. Moxon Controlling Robot Arms using Neuronal Population Recordings Contributed Talks 2:50 Rolf Eckmiller (Universitaet Bonn) Ralph Huenermann and Michael Becker Exploration of a Dialog-based Tunable Retina Encoder for Retina Implants 3:10 Joel White (Tufts Medical School) John Kauer Odor Recognition in an Artificial Nose by Spatio-temporal Processing using an Olfactory Neuronal Network. 3:30 Ralf Moeller (University of Zurich) Marinus Maris and Dimitrios Lambrinos A Neural Model of Landmark Navigation in Insects 3:50 Break 4:10 Malcolm P. Young (Neural Systems Group) Claus-C Hilgetag and Jack W Scannell Models of Paradoxical Lesion Effects and Rules of Inference for Imputing Function to Structure in the Brain 4:30 Invited Talk: TBA 5:20 End of Day Announcements 8:00 Poster Session B John Hertz (Nordita) Zhaoping Li Odor Recognition and Segmentation by Coupled Olfactory Bulb and Cortical Networks 106 Andrew Hill (Emory University) Phase Lag between Oscillators of a Realistic Neuronal Network Model of the Leech Heartbeat Motor Pattern Generating System 429 Osamu Hoshino (The University of Electro-Communications) Yoshiki Kashimori and Takeshi Kambara A Neural Mechanism of Feature Binding Based on the Dynamical Map Theory in Distributed Coding Scheme 257 Arthur Houweling (The Salk Institute) Maxim Bazhenov, Igor Timofeev, Mircea Steriade, and Terrence Sejnowski Computational Analysis of Intracortical Augmenting Responses Resulting from Short-term Synaptic Plasticity 482 Hidetoshi Ikeno (Maizuru National College of Technology) Shiro Usui Mathematical Description of Ionic Currents of the Kenyon Cell in the Mushroom Body of Honeybee 553 Laurent Itti (California Institute of Technology) Christof Koch and Jochen Braun A Quantitative Model Relating Visual Neuronal Activity to Psychophysical Thresholds 530 Eugene Izhikevich (Arizona State University) =46M Interactions in Brain Models 11 Eugene Izhikevich (Arizona State University) Theoretical Foundations of Pulse-coupled Models. 12 David B. Jaffe (University of Texas at San Antonio) Raymond A. Chitwood Comparing Electrotonus in Hippocampal Ca3 Nonpyramidal and Pyramidal Neurons 274 David B. Jaffe (The University of Texas at San Antonio) Raymond A. Chitwood The Contribution of Active Dendrites to Epsp Propagation in a Hippocampal Ca3 Nonpyramidal Neuron Model 438 Pat Johnston UCLA John Klopp, Val Nenov, and Eric Halgren The Effects of Varying Connectional Parameters on Input-output Relationships in a Neocortical-hippocampal Model 432 Jorge V. Jose (Northeastern University) P.H.E. Tiesinga Spiking Statistics in Noisy Hippocampal Interneurons 98 Jeeyune Jung (University of Kentucky) Ranu Jung Brain-spinal Feedforward-feedback Interactions Affect Output Pattern and Intracellular Properties of Motor Networks in the Lamprey 471 Takeshi Kambara (Univ. of Electro-Communications) Hiromichi Owada and Yoshiki Kashimori A Neural Mechanism of Am Frequency Selectivity of Pyramidal Cell Circuit in Electrosensory Lateral-line Lobe of Weakly Electric Fish 496 Adam Kepecs (Biology Department, Brandeis University) Milos Dolnik Control of Neuronal Chaos: Using Single Cell Dynamics To Store Information 440 Daryl Kipke (Arizona State University) Russell Witte, Glen Hattrup, Justin Williams, and Daryl Kipke Pursuing Dynamic Reorganization in Auditory Cortex using Chronic, Multichannel Microelectrodes in Awake, Behaving Animals 497 Jeanette Hellgren Kotaleski (Karolinska Institutet) Jesper Tegner, Sten Grillner, and Anders Lansner Control of Burst Proportion and Frequency Range by Drive Dependent Modulation of Adaptation 200 Jeffrey L. Krichmar (George Mason University) Kim T. Blackwell, Garth S. Barbour, Alexander B. Golovan, and Thomas P. Vogl A Solution To The Feature Correspondence Problem Inspired By Visual Scanpath= s 215 Yoshihisa Kubota (Caltech) James M. Bower Decoding Time-Varying Calcium Signals by CaMKII/PP1: A Dynamical System Theory of Synaptic Calcium Computation 616 Linda J. Larson-Prior San Francisco College of Osteopathic Medicine Huo Lu and Fred W. Prior Serotonergic Modulation of the Cerebellar Granule Cell Network 234 Mark Laubach (Duke University Medical Center) Marshall Shuler and Miguel Nicolelis Principal and Independent Component Analyses for Multi-site Investigations of Neural Ensemble Interactions 229 Sarah Lesher (University of Maryland) Nick Mellen, Suzanne Dykstra, Mark L. Spano, and Avis H. Cohen Stable Lamprey Swimming Has Unstable Periodic Orbits 374 William B. Levy (University of Virginia) Xiangbao Wu Enhancing The Performance of A Hippocampal Model by Increasing Variability Early in Learning 529 Jim-Shih Liaw (University of Southern California) J.-S.Liaw and T.W. Berger Synapse Dynamics: Harnessing the Computing Power of Synaptic Dynamics 398 David T.J. Liley (University of Technology) Peter J. Cadusch and James J. Wright A Continuum Theory of Electrocortical Activity 72 Miguel Maravall (SUNY at Stony Brook) An Analysis of Connectivity and Function in Hippocampal Associative Memory 232 Bethge Matthias (Max-Planck-Institut) Klaus Pawelzik and Theo Geisel Rapid Learning with Depressing Synapses 555 Marcelo Bastos Mazza (Universidade de S=E3o Paulo) Ant=F4nio Carlos Roque da Silva Filho A Realistic Computer Simulation of Properties of Somatotopic Maps 101 Marilene de Pinho S. Mazza (Universidade de S=E3o Paulo) Marilene de Pinho and Ant=F4nio Carlos Roque da Silva Filho A Realistic Computer Simulation of Tonotopic Maps Formation Processes in the Auditory Cortex 102 Bruce H. McCormick (Texas A&M University) Design of a Brain Tissue Scanner 160 Bruce H. McCormick (Texas A&M University) Brent P. Burton and Travis S. Chow Virtual Microscopy of Brain Tissue 161 Bruce H. McCormick (Texas A&M University) Brent P. Burton, Travis S. Chow, and Andrew T. Duchowski Exploring The Brain Forest 159 Elliot D. Menschik (University of Pennsylvania) Shih-Cheng Yen and Leif H. Finkel =46unctional Properties of a Cellular-level Model of Hippocampal Ca3 127 John Miller (Montana State University) B. Girish, Tenaya M. Rodewald, and John P. Miller Encoding of Direction and Dynamics of Air Currents by Filiform Mechanoreceptors in the Cricket Cercal System 431 Ali A. Minai (University of Cincinnati) Simona Doboli and Phillip J. Best A Letent Attractors Model of Context-selection in The Dentate Gyrus-hilus System 268 =46arhad K. Mosallaie (Baylor College of Medicine) John A. Halter and Andrew R. Blight Effects of Activity Dependent Ion Concentration on Repetitive Firing in the Myelinated Axon 528 =46rank Mossn (University of Missouri at St. Louis) Hans A. Braun, M. Dewald, M. Huber, K. Voigt, and Xing Pei Unstable Periodic Orbits and Chaos in Thermally Sensitive Neurons 272 =46rank Moss (University of Missouri at St. Louis) Peter Jung and Ann Cornell-Bell Noise Mediated Spiral Waves in Glial Cell Networks Show Evidence of Self Organized Critical Behavior 347 =46rank Moss (University of Missouri) Xing Pei, Kevin Dolan, and Ying-Cheng Lai Counting and Scaling Unstable Periodic Orbits in Biological and Physical Systems 252 Karen Anne Moxon (Allegheny University) John K. Chapin Cortico-thalamic Interactions in Response to Whisker Stimulation in a Computer Model of the Rat Barrel System 222 Robert Muller (SUNY Health Science Center at Brooklyn) Andre Fenton and Gyorgy Csizmadia Conjoint Control of Place Cell Activity by Two Visual Stimuli 372 Shingo Murakami (The University of Tokyo) Akira Hirose Proposal of Microscopic Nerve-cell-activity Analysis Theory for Elucidating Membrane Potential Dynamics 328 John S. Nafziger (University of Pennsylvania) Shih-Cheng Yen and Leif H. Finkel Effects of Element Spacing on the Detection of Contours: Psychophysical and Modeling Studies 400 Hirofumi Nagashino (The University of Tokushima) Minoru Kataoka and Yohsuke Kinouchi A Coupled Neural Oscillator Model for Recruitment and Annihilation of the Degrees of Freedom of Oscillatory Movements 95 Bruno A. Olshausen (University of California, Davis) A Functional Model of V1 Horizontal Connectivity Based on the Statistics of Natural Images 468 Tetsuya Oyamada (The University of Electro-Communications) Yoshiki Kashimori and Takeshi Kambara A Neural Network Model of Olfactory System for Odor Recognition and Memorization Controlled by Amygdala 459 Xing Pei (University of Missouri at St. Louis) Lon Wilkens and Winfried Wojtenek The Site Of Endogenous Oscillation In The Electrosensory Primary Afferent Nerve Fiber In The Paddlefish, Polyodon Spathula 332 J. S. Pezaris (California Institute of Technology) M. Sahani and R. A. Andersen Response Correlations in Parietal Cortex 538 TUESDAY, JULY 28, 1998 9:00 General Information 9:15 Featured Contributed Talk: Raymon M. Glantz (Rice Universiry) A Cellular Model for rhe Mechanism of Directional Selectivity in Tangential Cells of the Crayfish Visual System Contributed Talks 10:05 Hermann Schobesberger (University of Pittsburgh) Boris S. Gutkin and John P. Horn A Minimal Model for Metabotropic Modulation of Fast Synaptic Transmission and Firing Properties in Bullfrog Sympathetic B Neurons 10:25 Taraneh Ghaffari-Farazi (University of Southern California) T. Ghaffari-Farazi, J.-S. Liaw, and T.W. Berger Morphological Impacts on Synaptic Transmission 10:45 Niraj S. Desai (Brandeis University) Lana C. Rutherford, Sacha B. Nelson, and Gina G. Turrigiano Activity Regulates the Intrinsic Excitability of Neocortical Neurons 11:05 Break 11:20 Dieter Jaeger (Emory University) Volker Gauck The Response Function of Different Types of Neurons for Artificial Synaptic Input Applied with Dynamic Current Clamping. 11:40 M=E1t=E9 Lengyel (KFKI, Research Inst.) =C1d=E1m Kepecs and P=E9ter = =C9rdi An Investigation of Location-dependent Differences between Somatic and Dendritic IPSPs 12:00 Farzan Nadim (Brandeis University) Yair Manor, Nancy Kopell, and Eve Marder Frequency Regulation by a Synapse Acting as a Switch: A Role for Synaptic Depression of Graded Transmission 12:20 Lunch Break and Poster Preview Session C 2:00 Invited Talk: TBA Contributed Talks 2:50 Christopher A. Del Negro (UCLA) Chie-Fang Hsiao and Scott H. Chandler Orthodox and Unorthodox Dynamics Govern Bursting Behavior in Rodent Trigeminal Neurons 3:10 Mike Neubig (University Laval) Alain Destexhe Changes in the Subcellular Localization of T-type Calcium Channels Change the Threshold and Strength of its Bursts 3:30 Paul Rhodes (National Institutes of Health) Regulation of Na+ And K+ Channel Gating Controls the Electrical Properties of Pyramidal Cell Dendrites 3:50 Break 4:10 Jaap van Pelt (Netherlands Institute for Brain Research) Harry B. M. Uylings Modeling the Natural Variability in Neuronal Branching Patterns 4:30 Adrian Robert (UC San Diego) Pyramidal Arborizations and Activity Spread in Neocortex 4:50 Geoffrey J. Goodhill (Georgetown University Medical Center) Jeffrey S. Urbach Mathematical Analysis of Gradient Detection by Growth Cones 5:20 End of Day Announcements 8:00 Poster Session C Christian Piepenbrock (Technical University of Berlin) Klaus Obermayer Effects of Lateral Competition in the Primary Visual Cortex on the Development of Topographic Projections and Ocular Dominance Maps 381 Panayiota Poirazi (University of Southern California) Bartlett Mel Memory Capacity of Neurons with Active Dendrites 233 Sergei Rebrik (University of California, San Francisco) Brian D. Wright and Ken Miller Cross Channel Correlations in Tetrode Recordings: Implications for Spike-sorting 434 Chris Roehrig (University of British Columbia) Catharine H. Rankin Dymods: A Framework for Modularizing Dynamical Neuronal Structures 558 Jonathan Rubin (The Ohio State University) David Terman Geometric Analysis of Neural Firing Patterns in Network Models with Fast Inhibitory Synapses 614 Eytan Ruppin (Tel-Aviv University) Gal Chechik and Isaac Meilijson Neuronal Regulation: A Biological Plausible Mechanism for Efficient Synaptic Pruning in Development 572 Maureen E. Rush (California State University, Bakersfield) William Ott A-current Modulation of Low-threshold Spiking 481 Ilya A. Rybak (DuPont Central Research) Michael L. Ramaker and James S. Schwaber Modeling Interacting Neural Populations: Dynamics, State Transitions and Applications to Particular Models 83 Cristiane Salum (St. George's Hospital Medical School) Alan D. Pickering Striatal Dopamine in Reinforcement Learning: A Computational Model 326 Hermann Schobesberger (University of Pittsburgh) Boris S. Gutkin and John P. Horn A Minimal Model for Metabotropic Modulation of Fast Synaptic Transmission and Firing Properties in Bullfrog Sympathetic B Neurons 385 Simon Schultz (Oxford University) Stefano Panzeri, Alessandro Treves, and Edmund T. Rolls Correlated Firing and the Information Represented by Neurons in Short Epochs 325 Nicolas Schweighofer (ERATO) Kenji Doya and Mitsuo Kawato A Model of the Electrophysiological Properties of the Inferior Olive Neurons 197 Peggy Series (Ecole Normale Superieure) Philippe Tarroux Synchrony and Delay Activity in Cortical Network Models 311 Ladan Shams (University of Southern California) Christoph von der Malsburg Are Object Shape Primitives Learnable? 63 Lokendra Shastri (International Computer Science Institute) Recruitment of Binding-match and Binding-error Detector Circuits Via Long-term Potentiation and Depression 427 Natalia Shevtsova (University of Maryland) James A. Reggia Lateralization in a Bihemispheric Neural Model of Letter Identification 294 Ying Shu (Univ. of Southern California) Xiaping Xie, Jim-shih Liaw, and Ted W. Berger A Protocol-based Simulation for Linking Computational and Experimental Studi= es 491 Jonathan Z. Simon (University of Maryland) Catherine E. Carr and Shihab A. Shamma A Dendritic Model of Coincidence Detection in the Avian Brainstem 297 =46rances K. Skinner The Toronto Hospital Research Institute Liang Zhang, Jose Luis Perez Velazquez, and Peter L. Carlen Bursting: A Role for Gap-junctional Coupling 370 Gregory D. Smith National Institute of Health Charles L. Cox, S. Murray Sherman, and John Rinzel =46ourier Analysis of Sinusoidally Driven Thalamocortical Relay Neurons and = a Minimal Integrate-and-fire-or-burst Model 65 Sheryl S. Smith Allegheny University Ronald S. Markowitz, Chris I. deZeeuw and John K. Chapin Hormone Modulation of Synchronized Inferior Olivary Ensembles During Rapid Vibrissa Movement: Association with Increased Levels of Gap Junction Proteins 218 Jacob Spoelstra University of Southern California Michael A. Arbib and Nicolas Schewighofer Cerebellar Adaptive Control of a Biomimetic Manipulator 466 Klaas Enno Stephan (C&O-Vogt Institute) Rolf K=F6tter Objective Relational Transformation (ort) - A New Foundation cor Connectivity Databases 251 Michael Stiber (Univ. of Washington) Bilin Zhang Stiber, Edwin R. Lewis, and Kenneth R. Henry Categorization of Gerbil Auditory Fiber Responses 162 Susanne Still (Institut fuer Neuroinformatik) Gwendal Le Masson Traveling Waves with Asymmetric Phase-lags in a Ring of Three Inhibitory Coupled Model Neurons 585 =46ahad Sultan (Univeristy Tuebingen) A Model of Temporal and Activity Dependent Mechanisms Underlying the Phyolgenetic Development of Cerebellar Molecular Interneuron Morphology 504 Daniel Suta (Johns Hopkins University) Eric D. Young Computer Simulations of Dorsal Cochlear Nucleus Neuronal Circuits 377 Joel Tabak (NINDS/NIH) Walter Senn, Michael O'Donovan, and John Rinzel Comparison of Two Models for Pattern Generation Based on Synaptic Depression 421 David C. Tam (University of North Texas) A Spike Train Analysis for Detecting Temporal Integration in Neurons 413 Shoji Tanaka (Yale University School of Medicine) Shuhei Okada =46unctional Prefrontal Cortical Circuitry for Visuospatial Working Memory =46ormation: A Computational Model. 94 Masami Tatsuno (Waseda University) Yoji Aizawa Network Model of Synaptic Modification Induced by Time-structured Stimuli in the Hippocampal Ca1 Area 158 Kathleen Taylor (University of Oxford) John Stein Attention, Intention and Salience in the Posterior Parietal Cortex 261 Jesper Tegn=E9r (Nobel Institute for Neurophysiology) Jeanette Hellgren-Kotaleski The Synaptic Nmda Component Affects the Synchronization between Neural Oscillators 201 Simon Thorpe (Centre de Recherche Cerveau et Cognition) Van Rullen Rufin Spatial Attention in Asynchronous Neural Networks 335 Hiroyuki Uchiyama (Kagoshima University) Avian Centrifugal Visual System: A Possible Neural Substrate for Selective Visual Attention 407 Michael S. Wehr (Caltech) John Pezaris and Maneesh Sahani Simultaneous Paired Intracellular and Tetrode Recordings for Evaluating The Performance of Spike Sorting Algorithms. 507 Thomas Wennekers (University of Ulm, Germany) =46riedrich T. Sommer Gamma-oscillations Support Optimal Retrieval in Associative Memories of Pinsky-rinzel Neurons 376 Ralf Wessel (UCSD) William B. Kristan Jr and David Kleinfeld Spatial Distribution of Voltage-gated Channels in the Neurites of the Leech Anterior Pagoda Cell: Functional Consequences for Nonlinear Synaptic Integration 142 Justin C. Williams (Arizona State University) Robert Rennaker, David Pellinen, and Daryl Kipke Towards A Long Term Neural Interface: Unit Stability in Chronic, Multichannel Recordings 492 Simon A. J. Winder (Microsoft Corporation) A Model for Biological Winner-take-all Neural Competition Employing Inhibitory Modulation of Nmda-mediated Excitatory Gain 536 Laurenz Wiskott (The Salk Institute) Learning Invariance Manifolds 108 Russell Witte (Arizona State University) Glen Hattrup, Justin Williams, and Daryl Kipke Pursuing Dynamic Reorganization in Auditory Cortex Using Chronic, Multichannel Microelectrodes in Awake, Behaving Animals. 560 Shih-Cheng Yen (University of Pennsylvania) Elliot D. Menschik and Leif H. Finkel Synchronization and Desynchronization in Striate Cortical Networks 395 Katherine R. Zaremba (Arizona State University) Steven M. Baer Relaxation Oscillators and Bursters Coupled Through Passive Cables 573 Ying Zhou (Rhode Island College) Walter Gall Including a Second Inward Conductance in Morris and Lecar Dynamics 211 WEDNESDAY, July 29, 1998 9:30 General Information 9:45 Featured Contributed Talk: Volker Steuber (University of Edinburgh) and David Willshaw A Model of Intracellular Signalling Can Implement Radial Basis =46unction Learning in Cerebellar Purkinje Cell Contributed Talks 10:35 Berthold Ruf (Institute for Theoretical Computer Science) Thomas Natschlaeger Pattern Analysis with Spiking Neurons Using Delay Coding 10:55 Arthur D. Kuo (University of Michigan) Mark J. Evans An Adaptive Filter Model of Velocity Storage in Visual-vestibular Interactions 11:15 Yoshiki Kashimori (Univ. of Electro-Communications, Japan) Takeshi Kambara Neural Mechanism of Adaptive Suppression of Background Signals Arising from Tail Movements in the Gymnotid Electrosensory System 11:35 Daniel A. Butts (Lawrence Berkeley Laboratory) Marla B. Feller, Carla J. Shatz, and Daniel S. Rokhsar The Developing Retina Motivates a General Model of Neural Waves 11:55 Federal Funding Opportunities 12:30 Lunch Break 2:00 Workshops I - Organization 9:30 Rock and Roll Jam Session THURSDAY, July 30, 1998 9:30 General Information 9:45 Featured Contributed Talk: Xiao-Jing Wang (Brandeis University) Yinghui Liu and Baylor Fox Neuronal Mechanisms of Working Memory in Prefrontal Cortex Contributed Talks 10:35 David Horn (Tel Aviv University) Nir Levy and Eytan Ruppin The Importance of Nonlinear Dendritic Processing in Multimodular Memory Networks 10:55 Song Chun Zhu (Stanford University) From Local Features to Global Perception: Computing Texture and Shape in Markov Random Fields 11:15 Enrico Simonotto (Univ. Missouri at St. Louis) F.Spano, M.Riani, A. Ferrari, F. Levrero, A. Pillot, P. Renzetti, R.C. Parodi, F. Sardanelli, P. Vitali, J. Twitty, and F. Moss FMRI Studies of Visual Cortical Activity during Noise Stimulation 11:35 Featured Contributed Talk: Frank Moss (University of Missouri at St. Louis) David F. Russell Animal Behavior Enhanced By Noise 12:25 Business Meeting 1:00 Lunch Break 2:30 Workshops II - Organization 9:30 Banquet - Life is a beach ************************************************************************ CNS*98 REGISTRATION FORM Last Name: First Name: Title: Student___ Graduate Student___ Post Doc___ Professor___ Committee Member___ Other___ Organization: Address: City: State: Zip: Country: Telephone: Email Address: REGISTRATION FEES: Technical Program --July 26 - July 30, 1998 Regular $225 ($250 after June 26th) Price Includes One Banquet Ticket Student $ 95 ($125 after June 26th) Price Includes One Banquet Ticket Each Additional Banquet Ticket $35 Total Payment: $ Please Indicate Method of Payment: Check or Money Order * Payable in U. S. Dollars to CNS*98 - Caltech * Please make sure to indicate CNS*98 and YOUR name on all money transfers. Charge my card: Visa Mastercard American Express Number: Expiration Date: Name of Cardholder: Signature as appears on card (for mailed in applications): Date: ADDITIONAL QUESTIONS: Previously Attended: CNS*92___ CNS*93___ CNS*94___ CNS*95___ CNS*96___ CNS*97___ Did you submit an abstract and summary? ( ) Yes ( ) No Title: Do you have special dietary preferences or restrictions (e.g., diabetic, low sodium, kosher, vegetarian)? If so, please note: Some grants to cover partial travel expenses may become available for students and postdoctoral fellows who present papers at the meeting. Do you wish further information? ( ) Yes ( ) No PLEASE FAX OR MAIL REGISTRATION FORM TO: Caltech, Division of Biology 216-76, Pasadena, CA 91125 Attn: Judy Macias =46ax Number: (626) 795-2088 (Refund Policy: 50% refund for cancellations on or before July 17th, no refund after July 17th) ******************************************************************** ************************************************************************ PLEASE CALL FESS PARKER'S DOUBLETREE RESORT TO MAKE HOTEL RESERVATIONS AT (800) 879-2929, (805) 564-4333 Fax (805)564-4964 PLEASE NOTE: YOU CAN MAIL REGISTRATION FORM TWO WAYS * MAIL REGISTRATION FORM TO FESS PARKER'S DOUBLETREE RESORT AT THE ADDRESS BELOW * FAX REGISTRATION TO (805) 564-4964 ********************************************************************** MAIL TO: Fess Parker's Doubletree Resort Attn: Reservation Department 633 East Cabrillo Boulevard Santa Barbara, CA 93103-9932 Check-In Time: 4:00 p.m. Check-Out Time: 12:00 noon Computational Neuroscience Conference - CNS*98 July 26 - 30, 1998 REQUESTS MUST BE RECEIVED BY: JUNE 24, 1998 Name of Person Requesting Rooms: Last Name:____________________________ =46irst Name:____________________________ Company Name:________________________ Institute:_______________________________ Street Address or PO Box Number:___________________________ City:___________________________________ State:___________________________________ Zip Code:________________________________ Area Code and Phone Number:______________________________ ARRIVAL (DAY/DATE)______________ TIME ______________ DEPARTURE (DAY/DATE)____________ TIME ______________ ACCOMMODATIONS RATES Student.................................$99.00 - 30 student room rates available (Students will be asked to verify their status on check in with a student ID or other documentation.) SINGLE $139.00 Double $139.00 ___ I prefer a non-smoking room. Deposit non-refundable if not cancelled 72 hours before arrival. =46OR RESERVATIONS OR CANCELLATIONS OR OTHER INFORMATION, PLEASE CALL DIRECT (800)879-2929 (INSIDE U.S.), (805)564-4333, FAX (805) 564-4964 THIS IS A RESERVATION REQUEST AND MUST BE GUARANTTED BY A DEPOSIT OR AN ACCEPTED CREDIT CARD NUMBER AND SIGNATURE: ____ Guaranteed by my first night's deposit (check or Money Order enclosed) ____ Guaranteed by my credit card (Visa___, MasterCard___, American Express___, Diners Club___, or Carte blanche.) Credit Card No. ______________________________________ Expiration Date:______________________________________ I understand that I am liable for one night's room and tax which will be deducted from my deposit or billed through my credit card. Cancellation will be subject to current hotel policy and handling charge. Signature:__________________________ There is a 72 hour cancellation policy in effect at the resort. Rooms are subject to 10 % Santa Barbara Occupancy Tax. Children under 18 free when sharing room with adult. If the room type requested is not available, the next available room type will be assigned. If your group has a range of rates and the rate category requested has been filled, then the next available rate will apply. Special Requests:________________________________________________________ ______________________________________________________________________ ______________________________________________________________________ ______________________________________________________________________ All special requests are on a space availability basis. From ted.carnevale at yale.edu Thu Jun 18 09:45:30 1998 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Thu, 18 Jun 1998 09:45:30 -0400 Subject: the NEURON summer course! Message-ID: <358919FA.35D@yale.edu> There have been so many re-announcements of other summer events through multiple channels that I was a bit hesitant to add to the din--but just in case you haven't heard: Time is running out to register for the NEURON course we're presenting August 1-5 at the San Diego Supercomputer Center. Salient features: 1. lots of hands-on exercises using NEURON on UNIX and Windows NT workstations 2. covers everything you need to know to get started 3. topics include: --using real anatomical and biophysical data --extending NEURON's library of biophysical mechanisms with NMODL --using NEURON's built-in tools for automated data analysis and model optimization --strategies for increasing model efficiency and accuracy --accelerating simulations with the new variable order, variable timestep integration method --NEURON's powerful tools for electrotonic analysis (as described in Carnevale et al. Comparative electrotonic analysis of three classes of rat hippocampal neurons. JNP 78:703-720, 1997) An early version of this toolkit is described at http://www.neuron.yale.edu/papers/ebench/ebench.html --project management --customizing the graphical user interface --using NEURON to model networks of neurons Learn all this and more from the experts! For more information and the registration form, see http://www.neuron.yale.edu/sdsc98/sdsc98.htm The registration deadline is approaching rapidly, and only a few seats are left, so don't delay! --Ted From stefan.wermter at sunderland.ac.uk Thu Jun 18 13:14:21 1998 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Thu, 18 Jun 1998 18:14:21 +0100 Subject: Several additional PhD stipends Message-ID: <35894AEC.34C91471@sunderland.ac.uk> Several openings exist for PhDstudents/ researcher. Please note the different sources for further information. Please forward to potential researcher/students. Apologies if you are on multiple mailing lists. -------------------------------------------------- UNIVERSITY OF SUNDERLAND SCHOOL OF COMPUTING AND INFORMATION SYSTEMS The School of Computing and Information Systems is pleased to be able to offer a small number of studentships for candidates to study for MPhil and PhD (full-time). The School has a growing research reputation and is keen to build upon its 3A rating in the 1996 Research Assessment Exercise. Applicants should have a good honours degree in Computer Science, and a strong desire to undertake high quality research in an ambitious and thriving School. Studentships are available to EU citizens and cover all fees and maintenance support of approximately 5,000 per annum. Studentships will commence in October 1998. Projects are available in the following areas: Neural Networks Speech and Image Processing Human Computer Interaction Natural Language Processing Information Retrieval Digital Media Software Engineering Decision Support Systems For a detailed list of projects please contact: Mark Hindmarch Email: mark.hindmarch at sunderland.ac.uk School of Computing and Information Systems University of Sunderland St Peters Campus St Peters Way Sunderland SR6 0DD UK Application is by CV to Mark Hindmarch at the above address. Closing date 17 July 1998. ---------------------------------------------------- ---------------------------------------------------- ---------------------------------------------------- 2. Researcher in Neural and Intelligent Systems (reference number CIRG28) Applications are invited for a three year research assistant position in the School of Computing and Information Systems investigating the development of hybrid neural/symbolic techniques for intelligent processing. This is an exciting new project which aims at developing new environments for integrating neural networks and symbolic processing. You will play a key role in the development of such hybrid subsymbolic/symbolic environments. It is intended to apply the developed hybrid environments in areas such as natural language processing, intelligent information extraction, or the integration of speech/language in multimedia applications. You should have a degree in a computing discipline and will be able to register for a higher degree. A demonstrated interest in artificial neural networks, software engineering skills and programming experience are essential (preferably including a subset of C, C++, CommonLisp, Java, GUI). Experience and interest in neural network software and simulators would be an advantage (e.g. Planet, SNNS, Tlearn, Matlab, etc). Salary is according to the researcher A scale (currently up to 13,871, under revision). Application forms and further particulars are available from the Personell department under +44 191 515 and extensions 2055, 2429, 2054, 2046, or 2425 or E-Mail employee.recruitment at sunderland.ac.uk quoting the reference number CIRG28. For informal enquiries please contact Professor Stefan Wermter, e-mail: Stefan.Wermter at sunderland.ac.uk. Closing date: 10 July 1998. The successful candidate is expected to start the job as soon as possible. ******************************************** Professor Stefan Wermter University of Sunderland Dept. of Computing & Information Systems St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://osiris.sunderland.ac.uk/~cs0stw/ ******************************************** From mel at lnc.usc.edu Thu Jun 18 14:57:33 1998 From: mel at lnc.usc.edu (Bartlett Mel) Date: Thu, 18 Jun 1998 11:57:33 -0700 Subject: Paper Announcement: Active Dendrites & Complex Cells Message-ID: <3589631D.CCB14428@lnc.usc.edu> Members of the connectionist community may be interested in the following paper in the June issue of the Journal of Neuroscience: --------- "Translation-Invariant Orientation Tuning in Visual 'Complex' Cells Could Derive from Intradendritic Computations " Bartlett W. Mel, Daniel L. Ruderman, and Kevin A. Archie ---------- Journal of Neurocience Online: http://www.jneurosci.org/cgi/content/full/18/11/4325 Preprint, via our lab web page (click on Publications): http://lnc.usc.edu/ ----------- ABSTRACT Hubel and Wiesel (1962) first distinguished ``simple'' from ``complex'' cells in visual cortex, and proposed a processing hierarchy in which rows of LGN cells are pooled to drive oriented simple cell subunits, which are pooled in turn to drive complex cells. Though parsimonious and highly influential, the pure hierarchical model has since been challenged by results indicating many complex cells receive excitatory monosynaptic input from LGN cells, or do not depend on simple cell input. Alternative accounts for complex cell orientation tuning remain scant, however, and the function of monosynaptic LGN contacts onto complex cell dendrites remains unknown. We have used a biophysically detailed compartmental model to investigate whether nonlinear integration of LGN synaptic inputs within the dendrites of individual pyramidal cells could contribute to complex-cell receptive field structure. We show that an isolated cortical neuron with ``active'' dendrites, driven only by excitatory inputs from overlapping ON- and OFF-center LGN subfields, can produce clear phase-invariant orientation tuning---a hallmark response characteristic of a complex cell. The tuning is shown to depend critically upon both the spatial arrangement of LGN synaptic contacts across the complex cell dendritic tree, established by a Hebbian developmental principle, and on the physiological efficacy of excitatory voltage-dependent dendritic ion channels. We conclude that unoriented LGN inputs to a complex cell could contribute in a significant way to its orientation tuning, acting in concert with oriented inputs to the same cell provided by simple cells or other complex cells. As such, our model provides a novel, experimentally testable hypothesis regarding the basis of orientation tuning in the complex cell population, and more generally, underscores the potential importance of nonlinear intradendritic subunit processing in cortical neurophysiology. -- Bartlett W. Mel (213)740-0334, -3397(lab) Assistant Professor of Biomedical Engineering (213)740-0343 fax University of Southern California, OHE 500 mel at lnc.usc.edu, http://lnc.usc.edu US Mail: BME Department, MC 1451, USC, Los Angeles, CA 90089 Fedex: 3650 McClintock Ave, 500 Olin Hall, LA, CA 90089 From Kim.Plunkett at psy.ox.ac.uk Fri Jun 19 06:49:21 1998 From: Kim.Plunkett at psy.ox.ac.uk (Kim Plunkett) Date: Fri, 19 Jun 1998 11:49:21 +0100 (BST) Subject: PostDoc position Message-ID: <199806191049.LAA14247@pegasus.psych.ox.ac.uk> University of Oxford Department of Experimental Psychology Applications are invited for a post-doctoral research assistants to work on a programme of research concerned with investigating the nature and causes of language disorders in children. The project is funded for five years by the Wellcome Trust and aims to further understanding of developmental language disorders, in terms of both etiology and cognitive processes. Details of the research programme can be found on the web-site: http://www.mrc-apu.cam.ac.uk/personal/dorothy.bishop/wellcome Post doctoral Research Assistant Academic-Related Research Staff Grade 1A: Salary stlg15,159 - stlg22,785 Candidates should hold, or expect to hold by the time of appointment, a doctoral qualification and relevant research background in connectionist modelling. The post holder will be responsible for developing simulations of the effects of auditory deficits on language acquisition. The post holder will work both independently and as part of an integrated team actively engaged with all stages of the research process. For this post quote Reference: RA/dvmb2. Interviews are planned for 12 August 1998. Dorothy Bishop MRC Senior Scientist, MRC Cognition and Brain Sciences Unit (formerly Applied Psychology Unit) 15, Chaucer Road, Cambridge, UK, CB2 2EF. tel: UK: 01223 355294 ex 850 overseas: 44 1223 355294 ex 850 fax: UK 01223 359062 overseas: 44 1223 359062 email: dorothy.bishop at mrc-apu.cam.ac.uk World Wide Web: http://www.mrc-apu.cam.ac.uk/personal/dorothy.bishop/ From jkh at dcs.rhbnc.ac.uk Fri Jun 19 16:01:26 1998 From: jkh at dcs.rhbnc.ac.uk (Keith Howker) Date: Fri, 19 Jun 1998 16:01:26 +-100 Subject: Technical Report Series in Neural and Computational Learning (second attempt to send) Message-ID: <01BD9B9B.86E172C0@pc7.cs.rhbnc.ac.uk> [Apologies if you get more than one copy: the first attempt ] [had some non-deliveries. rgds, K. ] The European Community ESPRIT Working Group in Neural and Computational Learning Theory (NeuroCOLT) has been funded for a further three years as the Working Group NeuroCOLT2. The overall objective of the Working Group is to demonstrate the effectiveness of technologies that arise from a deep understanding of the performance and implementation of learning systems on real world data. We will continue to maintain the Technical Report Archive of papers produced by members or associates of the group's partners. We have created a new web site: http://www.neurocolt.com/ which gives more information about the project, including access to the Technical Reports (see below for further access instructions). Best wishes John Shawe-Taylor -------------------------------------------------------------------- Titles follow: Abstracts are available via the Web Site 1998 Document Archive Ref. Title Author 1998-abs Complete Abstract File For 1998 JNN, a Randomized Algorithm for 1998-001 Learning Multilayer Networks in Elisseeff & Polynomial Time Paugam-Moisy 1998-002 A comparison of non-informative Grunwald priors for Bayesian networks Data-Dependent Structural Risk 1998-003 Minimisation for Perceptron Decision Shawe-Taylor Trees 1998-004 Are Lower Bounds Easier over the Fournier & Koiran Reals? 1998-005 Query, PACS and simple-PAC Learning Castro & Guijarro 1998-006 The Real Dimension Problem is Koiran NPR-Complete 1998-007 Elimination of Parameters in the Koiran Polynomial Hierarchy Bayesian Classifiers are Large Cristiani, 1998-008 Margin Hyperplanes in a Hilbert Shawe-Taylor, Space Sykacek 1998-009 Learning via Internal Representation Dichterman Discrete versus analog computation: 1998-010 Some aspects of studying the same Meer problem in different computational models 1998-011 How many connected components must a Matamala & Meer difficult set have? The Separation Theorem for the 1998-012 Relation Classes Associated to the Gakwaya Extended Grzegorczyk Classes Isomorphism Theorem for BSS 1998-013 Recursively Enumerable Sets over Michaux & Troestler Real Closed Fields Efficient Read-Restricted Monotone 1998-014 CNF/DNF Dualization by Learning with Domingo, Mishra, Membership Queries Pitt 1998-015 Equality Is a Jump Boldi & Vigna Cristianini, 1998-016 Multiplicative Updatings for Campbell, Support-Vector Learning Shawe-Taylor Cristianini, 1998-017 Dynamically Adapting Kernels in Campbell, Support Vector Machines Shawe-Taylor 1998-018 Practical Algorithms for On-line Domingo, Gavalda, Sampling Watanabe --------------------------------------------------------------------------- ***************** ACCESS INSTRUCTIONS ****************** The files and abstracts may be accessed via WWW starting from the NeuroCOLT homepage: http://www.neurocolt.com/ or from the archive: ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports Alternatively, it is still possible to use ftp access as follows: % ftp ftp.dcs.rhbnc.ac.uk (134.219.96.1) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports/1998 ftp> binary ftp> get nc-tr-1998-001.ps.Z ftp> bye % zcat nc-tr-1998-001.ps.Z | lpr Similarly for the other technical reports. In some cases there are two files available, for example, nc-tr-97-002-title.ps.Z nc-tr-97-002-body.ps.Z The first contains the title page while the second contains the body of the report. The single command, ftp> mget nc-tr-97-002* will prompt you for the files you require. --------------------------------------------------------------------- | Keith Howker | e-mail: jkh at dcs.rhbnc.ac.uk | | Dept. of Computer Science | Phone : +44 1784 443696 | | RHUL | Fax : +44 1784 439786 | | EGHAM TW20 0EX, UK | Home: +44 1932 222529 | --------------------------------------------------------------------- --------------------------------------------------------------------- | Keith Howker | e-mail: jkh at dcs.rhbnc.ac.uk | | Dept. of Computer Science | Phone : +44 1784 443696 | | RHUL | Fax : +44 1784 439786 | | EGHAM TW20 0EX, UK | Home: +44 1932 222529 | --------------------------------------------------------------------- From juergen at idsia.ch Fri Jun 19 11:14:34 1998 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Fri, 19 Jun 1998 17:14:34 +0200 Subject: locoface mirrors Message-ID: <199806191514.RAA27565@ruebe.idsia.ch> Since the announcement on Mon Jun 15 1998 we have experienced unusually strong demand for an HTML document entitled "Facial beauty and fractal geometry." Due to limited capacity several thousands of the many download attempts led to incomplete results. I am very sorry for this. Friendly observers on the web, however, noticed our problems and established mirror sites. Darrin Chandler's mirror in the US: http://stilyagin.com/locoface/ Axel deKimpe's mirror: http://www.uoglobe.net/wm/idsia/index.html Another copy can now be found in the new cogprint archive: http://cogprints.soton.ac.uk/search?dom=Authors&query=Schmidhuber_J My original link temporally broke down due to the unexpected overload but should be working again: http://www.idsia.ch/~juergen/locoface/locoface.html Juergen Schmidhuber, IDSIA www.idsia.ch From nic at idsia.ch Fri Jun 19 18:24:22 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Sat, 20 Jun 1998 00:24:22 +0200 Subject: two technical reports Message-ID: <199806192224.AAA01683@idsia.ch> Dear colleagues, the following two papers are available by anonymous ftp: Technical Report IDSIA-32-98 (to be presented at ICANN'98) Slope Centering: Making Shortcut Weights Effective -------------------------------------------------- Nicol N. Schraudolph Shortcut connections are a popular architectural feature of multi-layer perceptrons. It is generally assumed that by implementing a linear sub-mapping, shortcuts assist the learning process in the remainder of the network. Here we find that this is not always the case: shortcut weights may also act as distractors that slow down convergence and can lead to inferior solutions. This problem can be addressed with slope centering, a particular form of gradient factor centering. By removing the linear component of the error signal at a hidden node, slope centering effectively decouples that node from the shortcuts that bypass it. This eliminates the possibility of destructive interference from shortcut weights, and thus ensures that the benefits of shortcut connections are fully realized. ftp://ftp.idsia.ch/pub/nic/slope.ps.gz Technical Report IDSIA-33-98 (submitted to NIPS*98) Accelerated Gradient Descent by Factor-Centering Decomposition -------------------------------------------------------------- Nicol N. Schraudolph Gradient factor centering is a new methodology for decomposing neural networks into biased and centered subnets which are then trained in parallel. The decomposition can be applied to any pattern-dependent factor in the network's gradient, and is designed such that the subnets are more amenable to optimization by gradient descent than the original network: biased subnets because of their simplified architecture, centered subnets due to a modified gradient that improves conditioning. The architectural and algorithmic modifications mandated by this approach include both familiar and novel elements, often in prescribed combinations. The framework suggests for instance that shortcut connections -- a well-known architectural feature -- should work best in conjunction with slope centering, a new technique described herein. Our benchmark experiments bear out this prediction, and show that factor-centering decomposition can speed up learning significantly without adversely affecting the trained network's generalization ability. ftp://ftp.idsia.ch/pub/nic/facede.ps.gz Best wishes, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland From Sebastian_Thrun at heaven.learning.cs.cmu.edu Fri Jun 19 18:55:19 1998 From: Sebastian_Thrun at heaven.learning.cs.cmu.edu (Sebastian Thrun) Date: Fri, 19 Jun 1998 18:55:19 -0400 Subject: Vacant positions at CMU Message-ID: *** please forward *** Carnegie Mellon University seeks applications for several vacant position in the areas of * machine learning / neural networks * multi-agent systems * distributed databases * adaptable software * distributed mobile robotics * security for U.S.-Government-funded research projects jointly carried out by the Computer Science Department (CSD), Institute for Complex Engineered Systems (ICES), the the Robotics Institute (RI), and the newly created Center for Automated Learning and Discovery (CALD). Applications are solicited at the research programmer, postdoc, and research faculty level. Prospective research programmers should hold a B.S. degree (or equivalent) and have extensive programming experience in C, C++ and/or Java. Prospective Postdocs and research faculty should hold a Ph.D. degree and have strong interests in scientific research and track records in one or more areas listed above. Applications from outside the US are welcome. Applications should include a CV, a statement of interest (1-2 pages), a recent relevant paper (if available), and a list of three or more references. We anticipate filling these positions at the earliest conveinence. Applications and inquiries should be addressed to Ms. Rhonda L Moyer Institute for Complex Engineered Systems Carnegie Mellon University 5000 Forbes Ave Pittsburgh, PA 15213-3891 Carnegie Mellon University is an equal opportunity employer. CMU possesses a unique research environment with a world-renown faculty in computer science, robotics, and engineering. - ------- End of Forwarded Message ------- End of Forwarded Message From mrj at cs.usyd.edu.au Sat Jun 20 23:49:47 1998 From: mrj at cs.usyd.edu.au (Mark James) Date: Sun, 21 Jun 1998 13:49:47 +1000 Subject: Ph.D. Thesis on A Model of Isocortex available Message-ID: <358C82DB.F31ADD4C@cs.usyd.edu.au> My doctoral thesis is available for download from: http://www.cs.usyd.edu.au/~mrj/AMI An Adaptive Model of Isocortex ABSTRACT A complete description of the functioning of the human brain requires understandings of three levels of neural processing: the operation of single brain neurons, how these neurons work together in each of the brains functional modules, and how these modules interact to produce the observed sensory, motor, and cognitive abilities. This thesis is principally concerned with the second of these processing levels, providing a plausible explanation of the operation of the functional modules of the cerebral cortex, namely cortical areas. Following a review of models of the operation of single cortical neurons, a model of the neural circuitry of homotypic six-layer cortex (isocortex) is constructed. The model is in accord with much of the anatomical and physiological data, and posits computational roles for each cortical layer and for each of the main types of cortical neurons. Analysis of the adaptive and activation dynamics of the isocortical model suggest that the neural feedback loops between cortical layers allow the cortex to perform powerful pattern recognition operations using a rule for adaptation of synaptic strengths that is constrained by biology to be much simpler than those often used in artificial neural network models. Results of computer simulations are described, which demonstrate that the model is capable of performing simple pattern discrimination and clustering tasks. The thesis concludes with an outline of ways in which the cortical model may be applied to speech and language processing. -- Mark James |EMAIL : mrj at cs.usyd.edu.au| Basser Department of Computer Science, F09 |PHONE : +61-2-9351-3423 | The University of Sydney NSW 2006 AUSTRALIA |FAX : +61-2-9351-3838 | ================- WEB: http://www.cs.usyd.edu.au/~mrj -================= From cchang at cns.bu.edu Mon Jun 22 11:41:17 1998 From: cchang at cns.bu.edu (Carolina Chang) Date: Mon, 22 Jun 1998 11:41:17 -0400 (EDT) Subject: CFP: Biomimetic Robotics - Special Issue of RAS Message-ID: Call for Papers: Biomimetic Robotics Special Issue of Robotics and Autonomous Systems Guest Editors: Carolina Chang and Paolo Gaudiano {cchang, gaudiano}@bu.edu Boston University Neurobotics Lab Department of Cognitive and Neural Systems Submission Deadline: October 31, 1998 It has been argued that today's supercomputers are able to process information at a rate comparable to that of simple invertebrates. And yet, even ignoring physical constraints, no existing algorithm running on the fastest supercomputer could enable a robot to fly around a room, avoid obstacles, land upside down on the ceiling, feed, reproduce, and perform many of the other tasks that a housefly learns to perform without external training or supervision. The apparent simplicity with which flies and even much simpler biological organisms manage to survive in a constantly changing environment suggests that a potentially fruitful avenue of research is that of understanding the mechanisms adopted by biological systems for perception and control, and applying what is learned to robots. While we may not yet be able to make a computer function as flexibly as a housefly, there have been many promising starts in that direction. The goal of this special issue is to present recent results in "biomimetic robotics", or the application of biological principles to robotics. The term "biological" in this case should be taken broadly to refer to any aspect of biological function, including for example, psychological theories or detailed models of neural function. Preference will be given to manuscripts describing original work that closely models biological principles observed in real animals and that uses real robots. Prospective authors should contact one of the guest editors as soon as possible to determine the relevance of their submission to this special issue. Authors are encouraged to submit manuscripts electronically. The final version of all accepted manuscripts should be in LaTeX, using the Elsevier style files available from the Robotics and Autonomous Systems web-page: http://www.elsevier.nl/locate/robot To submit an electronic copy of your manuscript, preferably in postscript or PDF format, upload it to the anonymous ftp site "neurobotics.bu.edu". Use "anonymous" as user name, and your e-mail as password. Change directory to pub/ras and "put" your file using binary transfer mode. You will get detailed directions when you enter the ras directory. To expedite uploading, your document may be compressed by any commonly used compression scheme. Once you have uploaded your file to the ftp site, please send e-mail to cchang at bu.edu indicating the filename, manuscript title, and the name and contact information (electronic and surface mail address, phone, fax) for the corresponding author. The title page of the manuscript should include contact information for all authors. For printed submissions, please send your double-spaced manuscript to: Carolina Chang Boston University Neurobotics Lab Department of Cognitive and Neural Systems 677 Beacon Street Boston, MA 02215 USA From Friedrich.Leisch at ci.tuwien.ac.at Mon Jun 22 11:29:53 1998 From: Friedrich.Leisch at ci.tuwien.ac.at (Friedrich Leisch) Date: Mon, 22 Jun 1998 17:29:53 +0200 (CEST) Subject: CI BibTeX Collection -- Update Message-ID: <13710.30833.579159.418831@galadriel.ci.tuwien.ac.at> The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Neural Networks 8/5-9/3 Advances in Neural Information Processing Systems 10 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ The NIPS proceedings source files have been generously provided by the editors. Best, Fritz Leisch -- =================================== Friedrich Leisch Institut f?r Statistik Tel: (+43 1) 58801 4541 Technische Universit?t Wien Fax: (+43 1) 504 14 98 Wiedner Hauptstra?e 8-10/1071 Friedrich.Leisch at ci.tuwien.ac.at A-1040 Wien, Austria http://www.ci.tuwien.ac.at/~leisch PGP public key http://www.ci.tuwien.ac.at/~leisch/pgp.key =================================== From gsiegle at sunstroke.sdsu.edu Tue Jun 23 13:35:36 1998 From: gsiegle at sunstroke.sdsu.edu (Greg Siegle) Date: Tue, 23 Jun 1998 10:35:36 -0700 (PDT) Subject: Connectionist models of disorder web site Message-ID: Dear Researcher, A new web site has been created as a source list for connectionist and neural network models of cognitive, affective, brain, and behavioral disorders. The site can be found at: www.sci.sdsu.edu/CAL/connectionist-models/ Please feel free to contribute references, links, comments, or suggestions. Sincerely, Greg Siegle From nenet at posta.unizar.es Tue Jun 23 19:22:36 1998 From: nenet at posta.unizar.es (Bonifacio Martin-del-Brio) Date: Tue, 23 Jun 1998 16:22:36 -0700 Subject: Summer Course (in Spanish) Message-ID: <359038BB.53582BB3@posta.unizar.es> In the following web sites you will find information on the 4th summer course 'Introduction to neural networks and fuzzy systems' (in Spanish, 'Introduccion a las redes neuronales y sistemas borrosos'), that will be taught in the 'Universidad de Verano de Teruel' (Summer University of Teruel, Spain), from 6th to 10th of July. There are not many courses on the subject in Spanish language, thus, this course can be a great opportunity for the Spanish community. http://www.unizar.es/univerter/inicio.html http://zape.unizar.es The course is based on the following introductory textbook, written in Spanish Bonifacio Martin-del-Brio and Alfredo Sanz 'Redes Neuronales y Sistemas Borrosos' Editorial RA-MA, Madrid (Spain), 1997. Best regards. ------------------------------------------------------- Dr. Bonifacio Martin-del-Brio Dept. Ingenieria Electronica y Comunicaciones Universidad de Zaragoza C. Corona de Aragon, 35. 50009 ZARAGOZA (Spain) ------------------------------------------------------- Phone: +34 976 351609 Fax: +34 976 762189 E-mail: nenet at posta.unizar.es ------------------------------------------------------- From espaa at soc.plym.ac.uk Tue Jun 23 12:04:44 1998 From: espaa at soc.plym.ac.uk (espaa) Date: Tue, 23 Jun 1998 16:04:44 GMT Subject: PAA Journal Message-ID: <17E85E45D9@scfs3.soc.plym.ac.uk> PATTERN ANALYSIS AND APPLICATIONS journal (Springer-Verlag Limited) http://www.soc.plym.ac.uk/soc/sameer/paa.htm VOLUME 1, ISSUE 2, July 1998 INDEX OF PAPERS Nonparametric Image Segmentation Thomas Kampke and Rudolf Kober Forschunginstitut fur Anwendungsorientierte Wissensverarbeitung, Germany A Monte Carlo Evaluation of the Moving Method, K-means and Self-organising Neural Networks E. W. Tyree, City University, UK J. A. Long, City University, UK Knowledge-Based Spatiotemporal Linear Abstraction Yuval Shahar, Stanford University, USA Martin Molina, Technical University of Madrid, Spain Recognition of Hand-printed Chinese Characters using Decision Trees/Machine Learning C4.5 System Adnan Amin, University of New South Wales, Australia Sameer Singh, University of Plymouth, UK Improving Stereovision Matching through Supervised Learning Gonzalo Pajares and Jesus Cruz, Universidad Complutense, Spain Beam Search and Simulated Beam Annealing for PFSA Inference Anand Raman, Massey University, New Zealand Book Reviews Pattern Classification by Juergen Shurmann Generic Object Recognition using Form and Function by Louise Stark and Kevin Bowyer World Scientific, 1996 Further ernquiries to the journal should be sent to Barbara Davies, Editorial Secretary at espaa at soc.plym.ac.uk From harnad at coglit.soton.ac.uk Wed Jun 24 08:26:15 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Wed, 24 Jun 1998 13:26:15 +0100 (BST) Subject: Expanded BBS 1998: Call for Papers Message-ID: [Apologes if you get this message more than once: sent to several lists] BBS 1998 Has Expanded by 50% CALL FOR PAPERS Behavioral and Brain Sciences Journal (BBS), founded in '78, has begun its third decade in '98 with a 50% Expansion. This means that more articles can be accorded Open Peer Commentary, the feature that has had such a great impact on the international cognitive and biobehavioral science community. (BBS's ISI Impact Factor of 15 is nearly three times the highest impact Psychology journal and is one of the 25 highest among all 6500 science, social science and Arts/Humanities journals indexed by ISI.) BBS is a unique scientific communication medium, providing the service of Open Peer Commentary for reports of significant current work in psychology, neuroscience, behavioral biology and cognitive science. If a manuscript is judged by BBS referees and editors to be appropriate for Commentary it is circulated to a large number of commentators across disciplines and around the world. The target article, commentaries, and authors' responses then co-appear in BBS. To be eligible for publication, a paper should not only meet the standards of a journal such as Psychological Review or the International Review of Neurobiology in terms of conceptual rigor, empirical grounding, and clarity of style, but should also offer a clear rationale for soliciting Commentary. A BBS target article can be (i) the report and discussion of empirical research that the author judges to have broader scope and implications than might be more appropriately reported in a specialty journal; (ii) an unusually significant theoretical article that formally models or systematizes a body of research; or (iii) a novel interpretation, synthesis, or critique of existing experimental or theoretical work. Occasionally, articles dealing with social or philosophical aspects of the behavioral and brain sciences will be considered. Multiple reviews of books also appear. BBS's Web Pages: http://www.princeton.edu/~harnad/bbs.html http://www.cogsci.soton.ac.uk/bbs Email: bbs at cogsci.soton.ac.uk harnad at cogsci.soton.ac.uk -------------------------------------------------------------------- Stevan Harnad harnad at cogsci.soton.ac.uk Professor of Psychology harnad at princeton.edu Director, phone: +44 1703 592582 Cognitive Sciences Centre fax: +44 1703 594597 Department of Psychology http://www.cogsci.soton.ac.uk/~harnad/ University of Southampton http://www.princeton.edu/~harnad/ Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/ SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/ From bressler at walt.ccs.fau.edu Thu Jun 25 19:48:57 1998 From: bressler at walt.ccs.fau.edu (Steven Bressler) Date: Thu, 25 Jun 1998 19:48:57 -0400 Subject: Postdoctoral Position in Computational Neuroscience Message-ID: <3.0.1.32.19980625194857.006fdf40@mail.ccs.fau.edu> COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION AVAILABLE Center for Complex Systems Florida Atlantic University A new postdoctoral position is open in the Center for Complex Systems at Florida Atlantic University to participate in a project in computational neuroscience. The aim of the project is to develop multivariate techniques for the analysis of cortical event-related potentials, and use the results from such analysis as the basis for computational modeling. The approach will emphasize the close interplay between state-of-the-art multivariate autoregressive analysis and the development of dynamical models of distributed information processing in the cerebral cortex. The research project will be conducted in close collaboration with S. Bressler, a cognitive neuroscientist and M. Ding, a computational modeler. The position is for two years, possibly renewable for another year. The desired starting date is September 1, 1998. Required background: -- Ph.D. degree -- Experience in C programming on UNIX systems and X11 Window programming -- Basic knowledge in dynamical systems, matrix algebra, signal processing, and statistics -- Research experience Desired background: -- Working knowledge in neurobiology and neural networks -- Knowledge in autoregressive time series modeling This project is funded by research grants from the National Science Foundation and the National Institute of Mental Health. Please send curriculum vitae, expression of interest, and the names and e-mail or phone numbers of three references to Steven Bressler at bressler at walt.ccs.fau.edu. Information about the Center for Complex Systems at Florida Atlantic University is available at http://www.ccs.fau.edu/ Steven L. Bressler, Ph.D. voice: 561-297-2322 Professor, Complex Systems & Brain Sciences fax: 561-297-3634 Center for Complex Systems Florida Atlantic University bressler at walt.ccs.fau.edu 777 Glades Road http://www.ccs.fau.edu/~bressler/ Boca Raton, FL 33431 U.S.A. From sontag at hilbert.rutgers.edu Fri Jun 26 00:51:58 1998 From: sontag at hilbert.rutgers.edu (Eduardo Sontag) Date: Fri, 26 Jun 1998 00:51:58 -0400 (EDT) Subject: book announcement for hybrid newsletter Message-ID: <199806260451.AAA14903@control.rutgers.edu> Contributed by: Eduardo Sontag (sontag at hilbert.rutgers.edu) Second Edition (revised and much extended) of Mathematical Control Theory Announcing a new book: Eduardo D. Sontag Mathematical Control Theory: Deterministic Finite Dimensional Systems ***Second Edition*** Springer-Verlag, New York, 1998, ISBN 0-387-984895 May be ordered from 1-800-Springer toll-free in the USA, or via email from: orders at springer-ny.com; or faxing +1.201.345.4505. This textbook introduces the core concepts and results of Control and System Theory. Unique in its emphasis on foundational aspects, it takes a "hybrid" approach in which basic results are derived for discrete and continuous time scales, and discrete and continuous state variables. Primarily geared towards mathematically advanced undergraduate or graduate students, it may also be suitable for a second engineering course in control which goes beyond the classical frequency domain and state-space material. The choice of topics, together with detailed end-of-chapter links to the bibliography, makes it an excellent research reference as well. The Second Edition constitutes a substantial revision and extension of the First Edition, mainly adding or expanding upon advanced material, including: Lie-algebraic accessibility theory, feedback linearization, controllability of neural networks, reachability under input constraints, topics in nonlinear feedback design (such as backstepping, damping, control-Lyapunov functions, and topological obstructions to stabilization), and introductions to the calculus of variations, the maximum principle, numerical optimal control, and linear time-optimal control. Also covered, as in the First Edition, are notions of systems and automata theory, and the algebraic theory of linear systems, including controllability, observability, feedback equivalence, and minimality; stability via Lyapunov, as well as input/output methods; linear-quadratic optimal control; observers and dynamic feedback; Kalman filtering via deterministic optimal observation; parametrization of stabilizing controllers, and facts about frequency domain such as the Nyquist criterion. From nic at idsia.ch Sat Jun 27 18:56:25 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Sun, 28 Jun 1998 00:56:25 +0200 Subject: revised TR on fast exponentiation Message-ID: <199806272256.AAA00626@idsia.ch> Dear colleagues, the following technical report has undergone extensive revision since it was first announced here. Among other things, the EXP macro itself has been modified (faster still), and its mean, maximum, and RMS relative approximation error are now derived analytically. With best regards, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland --------------------------- cut here ---------------------------- Technical Report IDSIA-07-98: A Fast, Compact Approximation of the Exponential Function --------------------------------------------------------- Nicol N. Schraudolph Neural network simulations often spend a large proportion of their time computing exponential functions. Since the exponentiation routines of typical math libraries are rather slow, their replacement with a fast approximation can greatly reduce the overall computation time. This note describes how exponentiation can be approximated by manipulating the components of a standard (IEEE-754) floating-point representation. This models the exponential function as well as a lookup table with linear interpolation, but is significantly faster and more compact. ftp://ftp.idsia.ch/pub/nic/exp.ps.gz (10 pages, 145 kB compressed) From adevries at sarnoff.com Tue Jun 30 11:09:31 1998 From: adevries at sarnoff.com (Bert De Vries) Date: Tue, 30 Jun 1998 11:09:31 -0400 Subject: Workshop Ann.: Neural Nets for Signal Proc. (Aug31-Sep2 '98) Message-ID: <3598FFAA.5E185768@sarnoff.com> We still have some openings for interested researchers to attend the 1998 IEEE Workshop on Neural Networks for Signal Processing, which this year will be held in beautiful Cambridge, UK, on August 31st to September 2nd 1998. We have a very interesting program, including many papers on blind signal processing, biomedical processing, speech, time series prediction etc. This note includes the preliminary program, registration information and a registration form. More information can be found at our website:- http://www.newton.cam.ac.uk/programs/nspw03.html http://www.newton.cam.ac.uk/programs/nsp.html Hope to see you in Cambridge! --Bert de Vries, Publicity chair NNSP98 =================================================================== IEEE Workshop on Neural Networks for Signal Processing NNSP 98 PRELIMINARY PROGRAMME Monday, 31 August 1998 9:00 Opening Remarks 9:15 Invited Talk Jose Principe, University of Florida 10:00 Oral Session I : Source Separation, Deconvolution & ICA 101 KuicNet Algorithms for Blind Deconvolution S. C. Douglas and S-Y. Kung 104 On the Stability of some Source Separation Algorithms J-F. Cardoso 110 Convolutive Blind Source Separation based on Multiple Decorrelation" L. Parra, C. Spence & B de Vries 10:45 Coffee Break 11:15 Oral Session I (Continued) 114 Independent Component Analysis: A flexible non-linearity and decorrelating manifold approach R. Everson & S.J. Roberts 115 Bayesian Blind Marginal Separation of Convolutely Mixed Discrete Sources C. Andrieu, A. Doucet, S. Godsill 116 Independent Component Analysis in Hybrid Mixture: Extrema Properties for Kurtosis and Higher Order Cumulant Function S-Y. Kung 12:00 Poster Previews I _________ 13:00 Lunch _________ 14:30 Oral Session II : Algorithms and Architectures 201 A General Probabilistic Formulation for Feedforward Neural Classifiers T. Adali, M. K. Sonmez & H. Ni 205 Learning from Examples with Mutual Information D. Xu & J C Principe 207 Experimental Evaluation of Latent Variable Models for Dimentionality Reduction M.A.Carrierra-Perpinian & S.J. Renals 208 From an A priori RNN to an A Posteriori PRNN Nonlinear Predictor D.P.Mandic & J Chambers ------------ 15:30 Coffee Break ------------ 16:00 Poster Session I (ICA, A&A) 103 Removing Electroencephalographic Artifacts: Comparison between ICA and PCA T-P Jung, C. Humphries, T-W Lee, M.J.McKeown, V. Iragui, S. Makeig & T. Sejnowski 105 A New Variable Step Algorithm for Blind Source Separation P.M. On & Y Hirai 107 Flexible Independent Component Analysis S. Choi, A. Cichocki & S. Amari 108 Blind Equalisation of Multichannels via Spatio-temporal Anti-Hebbian Learning Rule S. Choi, A. Cichocki & A. Amari 109 Asymmetric PCA Neural Networks for Adaptive Blind Source Separation K. I Diamantaras 111 The Effect of Signal Non-Stationarity on the Performanc of Information Maximisation Based Blind Separation M.J.T.Alphey, D.I. Laurensen & A.F. Murray 112 Blind Deconvolution / Equalization using State-Space Models L-Q Zhang & A. Cichocki 113 Two EM Algorithms for Blind Separation of Noisy Mixtures H. Attias 304 Online EM Algorithm and Reconstruction of Chaotic Dynamics S. Ishii & M. Sato Tuesday, 1 September 1998 9:00 Invited Talk Steve Young Cambridge University 10:00 Oral Session III Algorithms and Architectures 212 Adaptive Metric Kernel Regression C. Goutte and J. Larsen 215 Bayesian Filtering for Hidden Markov Models via Monte Carlo Methods A. Doucet, C. Andrieu, W.J. Fitzgerald 216 Clustering with Kernel Based Equiprobabilistic Topographic Maps M. van Hulle 10:45 Coffee Break 11:15 Oral Session III (Continued) 219 An Empirical Comparison of Arc-Cosine Distance, Generalised Fisher Ratio and Normalised Entropy Criteria for Model Selection S. Zheng & C. G. Molina 305 Stochastic Approximation by Neural Networks using the Radon and Wavelet Transforms R, Meir & V. Maiorov 701 Stochastic Unobserved Component Models for Adaptive Signal Extraction and Forecasting' P. Young 12:00 Poster Previews ___________ 13:00 Lunch ___________ 14:00 Poster Session II (A & A) 202 A Likelihood Framework for Nonlinear Signal Processing with Normal Mixtures B. Wang, T. Adali, X. Liu & J. Xuan 203 Nonlinear State Space Learning with EM and Neural Networks J. De Freitas, M. Niranjan & A.H. Gee 204 Volterra Signal Modelling using Lagrange Programming Neural Networks S. Chan, T. Stathaki & A. Constantinides 209 Split and Merge EM Algorithm for Improving Gaussian Mixture Density Estimates N. Ueda, R. Nakano, Z. Ghahramani & G.E. Hinton 210 Neural Network Regression with Input Uncertainty W. A. Wright 211 Speeding up MLP Execution by Approximating Neural Network Activation Functions Rosella Cancelliere 213 A Model for Non-Stationary Signal Processing with Clustering Methods S. Policker & A.B.Geva 214 A Reduced Size Lattice Ladder Neural Network D. Navakauskas 217 Designing the Optimal Structure of a Neural Filter K. Suzuki, I. Horiba, N. Sugie 218 From Data to Nonlinear Dynamics: A Hierarchical Bayes Approach to Neural Networks T. Matsumoto, Y. Nakajima, H. Hamagishi, J. Sugi & M. Saito 220 Recursive Nonlinear System Identification with Modular Networks V. Kadirkamanathan and S.G. Fabri 222 A heuristic Pattern Correction Scheme for GRNNs and its Application to Speech Recognition T. Hoya and A.G. Constantinides 306 Kohonen Networks and the Influence of Training on Data Structures I. Morlini _______________________ 15:30 Guided Tour of Colleges Punting in River Cam Conference Dinner _______________________ Wednesday, 2 September 1998 9:00 Invited Talk Josef Kittler Surrey University 10:00 Oral Session IV: Applications 506 Sound Monitoring based on the Generalised Probabilistic Descent Method H. Watanabe, Y. Matsumoto, S. Tanaka & S. Katagiri 511 Combining Neural Networks and Belief Networks for Image Segemtnation C. K. I. Williams and X. Feng 512 Analysing Time Series Structure with Hidden Markov Models M. Azzouzi & I.T. Nabney 10:45 Coffee Break 11:15 Oral Session IV (Continued) 515 Morphing Dynamical Sound Models A. Robel 603 Time Series Forecasting with Neural Networks Chris Chatfield 613 PCA/ICA Embeddings for Extracting Structure from Single Channel Wake EEG using Neural Networks D. Lowe 12:00 Poster Previews _____________ 13:00 Lunch _____________ 14:00 Invited Talk TBA 15:00 Oral Session V: Applications 611 Boundary Conditions of Pharyngeal Bolus Modeling by Neural Network Inversion E. Lin, J-N Hwang & MW Chang 602 Communication Channel Equalisation using Minimal Radial Basis Function Neural Networks P. Chandrakumar, P Saratchandran & N Sundararajan 614 Adaptive Medical Image Visualisation Based on Hierarchical Neural Networks and Intelligent Decision Fusion S-H. Lai & M Fang 605 Combining Histograms and Neural Networks in Static and Dynamical Systems Approach to Engine Condition Monitoring V. Kadirkamanathan and V.C. Patel 604 A Neural Network Extension of the Method of Analogues for Iterated Time Series Prediction N. Hazarika & D. Lowe 16:15 Poster Session (Applications) and Tea Break 501 Unconstrained Freehand-written Chinese Characters Recognition by Self-growing Probabilistic Decision-based Neural Networks H-C Fu, Y.Y.Xu & Y.P.Lee 502 Adaptive FIR filter use for signal Noise Cancelling M. Kolinova, A. Prochazka & M Mudrova 504 Face Classification using Principal Component Analysis and Multiresolution V. Brennan & J.C. Principe 505 A Comparison of a Hardware and a Software Integrate and Fire Neural Network for Clustering Onsets in Cohlear Filtered Sound L.S. Smith, M. Glover & A. Hamilton 507 Feature Extraction Techniques for Hindi Numerals H. Sanossian 508 Online Adaptive Histogram Equalisation D. Martinez 509 Weightless Neural Networks for Face Recognition: A Comparison S. Lauria & R.J. Mitchell 510 Exploiting the Statistical Characteristic of the Speech Signals for an Improved Neural Learning in a MLP Neural Network H. Altun and K.M. Curtis 513 Postprocessing for Image Coding Applications using Neural Network Visual Model Z. He, S. Chen, B. Luk & R. Istepanian 606 Structured Neural Network Approach for Measuring Raindrop Sizes and Velocities B. Denby, P. Gole & J. Tarniewicz 608 A Neural Network Architecture for the Classification of Remote Sensing Imagery with Advanced Learning Algorithms M.L. Goncalves, M. L. Netto, J.Z. Junior 612 A Framework for Combining Stochastic and Deterministic Descriptions of Nonstationary Financial Time Series R. H. Lesch & D. Lowe 18:15 Closing Remarks _____________ 18:30 End of Workshop _____________ ======================================================================= IEEE Neural Networks for Signal Processing Workshop (NNSP 98) Registration Information General: We have two types of delegates: Residential at Robinson College and Non residential. 120 Rooms have been booked at Robinson College for the purpose of the workshop, and it is expected that the majority of us will be residential there. The expected arrival is Sunday, 31 August and Departure is Thursday 3 rd September [i.e four nights]. It should be possible to accommodate those who might wish to arrive earlier / stay longer in Cambridge. The charges include accommodation, all meals between Sunday supper to breakfast on Thursday, and the Conference Dinner on Tuesday evening. Non-residential delegates are responsible for their own accommodation. Their registration includes Lunches on Monday, Tuesday & Wednesday and the Social Dinner on Tuesday. Registration Fees: ____________________________ EARLY REGISTRATION Early registration is encouraged, the cut off for this will be 10 July 1998. ____________________________ RESIDENTIAL AT ROBINSON The cost of accommodation will be UKP 93.00 per night Registration for the workshop: Member IEEE, Early registration: UKP 140.00 Nonmember IEEE, Early : UKP 170.00 Member IEEE Late Registration : UKP 170.00 Nonmember IEEE, Late : UKP 200.00 Student, Early : UKP 100.00 Student, Late : UKP 130.00 ____________________________ NON-RESIDENTIAL AT ROBINSON You are responsible for your own accommodation Registration for the workshop: Member IEEE, Early registration: UKP 225.00 Nonmember IEEE, Early : UKP 260.00 Member IEEE Late Registration : UKP 270.00 Nonmember IEEE, Late : UKP 290.00 Student, Early : UKP 160.00 Student, Late : UKP 200.00 ____________________________ ACCOMPANYING PERSON We have currently reserved 10 rooms that can accommodate an accompanying partner at an additional cost of UKP 35.70 per night [Bed and Breakfast] ____________________________ METHOD OF PAYMENT: Payment may be made by Bank Draft or International Money Order, payable to Robinson College Enterprise Ltd. Please make sure that we do not incur banking charges at this end. It is also possible to pay by credit card, but only VISA, MasterCard or JCB are acceptable. Please fill the form below and mail to Elizbeth Perrett Conference Office Robinson College Grange Road Cambridge CB3 9AN England Phone: 44 1223 332859 FaX : 44 1223 315094 Email: conference at robinson.cam.ac.uk Registration Form: ------------------------ Cut Here ------------------------------------ IEEE Neural Networks for Signal Processing Workshop (NNSP 98) REGISTRATION FORM Title: Neural Networks for Signal Processing, NNSP 98 Taking place at: Robinson College & Newton Institute, Cambridge, UK Date: August 31 - September 2, 1998 Last Name: ...............................Title (Mr, Ms, Dr etc)....... First Name: ............................................................ Date of Birth: ..../..../......Nationality: ............................ Professional Status: ................................................... University/Company: .................................................... Address:................................................................. ........................................................................ City: ............................... Postcode: ........................ Tel: ................................ Fax: ............................. Email: ................................................................. PAYMENT DETAILS: Accommodation at Robinson College ..... Nights @ 93.00 / night ....... Accompanying person (additional 35.70 / night) Residential Registration Early Registration, Member IEEE ................ Early Registration, NonMember IEEE ................ Non Early Registration, Member IEEE ................ Non Early Registration, NonMember IEEE ................ Early Student ................ Non Early Student ................ Non Residential Registration Early Registration, Member IEEE ................ Early Registration, NonMember IEEE ................ Non Early Registration, Member IEEE ................ Non Early Registration, NonMember IEEE ................ Early Student ................ Non Early Student ................ Total ........ Method of Payment ....... Bank Draft enclosed [Payabel to Robinson College Enterprises Ltd] ....... Charge Credit Card [VISA / MasterCard / JCB only] Card Number ............................... Expiry Date ................. Print Name .......................... Signed: .................................. Date: ........................ ---End of Registration form ---------------------------------------------- From peter.hansen at physiol.ox.ac.uk Thu Jun 4 10:59:35 1998 From: peter.hansen at physiol.ox.ac.uk (Peter Hansen) Date: Thu, 4 Jun 1998 15:59:35 +0100 (BST) Subject: Job Openings; University of Oxford Message-ID: +-------------------------------------------+ | UNIVERSITY OF OXFORD | | CENTRE FOR COGNITIVE NEUROSCIENCE | +-------------------------------------------+ The Centre, which is funded by grants from the Medical Research Council and the McDonnell-Pew Program, supports collaborative, interdisciplinary research on many aspects of brain function relevant to human cognition, in several departments at Oxford. Computational Neuroscientist (Ref IRC2) This appointment, for computational approaches to cognitive function, will probably be made for three years in the first instance, on the RSII scale (21,016 - 27,935 UKP), or on the RS1A scale (15,159 - 22,785 UKP), depending on experience and specified duties. The post holder should have a PhD or equivalent experience, evidence of a capacity for independent research, and expertise in the mathematical analysis of neuronal networks and/or other computational approaches to neuroscience. S/he will work with neuroscientists to develop biologically constrained models of cortical function. S/he will probably have the opportunity to supervise graduate students and to participate in organising seminars and workshops. Computer Officer (Ref McDPCO) Applications are invited from individuals, with extensive experience of modern computing techniques, for the above post, for three years in the first instance, full-time (or for a longer period part-time, pro rata), on the RS1A scale (15,159 - 22,785 UKP) or RSII scale (21,016 - 27,935 UKP), depending on experience and specified duties. The person appointed to this post will carry the main responsibility for supporting computing-related research activities in the Centre. The present facilities include a network of Unix workstations (Sun/SGI) and PCs using MS Windows 95/NT. The work will involve varied project-based programming using C or C++, general computing support, training of research staff and liaising with existing computer staff. Knowledge of experimental control programs or system administration (Unix/PC) would be an advantage. Further information is available from the Centre Web site (http://www.physiol.ox.ac.uk/mcdp/jobs) or from the Administrative Secretary, Harriet Fishman, University Laboratory of Physiology, Parks Road, Oxford OX1 3PT. Tel: 01865-272497; Fax: 01865-272488. Applicants should write, quoting the reference number of the post and enclosing a full curriculum vitae together with names and addresses of two referees, to the Assistant Administrator, University Laboratory of Physiology, Parks Road, Oxford OX1 3PT. Closing date is 10 July 1998. The University is an equal opportunity employer. From hali at theophys.kth.se Thu Jun 4 18:41:23 1998 From: hali at theophys.kth.se (Hans Liljenstrm) Date: Fri, 05 Jun 1998 00:41:23 +0200 Subject: 1998 Sigtuna Workshop on Fluctuations Message-ID: <35772293.841A08E6@theophys.kth.se> 2nd announcement and call for participation Third Sigtuna Workshop Random Events in Biological Systems Sigtuna, Sweden 3-5 Sep 1998 organized by the Agora for Biosystems Objectives In this meeting we wish to address questions concerning various forms of fluctuations and disorder in biological systems. By bringing together experimentalists and theoreticians with knowledge and insights from different disciplines, such as biology, physics, and computer science, we hope to shed more light on problems, which we think are profound for understanding the phenomenon of life. Topics will include synchronization, oscillations, chaos, noise, and stochastic resonance in e.g. the origin and evolution of life, biomolecular kinetics, neural information processing, and organ system functioning. Both experimental data and theory from the frontiers of science will be discussed. A number of invited speakers will provide presentations on the fundamental problems, but we invite further contributions, in the form of short lectures, computer demonstrations and posters. In order to maintain a close contact between all participants, and to provide an efficient workshop atmosphere, the number of participants will be limited to approximately forty people. The location of the workshop is at a unique guest house in Sigtuna, a royal town in the early Middle Ages. Situated at the shore of the beautiful lake M=E4laren, Sigtuna is only 15 km away from the Stockholm airport and 45 km from downtown Stockholm. It is also close to the city of Uppsala. The total cost, including accomodation, all meals and registration fee is 3000 SEK (approx 375 USD). Call for submissions: A small number of contributed talks will be organized. Interested participants are asked to submit by email a title and abstract to any of the organizers, by July 15, 1998. Organizing committee: Hans Liljenstr=F6m, Dept. of Physics, Royal Institute of Technology, Stockholm Peter =C5rhem, Nobel Institute for Neurophysiology, Karolinska Institutet, Stockholm Clas Blomberg, Dept. of Physics, Royal Institute of Technology, Stockholm (All also affiliated with the Agora for Biosystems) Confirmed invited speakers: Agnes Babloyantz, Dept of Chemical Physics, Free University of Brussels, Belgium Hans Braun, Institute of Physiology, University of Marburg, Germany Jarl-Thure Eriksson, Laboratory of Electricity, Technical University of Tampere, Finland Hans Frauenfelder, Los Alamos National Laboratory, New Mexico, USA Hermann Haken, Institut f=FCr Theoret. Physik und Synergetik, Universitet Stuttgart, Germany John Hertz, Nordita, Copenhagen, Denmark Amit Manwani, Computation and Neural Systems Program, Caltech, Pasadena, USA Michael Mackey, Dept. of Physiology, McGill University, Montreal, Canada Frank Moss, Dept. of Physics, University of Missouri, St Louis, USA Erik Mosekilde, Dept of Physics, Technical University of Denmark, Lyngby Sakire P=F6gun, Center for Brain Research, Ege University, Turkey J=F6rg Stucki, Dept. of Pharmacology, University of Bern, Switzerland E=F6rs Szathmary, Collegium Budapest, Hungary Peter Wolynes, Dept. of Chemistry, Univ. of Indiana, Urbana For further information, please see our web site, http://www.theophys.kth.se/~hali/agora/sigtuna98, or contact Hans Liljenstr=F6m Theoretical Biophysics Group Dept. of Physics Royal Institute of Technology S-100 44 Stockholm, SWEDEN Email: hali at theophys.kth.se Phone: +46-(0)8-790 9423 Fax: +46-(0)8-10 48 79 If you are interested in participating in this workshop, please fill in and return the pre-registration form below. 1998 Sigtuna Workshop on RANDOM EVENTS IN BIOLOGICAL SYSTEMS Pre-Registration Name:___________________________________________________________ Address:_________________________________________________________ __________________________________________________________ Student: Yes No Willing to contribute with a presentation: Yes No Presentation preference: Oral Poster Preliminary title/subject:____________________________________________ ________________________________________________________________ DEADLINE FOR SUBMISSIONS IS JULY 15, 1998 - please post - From dld at cs.monash.edu.au Fri Jun 5 01:47:24 1998 From: dld at cs.monash.edu.au (David L Dowe) Date: Fri, 5 Jun 1998 15:47:24 +1000 Subject: CFPs: Info theory in biology, due July 13 Message-ID: <199806050547.PAA06363@dec11.cs.monash.edu.au> Information-theoretic approaches to biology ------------------------------------------- This is the Call For Papers for the 4th Pacific Symposium on BioComputing (PSB99, 1999) conference track on "Information-theoretic approaches to biology". PSB-99 will be held from 4-9 January, 1999, in Mauni Lani on the Big Island of Hawaii. Track Organisers: David L. Dowe (dld at cs.monash.edu.au) and Klaus Prank. WWW site: http://www.cs.monash.edu.au/~dld/PSB99/PSB99.Info.CFPs.html . Specific technical area to be covered by this track: Approaches to biological problems using notions of information or complexity, including methods such as Algorithmic Probability, Minimum Message Length and Minimum Description Length. Two possible applications are (e.g.) protein folding and biological information processing. Kolmogorov (1965) and Chaitin (1966) studied the notions of complexity and randomness, with Solomonoff (1964), Wallace (1968) and Rissanen (1978) applying these to problems of statistical and inferential learning (and ``data mining'') and to prediction. The methods of Solomonoff, Wallace and Rissanen have respectively come to be known as Algorithmic Probability (ALP), Minimum Message Length (MML) and Minimum Description Length (MDL). All of these methods relate to information theory, and can also be thought of in terms of Shannon's information theory, and can also be thought of in terms of Boltzmann's thermo-dynamic entropy. An MDL/MML perspective has been suggested by a number of authors in the context of approximating unknown functions with some parametric approximation scheme (such as a neural network). The designated measure to optimize under this scheme combines an estimate of the cost of misfit with an estimate of the cost of describing the parametric approximation (Akaike 1973, Rissanen 1978, Barron and Barron 1988, Wallace and Boulton, 1968). This track invites all original papers of a biological nature which use notions of information and/or information-theoretic complexity, with no strong preference as to what specific nature. Such work has been done in problems of, e.g., protein folding and DNA string alignment. As we shortly describe in some detail, such work has also been done in the analysis of temporal dynamics in biology such as neural spike trains and endocrine (hormonal) time series analysis using the MDL principle in the context of neural networks and context-free grammar complexity. To elaborate on one of the relevant topics above, in the last three years or so, there has been a major focus on the aspect of timing in biological information processing ranging from fields such as neuroscience to endocrinology. The latest work on information processing at the single-cell level using computational as well as experimental approaches reveals previously unimagined complexity and dynamism. Timing in biological information processing on the single-cell level as well as on the systems level has been studied by signal-processing and information-theoretic approaches in particular in the field of neuroscience (see for an overview: Rieke et al. 1996). Using such approaches to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing to the understanding of temporal complexity in biological information transfer, the maximum information rates and the precision of spike timing could be revealed by computational methods (Mainen and Sejnowski, 1995; Gabbiani and Koch 1996; Gabbiani et al., 1996). The examples given above are examples of some possible biological application domains. We invite and solicit papers in all areas of (computational) biology which make use of ALP, MDL, MML and/or other notions of information and information-theoretic complexity. In problems of prediction, as well as using "yes"/"no" predictions, we would encourage the authors to consider also using probabilistic prediction, where the score assigned to a probabilistic prediction is given according to the negative logarithm of the stated probability of the event. Further comments re PSB-99 : ---------------------------- PSB99 will publish accepted full papers in an archival Proceedings. All contributed papers will be rigorously peer-reviewed by at least three referees. Each accepted full paper will be allocated up to 12 pages in the conference Proceedings. The best papers will be selected for a 30-minute oral presentation to the full assembled conference. Accepted poster abstracts will be distributed at the conference separately from the archival Proceedings. To be eligible for proceedings publication, each full paper must be accompanied by a cover letter stating that it contains original unpublished results not currently under consideration elsewhere. See http://www.cgl.ucsf.edu/psb/cfp.html for more information. IMPORTANT DATES: Full paper submissions due: July 13, 1998 Poster abstracts due: August 22, 1998 Notification of paper acceptance: September 22, 1998 Camera-ready copy due: October 1, 1998 Conference: January 4 - 9, 1999 More information about the "Information-theoretic approaches to biology" track, including a sample list of relevant papers is available on the WWW at http://www.cs.monash.edu.au/~dld/PSB99/PSB99.Info.CFPs.html . More information about PSB99 is available from http://www.cgl.ucsf.edu/psb/cfp.html For further information, e-mail Dr. David Dowe, dld at cs.monash.edu.au or e-mail Dr. Klaus Prank, ndxdpran at rrzn-serv.de . This page was put together by Dr. David Dowe, School of Computer Science and Softw. Eng., Monash University, Clayton, Vic. 3168, Australia e-mail: dld at cs.monash.edu.au Fax: +61 3 9905-5146 http://www.csse.monash.edu.au/~dld/ and Dr. Klaus Prank, Abteilung Klinische Endokrinologie Medizinische Hochschule Hannover Carl-Neuberg-Str. 1 D-30623 Hannover Germany e-mail: ndxdpran at rrzn-serv.de Tel.: +49 (511) 532-3827 Fax.: +49 (511) 532-3825 http://sun1.rrzn-user.uni-hannover.de/~ndxdpran/ From harnad at coglit.soton.ac.uk Fri Jun 5 14:37:49 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Fri, 5 Jun 1998 19:37:49 +0100 (BST) Subject: Pylyshyn on Vision & Cognition: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article on: IS VISION CONTINUOUS WITH COGNITION? THE CASE FOR COGNITIVE IMPENETRABILITY OF VISUAL PERCEPTION by Zenon Pylyshyn This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL to: bbs at cogsci.soton.ac.uk or write to: Behavioral and Brain Sciences Department of Psychology University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. ____________________________________________________________________ IS VISION CONTINUOUS WITH COGNITION? THE CASE FOR COGNITIVE IMPENETRABILITY OF VISUAL PERCEPTION Zenon Pylyshyn Rutgers Center for Cognitive Science Rutgers University Psychology Addition, Busch Campus, New Brunswick, NJ 08903 zenon at ruccs.rutgers.edu KEYWORDS: visual processing, modularity, cognitive penatrability, early vision context effects, top down processes, signal detection theory, attention expert perception, perceptual learning, knowledge-based vision, visual agnosia, categorical perception. ABSTRACT: Although the study of visual perception has made more progress in the past 40 years than any other area of cognitive science, there remain major disagreements as to how closely vision is tied to cognition. This paper sets out some of the arguments for both sides (arguments from computer vision, neuroscience, Psychophysics, perceptual learning and other areas of vision science) and defends the position that an important part of visual perception, corresponding to what some people have called early vision, is prohibited from accessing relevant expectations, knowledge and utilities in determining the function it computes - in other words it is cognitively impenetrable. That part of vision is complex and involves top-down interactions that are internal to the early vision system. Its function is to provide a structured representation of the 3-D surfaces of objects sufficient to serve as an index into memory, with somewhat different outputs being made available to other systems such as those dealing with motor control. The paper also addresses certain conceptual and methodological issues raised by this claim, including the use of signal detection theory and event-related potentials to assess cognitive penetration of vision. A distinction is made among several stages in visual processing. These include, in addition to the inflexible early-vision stage, a pre-perceptual attention-allocation stage and a post-perceptual evaluation, selection, and inference stage which accesses long-term memory. These two stages provide the primary ways in which cognition can affect the outcome of visual perception. The paper discusses arguments that have been presented in both computer vision and psychology showing that vision is "intelligent" and involves elements of "problem solving". It is suggested that the cases of apparently intelligent interpretation that are sometimes cited in support of this claim do not show cognitive penetration, but rather they show that certain natural constraints on interpretation, concerned primarily with optical and geometrical properties of the world, have been compiled into the visual system. The paper also examines a number of examples where instructions and "hints" are alleged to affect what is seen. In each case it is concluded that the evidence is more readily assimilated to the view that when cognitive effects are found, they have a locus outside early vision, in such processes as the allocation of focal attention and identification of the stimulus. -------------------------------------------------------------- To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp or gopher from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.pylyshyn.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.pylyshyn ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.pylyshyn gopher://gopher.princeton.edu:70/11/.libraries/.pujournals To retrieve a file by ftp from an Internet site, type either: ftp ftp.princeton.edu or ftp 128.112.128.1 When you are asked for your login, type: anonymous Enter password as queried (your password is your actual userid: yourlogin at yourhost.whatever.whatever - be sure to include the "@") cd /pub/harnad/BBS To show the available files, type: ls Next, retrieve the file you want with (for example): get bbs.pylyshyn When you have the file(s) you want, type: quit From terry at salk.edu Mon Jun 8 18:27:36 1998 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 8 Jun 1998 15:27:36 -0700 (PDT) Subject: NEURAL COMPUTATION 10:5 Message-ID: <199806082227.PAA05690@helmholtz.salk.edu> Neural Computation - Contents Volume 10, Number 5 - July 1, 1998 ARTICLE Dynamics of Membrane Excitability Determine Inter-Spike Interval Variability: A Link Between Spike Generation Mechanisms and Cortical Spike Train Statistics Boris S. Gutkin and G. Bard Ermentrout NOTE Correction To Proof That Recurrent Neural Networks Can Robustly Recognize Only Regular Languages Michael Casey LETTERS On the Effect of Analog Noise in Discrete-Time Analog Computations Wolfgang Maass, and Pekka Orponen Category Learning Through Multi-Modality Sensing Virginia R. de Sa and Dana H. Ballard A Hierarchical Model of Binocular Rivalry Peter Dayan Efficient Learning in Boltzmann Machines Using Linear Response Theory H. J. Kappen and F. B. Rodriguez A Learning Theorem for Networks at Detailed Stochastic Equilibrium Javier R. Movellan Asymmetric Dynamics in Optimal Variance Adaptation Michael DeWeese and Anthony Zador Computation with Infinite Neural Networks Christopher K. I. Williams Bayesian Radial Basis Functions of Variable Dimension C. C. Holmes and B. K. Mallick Absence of Cycles in Symmetric Neural Networks Xin Wang, Arun Jagota, Fernanda Botelho, and Max Garzon Pattern Generation by Two Coupled Time-Discrete Neural Networks with Synaptic Depression W. Senn, and Th. Wannier, J. Kleinle, H.-R. Luscher, L. Muller, J. Streit, and K. Wyler Computational Studies of Lateralization of Phoneme Sequence Generation James A. Reggia, Sharon Goodall, and Yuri Shkuro Nonlinear Component Analysis as a Kernel Eigenvalue Problem Bernhard Scholkopf, Alexander Smola, and Klaus-Robert Muller ----- ABSTRACTS - http://mitpress.mit.edu/NECO/ SUBSCRIPTIONS - 1998 - VOLUME 10 - 8 ISSUES USA Canada* Other Countries Student/Retired $50 $53.50 $78 Individual $82 $87.74 $110 Institution $285 $304.95 $318 * includes 7% GST (Back issues from Volumes 1-9 are regularly available for $28 each to institutions and $14 each for individuals. Add $5 for postage per issue outside USA and Canada. Add +7% GST for Canada.) MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From S.Holden at cs.ucl.ac.uk Tue Jun 9 09:42:27 1998 From: S.Holden at cs.ucl.ac.uk (Sean Holden) Date: Tue, 09 Jun 1998 14:42:27 +0100 Subject: Special issue on generalization Message-ID: <1057.897399747@cs.ucl.ac.uk> Readers of this mailing list may be interested to know that the March 1998 issue of the journal, STATISTICS AND COMPUTING published by Chapman & Hall is a special issue on the subject of "generalization". Full details can be found at, http://statsandcomp.thomsonscience.com and the list of contents follows. Best wishes, Sean Holden. Guest Editor. M. Anthony "Probabilistic 'generalization' of functions and dimension-based uniform convergence results" D. J. C. MacKay "Interpolation models with multiple hyperparameters" and R. Takeuchi R. Tibshirani "Coaching variables for regression and classification" and G. Hinton D. Wolpert, "Some results concerning off-training-set and IID error E. Knill for the Gibbs and Bayes Optimal Generalizers" and T. Grossman C. W. H. Mace "Statistical mechanical analysis of the dynamics of and A. C. C. learning in perceptrons" Coolen From Dave_Touretzky at skinner.boltz.cs.cmu.edu Mon Jun 8 14:18:46 1998 From: Dave_Touretzky at skinner.boltz.cs.cmu.edu (Dave Touretzky) Date: Mon, 08 Jun 1998 14:18:46 -0400 Subject: test posting Message-ID: <16336.897329926@skinner.boltz.cs.cmu.edu> ------- Blind-Carbon-Copy From Dave_Touretzky at cs.cmu.edu Mon Jun 8 14:18:46 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Mon, 08 Jun 1998 14:18:46 -0400 Subject: test posting Message-ID: <16336.897329926@skinner.boltz.cs.cmu.edu> The Connectionists list has been experiencing some technical problems in the past week. Please excuse this test posting. - -- Dave Touretzky, CONNECTIONISTS moderator ------- End of Blind-Carbon-Copy From tp at ai.mit.edu Thu Jun 4 11:27:57 1998 From: tp at ai.mit.edu (Tomaso Poggio) Date: Thu, 04 Jun 1998 11:27:57 -0400 Subject: Computational Position Message-ID: <3.0.5.32.19980604112757.00c57ea0@ai.mit.edu> MASSACHUSETTS INSTITUTE OF TECHNOLOGY DEPARTMENT OF BRAIN SCIENCES The MIT Department of Brain Sciences anticipates making another tenure-track appointment in computational brain and cognitive science at the Assistant Professor level. Candidates should have a strong mathematical background and an active research interest in the mathematical modeling of specific biophysical, neural or cognitive phenomena. Individuals whose research focuses on learning and memory at the level of neurons and networks of neurons are especially encouraged to apply. Responsibilities include graduate and undergraduate teaching and research supervision. Applications should include a brief cover letter stating the candidate's research and teaching interests, a vita, three letters of recommendation and representative reprints. Qualified individuals should send their dossiers by October 21, 1998 to: Chair, Faculty Search Committee/Computational Neuroscience Department of Brain & Cognitive Sciences, E25-406 MIT 77 Massachusetts Avenue Cambridge, MA 02139-4307 Previous pplicants need not resubmit their dossiers. MIT is an Affirmative Action/Equal Opportunity Employer. Qualified women and minority candidates are encouraged to apply. Tomaso Poggio Uncas and Helen Whitaker Professor Brain Sciences Department and A.I. Lab M.I.T., E25-218, 45 Carleton St Cambridge, MA 02142 E-mail: tp at ai.mit.edu Web: Phone: 617-253-5230 Fax: 617-253-2964 From singer at research.att.com Mon Jun 8 17:31:39 1998 From: singer at research.att.com (Yoram Singer) Date: Mon, 8 Jun 1998 17:31:39 -0400 (EDT) Subject: new and improved family of boosting algorithms Message-ID: <199806082131.RAA10700@allegro.research.att.com> The following papers introduce, analyze, and describe applications of a new and improved family of boosting algorithms. The papers are available from: http://www.research.att.com/~schapire/boost.html and http://www.research.att.com/~singer/pub.html Questions and comments are welcome. - Rob Schapire and Yoram Singer {schapire,singer}@research.att.com ----------------------------------------------------------------------------- Improved Boosting Algorithms Using Confidence-rated Predictions Robert Robert E. Schapire and Yoram Singer We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find improved parameter settings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns out to be identical to one proposed by Kearns and Mansour. We focus next on how to apply the new boosting algorithms to multiclass classification problems, particularly to the multi-label case in which each example may belong to more than one class. We give two boosting methods for this problem. One of these leads to a new method for handling the single-label case which is simpler but as effective as techniques suggested by Freund and Schapire. Finally, we give some experimental results comparing a few of the algorithms discussed in this paper. ----------------------------------------------------------------------------- BoosTexter: A System for Multiclass Multi-label Text Categorization Robert E. Schapire and Yoram Singer This work focuses on algorithms which learn from examples to perform multiclass text and speech categorization tasks. We first show how to extend the standard notion of classification by allowing each instance to be associated with multiple labels. We then discuss our approach for multiclass multi-label text categorization which is based on a new and improved family of boosting algorithms. We describe in detail an implementation, called BoosTexter, of the new boosting algorithms for text categorization tasks. We present results comparing the performance of BoosTexter and a number of other text-categorization algorithms on a variety of tasks. We conclude by describing the application of our system to automatic call-type identification from unconstrained spoken customer responses. ----------------------------------------------------------------------------- An Efficient Boosting Algorithm for Combining Preferences Yoav Freund, Raj Iyer, Robert E. Schapire, Yoram Singer The problem of combining preferences arises in several applications, such as combining the results of different search engines. This work describes an efficient algorithm for combining multiple preferences. We first give a formal framework for the problem. We then describe and analyze a new boosting algorithm for combining preferences called RankBoost. We also describe an efficient implementation of the algorithm for a restricted case. We discuss two experiments we carried out to assess the performance of RankBoost. In the first experiment, we used the algorithm to combine different WWW search strategies, each of which is a query expansion for a given domain. For this task, we compare the performance of RankBoost to the individual search strategies. The second experiment is a collaborative-filtering task for making movie recommendations. Here, we present results comparing RankBoost to nearest-neighbor and regression algorithms. From giro at open.brain.riken.go.jp Mon Jun 8 01:42:19 1998 From: giro at open.brain.riken.go.jp (Dr. Mark Girolami) Date: Mon, 08 Jun 1998 14:42:19 +0900 Subject: PhD Research Studentships Message-ID: <357B79BB.FF6@open.brain.riken.go.jp> UNIVERSITY OF PAISLEY DEPARTMENT OF COMPUTING AND INFORMATION SYSTEMS PhD Studentships 'Applying Artificial Neural Networks in Non-invasive Direct Depth-Of-Anaesthesia Monitoring' Applications are invited for PhD research studentships to participate in a three year funded project on applying artificial neural networks and advanced signal processing techniques to non-invasive direct depth-of-anaesthesia monitoring. This project is being carried out in collaboration with the Department of Anaesthesia, Glasgow Western Infirmary. Suitable candidates will also have the opportunity of carrying out periods of research in collaborating laboratories based in Japan and the USA. Despite much research, directly monitoring the depth-of-anaesthesia has not yet found a place in routine anaesthetic practice. Two technologies form the basis of current monitors: auditory evoked potentials (AEPs) and bispectral index (BIS). AEPs involve the analysis of electrical signals produced by the auditory cortex in response to a pattern of clicks through a pair of headphones. BIS is a method which employs higher order statistics in processing brain EEG data which results in a single figure measure of depth of anaesthesia. Recording EEG and AEP data are now established and reliable techniques. The extraction of anaesthesia related data from the resultant stream of information holds the key to further advances in directly quantifying depth-of-anaesthesia. The project aims to develop and assess the use of artificial neural network (ANN) techniques in processing and analysing EEG and AEP data with the specific aim of improving the sensitivity and specificity of direct depth-of-anaesthesia monitoring. Applicants should have at least an upper second class degree in one of these disciplines: electronic engineering, mathematics, physics or computer science. Knowledge of signal processing or neural networks would be desirable. Applications in the form of a CV and names and addresses of three referees should be sent, as soon as possible and at the latest by 30th July 1998, to Dr. Mark Girolami, Computational Intelligence Research Unit, Department of Computing and Information Systems, University of Paisley, High Street, Paisley, PA1 2BE, Scotland, UK. Informal inquiries can be made direct to Dr. Mark Girolami giro at open.brain.riken.go.jp Or giro0ci at paisley.ac.uk -- ---------------------------------------------- Dr. Mark Girolami (TM) RIKEN, Brain Science Institute Laboratory for Open Information Systems 2-1 Hirosawa, Wako-shi, Saitama 351-01, Japan Email: giro at open.brain.riken.go.jp Tel: +81 48 467 9666 Tel: +81 48 462 3769 (apartment) Fax: +81 48 467 9694 --------------------------------------------- Currently on Secondment From: Department of Computing and Information Systems University of Paisley High Street, PA1 2BE Scotland, UK Email: giro0ci at paisley.ac.uk Tel: +44 141 848 3963 Fax: +44 141 848 3542 Secretary: Mrs E Campbell Tel: +44 141 848 3966 --------------------------------------------- From mac+ at andrew.cmu.edu Mon Jun 1 13:57:35 1998 From: mac+ at andrew.cmu.edu (Mary Anne Cowden) Date: Mon, 1 Jun 1998 13:57:35 -0400 (EDT) Subject: Carnegie Symposium on Mechanisms of Cognitive Development, Oct 9-11, 1998 Message-ID: =============================================================== CALL FOR PARTICIPATION The 29th Carnegie Symposium on Cognition Mechanisms of Cognitive Development: Behavioral and Neural Perspectives October 9 - 11, 1998 James L. McClelland and Robert S. Siegler, Organizers ---------------------------------------------------------------------------- The 29th Carnegie Symposium on Cognition is sponsored by the Department of Psychology and the Center for the Neural Basis of Cognition. The symposium is supported by the National Science Foundation, the National Institute of Mental Heatlh, and the National Institute of Child Health and Human Development. ---------------------------------------------------------------------------- This post contains the following entries relevant to the symposium: * Overview * Schedule of Events * Attending the Symposium * Travel Fellowships ---------------------------------------------------------------------------- Overview This symposium will consider how children's thinking evolves during development, with a focus on the role of experience in causing change. Speakers will examine the processes by which children learn and those that make children ready and able to learn at particular points in development, using both behavioral and neural approaches. Behavioral approaches will include research on the 'microgenesis' of cognitive change over short time periods (e.g., several hour-long sessions) in specific task situations. Research on cognitive change over longer time scales (months and years) will also be presented, as will research that uses computational modeling and dynamical systems approaches to understand learning and development. Neural approaches will include the study of how neuronal activity and connectivity change during acquisition of cognitive skills in children and adults. Other studies will consider the possible emergence of cognitive abilities through the maturation of brain structures and the effects of experience on the organization of functions in the brain. Developmental anomalies such as autism and attention deficit disorder will also be examined, as windows on normal development. Four questions will be examined throughout the symposium: 1) Why do cognitive abilities emerge when they do during development? 2) What are the sources of developmental and individual differences, and of developmental anomalies in learning? 3) What happens in the brain when people learn? 4) How can experiences be ordered and timed so as to optimize learning? The answers to these questions have strong implications for how we educate children and remediate deficits that impede development of thinking abilities. These implications will be explored in discussions among the participants. ---------------------------------------------------------------------------- The 29th Carnegie Symposium on Cognition: Schedule ---------------------------------------------------------------------------- Friday, October 9th: Studies of the Microgenesis of Cognitive Development 8:30 - 9:00 Continental Breakfast 9:00 Welcome BEHAVIORAL APPROACHES 9:20 Susan Goldin-Meadow, University of Chicago Giving the mind a hand: The role of gesture in cognitive change 10:20 Break 10:40 Robert Siegler, Carnegie Mellon University Microgenetic studies of learning in children and in brain-damaged adults 11:40 Lunch NEUROSCIENCE APPROACHES 1:00 Michael Merzenich, University of California, San Francisco Cortical plasticity phenomenology and mechanisms: Implications for neurorehabilitation 2:00 James L. McClelland, Carnegie Mellon University/CNBC Revisiting the critical period: Interventions that enhance adaptation to non-native phonological contrasts in Japanese adults 3:00 Break 3:20 Richard Haier, University of California, Irvine PET studies of learning and individual differences 4:20 Discussant: James Stigler, UCLA Saturday, October 10th: Studies of Change Over Long Time Scales 8:30 - 9:00 Continental Breakfast BEHAVIORAL APPROACHES 9:00 Esther Thelen, Indiana University Dynamic mechanisms of change in early perceptual motor development 10:00 Robbie Case, University of Toronto Differentiation and integration as the mechanisms in cognitive and neurological development 11:00 Break 11:20 Deanna Kuhn, Teacher's College, Columbia University Why development does (and doesn't) occur: Evidence from the domain of inductive reasoning 12:20 Lunch NEUROSCIENCE APPROACHES 2:00 Mark Johnson, Birkbeck College/University College London Cortical specialization for cognitive functions 3:00 Helen Neville, University of Oregon Specificity and plasticity in human brain development 4:00 Break 4:20 Discussant: David Klahr, Carnegie Mellon University Sunday, October 11th: Developmental Disorders 8:30 - 9:00 Continental Breakfast DYSLEXIA 9:00 Albert Galaburda, Harvard Medical School Toxicity of neural plasticity as seen through a model of learning disability AUTISM 10:00 Patricia Carpenter, Marcel Just, Carnegie Mellon University Cognitive load distribution in normal and autistic individuals 11:00 Break ATTENTION DEFICIT DISORDER 11:20 B. J. Casey, University of Pittsburgh Medical Center Disruption and inhibitory control in developmental disorders: A mechanistic model of implicated frontostriatal circuitry 12:20 Concluding discussant: Michael I. Posner, University of Oregon ---------------------------------------------------------------------------- Attending the Symposium Sessions on Friday, October 9 will be held in McConomy Auditorium, University Center, Carnegie Mellon. Sessions on Saturday, October 10 and Sunday, October 11 will be held in the Adamson Wing, Room 135 Baker Hall. Admission is free, and everyone is welcome to attend. Out of town visitors can contact Mary Anne Cowden, (412) 268-3151, mac+ at cmu.edu, for additional information. Travel Fellowships Fellowships are available for junior scientists for travel and lodging expenses associated with attending the symposium. Interested applicants should send a brief statement of interest, a curriculum vitae, and one letter of recommendation by August 15, 1998 to Mary Anne Cowden, Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213. --------------------------------------------------------------------------- This material is based on the symposium web-page: http://www.cnbc.cmu.edu/carnegie-symposium ---------------------------------------------------------------------------- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Mary Anne Cowden, Administrative Coord. Psychology Dept, Carnegie Mellon University Phone: 412/268-3151 Fax: 412/268-3464 http://www.contrib.andrew.cmu.edu/~mac/ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ From mjjs at eng.cam.ac.uk Tue Jun 2 07:22:37 1998 From: mjjs at eng.cam.ac.uk (M.J.J. Scott) Date: Tue, 02 Jun 1998 12:22:37 +0100 Subject: Technical report available Message-ID: <3573E07D.796@eng.cam.ac.uk> The following technical report is available by anonymous ftp from the archive of the Speech, Vision and Robotics Group at the Cambridge University Engineering Department. The authors would welcome comments on this report. Parcel: feature subset selection in variable cost domains M.J.J. Scott, M. Niranjan, R.W. Prager. Technical Report CUED/F-INFENG/TR.323 Cambridge University Engineering Department Trumpington Street Cambridge CB2 1PZ England Abstract The vast majority of classification systems are designed with a single set of features, and optimised to a single specified cost. However, in examples such as medical and financial risk modelling, costs are known to vary subsequent to system design. In this paper, we present a design method for feature selection in the presence of varying costs. Starting from the Wilcoxon nonparametric statistic for the performance of a classification system, we introduce a concept called the maximum realisable receiver operating characteristic (MRROC), and prove a related theorem. A novel criterion for feature selection, based on the area under the MRROC curve, is then introduced. This leads to a framework which we call Parcel. This has the flexibility to use different combinations of features at different operating points on the resulting MRROC curve. Empirical support for each stage in our approach is provided by experiments on real world problems, with Parcel achieving superior results. ************************ How to obtain a copy ************************ a) http://svr-www.eng.cam.ac.uk/reports/abstracts/Scott_tr323.html b) Via FTP: unix> ftp svr-ftp.eng.cam.ac.uk Name: anonymous Password: (type your email address) ftp> cd reports ftp> binary ftp> get Scott_tr323.ps.gz ftp> quit unix> gunzip Scott_tr323.ps.gz unix> lpr Scott_tr323.ps (or however you print PostScript) c) Via postal mail: Request a hardcopy from Martin J.J. Scott, Cambridge University Engineering Department, Trumpington Street, Cambridge CB2 1PZ, England. or email me: mjjs at eng.cam.ac.uk -- Martin JJ Scott Fallside Lab, Engineering Dept, Trumpington St., Cambridge CB2 1PZ, +(44 1223) 332754 http://svr-www.eng.cam.ac.uk/~mjjs/Personal.html "We have heard the chimes at midnight ..." From derrabi at fin.ucl.ac.be Tue Jun 2 09:40:16 1998 From: derrabi at fin.ucl.ac.be (Derrabi Mohamed) Date: Tue, 02 Jun 1998 15:40:16 +0200 Subject: ACESG - Remainder Message-ID: <1.5.4.32.19980602134016.006a8398@doyens1.iag.ucl.ac.be> CONNECTIONIST APPROACHES IN ECONOMICS AND MANAGEMENT SCIENCES FIFTH INTERNATIONAL MEETING COMPLEX DATA : MODELING AND ANALYSIS LOUVAIN-LA-NEUVE, NOVEMBER 20, 1998 CALL FOR PAPERS - REMINDER ---------------------------------------------------------------------------- ------ Since the beginning of the 80s, important advances have been made in developing diverse new approaches of bio-mimetic inspiration (neural nets, genetic algorithms, cellular automata, ...). These approaches are of prime interest for researcher both in Economics and in Management Sciences. The ACSEG International Meetings give the opportunity to assess the state-of-the-art in the domain, to delineate future developments, and to evidence the contribution of bio-mimetic methods to Economics and Management Sciences. They also allow the researchers to present their recent work, to exchange know-how, and to discuss the problems encountered in their research. The 1998 ASCEG International Meeting on COMPLEX DATA : MODELING AND ANALYSIS will take place at the Universite catholique de Louvain, November 20, 1998. Organizers are the research centers SAMOS (Universite de Paris 1- Pantheon - Sorbonne), CEGF (Universite catholique de Louvain) and CeReFim (Facultes Universitaires Notre-Dame de la Paix). The members of the scientific committee invite you to submit papers in Economics and Management Sciences on the following topics: - simulation of complex processes (non-linear, non-parametric, ...) - new approaches for data analysis - local and global optimization - forecasting (financial series, bankruptcies, consumer behavior, ...) - behavioral modeling - hybrid approaches associating new and classical approaches - numerical evaluation methods (prices of financial assets, ...) If interested, check the conference page http://mkb.fin.ucl.ac.be/Acseg98 or write to: ACSEG98, Centre d'Etudes en Gestion Financi?re, Institut d'Administration et de Gestion, Universite catholique de Louvain, 1 place des Doyens, 1348 Louvain-la-Neuve - Belgium (Fax. : + (32).10.47.83.24) for additional information. Submission Date : Before June 30, 1998 **************************************************************************** ******* Mohamed DERRABI UCL- Institut d'Administration et de Gestion Unite Finance d'entreprise 1, Place des doyens B-1348 LLN Tel: 010 / 47 84 36 Fax: 010 / 47 83 24 **************************************************************************** ******* From xli at sckcen.be Tue Jun 9 05:32:50 1998 From: xli at sckcen.be (Xiaozhong Li) Date: Tue, 9 Jun 1998 11:32:50 +0200 Subject: Papers related to FLINS are available Message-ID: <2.2.16.19980609113045.0c5f5cf2@mail.sckcen.be> The following papers (1997-1998) related to FLINS are available from the following site: http://www.sckcen.be/people/xli/ Directory: Publications in English Tip: The files are in postscript formats compressed by WinZiP. Comments are welcome. My regards. Xiaozhong Li Xiaozhong Li, Da Ruan Novel Neural Algorithms Based on Fuzzy $\delta$ Rules for Solving Fuzzy Relation Equations: Part I Fuzzy Sets and Systems 90 (1997) 11-23. ABSTRACT Although there are some papers on using neural networks to solve fuzzy relation equations, they have some widespread problems. For example, the best learning rate cannot be decided easily and strict theoretic analyses on convergence of algorithms are not given due to the complexity in a given system. To overcome these problems, we present some novel neural algorithms in this paper. We first describe such algorithms for max-min operator networks, then we demonstrate these algorithms can also be extended to max-times operator network. Important results include some improved fuzzy $\delta$ rules, a convergence theorem and an equivalence theorem which reflects fuzzy theory and neural networks can reach the same goal by different routes. The fuzzy bidirectional associative memory network and its training algorithms are also discussed. All important theorems are well proved and a simulation and a comparision result with Blanco and Pedrycz are reported. Xiaozhong Li, Da Ruan Fuzzy $\delta$ Rule and Its Simulations in Fuzzy Relation Equations Int. J. of Fuzzy Mathematics . Accepted. ABSTRACT After a short review of our previous work, in this paper we will present a new simplified proof to a lemma which plays an important role in proving the convergence theorem of the fuzzy perceptron. The new proof is much shorter. Moreover, we give some typical simulation results to illustrate the power of the fuzzy $\delta$ rule. Xiaozhong Li, Da Ruan Novel Neural Algorithms Based on Fuzzy $\delta$ Rules for Solving Fuzzy Relation Equations: Part II Fuzzy Sets and Systems , Accepted. ABSTRACT In this paper, we first design a fuzzy neuron which possesses some generality. This fuzzy neuron is founded by replacing the operators of the traditional neuron with a pair of abstract fuzzy operators as ($\widehat+$, $\widehat\bullet$) which we call fuzzy neuron operators. For example, it may be $(+, \bullet)$, $(\bigwedge,\bullet)$, $(\bigvee,\bullet)$, or $(\bigwedge,\bigwedge)$, etc. It is an extended fuzzy neuron and a network composed of such neurons is an extended fuzzy neural network. Then we discuss the relationship between the fuzzy neuron operators and $t$-norm and $t$-conorm, and point out fuzzy neuron operators are based on $t$-norm but much wider than $t$-norm. In this paper we will emphatically discuss a two-layered network and its training algorithm which will have to satisfy a set of various operators. This work is very related to solving fuzzy relation equations. So it can be used to resolve fuzzy relation equations. Furthermore, the new fuzzy neural algorithm is found to be stronger than other existing methods to some degree. Some simulation results will be reported in detail. Xiaozhong Li, Da Ruan Novel Neural Algorithms Based on Fuzzy $\delta$ Rules for Solving Fuzzy Relation Equations: Part III Fuzzy Sets and Systems . Accepted. ABSTRACT In our previous work, we proposed a max-min operator network and a series of training algorithms, called fuzzy $\delta$ rules, which could be used to solve fuzzy relation equations. The most basic and important result is the convergence theorem of fuzzy perceptron based on max-min operators. This convergence theorem has been extended to the max-times operator network in the previous paper. In this paper, we will further extend the fuzzy $\delta$ rule and its convergence theorem to the case of max-* operator network in which * is a t-norm. An equivalence theorem points out that the neural algorithm in solving this kind of fuzzy relation equations is equivalent to the fuzzy solving method (non-neural) in \cite{Nol:848,Got:946}. The proof and simulation will be given. Xiaozhong Li, Da Ruan, Arien J. Van del Wal Discussions on Soft Computing at FLINS'96 International Journal of Intelligent Systems. , Vol. 13, Nos. 2/3, Feb./Mar. 1998. pp. 287-300. ABSTRACT This is a report on the discussion about soft computing (SC) during FLINS'96. The discussion is based on the 5 questions formulated by X. Li, viz. (1) What is SC? (2)What are the characteristics of SC? (3)What are the principal achievements of SC? (4)What are the typical problems of SC and what are the solutions? and (5)What is the prediction of SC for the future. Before and during FLINS'96, these 5 questions have been sent to several known specialists for a reply. Among them, Martin Wildberger, Bart Kosko, Bo Yuan, Hideyuki Takagi, Takehisa Onisawa, Germano Resconi, Zhong Zhang and Yasushi Nishiwaki answered these questions with their opinions. By this report we hope to stimulate some further discussion on this topic. Xiaozhong Li, Da Ruan Constructing A Fuzzy Logic Control Demo Model at SCK?CEN Proceedinds of the 5th European Congress on Intelligent Techniques and Soft Computing (EUFIT'97) , Aachen, Germany, September 8-11, 1997. Vol. 2, pp. 1408-1412. ABSTRACT Based on the background of fuzzy logic control application in nuclear reactors at SCK?CEN, we have made a real fuzzy logic control demo model. The demo model is suitable for us to test and compare our new algorithms of fuzzy control, because it is always difficult and risky to do all experiments in a real nuclear environment. This paper will mainly report the construction of the demo model and its fuzzy logic control system. Although this demo model is special designed to simulate the working principle of a nuclear reactor, it can be also used as a general object or flat for control experiments. It is much better than an inverted pendulum system which is often used as a test flat in imitating the delay of a real complex system. The current fuzzy logic control algorithm in this demo model is a normal algorithm based on Mamdani model. In our system, triangular shaped membership functions are used. In order to overcome the well known dilemma of fast response and no overshot, some parameters, for instance, fuzzy control rules and universes of discourse, must be adjusted. Finally, we have fulfilled this goal, however it is not easy to choose suitable parameters. This is the real drawback which has slowed down the wide applications of fuzzy logic control. Therefore new effective algorithms must be further researched, and it is possible to combine other intelligent technologies, such as the learning of neural network and evolving of genetic algorithm, although much work has already been done. Da Ruan, Xiaozhong Li Fuzzy Logic Control Applications to Belgian Nuclear Reactor 1 (BR1) Computers and Artificial Intelligence , Accepted. ABSTRACT Fuzzy logic applications in nuclear industry present a tremendous challenge. The main reason for this is the public awareness of the risks of nuclear industry and the very strict safety regulations in force for nuclear power plants. The very same regulations prevent a researcher from quickly introducing novel fuzzy-logic methods into this field. On the other hand, the application of fuzzy logic has, despite the ominous sound of the word "fuzzy" to nuclear engineers, a number of very desirable advantages over classical methods, e.g., its robustness and the capability to include human experience into the ontroller. In this paper we report an on-going R&D project for controlling the power level of the Belgian Nuclear Reactor 1 (BR1) at the Belgian Nuclear Research Centre SCK?CEN). The project started in 1995 and aims to investigate the added value of fuzzy logic control for nuclear reactors. We first review some relevant literature on fuzzy logic control in nuclear reactors, then present the state-of-the-art of the BR1 project. After experimenting fuzzy logic control under off-line test cases at the BR1 reactor, we now foresee a new development for a closed-loop fuzzy control as an on-line operation of the BR1 reactor. Finally, we present the new development for a closed-loop fuzzy logic control at BR1 with an understanding of the safety requirements for this real fuzzy logic control application in nuclear reactors. Xiaozhong Li, Da Ruan Comparative Study of Fuzzy Control, PID control, and Advanced Fuzzy Control for Simulating a Nuclear Reactor Operation Intelligent Systems and Soft Computing for Nuclear Science and Industry , Proceedings of the 3nd International FLINS Workshop, Mol, Belgium, September 14-16, 1998, Eds. Da Ruan, Pierre D'hondt et al, World Scientific Publisher. ABSTRACT Based on the background of fuzzy control applications at the BR1 reactor at SCK?CEN, we have made a real fuzzy logic control demo model. The demo model is suitable for us to test and compare any new algorithms of fuzzy control and intelligent systems, because it is always difficult and time consuming due to safety aspects to do all experiments in a real nuclear environment. In this paper, we first briefly report the construction of the demo model, and then introduce the results of a fuzzy control, a PID control, and an advanced fuzzy control, in which the advanced fuzzy control is a fuzzy control with an adaptive function which can self-regulate the fuzzy control rules. Afterwards, we give a comparative study among those three methods. The results have shown that fuzzy control has more advantages in term of flexibility, robustness, and easy updated facilities with respect to the PID control of the demo model, but PID control has much higher regulation resolution due to its integration term. The adaptive fuzzy control can dynamically adjust the rule base, therefore it is more robust and suitable to those very uncertain occasions. _____________________________________________________________________ * Xiaozhong Li. PhD, Currently Young Scientific Researcher * * Belgian Nuclear Research Centre (SCK.CEN) *----------* * Boeretang 200, B-2400 Mol, Belgium | _L_ * * phone: (+32-14) 33 22 30(O); (+32-14) 32 25 52(H) | /\X/\ * * fax: (+32-14) 32 15 29 | \/Z\/ * * e-mail:xli at sckcen.be http://www.sckcen.be/people/xli | / \ @ * *________________________________________________________*----------* From delapaz at dia.uned.es Wed Jun 10 05:02:28 1998 From: delapaz at dia.uned.es (Felix de la Paz Lopez) Date: Wed, 10 Jun 1998 11:02:28 +0200 Subject: IWANN'99 Message-ID: <004d01bd944e$7ff0f660$8df092c1@pc-felix.dia.uned.es> (sorry if you have received this message previously) Call for papers 5TH.INTERNATIONAL WORK-CONFERENCE ON ARTIFICIAL AND NATURAL NEURAL NETWORKS Biological and Artificial Computation: Methodologies, Neural Modeling and Bioinspired Applications IWANN'99 Alicante, Spain June 2-4, 1999 http://iwann99.umh.es/ Organized by: Asociación Española de Redes Neuronales (AERN) Universidad Nacional de Educacion a Distancia (UNED) Instituto de Bioingenieria, Universidad Miguel Hernandez (UMH) IN COOPERATION WITH Universidad de Granada Universidad de Malaga Universitat Politecnica de Catalunya Universidad de Las Palmas de Gran Canaria AND IFIP (Working Group in Neural Computer Systems, WG10.6) Spanish RIG IEEE Neural Networks Council UK&RI Communication Chapter of IEEE SCOPE Under the basic idea that living beings and machines can be understood using the same experimental methodology and the same theoretical and formal tools, the interdisciplinary team of the IWANN'99 program committee recognizes as global goals the following: I. Developments on Foundations and Methodology. II. From artificial to natural: How can help the Systems theory, the Electronics and the Computation (including AI) to the understanding of Nervous System?. As a science of analysis, neural computation seeks to help neurology, brain theory, and cognitive psychology in the understanding of the functioning of the Nervous System by means of computational models of neurons, neural nets and subcellular processes. III. From Natural to Artificial: How can help the understanding of Nervous System to the obtention of bio-inspired models of artificial neurons, evolutionary architectures, and learning algorithms of value in computation and engineering?. As engineering, neural computation seeks to complement the symbolic perspective of Artificial Intelligence (AI), using these biologically inspired models of neurons and nets to solve those non-algorithmic problems of function approximation and pattern classification having to do with changing and only partially known environments. IV. Bio-inspired Technology and Engineering Applications: How can we obtain bio-inspired formulations for sensory coding, perception, memory, decision making, planning, and control?. The essential aim of this perspective is to reduce the distance between the biological and artificial perspectives of neural computation. Contributions on the following and related topics are welcome. TOPICS 1. Foundations of Computational Neuroscience: Brain Organization Principles: Communication, control and oscillations, cooperativity, self-organization, and evolution. Convergency between theory and experiments. Principles: Communication, control and oscillations, cooperativity, self-organization, and evolution. Convergency between theory and experiments. 2. Neural Modeling: Biophysical and Structural Models: Ionic chanels, synaptic level, neurons, circuits and system level. Functional Models: Analogue, digital, probabilistic, bayesian, fuzzy and object oriented formulations. Energy related models. Hybrid techniques. 3. Plasticity Phenomena (Maturing, Learning and Memory): Biological mechanisms at the molecular, cellular, network, and behavioural levels. Computable Models of adaptation and plasticity. Supervised and non-supervised algorithms. Inductive, deductive and hybrid symbolic-subsymbolic formulations. 4. Complex Systems Dynamics: Optimization, self-organization, cooperative processes, fault-tolerance and self-repair. Genetic algorithms. Simulated evolution. Social organization processes and large scale neural models, non-linear dynamics in biological systems. 5. Artificial Intelligence and Cognitive Neuroscience: Knowledge modeling. Ontologies. Generic tasks of analysis, modification and synthesis. Libraries of problem solving methods and reusable components. Concept formation. Natural language understanding and linguistic. Intentionality and consciousness in autonomous agents. 6. Artificial Neural Nets Simulation, Implementation, and Evaluation: Development environments, formal frames, and simulation languages. Neural models editing tools. Advances in ANN's implementation. Evolving hardware. Validation and evaluation criteria. Acceptability and explanatory capacity. 7. Methodology for Nets Design: Data analysis, task identification and recursive hierarchical design in specific domains. Hybrid solutions to hybrid problems. 8. Bio-inspired Systems and Engineering: Signal processing, cochlear systems, auditory processing, retinomorphic systems, other sensory processing systems, neuromorphic communication, neuromorphic learning, neural prosthetic devices. 9. Other applications: Artificial vision, speech recognition, multisensorial integration, spatio-temporal planning and scheduling, strategies of sensory-motor coordination. Applications of ANN's in vision, real time, control, robotics, economy, industry and medicine. IMPORTANT DATES Second and final call for papers: September 1998 *** Final date for submission: January 15, 1999 *** Acceptance notification: February 15, 1999 Formalization of inscription: March 1, 1999 Contributions must be sent by surface mail to: Prof. Jose Mira-Mira Dpto. Inteligencia Artificial - UNED Senda del Rey s/n. E-28040 MADRID, Spain. Additional Information: http://iwann99.umh.es/ Phone: +34 91-398-7155 FAX: +34 91-398-6697 e-mail: iwann99 at dia.uned.es PAPER SUBMISSION The Programme Committee request original papers on the mentioned topics. Authors are invited to submit five copies of papers, written in english, of up to 10 pages, including figures, tables and references. The format should be A4 or 1/2 11 inch paper, in a Roman font, 12 point in size, with a printing area of 15.3 x 24.2 cm2 (6.0 x 9.5 sq. inches). If possible, please make use of the latex/plaintex style available in our WWW site. In adiction, one sheet must be attached including: title, author's names, a list of five keywords, the topic under the paper fit the best, the preferred presentation (oral or poster) and the corresponding author information (name, postal and e-mail address, phone and fax number). All received papers will be reviewed by the Programm Committee. Accepted papers may be presented orally or as a poster panels, however all accepted contributions will be published at full length (Springer-Verlag Proceedings are expected, as usual). -------------------------------------------------------------------------- From ngoddard at psc.edu Wed Jun 10 22:45:21 1998 From: ngoddard at psc.edu (Nigel Goddard) Date: Wed, 10 Jun 98 22:45:21 -0400 Subject: Position in Computational Neural Science Message-ID: <17039.897533121@pscuxc.psc.edu> This position is responsible for developing a nationally recognized research program in computational neural science. The following research areas of computational neuroscience are included: neural modeling, neural nets and machine learning and functional/structural MRI. The position will determine the priorities and direction for this effort based on the overall goals and objectives of the biomedical applications group and the Pittsburgh Supercomputing Center (PSC). This position will be responsible for writing research grants for continued funding, writing annual reports, presenting research results at national and international conferences, developing application-specific or research workshops, and publishing the results of this work in peer-reviewed publications. QUALIFICATIONS: Ph.D. in a computer science, mathematics or related discipline or equivalent combination of training and experience; five or more years of experience; proficient in C++ or C; proficient with message passing libraries; and ability to communicate effectively required. Experience performing computational neural science research on high performance computers preferred. This is a summary statement of the responsibilities and qualifications for this position. AA/EEO EMPLOYER To apply, send resume and cover letter to: David W. Deerfield, Biomedical Applications Manager Pittsburgh Supercomputing Center 4400 Fifth Avenue Pittsburgh, PA 15213 email: deerfield at psc.edu From wkistler at physik.tu-muenchen.de Thu Jun 11 04:37:40 1998 From: wkistler at physik.tu-muenchen.de (Werner Kistler) Date: Thu, 11 Jun 1998 10:37:40 +0200 Subject: Paper available: Modelling Collective Excitations in Cortical Tissue Message-ID: <357F9754.B2770228@physik.tu-muenchen.de> The following paper is available on my web page: http://www.physik.tu-muenchen.de/~wkistler/kistler98a.html ------------------------------------------------------------------ W. M. Kistler, R. Seitz, and J. L. van Hemmen. Modelling Collective Excitations in Cortical Tissue. Physica D, 114(3/4): 273-295, 1998. Abstract: We study a two-dimensional system of spiking neurons with local interactions depending on distance. The interactions between the neurons decrease as the distance between them increases and can be either excitatory or inhibitory. Depending on the mix of excitation and inhibition, this kind of system exhibits a rich repertoire of collective excitations such as traveling waves, expanding rings, and rotating spirals. We present a continuum approximation that allows an analytic treatment of plane waves and circular rings. We calculate the dispersion relation for plane waves and perform a linear stability analysis. Only waves that have a speed of propagation below a certain critical velocity, are stable. For target patterns, we derive an integro-differential equation that describes the evolution of a circular excitation. Its asymptotic behavior is handled exactly. We illustrate the analytic results by parallel-computer simulations of a network of 10^6 neurons. In so doing, we exhibit a novel type of local excitation, a so-called `paternoster'. ------------------------------------------------------------------ Werner Kistler Phone: +49(89)289.12193 Dipl.-Phys. Fax: +49(89)289.12296 email: wkistler at physik.tu-muenchen.de WWW: http://www.physik.tu-muenchen.de/~wkistler Institut f"ur Theoretische Physik Physik-Department der Technischen Universit"at M"unchen James-Franck-Strasse D-85748 Garching bei M"unchen Germany ------------------------------------------------------------------ From leila at ida.his.se Thu Jun 11 05:36:43 1998 From: leila at ida.his.se (Leila Khammari) Date: Thu, 11 Jun 1998 11:36:43 +0200 Subject: ICANN 98 Message-ID: <357FA52B.980A47C0@ida.his.se> CALL FOR PARTICIPATION: 8th INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS (ICANN 98) September 1-4, 1998, Skoevde, Sweden (Tutorials Sept 1, Conference Sept 2-4) ==================================== PRELIMINARY PROGRAM AND REGISTRATION FORMS NOW AVAILABLE AT: http://www.his.se/ida/icann98/ ==================================== ==================================== INVITED TALKS: ==================================== Diagrammatic Representation and Reasoning in a Connectionist Framework John Barnden, University of Birmingham, UK Variational Learning in Graphical Models and Neural Networks Chris Bishop, Microsoft Research, Cambridge, UK Learning To Be Social Rodney Brooks, MIT, Cambridge, USA Synchronization: The Computational Currency of Cognition Leif Finkel, University of Pennsylvania, USA Applications of Vapnik's theory for prediction Francoise Fogelman Soulie, Atos, France Title pending David Hansel, CNRS, France Brains, Gases and Robots Phil Husbands, University of Sussex, UK Self-Organization of Very Large Document Collections: State of the Art Teuvo Kohonen, Helsinki Univ. of Technology, Finland Gaussian Processes- a replacement for supervised neural networks? David MacKay, Cavendish Laboratory, Cambridge, UK Title pending Barak Pearlmutter, University of New Mexico, USA The Silicon Way to Artificial Neural Networks Ulrich Rueckert, Universitaet Paderborn, Germany Title pending David Rumelhart, Stanford University, USA ==================================== SCOPE: ==================================== ICANN 98 covers all aspects of ANN research, broadly divided into six areas, corresponding to separate organizational modules. - THEORY - APPLICATIONS - COMPUTATIONAL NEUROSCIENCE AND BRAIN THEORY - CONNECTIONIST COGNITIVE SCIENCE AND AI - AUTONOMOUS ROBOTICS AND ADAPTIVE BEHAVIOR - HARDWARE/IMPLEMENTATION Out of 340 submissions, 180 papers have been accepted for presentation. 65 of these will be presented orally in 3 parallel tracks, and 115 will be presented in two separate poster sessions. In addition to the modules mentioned above we aim to further promote contacts between researchers and industry. To achieve this, a special session on INDUSTRY AND RESEARCH is organized, featuring the following talks: Neural Computation at Siemens: Challenges in Applications and Research, Bernd Schuermann, Siemens, Germany Toward Real World Intelligence: R&D in the Real World Computing Program, Nobuyuki Otsu, ETL, Japan Industry - researchers interface: What is important for effective co-operation?, Timo Salo, Helsinki University, Finland Industrial perspective on ANN-research, Tony Larsson, Ericsson, Sweden Funding programs in Europe, Karl-Einar Sj?din, NUTEK, Sweden Technology transfer from European academia to industry Trevor Clarkson, King's College London, NEuroNet, Great Britain ==================================== TUTORIALS, September, 1 ==================================== Spiking Neurons Wulfram Gerstner, EPFL, Lausanne Realistic Modeling of Neurons and Networks using GENESIS Erik De Schutter, University of Antwerp The Self-Organizing Map Teuvo Kohonen, HUT, Helsinki Combining Artificial Neural Networks Amanda Sharkey, University of Sheffield The Working Brain: Brain Imaging and its Implications John Taylor, King's College, London Analogic Cellular Computing based on Cellular Neural Networks Tamas Roska, Computer and Automation Institute, Budapest Evolutionary Robotics Stefano Nolfi, National Research Council, Rome Dario Floreano, EPFL, Lausanne ==================================== REGISTRATION: ==================================== Early registration deadline: July 14th. Registration Fees: Before July 14 After July 14 Regular ENNS-member 3000 SEK 3500 SEK Regular non ENNS-member 3500 SEK 4000 SEK Student 2000 SEK 2500 SEK Tutorial day 500 SEK 500 SEK ==================================== For more detailed information, and registration forms please use www.his.se/ida/icann98, or get in contact with the ICANN 98 secretariat: Address: ICANN 98 Hoegskolan i Skoevde P.O. Box 408 541 28 Skoevde, SWEDEN Email: icann98 at ida.his.se Fax: +46 (0)500 46 47 25 ==================================== From c.k.i.williams at aston.ac.uk Thu Jun 11 06:48:59 1998 From: c.k.i.williams at aston.ac.uk (Chris Williams) Date: Thu, 11 Jun 1998 11:48:59 +0100 Subject: Postdoc position at the University of Edinburgh Message-ID: <2284.9806111048@sun.aston.ac.uk> [Apologies for cross-posting. Please note that I shall be moving to the Department of Arificial Intelligence, Univeristy of Edinburgh on 1 July 1998] ---------------------------------------------------------------------- Research Associate: Probabilistic Models for Sequences Department of Artificial Intelligence, University of Edinburgh A vacancy exists for a research associate on the RA1A scale (point 6), to work on a 3-year EPSRC funded research project entitled "Probabilistic Models for Sequences". The aim of the project is to investigate belief network models for sequences, with a particular focus on image sequences. This will entail the development and evaluation of approximation schemes for multiple-cause/hierarchical belief network models. These methods will be applied to problems such as road-scene interpretation and medical image analysis. This post is fixed-term, funded until 30 September 2001. Candidates should have strong mathematical and computational skills, preferably with a background in belief networks or more generally in probabilistic modelling. Starting salary will be point 6 on the RA1A scale, 16,927 pounds per annum. The successful applicant will work with a PhD student and a MSc student who are also funded through the project. Futher information about the project can be obtained from http://www.dai.ed.ac.uk/daidb/byhand/ckiw/, and/or by contacting Dr. Chris Williams (ckiw at dai.ed.ac.uk) telephone: 0121 359 3621 ext 4382 (international +44 121 359 3621 ext 4382) (with voicemail). Further particulars including the application procedure should be obtained from The Personnel Office, 1 Roxburgh Street, Edinburgh EH8 9TB Scotland or telephone: 0131 650 2511 (24 hour answering service). Please quote reference 896421. Closing date for receipt of applications is 3 July 1998. From pollack at cs.brandeis.edu Thu Jun 11 14:27:38 1998 From: pollack at cs.brandeis.edu (jordan pollack) Date: Thu, 11 Jun 1998 14:27:38 -0400 Subject: NCS survey paper References: <199805290056.RAA18938@arapaho.cse.ucsc.edu> Message-ID: <3580219A.37D5@cs.brandeis.edu> my history survey was written 10 years ago, and may be of interest to some to see whats changed since then: Pollack, J. B. (1989). Connectionism: Past, Present, and Future. Artificial Intelligence Review, 3, 3-20. Research efforts to study computation and cognitive modeling on neurally-inspired mechanisms have come to be called Connectionism. ..This paper surveys the history of the field, often in relation to AI, discusses its current successes and failures, and makes some predictions for where it might lead in the future. http://www.demo.cs.brandeis.edu/papers/long.html#nnhistory -- Professor Jordan B. Pollack DEMO Laboratory, Volen Center for Complex Systems Computer Science Dept, MS018 Phone (781) 736-2713/Lab x3366/Fax x2741 Brandeis University website: http://www.demo.cs.brandeis.edu Waltham, MA 02254 email: pollack at cs.brandeis.edu From terry at salk.edu Thu Jun 11 15:41:40 1998 From: terry at salk.edu (Terry Sejnowski) Date: Thu, 11 Jun 1998 12:41:40 -0700 (PDT) Subject: NIPS Volume 10 Message-ID: <199806111941.MAA26296@helmholtz.salk.edu> The abstracts for NIPS*97 are available online at: http://mitpress.mit.edu/cognet/abstracts/NIPS10/ The full papers are available in: Advances in Neural Information Processing Systems 10 Michael I. Jordan, Michael J. Kearns, and Sara A. Solla (eds.) Cambridge, MA: MIT Press (1998) This volume has been mailed to all registered participants. Terry ----- From segevr at post.tau.ac.il Fri Jun 12 05:32:39 1998 From: segevr at post.tau.ac.il (Ronen Segev) Date: Fri, 12 Jun 1998 12:32:39 +0300 (IDT) Subject: Paper: From Neurons to Brain: Adaptive Self-Wiring of Neurons Message-ID: Dear Connectionist, The following paper will be published at Journal of Complex Systems, vol 1, (1998). Hard copies can be obtained by sending an email to: segevr at post.tau.ac.il An electronic version can be found at: http://xxx.lanl.gov/find/cond-mat/1/segev/0/1/0/past/3/0 Your comments are welcome! Ronen Segev, email: segevr at post.tau.ac.il, School of Physics & Astronomy, Tel Aviv university. Title: From Neurons to Brain: Adaptive Self-Wiring of Neurons Authors: Ronen Segev , Eshel Ben-Jacob Comments: Latex, 12 pages, 9 gif figures. Report-no: S2 Subj-class: Neural Networks and Disorderd Systems. Journal-ref: J. Comp. Sys. 1 (1998) During embryonic morpho-genesis, a collection of individual neurons turns into a functioning network with unique capabilities. Only recently has this most staggering example of emergent process in the natural world, began to be studied. Here we propose a navigational strategy for neurites growth cones, based on sophisticated chemical signaling. We further propose that the embryonic environment (the neurons and the glia cells) acts as an excitable media in which concentric and spiral chemical waves are formed. Together with the navigation strategy, the chemical waves provide a mechanism for communication, regulation, and control required for the adaptive self-wiring of neurons. From stefan.wermter at sunderland.ac.uk Fri Jun 12 13:57:04 1998 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Fri, 12 Jun 1998 18:57:04 +0100 Subject: Job: Neural and Intelligent Systems Message-ID: <35816BF0.53A4D1E4@sunderland.ac.uk> Please post, thank you very much, Stefan Wermter --------------------------- Research Assistant in Neural and Intelligent Systems (reference number CIRG28) Applications are invited for a three year research assistant position in the School of Computing and Information Systems investigating the development of hybrid neural/symbolic techniques for intelligent processing. This is an exciting new project which aims at developing new environments for integrating neural networks and symbolic processing. You will play a key role in the development of such hybrid subsymbolic/symbolic environments. It is intended to apply the developed hybrid environments in areas such as natural language processing, intelligent information extraction, or the integration of speech/language in multimedia applications. You should have a degree in a computing discipline and will be able to register for a higher degree. A demonstrated interest in artificial neural networks, software engineering skills and programming experience are essential (preferably including a subset of C, C++, CommonLisp, Java, GUI). Experience and interest in neural network software and simulators would be an advantage (e.g. Planet, SNNS, Tlearn, Matlab, etc). Salary is according to the researcher A scale (currently up to 13,871, under revision). Application forms and further particulars are available from the Personnel department under +44 191 515 and extensions 2055, 2429, 2054, 2046, or 2425 or E-Mail employee.recruitment at sunderland.ac.uk quoting the reference number CIRG28. For informal inquiries please contact Professor Stefan Wermter, e-mail: Stefan.Wermter at sunderland.ac.uk. Closing date: 10 July 1998. The successful candidate is expected to start the job as soon as possible. ******************************************** Professor Stefan Wermter University of Sunderland Dept. of Computing & Information Systems St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://osiris.sunderland.ac.uk/~cs0stw/ ******************************************** From gary at cs.ucsd.edu Fri Jun 12 20:22:21 1998 From: gary at cs.ucsd.edu (Gary Cottrell) Date: Fri, 12 Jun 1998 17:22:21 -0700 (PDT) Subject: Recent publications from GURU Message-ID: <199806130022.RAA02659@gremlin.ucsd.edu> Hello all, Below are titles of six recent papers from Gary's Unbelievable Research Unit (GURU). Five of these will appear in the 1998 Proceedings of the Cognitive Science Society. One will appear in the Proceedings of Special Interest Group on Information Retrieval. All are available from my home page: http://www-cse.ucsd.edu/users/gary/ Abstracts are appended to the end of this message. Cheers, gary Gary Cottrell 619-534-6640 FAX: 619-534-7029 Faculty Assistant Joy Gorback: 619-534-5948 Computer Science and Engineering 0114 IF USING FED EX INCLUDE THE FOLLOWING LINE: "Only connect" 3101 Applied Physics and Math Building University of California San Diego -E.M. Forster La Jolla, Ca. 92093-0114 Email: gary at cs.ucsd.edu or gcottrell at ucsd.edu Anderson, Karen, Milostan, Jeanne C. and Cottrell, Garrison W. (1998) Assessing the contribution of representation to results. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Clouse, Daniel S. and Cottrell, Garrison W. (1998) Regulari- ties in a Random Mapping from Orthography to Semantics. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Dailey, Matthew N., Cottrell, Garrison W. and Busey, Thomas A. (1998) Eigenfaces for familiarity. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Laakso, Aarre and Cottrell, Garrison W. (1998) How can I know what You think?: Assessing representational similarity in neural systems. In Proceedings of the Twentieth Annual Cognitive Science Conference, Madison, WI, Mahwah: Lawrence Erlbaum. Padgett, Curtis and Cottrell, Garrison W. (1998) A simple neural network models categorical perception of facial expressions. In Proceedings of the Twentieth Annual Cogni- tive Science Conference, Madison, WI, Mahwah: Lawrence Erl- baum. Vogt, Christopher C. and Cottrell, Garrison W. (1998) Predicting the performance of linearly combined IR systems. In Proceedings of Special Interest Group on Information Retrieval. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Abstracts XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Assessing the Contribution of Representation to Results Karen Anderson kanders at cs.ucsd.edu Jeanne Milostan jmilosta at cs.ucsd.edu Garrison W. Cottrell gary at cs.ucsd.edu Computer Science and Engineering Department 0114 Institute for Neural Computation University of California San Diego La Jolla, CA 92093-0114 In this paper, we make a methodological point concerning the contribution of the representation of the output of a neural network model when using the model to compare to human error performance. We replicate part of Dell, Juliano \& Govindjee's work on modeling speech errors using recurrent networks (Dell et al. 1993). We find that 1) the error patterns reported by Dell et al. do not appear to remain when more networks are used; and 2) some components of the error patterns that are found can be accounted for by simply adding Gaussian noise to the output representation they used. We suggest that when modeling error behavior, the technique of adding noise to the output representation of a network should be used as a control to assess to what degree errors may be attributed to the underlying network. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Regularities in a Random Mapping from Orthography to Semantics Daniel S. Clouse and Garrison W. Cottrell Computer Science & Engineering 0114 University of California, San Diego La Jolla, CA 92093 {dclouse,gary}@cs.ucsd.edu In this paper we investigate representational and methodological issues in a attractor network model of the mapping from orthography to semantics based on (Plaut, 1995). We find that, contrary to psycholinguistic studies, the response time to concrete words (represented by more 1 bits in the output pattern) is slower than for abstract words. This model also predicts that response times to words in a dense semantic neighborhood will be faster than words which have few semantically similar neighbors in the language. This is conceptually consistent with the neighborhood effect seen in the mapping from orthography to phonology (Seidenberg & McClelland, 1989; Plaut et al. 1996) in that patterns with many neighbors are faster in both pathways, but since there is no regularity in the random mapping used here, it is clear that the cause of this effect is different than that of previous experiments. We also report a rather distressing finding. Reaction time in this model is measured by the time it takes the network to settle after being presented with a new input. When the criterion used to determine when the network is ``settled'' is changed to include testing of the hidden units, each of the results reported above change the direction of effect -- abstract words are now slower, as are words in dense semantic neighborhoods. Since there are independent reasons to exclude hidden units from the stopping criterion, and this is what is done in common practice, we believe this phenomenon to be of interest mostly to neural network practitioners. However, it does provide some insight into the interaction between the hidden and output units during settling. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Eigenfaces for Familiarity Matthew N. Dailey mdailey at cs.ucsd.edu Garrison W. Cottrell gary at cs.ucsd.edu Computer Science and Engineering Department University of California, San Diego 9500 Gilman Dr., La Jolla CA 92093-0114 USA Thomas A. Busey busey at indiana.edu Department of Psychology Indiana University Bloomington, IN 47405 USA A previous experiment tested subjects' new/old judgments of previously-studied faces, distractors, and morphs between pairs of studied parents. We examine the extent to which models based on principal component analysis (eigenfaces) can predict human recognition of studied faces and false alarms to the distractors and morphs. We also compare eigenface models to the predictions of previous models based on the positions of faces in a multidimensional ``face space'' derived from a multidimensional scaling (MDS) of human similarity ratings. We find that the error in reconstructing a test face from its position in an ``eigenface space'' provides a good overall prediction of human familiarity ratings. However, the model has difficulty accounting for the fact that humans false alarm to morphs with similar parents more frequently than they false alarm to morphs with dissimilar parents. We ascribe this to the limitations of the simple reconstruction error-based model. We then outline preliminary work to improve the fine-grained fit within the eigenface-based modeling framework, and discuss the results' implications for exemplar- and face space-based models of face processing. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX How Can *I* Know What *You* Think?: Assessing Representational Similarity in Neural Systems Aarre Laakso aarre at ucsd.edu Department of Philosophy University of California, San Diego La Jolla, CA 92093 Garrison W. Cottrell gary at cs.ucsd.edu Institute for Neural Computation Computer Science and Engineering University of California, San Diego La Jolla, CA 92093 How do my mental states compare to yours? We suggest that, while we may not be able to compare experiences, we can compare neural representations, and that the correct way to compare neural representations is through analysis of the distances between them. In this paper, we present a technique for measuring the similarities between representations at various layers of neural networks. We then use the measure to demonstrate empirically that different artificial neural networks trained by backpropagation on the same categorization task, even with different representational encodings of the input patterns and different numbers of hidden units, reach states in which representations at the hidden units are similar. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX A Simple Neural Network Models Categorical Perception of Facial Expressions Curtis Padgett and Garrison W. Cottrell Computer Science & Engineering 0114 University of California, San Diego La Jolla, CA 92093-0114 {cpadgett,gary}@cs.ucsd.edu The performance of a neural network that categorizes facial expressions is compared with human subjects over a set of experiments using interpolated imagery. The experiments for both the human subjects and neural networks make use of interpolations of facial expressions from the Pictures of Facial Affect Database (Ekman & Freisen, 1976). The only difference in materials between those used in the human subjects experiments (Young et al., 1997) and our materials are the manner in which the interpolated images are constructed -- image-quality morphs versus pixel averages. Nevertheless, the neural network accurately captures the categorical nature of the human responses, showing sharp transitions in labeling of images along the interpolated sequence. Crucially for a demonstration of categorical perception (Harnad, 1987), the model shows the highest discrimination between transition images at the crossover point. The model also captures the shape of the reaction time curves of the human subjects along the sequences. Finally, the network matches human subjects' judgements of which expressions are being mixed in the images. The main failing of the model is that there are intrusions of ``neutral'' responses in some transitions, which are not seen in the human subjects. We attribute this difference to the difference between the pixel average stimuli and the image quality morph stimuli. These results show that a simple neural network classifier, with no access to the biological constraints that are presumably imposed on the human emotion processor, and whose only access to the surrounding culture is the category labels placed by American subjects on the facial expressions, can nevertheless simulate fairly well the human responses to emotional expressions. XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX Predicting the Performance of Linearly Combined IR Systems Christopher C. Vogt University of California, San Diego, CSE 0114, La Jolla, CA 92093, USA Garrison W. Cottrell University of California, San Diego, CSE 0114, La Jolla, CA 92093, USA Abstract We introduce a new technique for analyzing combination models. The technique allows us to make qualitative conclusions about which IR systems should be combined. We achieve this by using a linear regression to accurately (r2=0.98) predict the performance of the combined system based on quantitative measurements of individual component systems taken from TREC5. When applied to a linear model (weighted sum of relevance scores), the technique supports several previously suggested hypotheses: one should maximize both the individual systems' performances and the overlap of relevant documents between systems, while minimizing the overlap of nonrelevant documents. It also suggests new conclusions: both systems should distribute scores similarly, but not rank relevant documents similarly. It furthermore suggests that the linear model is only able to exploit a fraction of the benefit possible from combination. The technique is general in nature and capable of pointing out the strengths and weaknesses of any given combination approach. SIGIR'98 24-28 August 1998 Melbourne, Australia. From harnad at coglit.soton.ac.uk Sat Jun 13 05:47:40 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Sat, 13 Jun 1998 10:47:40 +0100 (BST) Subject: Invitation to archive your papers in CogPrints Archive Message-ID: To all cognitive scientists (apologies if you receive this more than once): You are invited to archive your preprints and reprints in the CogPrints electronic archive. The Archive covers all the Cognitive Sciences: Psychology, Neuroscience, Biology, Computer Science, Linguistics and Philosophy CogPrints is completely free for everyone, both authors and readers, thanks to a subsidy from the Electronic Libraries Programme of the Joint Information Systems of the United Kingdom and the collaboration of the NSF-supported Physics Eprint Archive at Los Alamos. CogPrints has just been opened for public automatic archiving. This means authors can now deposit their own papers automatically. The first wave of papers had been invited and hand-archived by CogPrints in order to set a model of the form and content of CogPrints. To see the current holdings: http://cogprints.soton.ac.uk/ To archive your own papers automatically: http://cogprints.soton.ac.uk/author.html All authors are encouraged to archive their papers on their home servers as well. For ferther information: admin at coglit.soton.ac.uk -------------------------------------------------------------------- BACKGROUND INFORMATION (No need to read if you wish to proceed directly to the Archive.) The objective of CogPrints is to emulate in the cognitive and biobehavioral sciences the remarkable success of the NSF-subsidised Physics Eprint Archive at Los Alamos http://xxx.lanl.gov The Physics Eprint Archive now makes available, free for all, over half of the annual physics periodical literature, with its annual growth strongly suggesting that it will not be long before it becomes the locus classicus for all of the literature in Physics. What this means is that anyone in the world with access to the Internet (and that number too is rising at a breath-taking rate, and already includes all academics, researchers and students in the West, and an increasing proportion in the Third World as well) can now search and retrieve virtually all current work in, for example, High Energy Physics, much of it retroactive to 1990 when the Physics archive was founded by Paul Ginsparg, who must certainly be credited by historians with having launched this revolution in scientific and scholarly publication (www-admin at xxx.lanl.gov). Does this mean that learned journals will disappear? Not at all. They will continue to play their traditional role of validating research through peer review, but this function will be an "overlay" on the electronic archives. The literature that is still in the form of unrefereed preprints and technical reports will be classified as such, to distinguish it from the refereed literature, which will be tagged with the imprimatur of the journal that refereed and accepted it for publication, as it always has been. It will no longer be necessary for publishers to recover (and research libraries to pay) the substantial costs of producing and distributing paper through ever-higher library subscription prices: Instead, it will be the beneficiaries of the global, unimpeded access to the learned research literature -- the funders of the research and the employers of the researcher -- who will cover the much reduced costs of implementing peer review, editing, and archiving in the electronic medium alone, in the form of minimal page-charges, in exchange for instant, permanent, worldwide access to the research literature for all, for free. If this arrangement strikes you as anomalous, consider that the real anomaly was that the authors of the scientific and scholarly periodical research literature, who, unlike trade authors, never got (or expected) royalties for the sale of their texts -- on the contrary, so important was it to them that their work should reach all potentially interested fellow-researchers that they had long been willing to pay for the printing and mailing of preprints and reprints to those who requested them -- nevertheless had to consent to have access to their work restricted to those who paid for it. This Faustian bargain was unavoidable in the Gutenberg age, because of the need to recover the high cost of producing and disseminating print on paper, but Paul Ginsparg has shown the way to launch the entire learned periodical literature into the PostGutenberg Galaxy, in which scientists and scholars can publish their work in the form of "skywriting": visible and available for free to all. -------------------------------------------------------------------- Stevan Harnad harnad at cogsci.soton.ac.uk Professor of Psychology harnad at princeton.edu Director, phone: +44 1703 592582 Cognitive Sciences Centre fax: +44 1703 594597 Department of Psychology http://www.cogsci.soton.ac.uk/~harnad/ University of Southampton http://www.princeton.edu/~harnad/ Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/ SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/ From juergen at idsia.ch Mon Jun 15 13:35:55 1998 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Mon, 15 Jun 1998 19:35:55 +0200 Subject: fractal face Message-ID: <199806151735.TAA19652@ruebe.idsia.ch> FACIAL BEAUTY AND FRACTAL GEOMETRY Juergen Schmidhuber What is it that makes a face beautiful? Average faces obtained by photographic (Galton 1878) or digital (Langlois & Roggman 1990) blending are judged attractive but not optimally attractive (Alley & Cunningham 1991) --- digital exaggerations of deviations from average face blends can lead to higher attractiveness ratings (Perrett, May, & Yoshikawa 1994). My novel approach to face design does not involve blending at all. Instead, the image of a female face with high ratings is composed from a fractal geometry based on rotated squares and powers of two. The corresponding geometric rules are more specific than those previously used by artists such as Leonardo and Duerer. They yield a short algorithmic description of all facial characteristics, many of which are compactly encod- able with the help of simple feature detectors similar to those found in mammalian brains. This suggests that a face's beauty correlates with simplicity relative to the subjective observer's way of encoding it. HTML: http://www.idsia.ch/~juergen/locoface/locoface.html (5 color figures, total of 0.7MB) Postscript: ftp://ftp.idsia.ch/pub/juergen/locoface.ps.gz (7 pages, 1.3MB, 5MB gunzipped) Comments welcome! IDSIA, Switzerland Juergen Schmidhuber www.idsia.ch From stefan.wermter at sunderland.ac.uk Tue Jun 16 09:12:59 1998 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Tue, 16 Jun 1998 14:12:59 +0100 Subject: Neural and Intelligent Systems Message-ID: <35866F5B.FA15A41A@sunderland.ac.uk> Research Assistant in Neural and Intelligent Systems (reference number CIRG28) Applications are invited for a three year research assistant position in the School of Computing and Information Systems investigating the development of hybrid neural/symbolic techniques for intelligent processing. This is an exciting new project which aims at developing new environments for integrating neural networks and symbolic processing. You will play a key role in the development of such hybrid subsymbolic/symbolic environments. It is intended to apply the developed hybrid environments in areas such as natural language processing, intelligent information extraction, or the integration of speech/language in multimedia applications. You should have a degree in a computing discipline and will be able to register for a higher degree. A demonstrated interest in artificial neural networks, software engineering skills and programming experience are essential (preferably including a subset of C, C++, CommonLisp, Java, GUI). Experience and interest in neural network software and simulators would be an advantage (e.g. Planet, SNNS, Tlearn, Matlab, etc). Salary is according to the researcher A scale (currently up to 13,871, under revision). Application forms and further particulars are available from the Personnel department under +44 191 515 and extensions 2055, 2429, 2054, 2046, or 2425 or E-Mail employee.recruitment at sunderland.ac.uk quoting the reference number CIRG28. For informal inquiries please contact Professor Stefan Wermter, e-mail: Stefan.Wermter at sunderland.ac.uk. Closing date: 10 July 1998. The successful candidate is expected to start the job as soon as possible. ******************************************** Professor Stefan Wermter University of Sunderland Dept. of Computing & Information Systems St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://osiris.sunderland.ac.uk/~cs0stw/ ******************************************** From Dave_Touretzky at cs.cmu.edu Mon Jun 15 17:06:08 1998 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Mon, 15 Jun 1998 14:06:08 -0700 Subject: CNS*98 listing of papers, and registration information Message-ID: ************************************************************************ SEVENTH ANNUAL COMPUTATIONAL NEUROSCIENCE MEETING (CNS*98) July 26 - 30, 1998 Santa Barbara, California REGISTRATION INFORMATION ************************************************************************ Registration is now open for this year's Computational Neuroscience meeting (CNS*98). This is the seventh in a series of annual inter-disciplinary conferences intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience. As in previous years, this meeting will bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in understanding how biological neural systems compute. The meeting will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1998 will take place at Fess Parker's Doubletree Resort in Santa Barbara, California and include plenary, contributed, and poster sessions. The first session starts at 9 am, Sunday July 26th and the meeting ends with the annual CNS banquet on Thursday evening, July 30th. There will be no parallel sessions. The meeting includes two half days of informal workshops focused on current issues in computational neuroscience. Day care will be available for children and given the beauty and recreational interest of the area, we encourage families to attend. LOCATION: The meeting will take place at the Fess Parker's Double Tree Resort in Santa Barbara, California. MEETING ACCOMMODATIONS: Accommodations for the meeting have been arranged at Fess Parker's Doubletree Resort. Information concerning reservations, hotel accommodations, etc. are available at the meeting web site indicated below. A block of rooms are reserved at special rates. 30 student rate rooms are available on a first-come-first-served basis, so we recommend students acting quickly to reserve these slots. NOTE that registering for the meeting, WILL NOT result in an automatic room reservation. Instead you must make your own reservations by contacting the hotel itself. As this is the high season for tourists in Santa Barbara, you should make sure and reserve your accommodations quickly by contacting: =46ess Parker's Doubletree Resort (RESERVATION REQUEST ORDER FORM LOCATED BELOW) NOTE: IN ORDER TO GET THE AVAILABLE ROOMS, YOU MUST CONFIRM HOTEL REGISTRATIONS BY JUNE 24, 1997. When making reservations by phone, make sure and indicate that you are registering for the Computational Neuroscience (CNS*98) meeting. Students will be asked to verify their status on check in with a student ID or other documentation. MEETING REGISTRATION FEES: Registration received on or before June 26, 1998: Student: $ 95 Regular: $ 225 Meeting registration after June 26, 1998: Student: $ 125 Regular: $ 250 BANQUET: Registration for the meeting includes a single ticket to the annual CNS Banquet. Additional Banquet tickets can be purchased for $35 each person. The banquet will be held on Thursday, July 30th. DAY CARE: Day care will be available at the conference for those who inform us in advance of their day care needs. Note that day care will not be provided during the evening. Please send e-mail to judy at bbb.caltech.edu. Please provide the following information: 1. name of parent(s), 2. e-mail address, 3. age of children and 4. estimated times during which children will need day care. Day care will be provided free of charge accept for children under the age of 2 years old for whom a fee may be charged. AIRFARE: Santa Barbara has its own small airport with daily flights from Los Angeles and San Francisco. In addition, ground transportation is available to Santa Barbara from Los Angeles International Airport (and a one and a half hour drive). Special discount rates have been arranged with United and Northwest airlines, if you mention the following group ID with airline reservations: Northwest Airlines - Phone No: 1-800-328-1111 Meeting I.D. No: NMG66 United Airlines - Phone No: 1-800-521-4041 (U.S. and Canada) Meeting I.D. No: 5255V ********************************************************************* ADDITIONAL INFORMATION (including the agenda with list of talks) can be obtained by: o Using our on-line WWW information and registration server, URL of: http://www.bbb.caltech.edu/cns98.html o ftp-ing to our ftp site. yourhost% ftp ftp.bbb.caltech.edu Name (ftp.bbb.caltech.edu:<): ftp Password: yourname at yourhost.yourside.yourdomain ftp> cd cns98 ftp> ls o Sending Email to: cns98 at bbb.caltech.edu ************************************************************************ ************************************************************************ SUNDAY, JULY 26, 1998 9:00 Welcoming Remarks and General Information 9:15 Featured Contributed Talk: Barry J. Richmond (NIH/NIMH) John A. Hertz and Timothy J. Gawne Comparing Responses to Visual Stimuli Appearing on Receptive Fields of V1 Complex Cells Due to Saccades with Responses Elicited by Stimulus Sequences Contributed Talks 10:05 Udo Ernst (MPI for Fluid Dynamics) Klaus Pawelzik, Fred Wolf, and Theo Geisel Theory of Nonclassical Receptive Field Phenomena in the Visual Cortex 10:25 Ko Sakai (RIKEN Brain Science Institute) Shigeru Tanaka Retinotopic Coding and Perceptual Segmentation in Tilt Illusion 10:45 David H. Goldberg (Brown University) Harel Shouval and Leon N Cooper Lateral Connectivity: A Possible Scaffolding for the Development of Orientation Preference Maps 11:05 Break 11:20 Gy=F6ngyi Ga=E1l (Emory University) John P. Donoghue and Jerome N. Sa= nes Relations Among Neural Activities Recorded in Premotor and Motor Cortex of Trained Monkeys During Visually Guided Hand and Arm Movements Tasks 11:40 Emery N. Brown (Massachusetts General Hospital / MIT) Loren M. =46rank, Dengda Tang, Michael C. Quirk, and Matthew A. Wilson A Statistical Model of Spatial Information Encoding in The Rat Hippocampus 12:00 Satoru Inoue (The University of Electro-Communications) Yoshiki Kashimori and Takeshi Kambara The Neural Model of Nucleus Laminaris and Integration Layer Accomplishing Hyperacuity in Sound Location in the Barn Owl 12:20 Lunch Break and Poster Preview Session A 2:20 Featured Contributed Talk: Michael E Hasselmo (Harvard University) Erik Fransen, Gene V Wallenstein, Angel A Alonso, and Clayton T Dickson A Biophysical Simulation of Intrinsic and Network Properties of Entorhinal Cortex Contributed Talks 3:10 Jonathan Wolfe (University of New Mexico) Akaysha Tang Neuromodulation of Spike Timing and Spike Rate 3:30 Omer Artun (Brown University) Harel Z. Shouval Temporal Coding by Dynamic Synapses 3:50 Break 4:10 Hans E. Plesser (Max-Planck-Institut) Theo Geisel Bandpass Properties of Integrate-Fire Neurons 4:30 Invited Talk: To be Announced 5:20 End of Day Announcements 8:00 Poster Session A P=E9ter Adorj=E1n (Technische Universit=E4t Berlin) Gy=F6rgy Barna, P=E9ter =C9rdi, and Klaus Obermayer A Statistical Neural Field Approach to Orientation Selectivity 543 Charles H. Anderson (Washington University School of Medicine) Modeling Population Codes using Probability Density Functions 239 Charles H. Anderson (Washington University School of Medicine) Shahin Hakimian and W. Thomas Thach A PDF Model of Populations of Purkinje Cells: Non-linear Interactions and High Variability 248 David J. Anderson (University of Michigan) Steven M. Bierer Noise Reduction of Multi-channel Neural Activity Using an Array Processing Technique 501 Ildiko Aradi (Ohio University) William R. Holmes Active Dendrites Regulate Spatio-temporal Synaptic Integration in Hippocampal Dentate Granule Cells 388 Delorme Arnaud (Centre de Recherche Cerveau et Cognition) =46abre-Thorpe Mich=E8le, Richard Ghislaine, Fize Denis, and Simon Thorpe Rapid Processing of Complex Natural Scenes : A Role for the Magnocellular Visual Pathways? 235 Delorme Arnaud (Centre de Recherche Cerveau et Cognition) Jacques Gautrais, Rufin van Rullen, and Simon Thorpe Pikenet : A Simulator for Modeling Large Networks of Integrate and Fire Neurons 369 Bill Baird (U.C. Berkeley) An Oscillating Cortical Architecture Simulating Auditory Attention and Eeg-erp Data 513 Davis Barch (University of California, Berkeley) Donald A. Glaser Detection and Characterization of Coherent Motion by A 2-dimensional Sheet of Connected Elements: The "bow-wave" Model 412 William H. A. Beaudot (McGill University) =46igure-ground Segregation of Coherent Motion In V1: A Model Based on The Role of Intra-cortical and Extra-cortical Feedbacks 386 Avrama Blackwell (George Mason University) Dynamics of the Light-induced Na+ Current in Hermissenda 206 Brian Blais (Brown University) Harel Shouval and Leon N Cooper =46ormation of Direction Selectivity in Natural Scene Environments 226 Alan H. Bond (Caltech) A System Model of the Primate Neocortex 593 Vladimir E. Bondarenko (Russian Academy of Sciences) Teresa Ree Chay Generation of Various Rhythms by Thalamic Neural Network Model 87 Victoria Booth (New Jersey Institute of Technology) Dendritic Plateau Potentials in Bistable Motoneurons 236 Carlos Brody (UNAM) Slow Resting Potential Covariations in Lgn Neurons Can Lead To Apparently =46ast Cross-correlations in Their Spike Trains 316 David Brown (The Babraham Institute) Jianfeng Feng Is There a Problem Matching Model and Real Cv(isi)? 238 Anthony N. Burkitt (The Bionic Ear Institute) Graeme M. Clark New Technique for Analyzing Integrate and Fire Neurons 574 Gully Burns (University of Southern California) Neuroscholar 1.00, A Neuroinformatics Databasing Website 517 Maria Bykhovskaia (University of Virginia) Mary Kate Worden and John T. Hackett =46requency Facilitation at the Lobster Neuromuscular Junction: Quantal Analysis and Simulations 193 Martin T. Chian (University of Southern California) M. T. Chian, V.Z. Marmarelis, and T.W. Berger Identification of Unobservable Neural Systems in the Hippocampus using Adaptive Estimation 406 Ryan Clement (Arizona State University) Russell Witte, Rob Rennaker, and Daryl Kipke =46unctional Connectivity in Auditory Cortex using Chronic, Multichannel Microelectrodes in Awake Animals 602 Sharon Crook (Montana State University) John P. Miller The Mechanistic Basis of Neural Encoding 192 Erik De Schutter (University of Antwerp) Reinoud Maex and Bart Vos Synchronized Firing of Golgi Cells: Functional Modulation by the Cerebellar Circuitry 259 Patricia M. Di Lorenzo (SUNY at Binghamton) Inhibitory Influence on Electrophysiological Response to Taste in the Brain Stem 373 Jim Dilmore (University of Pittsburgh) J. G. Dilmore, B. S. Gutkin, and G. B. Ermentrout A Biophysical Model of Dopaminergic Modulation of Persistent Sodium Currents in Pfc Pyramidal Neurons: Effects on Neural Response Properties 217 Alexander Dimitrov (The University of Chicago) Alexander Dimitrov, Trevor Mundel, Vernon L. Towle, and Jack D. Cowan Independent Components Analysis of Subdural Ecog Recordings from an Epileptic Patient 396 Mikael Djurfeldt (SANS/NADA, KTH) Anders Sandberg, =D6rjan Ekeberg, and Anders Lansner See---a Framework for Simulation of Biologically Detailed and Artificial Neural Networks and Systems 518 Gideon Dror (The Academic College of Tel-Aviv-Yaffo) Analysis and Modelling of Population Dynamics in the Visual Cortex 564 Witali L. Dunin-Barkowski (Texas Tech University) Serge L. Shishkin and Donald C. Wunsch Phase-based Storage of Information in Cerebellum: A Case of Stationary Random Inputs 531 Michael Eisele (Salk Institute) Terry Sejnowski Model-based Reinforcement Learning by Pyramidal Neurons 485 Chris Eliasmith (Washington University in St. Louis) Charles H. Anderson Attractors, Representation, and the Pdf Framework 288 P=C8ter Erdi (Hungarian Academy of Sciences) A Statistical Approach to Neural Population Dynamics: Theory, Algorithms, Simulations 263 Jianfeng FENG (The Babraham Institute) Coefficient of Variation Greater Than .5 How And When? 223 Brent A. Field (Univeristy of Oregon) Alexander R. Pico and Richard T. Marrocco Local Injections Of Neurotransmitters Significantly Alter High-frequency (245hz) Activity 559 Piotr J. Franaszczuk (Univ.of Maryland School of Medicine) Pawel Kudela and Gregory K. Bergey Model of the Propagation of Synchronous Firing in a Reduced Neuron Network 368 =46rancesco Frisone (University of Genova) Paolo Vitali, Pietro G. Morasso, Guido Rodriguez, Alberto Pilot, and Marco R= osa Can the Synchronization of Cortical Areas Be Evidenced by Fmri? 264 Tomoki Fukai (Tokai University) Modeling the Interplay of Short-term Memory and the Basal Ganglia in Sequence Processing 157 Gradwohl Gideon (University of the Negev) Nitzan Ron Grossman Yoram Homogeneous Distribution of Excitatory and Inhibitory Synapses on the Dendrites of the Cat Surea Triceps Alpha-motoneurons Increases Synaptic Efficacy: Computer Model 191 J. Randall Gobbel Carnegie Mellon University Synchronization of Tonically Active Neurons in a Biophysical Model of the Neostriatum 488 David Golomb (Ben-Gurion University of Negev) David Hansel Theory of Synchrony in Sparse Neuronal Networks 208 Jeremy P. Goodridge (Carnegie Mellon University) A. David Redish and David S. Touretzky A Model of the Rat Head Direction System That Accounts for the Unique Properties of Anterior Thalamic Head Direction Cell Firing 384 Alex Guazzelli (USC Brain Project) Mihail Bota and Michael A. Arbib Incorporating Path Integration Capabilities In the Tam-wg Model of Rodent Navigation 428 Alex Guazzelli (USC Brain Project) Mihail Bota and Michael A. Arbib Incorporating Path Integration Capabilities in the Tam-wg Model of Rodent Navigation 524 Juergen Haag (Friedrich-Miescher-Laboratory) Alexander Borst Influence of Active Membrane Properties on the Encoding of Motion Information In Visual Interneurons of the Blowfly 103 Rolf Henkel University of Bremen Sampling Three-dimensional Space --- The Interplay of Vergence- And =46usion-system 198 Michael Herrmann (Max-Planck-Institut) Klaus Pawelzik and Theo Geisel Simultaneous Self-organization of Place and Direction Selectivity in a Neural Model of Self-localization 420 MONDAY, JULY 27, 1998 9:00 General Information 9:15 Featured Contributed Talk: Vikaas S. Sohal (Stanford University) and John R. Huguenard Long-range Connections Synchronize Rather Than Spread Intrathalamic Oscillatory Activity: Computational Modeling and In Vitro Electrophysiology Contributed Talks 10:05 Thomas Wennekers (University of Ulm) Gunther Palm How Imprecise is Neuronal Synchronization? 10:25 Steven P. Dear (Pennsylvania State University) Corey B. Hart Computational Mechanisms Linking Synchronization and Information Coding 10:45 Nicholas Hatsopoulos (Brown University) Liam Paninski, Nicholas G. Hatsopoulos, and John P. Donoghue Mutual Information Provided by Synchronous Neuronal Discharge about Target Location 11:05 Break 11:20 Dror Gideon (Academic College of Tel-Aviv-Yaffo) Tsodyks Misha Analysis and Modelling of Population Dynamics in the Visual Cortex 11:40 Steven L. Bressler (Florida Atlantic University) Mingzhou Ding and Weiming Yang Investigation of Cooperative Cortical Dynamics by Multivariate Autoregressive Modeling of Event-related Local Field Potentials 12:00 Charlotte Gruner (Rice University) Don H. Johnson Correlation and Neural Information Coding Efficiency 12:20 Mark S. Goldman (Brandeis University) Sacha B. Nelson and Laurence F. Abbott Decorrelation Of Spike Trains by Synaptic Depression 12:40 Lunch Break and Poster Preview Session B 2:00 Featured Contributed Talk: John K. Chapin (Allegheny University of Health Sci.) Ronald S. Markowitz and Karen A. Moxon Controlling Robot Arms using Neuronal Population Recordings Contributed Talks 2:50 Rolf Eckmiller (Universitaet Bonn) Ralph Huenermann and Michael Becker Exploration of a Dialog-based Tunable Retina Encoder for Retina Implants 3:10 Joel White (Tufts Medical School) John Kauer Odor Recognition in an Artificial Nose by Spatio-temporal Processing using an Olfactory Neuronal Network. 3:30 Ralf Moeller (University of Zurich) Marinus Maris and Dimitrios Lambrinos A Neural Model of Landmark Navigation in Insects 3:50 Break 4:10 Malcolm P. Young (Neural Systems Group) Claus-C Hilgetag and Jack W Scannell Models of Paradoxical Lesion Effects and Rules of Inference for Imputing Function to Structure in the Brain 4:30 Invited Talk: TBA 5:20 End of Day Announcements 8:00 Poster Session B John Hertz (Nordita) Zhaoping Li Odor Recognition and Segmentation by Coupled Olfactory Bulb and Cortical Networks 106 Andrew Hill (Emory University) Phase Lag between Oscillators of a Realistic Neuronal Network Model of the Leech Heartbeat Motor Pattern Generating System 429 Osamu Hoshino (The University of Electro-Communications) Yoshiki Kashimori and Takeshi Kambara A Neural Mechanism of Feature Binding Based on the Dynamical Map Theory in Distributed Coding Scheme 257 Arthur Houweling (The Salk Institute) Maxim Bazhenov, Igor Timofeev, Mircea Steriade, and Terrence Sejnowski Computational Analysis of Intracortical Augmenting Responses Resulting from Short-term Synaptic Plasticity 482 Hidetoshi Ikeno (Maizuru National College of Technology) Shiro Usui Mathematical Description of Ionic Currents of the Kenyon Cell in the Mushroom Body of Honeybee 553 Laurent Itti (California Institute of Technology) Christof Koch and Jochen Braun A Quantitative Model Relating Visual Neuronal Activity to Psychophysical Thresholds 530 Eugene Izhikevich (Arizona State University) =46M Interactions in Brain Models 11 Eugene Izhikevich (Arizona State University) Theoretical Foundations of Pulse-coupled Models. 12 David B. Jaffe (University of Texas at San Antonio) Raymond A. Chitwood Comparing Electrotonus in Hippocampal Ca3 Nonpyramidal and Pyramidal Neurons 274 David B. Jaffe (The University of Texas at San Antonio) Raymond A. Chitwood The Contribution of Active Dendrites to Epsp Propagation in a Hippocampal Ca3 Nonpyramidal Neuron Model 438 Pat Johnston UCLA John Klopp, Val Nenov, and Eric Halgren The Effects of Varying Connectional Parameters on Input-output Relationships in a Neocortical-hippocampal Model 432 Jorge V. Jose (Northeastern University) P.H.E. Tiesinga Spiking Statistics in Noisy Hippocampal Interneurons 98 Jeeyune Jung (University of Kentucky) Ranu Jung Brain-spinal Feedforward-feedback Interactions Affect Output Pattern and Intracellular Properties of Motor Networks in the Lamprey 471 Takeshi Kambara (Univ. of Electro-Communications) Hiromichi Owada and Yoshiki Kashimori A Neural Mechanism of Am Frequency Selectivity of Pyramidal Cell Circuit in Electrosensory Lateral-line Lobe of Weakly Electric Fish 496 Adam Kepecs (Biology Department, Brandeis University) Milos Dolnik Control of Neuronal Chaos: Using Single Cell Dynamics To Store Information 440 Daryl Kipke (Arizona State University) Russell Witte, Glen Hattrup, Justin Williams, and Daryl Kipke Pursuing Dynamic Reorganization in Auditory Cortex using Chronic, Multichannel Microelectrodes in Awake, Behaving Animals 497 Jeanette Hellgren Kotaleski (Karolinska Institutet) Jesper Tegner, Sten Grillner, and Anders Lansner Control of Burst Proportion and Frequency Range by Drive Dependent Modulation of Adaptation 200 Jeffrey L. Krichmar (George Mason University) Kim T. Blackwell, Garth S. Barbour, Alexander B. Golovan, and Thomas P. Vogl A Solution To The Feature Correspondence Problem Inspired By Visual Scanpath= s 215 Yoshihisa Kubota (Caltech) James M. Bower Decoding Time-Varying Calcium Signals by CaMKII/PP1: A Dynamical System Theory of Synaptic Calcium Computation 616 Linda J. Larson-Prior San Francisco College of Osteopathic Medicine Huo Lu and Fred W. Prior Serotonergic Modulation of the Cerebellar Granule Cell Network 234 Mark Laubach (Duke University Medical Center) Marshall Shuler and Miguel Nicolelis Principal and Independent Component Analyses for Multi-site Investigations of Neural Ensemble Interactions 229 Sarah Lesher (University of Maryland) Nick Mellen, Suzanne Dykstra, Mark L. Spano, and Avis H. Cohen Stable Lamprey Swimming Has Unstable Periodic Orbits 374 William B. Levy (University of Virginia) Xiangbao Wu Enhancing The Performance of A Hippocampal Model by Increasing Variability Early in Learning 529 Jim-Shih Liaw (University of Southern California) J.-S.Liaw and T.W. Berger Synapse Dynamics: Harnessing the Computing Power of Synaptic Dynamics 398 David T.J. Liley (University of Technology) Peter J. Cadusch and James J. Wright A Continuum Theory of Electrocortical Activity 72 Miguel Maravall (SUNY at Stony Brook) An Analysis of Connectivity and Function in Hippocampal Associative Memory 232 Bethge Matthias (Max-Planck-Institut) Klaus Pawelzik and Theo Geisel Rapid Learning with Depressing Synapses 555 Marcelo Bastos Mazza (Universidade de S=E3o Paulo) Ant=F4nio Carlos Roque da Silva Filho A Realistic Computer Simulation of Properties of Somatotopic Maps 101 Marilene de Pinho S. Mazza (Universidade de S=E3o Paulo) Marilene de Pinho and Ant=F4nio Carlos Roque da Silva Filho A Realistic Computer Simulation of Tonotopic Maps Formation Processes in the Auditory Cortex 102 Bruce H. McCormick (Texas A&M University) Design of a Brain Tissue Scanner 160 Bruce H. McCormick (Texas A&M University) Brent P. Burton and Travis S. Chow Virtual Microscopy of Brain Tissue 161 Bruce H. McCormick (Texas A&M University) Brent P. Burton, Travis S. Chow, and Andrew T. Duchowski Exploring The Brain Forest 159 Elliot D. Menschik (University of Pennsylvania) Shih-Cheng Yen and Leif H. Finkel =46unctional Properties of a Cellular-level Model of Hippocampal Ca3 127 John Miller (Montana State University) B. Girish, Tenaya M. Rodewald, and John P. Miller Encoding of Direction and Dynamics of Air Currents by Filiform Mechanoreceptors in the Cricket Cercal System 431 Ali A. Minai (University of Cincinnati) Simona Doboli and Phillip J. Best A Letent Attractors Model of Context-selection in The Dentate Gyrus-hilus System 268 =46arhad K. Mosallaie (Baylor College of Medicine) John A. Halter and Andrew R. Blight Effects of Activity Dependent Ion Concentration on Repetitive Firing in the Myelinated Axon 528 =46rank Mossn (University of Missouri at St. Louis) Hans A. Braun, M. Dewald, M. Huber, K. Voigt, and Xing Pei Unstable Periodic Orbits and Chaos in Thermally Sensitive Neurons 272 =46rank Moss (University of Missouri at St. Louis) Peter Jung and Ann Cornell-Bell Noise Mediated Spiral Waves in Glial Cell Networks Show Evidence of Self Organized Critical Behavior 347 =46rank Moss (University of Missouri) Xing Pei, Kevin Dolan, and Ying-Cheng Lai Counting and Scaling Unstable Periodic Orbits in Biological and Physical Systems 252 Karen Anne Moxon (Allegheny University) John K. Chapin Cortico-thalamic Interactions in Response to Whisker Stimulation in a Computer Model of the Rat Barrel System 222 Robert Muller (SUNY Health Science Center at Brooklyn) Andre Fenton and Gyorgy Csizmadia Conjoint Control of Place Cell Activity by Two Visual Stimuli 372 Shingo Murakami (The University of Tokyo) Akira Hirose Proposal of Microscopic Nerve-cell-activity Analysis Theory for Elucidating Membrane Potential Dynamics 328 John S. Nafziger (University of Pennsylvania) Shih-Cheng Yen and Leif H. Finkel Effects of Element Spacing on the Detection of Contours: Psychophysical and Modeling Studies 400 Hirofumi Nagashino (The University of Tokushima) Minoru Kataoka and Yohsuke Kinouchi A Coupled Neural Oscillator Model for Recruitment and Annihilation of the Degrees of Freedom of Oscillatory Movements 95 Bruno A. Olshausen (University of California, Davis) A Functional Model of V1 Horizontal Connectivity Based on the Statistics of Natural Images 468 Tetsuya Oyamada (The University of Electro-Communications) Yoshiki Kashimori and Takeshi Kambara A Neural Network Model of Olfactory System for Odor Recognition and Memorization Controlled by Amygdala 459 Xing Pei (University of Missouri at St. Louis) Lon Wilkens and Winfried Wojtenek The Site Of Endogenous Oscillation In The Electrosensory Primary Afferent Nerve Fiber In The Paddlefish, Polyodon Spathula 332 J. S. Pezaris (California Institute of Technology) M. Sahani and R. A. Andersen Response Correlations in Parietal Cortex 538 TUESDAY, JULY 28, 1998 9:00 General Information 9:15 Featured Contributed Talk: Raymon M. Glantz (Rice Universiry) A Cellular Model for rhe Mechanism of Directional Selectivity in Tangential Cells of the Crayfish Visual System Contributed Talks 10:05 Hermann Schobesberger (University of Pittsburgh) Boris S. Gutkin and John P. Horn A Minimal Model for Metabotropic Modulation of Fast Synaptic Transmission and Firing Properties in Bullfrog Sympathetic B Neurons 10:25 Taraneh Ghaffari-Farazi (University of Southern California) T. Ghaffari-Farazi, J.-S. Liaw, and T.W. Berger Morphological Impacts on Synaptic Transmission 10:45 Niraj S. Desai (Brandeis University) Lana C. Rutherford, Sacha B. Nelson, and Gina G. Turrigiano Activity Regulates the Intrinsic Excitability of Neocortical Neurons 11:05 Break 11:20 Dieter Jaeger (Emory University) Volker Gauck The Response Function of Different Types of Neurons for Artificial Synaptic Input Applied with Dynamic Current Clamping. 11:40 M=E1t=E9 Lengyel (KFKI, Research Inst.) =C1d=E1m Kepecs and P=E9ter = =C9rdi An Investigation of Location-dependent Differences between Somatic and Dendritic IPSPs 12:00 Farzan Nadim (Brandeis University) Yair Manor, Nancy Kopell, and Eve Marder Frequency Regulation by a Synapse Acting as a Switch: A Role for Synaptic Depression of Graded Transmission 12:20 Lunch Break and Poster Preview Session C 2:00 Invited Talk: TBA Contributed Talks 2:50 Christopher A. Del Negro (UCLA) Chie-Fang Hsiao and Scott H. Chandler Orthodox and Unorthodox Dynamics Govern Bursting Behavior in Rodent Trigeminal Neurons 3:10 Mike Neubig (University Laval) Alain Destexhe Changes in the Subcellular Localization of T-type Calcium Channels Change the Threshold and Strength of its Bursts 3:30 Paul Rhodes (National Institutes of Health) Regulation of Na+ And K+ Channel Gating Controls the Electrical Properties of Pyramidal Cell Dendrites 3:50 Break 4:10 Jaap van Pelt (Netherlands Institute for Brain Research) Harry B. M. Uylings Modeling the Natural Variability in Neuronal Branching Patterns 4:30 Adrian Robert (UC San Diego) Pyramidal Arborizations and Activity Spread in Neocortex 4:50 Geoffrey J. Goodhill (Georgetown University Medical Center) Jeffrey S. Urbach Mathematical Analysis of Gradient Detection by Growth Cones 5:20 End of Day Announcements 8:00 Poster Session C Christian Piepenbrock (Technical University of Berlin) Klaus Obermayer Effects of Lateral Competition in the Primary Visual Cortex on the Development of Topographic Projections and Ocular Dominance Maps 381 Panayiota Poirazi (University of Southern California) Bartlett Mel Memory Capacity of Neurons with Active Dendrites 233 Sergei Rebrik (University of California, San Francisco) Brian D. Wright and Ken Miller Cross Channel Correlations in Tetrode Recordings: Implications for Spike-sorting 434 Chris Roehrig (University of British Columbia) Catharine H. Rankin Dymods: A Framework for Modularizing Dynamical Neuronal Structures 558 Jonathan Rubin (The Ohio State University) David Terman Geometric Analysis of Neural Firing Patterns in Network Models with Fast Inhibitory Synapses 614 Eytan Ruppin (Tel-Aviv University) Gal Chechik and Isaac Meilijson Neuronal Regulation: A Biological Plausible Mechanism for Efficient Synaptic Pruning in Development 572 Maureen E. Rush (California State University, Bakersfield) William Ott A-current Modulation of Low-threshold Spiking 481 Ilya A. Rybak (DuPont Central Research) Michael L. Ramaker and James S. Schwaber Modeling Interacting Neural Populations: Dynamics, State Transitions and Applications to Particular Models 83 Cristiane Salum (St. George's Hospital Medical School) Alan D. Pickering Striatal Dopamine in Reinforcement Learning: A Computational Model 326 Hermann Schobesberger (University of Pittsburgh) Boris S. Gutkin and John P. Horn A Minimal Model for Metabotropic Modulation of Fast Synaptic Transmission and Firing Properties in Bullfrog Sympathetic B Neurons 385 Simon Schultz (Oxford University) Stefano Panzeri, Alessandro Treves, and Edmund T. Rolls Correlated Firing and the Information Represented by Neurons in Short Epochs 325 Nicolas Schweighofer (ERATO) Kenji Doya and Mitsuo Kawato A Model of the Electrophysiological Properties of the Inferior Olive Neurons 197 Peggy Series (Ecole Normale Superieure) Philippe Tarroux Synchrony and Delay Activity in Cortical Network Models 311 Ladan Shams (University of Southern California) Christoph von der Malsburg Are Object Shape Primitives Learnable? 63 Lokendra Shastri (International Computer Science Institute) Recruitment of Binding-match and Binding-error Detector Circuits Via Long-term Potentiation and Depression 427 Natalia Shevtsova (University of Maryland) James A. Reggia Lateralization in a Bihemispheric Neural Model of Letter Identification 294 Ying Shu (Univ. of Southern California) Xiaping Xie, Jim-shih Liaw, and Ted W. Berger A Protocol-based Simulation for Linking Computational and Experimental Studi= es 491 Jonathan Z. Simon (University of Maryland) Catherine E. Carr and Shihab A. Shamma A Dendritic Model of Coincidence Detection in the Avian Brainstem 297 =46rances K. Skinner The Toronto Hospital Research Institute Liang Zhang, Jose Luis Perez Velazquez, and Peter L. Carlen Bursting: A Role for Gap-junctional Coupling 370 Gregory D. Smith National Institute of Health Charles L. Cox, S. Murray Sherman, and John Rinzel =46ourier Analysis of Sinusoidally Driven Thalamocortical Relay Neurons and = a Minimal Integrate-and-fire-or-burst Model 65 Sheryl S. Smith Allegheny University Ronald S. Markowitz, Chris I. deZeeuw and John K. Chapin Hormone Modulation of Synchronized Inferior Olivary Ensembles During Rapid Vibrissa Movement: Association with Increased Levels of Gap Junction Proteins 218 Jacob Spoelstra University of Southern California Michael A. Arbib and Nicolas Schewighofer Cerebellar Adaptive Control of a Biomimetic Manipulator 466 Klaas Enno Stephan (C&O-Vogt Institute) Rolf K=F6tter Objective Relational Transformation (ort) - A New Foundation cor Connectivity Databases 251 Michael Stiber (Univ. of Washington) Bilin Zhang Stiber, Edwin R. Lewis, and Kenneth R. Henry Categorization of Gerbil Auditory Fiber Responses 162 Susanne Still (Institut fuer Neuroinformatik) Gwendal Le Masson Traveling Waves with Asymmetric Phase-lags in a Ring of Three Inhibitory Coupled Model Neurons 585 =46ahad Sultan (Univeristy Tuebingen) A Model of Temporal and Activity Dependent Mechanisms Underlying the Phyolgenetic Development of Cerebellar Molecular Interneuron Morphology 504 Daniel Suta (Johns Hopkins University) Eric D. Young Computer Simulations of Dorsal Cochlear Nucleus Neuronal Circuits 377 Joel Tabak (NINDS/NIH) Walter Senn, Michael O'Donovan, and John Rinzel Comparison of Two Models for Pattern Generation Based on Synaptic Depression 421 David C. Tam (University of North Texas) A Spike Train Analysis for Detecting Temporal Integration in Neurons 413 Shoji Tanaka (Yale University School of Medicine) Shuhei Okada =46unctional Prefrontal Cortical Circuitry for Visuospatial Working Memory =46ormation: A Computational Model. 94 Masami Tatsuno (Waseda University) Yoji Aizawa Network Model of Synaptic Modification Induced by Time-structured Stimuli in the Hippocampal Ca1 Area 158 Kathleen Taylor (University of Oxford) John Stein Attention, Intention and Salience in the Posterior Parietal Cortex 261 Jesper Tegn=E9r (Nobel Institute for Neurophysiology) Jeanette Hellgren-Kotaleski The Synaptic Nmda Component Affects the Synchronization between Neural Oscillators 201 Simon Thorpe (Centre de Recherche Cerveau et Cognition) Van Rullen Rufin Spatial Attention in Asynchronous Neural Networks 335 Hiroyuki Uchiyama (Kagoshima University) Avian Centrifugal Visual System: A Possible Neural Substrate for Selective Visual Attention 407 Michael S. Wehr (Caltech) John Pezaris and Maneesh Sahani Simultaneous Paired Intracellular and Tetrode Recordings for Evaluating The Performance of Spike Sorting Algorithms. 507 Thomas Wennekers (University of Ulm, Germany) =46riedrich T. Sommer Gamma-oscillations Support Optimal Retrieval in Associative Memories of Pinsky-rinzel Neurons 376 Ralf Wessel (UCSD) William B. Kristan Jr and David Kleinfeld Spatial Distribution of Voltage-gated Channels in the Neurites of the Leech Anterior Pagoda Cell: Functional Consequences for Nonlinear Synaptic Integration 142 Justin C. Williams (Arizona State University) Robert Rennaker, David Pellinen, and Daryl Kipke Towards A Long Term Neural Interface: Unit Stability in Chronic, Multichannel Recordings 492 Simon A. J. Winder (Microsoft Corporation) A Model for Biological Winner-take-all Neural Competition Employing Inhibitory Modulation of Nmda-mediated Excitatory Gain 536 Laurenz Wiskott (The Salk Institute) Learning Invariance Manifolds 108 Russell Witte (Arizona State University) Glen Hattrup, Justin Williams, and Daryl Kipke Pursuing Dynamic Reorganization in Auditory Cortex Using Chronic, Multichannel Microelectrodes in Awake, Behaving Animals. 560 Shih-Cheng Yen (University of Pennsylvania) Elliot D. Menschik and Leif H. Finkel Synchronization and Desynchronization in Striate Cortical Networks 395 Katherine R. Zaremba (Arizona State University) Steven M. Baer Relaxation Oscillators and Bursters Coupled Through Passive Cables 573 Ying Zhou (Rhode Island College) Walter Gall Including a Second Inward Conductance in Morris and Lecar Dynamics 211 WEDNESDAY, July 29, 1998 9:30 General Information 9:45 Featured Contributed Talk: Volker Steuber (University of Edinburgh) and David Willshaw A Model of Intracellular Signalling Can Implement Radial Basis =46unction Learning in Cerebellar Purkinje Cell Contributed Talks 10:35 Berthold Ruf (Institute for Theoretical Computer Science) Thomas Natschlaeger Pattern Analysis with Spiking Neurons Using Delay Coding 10:55 Arthur D. Kuo (University of Michigan) Mark J. Evans An Adaptive Filter Model of Velocity Storage in Visual-vestibular Interactions 11:15 Yoshiki Kashimori (Univ. of Electro-Communications, Japan) Takeshi Kambara Neural Mechanism of Adaptive Suppression of Background Signals Arising from Tail Movements in the Gymnotid Electrosensory System 11:35 Daniel A. Butts (Lawrence Berkeley Laboratory) Marla B. Feller, Carla J. Shatz, and Daniel S. Rokhsar The Developing Retina Motivates a General Model of Neural Waves 11:55 Federal Funding Opportunities 12:30 Lunch Break 2:00 Workshops I - Organization 9:30 Rock and Roll Jam Session THURSDAY, July 30, 1998 9:30 General Information 9:45 Featured Contributed Talk: Xiao-Jing Wang (Brandeis University) Yinghui Liu and Baylor Fox Neuronal Mechanisms of Working Memory in Prefrontal Cortex Contributed Talks 10:35 David Horn (Tel Aviv University) Nir Levy and Eytan Ruppin The Importance of Nonlinear Dendritic Processing in Multimodular Memory Networks 10:55 Song Chun Zhu (Stanford University) From Local Features to Global Perception: Computing Texture and Shape in Markov Random Fields 11:15 Enrico Simonotto (Univ. Missouri at St. Louis) F.Spano, M.Riani, A. Ferrari, F. Levrero, A. Pillot, P. Renzetti, R.C. Parodi, F. Sardanelli, P. Vitali, J. Twitty, and F. Moss FMRI Studies of Visual Cortical Activity during Noise Stimulation 11:35 Featured Contributed Talk: Frank Moss (University of Missouri at St. Louis) David F. Russell Animal Behavior Enhanced By Noise 12:25 Business Meeting 1:00 Lunch Break 2:30 Workshops II - Organization 9:30 Banquet - Life is a beach ************************************************************************ CNS*98 REGISTRATION FORM Last Name: First Name: Title: Student___ Graduate Student___ Post Doc___ Professor___ Committee Member___ Other___ Organization: Address: City: State: Zip: Country: Telephone: Email Address: REGISTRATION FEES: Technical Program --July 26 - July 30, 1998 Regular $225 ($250 after June 26th) Price Includes One Banquet Ticket Student $ 95 ($125 after June 26th) Price Includes One Banquet Ticket Each Additional Banquet Ticket $35 Total Payment: $ Please Indicate Method of Payment: Check or Money Order * Payable in U. S. Dollars to CNS*98 - Caltech * Please make sure to indicate CNS*98 and YOUR name on all money transfers. Charge my card: Visa Mastercard American Express Number: Expiration Date: Name of Cardholder: Signature as appears on card (for mailed in applications): Date: ADDITIONAL QUESTIONS: Previously Attended: CNS*92___ CNS*93___ CNS*94___ CNS*95___ CNS*96___ CNS*97___ Did you submit an abstract and summary? ( ) Yes ( ) No Title: Do you have special dietary preferences or restrictions (e.g., diabetic, low sodium, kosher, vegetarian)? If so, please note: Some grants to cover partial travel expenses may become available for students and postdoctoral fellows who present papers at the meeting. Do you wish further information? ( ) Yes ( ) No PLEASE FAX OR MAIL REGISTRATION FORM TO: Caltech, Division of Biology 216-76, Pasadena, CA 91125 Attn: Judy Macias =46ax Number: (626) 795-2088 (Refund Policy: 50% refund for cancellations on or before July 17th, no refund after July 17th) ******************************************************************** ************************************************************************ PLEASE CALL FESS PARKER'S DOUBLETREE RESORT TO MAKE HOTEL RESERVATIONS AT (800) 879-2929, (805) 564-4333 Fax (805)564-4964 PLEASE NOTE: YOU CAN MAIL REGISTRATION FORM TWO WAYS * MAIL REGISTRATION FORM TO FESS PARKER'S DOUBLETREE RESORT AT THE ADDRESS BELOW * FAX REGISTRATION TO (805) 564-4964 ********************************************************************** MAIL TO: Fess Parker's Doubletree Resort Attn: Reservation Department 633 East Cabrillo Boulevard Santa Barbara, CA 93103-9932 Check-In Time: 4:00 p.m. Check-Out Time: 12:00 noon Computational Neuroscience Conference - CNS*98 July 26 - 30, 1998 REQUESTS MUST BE RECEIVED BY: JUNE 24, 1998 Name of Person Requesting Rooms: Last Name:____________________________ =46irst Name:____________________________ Company Name:________________________ Institute:_______________________________ Street Address or PO Box Number:___________________________ City:___________________________________ State:___________________________________ Zip Code:________________________________ Area Code and Phone Number:______________________________ ARRIVAL (DAY/DATE)______________ TIME ______________ DEPARTURE (DAY/DATE)____________ TIME ______________ ACCOMMODATIONS RATES Student.................................$99.00 - 30 student room rates available (Students will be asked to verify their status on check in with a student ID or other documentation.) SINGLE $139.00 Double $139.00 ___ I prefer a non-smoking room. Deposit non-refundable if not cancelled 72 hours before arrival. =46OR RESERVATIONS OR CANCELLATIONS OR OTHER INFORMATION, PLEASE CALL DIRECT (800)879-2929 (INSIDE U.S.), (805)564-4333, FAX (805) 564-4964 THIS IS A RESERVATION REQUEST AND MUST BE GUARANTTED BY A DEPOSIT OR AN ACCEPTED CREDIT CARD NUMBER AND SIGNATURE: ____ Guaranteed by my first night's deposit (check or Money Order enclosed) ____ Guaranteed by my credit card (Visa___, MasterCard___, American Express___, Diners Club___, or Carte blanche.) Credit Card No. ______________________________________ Expiration Date:______________________________________ I understand that I am liable for one night's room and tax which will be deducted from my deposit or billed through my credit card. Cancellation will be subject to current hotel policy and handling charge. Signature:__________________________ There is a 72 hour cancellation policy in effect at the resort. Rooms are subject to 10 % Santa Barbara Occupancy Tax. Children under 18 free when sharing room with adult. If the room type requested is not available, the next available room type will be assigned. If your group has a range of rates and the rate category requested has been filled, then the next available rate will apply. Special Requests:________________________________________________________ ______________________________________________________________________ ______________________________________________________________________ ______________________________________________________________________ All special requests are on a space availability basis. From ted.carnevale at yale.edu Thu Jun 18 09:45:30 1998 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Thu, 18 Jun 1998 09:45:30 -0400 Subject: the NEURON summer course! Message-ID: <358919FA.35D@yale.edu> There have been so many re-announcements of other summer events through multiple channels that I was a bit hesitant to add to the din--but just in case you haven't heard: Time is running out to register for the NEURON course we're presenting August 1-5 at the San Diego Supercomputer Center. Salient features: 1. lots of hands-on exercises using NEURON on UNIX and Windows NT workstations 2. covers everything you need to know to get started 3. topics include: --using real anatomical and biophysical data --extending NEURON's library of biophysical mechanisms with NMODL --using NEURON's built-in tools for automated data analysis and model optimization --strategies for increasing model efficiency and accuracy --accelerating simulations with the new variable order, variable timestep integration method --NEURON's powerful tools for electrotonic analysis (as described in Carnevale et al. Comparative electrotonic analysis of three classes of rat hippocampal neurons. JNP 78:703-720, 1997) An early version of this toolkit is described at http://www.neuron.yale.edu/papers/ebench/ebench.html --project management --customizing the graphical user interface --using NEURON to model networks of neurons Learn all this and more from the experts! For more information and the registration form, see http://www.neuron.yale.edu/sdsc98/sdsc98.htm The registration deadline is approaching rapidly, and only a few seats are left, so don't delay! --Ted From stefan.wermter at sunderland.ac.uk Thu Jun 18 13:14:21 1998 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Thu, 18 Jun 1998 18:14:21 +0100 Subject: Several additional PhD stipends Message-ID: <35894AEC.34C91471@sunderland.ac.uk> Several openings exist for PhDstudents/ researcher. Please note the different sources for further information. Please forward to potential researcher/students. Apologies if you are on multiple mailing lists. -------------------------------------------------- UNIVERSITY OF SUNDERLAND SCHOOL OF COMPUTING AND INFORMATION SYSTEMS The School of Computing and Information Systems is pleased to be able to offer a small number of studentships for candidates to study for MPhil and PhD (full-time). The School has a growing research reputation and is keen to build upon its 3A rating in the 1996 Research Assessment Exercise. Applicants should have a good honours degree in Computer Science, and a strong desire to undertake high quality research in an ambitious and thriving School. Studentships are available to EU citizens and cover all fees and maintenance support of approximately 5,000 per annum. Studentships will commence in October 1998. Projects are available in the following areas: Neural Networks Speech and Image Processing Human Computer Interaction Natural Language Processing Information Retrieval Digital Media Software Engineering Decision Support Systems For a detailed list of projects please contact: Mark Hindmarch Email: mark.hindmarch at sunderland.ac.uk School of Computing and Information Systems University of Sunderland St Peters Campus St Peters Way Sunderland SR6 0DD UK Application is by CV to Mark Hindmarch at the above address. Closing date 17 July 1998. ---------------------------------------------------- ---------------------------------------------------- ---------------------------------------------------- 2. Researcher in Neural and Intelligent Systems (reference number CIRG28) Applications are invited for a three year research assistant position in the School of Computing and Information Systems investigating the development of hybrid neural/symbolic techniques for intelligent processing. This is an exciting new project which aims at developing new environments for integrating neural networks and symbolic processing. You will play a key role in the development of such hybrid subsymbolic/symbolic environments. It is intended to apply the developed hybrid environments in areas such as natural language processing, intelligent information extraction, or the integration of speech/language in multimedia applications. You should have a degree in a computing discipline and will be able to register for a higher degree. A demonstrated interest in artificial neural networks, software engineering skills and programming experience are essential (preferably including a subset of C, C++, CommonLisp, Java, GUI). Experience and interest in neural network software and simulators would be an advantage (e.g. Planet, SNNS, Tlearn, Matlab, etc). Salary is according to the researcher A scale (currently up to 13,871, under revision). Application forms and further particulars are available from the Personell department under +44 191 515 and extensions 2055, 2429, 2054, 2046, or 2425 or E-Mail employee.recruitment at sunderland.ac.uk quoting the reference number CIRG28. For informal enquiries please contact Professor Stefan Wermter, e-mail: Stefan.Wermter at sunderland.ac.uk. Closing date: 10 July 1998. The successful candidate is expected to start the job as soon as possible. ******************************************** Professor Stefan Wermter University of Sunderland Dept. of Computing & Information Systems St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 2781 email: stefan.wermter at sunderland.ac.uk http://osiris.sunderland.ac.uk/~cs0stw/ ******************************************** From mel at lnc.usc.edu Thu Jun 18 14:57:33 1998 From: mel at lnc.usc.edu (Bartlett Mel) Date: Thu, 18 Jun 1998 11:57:33 -0700 Subject: Paper Announcement: Active Dendrites & Complex Cells Message-ID: <3589631D.CCB14428@lnc.usc.edu> Members of the connectionist community may be interested in the following paper in the June issue of the Journal of Neuroscience: --------- "Translation-Invariant Orientation Tuning in Visual 'Complex' Cells Could Derive from Intradendritic Computations " Bartlett W. Mel, Daniel L. Ruderman, and Kevin A. Archie ---------- Journal of Neurocience Online: http://www.jneurosci.org/cgi/content/full/18/11/4325 Preprint, via our lab web page (click on Publications): http://lnc.usc.edu/ ----------- ABSTRACT Hubel and Wiesel (1962) first distinguished ``simple'' from ``complex'' cells in visual cortex, and proposed a processing hierarchy in which rows of LGN cells are pooled to drive oriented simple cell subunits, which are pooled in turn to drive complex cells. Though parsimonious and highly influential, the pure hierarchical model has since been challenged by results indicating many complex cells receive excitatory monosynaptic input from LGN cells, or do not depend on simple cell input. Alternative accounts for complex cell orientation tuning remain scant, however, and the function of monosynaptic LGN contacts onto complex cell dendrites remains unknown. We have used a biophysically detailed compartmental model to investigate whether nonlinear integration of LGN synaptic inputs within the dendrites of individual pyramidal cells could contribute to complex-cell receptive field structure. We show that an isolated cortical neuron with ``active'' dendrites, driven only by excitatory inputs from overlapping ON- and OFF-center LGN subfields, can produce clear phase-invariant orientation tuning---a hallmark response characteristic of a complex cell. The tuning is shown to depend critically upon both the spatial arrangement of LGN synaptic contacts across the complex cell dendritic tree, established by a Hebbian developmental principle, and on the physiological efficacy of excitatory voltage-dependent dendritic ion channels. We conclude that unoriented LGN inputs to a complex cell could contribute in a significant way to its orientation tuning, acting in concert with oriented inputs to the same cell provided by simple cells or other complex cells. As such, our model provides a novel, experimentally testable hypothesis regarding the basis of orientation tuning in the complex cell population, and more generally, underscores the potential importance of nonlinear intradendritic subunit processing in cortical neurophysiology. -- Bartlett W. Mel (213)740-0334, -3397(lab) Assistant Professor of Biomedical Engineering (213)740-0343 fax University of Southern California, OHE 500 mel at lnc.usc.edu, http://lnc.usc.edu US Mail: BME Department, MC 1451, USC, Los Angeles, CA 90089 Fedex: 3650 McClintock Ave, 500 Olin Hall, LA, CA 90089 From Kim.Plunkett at psy.ox.ac.uk Fri Jun 19 06:49:21 1998 From: Kim.Plunkett at psy.ox.ac.uk (Kim Plunkett) Date: Fri, 19 Jun 1998 11:49:21 +0100 (BST) Subject: PostDoc position Message-ID: <199806191049.LAA14247@pegasus.psych.ox.ac.uk> University of Oxford Department of Experimental Psychology Applications are invited for a post-doctoral research assistants to work on a programme of research concerned with investigating the nature and causes of language disorders in children. The project is funded for five years by the Wellcome Trust and aims to further understanding of developmental language disorders, in terms of both etiology and cognitive processes. Details of the research programme can be found on the web-site: http://www.mrc-apu.cam.ac.uk/personal/dorothy.bishop/wellcome Post doctoral Research Assistant Academic-Related Research Staff Grade 1A: Salary stlg15,159 - stlg22,785 Candidates should hold, or expect to hold by the time of appointment, a doctoral qualification and relevant research background in connectionist modelling. The post holder will be responsible for developing simulations of the effects of auditory deficits on language acquisition. The post holder will work both independently and as part of an integrated team actively engaged with all stages of the research process. For this post quote Reference: RA/dvmb2. Interviews are planned for 12 August 1998. Dorothy Bishop MRC Senior Scientist, MRC Cognition and Brain Sciences Unit (formerly Applied Psychology Unit) 15, Chaucer Road, Cambridge, UK, CB2 2EF. tel: UK: 01223 355294 ex 850 overseas: 44 1223 355294 ex 850 fax: UK 01223 359062 overseas: 44 1223 359062 email: dorothy.bishop at mrc-apu.cam.ac.uk World Wide Web: http://www.mrc-apu.cam.ac.uk/personal/dorothy.bishop/ From jkh at dcs.rhbnc.ac.uk Fri Jun 19 16:01:26 1998 From: jkh at dcs.rhbnc.ac.uk (Keith Howker) Date: Fri, 19 Jun 1998 16:01:26 +-100 Subject: Technical Report Series in Neural and Computational Learning (second attempt to send) Message-ID: <01BD9B9B.86E172C0@pc7.cs.rhbnc.ac.uk> [Apologies if you get more than one copy: the first attempt ] [had some non-deliveries. rgds, K. ] The European Community ESPRIT Working Group in Neural and Computational Learning Theory (NeuroCOLT) has been funded for a further three years as the Working Group NeuroCOLT2. The overall objective of the Working Group is to demonstrate the effectiveness of technologies that arise from a deep understanding of the performance and implementation of learning systems on real world data. We will continue to maintain the Technical Report Archive of papers produced by members or associates of the group's partners. We have created a new web site: http://www.neurocolt.com/ which gives more information about the project, including access to the Technical Reports (see below for further access instructions). Best wishes John Shawe-Taylor -------------------------------------------------------------------- Titles follow: Abstracts are available via the Web Site 1998 Document Archive Ref. Title Author 1998-abs Complete Abstract File For 1998 JNN, a Randomized Algorithm for 1998-001 Learning Multilayer Networks in Elisseeff & Polynomial Time Paugam-Moisy 1998-002 A comparison of non-informative Grunwald priors for Bayesian networks Data-Dependent Structural Risk 1998-003 Minimisation for Perceptron Decision Shawe-Taylor Trees 1998-004 Are Lower Bounds Easier over the Fournier & Koiran Reals? 1998-005 Query, PACS and simple-PAC Learning Castro & Guijarro 1998-006 The Real Dimension Problem is Koiran NPR-Complete 1998-007 Elimination of Parameters in the Koiran Polynomial Hierarchy Bayesian Classifiers are Large Cristiani, 1998-008 Margin Hyperplanes in a Hilbert Shawe-Taylor, Space Sykacek 1998-009 Learning via Internal Representation Dichterman Discrete versus analog computation: 1998-010 Some aspects of studying the same Meer problem in different computational models 1998-011 How many connected components must a Matamala & Meer difficult set have? The Separation Theorem for the 1998-012 Relation Classes Associated to the Gakwaya Extended Grzegorczyk Classes Isomorphism Theorem for BSS 1998-013 Recursively Enumerable Sets over Michaux & Troestler Real Closed Fields Efficient Read-Restricted Monotone 1998-014 CNF/DNF Dualization by Learning with Domingo, Mishra, Membership Queries Pitt 1998-015 Equality Is a Jump Boldi & Vigna Cristianini, 1998-016 Multiplicative Updatings for Campbell, Support-Vector Learning Shawe-Taylor Cristianini, 1998-017 Dynamically Adapting Kernels in Campbell, Support Vector Machines Shawe-Taylor 1998-018 Practical Algorithms for On-line Domingo, Gavalda, Sampling Watanabe --------------------------------------------------------------------------- ***************** ACCESS INSTRUCTIONS ****************** The files and abstracts may be accessed via WWW starting from the NeuroCOLT homepage: http://www.neurocolt.com/ or from the archive: ftp://ftp.dcs.rhbnc.ac.uk/pub/neurocolt/tech_reports Alternatively, it is still possible to use ftp access as follows: % ftp ftp.dcs.rhbnc.ac.uk (134.219.96.1) Name: anonymous password: your full email address ftp> cd pub/neurocolt/tech_reports/1998 ftp> binary ftp> get nc-tr-1998-001.ps.Z ftp> bye % zcat nc-tr-1998-001.ps.Z | lpr Similarly for the other technical reports. In some cases there are two files available, for example, nc-tr-97-002-title.ps.Z nc-tr-97-002-body.ps.Z The first contains the title page while the second contains the body of the report. The single command, ftp> mget nc-tr-97-002* will prompt you for the files you require. --------------------------------------------------------------------- | Keith Howker | e-mail: jkh at dcs.rhbnc.ac.uk | | Dept. of Computer Science | Phone : +44 1784 443696 | | RHUL | Fax : +44 1784 439786 | | EGHAM TW20 0EX, UK | Home: +44 1932 222529 | --------------------------------------------------------------------- --------------------------------------------------------------------- | Keith Howker | e-mail: jkh at dcs.rhbnc.ac.uk | | Dept. of Computer Science | Phone : +44 1784 443696 | | RHUL | Fax : +44 1784 439786 | | EGHAM TW20 0EX, UK | Home: +44 1932 222529 | --------------------------------------------------------------------- From juergen at idsia.ch Fri Jun 19 11:14:34 1998 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Fri, 19 Jun 1998 17:14:34 +0200 Subject: locoface mirrors Message-ID: <199806191514.RAA27565@ruebe.idsia.ch> Since the announcement on Mon Jun 15 1998 we have experienced unusually strong demand for an HTML document entitled "Facial beauty and fractal geometry." Due to limited capacity several thousands of the many download attempts led to incomplete results. I am very sorry for this. Friendly observers on the web, however, noticed our problems and established mirror sites. Darrin Chandler's mirror in the US: http://stilyagin.com/locoface/ Axel deKimpe's mirror: http://www.uoglobe.net/wm/idsia/index.html Another copy can now be found in the new cogprint archive: http://cogprints.soton.ac.uk/search?dom=Authors&query=Schmidhuber_J My original link temporally broke down due to the unexpected overload but should be working again: http://www.idsia.ch/~juergen/locoface/locoface.html Juergen Schmidhuber, IDSIA www.idsia.ch From nic at idsia.ch Fri Jun 19 18:24:22 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Sat, 20 Jun 1998 00:24:22 +0200 Subject: two technical reports Message-ID: <199806192224.AAA01683@idsia.ch> Dear colleagues, the following two papers are available by anonymous ftp: Technical Report IDSIA-32-98 (to be presented at ICANN'98) Slope Centering: Making Shortcut Weights Effective -------------------------------------------------- Nicol N. Schraudolph Shortcut connections are a popular architectural feature of multi-layer perceptrons. It is generally assumed that by implementing a linear sub-mapping, shortcuts assist the learning process in the remainder of the network. Here we find that this is not always the case: shortcut weights may also act as distractors that slow down convergence and can lead to inferior solutions. This problem can be addressed with slope centering, a particular form of gradient factor centering. By removing the linear component of the error signal at a hidden node, slope centering effectively decouples that node from the shortcuts that bypass it. This eliminates the possibility of destructive interference from shortcut weights, and thus ensures that the benefits of shortcut connections are fully realized. ftp://ftp.idsia.ch/pub/nic/slope.ps.gz Technical Report IDSIA-33-98 (submitted to NIPS*98) Accelerated Gradient Descent by Factor-Centering Decomposition -------------------------------------------------------------- Nicol N. Schraudolph Gradient factor centering is a new methodology for decomposing neural networks into biased and centered subnets which are then trained in parallel. The decomposition can be applied to any pattern-dependent factor in the network's gradient, and is designed such that the subnets are more amenable to optimization by gradient descent than the original network: biased subnets because of their simplified architecture, centered subnets due to a modified gradient that improves conditioning. The architectural and algorithmic modifications mandated by this approach include both familiar and novel elements, often in prescribed combinations. The framework suggests for instance that shortcut connections -- a well-known architectural feature -- should work best in conjunction with slope centering, a new technique described herein. Our benchmark experiments bear out this prediction, and show that factor-centering decomposition can speed up learning significantly without adversely affecting the trained network's generalization ability. ftp://ftp.idsia.ch/pub/nic/facede.ps.gz Best wishes, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland From Sebastian_Thrun at heaven.learning.cs.cmu.edu Fri Jun 19 18:55:19 1998 From: Sebastian_Thrun at heaven.learning.cs.cmu.edu (Sebastian Thrun) Date: Fri, 19 Jun 1998 18:55:19 -0400 Subject: Vacant positions at CMU Message-ID: *** please forward *** Carnegie Mellon University seeks applications for several vacant position in the areas of * machine learning / neural networks * multi-agent systems * distributed databases * adaptable software * distributed mobile robotics * security for U.S.-Government-funded research projects jointly carried out by the Computer Science Department (CSD), Institute for Complex Engineered Systems (ICES), the the Robotics Institute (RI), and the newly created Center for Automated Learning and Discovery (CALD). Applications are solicited at the research programmer, postdoc, and research faculty level. Prospective research programmers should hold a B.S. degree (or equivalent) and have extensive programming experience in C, C++ and/or Java. Prospective Postdocs and research faculty should hold a Ph.D. degree and have strong interests in scientific research and track records in one or more areas listed above. Applications from outside the US are welcome. Applications should include a CV, a statement of interest (1-2 pages), a recent relevant paper (if available), and a list of three or more references. We anticipate filling these positions at the earliest conveinence. Applications and inquiries should be addressed to Ms. Rhonda L Moyer Institute for Complex Engineered Systems Carnegie Mellon University 5000 Forbes Ave Pittsburgh, PA 15213-3891 Carnegie Mellon University is an equal opportunity employer. CMU possesses a unique research environment with a world-renown faculty in computer science, robotics, and engineering. - ------- End of Forwarded Message ------- End of Forwarded Message From mrj at cs.usyd.edu.au Sat Jun 20 23:49:47 1998 From: mrj at cs.usyd.edu.au (Mark James) Date: Sun, 21 Jun 1998 13:49:47 +1000 Subject: Ph.D. Thesis on A Model of Isocortex available Message-ID: <358C82DB.F31ADD4C@cs.usyd.edu.au> My doctoral thesis is available for download from: http://www.cs.usyd.edu.au/~mrj/AMI An Adaptive Model of Isocortex ABSTRACT A complete description of the functioning of the human brain requires understandings of three levels of neural processing: the operation of single brain neurons, how these neurons work together in each of the brains functional modules, and how these modules interact to produce the observed sensory, motor, and cognitive abilities. This thesis is principally concerned with the second of these processing levels, providing a plausible explanation of the operation of the functional modules of the cerebral cortex, namely cortical areas. Following a review of models of the operation of single cortical neurons, a model of the neural circuitry of homotypic six-layer cortex (isocortex) is constructed. The model is in accord with much of the anatomical and physiological data, and posits computational roles for each cortical layer and for each of the main types of cortical neurons. Analysis of the adaptive and activation dynamics of the isocortical model suggest that the neural feedback loops between cortical layers allow the cortex to perform powerful pattern recognition operations using a rule for adaptation of synaptic strengths that is constrained by biology to be much simpler than those often used in artificial neural network models. Results of computer simulations are described, which demonstrate that the model is capable of performing simple pattern discrimination and clustering tasks. The thesis concludes with an outline of ways in which the cortical model may be applied to speech and language processing. -- Mark James |EMAIL : mrj at cs.usyd.edu.au| Basser Department of Computer Science, F09 |PHONE : +61-2-9351-3423 | The University of Sydney NSW 2006 AUSTRALIA |FAX : +61-2-9351-3838 | ================- WEB: http://www.cs.usyd.edu.au/~mrj -================= From cchang at cns.bu.edu Mon Jun 22 11:41:17 1998 From: cchang at cns.bu.edu (Carolina Chang) Date: Mon, 22 Jun 1998 11:41:17 -0400 (EDT) Subject: CFP: Biomimetic Robotics - Special Issue of RAS Message-ID: Call for Papers: Biomimetic Robotics Special Issue of Robotics and Autonomous Systems Guest Editors: Carolina Chang and Paolo Gaudiano {cchang, gaudiano}@bu.edu Boston University Neurobotics Lab Department of Cognitive and Neural Systems Submission Deadline: October 31, 1998 It has been argued that today's supercomputers are able to process information at a rate comparable to that of simple invertebrates. And yet, even ignoring physical constraints, no existing algorithm running on the fastest supercomputer could enable a robot to fly around a room, avoid obstacles, land upside down on the ceiling, feed, reproduce, and perform many of the other tasks that a housefly learns to perform without external training or supervision. The apparent simplicity with which flies and even much simpler biological organisms manage to survive in a constantly changing environment suggests that a potentially fruitful avenue of research is that of understanding the mechanisms adopted by biological systems for perception and control, and applying what is learned to robots. While we may not yet be able to make a computer function as flexibly as a housefly, there have been many promising starts in that direction. The goal of this special issue is to present recent results in "biomimetic robotics", or the application of biological principles to robotics. The term "biological" in this case should be taken broadly to refer to any aspect of biological function, including for example, psychological theories or detailed models of neural function. Preference will be given to manuscripts describing original work that closely models biological principles observed in real animals and that uses real robots. Prospective authors should contact one of the guest editors as soon as possible to determine the relevance of their submission to this special issue. Authors are encouraged to submit manuscripts electronically. The final version of all accepted manuscripts should be in LaTeX, using the Elsevier style files available from the Robotics and Autonomous Systems web-page: http://www.elsevier.nl/locate/robot To submit an electronic copy of your manuscript, preferably in postscript or PDF format, upload it to the anonymous ftp site "neurobotics.bu.edu". Use "anonymous" as user name, and your e-mail as password. Change directory to pub/ras and "put" your file using binary transfer mode. You will get detailed directions when you enter the ras directory. To expedite uploading, your document may be compressed by any commonly used compression scheme. Once you have uploaded your file to the ftp site, please send e-mail to cchang at bu.edu indicating the filename, manuscript title, and the name and contact information (electronic and surface mail address, phone, fax) for the corresponding author. The title page of the manuscript should include contact information for all authors. For printed submissions, please send your double-spaced manuscript to: Carolina Chang Boston University Neurobotics Lab Department of Cognitive and Neural Systems 677 Beacon Street Boston, MA 02215 USA From Friedrich.Leisch at ci.tuwien.ac.at Mon Jun 22 11:29:53 1998 From: Friedrich.Leisch at ci.tuwien.ac.at (Friedrich Leisch) Date: Mon, 22 Jun 1998 17:29:53 +0200 (CEST) Subject: CI BibTeX Collection -- Update Message-ID: <13710.30833.579159.418831@galadriel.ci.tuwien.ac.at> The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Neural Networks 8/5-9/3 Advances in Neural Information Processing Systems 10 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ The NIPS proceedings source files have been generously provided by the editors. Best, Fritz Leisch -- =================================== Friedrich Leisch Institut f?r Statistik Tel: (+43 1) 58801 4541 Technische Universit?t Wien Fax: (+43 1) 504 14 98 Wiedner Hauptstra?e 8-10/1071 Friedrich.Leisch at ci.tuwien.ac.at A-1040 Wien, Austria http://www.ci.tuwien.ac.at/~leisch PGP public key http://www.ci.tuwien.ac.at/~leisch/pgp.key =================================== From gsiegle at sunstroke.sdsu.edu Tue Jun 23 13:35:36 1998 From: gsiegle at sunstroke.sdsu.edu (Greg Siegle) Date: Tue, 23 Jun 1998 10:35:36 -0700 (PDT) Subject: Connectionist models of disorder web site Message-ID: Dear Researcher, A new web site has been created as a source list for connectionist and neural network models of cognitive, affective, brain, and behavioral disorders. The site can be found at: www.sci.sdsu.edu/CAL/connectionist-models/ Please feel free to contribute references, links, comments, or suggestions. Sincerely, Greg Siegle From nenet at posta.unizar.es Tue Jun 23 19:22:36 1998 From: nenet at posta.unizar.es (Bonifacio Martin-del-Brio) Date: Tue, 23 Jun 1998 16:22:36 -0700 Subject: Summer Course (in Spanish) Message-ID: <359038BB.53582BB3@posta.unizar.es> In the following web sites you will find information on the 4th summer course 'Introduction to neural networks and fuzzy systems' (in Spanish, 'Introduccion a las redes neuronales y sistemas borrosos'), that will be taught in the 'Universidad de Verano de Teruel' (Summer University of Teruel, Spain), from 6th to 10th of July. There are not many courses on the subject in Spanish language, thus, this course can be a great opportunity for the Spanish community. http://www.unizar.es/univerter/inicio.html http://zape.unizar.es The course is based on the following introductory textbook, written in Spanish Bonifacio Martin-del-Brio and Alfredo Sanz 'Redes Neuronales y Sistemas Borrosos' Editorial RA-MA, Madrid (Spain), 1997. Best regards. ------------------------------------------------------- Dr. Bonifacio Martin-del-Brio Dept. Ingenieria Electronica y Comunicaciones Universidad de Zaragoza C. Corona de Aragon, 35. 50009 ZARAGOZA (Spain) ------------------------------------------------------- Phone: +34 976 351609 Fax: +34 976 762189 E-mail: nenet at posta.unizar.es ------------------------------------------------------- From espaa at soc.plym.ac.uk Tue Jun 23 12:04:44 1998 From: espaa at soc.plym.ac.uk (espaa) Date: Tue, 23 Jun 1998 16:04:44 GMT Subject: PAA Journal Message-ID: <17E85E45D9@scfs3.soc.plym.ac.uk> PATTERN ANALYSIS AND APPLICATIONS journal (Springer-Verlag Limited) http://www.soc.plym.ac.uk/soc/sameer/paa.htm VOLUME 1, ISSUE 2, July 1998 INDEX OF PAPERS Nonparametric Image Segmentation Thomas Kampke and Rudolf Kober Forschunginstitut fur Anwendungsorientierte Wissensverarbeitung, Germany A Monte Carlo Evaluation of the Moving Method, K-means and Self-organising Neural Networks E. W. Tyree, City University, UK J. A. Long, City University, UK Knowledge-Based Spatiotemporal Linear Abstraction Yuval Shahar, Stanford University, USA Martin Molina, Technical University of Madrid, Spain Recognition of Hand-printed Chinese Characters using Decision Trees/Machine Learning C4.5 System Adnan Amin, University of New South Wales, Australia Sameer Singh, University of Plymouth, UK Improving Stereovision Matching through Supervised Learning Gonzalo Pajares and Jesus Cruz, Universidad Complutense, Spain Beam Search and Simulated Beam Annealing for PFSA Inference Anand Raman, Massey University, New Zealand Book Reviews Pattern Classification by Juergen Shurmann Generic Object Recognition using Form and Function by Louise Stark and Kevin Bowyer World Scientific, 1996 Further ernquiries to the journal should be sent to Barbara Davies, Editorial Secretary at espaa at soc.plym.ac.uk From harnad at coglit.soton.ac.uk Wed Jun 24 08:26:15 1998 From: harnad at coglit.soton.ac.uk (Stevan Harnad) Date: Wed, 24 Jun 1998 13:26:15 +0100 (BST) Subject: Expanded BBS 1998: Call for Papers Message-ID: [Apologes if you get this message more than once: sent to several lists] BBS 1998 Has Expanded by 50% CALL FOR PAPERS Behavioral and Brain Sciences Journal (BBS), founded in '78, has begun its third decade in '98 with a 50% Expansion. This means that more articles can be accorded Open Peer Commentary, the feature that has had such a great impact on the international cognitive and biobehavioral science community. (BBS's ISI Impact Factor of 15 is nearly three times the highest impact Psychology journal and is one of the 25 highest among all 6500 science, social science and Arts/Humanities journals indexed by ISI.) BBS is a unique scientific communication medium, providing the service of Open Peer Commentary for reports of significant current work in psychology, neuroscience, behavioral biology and cognitive science. If a manuscript is judged by BBS referees and editors to be appropriate for Commentary it is circulated to a large number of commentators across disciplines and around the world. The target article, commentaries, and authors' responses then co-appear in BBS. To be eligible for publication, a paper should not only meet the standards of a journal such as Psychological Review or the International Review of Neurobiology in terms of conceptual rigor, empirical grounding, and clarity of style, but should also offer a clear rationale for soliciting Commentary. A BBS target article can be (i) the report and discussion of empirical research that the author judges to have broader scope and implications than might be more appropriately reported in a specialty journal; (ii) an unusually significant theoretical article that formally models or systematizes a body of research; or (iii) a novel interpretation, synthesis, or critique of existing experimental or theoretical work. Occasionally, articles dealing with social or philosophical aspects of the behavioral and brain sciences will be considered. Multiple reviews of books also appear. BBS's Web Pages: http://www.princeton.edu/~harnad/bbs.html http://www.cogsci.soton.ac.uk/bbs Email: bbs at cogsci.soton.ac.uk harnad at cogsci.soton.ac.uk -------------------------------------------------------------------- Stevan Harnad harnad at cogsci.soton.ac.uk Professor of Psychology harnad at princeton.edu Director, phone: +44 1703 592582 Cognitive Sciences Centre fax: +44 1703 594597 Department of Psychology http://www.cogsci.soton.ac.uk/~harnad/ University of Southampton http://www.princeton.edu/~harnad/ Highfield, Southampton ftp://ftp.princeton.edu/pub/harnad/ SO17 1BJ UNITED KINGDOM ftp://cogsci.soton.ac.uk/pub/harnad/ From bressler at walt.ccs.fau.edu Thu Jun 25 19:48:57 1998 From: bressler at walt.ccs.fau.edu (Steven Bressler) Date: Thu, 25 Jun 1998 19:48:57 -0400 Subject: Postdoctoral Position in Computational Neuroscience Message-ID: <3.0.1.32.19980625194857.006fdf40@mail.ccs.fau.edu> COMPUTATIONAL NEUROSCIENCE POSTDOCTORAL POSITION AVAILABLE Center for Complex Systems Florida Atlantic University A new postdoctoral position is open in the Center for Complex Systems at Florida Atlantic University to participate in a project in computational neuroscience. The aim of the project is to develop multivariate techniques for the analysis of cortical event-related potentials, and use the results from such analysis as the basis for computational modeling. The approach will emphasize the close interplay between state-of-the-art multivariate autoregressive analysis and the development of dynamical models of distributed information processing in the cerebral cortex. The research project will be conducted in close collaboration with S. Bressler, a cognitive neuroscientist and M. Ding, a computational modeler. The position is for two years, possibly renewable for another year. The desired starting date is September 1, 1998. Required background: -- Ph.D. degree -- Experience in C programming on UNIX systems and X11 Window programming -- Basic knowledge in dynamical systems, matrix algebra, signal processing, and statistics -- Research experience Desired background: -- Working knowledge in neurobiology and neural networks -- Knowledge in autoregressive time series modeling This project is funded by research grants from the National Science Foundation and the National Institute of Mental Health. Please send curriculum vitae, expression of interest, and the names and e-mail or phone numbers of three references to Steven Bressler at bressler at walt.ccs.fau.edu. Information about the Center for Complex Systems at Florida Atlantic University is available at http://www.ccs.fau.edu/ Steven L. Bressler, Ph.D. voice: 561-297-2322 Professor, Complex Systems & Brain Sciences fax: 561-297-3634 Center for Complex Systems Florida Atlantic University bressler at walt.ccs.fau.edu 777 Glades Road http://www.ccs.fau.edu/~bressler/ Boca Raton, FL 33431 U.S.A. From sontag at hilbert.rutgers.edu Fri Jun 26 00:51:58 1998 From: sontag at hilbert.rutgers.edu (Eduardo Sontag) Date: Fri, 26 Jun 1998 00:51:58 -0400 (EDT) Subject: book announcement for hybrid newsletter Message-ID: <199806260451.AAA14903@control.rutgers.edu> Contributed by: Eduardo Sontag (sontag at hilbert.rutgers.edu) Second Edition (revised and much extended) of Mathematical Control Theory Announcing a new book: Eduardo D. Sontag Mathematical Control Theory: Deterministic Finite Dimensional Systems ***Second Edition*** Springer-Verlag, New York, 1998, ISBN 0-387-984895 May be ordered from 1-800-Springer toll-free in the USA, or via email from: orders at springer-ny.com; or faxing +1.201.345.4505. This textbook introduces the core concepts and results of Control and System Theory. Unique in its emphasis on foundational aspects, it takes a "hybrid" approach in which basic results are derived for discrete and continuous time scales, and discrete and continuous state variables. Primarily geared towards mathematically advanced undergraduate or graduate students, it may also be suitable for a second engineering course in control which goes beyond the classical frequency domain and state-space material. The choice of topics, together with detailed end-of-chapter links to the bibliography, makes it an excellent research reference as well. The Second Edition constitutes a substantial revision and extension of the First Edition, mainly adding or expanding upon advanced material, including: Lie-algebraic accessibility theory, feedback linearization, controllability of neural networks, reachability under input constraints, topics in nonlinear feedback design (such as backstepping, damping, control-Lyapunov functions, and topological obstructions to stabilization), and introductions to the calculus of variations, the maximum principle, numerical optimal control, and linear time-optimal control. Also covered, as in the First Edition, are notions of systems and automata theory, and the algebraic theory of linear systems, including controllability, observability, feedback equivalence, and minimality; stability via Lyapunov, as well as input/output methods; linear-quadratic optimal control; observers and dynamic feedback; Kalman filtering via deterministic optimal observation; parametrization of stabilizing controllers, and facts about frequency domain such as the Nyquist criterion. From nic at idsia.ch Sat Jun 27 18:56:25 1998 From: nic at idsia.ch (Nici Schraudolph) Date: Sun, 28 Jun 1998 00:56:25 +0200 Subject: revised TR on fast exponentiation Message-ID: <199806272256.AAA00626@idsia.ch> Dear colleagues, the following technical report has undergone extensive revision since it was first announced here. Among other things, the EXP macro itself has been modified (faster still), and its mean, maximum, and RMS relative approximation error are now derived analytically. With best regards, -- Dr. Nicol N. Schraudolph Tel: +41-91-970-3877 IDSIA Fax: +41-91-911-9839 Corso Elvezia 36 CH-6900 Lugano http://www.idsia.ch/~nic/ Switzerland --------------------------- cut here ---------------------------- Technical Report IDSIA-07-98: A Fast, Compact Approximation of the Exponential Function --------------------------------------------------------- Nicol N. Schraudolph Neural network simulations often spend a large proportion of their time computing exponential functions. Since the exponentiation routines of typical math libraries are rather slow, their replacement with a fast approximation can greatly reduce the overall computation time. This note describes how exponentiation can be approximated by manipulating the components of a standard (IEEE-754) floating-point representation. This models the exponential function as well as a lookup table with linear interpolation, but is significantly faster and more compact. ftp://ftp.idsia.ch/pub/nic/exp.ps.gz (10 pages, 145 kB compressed) From adevries at sarnoff.com Tue Jun 30 11:09:31 1998 From: adevries at sarnoff.com (Bert De Vries) Date: Tue, 30 Jun 1998 11:09:31 -0400 Subject: Workshop Ann.: Neural Nets for Signal Proc. (Aug31-Sep2 '98) Message-ID: <3598FFAA.5E185768@sarnoff.com> We still have some openings for interested researchers to attend the 1998 IEEE Workshop on Neural Networks for Signal Processing, which this year will be held in beautiful Cambridge, UK, on August 31st to September 2nd 1998. We have a very interesting program, including many papers on blind signal processing, biomedical processing, speech, time series prediction etc. This note includes the preliminary program, registration information and a registration form. More information can be found at our website:- http://www.newton.cam.ac.uk/programs/nspw03.html http://www.newton.cam.ac.uk/programs/nsp.html Hope to see you in Cambridge! --Bert de Vries, Publicity chair NNSP98 =================================================================== IEEE Workshop on Neural Networks for Signal Processing NNSP 98 PRELIMINARY PROGRAMME Monday, 31 August 1998 9:00 Opening Remarks 9:15 Invited Talk Jose Principe, University of Florida 10:00 Oral Session I : Source Separation, Deconvolution & ICA 101 KuicNet Algorithms for Blind Deconvolution S. C. Douglas and S-Y. Kung 104 On the Stability of some Source Separation Algorithms J-F. Cardoso 110 Convolutive Blind Source Separation based on Multiple Decorrelation" L. Parra, C. Spence & B de Vries 10:45 Coffee Break 11:15 Oral Session I (Continued) 114 Independent Component Analysis: A flexible non-linearity and decorrelating manifold approach R. Everson & S.J. Roberts 115 Bayesian Blind Marginal Separation of Convolutely Mixed Discrete Sources C. Andrieu, A. Doucet, S. Godsill 116 Independent Component Analysis in Hybrid Mixture: Extrema Properties for Kurtosis and Higher Order Cumulant Function S-Y. Kung 12:00 Poster Previews I _________ 13:00 Lunch _________ 14:30 Oral Session II : Algorithms and Architectures 201 A General Probabilistic Formulation for Feedforward Neural Classifiers T. Adali, M. K. Sonmez & H. Ni 205 Learning from Examples with Mutual Information D. Xu & J C Principe 207 Experimental Evaluation of Latent Variable Models for Dimentionality Reduction M.A.Carrierra-Perpinian & S.J. Renals 208 From an A priori RNN to an A Posteriori PRNN Nonlinear Predictor D.P.Mandic & J Chambers ------------ 15:30 Coffee Break ------------ 16:00 Poster Session I (ICA, A&A) 103 Removing Electroencephalographic Artifacts: Comparison between ICA and PCA T-P Jung, C. Humphries, T-W Lee, M.J.McKeown, V. Iragui, S. Makeig & T. Sejnowski 105 A New Variable Step Algorithm for Blind Source Separation P.M. On & Y Hirai 107 Flexible Independent Component Analysis S. Choi, A. Cichocki & S. Amari 108 Blind Equalisation of Multichannels via Spatio-temporal Anti-Hebbian Learning Rule S. Choi, A. Cichocki & A. Amari 109 Asymmetric PCA Neural Networks for Adaptive Blind Source Separation K. I Diamantaras 111 The Effect of Signal Non-Stationarity on the Performanc of Information Maximisation Based Blind Separation M.J.T.Alphey, D.I. Laurensen & A.F. Murray 112 Blind Deconvolution / Equalization using State-Space Models L-Q Zhang & A. Cichocki 113 Two EM Algorithms for Blind Separation of Noisy Mixtures H. Attias 304 Online EM Algorithm and Reconstruction of Chaotic Dynamics S. Ishii & M. Sato Tuesday, 1 September 1998 9:00 Invited Talk Steve Young Cambridge University 10:00 Oral Session III Algorithms and Architectures 212 Adaptive Metric Kernel Regression C. Goutte and J. Larsen 215 Bayesian Filtering for Hidden Markov Models via Monte Carlo Methods A. Doucet, C. Andrieu, W.J. Fitzgerald 216 Clustering with Kernel Based Equiprobabilistic Topographic Maps M. van Hulle 10:45 Coffee Break 11:15 Oral Session III (Continued) 219 An Empirical Comparison of Arc-Cosine Distance, Generalised Fisher Ratio and Normalised Entropy Criteria for Model Selection S. Zheng & C. G. Molina 305 Stochastic Approximation by Neural Networks using the Radon and Wavelet Transforms R, Meir & V. Maiorov 701 Stochastic Unobserved Component Models for Adaptive Signal Extraction and Forecasting' P. Young 12:00 Poster Previews ___________ 13:00 Lunch ___________ 14:00 Poster Session II (A & A) 202 A Likelihood Framework for Nonlinear Signal Processing with Normal Mixtures B. Wang, T. Adali, X. Liu & J. Xuan 203 Nonlinear State Space Learning with EM and Neural Networks J. De Freitas, M. Niranjan & A.H. Gee 204 Volterra Signal Modelling using Lagrange Programming Neural Networks S. Chan, T. Stathaki & A. Constantinides 209 Split and Merge EM Algorithm for Improving Gaussian Mixture Density Estimates N. Ueda, R. Nakano, Z. Ghahramani & G.E. Hinton 210 Neural Network Regression with Input Uncertainty W. A. Wright 211 Speeding up MLP Execution by Approximating Neural Network Activation Functions Rosella Cancelliere 213 A Model for Non-Stationary Signal Processing with Clustering Methods S. Policker & A.B.Geva 214 A Reduced Size Lattice Ladder Neural Network D. Navakauskas 217 Designing the Optimal Structure of a Neural Filter K. Suzuki, I. Horiba, N. Sugie 218 From Data to Nonlinear Dynamics: A Hierarchical Bayes Approach to Neural Networks T. Matsumoto, Y. Nakajima, H. Hamagishi, J. Sugi & M. Saito 220 Recursive Nonlinear System Identification with Modular Networks V. Kadirkamanathan and S.G. Fabri 222 A heuristic Pattern Correction Scheme for GRNNs and its Application to Speech Recognition T. Hoya and A.G. Constantinides 306 Kohonen Networks and the Influence of Training on Data Structures I. Morlini _______________________ 15:30 Guided Tour of Colleges Punting in River Cam Conference Dinner _______________________ Wednesday, 2 September 1998 9:00 Invited Talk Josef Kittler Surrey University 10:00 Oral Session IV: Applications 506 Sound Monitoring based on the Generalised Probabilistic Descent Method H. Watanabe, Y. Matsumoto, S. Tanaka & S. Katagiri 511 Combining Neural Networks and Belief Networks for Image Segemtnation C. K. I. Williams and X. Feng 512 Analysing Time Series Structure with Hidden Markov Models M. Azzouzi & I.T. Nabney 10:45 Coffee Break 11:15 Oral Session IV (Continued) 515 Morphing Dynamical Sound Models A. Robel 603 Time Series Forecasting with Neural Networks Chris Chatfield 613 PCA/ICA Embeddings for Extracting Structure from Single Channel Wake EEG using Neural Networks D. Lowe 12:00 Poster Previews _____________ 13:00 Lunch _____________ 14:00 Invited Talk TBA 15:00 Oral Session V: Applications 611 Boundary Conditions of Pharyngeal Bolus Modeling by Neural Network Inversion E. Lin, J-N Hwang & MW Chang 602 Communication Channel Equalisation using Minimal Radial Basis Function Neural Networks P. Chandrakumar, P Saratchandran & N Sundararajan 614 Adaptive Medical Image Visualisation Based on Hierarchical Neural Networks and Intelligent Decision Fusion S-H. Lai & M Fang 605 Combining Histograms and Neural Networks in Static and Dynamical Systems Approach to Engine Condition Monitoring V. Kadirkamanathan and V.C. Patel 604 A Neural Network Extension of the Method of Analogues for Iterated Time Series Prediction N. Hazarika & D. Lowe 16:15 Poster Session (Applications) and Tea Break 501 Unconstrained Freehand-written Chinese Characters Recognition by Self-growing Probabilistic Decision-based Neural Networks H-C Fu, Y.Y.Xu & Y.P.Lee 502 Adaptive FIR filter use for signal Noise Cancelling M. Kolinova, A. Prochazka & M Mudrova 504 Face Classification using Principal Component Analysis and Multiresolution V. Brennan & J.C. Principe 505 A Comparison of a Hardware and a Software Integrate and Fire Neural Network for Clustering Onsets in Cohlear Filtered Sound L.S. Smith, M. Glover & A. Hamilton 507 Feature Extraction Techniques for Hindi Numerals H. Sanossian 508 Online Adaptive Histogram Equalisation D. Martinez 509 Weightless Neural Networks for Face Recognition: A Comparison S. Lauria & R.J. Mitchell 510 Exploiting the Statistical Characteristic of the Speech Signals for an Improved Neural Learning in a MLP Neural Network H. Altun and K.M. Curtis 513 Postprocessing for Image Coding Applications using Neural Network Visual Model Z. He, S. Chen, B. Luk & R. Istepanian 606 Structured Neural Network Approach for Measuring Raindrop Sizes and Velocities B. Denby, P. Gole & J. Tarniewicz 608 A Neural Network Architecture for the Classification of Remote Sensing Imagery with Advanced Learning Algorithms M.L. Goncalves, M. L. Netto, J.Z. Junior 612 A Framework for Combining Stochastic and Deterministic Descriptions of Nonstationary Financial Time Series R. H. Lesch & D. Lowe 18:15 Closing Remarks _____________ 18:30 End of Workshop _____________ ======================================================================= IEEE Neural Networks for Signal Processing Workshop (NNSP 98) Registration Information General: We have two types of delegates: Residential at Robinson College and Non residential. 120 Rooms have been booked at Robinson College for the purpose of the workshop, and it is expected that the majority of us will be residential there. The expected arrival is Sunday, 31 August and Departure is Thursday 3 rd September [i.e four nights]. It should be possible to accommodate those who might wish to arrive earlier / stay longer in Cambridge. The charges include accommodation, all meals between Sunday supper to breakfast on Thursday, and the Conference Dinner on Tuesday evening. Non-residential delegates are responsible for their own accommodation. Their registration includes Lunches on Monday, Tuesday & Wednesday and the Social Dinner on Tuesday. Registration Fees: ____________________________ EARLY REGISTRATION Early registration is encouraged, the cut off for this will be 10 July 1998. ____________________________ RESIDENTIAL AT ROBINSON The cost of accommodation will be UKP 93.00 per night Registration for the workshop: Member IEEE, Early registration: UKP 140.00 Nonmember IEEE, Early : UKP 170.00 Member IEEE Late Registration : UKP 170.00 Nonmember IEEE, Late : UKP 200.00 Student, Early : UKP 100.00 Student, Late : UKP 130.00 ____________________________ NON-RESIDENTIAL AT ROBINSON You are responsible for your own accommodation Registration for the workshop: Member IEEE, Early registration: UKP 225.00 Nonmember IEEE, Early : UKP 260.00 Member IEEE Late Registration : UKP 270.00 Nonmember IEEE, Late : UKP 290.00 Student, Early : UKP 160.00 Student, Late : UKP 200.00 ____________________________ ACCOMPANYING PERSON We have currently reserved 10 rooms that can accommodate an accompanying partner at an additional cost of UKP 35.70 per night [Bed and Breakfast] ____________________________ METHOD OF PAYMENT: Payment may be made by Bank Draft or International Money Order, payable to Robinson College Enterprise Ltd. Please make sure that we do not incur banking charges at this end. It is also possible to pay by credit card, but only VISA, MasterCard or JCB are acceptable. Please fill the form below and mail to Elizbeth Perrett Conference Office Robinson College Grange Road Cambridge CB3 9AN England Phone: 44 1223 332859 FaX : 44 1223 315094 Email: conference at robinson.cam.ac.uk Registration Form: ------------------------ Cut Here ------------------------------------ IEEE Neural Networks for Signal Processing Workshop (NNSP 98) REGISTRATION FORM Title: Neural Networks for Signal Processing, NNSP 98 Taking place at: Robinson College & Newton Institute, Cambridge, UK Date: August 31 - September 2, 1998 Last Name: ...............................Title (Mr, Ms, Dr etc)....... First Name: ............................................................ Date of Birth: ..../..../......Nationality: ............................ Professional Status: ................................................... University/Company: .................................................... Address:................................................................. ........................................................................ City: ............................... Postcode: ........................ Tel: ................................ Fax: ............................. Email: ................................................................. PAYMENT DETAILS: Accommodation at Robinson College ..... Nights @ 93.00 / night ....... Accompanying person (additional 35.70 / night) Residential Registration Early Registration, Member IEEE ................ Early Registration, NonMember IEEE ................ Non Early Registration, Member IEEE ................ Non Early Registration, NonMember IEEE ................ Early Student ................ Non Early Student ................ Non Residential Registration Early Registration, Member IEEE ................ Early Registration, NonMember IEEE ................ Non Early Registration, Member IEEE ................ Non Early Registration, NonMember IEEE ................ Early Student ................ Non Early Student ................ Total ........ Method of Payment ....... Bank Draft enclosed [Payabel to Robinson College Enterprises Ltd] ....... Charge Credit Card [VISA / MasterCard / JCB only] Card Number ............................... Expiry Date ................. Print Name .......................... Signed: .................................. Date: ........................ ---End of Registration form ----------------------------------------------