From amari at brain.riken.go.jp Tue Jun 1 02:59:55 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Tue, 01 Jun 1999 15:59:55 +0900 Subject: New Laboratory Heads of RIKEN Brain Science Institute Message-ID: <19990601155955C.amari@brain.riken.go.jp> RIKEN Brain Science Institute, Japan Applications Invited for Heads of New Laboratories in the "Creating the Brain" Area *************** Applications are invited for heads of several new laboratories in the "Creating the Brain" Area of RIKEN BSI. The RIKEN Brain Science Institute was established in October 1997 to promote three strategic research areas: "Understanding the Brain", "Protecting the Brain" and "Creating the Brain". The "Creating the Brain" area aims at elucidation of the information principles in the brain by theoretical and experimental approaches and aims to create new information technology based on brain-style computations and systems. The targets include theoretical foundation of neurocomputing, modeling of structures and functions of the brain, their simulations, and establishment of brain-style computing systems. The area now consists of two research groups, the "Brainway Group" headed by Gen Matsumoto and the "Brain-Style Information Systems Group" headed by Shun-ichi Amari. For more information, see the web site: http://www.brain.riken.go.jp. BSI will establish two more research groups including several laboratories in order to further develop activities of the creating the brain area. One new group is concerned with computational cognitive neuroscience of higher-order brain functions. It will include laboratories which study intelligent, emergent and complex systems, brain-style artificial intelligence, languages, logical reasoning, symbol processing, and brain-style super-parallel computation. The other group is concerned with brain-style behavioral systems and robotics. It will include laboratories which study motor planning, command generation, brain-style control including learnable inner models and inverse models, mechanisms of sensorimotor transformations and brain-style robot systems. A new computational neuroscience research laboratory will also be established in the existing groups. New heads of laboratories will be required to organize a team of 5 - 10 researchers and technical staff and be provided generous support for five years. A review of progress by international review committee occurs every five years with possibility of contract renewal. Applications are encouraged from outside Japan as well as inside Japan, under the condition that heads must be willing to work at BSI full time. Applicants interested in leading a new laboratory are invited to submit a research proposal (of no more than 2,000 words) and a suggested name for the laboratory. In addition, applicants should provide a full CV, a list of all publications with reprints of five papers, a statement of research interests and the names and addresses of three referees to: Search Committee (10), RIKEN Brain Science Institute, 2-1 Hirosawa, Wako-shi, Saitama 351-0198, Japan Fax: +81-48-462-4796, e-mail: search10 at brain.riken.go.jp Deadline: September 30, 1999 From jagota at cse.ucsc.edu Tue Jun 1 16:26:02 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Tue, 1 Jun 1999 13:26:02 -0700 (PDT) Subject: new e-publication: ICA survey Message-ID: <199906012026.NAA17656@arapaho.cse.ucsc.edu> New refereed e-publication action editor: Barak Pearlmutter A. Hyvarinen, Survey on Independent Component Analysis, Neural Computing Surveys 2, 94--128, 1999. 150 references. http://www.icsi.berkeley.edu/~jagota/NCS Abstract: A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is finding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Well-known linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. From hali at theophys.kth.se Tue Jun 1 17:52:53 1999 From: hali at theophys.kth.se (Hans =?iso-8859-1?Q?Liljenstr=F6m?=) Date: Tue, 01 Jun 1999 23:52:53 +0200 Subject: Final CALL FOR PAPERS: 1999 Agora Meeting on Fluctuations in Biological Systems Message-ID: <37545635.694BFF49@theophys.kth.se> ************************************************************************* 2nd Announcement and Final Call For Papers 1999 Agora Meeting on Fluctuations in Biological Systems Agora'99 Sigtuna, Sweden August 3-7, 1999 Organized by Agora for Biosystems Sponsored by International Union for Pure and Applied Physics (IUPAP) Swedish Council for Planning and Coordination of Research (FRN) >>>>>>DEADLINE for abstract submission extended to: June 15, 1999 <<<<<<< More information and registration at http://www.theophys.kth.se/~hali/agora/agora99 ************************************************************************* SCOPE This interdiscplinary conference on fluctuations in biological systems will be held in the small old town of Sigtuna, Sweden, Aug 3-7, 1999, and is following upon a series of workshops, where the first was held in Sigtuna, Sep 4-9 1995 (Sigtuna Workshop 95). The approach on these meetings is theoretical as well as experimental, and the meetings are intended to attract participants from various fields, such as biology, physics, and computer science. A number of invited speakers will provide presentations on the fundamental problems, but participants are invited to submit abstracts on topics related to those listed below. The number of participants is limited to approx. 150. MOTIVATION Life is normally associated with a high degree of order and organization. However, disorder -- in various contexts referred to as fluctuations, noise or chaos -- is also a crucial component of many biological processes. For example, in evolution random errors in the reproduction of the genetic material provides a variation that is fundamental for the selection of adaptive organisms. At a molecular level, thermal fluctuations govern the movements and functions of the macromolecules in the cell. Yet, it is also clear that too large a variation may have disastrous effects. Uncontrolled processes need stabilizing mechanisms. More knowledge of the stability requirements of biological processes is needed in order to better understand these problems, which also have important medical applications. Many diseases, for instance certain degenerations of brain cells, are caused by failure of the stabilizing mechanisms in the cell. Stability is also important and difficult to achieve in biotechnological applications. There is also randomness in structure and function of the neural networks of the brain. Spontaneous firing of neurons seems to be important for maintaining an adequate level of activity, but does this "neuronal noise" have any other significance? What are the effects of errors and fluctuations in the information processing of the brain? Can these microscopic fluctuations be amplified to provide macroscopic effects? Often, one cannot easily determine whether an apparently random process is due to noise, governed by uncontrolled degrees of freedom, or if it is a result of "deterministic chaos". Would the difference be of any importance for biology? Especially, could chaos, which is characterized by sensitivity and divergence, be useful for any kind of information processing that normally depends upon stability and convergence? OBJECTIVE The objective of this meeting is to address questions and problems related to those above, for a deeper understanding of the effects of disorder in biological systems. Fluctuations and chaos have been extensively studied in physics, but to a much lesser degree in biology. Important concepts from physics, such as "noise-induced state transitions" and "controlled chaos" could also be of relevance for biological systems. Yet, little has been done about such applications and a more critical analysis of the positive and negative effects of disorder for living systems is needed. It is essential to make concrete and testable hypotheses, and to avoid the kind of superficial and more fashionable treatment that often dominates the field. By bringing together scientists with knowledge and insights from different disciplines we hope to shed more light on these problems, which we think are profound for understanding the phenomenon of life. TOPICS Topics include various aspects, experimental as well as theoretical, on fluctuations, noise and chaos, in biological systems at a microscopic (molecular), mesoscopic (cellular), and macroscopic (network and systems) level. Contributions are welcome regarding, among others, the following topics: - Biological signals and noise - Neural information processing - Synaptic fluctuations - Spontaneous neural firing - Macromolecular dynamics - Dynamics of microtubuli - Ion channel kinetics - Cell motility - Medical implications INVITED SPEAKERS Per Andersen, Oslo University, Norway Hans Braun, University of Marburg, Germany Franco Conti, Istituto di Cibernetica e Biofisica, CNR, Genova, Italy Louis DeFelice, Vanderbilt University, Nashville, USA Hans Frauenfelder, Los Alamos National Laboratory, New Mexico, USA John Hopfield, Princeton University, USA Fernan Jaramillo, Emory University School of Medicine, Atlanta, USA Stanislas Leibler, Princeton University, USA Uno Lindberg, Stockholm University, Sweden Koichiro Matsuno, Nagaoka University of Technology, Japan Erik Mosekilde, Technical University of Denmark, Lyngby Frank Moss, University of Missouri, St Louis, USA Sakire P=F6gun, Center for Brain Research, Ege University, Turkey Stephen Traynelis, Emory University School of Medicine, Atlanta, USA Horst Vogel, Swiss Federal Institute of Technology, Lausanne, Switzerland Peter Wolynes, University of Illinois, USA James J. Wright, University of Melbourne, Australia Mikhail Zhadin, Institute of Cell Biophysics, Pushchino, Russia PRE-REGISTRATION and abstract submission can preferably be done via the Agora'99 home page: http://www.theophys.kth.se/~hali/agora/agora99 REGISTRATION FEES Regular: 1500 SEK (ca 190 USD) before June1 ---- 2000 SEK after June 1 Students: 800 SEK (ca 100 USD) before June 1 ---- 1000 SEK after June 1 DEADLINE for abstract submission: June 15, 1999. FURTHER INFORMATION available from: Hans Liljenstrom Theoretical Physics Royal Institute of Technology SE-100 44 Stockholm, Sweden and Agora for Biosystems Box 57, SE-193 22 Sigtuna, Sweden Phone: +46-8-790 7167 Fax: +46-8-10 48 79 Email: hali at theophys.kth.se WWW: http://www.theophys.kth.se/~hali From amari at brain.riken.go.jp Tue Jun 1 22:59:54 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Wed, 02 Jun 1999 11:59:54 +0900 Subject: adaptive natural gradient papers Message-ID: <19990602115954G.amari@brain.riken.go.jp> Two papers concerning natural gradient on-line learning are available! Natural gradient learning has attracted much attention because of its excellent dynamical behaviors. When it is applied to multilayer perceptrons, its superior properties have been proved by statistical- physical methods. It is not only locally Fisher efficient but avoids plateaus or quickly gets rid of them. Although theoretically good, it is believed that its practical implementation is difficult. This is because calculation of the Fisher information matrix and its inversion are very difficult. In order to avoid this difficulty, we have developed an adaptive method of directly calculating the inverse of the Fisher information. The natural gradient method works surprisingly well with this adaptive estimate of the inverse. The first paper proposes the method itself, which has been accepted for publication in Neural Computation. The second paper generalizes the idea to be applicable to a wider class of network models and loss functions. This has been submitted to Neural Networks. S.Amari, H.Park and K.Fukumizu, Adaptive method of realizing natural gradient learning for multilayer perceptrons, H.Park, S.Amari, and K.Fukumizu, Adaptive natural gradient learning algorithms for various stochastic models. You can copy the papers from http://www.bsis.brain.riken.go.jp/ Shun-ichi Amari ? Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan RIKEN Brain Science Institute Director of Brain-Style Information Systems Research Group Laboratory for Information Synthesis, Head tel: +81-(0)48-467-9669 fax: +81-(0)48-467-9687 e-mail: amari at brain.riken.go.jp home page: http://www.bsis.brain.riken.go.jp/ From phil at rome.cis.plym.ac.uk Fri Jun 4 08:25:03 1999 From: phil at rome.cis.plym.ac.uk (Phil Culverhouse) Date: Fri, 04 Jun 1999 13:25:03 +0100 Subject: JOB POSITION References: Message-ID: <3757C59F.247A59DD@cis.plym.ac.uk> SCHOOL OF ELECTRONIC, COMMUNICATION & ELECTRICAL ENGINEERING, UNIVERSITY OF PLYMOUTH, UK. Ref: 3289/TECH RESEARCH ASSISTANT/FELLOW Salary stlg10399 to stlg17606 pa - RA/RF An exciting 20 MONTH post is IMMEDIATELY available for a Vision Scientist You will LEAD the development of a neural network based natural object categoriser for laboratory use. The existing prototype (UNIX platform)is capable of categorising 23 species of marine plankton, but has to be further developed and a user interface tailored to Marine Ecologists. You should have a working knowledge of wavelet transforms, current machine vision techniques and multi-dimensional clustering statistics. You should ideally be familiar with UNIX and Windows NT operating systems as well as being a Matlab and C++ programmer. The POST IS AVAILABLE IMMEDIATELY and will involve some European travel. For informal enquiries regarding this post, please contact Dr P Culverhouse on +44 1752 233517 or email: pculverhouse at plymouth.ac.uk closing date: 30th June 1999 From ishii at is.aist-nara.ac.jp Sun Jun 6 22:54:41 1999 From: ishii at is.aist-nara.ac.jp (Shin Ishii) Date: Mon, 07 Jun 1999 11:54:41 +0900 Subject: Paper on on-line EM algorithm for NRBF Message-ID: <199906070254.LAA06189@axp27.aist-nara.ac.jp> The following paper is available on my web site: http://mimi.aist-nara.ac.jp/~ishii/publication.html We would greatly appreciate comments and suggestion. ------------------------------------------------------------------- On-line EM algorithm for the normalized Gaussian network Masa-aki Sato and Shin Ishii To appear in Neural Computation A Normalized Gaussian Network (NGnet) (Moody and Darken 1989) is a network of local linear regression units. The model softly partitions the input space by normalized Gaussian functions and each local unit linearly approximates the output within the partition. In this article, we propose a new on-line EM algorithm for the NGnet, which is derived from the batch EM algorithm (Xu, Jordan and Hinton 1995) by introducing a discount factor. We show that the on-line EM algorithm is equivalent to the batch EM algorithm if a specific scheduling of the discount factor is employed. In addition, we show that the on-line EM algorithm can be considered as a stochastic approximation method to find the maximum likelihood estimator. A new regularization method is proposed in order to deal with a singular input distribution. In order to manage dynamic environments, where the input-output distribution of data changes over time, unit manipulation mechanisms such as unit production, unit deletion, and unit division are also introduced based on the probabilistic interpretation. Experimental results show that our approach is suitable for function approximation problems in dynamic environments. We also apply our on-line EM algorithm to robot dynamics problems and compare our algorithm with the Mixtures-of-Experts family. ------------------------------------------------------------------- Shin Ishii Nara Institute of Science and Technology ATR Human Information Processing Research Laboratories From amari at brain.riken.go.jp Mon Jun 7 05:05:38 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Mon, 07 Jun 1999 18:05:38 +0900 Subject: exact location of two natural gradient learning papers Message-ID: <19990607180538R.amari@brain.riken.go.jp> I have received a number of complaints concerning difficulties for copying the announced papers because of my bad information. The following is the exact location of the papers for two adaptive natural gradient papers. We have made two files each, one being in the postscript form (gziped) and the other in the pdf form. The URL is as follows; http://www.islab.brain.riken.go.jp/~amari/pub_j.html#Journal Please contact Dr.Fukumizu (fuku at brain.riken.go.jp) if you have any troubles. Shun-ichi Amari ? Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan RIKEN Brain Science Institute Director of Brain-Style Information Systems Research Group Laboratory for Information Synthesis, Head tel: +81-(0)48-467-9669 fax: +81-(0)48-467-9687 e-mail: amari at brain.riken.go.jp home page: http://www.bsis.brain.riken.go.jp/ From jagota at cse.ucsc.edu Mon Jun 7 17:50:12 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Mon, 7 Jun 1999 14:50:12 -0700 (PDT) Subject: new {H}MM e-survey Message-ID: <199906072150.OAA02253@arapaho.cse.ucsc.edu> New refereed e-publication action editor: Yoram Singer Y. Bengio, Markovian Models for Sequential Data, Neural Computing Surveys 2, 129--162, 1999. 141 references. http://www.icsi.berkeley.edu/~jagota/NCS Abstract: Hidden Markov Models (HMMs) are statistical models of sequential data that have been used successfully in many machine learning applications, especially for speech recognition. Furthermore, in the last few years, many new and promising probabilistic models related to HMMs have been proposed. We first summarize the basics of HMMs, and then review several recent related learning algorithms and extensions of HMMs, including in particular hybrids of HMMs with artificial neural networks, Input-Output HMMs (which are conditional HMMs using neural networks to compute probabilities), weighted transducers, variable-length Markov models and Markov switching state-space models. Finally, we discuss some of the challenges of future research in this very active area. From ctan at bond.edu.au Tue Jun 8 03:21:35 1999 From: ctan at bond.edu.au (Dr Clarence N W Tan) Date: Tue, 8 Jun 1999 17:21:35 +1000 Subject: call for abstracts: International Conference on Advanced Investment Technology 1999 Message-ID: <012d01beb17f$8b04c0a0$3a08f483@it.bond.edu.au> International Conference on Advanced Investment Technology 1999 Incorporating Workshops on "A Primer to Advanced Investment Technology" Announcement and Call for Papers ------------------------------------------------------------------------- Date: Sun 19 December 1999 - Tue 21 December 1999 Venue: The Conference Centre Bond University, Gold Coast, Queensland 4229 Australia Extended Abstract of Paper Deadline: July 31, 1999 Organisers: ----------- Honorary Chair: Prof. Gopal Gupta Conference Chairs: Dr. Clarence Tan Dr. Kuldeep Kumar Invitation ----------- Bond University would like to invite all researchers and practitioners interested in all aspects of technology applications to finance to attend the Advanced Investment Technology 1999 Conference (AIT99). Objectives ---------- AIT99 aims to bring together academics and professionals in the finance and investment industry who are interested in applications of advanced information technology to finance problems. It intends to foster better relationships between academia and the finance industry on topics such as soft computing and web-based technology in the investment industry. A one-day workshop is planned for the day prior to the conference to cover a range of practical issues concerning use of information technology in financial environments including applications of advanced technology such as Artificial Neural Networks, Chaos Theory and Forecasting to financial analysis and investment. Structure: ---------- The conference is organised into two principal sessions: A series of workshops on 19th December 1999 and a conference program consisting of keynote speakers and presentations by the delegates/participants on 20th and 21st December 1999. There are plans to have a public exhibition area for members of the industry to display their products in conjunction with the conference. Please contact organisers for exhibition details. Conference Topics: ----------------- Topics include but are not limited to: Applications in Finance of Soft Computing, Artificial Intelligence & Statistical methods such as Neural Networks, Genetic Algorithms, Fuzzy Logic; Time Series & Forecasting, Multivariate Analysis, Hybrid Intelligent Systems, Intelligent Agents, Chaos Theory, Data Mining, Online Gambling, Gaming Strategies and Trading Systems. Invited Keynote Speakers ------------------------ Dr Jeffrey Carmichael, Ph.D. (Princeton), AO, Chairman of the Australian Prudential Regulatory Authority (APRA) and Member of the 1997 Australian Wallis Financial Inquiry Commission. Prof. Efraim Turban, Ph.D. (Berkeley), Author of over 100 publications in the areas of Information Systems, Electronic Commerce and Neural Networks in Finance and Investment. International Review Committee: ------------------------------- Prof. Efraim Turban (USA/HK) Dr. Jeff Carmichael (Aus) Prof. Ah Chung Tsoi (Aus) Emeritus Prof. Peter Poole (Aus) Prof. John D. Haynes (NZ) Prof. M. L. Tiku (Canada) Prof. R. Velu (USA) Prof. Kevin Burrage (Aus) Prof. V K Srivastava (India) Prof. Neville De Mestre (Aus) Prof. Berlin Wu (Taiwan)) Prof. Nikola Kasabov (NZ) Prof. Stan Hurn (Aus) Dr. Vance Martin (Aus) Dr. Graham McMahon (Aus) Dr. A Flitman (Aus) Dr. Mark Chignell (Can) Dr. Gavin Finnie (Aus) Dr. A. N. Roy (USA) Dr. Jeff Barker (Aus) Dr. Zheng Da Wu (Aus) Dr. S. Ganesalingam (NZ) Dr. Stephen Sugden (Aus) Dr. S. Alvandi (Singapore) Dr. Gerhard Wittig (Aus) Dr. James Liu (HK) Dr W. K. Yeap (NZ) Dr. Steven Lawrence (USA) Registration Fees ----------------- Workshop(s): Individual: Full day A$600 or A$150 each Industry representative: Full day A$1200 Conference (2 full days): Before Oct 31 After Oct 31 ------------------------------------------------------------ Students: A$150 A$150 Academic: A$300 A$400 Industry representative A$600 A$750 Exchange Rate Indication: US$1.00 = A$1.55 All SIA, AIBF, ATAA, GCRITF and ANZIAM members are entitled to a 10% discount on conference and workshop fees, subject to availability. Group bookings and sponsors are entitled to discounts. Contact organisers for details. The registration fee includes lunches and refreshments for each day of the conference and workshops. Workshops (very limited places, register early to ensure place) --------------------------------------------------------------- Workshop 1: Financial Forecasting Techniques (Kumar) Workshop 2: Basic Financial Trading System Design Using Excel (Tan) Workshop 3: Applications of E-commerce in Finance I (Prof. Efraim Turban) Workshop 4: Applications of E-commerce in Finance II (Prof. Efraim Turban) Workshop 5: Artificial Neural Networks (Tsoi) Workshop 6: Applying Chaos Theory to Finance (Kumar, Tan and Ghosh) Registration ------------ To register please complete the registration form and mail it with the payment to: AIT 99 Ms Herlina Dihardjo School of Information Technology Bond University, QLD 4229 Australia Telephone: +61 (0) 7 5595-3392 Fax: +61 (0) 7 5595-3320 *Do not dial (0) from outside Australia E-mail: ait99 at bond.edu.au Submission of Papers -------------------- To submit a paper to the AIT99 conference, please send an extended abstract to the above address no later than 31st July 1999. If accepted, two copies of final paper should be submitted before November 1 1999. The preferred format is Microsoft Word. Student papers are invited and encouraged. The Securities Institute of Australia is proud to sponsor the Best Student Paper Prize. Accommodation & Travel ---------------------- Bond University has its own Conference Centre with good accommodation facilities of one hundred standard and executive rooms. There is also a large number of choices for accommodation on the Gold Coast ranging from luxury hotels such as Jupiter's Casino/Conrad Hotel, Marriott, Sheraton Mirage and Grand Mercure to backpackers' accommodation. QANTAS is the official airline for the AIT99 conference and is proud to be part of the conference. A discount of up to 45% of the full economy airfare excluding taxes, for domestic travel in Australia, at the time of booking, has been negotiated for delegates attending the conference, subject to seat availability in group class and payment & ticketing conditions. Please quote Association Profile Number: "1203355", destination and date of conference when making your reservation. The Qantas Association Sales contact number for Australian delegates is Toll Free 1800 684 880 * International delegates can contact their local Qantas office for the best available fare of the day* Our Web Site ------------ Prospective Conference Participants and Delegates are invited to visit our website at the following URL: http://tide.it.bond.edu.au/ait99 or http://w3.to/ait99. On-line registration is available on the web. Sponsors -------- Major Sponsor: Bond University's School of Information Technology. Other AIT99 sponsors and/or supporters include: Gold Coast Regional Information Technology Forum (GRITF), the Gold Coast City Council, the Securities Institute of Australia (SIA), the Australian Institute of Banking and Finance (AIBF), the Australia and New Zealand Industrial Applied Mathematics (ANZIAM), Australian Technical Analysts Association (ATAA), TechQuad, and On The Net, etc. See web site for full list. From FYFE-CI0 at wpmail.paisley.ac.uk Tue Jun 8 09:29:40 1999 From: FYFE-CI0 at wpmail.paisley.ac.uk (COLIN FYFE) Date: Tue, 08 Jun 1999 13:29:40 +0000 Subject: PhD Studentships Message-ID: Two three year studentships are offered in the field of unsupervised artificial neural networks applied to the extraction of information from visual data. One student will work most closely with Dr Bogdan Gabrys and will concentrate on novel cost functions for extraction of independent components of visual scenes. The second will work with Dr Darryl Charles and will concentrate on additive noise for the creation of minimal code sets for sparsely coded visual data. Each will comprise payment of fees and an annual grant of approximately 5500. Both are expected to lead to the award of PhD within the three year period. The Applied Computational Intelligence Research Unit is a very active research unit within the Department of Computing and Information Systems comprising some 10 academics, 5 research assistants and 15 research students. To apply for either of these posts please send a current CV to either char-ci0 at paisley.ac.uk or fyfe-ci0 at paisley.ac.uk before 30th June 1999. Colin Fyfe From ckiw at dai.ed.ac.uk Tue Jun 8 12:27:58 1999 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Tue, 8 Jun 1999 17:27:58 +0100 (BST) Subject: Call for Participation: ICANN 99 post-conference workshops Message-ID: Dear Connectionists, Below are brief announcements of the 5 ICANN 99 post-conference workshops taking place at the University of Edinburgh on Saturday 11 September 1999. See http://www.dai.ed.ac.uk/daidb/people/homes/ckiw/icann/ and the URLs listed below for further details. * Call for participation/presentations This is a call for participation/presentations for these workshops. Please see the individual workshop web pages for details, and then contact the appropriate organizers. * Registration Arrangements Registration for the workshops is free. Those registering for the ICANN conference should register for the workshops on the same form. Anyone not attending the conference should register with the organizers of the workshop(s) they wish to attend. We need you to do this so that we can get rooms of a suitable size for the workshops. Those attending are responsible for their own travel and accommodation arrangements. Chris Williams ICANN 99 Post-Conference workshops organizer -------------------------------------------------------------------------- Interactions between theoretical and experimental approaches in developmental neuroscience Organizers: Stephen Eglen, Bruce Graham, David Willshaw (Edinburgh) http://www.anc.ed.ac.uk/~stephen/workshop.html This workshop will highlight the role of theoretical approaches in understanding the development of the nervous system. It will also allow us to discuss the ways in which experimental and theoretical approaches can interact on various developmental problems. The workshop will examine several key areas in neural development: Growth and branching in dendritic trees. Molecular gradients, and their role in topographic mappings. Neurotrophic factors. Visual system development. Development of innervation at the neuromuscular junction. -------------------------------------------------------------------------- Emergent Neural Computation Architectures Based on Neuroscience Organizers: Stefan Wermter (Sunderland), Jim Austin (York), David Willshaw (Edinburgh) http://www.his.sunderland.ac.uk/emernet/icann99w.html Areas of interest include issues of neuroscience and neural network, such as: 1.Synchronisation: How does the brain synchronise its processing? How does the brain schedule its processing? 2.Processing speed: How does the brain compute with relatively slow computing elements but still achieve rapid and real time performance? 3.Robustness: How does human memory manage to continue to operate despite failure of its components? 4.Modular construction: What can we learn from the brain for building modular more powerful artificial neural network architectures to solve larger tasks? 5.Learning in context: How can we build learning algorithms which consider context? How can we design incremental learning algorithms and dynamic architectures? -------------------------------------------------------------------------- Neural Networks for Intelligent User Interfaces Organizers: Rainer Malaka (Heidelberg), Ramin Yasdi (Sankt Augustin) http://www.dai.ed.ac.uk/daidb/people/homes/ckiw/icann/malaka.html User interfaces that adapt themselves to individual needs, preferences, and knowledge of their users are becoming more and more important. Personalized interfaces are of special importance to deal with information overload and navigation by personalizing and improving the quality of information retrieval and filtering, information restructuring and annotation, as well as information visualization. The development of these new intelligent user interfaces require techniques that enable computer programs to learn how to serve the user most efficiently. Neural networks are not yet widely used within this challenging domain. But the domain seems to be an interesting new application area for neural networks due to availability of large sets of data and the required automatic adaptation to new situations and users. Therefore, growing interest in using various powerful learning methods known from neural network models for intelligent user interfaces is arising among researchers. -------------------------------------------------------------------------- Kernel Methods: Gaussian Process and Support Vector Machine predictors Organizers: Carl Edward Rasmussen (Lyngby), Roderick Murray-Smith (Lyngby) Alex Smola (Berlin), Chris Williams (Edinburgh) http://www.dai.ed.ac.uk/daidb/people/homes/ckiw/icann/gpsvm.html This workshop aims to bring together people working with Gaussian Process (GP) and Support Vector Machine (SVM) predictors for regression and classification problems. The scope of the workshop includes: Methods for choosing kernels: Generic vs problem specific issues Uniform convergence and Bayesian theory Efficient implementation/approximation of GP and SVM predictors on large datasets GP classifiers: MCMC methods, variational and Laplace approximations Kernel methods for dynamic system modelling Applications of Kernel methods -------------------------------------------------------------------------- Developments in Artificial Neural Network Theory: Independent Component Analysis and Blind Source Separation Organizer: mark Girolami (Paisley) http://cis.paisley.ac.uk/staff/giro-ci0/ICANN99/ICANN99_ICA_WS.html This workshop seeks to re-focus the attention of ANN researchers by exploring how ICA / BSS and its further development can push forward our knowledge of the computational brain. Proposals are solicited for presentation and discussion that address and explore some of the following topics: Models of Sensory Coding in the Brain Mammalian Visual Cortex Image Feature Extraction Natural Image Statistics and Efficient Coding Auditory Modelling and the Binaural Cocktail Party Effect Over-complete Basis Representations State Space Models Time Varying Mixtures Non-linear ICA and Topographic Mappings Applications of ICA to Electrophysiological Data From yweiss at CS.Berkeley.EDU Tue Jun 8 08:43:30 1999 From: yweiss at CS.Berkeley.EDU (Yair Weiss) Date: Tue, 8 Jun 1999 05:43:30 -0700 (PDT) Subject: TR on loopy belief propagation Message-ID: <199906081243.FAA18143@gibson.CS.Berkeley.EDU> Hi, The following paper showing that belief propagation gives the exact means for Gaussian graphical models regardless of the number of loops is available online via: http://www.cs.berkeley.edu/~yweiss/gaussTR.ps.gz Comments are most welcome. Yair -------------------------------------------------------------------------- Title: Correctness of belief propagation in Gaussian graphical models of arbitrary topology. Authors: Yair Weiss and William T. Freeman Reference: UC Berkeley CS Department TR UCB//CSD-99-1046 Abstract: Graphical models, such as Bayesian networks and Markov Random Fields represent statistical dependencies of variables by a graph. Local ``belief propagation'' rules of the sort proposed by Pearl (1988) are guaranteed to converge to the correct posterior probabilities in singly connected graphical models. Recently, a number of researchers have empirically demonstrated good performance of ``loopy belief propagation''--using these same rules on graphs with loops. Perhaps the most dramatic instance is the near Shannon-limit performance of ``Turbo codes'', whose decoding algorithm is equivalent to loopy belief propagation. Except for the case of graphs with a single loop, there has been little theoretical understanding of the performance of loopy propagation. Here we analyze belief propagation in networks with arbitrary topologies when the nodes in the graph describe jointly Gaussian random variables. We give an analytical formula relating the true posterior probabilities with those calculated using loopy propagation. We give sufficient conditions for convergence and show that when belief propagation converges it gives the correct posterior means {\em for all graph topologies}, not just networks with a single loop. The related ``max-product'' belief propagation algorithm finds the maximum posterior probability estimate for singly connected networks. We show that, even for non-Gaussian probability distributions, the convergence points of the max-product algorithm in loopy networks are at least local maxima of the posterior probability. These results motivate using the powerful belief propagation algorithm in a broader class of networks, and help clarify the empirical performance results. From masaaki at hip.atr.co.jp Wed Jun 9 05:12:48 1999 From: masaaki at hip.atr.co.jp (Masa-aki SATO) Date: Wed, 9 Jun 1999 18:12:48 +0900 Subject: TR: Fast Learning of On-line EM Algorithm Message-ID: <01BEB2A3.AEC1C360@hippc4660.hip.atr.co.jp> The following paper is available on my web site: http://www.hip.atr.co.jp/~masaaki/ We would greatly appreciate comments and suggestion. TITLE: "Fast Learning of On-line EM Algorithm" Masa-aki Sato ATR Human Information Processing Research Laboratories ------------------------------------------------------------------ Abstract In this article, an on-line EM algorithm is derived for general Exponential Family models with Hidden variables (EFH models). It is proven that the on-line EM algorithm is equivalent to a stochastic gradient method with the inverse of the Fisher information matrix as a coefficient matrix. As a result, the stochastic approximation theory guarantees the convergence to a local maximum of the likelihood function. The performance of the on-line EM algorithm is examined by using the mixture of Gaussian model, which is a special type of the EFH model. The simulation results show that the on-line EM algorithm is much faster than the batch EM algorithm and the on-line gradient ascent algorithm. The fast learning speed is achieved by the systematic design of the learning rate schedule. Moreover, it is shown that the on-line EM algorithm can escape from a local maximum of the likelihood function in the early training phase, even when the batch EM algorithm is trapped to a local maximum solution. It is pointed out that the on-line EM algorithm has a similar form as the natural gradient method proposed by Amari (1998), which gives the optimal asymptotic convergence. The inverse of the Fisher information matrix in the on-line EM algorithm may contribute to fast learning performance. In our on-line EM algorithm, however, it is not necessary to calculate the inverse of the Fisher information matrix. In the future, it would be interesting to study the relation of our algorithm to the natural gradient method. -------------------------------------- Masa-aki Sato ATR Human Information Processing Research Laboratories 2-2, Hikaridai, Seika-cho, Soraku-gun Kyoto 619-0288 Japan phone : 0774-95-1039 fax : 0774-95-1008 E-mail: masaaki at hip.atr.co.jp From C.Campbell at bristol.ac.uk Wed Jun 9 08:16:16 1999 From: C.Campbell at bristol.ac.uk (I C G Campbell) Date: Wed, 9 Jun 1999 13:16:16 +0100 (BST) Subject: PhD studentship available Message-ID: <199906091216.NAA12541@zeus.bris.ac.uk> PhD STUDENTSHIP AVAILABLE A three year studentship is available in the area of Support Vector Machines and their applications. The project has a theoretical component and an applied component, principally the application of SVMs and related kernel classifiers to biosequences. The student will be a member of the Computational Intelligence Group (see our web pages http://lara.enm.bris.ac.uk/cig/ ) which is part of the Advanced Computing Research Centre, Bristol University (http://www.cs.bris.ac.uk/ACRC/ ). Suitable applicants should have a strong mathematical background and some additional computing experience would be preferred. The studentship comprises payment of fees and an annual maintenance grant at EPSRC rates. To apply for this post please send a current CV to: C.Campbell at bris.ac.uk Colin Campbell ACRC, Bristol University. United Kingdom From frey at dendrite.uwaterloo.ca Wed Jun 9 08:39:48 1999 From: frey at dendrite.uwaterloo.ca (Brendan Frey) Date: Wed, 9 Jun 1999 08:39:48 -0400 Subject: loopy learning Message-ID: <199906091239.IAA08681@dendrite.uwaterloo.ca> It turns out that loopy propagation, or "turboinference", in Gaussian networks can be used effectively for INFERENCE and LEARNING: http://www.cs.toronto.edu/~frey/papers/tfa-nc99.abs.html In this paper (submitted for publication in May), I show that iterative probability propagation in factor analysis networks has a fixed point and I give an eigenvalue condition for global convergence. I also show that iterative propagation can be used for learning factor analyzer networks and give results on face recognition. Brendan. From mdorigo at ulb.ac.be Wed Jun 9 09:31:46 1999 From: mdorigo at ulb.ac.be (Marco DORIGO) Date: Wed, 9 Jun 1999 15:31:46 +0200 Subject: Fourth European Workshop on Reinforcement Learning: Call for Participation and Abstracts Message-ID: Call for Abstracts: EWRL-4, Fourth European Workshop on Reinforcement Learning Lugano, Switzerland, October 29-30, 1999 (We apologize for duplicates of this email) Reinforcement learning (RL) is a growing research area. To build a European RL community and give visibility to the current situation on the old continent, we are running a now biennal series of workshops. EWRL-1 took place in Brussels, Belgium (1994), EWRL-2 in Milano, Italy (1995), EWRL-3 in Rennes, France (1997). EWRL-4 will take place in Lugano, Switzerland (1999). The first morning will feature a plenary talk by Dario Floreano. The rest of the two-day workshop will be dedicated to presentations given by selected participants. Presentation length will be determined once we have some feedback on the number of participants. The number of participants will be limited. Access will be restricted to active RL researchers and their students. Please communicate as soon as possible, and in any case before end of July 1999, your intention to participate by means of the intention form attached below (e-mail preferred: ewrl at iridia.ulb.ac.be). Otherwise send intention forms to: Marco Dorigo IRIDIA, CP 194/6 Universite' Libre de Bruxelles Avenue Franklin Roosvelt 50 1050 Bruxelles Belgium TIMELINE: intention forms and one page abstracts should be emailed by the end of July to ewrl at iridia.ulb.ac.be Up-to-date information, including inscription fees, hotel information, etc., is maintained at: http://iridia.ulb.ac.be/~ewrl/EWRL4/EWRL4.html The Organizing Committee Marco Dorigo and Hugues Bersini, IRIDIA, ULB, Brussels, Belgium, Luca M. Gambardella and Juergen Schmidhuber , IDSIA, Lugano, Switzerland, Marco Wiering, University of Amsterdam, The Netherlands. -------------------------------------------------------------------- INTENTION FORM (to be emailed by the end of July, 1999, to ewrl at iridia.ulb.ac.be) Fourth European Workshop on Reinforcement Learning (EWRL-4) Lugano, Switzerland, October 29-30, 1999 Family Name: First Name: Institution: Address: Phone No.: Fax No.: E-mail: ____ I intend to participate without giving a presentation ____ I intend to participate and would like to give a presentation with the following title: ____ MAX one page abstract: From a.burkitt at medoto.unimelb.edu.au Wed Jun 9 21:58:23 1999 From: a.burkitt at medoto.unimelb.edu.au (Anthony BURKITT) Date: Thu, 10 Jun 1999 11:58:23 +1000 Subject: Preprint available Message-ID: <60E1B9CE4896D111A22700E0291005973580DA@mail.medoto.unimelb.edu.au> The following paper on the analysis of integrate and fire neurons has been accepted for publication in Neural Computation and is available now from my web page: http://www.medoto.unimelb.edu.au/people/burkitta/poisson.ps.zip "Calculation of interspike intervals for integrate and fire neurons with Poisson distribution of synaptic inputs" A. N. Burkitt and G. M. Clark Abstract: In this paper we present a new technique for calculating the interspike intervals of integrate and fire neurons. There are two new components to this technique. Firstly, the probability density of the summed potential is calculated by integrating over the distribution of arrival times of the afferent postsynaptic potentials (PSPs), rather than using conventional stochastic differential equation techniques. A general formulation of this technique is given in terms of the probability distribution of the inputs and the time course of the postsynaptic response. The expressions are evaluated in the Gaussian approximation, which gives results that become more accurate for large numbers of small amplitude PSPs. Secondly, the probability density of output spikes, which are generated when the potential reaches threshold, is given in terms of an integral involving a conditional probability density. This expression is a generalization of the renewal equation, but it holds for both leaky neurons and for situations in which there is no time-translational invariance. The conditional probability density of the potential is calculated using the same technique of integrating over the distribution of arrival times of the afferent PSPs. For inputs with a Poisson distribution the known analytic solutions for both the perfect integrator model and the Stein model (which incorporates membrane potential leakage) in the diffusion limit are obtained. The interspike interval distribution may also be calculated numerically for models which incorporate both membrane potential leakage and a finite rise time of the postsynaptic response. Plots of the relationship between input and output firing rates as well as the coefficient of variation are given, and inputs with varying rates and amplitudes, including inhibitory inputs, are analyzed. The results indicate that neurons functioning near their critical threshold, where the inputs are just sufficient to cause firing, display a large variability in their spike timings. ====================ooOOOoo==================== Anthony N. Burkitt The Bionic Ear Institute 384-388 Albert Street East Melbourne, VIC 3002 Australia Email: a.burkitt at medoto.unimelb.edu.au http://www.medoto.unimelb.edu.au/people/burkitta Phone: +61 - 3 - 9283 7510 Fax: +61 - 3 - 9283 7518 =====================ooOOOoo=================== From ericwan at ece.ogi.edu Thu Jun 10 14:59:53 1999 From: ericwan at ece.ogi.edu (Eric Wan) Date: Thu, 10 Jun 1999 18:59:53 +0000 Subject: OGI PH.D. STUDENT RESEARCH POSITION Message-ID: <37600B29.9CDF09A2@ece.ogi.edu> *********** PH.D. STUDENT RESEARCH POSITION OPENING **************** CENTER FOR SPOKEN LANGUAGE UNDERSTANDING http://cslu.cse.ogi.edu/ OREGON GRADUATE INSTITUTE The Oregon Graduate Institute of Science and Technology (OGI) has an immediate opening for an outstanding student in its Electrical and Computer Engineering Ph.D program. Full stipend and tuition will be covered. The student will specifically work with Professor Eric A. Wan (http://www.ece.ogi.edu/~ericwan/) on a number of projects relating to neural network learning and speech enhancement. QUALIFICATIONS: The candidate should have a strong background in signal processing with some prior knowledge of neural networks. A Masters Degree in Electrical Engineering is preferred. Please send inquiries and background information to ericwan at ece.ogi.edu. Eric A. Wan Associate Professor, OGI ********************************************************************* OGI OGI is a young, but rapidly growing, private research institute located in the Portland area. OGI offers Masters and PhD programs in Computer Science and Engineering, Applied Physics, Electrical Engineering, Biology, Chemistry, Materials Science and Engineering, and Environmental Science and Engineering. OGI has world renowned research programs in the areas of speech systems (Center for Spoken Language Understanding) and machine learning. (Center for Information Technologies). Center for Spoken Language Understanding The Center for Spoken Language Understanding is a multidisciplinary academic organization that focuses on basic research in spoken language systems technologies, training of new investigators, and development of tools and resources for free distribution to the research and education community. Areas of specific interest include speech recognition, natural language understanding, text-to-speech synthesis, speech enhancement in noisy conditions, and modeling of human dialogue. A key activity is the ongoing development of the CSLU Toolkit, a comprehensive software platform for learning about, researching, and developing spoken dialog systems and new applications. Center for Information Technologies The Center for Information Technologies supports development of powerful, robust, and reliable information processing techniques by incorporating human strategies and constraints. Such techniques are critical building blocks of multimodal communication systems, decision support systems, and human-machine interfaces. The CIT approach is based on emulating relevant human information processing capabilities and extending them to a variety of complex tasks. The approach requires expertise in nonlinear and adaptive signal processing (e.g., neural networks), statistical computation, decision analysis, and modeling of human information processing. Correspondingly, CIT research areas include perceptual characterization of speech and images, prediction, robust signal processing, rapid adaptation to changing environments, nonlinear signal representation, integration of information from several sources, and integration of prior knowledge with adaptation. From jbower at bbb.caltech.edu Thu Jun 10 18:22:12 1999 From: jbower at bbb.caltech.edu (James M. Bower) Date: Thu, 10 Jun 1999 15:22:12 -0700 Subject: REGISTRATION FOR CNS*99 Message-ID: ************************************************************************ EIGHTH ANNUAL COMPUTATIONAL NEUROSCIENCE MEETING (CNS*99) July 18 - 22, 1999 Pittsburgh, Pennsylavania REGISTRATION INFORMATION ************************************************************************ Registration is now open for this year's Computational Neuroscience meeting (CNS*99). This is the eighth in a series of annual inter-disciplinary conferences intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience. As in previous years, this meeting will bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in understanding how biological neural systems compute. The meeting will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1999 will take place at Pittsburgh Hilton and Towers in Pittsburgh, Pennsylvania and include plenary, contributed, and poster sessions. The first session starts at 9 am, Sunday July 18th and the meeting ends with the annual CNS banquet on Thursday evening, July 22nd. There will be no parallel sessions. The meeting includes two half days of informal workshops focused on current issues in computational neuroscience. Day care will be not be available. LOCATION: The meeting will take place at the Pittsburgh Hilton and Towers in Pittsburgh, Pennsylvania MEETING ACCOMMODATIONS: Accommodations for the meeting have been arranged at Pittsburgh Hilton and Towers. Information concerning reservations, hotel accommodations, etc. are available at the meeting web site indicated below. A block of rooms are reserved at special rates. 40 student rate rooms are available on a first-come-first-served basis, so we recommend students acting quickly to reserve these slots. NOTE that registering for the meeting, WILL NOT result in an automatic room reservation. Instead you must make your own reservations by contacting the hotel itself. As this is the high season for tourists in Pittsburgh, you should make sure and reserve your accommodations quickly by contacting: Pittsburgh Hilton and Towers (RESERVATION REQUEST ORDER FORM LOCATED BELOW) NOTE: IN ORDER TO GET THE AVAILABLE ROOMS, YOU MUST CONFIRM HOTEL REGISTRATIONS BY SATURDAY, JUNE 17, 1999. When making reservations by phone, make sure and indicate that you are registering for the Computational Neuroscience (CNS*99) meeting. Students will be asked to verify their status on check in with a student ID or other documentation. MEETING REGISTRATION FEES: Registration received on or before July 3, 1999: Student: $ 125 Regular: $ 275 Meeting registration after July 3, 1999: Student: $ 150 Regular: $ 300 BANQUET: Registration for the meeting includes a single ticket to the annual CNS Banquet. Additional Banquet tickets can be purchased for $35 each person. The banquet will be held on Thursday, July 22nd. ********************************************************************* REGISTRATION AND ADDITIONAL INFORMATION (including the agenda with list of talks) can be obtained by: o Using our on-line WWW information and registration server, URL of: http://cns.numedeon.com/cns99/ o Sending Email to: cns99 at bbb.caltech.edu PLEASE FAX OR MAIL REGISTRATION FORM TO: Caltech, Division of Biology 216-76, Pasadena, CA 91125 Attn: Judy Macias Fax Number: (626) 795-2088 (Refund Policy: 50% refund for cancellations on or before July 9th no refund after July 10th) ******************************************************************* PLEASE CALL PITTSBURGH HILTON AND TOWERS TO MAKE HOTEL RESERVATIONS AT (412) 391-4600 Fax (412) 594-5144 OR YOU CAN MAIL REGISTRATION FORM TWO WAYS * MAIL REGISTRATION FORM TO PITTSBURGH HILTON AND TOWERS AT THE ADDRESS BELOW OR FAX REGISTRATION FORM BELOW AT (412) 594-5144 ********************************************************************** MAIL TO: Pittsburgh Hilton and Towers Attn: Reservation Department 600 Commonwealth Place Pittsburgh, Pennsylvania 15222 Check-In Time: 3:00 p.m. Check-Out Time: 12:00 noon Computational Neuroscience Conference - CNS*99 July 17 - 22, 1999 * A $50 early departure fee will be assessed should you change your departure date after you have checked in. ** The hotel requires a 1 night advance deposit for all reservations. The deposit is refunable up to 3 days before arrival. Checks and major credit cards are acceptable to establish the deposit. REQUESTS MUST BE RECEIVED BY: SATURDAY, JUNE 17, 1998 PLEASE RESERVE ____________ ROOM(S) OF THE TYPE CIRCLED ___________ ARRIVAL (DAY/DATE)______________ TIME ______________ DEPARTURE (DAY/DATE)____________ TIME ______________ Name of Person Requesting Rooms: Last Name:____________________________ First Name:____________________________ Company Name:________________________ Institute:_______________________________ Street Address or PO Box Number:___________________________ City:___________________________________ State:___________________________________ Zip Code:________________________________ Area Code and Phone Number:______________________________ PERSON SHARING ACCOMMODATIONS: 1. ____________________________ 2. ____________________________ 3. ____________________________ __ ADVANCE DEPOSIT (check) __ AMERICAN EXPRESS __ MASTERCARD __ VISA __ CARTE BLACHE __ DINERS __ DISCOVER Credit Card No. ______________________________________ Expiration Date:______________________________________ SCHEDULE OF RATE (CIRCLE 1) Queen 1 person/1 bed $119.00 Queen 2 persons/1 bed $119.00 Double/Double 2 persons/2 beds $119.99 Plus prevailing Sate Sales Tax 14% Additional Person charge is $20.00 Cut Off Date: June 26, 1999 H HONORS Membership No. ______________________________________ **If a room is not available at rate requested, reservation will be made at the next available rate. ******************************************************************** CNS*99 MEETING AGENDA SUNDAY, JULY 18, 1999 9:00 Welcoming Remarks and General Information 9:15 Featured Contributed Talk: Boris Gutkin (Center for Neuroscience at University of Pittsburgh) G. Bard Ermentrout A Canonical Theory of Spike Generation in Cortical Neurons Accounts for Complex Neural Responses to Constant and Time-varying Stimuli Contributed Talks 10:05 Frances Chance (Brandeis University) Sacha Nelson, and L. F. Abbott Recurrent Cortical Amplification Produces Complex Cell Responses 10:25 Brian Blais (Brown University) Ann Besu, Harel Shouval, and Leon Cooper Statistics of LGN Activity Determine the Segregation of ON/OFF Subfields for Simple Cells in Cortex 10:45 Alain Destexhe (Laval University) Denis Pare Correlated Synaptic Bombardment in Neocortical Pyramidal Neurons in Vivo 11:05 Break 11:20 Reinoud Maex (University of Antwerp) Bart P. Vos and Erik De Schutter Imprecise Spike Synchronization Reveals Common Excitation through Weak Distributed Synapses 11:40 Allan Coop (Rockefeller University) George Reeke Simulating the Temporal Evolution of Neuronal Discharge 12:00 Gwendal le Masson (Institut Fran=E7ois Magendie) Emmanuel Barbe, Valerie Morisset, and Frederic Nagy >From Current Clamp Experiments to Conductance-based Model Neurons: A >Direct Link Using A New Error Function and Optimization Scheme 12:20 Lunch Break and Poster Preview Session A 2:20 Featured Contributed Talk: Rama Ratnam (University of Illinois) Mark Nelson Impact of Afferent Spike Train Irregularity on the Detection of Weak Sensory Signals Contributed Talks 3:10 Elise Cassidente (Carnegie Mellon University) Xiaogang Yan and Tai Sing Lee A Bayesian Decision Approach to Decode Local and Contextual Signals from Spike Trains 3:30 Alexander Dimitrov (MSU Center for Computational Biology) John P. Miller Natural Time Scale for Neural Encoding 3:50 Break 4:10 Kevin Otto (Arizona State University) Rousche Patrick and Daryl Kipke Investigating Neural Coding and Plasticity in Auditory Cortex using Real-Time Feedback from Ensemble Neural Recordings 4:30 End of Day Announcements 8:00 Poster Session A Pamela Abshire (Johns Hopkins University) Andreas Andreou Relating Information Capacity to a Biophysical Model for Blowfly Retina Paul Adams (SUNY Stony Brook) Kingsley Cox Implications of Digital Synapses for Thalamocortical Function Ildiko Aradi (Ohio University) William Holmes Synchronized Oscillation in Networks of Hippocampal Dentate Gyrus Interneurons with Different Adaptation Properties Per Aronsson (Royal Institute of Technology) Hans Liljenstr=F6m Electromagnetic Interaction in a Neural System Giorgio A. Ascoli (George Mason University) Jeffrey L. Krichmar L-Neuron: A Modeling Tool for the Efficient Generation and Parsimonious Description of Dendritic Morphology Davis Barch (University of California at Berkeley) Characterization of Static Input by Activity Oscillations in an Excitable Membrane Model: Effect of Input Size, Shape and Texture on Oscillation Parameters Hauke Bartsch (Technische University) The Influence of Threshold Variability on the Response of Visual Cortical Neurons John Beggs (Yale University) Why LTP and LTD are Asymmetric: A Bin Model to Explain Induction Parameters Alan Bond (California Institute of Technology) Problem-solving Behavior in a System Model of the Primate Neocortex Vladimir Bondarenko (University of Pittsburgh) Teresa Chay The Role of AMPA, GABA, [Ca2+]i, and Calcium Stores in Propagating Waves in Neuronal Networks Mihail Bota (USC Brain Project) Alex Guazzelli and Michael Arbib The Extended Taxon-Affordances Model: Egocentric Navigation and the Interactions between the Prefrontal Cortex and the Striatum Hans A. Braun (University of Marburg) Martin T. Huber, Mathias Dewald, Karlheinz Voigt, Alexander Neimann, Xing Pei, and Frank Moss A Computer-Model of Temperature Encoding in Peripheral Cold Receptors: Oscillations, Noise and Chaotic Dynamics Adam Briska (University of Wisconsin) Daniel Uhlrich and William Lytton Independent Dendritic Domains in the Thalamic Circuit Dyrk Brockmann (Max-Planck-Institut) Theo Geisel The Ecology of Gaze Shifts David Brown (The Babraham Institute) Jianfeng Feng Low Correlation between Random Synaptic Inputs Impacts Considerably on the Output of the Hodgkin-Huxley Model Emery N. Brown (Massachusetts General Hospital) Riccardo Barbieri, Michael C. Quirk, Loren M. Frank, and Matthew A. Wilson Constructing a Time-dependent Gamma Probability Model of Spatial Information Encoding in the Rat Hippocampus Nicolas Brunel (Brandeis University) Phase Diagrams of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons Anthony Burkitt (The Bionic Ear Institute) Graeme Clark Analysis of Synchronization in the Response of Neurons to Noisy Periodic Synaptic Input Anthony Burkitt (The Bionic Ear Institute) Interspike Interval Variability for Balanced Networks with Reversal Potentials for Large Numbers of Inputs Marcelo Camperi (University of San Francisco) Peter Pacheco, Nicola Rugai, and Toshiyuki Uchino NEUROSYS: An Easy-to-Use System for the Simulation of Very Large Networks of Biologically Accurate Neurons on Parallel Computers Marcelo Camperi (University of San Francisco) Nicola Rugai Modeling Dopaminergic Modulation of Delay-Period Activity in Prefrontal Cortex During Working Memory Processes" Carmen Canavier (University of New Orleans) Reciprocal Excitatory Ohmic Synapses Convert Pacemaker-Like Firing into Burst Firing in a Simple Model of Coupled Neurons Gal Chechik (Tel-Aviv University) Isaac Mailijson and Eytan Ruppin Neuronal Normalization Provides Effective Learning through Ineffective Synaptic Learning Rules Yoonsuck Choe (The University of Texas at Austin) Risto Miikkulainen and Lawrence K. Cormack Effects of Presynaptic and Postsynaptic Resource Redistribution in Hebbian Weight Adaptation Carson Chow (University Of Pittsburgh) Nancy Kopell Dynamics of Spiking Neurons with Electrical Coupling Thomas Coates (Pennsylvania State University) Control and Monitoring of a Parallel Processed Neural Network via the World Wide Web Eyal Cohen (Tel-Aviv University) Nir Levy and Eytan Ruppin Global Versus Local Processing of Compressed Representations: A Computational Model of Visual Search Gennady S. Cymbalyuk (Emory University) Ronald L. Calabrese Oscillatory Behaviors in Pharmacologically Isolated Heart Interneurons from the Medicinal Leech Yue Dai (University of Manitoba) Kelvin Jones, Brent Fedirchuk, and Larry Jordan Regulation of the Action Potential Voltage Threshold in Cat Spinal Motoneurons during Fictive Locomotion Field David (Cornell University) Learning Wavelet-Like Receptive Fields from Natural Scenes with a Biologically Plausible De-correlation Network Marilene de Pinho (Universidade De S=E3o Paulo) Marcelo Mazza and Antonio Roque-da-Silva A Biologically Plausible Computer Simulation of Classical Conditioning Induced Reorganization of Tonotopic Maps in the Auditory Cortex Gustavo Deco (Siemens AG) Josef Zihl Neurodynamical Mechanism of Binding and Selective Attention for Visual Search Alain Destexhe (Laval University) Helmut Kroger Consequences of Correlated Synaptic Bombardment on Dendritic Integration in Neocortical Pyramidal Neurons Patricia M. Di Lorenzo (SUNY at Binghamton) Kurt Grandis and Christian Reich Stimulation of Sodium Channels in Taste Receptor Cells Provides Noise that Enhances Taste Detection L. M. Dobbs (Schafer Corporation) T. J. Manuccia and J. L. Murphy Planar Optically Switched Microelectrode Array (OSMA) Silke Dodel (Max Planck Institute) J. Michael Herrmann, Theo Geisel, and Jens Frahm Components of Brain Activity -Data Analysis for FMRI Gideon Dror (The Academic College of Tel-Aviv-Yaffo) Misha Tsodyks Activity of Coupled Excitatory and Inhibitory Neural Populations with Dynamic Synapses Gideon Dror (The Academic College of Tel-Aviv-Yaffo) Misha Tsodyks Chaotic Phenomena in Neural Populations with Dynamic Synapses Witali Dunin-Barkowski (Texas Tech University) Donald Wunsch Phase-Based Cerebellur Learning of Dynamic Signals Gaute T. Einevoll (Agricultural University of Norway) Paul Heggelund Mathematical Models for Spatial Receptive-Field Organization of DLGN Neurons in CAT Chris Eliasmith (Washington University in St. Louis) Charles H. Anderson Rethinking Central Pattern Generators: A General Framework Steven Epstein (Boston University) Jason Ritt, Yair Manor, Farzan Nadim, Eve Marder, and Nancy Kopell Network Oscillations Generated by Balancing Graded Asymmetric Reciprocal Inhibition in Passive Neurons T. Ghaffari Farazi (University of Southern California) J.-S. Liaw and T.W. Berger Functional Implications of Synaptic Morphology Stuart Feerick (The Babraham Institute) Jianfeng Feng and David Brown Random Pulse input Versus Continuous Current plus White/colored Noise: Are They Equivalent? Jianfeng Feng (The Babraham Institute) Stimulus-Evoked Oscillatory Synchronization in Neuronal Models Joseph Francis (George Washington University) Bruce Gluckman and Steven Schiff Deterministic Structure in Data from a Free Running Neuronal Ensemble: A Comparison of Three Non-linear Test for Determinism Mark Fuhs (Carnegie Mellon University) David Touretzky Synaptic Learning Models of Map Separation in the Hippocampus Tomoki Fukai (Tokai University) Seinichi Kanemura Precisely-Timed Transient Synchronization by Synaptic Depression Ryuta Fukuda (Keio University) Satoshi Nagayasu, Junko Hara, William Shankle, Masaru Tomita Suggesting Human Cortical Connectivity for Language-related Areas and Simulations of its Computational Model Enrique Garibay (Brandeis University) Xiao-Jing Wang Information Transfer in a Model Neuron with Correlated Inputs Daniel Gill (The Hebrew University) Lidror Troyansky and Israel Nelken Auditory Localization using Direction-dependent Spectral Information Simon F. Giszter (MCPHU) William J. Kargo On-line Limb Trajectory Adaptation by Assembly and Control of Force-field Primitives Using Modular Encapsulated Feedback J. Randall Gobbel (Carnegie Mellon University) Reinforcement Learning in a Biophysical Model of Basal Ganglia-Neocortex Loops MONDAY, JULY 19, 1999 9:00 General Information 9:15 Featured Contributed Talk: Don Johnson (Rice University) Charlotte Gruner, Raymon Glantz Quantifying Information Transfer in Spike Generation Contributed Talks 10:05 Daniel Butts (Lawrence Berkeley National Laboratory) Daniel Rokhsar The Information Content of Spontaneous Retinal Waves 10:25 Mona Spiridon (EPFL, Swiss Federal Institute of Technology) Carson Chow and Wulfram Gerstner Signal Transmission through A Population of Integrate-and-fire Neurons 10:45 Paul H.E. Tiesinga (Salk Institute) Jorge V. Jose Driven by Inhibition 11:05 Break 11:20 Jianfeng Feng (The Babraham Institute) Stimulus-Evoked Oscillatory Synchronization in Neuronal Models 11:40 Charles Anderson (Washington University) Qingfeng Huang and John Clark Harmonic Analysis of Spiking Neuronal Ensembles 12:00 Steven Schiff (Krasnow Institute) David Colella Brain Chirps: Spectrographic Signatures of Epileptic Seizures 12:20 Vikaas Sohal (Stanford University School of Medicine) Molly Hunstman and John Huguenard Reciprocal Inhibitory Connections Produce Phase Lags That Desynchronize Intrathalamic Oscillations 12:40 Lunch Break and Poster Preview Session B 2:00 Featured Contributed Talk: Nancy Kopell (Boston University) Bard Ermentrout, Miles Whittington, and Roger Traub Gamma and Beta Rhythms Have Different Synchronization Properties Contributed Talks 2:50 Carson Chow (University of Pittsburgh) Carlo Laing and G. Bard Ermentrout Bump Solutions in a Network of Spiking Neurons 3:10 Christian W. Eurich (Institute of Theoretical Physics) Klaus Pawelzik, Udo Ernst, Jack Cowan, and John Milton Delay Adaptation in the Nervous System 3:30 Kay Robbins (University of Texas at San Antonio) David Senseman The Relationship of Response Latency to Modal Decomposition: Analysis of the Initial Spread of Visually-evoked Cortical Depolarization 3:50 Break 4:10 Ernst Niebur (Johns Hopkins University) Arup Roy, Peter Steinmetz, and Kenneth Johnson Model-free Detection of Synchrony in Neuronal Spike Trains, with an Application to Primate Somatosensory Cortex 4:30 Invited Talk: To be announced 5:20 End of Day Announcements 8:00 Poster Session B Mark Goldman (Brandeis University) Jorge Golowasch, Laurence Abbott, and Eve Marder Dependence of Firing Pattern on Intrinsic Ionic Conductances: Sensitive and Insensitive Combinations Anatoli Gorchetchnikov (Middle Tennessee State University) Introduction of Threshold Self-adjustment Improves the Convergence in Feature-detective Neural Nets Boris S. Gutkin (University of Pittsburgh) G. Bard Ermentrout and Joseph O'Sullivan Layer 3 Patchy Recurrent Excitatory Connections May Determine the Spatial Organization of Sustained Activity in the Primate Prefrontal Cortex Junko Hara (University of California at Irvine) Ryuta Fukuda, William Shankle, and James Fallon Estimating Cortical Connectivity from Statistical Properties of the Microscopic Features of the Developing Human Cerebral Cortex: Comparison to Contemporary Methods and Relevance to Computational Modeling Jeanette Hellgren Kotaleski (NADA) Patrik Krieger Simulation of Metabotropic Glutamate Receptor Induced Cellular and Network Effects in the Lamprey Locomotor CPG Jeanette Hellgren Kotaleski (NADA) Alexander Kozlov, Erik Aurell, Sten Grillner, and Anders Lansner Modeling of Plasticity of the Synaptic Connections in the Lamprey Spinal CPG - Consequences for Network Behavior Tim Hely (Santa Fe Institute) The Development of Corpus Callosum Connections in Primary Visual Cortex Alix Herrmann (Swiss Federal Institute of Technology) Wulfram Gerstner Effect of Noise on Neuron Transient Response J. Michael Herrmann (Max-Planck-Institut Fuer Stroemungsforschung) Klaus Pawelzik and Theo Geisel Learning Predictive Representations Claus C. Hilgetag (Newcastle University) Spatial Neglect and Paradoxical Lesion Effects in the CAT - A Model Based on Midbrain Connectivity Ulrich Hofmann (California Institute of Technology) Stephen Van Hooser, David Kewley, and James Bower Relationship between Field Potentials and Spike Activity in Rat S1: Multi-site Cortical Recordings and Simulations William Holmes (Ohio University) Comparison of CaMKinase II Activation in a Dendritic Spine Computed with Deterministic and Stochastic Models of the NMDA Synaptic Conductance Greg Hood (Pittsburgh Supercomputing Center) John Burkardt and Greg Foss Visualizing the Visual System David Horn (Tel Aviv University) Irit Opher Complex Dynamics of Neuronal Thresholds Osamu Hoshino (The University of Electro-Communications) Satoru Inoue, Yoshiki Kashimori, and Takeshi Kambara A Role of a Hierarchical Dynamical Map in Cognition and Binding Different Sensory Modalities Michael Howe (University of Texas at Austin) Risto Miikkulainen Hebbian Learning and Temporary Storage in the Convergence-Zone Model of Episodic Memory Fred Howell (University of Edinburgh) Jonas Dyhrfjeld-Johnsen, Reinhoud Maex, Nigel Goddard, and Erik De Schutter A Large Scale Simulation Model of the Cerebellar Cortex using PGENESIS Martin T. Huber (University of Marburg) J=FCrgen C. Krieg, Hans A. Braun, Xing Pei, and Frank Moss Do Stochastic-dynamic Effects Contribute to the Progression of Mood Disorders: Implications from Neurodynamic Modelling Hidetoshi Ikeno (Himeji Institute of Technology) Shiro Usui Information Processing by Electro-diffusion in the Kenyon Cell Satoru Inoue (The University of Electro-Communications) Yoshiki Kashimori, Osamu Hoshino, and Takeshi Kambara A Neuronal Model of Nucleus Laminaris and Inferior Colliculus Detecting Microsecond Interaural Time Difference in Sound Localization David Jaffe (University of Texas at San Antonio) Nicholas Carnevale Morphological Determinants of Synaptic Integration Kelvin E. Jones (University of Manitoba) Kevin P. Carlin, Jeremy Rempel, Larry M. Jordan, and Rob M. Brownstone Dendritic Calcium Channels in Mouse Spinal Motoneurons: Implications for Bistable Membrane Properties Ranu Jung (University of Kentucky) David Magnuson Non-stationary Analysis of Extracellular Neural Activity Ranu Jung (University of Kentucky) Min Shao Robustness of the CGSA in Estimating the Hurst Exponent from Time Series with Fractal and Harmonic Components George Kalarickal (Massachusetts Institute of Technology) Jonathan Marshall Neural Model of Temporal and Stochastic Properties of Binocular Rivalry Takeshi Kambara (University of Electro-Communications) Yoshiki Kashimori A Positive Role of Noises in Accurate Detection of Time Difference by Electrosensory System of Weakly Electric Fish Jan Karbowski (Boston University) Nancy Kopell Multispikes and Synchronization in a Large Neural Network with Temporal Delays Matthias Kaschube (Max-Planck-Institut Fuer Stroemungsforschung) Fred Wolf, Theo Geisel, and Siegrid Loewel Quantifying the Variability of Patterns of Orientation Domains in the Visual Cortex of Cats Yoshiki Kashimori (Univ. of Electro-Communications) Takeshi Kambara A Role of Synaptic Variation Depending on Precise Timing of Pre- and Post-synaptic Depolarization in Electrolocation Adam Kepecs (Brandeis University) Xiao-Jing Wang An Analysis of Complex Bursting in Cortical Pyramidal Neuron Models Blackwell Kim (George Mason University) Characterization of the Light-induced Currents in Hermissenda D. O. Kim (Univ. Conn. Health Center) W. R. D'Angelo Computational Model for the Bushy Cell of the Cochlear Nucleus Wonryull Koh (Texas A&M University) Bruce H. McCormick Distributed, Web-based Microstructure Database for Brain Tissue Wonryull Koh (Texas A&M University) Bruce H. McCormick, William R. Shankle, and James H. Fallon Geometric Modeling of Local Cortical Networks Alexander Kozlov (Russian Academy of Science) Erik Aurell, Tatiana Deliagina, Sten Grillner, Jeanette Hellgren-Kotaleski, Grigory Orlovsky, and Pavel Zelenin Modeling Control of Body Orientation in the Lamprey Sarah Lesher (University of Maryland) Li Guan and Avis Cohen Symbolic Time Series Analysis of Neural Data Nir Levy (Tel Aviv University) David Horn and Eytan Ruppin Distributed Synchrony in an Attractor of Spiking Neurons Shu-Chen Li (Max Planck Institute for Human Development) Ulman Lindenberger and Peter A. Frensch Unifying Levels of Cognitive Aging: From Neurotransmission to Representation to Cognition Hualou Liang (Florida Atlantic University) Mingzhou Ding and Steve Bressler On the Tracking of Dynamic Functional Relations in Monkey Cerebral Cortex Christiane Linster (Boston University) Eve Derosa, Michaella Maloney, and Michael Hasselmo Selective Cholinergic Suppression of Pre-strengthened Synapses: A Mechanism to Minimize Associative Interference between Odors John E. Love (Florida Institute of Technology) Kathleen M. Johnson GRAVICOGNITOR: Toward a Hybrid Fuzzy Cellular Neural Network Based on the Cytoarchitectonics of Biological Gravity-Sensing Organ Ontogenesis Huo Lu (California Institute of Technology) James Bower Noradrenergic Modulation of Interneurons in the Cerebellar Cortex Malcolm A. MacIver (University of Illinois) Mark E. Nelson Evidence for Closed-Loop Control of Prey Capture in Weakly Electric Fish Norbert Mayer (MPI Fuer Stroemungsforschung) Michael Herrmann and Theo Geisel Receptive Field Formation in Binocularly Deprived Cats Marcelo Mazza (Universidade De S=E3o Paulo) Antonio Roque-da-Silva Realistic Computer Simulation of Cortical Lesion Induced Imbalances in Properties of Somatotopic Maps Bruce H. McCormick (Texas A&M University) Richard W. DeVaul, William R. Shankle, and James H. Fallon Modeling Neuron Spatial Distribution and Morphology in the Developing Human Cerebral Cortex Rebecca McNamee (University Of Pittsburgh) Mingui Sun and Robert Sclabassi Use of a Neuro-Fuzzy Inference System (NFIS) for Modeling the Physiologic System of Beat-By-Beat Cardiac Control: Comparison to an Auto-Regressive Moving Average (ARMA) Model Georgiy Medvedev (Boston University) Charles Wilson, Jay Callaway, and Nancy Kopell A Dopaminergic Neuron as a Chain of Oscillators: Analysis of Transient Dynamics Eduardo Mercado III (Rutgers University) Catherine E. Myers and Mark A. Gluck Modeling Auditory Cortical Processing as an Adaptive Chirplet Transform Eugene Mihaliuk (West Virginia University) Kenneth Showalter Entrainment with Hebbian Learning Octavian D. Mocanu (Universidad Aut=F3noma De Barcelona) Joan Oliver, Fidel Santamar=EDa, and James Bower A Passive Featuring of the Cerebellar Granule Cell (The Branching Point Hypothesis) Benoit Morel (Carnegie Mellon University) Biologically Plausible Learning Rules for Neural Networks and Quantum Computing John Nafziger (University of Pennsylvania) Leif Finkel A Stimulus Density-dependent Normalization Mechanism for Modulating the Range of Contour Integration TUESDAY, JULY 20, 1999 9:00 General Information 9:15 Featured Contributed Talk: Seth Wolpert (Penn State University-Harrisburg) W. Otto Friesen On the Parametric Stability of a Central Pattern Generator Contributed Talks 10:05 Robert Butera (Lab of Neural Control, NINDS, NIH) Sheree Johnson, Christopher Del Negro, John Rinzel, and Jeffrey Smith Dynamics of Excitatory Networks of Burst-capable Neurons: Experimental and Modeling Studies of the Respiratory Central Pattern Generator 10:25 Philip Ulinski (University of Chicago) Vivek Khatri Functional Significance of Interactions between Inhibitory Interneurons in Visual Cortex 10:45 Stella Yu (Carnegie Mellon University) Tai Sing Lee What do V1 Neurons Tell Us about Saccadic Suppression 11:05 Break 11:20 John A. White (Boston University) Matthew I. Banks, Nancy Kopell, and Robert A. Pearce A Novel Mechanism for Theta Modulation of Fast GABA_A Circuit Activity 11:40 Jeremy Caplan (Brandeis University) Michael Kahana, Robert Sekuler, Matthew Kirschen, and Joseph Madsen Task Dependence of Human Theta Oscillations during Virtual Maze Navigation 12:00 Albert Compte (Brandeis University) Nicolas Brunel and Xiao-Jing Wang Spontaneous and Spatially Tuned Persistent Activity in a Cortical Working Memory Model 12:20 Lunch Break and Poster Preview Session C 2:00 Featured Contributed Talk: A. D. Redish (University of Arizona) B. L. McNaughton and C. A. Barnes What Makes Place Cells Directional on the Linear Track? Contributed Talks 2:50 Victoria Booth (New Jersey Institute of Technology) Amitabha Bose and Michael Recce Hippocampal Place Cells and the Generation of Temporal Codes 3:10 Ali Minai (University of Cincinnati) Simona Doboli and Phillip Best A Comparison of Context-Dependent Hippocampal Place Codes In 1-Layer and 2-Layer Recurrent Networks 3:30 Mayank Mehta (Massachusetts Institute of Technology) Michael Quirk and Matthew Wilson >From Hippocampus to V1: Effect of LTP on Spatio-Temporal Dynamics of >Receptive Fields 3:50 Break 4:10 Jessica D. Bayliss (University of Rochester) Dana H. Ballard Single Trial P3 Epoch Recognition in a Virtual Environment 4:30 Geoffrey Goodhill (Georgetown University Medical Center) Andrei Cimponeriu Modeling the Joint Development of Ocular Dominance and Orientation Columns in Visual Cortex 4:50 Arjen van Ooyen (Netherlands Institute for Brain Research) David Willshaw Influence of Dendritic Morphology on Axonal Competition and Pattern of Innervation 5:20 End of Day Announcements 8:00 Poster Session C Hirofumi Nagashino (The University of Tokushima) Kazumi Achi and Yohsuke Kinouchi Synchronization with a Periodic Pulse Train in an Asymmetrically Coupled Neuronal Network Model Alexander Neiman (University of Missouri at St. Louis) Xing Pei, Frank Moss, Winfried Wojtenek, Lon Wilkens, Martin Huber, Mathias Dewald, and Hans Braun Spike Train Reveals Low Dimensional Deterministic Behaviors in a Hodgkin-Huxley Neuron with Intrinsic Oscillator Alexander Neiman (University of Missouri at St. Louis) Frank Moss, Pei Xing, David Russell, Winfried Wojtenek, Lon Wilkens, Hans Braun, and Martin Huber Synchronization of the Electroreceptors of the Paddlefish Alexander Neiman (University of Missouri at St.Louis) Ulrike Feudel, Xing Pei, Winfried Wojtenek, Frank Moss, Hans Braun, Mathias Dewald, and Martin Huber Global Bifurcations and Intermittency in a Hodgkin-Huxley Model of Thermally Sensitive Neurons Mike Neubig (University Laval) Alain Destexhe Are Inhibitory Synaptic Conductances on Thalamic Relay Neurons Inhomogeneous? Are Synapses from Individual Afferents Clustered? Duane Nykamp (New York University) Daniel Tranchina A Population Density Approach that Facilitates Large-scale Modeling of Neural Networks: Extension to Slow Inhibitory Synapses Hiroshi Okamoto (Fuji Xerox Co.) Tomoki Fukai A Model for A Cortical Mechanism to Store Intervals of Time Tim C. Pearce (University of Leicester) Odour to Sensor Space Transformations in Artificial and Biological Noses John Pezaris (California Institute of Technology) Maneesh Sahani and Richard Andersen Dynamics in LIP Spike Train Coherence Hans E. Plesser (Max-Planck-Institut for Fluid Dynamics) Wulfram Gerstner Escape Rate Models for Noisy Integrate-and-Fire Neurons Gregor Rainer (Massachusetts Institute of Technology) Earl Miller Neural Ensemble States in the Prefrontal Cortex during Free Viewing Identified using Hidden Markov Model Pamela Reinagel (Harvard Medical School) R. Clay Reid Reproducibility of Firing Patterns in the Thalamus David V. Reynolds (University of Windsor) Christopher Aswin A Large-Scale Neuroanatomical Model of Attention Implemented as a Computer Simulation Barry Richmond (National Institute of Mental Health) Mike Oram and Matthew Wiener The Random Origin of Precise Timing within Single Spike Trains during Pattern Recognition Dan Rizzuto (Brandeis University) Michael Kahana An Autoassociative Neural Network Model of Paired-associate Learning Patrick Roberts (Neurological Sciences Institute) Electrosensory Response Mechanisms in Mormyrid Electric Fish Bas Rokers (Rutgers University) Catherine Myers A Dynamic Model of Learning in the Septo-Hippocampal System Ilya A. Rybak (Drexel University) John K. Chapin, Allon Guez, and Karen A. Moxon Competition and Cooperation between the Automatic and Higher Order Voluntary/Behavioral Neural Mechanisms in the Brain Control of Movements Maneesh Sahani (California Institute of Technology) John Pezaris and Richard Andersen Short Discrete Epochs of Chirped Oscillatory Spiking in LIP Neurons Maneesh Sahani (California Institute of Technology) Jennifer Linden Doubly Stochastic Poisson Models for Smoothing and Clustering of Spike Trains Ko Sakai (RIKEN) Shigeru Tanaka Perceptual Segmentation and Neural Grouping in Tilt Illusion Anders Sandberg (Royal Institute of Technology) Anders Lansner, Karl-Magnus Peterson, Martin Ingvar, and =D6rjan Ekeberg A Palimpsest Memory Based on an Incremental Bayesian Learning Rule Fidel Santamaria (California Institute of Technology) Dieter Jaeger, James Bower, and Erik De Schutter Dendritic Temporal Integration Properties of a Purkinje Cell are Modulated by Background Activity: A Modeling Study William Rodman Shankle (University of California at Irvine) Junko Hara, James H. Fallon, A. Kimball Romney, John P. Boyd, Robert S. Sneddon, and Benjamin H. Landing Insights into the Structuring of the Cytoarchitecture of the Developing Postnatal Human Cerebral Cortex from Term Birth to Age 72 Months: Relevance to Computational Models and Access to the Data Gregory Smith (New York University) Charles Cox, S. Murray Sherman, and John Rinzel Spike-frequency Adaptation in Sinusoidally-driven Thalamocortical Relay Neurons Paul Smolen (The University of Texas) Douglas A. Baxter and John H. Byrne Biochemical Constraints on Realistic Models of Circadian Rhythms Friedrich T. Sommer (University of Ulm) On Cell Assemblies in a Cortical Column Sen Song (Brandeis University) Larry Abbott Temporally Asymmetric Hebbian Learning and Neuronal Response Variability Cristina Soto-Trevino (Brandeis University) L.F. Abbott and Eve Marder A Robust Network Based on Activity-dependent Regulation of Inhibitory Synaptic Conductances Bilin Zhang Stiber (University of California at Berkeley) Edwin R. Lewis, Michael Stiber, and Kenneth R. Henry Auditory Singularity Detection by a Gerbil Cochlea Model Katrin Suder (Ruhr-University Bochum) Florentin Woergoetter and Thomas Wennekers Neural Field Description of State-Dependent Visual Receptive Field Changes Mingui Sun (University of Pittsburgh) Robert J. Sclabassi A Novel Method to Salvage Clipped Multichannel Neurophysiological Recordings Krisztina Szalisznyo (The Hungarian Academy of Sciences) Peter Erdi Search for Resonators:Effects of Granule Cell Firing Properties on Temporal Patterns of the CA3 Pyramidal Cell David Tam (University of North Texas) A Spike Train Analysis for Detecting Spatio-temporal Integration in Neurons David Tam (University of North Texas) A Spike Train Analysis for Detecting Spatial Integration in Neurons Shoji Tanaka (Sophia University) Post-Cue Activity of Prefrontal Cortical Neurons Controlled by Local Inhibition Akaysha Tang (University of New Mexico) Barak Pearlmutter, Michael Zibulevsky, and Rebecca Loring Response Time Variability in the Human Sensory and Motor Systems Adam Taylor (University of California at San Diego) William B. Kristan, Jr. and Garrison W. Cottrell A Model of the Leech Segmental Swim Central Pattern Generator Peter Thomas (University of Chicago) Jack Cowan Spin Model for Orientation Map Via Reduction of Hebb's Rule Simon Thorpe (CERCO) Arnaud Delorme and Rufin van Rullen Real-time Simulation of Visual Processing with Millions of Spiking Neurons Paul H. E. Tiesinga (Salk Institute) Shuang Zhang and Jorge V. Jose Model of Carbachol-induced Gamma-frequency Oscillations in Hippocampus Wilson Truccolo-Filho (Florida Atlantic University) Mingzhou Ding and Steven L. Bressler Stability and Bifurcation Analysis of a Generic Cortical Area Model Philip Ulinski (University of Chicago) Zoran Nenadic and Bijoy Ghosh Spatiotemporal Dynamics in a Model of Turtle Visual Cortex Jean-Francois Vibert (Facult=E9 de M=E9decine Saint-Antoine) Vincent Lagoueyte, Nicolas Bourri=E9, Gilles Fortin, and Jean Champagnat Modeling the Primitive Rhythmic Neural Network of the Chicken Embryo Gene Wallenstein (University of Utah) Michael Hasselmo Septal Modulation of Hippocampal Firing Fields and Flexible Spatial Learning Heng Wang (University of Kentucky) Ranu Jung Effects of Supraspinal-Spinal Loops on the Dynamic Evolution of Fictive Locomotion Stuart Washington (The Krasnow Institute) Giorgio Ascoli and Jeffrey Krichmar Statistical Analysis of Dendritic Morphology's Effect on CA3 Pyramidal Cell Electrophysiology Thomas Wennekers (MPI for Mathematics in the Sciences) Dynamics of Spatio-temporal Patterns in Associative Networks of Spiking Neurons Matthew Wiener (NIMH, NIH) Mike Oram and Barry Richmond Assessing Information Processing in V1 Neurons across Stimulus Sets Russell Witte (Arizona State University) Daryl Kipke Evidence of Competing Neural Assemblies in Background Activity of Neurons in Auditory Cortex of Awake Guinea Pig Xiangbao Wu (University of Virginia) Sean Polyn and William Levy Entorhinal/dentate Excitation of CA3: A Critical Variable in Hippocampal Models? Xiangbao Wu (University of Virginia) Aaron Shon and William Levy Using Computational Simulations to Discover Optimal Training Paradigms Jian Zhang (Brandeis University) Larry Abbott Gain Modulation of Recurrent Networks Ying Zhou (Rhode Island College) Walter Gall An Organizing Center for Planar Neural Excitability WEDNESDAY, July 21, 1999 9:30 General Information 9:45 Featured contributed talk: Sharon Crook (Montana State University) Gwen Jacobs, Kelli Hodge, and Cooper Roddey Dynamic Patterns of Activation in a Neural Map in the Cricket Cercal Sensory System Contributed Talks 10:35 Richard Zemel (University of Arizona) Jonathan Pillow Encoding Multiple Orientations in a Recurrent Network 10:55 Dana Ballard (University of Rochester) Rajesh Rao A Single-spike Model of Predictive Coding 11:15 Invited Presentation: To be announced 11:50 Federal Funding Opportunities 12:30 Lunch Break 2:00 Workshops I - Organization 9:30 Rock and Roll Jam Session THURSDAY, July 22, 1999 9:30 General Information 9:45 Featured Contributed Talk: Nathaniel D. Daw (Carnegie Mellon University) David S. Touretzky Behavioral Results Suggest an Average Reward TD Model of Dopamine Neurons Contributed Talks 10:35 Christiane Linster (Boston University) Michael Hasselmo How Cholinergic Modulation Influences Generalization between Similar Odorants: Behavioral Predictions from A Computational Model 10:55 Elliot D. Menschik (University of Pennsylvania) Leif H. Finkel Cholinergic Neuromodulation of an Anatomically Reconstructed Hippocampal CA3 Pyramidal Cell 11:15 Invited Presentation: To be announced 12:25 Business Meeting 1:00 Lunch Break 2:30 Workshops II - Organization 6:30 Banquet - Banquet River Cruise From jbower at bbb.caltech.edu Thu Jun 10 22:49:13 1999 From: jbower at bbb.caltech.edu (James M. Bower) Date: Thu, 10 Jun 1999 19:49:13 -0700 Subject: Travel Grants for CNS*99 Message-ID: ********* TRAVEL GRANTS ********* FOR THE UPCOMING EIGHTH ANNUAL COMPUTATIONAL NEUROSCIENCE MEETING (CNS*99) July 18 - 22, 1999 Pittsburgh, Pennsylavania We have just learned that travel grants for students and postdoctoral fellows will be available for the upcoming CNS meeting. While priority for support will go to those who are listed as authors on CNS papers, we anticipate also being able to support some graduate students and postdoctoral fellows who want to attend the meeting without presenting a paper. Information on the meeting agenda, registration, etc can be found at: http://cns.numedeon.com/cns99/ Please note that conference rates at the hotel and assured room availability require that you make reservations at the Pittsburgh Hilton and Towers BEFORE JUNE 17th. Please contact the hotel directly for reservations and make sure and mention that you are attending CNS*99. The Pittsburgh Hilton and Towers: (412) 391-4600 From zhaoping at gatsby.ucl.ac.uk Sun Jun 13 16:21:10 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Sun, 13 Jun 1999 21:21:10 +0100 Subject: Positions available in computational vision Message-ID: <199906132021.VAA05253@flies.gatsby.ucl.ac.uk> Ph.D. student/postdoctoral positions in computational vision The Gatsby Computational Neuroscience Unit at University College London seeks applicants for a position as a Ph.D. student or postdoctoral researcher in computational vision. Candidates should have a strong analytical background and a keen interest in neuroscience and/or psychophysics. Applicants should send a detailed CV, a statement of study/research interests, and names and addresses of 3 references to zhaoping at gatsby.ucl.ac.uk (email applications preferred) or Zhaoping Li, 17 Queen Square, London, WC1N, 3AR, UK. For more information on the Gatsby Unit see http://www.gatsby.ucl.ac.uk, and on Zhaoping Li's lab see http://www.gatsby.ucl.ac.uk/~zhaoping From andre at physics.uottawa.ca Mon Jun 14 17:23:36 1999 From: andre at physics.uottawa.ca (Andre Longtin) Date: Mon, 14 Jun 99 17:23:36 EDT Subject: postdoctoral position Message-ID: <9906142123.AA16516@miro.physics.uottawa.ca.physics.uottawa.ca> POSTDOCTORAL POSITION IN NEURONAL MODELING/NONLINEAR DYNAMICS The Physics Department of the University of Ottawa has an immediate opening for a postdoctoral position in Neuronal Modeling and Nonlinear Dynamics. The research will focus mainly on the role of feedback in sensory systems, with an emphasis on electrosensory systems. The position is for one year, renewable one year. Applications will be accepted until the position is filled. Candidates must be within the first four years of their doctoral degree. Candidates must have demonstrated excellence in research, and possess a strong background in neuronal modeling and nonlinear dynamics; those whose training is more slanted towards either of these areas will also be considered. Candidates wishing to carry out a blend of experimental work and modeling/computational studies are strongly encouraged to apply; experiments would be carried out in Prof. Len Maler's laboratory in the Faculty of Medicine at the University of Ottawa. The salary of $31,000 CAN per year conforms with current NSERC guidelines; additional support for moving and conference travel is also available. A second similar postdoctoral position will become available with a starting date around January 2001. The National Capital Region of Canada, with a population around one million, is the home of the Federal Government, the National Research Council, and many other governmental research laboratories. Also known as Silicon Valley North, the region has a very high concentration of high-tech companies. Its National Art Center, Rideau Canal, museums, cafes, numerous cultural festivals, and its close proximity to hiking, swimming, canoeing and skiing areas make Ottawa a most enjoyable place to live in. Applicants should send a CV and a brief statement of research interests by regular mail or email (postdoc99 at miro.physics.uottawa.ca), and arrange for a minimum of two letters of recommendation to be sent to: Prof. Andre Longtin Physics Department University of Ottawa 150 Louis Pasteur Ottawa, Ont. Canada K1N 6N5 tel: 613-562-5800 ext.6762 fax: 613-562-5190 From marks at maxwell.ee.washington.edu Wed Jun 16 01:20:45 1999 From: marks at maxwell.ee.washington.edu (Robert J. Marks II) Date: Tue, 15 Jun 1999 22:20:45 -0700 Subject: New Book: "Neural Smithing" (MIT Press - 1999) g Message-ID: <3.0.1.32.19990615222045.006b57b4@maxwell.ee.washington.edu> NEW BOOK: Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks Russell D. Reed & Robert J. Marks II (MIT Press, 1999). _____________________ REVIEW "I have added a new book to the list of "The best elementary textbooks on practical use of NNs" in the NN FAQ ..." "Reed, R.D., and Marks, R.J, II (1999), Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks, Cambridge, MA: The MIT Press, ISBN 0-262-18190-8. "After you have read Smith (1993) or Weiss and Kulikowski (1991), Reed and Marks provide an excellent source of practical details for training MLPs. They cover both backprop and conventional optimization algorithms. Their coverage of initialization methods, constructive networks, pruning, and regularization methods is unusually thorough. Unlike the vast majority of books on NNs, this one has lots of really informative graphs..." Warren S. Sarle, SAS Institute Inc. on . ______________________ Contents: Artificial neural networks are nonlinear mapping systems whose structure is loosely based on principles observed in the nervous systems of humans and animals. The basic idea is that massive systems of simple units linked together in appropriate ways can generate many complex and interesting behaviors. This book focuses on the subset of feedforward artificial neural networks called multilayer perceptions (MLP). These are the most widely used neural networks, with applications as diverse as finance (forecasting), manufacturing (process control), and science (speech and image recognition). This book presents an extensive and practical overview of almost every aspect of MLP methodology, progressing from an initial discussion of what MLPs are and how they might be used to an in-depth examination of technical factors affecting performance. The book can be used as a tool kit by readers interested in applying networks to specific problems, yet it also presents theory and references outlining the last ten years of MLP research. Table of Contents Preface 1 Introduction 1 2 Supervised Learning 7 3 Single-Layer Networks 15 4 MLP Representational Capabilities 31 5 Back-Propagation 49 6 Learning Rate and Momentum 71 7 Weight-Initialization Techniques 97 8 The Error Surface 113 9 Faster Variations of Back-Propagation 135 10 Classical Optimization Techniques 155 11 Genetic Algorithms and Neural Networks 185 12 Constructive Methods 197 13 Pruning Algorithms 219 14 Factors Influencing Generalization 239 15 Generalization Prediction and Assessment 257 16 Heuristics for Improving Generalization 265 17 Effects of Training with Noisy Inputs 277 A Linear Regression 293 B Principal Components Analysis 299 C Jitter Calculations 311 D Sigmoid-like Nonlinear Functions 315 References 319 Index 339 Ordering information: 1. MIT Press http://mitpress.mit.edu/book-home.tcl?isbn=0262181908 2. amazon.com http://www.amazon.com/exec/obidos/ASIN/0262181908/qid%3D909520837/sr%3D1-21/ 002-3321940-3881246 3. Barnes & Nobel http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=1KKG10OPZT& mscssid=A7M4XXV5DNS12MEG00CGNDBFPT573NJS&pcount=&isbn=0262181908 4. buy.com http://www.buy.com/books/product.asp?sku=30360116 From harnad at coglit.ecs.soton.ac.uk Wed Jun 16 17:36:27 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Wed, 16 Jun 1999 22:36:27 +0100 (BST) Subject: EEG and Neocortical Function: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article: NEOCORTICAL DYNAMIC FUNCTION AND EEG by Paul L. Nunez This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL by May 14th to: bbs at cogsci.soton.ac.uk or write to [PLEASE NOTE SLIGHTLY CHANGED ADDRESS]: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ TOWARD A QUANTITATIVE DESCRIPTION OF LARGE SCALE NEOCORTICAL DYNAMIC FUNCTION AND EEG. Paul L. Nunez Permanent Address: Brain Physics Group, Dept. of Biomedical Engineering, Tulane University, New Orleans, Louisiana 70118 pnunez at mailhost.tcs.tulane.edu Temporary Address (6/98 - 6/00): Brain Sciences Institute, Swinburne University of Technology, 400 Burwood Road, Melbourne, Victoria 3122, Australia pnunez at mind.scan.swin.edu.au ABSTRACT: A conceptual framework for large-scale neocortical dynamic behavior is proposed. It is sufficiently general to embrace brain theories applied to different experimental designs, spatial scales and brain states. This framework, based on the work of many scientists, is constructed from anatomical, physiological and EEG data. Neocortical dynamics and correlated behavioral/cognitive brain states are viewed in the context of partly distinct, but interacting local (regionally specific) processes and globally coherent dynamics. Local and regional processes (eg, neural networks) are enabled by functional segregation; global processes are facilitated by functional integration. Global processes can also facilitate synchronous activity in remote cell groups (top down) which function simultaneously at several different spatial scales. At the same time, local processes may help drive (bottom up) macroscopic global dynamics observed with EEG (or MEG). A specific, physiologically based local/global dynamic theory is outlined in the context of this general conceptual framework. It is consistent with a body of EEG data and fits naturally within the proposed conceptual framework. The theory is incomplete since its physiological control parameters are known only approximately. Thus, brain state-dependent contributions of local versus global dynamics cannot be predicted. It is also neutral on properties of neural networks, assumed to be embedded within macroscopic fields. Nevertheless, the purely global part of the theory makes qualitative, and in a few cases, semi-quantitative predictions of the outcomes of several disparate EEG studies in which global contributions to the dynamics appear substantial. Experimental data are used to obtain a variety of measures of traveling and standing wave phenomena, predicted by the pure global theory. The more general local/global theory is also proposed as a "meta-theory," a suggestion of what large-scale quantitative theories of neocortical dynamics may be like when more accurate treatment of local and non-linear effects is achieved. In the proposed local/global theory, the dynamics of excitatory and inhibitory synaptic action fields are described. EEG and MEG are believed to provide large-scale estimates of modulation of these synaptic fields about background levels. Brain state is determined by neuromodulatory control parameters. Some states are dominated by local cell groups, in which EEG frequencies are due to local feedback gains and rise and decay times of post-synaptic potentials. Local frequencies vary with brain location. Other states are strongly global, with multiple, closely spaced EEG frequencies, but identical at each cortical location. Coherence at these frequencies is high over large distances. The global mode frequencies are due to a combination of delays in cortico-cortical axons and neocortical boundary conditions. Many states involve dynamic interactions between local networks and the global system, in which case observed EEG frequencies may involve "matching" of local resonant frequencies with one or more of the global frequencies. KEYWORDS: EEG, neocortical dynamics, standing waves, functional integration, spatial scale, binding problem, synchronization, coherence, cell assemblies, limit cycles, pacemakers ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.nunez.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.nunez ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.nunez *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From Gerhard.Paass at gmd.de Wed Jun 16 08:31:54 1999 From: Gerhard.Paass at gmd.de (Gerhard Paass) Date: Wed, 16 Jun 1999 14:31:54 +0200 Subject: CFP: Workshop Neural Networks and Connectionism, Magdeburg Message-ID: <3767993A.DF7290F7@gmd.de> [ We apologise if you should receive this message more than once ] ANNOUNCEMENT AND CALL FOR PAPERS Workshop on Neural Networks and Connectionism Magdeburg, Germany, 29.Sep.1999, The meeting of the working group 1.1.2 "Connectionism" of the German Society of Computer Science (GI) takes place in Magdeburg during the GI-Workshop-Days Learning, Knowledge Discovery, and Adaptivity. It is devoted to the discussion of new trends and ongoing research projects in the areas of connectionism and neural networks. TOPICS We want to discuss papers covering empirical, theoretical or interdisciplinary topics from connectionsm and neural networks, especially * Theory: Prediction and generalization, regularization, computational learning theory, support vector machines, approximation and estimation, learning in dynamical systems. * Algorithms andArchitectures:supervised and unsupervised learning, model selection, feedforward aand recurrent architectures, hybrid symbolic-subsymbolic approaches. * Knowledge discovery and Adaptivity: active learning, reinforcement learning, Markovian state estimation, novelty detection, information content, time-varying systems. * Applications: mmedical diagnosis, data mining, expert systems, financial predictions, time series analysis, information retrieval, etc. Deadline for contributions: July 31, 1999 For details see http://ais.gmd.de/~paass/nn99 From M.Usher at ukc.ac.uk Thu Jun 17 07:25:05 1999 From: M.Usher at ukc.ac.uk (M.Usher) Date: Thu, 17 Jun 1999 12:25:05 +0100 Subject: 1 year research position Message-ID: A research associate is required to assist in the running of a BBSRC funded project titled 'The role of VISUAL SYNCHRONY in perceptual organisation: studies in human psychophysics and computational methods.' The research programme follows-up studies described in a recent paper (Usher & Donnelly, 1998, Nature, pp. 179-182) investigating spatio-temporal interactions in visual grouping as a means for exploring mechanisms for GROUPING based on neural synchrony. The award was made to Dr Nick Donnelly (University of Southampton) and to Dr Marius Usher (Birkbeck College, University of London). The project will run for twelve months in the first instance (and will be performed either in London or in Southampton). It is intended that further funds will be sought during this first year to continue the research. The ideal candidate will have a background in visual psychophysics and should have experience programming in C on Unix workstations (experience with Silicon Graphics graphical libraries and/or OpenGL will be helpful). The salary will be based on the research associate scale and will be up to BP 16,927 (i.e., approx. $27,920) per annum. For further information contact either Nick Donnelly (email: n.donnelly at ukc.ac.uk) or Marius Usher (email: m.usher at ukc.ac.uk http://www.ukc.ac.uk/psychology/people/usherm/ ) Deadline for applications: July 1st 1999. Marius Usher Lecturer in Psychology and Cognitive Neuroscience From moeller at ifi.unizh.ch Fri Jun 18 11:11:17 1999 From: moeller at ifi.unizh.ch (Ralf Moeller) Date: Fri, 18 Jun 1999 17:11:17 +0200 Subject: Ph.D. student position synthetic modeling/biorobotics Message-ID: <376A6194.4750EE96@ifi.unizh.ch> ------------------------------------------ Position for a Ph.D. student in *Synthetic modeling / Biorobotics* at the AILab, University of Zurich ------------------------------------------ A new Ph.D. student position is open at the Artificial Intelligence Laboratory, Dept. of Computer Science of the University of Zurich. Availability: Immediately or at earliest convenience. Research on this position will focus on "Synthetic modeling" or "Biorobotics". "Synthetic modeling" is a novel biological methodology to gain insights in the mechanisms underlying the behavior of biological agents. Models developed to explain the animal's abilities are implemented on an artifical agent and validated by observing the behavior and the internal states of the robot. At the same time, the method will lead to new solutions for robotics applications. The specific goal of the project is to improve our understanding of the impressive visual navigation abilities of insects and to apply this knowledge to enable robots to safely navigate in complex environments. This will require scaling-up of previous theoretical and practical work. If the above challenges capture your interest, and you would like to become a member of an international research team conducting interdisciplinary work, submit a curriculum vitae, statement of research interests, and names of three referees to: Corinne Maurer Dept. of Computer Science University of Zurich Winterthurerstrasse 190 CH-8057 Zurich, Switzerland E-mail: maurer at ifi.unizh.ch Phone: 41-1-63 54331 Fax: 41-1-63 56809 For details on the research subject, contact: Ralf Moeller Email: moeller at ifi.unizh.ch Profile: Applicants should have an M.Sc., a university Diploma, or a similar degree, in one of the following areas: computer science, electrical or mechanical engineering, biology, neurobiology, physics, mathematics, or related disciplines. He/she should have good programming skills (C, C++, etc.) and preferably experience with robot control, image processing, and electronics. Tasks: The main task for the accepted candidate will be to conduct research towards his/her Ph.D. Additional tasks include support for classes organized by the AI-Lab as well as other administrative tasks required by the computer science department. Salary: The salary will be according to the specification of the Swiss National Science Foundation. Time prospect: The candidate is expected to complete his/her Ph.D. work within a period of maximum 4 years. From l.s.smith at cs.stir.ac.uk Fri Jun 18 08:34:16 1999 From: l.s.smith at cs.stir.ac.uk (Dr L S Smith) Date: Fri, 18 Jun 99 13:34:16 +0100 Subject: 2nd European Workshop on Neuromorphic Systems: Call for participation Message-ID: <199906181234.NAA09196@tinker.cs.stir.ac.uk> (Apologies if you receive this more than once) Call for Participation: 2nd European Workshop on Neuromorphic Systems (EWNS2) 3-5 September 1999, Cottrell Building, University of Stirling, Stirling, Scotland Neuromorphic systems are implementations in silicon of sensory and neural systems whose architecture and design are based on neurobiology. The area is at the intersection of many disciplines: neurophysiology, computer science and electrical engineering. Registration Forms and Further Information are available from the WWW page http://www.cs.stir.ac.uk/EWNS2 _____________________ Provisional Programme Friday September 3 0900-1030: Registration and Coffee 1030-1115: Pedro Marijuan, Dept. Ingen. Electronica y Comunicaciones, Universidad de Zaragoza, Spain: From Darwin to Cajal: The Quest for a Neurodynamic Optimization Principle Session 1: General Papers 1115-1145: Babara Webb, University of Stirling: A Framework for Models of Biological Behaviour 1145-1215: Catherine Breslin and Leslie Smith, University of Stirling: Silicon Cellular Morphology 1215-1245: J Love, Florida Institute of Technology, Melbourne, Florida USA and K M Johnson National Research Council, NASA Kennedy Space Center, Florida USA: Towards Evolvable Neuromorphic Ststems: Adaptive Ontogenetic Engineering of Artificial Sensorineural Vestibular Organs 1245-1400: Lunch Session 2: Auditory I 1400-1445 Simon Jones, Dept. of Electrical Engineering, University of Loughborough, England: (title to be announced) 1445-1515: Mete Erturk, C P Brown, D J Klein and S A Shamma, Institute for Systems Research and Dept. of Electrical Engineering, University of Maryland, USA: A Neuromorphic Approach to the Analysis of Monaural and Binaural auditory Signals 1515-1545: Amir Hussain and Douglas R Campbell, Dept of Applied Computing, University of Dundee, Scotland: Speech Intelligibility - Improvements Using a Binaural Adaptive-Shceme Based Conceptually on the Human-Auditory System 1545-1610: Tea Session 3: Vision I 1610-1640 Tobi Delbruk, Institute for Neuronformatics, Zurich, Switzerland: Three Silicon Retinas for Simple Consumer Applications 1640-1710 Seiji Kameda, Akira Honda, Tetsuya Yagi, Faculty of Computer Science and Ststems Engineering, Kyushu Institute of Technology, Fukuoka, Japan: Real Time Image Processing with an Analog Vision Chip System 1930 Wine and Cheese Reception in the Atrium _________________________________________________________ Saturday September 4 Session 4: Auditory II 0900-0945: Andre van Schaik, Craig Jin and Simon Carlile, University of Sydney, Australia: Human Localisation of Band-Pass Filtered Noise 0945-1015: Amir Hussain, Dept of Applied Computing, University of Dundee, Scotland: Binaural Neural-Network Based Sub-Band Processing of Noisy Speech Signals 1015-1045: Sofia Cavaco, Universidade Nova de Lisboa and John Hallam, University of Edinburgh: A Biologically Plausible Acoustic Motion Detection System for a Robotic Cat 1045-1115: Coffee Session 5: Vision II 1115-1145: E Ros, F J Pelayo, D Palomar, I Rojas, J L Bernier and A Prieto, Dept of Architecture and Tecnology of Computers, University of Granada, Spain: Stimulus correlation and Adaptive Local Motion Detection 1145-1215: Reid R Harrison and Christof Koch, Computation and Neural Systems Program, California Institute of Technology, USA: An Analog VLSI Implementation of a Visual Interneuron: Enhanced Sensory Processing through Biophysical Modelling 1215-1245: R Timothy Edwards, Johns Hopkins University, Applied Physics Laboratory, Maryland, USA: Acoustic Transient Classification with a Template Correlation Processor 1245-1400: Lunch 1400-1445 Avis Cohen, Dept of Biology and Neuroscience and Cognitive Science, University of Maryland, USA: (title to be announced) Session 6: Robotics 1445-1515 Timothy Chapman and Babara Webb, University of Stirling: A Neuromorphic Hair Sensor Model of Wind-Mediated Escape in the Cricket 1515-1545: R Mudra, R Hahnloser and R J Douglas, Institute for Neuroinformatics, Zurich, Switzerland: Integrating neuromorphic action-oriented perceptual inputs to generate a navigation behaviour for a robot 1545-1615 Coffee 1615-1645 Mark Blanchard, P F M J Verschure, Institute of Neuroinformatics, Zurich, Switzerland and F C Rind, Dept of Neurobilogy, University of Newcastle upon Tyne, England: Using Mobile Robots to Study Locust Collision Avoidance Responses 1645-1715 Ralf Moller, Dept of Computer Science and Dept of Zoology, University of Zurich, Switzerland: Visual Homing in Analog Hardware 1915: Conference Dinner _______________________________________________ Sunday September 5 Session 7 Hardware 1000-1045: Shii Chii: Institute of Neuroinformatics, Zurich, Switzerland: (title to be announced) 1045-1115: Peter Paschke and Carsten Schauer, Technical University of Ilmenau, Germany: A Spike-Based Model of Binaural Sound Localization 1115-1145: Cyprian Grassmann and Joachim K Anlauf, University of Bonn, Germany: Fast Digital Simulation of Spiking Neural Networks and Neuromorphic Integration with SPIKELAB 1145-1215: B E Eriksson, L S Smith, University of Stirling, M Glover, DERA, A Hamilton, University of Edinburgh, Scotland: SPIKE II: An integrate-and-fire aVLSI chip. 1215-1315: Lunch 1315-1345: Best Paper Prize 1345-1500: Panel Discussion: Neuromorphic Systems - The Ways Forward Presentation on EPSRC Silicon and Neurobiology Network (L.S. Smith) 1500: Tea and Close of Conference We gratefully acknowledge the assistance of the Gatsby Charitable Foundation. From j.v.stone at sheffield.ac.uk Sat Jun 19 10:19:30 1999 From: j.v.stone at sheffield.ac.uk (Jim Stone) Date: Sat, 19 Jun 1999 15:19:30 +0100 Subject: Research Assistant Message-ID: ----------------------------------------------------------------------- Research Assistant in Functional Decomposition and Analysis of Brain Activity ------------------------------------------------------------------------ DEPARTMENT OF PSYCHOLOGY, UNIVERSITY OF SHEFFIELD, UK. Applications are invited for a research assistant, to commence before October 1st 1999 (the contract will end on May 31st 2002). The successful candidate should have a solid theoretical background from a numerate discipline (e.g. mathematics, control engineering). We would also consider candidates from other backgrounds with experience of image analysis/signal processing. Post-doctoral candidates are preferred, but graduates with appropriate skills will also be considered. The research assistant will work with Dr John Porrill and Dr Jim Stone, developing and applying novel techniques for analysis of brain activity using data obtained from fMRI, optical imaging and EEG. The candidate will also have the opportunity to design and run fMRI/EEG experiments. This research is funded as one component of a co-operative MRC award, and will involve collaboration with other groups working on fMRI and optical imaging techniques. The salary is up to 21,815 pounds (RA1A scale, point 11) according to experience. This interdisciplinary project is shared between the Departments of Psychology and Clinical Neurology. The project is based in the Department of Psychology, which is one of the leading research centres for psychology in the UK, and has a rapidly expanding research base in imaging technologies. This is reflected in the top ratings (currently Grade 5A) achieved by the department in all four national research assessment exercises. Applicants should send a CV and a brief statement of research interests by regular mail or email to be sent to: Dr JV Stone, Psychology Department, Sheffield University, Sheffield, S10 2TP, UK. Informal enquiries can be made via email to: {j.porrill}{j.v.stone}@Sheffield.ac.uk Recent papers relevant to this project can be obtained from: http://www.shef.ac.uk/~pc1jvs/ ----------------------------------------------------------------------- From ESANN at dice.ucl.ac.be Mon Jun 21 03:49:35 1999 From: ESANN at dice.ucl.ac.be (ESANN) Date: Mon, 21 Jun 1999 09:49:35 +0200 Subject: ESANN'2000: European Symposium on Artificial Neural Networks Message-ID: <000601bebbba$9b5cbd50$5aed6882@natacha.dice.ucl.ac.be> ***************************************************** Call for special sessions European Symposium on Artificial Neural Networks Bruges (Belgium), April 26-27-28, 2000 ***************************************************** (We apologize for duplicates of this email) The preliminary announcement for the ESANN'2000 conference is available from the WWW page http://www.dice.ucl.ac.be/esann The call for papers will be published soon on this page. We are now waiting for proposals/suggestions to organize special sessions during the conference. A description of special sessions is available from the above link. For those who never attended ESANN, the programmes (and lists of committee members) of the former conferences is also available from the above link. DEADLINE for proposals to organize special sessions: July 15, 1999. ===================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat D facto conference services 27 rue du Laekenveld - B-1080 Brussels - Belgium tel: + 32 2 420 37 57 - fax: + 32 2 420 02 55 mailto:esann at dice.ucl.ac.be ===================================================== From lorincz at iserv.iki.kfki.hu Tue Jun 22 12:58:42 1999 From: lorincz at iserv.iki.kfki.hu (Andras Lorincz) Date: Tue, 22 Jun 1999 18:58:42 +0200 (MET) Subject: Post-doctoral position in electrophysiology Message-ID: A post-doctoral position is available for 3 years from the 1st of January 2000 in the field of visual electrophysiology. The project is on the coding of various depth (3D) cues by inferior temporal neurons. The work involves behavioral training of rhesus monkeys and recording of single cortical neurons in the awake monkey. For more information and applications contact: Dr. Rufin Vogels email: Rufin.Vogels at med.kuleuven.ac.be http://www.kuleuven.ac.be/facdep/medicine/dep_neu/neufys/ Lab. Neuro- en Psychofysiologie Fac. Geneeskunde Onderwijs en Navorsing Campus Gasthuisberg B-3000 Leuven Belgium Tel: +32 16 345857 From dummy at ultra3.ing.unisi.it Wed Jun 23 07:20:45 1999 From: dummy at ultra3.ing.unisi.it (Paolo Frasconi) Date: Wed, 23 Jun 1999 13:20:45 +0200 (MET DST) Subject: REMINDER: Special Issue on Learning in Structured Domains Message-ID: REMINDER: Submission deadline, IEEE TKDE Special Issue Electronic abstracts due: July 15, 1999 Submissions due: July 30, 1999 Special issue on Connectionist Models for Learning in Structured Domains IEEE Transactions on Knowledge and Data Engineering BACKGROUND Structured representations are ubiquitous in different fields such as knowledge representation, language modeling, and pattern recognition. Although many of the most successful connectionist models are designed for "flat" (vector-based) or sequential representations, recursive or nested representations should be preferred in several situations. One obvious setting is concept learning when objects in the instance space are graphs or can be conveniently represented as graphs. Terms in first-order logic, blocks in document processing, patterns in structural and syntactic pattern recognition, chemical compounds, proteins in molecular biology, and even world wide web sites, are all entities which are best represented as graphical structures, and they cannot be easily dealt with vector-based architectures. In other cases (e.g., language processing) the process underlying data has a (hidden) recursive nature but only a flat representation is left as an observation. Still, the architecture should be able to deal with recursive representations in order to model correctly the mechanism that generated the observations. The interest in developing connectionist architectures capable of dealing with these rich representations can be traced back to the end of the 80's. Early approaches include Touretzky's BoltzCONS, the Pollack's RAAM model, Hinton's recursive distributed representations. More recent techniques include labeled RAAMs, holographic reduced representations, and recursive neural networks. Today, after more than ten years since the explosion of interest in connectionism, research in architectures and algorithms for learning structured representations still has a lot to explore and no definitive answers have emerged. It seems that the major difficulty with connectionist models is not just representing symbols, but rather devising proper ways of learning when examples are data structures, i.e. labeled graphs that can be used for describing relationships among symbols (or, more in general, combinations of symbols and continuously-valued attributes). TOPICS The aim of this special issue is to solicit and publish valuable papers that bring a clear picture of the state of the art in this area. We encourage submissions of papers addressing, in addition to other relevant issues, the following topics: * Algorithms and architectures for classification of data structures. * Unsupervised learning in structured domains. * Belief networks for learning structured patterns. * Compositional distributed representations. * Recursive autoassociative memories. * Learning structured rules and structured rule refinement. * Connectionist learning of syntactic parsing from text corpora. * Stochastic grammars and their relationships to neural and belief networks. * Links between connectionism and syntactic and structural pattern recognition. * Analogical reasoning. * Applications, including: - Medical and technical diagnosis: discovery and manipulation of structured dependencies, constraints, explanations. - Molecular biology and chemistry: prediction of molecular structure folding, classification of chemical structures. - Automated reasoning: robust matching, manipulation of logical terms, proof plans, search space reduction. - Software engineering: quality testing, modularization of software. - Geometrical and spatial reasoning: robotics, structured representation of objects in space, figure animation, layouting of objects. INSTRUCTIONS We encourage e-mail submissions (Postscript, RTF, and PDF are the only acceptable formats). For hard copy submission please send 6 copies of the manuscript to Prof. Marco Gori. Manuscripts should not exceed 30 pages double spaced (excluding Figures and Tables). The title and the abstract should be sent separately in ASCII format, even before the final submission, so that reviewers can be contacted timely. IMPORTANT DATES Submission of title and abstract (e-mail): July 15, 1999 Submission deadline: July 30, 1999 Notification of acceptance: December 31, 1999 Expected publication date: Mid-to-late 2000. GUEST EDITORS Prof. Paolo Frasconi DIEE, University of Cagliari Piazza d'Armi 09123 Cagliari (ITALY) Phone: +39 070 675 5849 E-mail: paolo at diee.unica.it Prof. Marco Gori DII, University of Siena Via Roma 56, 53100 Siena (ITALY) Phone: +39 0577 263 610 E-mail: marco at ing.unisi.it Prof. Alessandro Sperduti DI, University of Pisa Corso Italia 40, 56125 Pisa (ITALY) Phone: +39 050 887 213 E-mail: perso at di.unipi.it From adr at nsma.arizona.edu Wed Jun 23 16:34:39 1999 From: adr at nsma.arizona.edu (David Redish) Date: Wed, 23 Jun 1999 13:34:39 -0700 Subject: Book announcement Message-ID: <199906232034.NAA13211@raphe.nsma.arizona.edu> The following book might be of interest to the people on this list. adr ----------------------------------------------------- A. David Redish adr at nsma.arizona.edu Post-doc http://www.cs.cmu.edu/~dredish Neural Systems, Memory and Aging, Univ of AZ, Tucson AZ ----------------------------------------------------- BEYOND THE COGNITIVE MAP: From Place Cells to Episodic Memory A. David Redish Now available from MIT Press. [From the inside cover description] There are currently two major theories about the role of the hippocampus, a distinctive structure in the back of the temporal lobe. One says that it stores a cognitive map, the other that it is a key locus for the temporary storage of episodic memories. A. David Redish takes the approach that understanding the role of the hippocampus in space will make it possible to address its role in less easily quantifiable areas such as memory. Basing his investigation on the study of rodent navigation--one of the primary domains for understanding information processing in the brain--he places the hippocampus in its anatomical context as part of a greater functional system. Redish draws on the extensive experimental and theoretical work of the last 100 years to paint a coherent picture of rodent navigation. His presentation encompasses multiple levels of analysis, from single-unit recording results to behavioral tasks to computational modeling. From this foundation, he proposes a novel understanding of the role of the hippocampus in rodents that can shed light on the role of the hippocampus in primates, explaining data from primate studies and human neurology. The book will be of interest not only to neuroscientists and psychologists, but also to researchers in computer science, robotics, artificial intelligence, and artificial life. [Table of contents] 1 The Hippocampus Debate 2 Navigation Overview 3 Local View 4 Route Navigation: Taxon and Praxic Strategies 5 Head Direction 6 Path Integration 7 Goal Memory 8 Place Code 9 Self-Localization 10 Multiple Maps 11 Route Replay 12 Consolidation 13 Questions of Hippocampal Function 14 The Primate Hippocampus 15 Coda A Attractor Networks B Selective Experimental Review C Open Questions From alex at nervana.montana.edu Thu Jun 24 19:11:15 1999 From: alex at nervana.montana.edu (Alexander Dimitrov) Date: Thu, 24 Jun 1999 17:11:15 -0600 Subject: postdoctoral positions available Message-ID: <3772BB13.76AF5700@nervana.montana.edu> Two postdoctoral positions are available immediately in the Center for Computational Biology at Montana State University, Bozeman, in the laboratories of John Miller and Gwen Jacobs. Both positions are to study information processing in the cercal sensory system of the cricket. Applicants for both positions should have experience in electrophysiology. The focus of the project in Miller's lab is to understand encoding of dynamic sensory stimuli by ensembles of nerve cells. The primary experimental approach will be multi-unit extracellular recording. Candidates should have background in the application of information theory to analysis of neural systems. The focus of the project in Dr. Jacobs' lab is to understand the physiological mechanisms underlying neural encoding in this system. Experimental approaches will include intracellular and optical recording, and advanced morphometric analysis of identified nerve cells. For both projects, analysis of the data will involve the development of compartmental and analytical models of identified nerve cells and networks. Advanced computational, microscopy and physiological recording facilities are available within the Center, including a 32-processor SGI Origin2000 computer and a Leica TSP confocal microscope. More information about the Center for Computational Biology can be found at: http://www.nervana.montana.edu Interested parties should contact John Miller or Gwen Jacobs: jpm at nervana.montana.edu and gwen at nervana.montana.edu -- Alexander Dimitrov Center for Computational Biology Montana State University Bozeman, MT 59717-3505 phone: (406)994-6494 fax: (406)994-5122 email: alex at nervana.montana.edu From dwang at wjh.harvard.edu Fri Jun 25 09:47:20 1999 From: dwang at wjh.harvard.edu (DeLiang Wang) Date: Fri, 25 Jun 1999 09:47:20 -0400 Subject: Tech report on speech segregation Message-ID: <37738859.E77A8BAF@wjh.harvard.edu> The following technical report is available via FTP/WWW: ******************************************************************** "A Comparison of Auditory and Blind Separation Techniques for Speech Segregation" Technical Report #15, June 1999 Department of Computer and Information Science The Ohio State University ******************************************************************** Andre J. W. van der Kouwe, The Ohio State University DeLiang L. Wang, The Ohio State University Guy J. Brown, The University of Sheffield A fundamental problem in auditory and speech processing is the segregation of speech from concurrent sounds. This problem has been a focus of study in computational auditory scene analysis (CASA), and it has also been recently investigated from the perspective of blind source separation. Using a standard corpus of voiced speech mixed with interfering sounds, we report a comparison between CASA and blind source separation techniques, which have been developed independently. Our comparison reveals that they perform well under very different conditions. A number of conclusions are drawn with respect to their relative strengths and weaknesses in speech segregation applications as well as in modeling auditory function. (10 pages, 92 KB compressed) for anonymous ftp: FTP-HOST: ftp.cis.ohio-state.edu Directory: /pub/tech-report/1999/ Filename: TR15.ps.gz for WWW: http://www.cis.ohio-state.edu/~dwang/reports.html -- ------------------------------------------------------------ My contact information through September, 1999 is: Dr. DeLiang Wang Visual Sciences Lab. William James Hall Harvard University 33 Kirkland Street Cambridge, MA 02138 Email: dwang at wjh.harvard.edu (my OSU email address is good too) Phone: 617-496-3367 (OFFICE); 617-495-3884 (LAB) Fax: 617-495-3764 URL: http://www.cis.ohio-state.edu/~dwang From austin at minster.cs.york.ac.uk Mon Jun 28 05:18:45 1999 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Mon, 28 Jun 1999 10:18:45 +0100 Subject: Lectureship in Neural Networks. Message-ID: <9906281018.ZM661@minster.cs.york.ac.uk> LECTURESHIP IN NEURAL NETWORKS University of York, UK. Closing date 12 July 1999. Applications are invited for a Lectureship, available immediately, in any aspect of neural networks, including such areas as hardware implementation, parallel processing, cognitive aspects, modelling, and theory. You will join the Advanced Computer Architectures Group within the Department of Computer Science one of the UKs leading groups working in neural networks and a Department with the highest rating in research and teaching. The Group undertakes research in all the theory, implementation and application of neural networks, with over 10 researchers active in this area. The group attracts funds from many government sources including EPSRC and EU, as well as working with a number of industries including British Aerospace, The Post Office, Porta Systems and Glaxo- Welcome. The group is also well known for its work in the hardware implementation of neural networks with extensive support for this aspect of our research. The groups facilities include a SGI Origin 2000 32 node supercomputer, high performance workstations and a very supportive research culture. An associated post is also available in Computer Vision (see web below). Full information on the groups activities can be found on http://www.cs.york.ac.uk/arch/neural/. Full details of the post can be found at http://www.york.ac.uk/admin/persnl/jobs/3030.htm Informal enquires can be made to Prof. Jim Austin at austin at cs.york.ac.uk -- Jim Austin, Professor of Neural Computation Advanced Computer Architecture Group, Department of Computer Science, University of York, York, YO10 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From kimmo at james.hut.fi Tue Jun 29 04:44:06 1999 From: kimmo at james.hut.fi (Kimmo Raivio) Date: 29 Jun 1999 11:44:06 +0300 Subject: Thesis: Receiver Structures based on SOMs Message-ID: The following Dr.Tech. thesis is available at http://www.cis.hut.fi/~kimmo/papers/thesis.ps.gz (compressed postscript, 217K ) http://www.cis.hut.fi/~kimmo/papers/thesis.ps (postscript, 797K ) Most of the articles that belong to the thesis can be accessed through the page http://www.cis.hut.fi/~kimmo/papers/ ---------------------------------------------------------------------- Receiver Structures Based on Self-Organizing Maps Kimmo Raivio Helsinki University of Technology Lab. of Computer and Information Science P.O.BOX 5400, FIN-02015 HUT, FINLAND Email: Kimmo.Raivio at hut.fi Abstract New adaptive receiver structures are studied to allow a more efficient compensation of the disturbances of the communication channel. This study concentrates on the use of the Self-Organizing Map (SOM) algorithm as a building block of new adaptive receivers. The SOM has been used both as an adaptive decision device and to follow up error signals. When the SOM was used as an adaptive decision device, comparisons with conventional equalizers such as the linear equalizer and the decision feedback equalizer were performed. The new structures were also compared with other neural methods like radial basis function networks and multi-layer perceptrons. The performances of the neural equalizers and especially the SOM have been found to be better in nonlinear multipath channels and about equal in linear channels. When the SOM was used to follow up error signals, the actual idea was to cancel interference. This task was divided between following up the error distribution and finding out the error estimate. The error was approximately the same as the interference. Other sources of error were noise, intersymbol interference, wrong error estimates and detection errors due to the reasons mentioned before. The error distribution can be followed up, but the problem is how to predict the error. Some solutions are presented in this thesis, but they do not provide satisfactory results. The performance has been compared with a pure detector without any kind of interference cancellation and with a receiver based on the radial basis function network. However, it was discovered that these neural receivers designed for interference cancellation perform better when nonlinear distortions are compensated. The receivers based on the SOM are slightly more complicated than conventional ones, but when a channel has nonlinear disturbances in particular, they offer one possible solution. -- * Kimmo Raivio, Dr. of Science in Technology | email: Kimmo.Raivio at hut.fi * Lab. of Computer and Information Science | http://www.cis.hut.fi/~kimmo/ * Helsinki University of Technology | phone +358 9 4515295 * P.O.BOX 5400, FIN-02015 HUT, FINLAND | fax +358 9 4513277 From aapo at james.hut.fi Tue Jun 29 09:24:37 1999 From: aapo at james.hut.fi (Aapo Hyvarinen) Date: Tue, 29 Jun 1999 16:24:37 +0300 (EEST) Subject: ICA 2000: 1st CFP Message-ID: <199906291324.QAA17887@james.hut.fi> [Our apologies if you receive multiple copies of this message.] First Call for Papers: ------------- I C A 2000 ------------- International Workshop on INDEPENDENT COMPONENT ANALYSIS and BLIND SIGNAL SEPARATION 19-22 June 2000 Helsinki, Finland http://www.cis.hut.fi/ica2000/ Submission deadline: 1 March 2000 ---------------------------------------------------------------------------- AIMS AND SCOPE ---------------------------------------------------------------------------- This workshop is the second in the series initiated by the highly succesful ICA'99 workshop in Aussois, France. It is devoted to recent advances in Independent Component Analysis and Blind Signal Separation. An important goal of the workshop is to bring together researchers from artificial neural networks, signal processing, and other related fields to provide interdisciplinary exchange. Papers describing original work on ICA and BSS are invited. Relevant topics include, for example: - Theory and estimation methods - Extensions of basic models - Convolutive and noisy mixtures - Nonlinear methods - Hardware implementations - Audio and telecommunications applications - Biomedical applications - Image processing applications - Data mining applications - Sensory coding models ---------------------------------------------------------------------------- PAPER SUBMISSION ---------------------------------------------------------------------------- Important dates: 1 March, 2000 Submission of *full* paper 15 April, 2000 Notification of acceptance 19-22 June, 2000 Workshop Detailed submission information will be available from our web site: http://www.cis.hut.fi/ica2000/ Submitted papers will be peer-reviewed, and acceptance will be based on quality, relevance and originality. All the papers presented at the workshop will be published in the Proceedings of ICA 2000. ---------------------------------------------------------------------------- INTERNATIONAL PROGRAM COMMITTEE ---------------------------------------------------------------------------- L. Almeida, INESC, Portugal S.-I. Amari, RIKEN, Japan A. Bell, Interval Research, USA J.-F. Cardoso, ENST, France A. Cichocki, RIKEN, Japan P. Comon, Universite de Nice, France S. Douglas, Southern Methodist University, USA C. Fyfe, Univ. of Paisley, UK S. Haykin, McMaster University, Canada A. Hyvarinen, Helsinki Univ. of Technology, Finland C. Jutten, INPG, France J. Karhunen, Helsinki Univ. of Technology, Finland S. Kassam, Univ. of Pennsylvania, USA V. Koivunen, Helsinki Univ. of Technology, Finland T.-W. Lee, Salk Institute, USA R.-W. Liu, Univ. of Notre Dame, USA P. Loubaton, Universite de Marne la Vallee, France K.-R. Mueller, GMD First, Germany B. Olshausen, UC Davis, USA E. Oja, Helsinki Univ. of Technology, Finland P. Pajunen, Helsinki Univ. of Technology, Finland J. Principe, Univ. of Florida, USA T. Sejnowski, Salk Institute, USA K. Torkkola, Motorola Corporate Research, USA J. Tugnait, Auburn University, USA L. Xu, The Chinese Univ. of Hong Kong, China ---------------------------------------------------------------------------- LOCAL ORGANIZING COMMITTEE ---------------------------------------------------------------------------- General Chair: E. Oja Program Chair: J. Karhunen Local Arrangements Chair: V. Koivunen Publications Chair; P. Pajunen Publicity Chair; A. Hyvarinen Treasurer: J. Iivarinen Web Master: J. Sarela ---------------------------------------------------------------------------- COOPERATING SOCIETIES ---------------------------------------------------------------------------- European Neural Network Society, IEEE Signal Processing Society, EURASIP, IEEE Neural Networks Council, IEEE Circuits and Systems Society ---------------------------------------------------------------------------- CONTACT INFORMATION ---------------------------------------------------------------------------- web site http://www.cis.hut.fi/ica2000/ email ica2000 at mail.cis.hut.fi postal mail ICA 2000, P.O.Box 5400 Lab of Comp. and Info. Science Helsinki Univ. of Technology FIN-02015 HUT, Finland ---------------------------------------------------------------------------- From sbaluja at lycos.com Tue Jun 29 18:24:10 1999 From: sbaluja at lycos.com (sbaluja@lycos.com) Date: Tue, 29 Jun 1999 18:24:10 -0400 Subject: Paper: High Performance Named-Entity Extraction Message-ID: <8525679F.007A85FD.00@pghmta2.mis.pgh.lycos.com> Paper: Applying Machine Learning for High Performance Named-Entity Extraction Authors: Shumeet Baluja, Vibhu Mittal, Rahul Sukthankar Available from: http://www.cs.cmu.edu/~baluja Abstract: This paper describes a machine learning approach to build an efficient, accurate and fast name spotting system. Finding names in free text is an important task in addressing real-world text based applications. Most previous approaches have been based on carefully hand-crafted modules encoding linguistic knowledge specific to the language and document genre. Such approaches have two drawbacks: they require large amounts of time and linguistic expertise to develop, and they are not easily portable to new languages and genres. This paper describes an extensible system which automatically combines weak evidence for name extraction. This evidence is gathered from easily available sources: part-of-speech tagging, dictionary lookups, and textual information such as capitalization and punctuation. Individually, each piece of evidence is insufficient for robust name detection. However, the combination of evidence, through standard machine learning techniques, yields a system that achieves performance equivalent to the best existing hand-crafted approaches. Contact: sbaluja at lycos.com, mittal at jprc.com, rahuls at jprc.com Questions and comments are welcome! From sue at soc.plym.ac.uk Wed Jun 30 06:27:53 1999 From: sue at soc.plym.ac.uk (Sue Denham) Date: Wed, 30 Jun 1999 11:27:53 +0100 Subject: Senior Lectureships / Readerships in Computational Neuroscience and Neural Computation Message-ID: <1.5.4.32.19990630102753.00749220@soc.plym.ac.uk> University of Plymouth, UK School of Computing Centre for Neural and Adaptive Systems Senior Lectureships / Readerships in Computational Neuroscience and Neural Computation Salary: ?27998-?29600 pa (Senior Lecturer) Applications are invited for two newly-established, permanent research-related academic positions in the Centre for Neural and Adaptive Systems, a well-established research group which has an international reputation in the theory and computational modelling of neural systems. Applicants must have a very strong research record in either computational neuroscience or biologically inspired neural computation. Appointments will be made either to a Senior Lectureship (salary: ?27998-?29600 pa), or to a Readership (salary: ?27998-?35204 pa), depending on the current research standing of the appointee. Further information about the positions and the research activities of the Centre for Neural and Adaptive Systems can be obtained via e-mail or telephone, from Professor Mike Denham (mike at soc.plym.ac.uk; tel: +44 (0)1752 232547). Applicants are invited to send, as soon as possible, a full curriculum vitae, together with details of their current research activities and future plans, to Professor Mike Denham by e-mail to the above address, or by mail to University of Plymouth, Plymouth, PL4 8AA, UK. Dr Sue Denham Centre for Neural and Adaptive Systems School of Computing University of Plymouth Plymouth PL4 8AA England tel: +44 17 52 23 26 10 fax: +44 17 52 23 25 40 e-mail: sue at soc.plym.ac.uk http://www.tech.plym.ac.uk/soc/research/neural/index.html From movellan at cogsci.ucsd.edu Wed Jun 30 18:45:48 1999 From: movellan at cogsci.ucsd.edu (Javier R. Movellan) Date: Wed, 30 Jun 1999 15:45:48 -0700 Subject: TR Announcement Message-ID: <377A9E1C.B7F37BEA@cogsci.ucsd.edu> The following technical report is available online at http://cogsci.ucsd.edu (follow links to Tech Reports & Software ) Physical copies are also available (see the site for information). Modeling Path Distributions Using Partially Observable Diffusion Networks: A Monte-Carlo Approach. Paul Mineiro Department of Cognitive Science University of California San Diego Javier R. Movellan Department of Cognitive Science & Institute for Neural Computation University of California San Diego Ruth J. Williams Department of Mathematics & Institute for Neural Computation University of California San Diego Hidden Markov models have been more successful than recurrent neural networks for problems involving temporal sequences, e.g., speech recognition. One possible reason for this is that recurrent neural networks are being used in ways that do not handle temporal uncertainty well. In this paper we present a framework for learning, recognition and stochastic filtering of temporal sequences based on a probabilistic version of continuous recurrent neural networks. We call these networks diffusion (neural) networks for they are based on stochastic diffusion processes defined by adding Brownian motion to the standard recurrent neural network dynamics. The goal is to combine the versatility of recurrent neural networks with the power of probabilistic techniques. We focus on the problem of learning to approximate a desired probability distribution of sequences. Once a distribution of sequences has been learned, well known techniques can be applied for the generation, recognition and stochastic filtering of new sequences. We present an adaptive importance sampling scheme for estimation of log-likelihood gradients. This allows the use of iterative optimization techniques, like gradient descent and the EM algorithm, to train diffusion networks. We present results for an automatic visual speech recognition task in which diffusion networks provide excellent performance when compared to hidden Markov models. From amari at brain.riken.go.jp Tue Jun 1 02:59:55 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Tue, 01 Jun 1999 15:59:55 +0900 Subject: New Laboratory Heads of RIKEN Brain Science Institute Message-ID: <19990601155955C.amari@brain.riken.go.jp> RIKEN Brain Science Institute, Japan Applications Invited for Heads of New Laboratories in the "Creating the Brain" Area *************** Applications are invited for heads of several new laboratories in the "Creating the Brain" Area of RIKEN BSI. The RIKEN Brain Science Institute was established in October 1997 to promote three strategic research areas: "Understanding the Brain", "Protecting the Brain" and "Creating the Brain". The "Creating the Brain" area aims at elucidation of the information principles in the brain by theoretical and experimental approaches and aims to create new information technology based on brain-style computations and systems. The targets include theoretical foundation of neurocomputing, modeling of structures and functions of the brain, their simulations, and establishment of brain-style computing systems. The area now consists of two research groups, the "Brainway Group" headed by Gen Matsumoto and the "Brain-Style Information Systems Group" headed by Shun-ichi Amari. For more information, see the web site: http://www.brain.riken.go.jp. BSI will establish two more research groups including several laboratories in order to further develop activities of the creating the brain area. One new group is concerned with computational cognitive neuroscience of higher-order brain functions. It will include laboratories which study intelligent, emergent and complex systems, brain-style artificial intelligence, languages, logical reasoning, symbol processing, and brain-style super-parallel computation. The other group is concerned with brain-style behavioral systems and robotics. It will include laboratories which study motor planning, command generation, brain-style control including learnable inner models and inverse models, mechanisms of sensorimotor transformations and brain-style robot systems. A new computational neuroscience research laboratory will also be established in the existing groups. New heads of laboratories will be required to organize a team of 5 - 10 researchers and technical staff and be provided generous support for five years. A review of progress by international review committee occurs every five years with possibility of contract renewal. Applications are encouraged from outside Japan as well as inside Japan, under the condition that heads must be willing to work at BSI full time. Applicants interested in leading a new laboratory are invited to submit a research proposal (of no more than 2,000 words) and a suggested name for the laboratory. In addition, applicants should provide a full CV, a list of all publications with reprints of five papers, a statement of research interests and the names and addresses of three referees to: Search Committee (10), RIKEN Brain Science Institute, 2-1 Hirosawa, Wako-shi, Saitama 351-0198, Japan Fax: +81-48-462-4796, e-mail: search10 at brain.riken.go.jp Deadline: September 30, 1999 From jagota at cse.ucsc.edu Tue Jun 1 16:26:02 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Tue, 1 Jun 1999 13:26:02 -0700 (PDT) Subject: new e-publication: ICA survey Message-ID: <199906012026.NAA17656@arapaho.cse.ucsc.edu> New refereed e-publication action editor: Barak Pearlmutter A. Hyvarinen, Survey on Independent Component Analysis, Neural Computing Surveys 2, 94--128, 1999. 150 references. http://www.icsi.berkeley.edu/~jagota/NCS Abstract: A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is finding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the original data. Well-known linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes the statistical dependence of the components of the representation. Such a representation seems to capture the essential structure of the data in many applications. In this paper, we survey the existing theory and methods for ICA. From hali at theophys.kth.se Tue Jun 1 17:52:53 1999 From: hali at theophys.kth.se (Hans =?iso-8859-1?Q?Liljenstr=F6m?=) Date: Tue, 01 Jun 1999 23:52:53 +0200 Subject: Final CALL FOR PAPERS: 1999 Agora Meeting on Fluctuations in Biological Systems Message-ID: <37545635.694BFF49@theophys.kth.se> ************************************************************************* 2nd Announcement and Final Call For Papers 1999 Agora Meeting on Fluctuations in Biological Systems Agora'99 Sigtuna, Sweden August 3-7, 1999 Organized by Agora for Biosystems Sponsored by International Union for Pure and Applied Physics (IUPAP) Swedish Council for Planning and Coordination of Research (FRN) >>>>>>DEADLINE for abstract submission extended to: June 15, 1999 <<<<<<< More information and registration at http://www.theophys.kth.se/~hali/agora/agora99 ************************************************************************* SCOPE This interdiscplinary conference on fluctuations in biological systems will be held in the small old town of Sigtuna, Sweden, Aug 3-7, 1999, and is following upon a series of workshops, where the first was held in Sigtuna, Sep 4-9 1995 (Sigtuna Workshop 95). The approach on these meetings is theoretical as well as experimental, and the meetings are intended to attract participants from various fields, such as biology, physics, and computer science. A number of invited speakers will provide presentations on the fundamental problems, but participants are invited to submit abstracts on topics related to those listed below. The number of participants is limited to approx. 150. MOTIVATION Life is normally associated with a high degree of order and organization. However, disorder -- in various contexts referred to as fluctuations, noise or chaos -- is also a crucial component of many biological processes. For example, in evolution random errors in the reproduction of the genetic material provides a variation that is fundamental for the selection of adaptive organisms. At a molecular level, thermal fluctuations govern the movements and functions of the macromolecules in the cell. Yet, it is also clear that too large a variation may have disastrous effects. Uncontrolled processes need stabilizing mechanisms. More knowledge of the stability requirements of biological processes is needed in order to better understand these problems, which also have important medical applications. Many diseases, for instance certain degenerations of brain cells, are caused by failure of the stabilizing mechanisms in the cell. Stability is also important and difficult to achieve in biotechnological applications. There is also randomness in structure and function of the neural networks of the brain. Spontaneous firing of neurons seems to be important for maintaining an adequate level of activity, but does this "neuronal noise" have any other significance? What are the effects of errors and fluctuations in the information processing of the brain? Can these microscopic fluctuations be amplified to provide macroscopic effects? Often, one cannot easily determine whether an apparently random process is due to noise, governed by uncontrolled degrees of freedom, or if it is a result of "deterministic chaos". Would the difference be of any importance for biology? Especially, could chaos, which is characterized by sensitivity and divergence, be useful for any kind of information processing that normally depends upon stability and convergence? OBJECTIVE The objective of this meeting is to address questions and problems related to those above, for a deeper understanding of the effects of disorder in biological systems. Fluctuations and chaos have been extensively studied in physics, but to a much lesser degree in biology. Important concepts from physics, such as "noise-induced state transitions" and "controlled chaos" could also be of relevance for biological systems. Yet, little has been done about such applications and a more critical analysis of the positive and negative effects of disorder for living systems is needed. It is essential to make concrete and testable hypotheses, and to avoid the kind of superficial and more fashionable treatment that often dominates the field. By bringing together scientists with knowledge and insights from different disciplines we hope to shed more light on these problems, which we think are profound for understanding the phenomenon of life. TOPICS Topics include various aspects, experimental as well as theoretical, on fluctuations, noise and chaos, in biological systems at a microscopic (molecular), mesoscopic (cellular), and macroscopic (network and systems) level. Contributions are welcome regarding, among others, the following topics: - Biological signals and noise - Neural information processing - Synaptic fluctuations - Spontaneous neural firing - Macromolecular dynamics - Dynamics of microtubuli - Ion channel kinetics - Cell motility - Medical implications INVITED SPEAKERS Per Andersen, Oslo University, Norway Hans Braun, University of Marburg, Germany Franco Conti, Istituto di Cibernetica e Biofisica, CNR, Genova, Italy Louis DeFelice, Vanderbilt University, Nashville, USA Hans Frauenfelder, Los Alamos National Laboratory, New Mexico, USA John Hopfield, Princeton University, USA Fernan Jaramillo, Emory University School of Medicine, Atlanta, USA Stanislas Leibler, Princeton University, USA Uno Lindberg, Stockholm University, Sweden Koichiro Matsuno, Nagaoka University of Technology, Japan Erik Mosekilde, Technical University of Denmark, Lyngby Frank Moss, University of Missouri, St Louis, USA Sakire P=F6gun, Center for Brain Research, Ege University, Turkey Stephen Traynelis, Emory University School of Medicine, Atlanta, USA Horst Vogel, Swiss Federal Institute of Technology, Lausanne, Switzerland Peter Wolynes, University of Illinois, USA James J. Wright, University of Melbourne, Australia Mikhail Zhadin, Institute of Cell Biophysics, Pushchino, Russia PRE-REGISTRATION and abstract submission can preferably be done via the Agora'99 home page: http://www.theophys.kth.se/~hali/agora/agora99 REGISTRATION FEES Regular: 1500 SEK (ca 190 USD) before June1 ---- 2000 SEK after June 1 Students: 800 SEK (ca 100 USD) before June 1 ---- 1000 SEK after June 1 DEADLINE for abstract submission: June 15, 1999. FURTHER INFORMATION available from: Hans Liljenstrom Theoretical Physics Royal Institute of Technology SE-100 44 Stockholm, Sweden and Agora for Biosystems Box 57, SE-193 22 Sigtuna, Sweden Phone: +46-8-790 7167 Fax: +46-8-10 48 79 Email: hali at theophys.kth.se WWW: http://www.theophys.kth.se/~hali From amari at brain.riken.go.jp Tue Jun 1 22:59:54 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Wed, 02 Jun 1999 11:59:54 +0900 Subject: adaptive natural gradient papers Message-ID: <19990602115954G.amari@brain.riken.go.jp> Two papers concerning natural gradient on-line learning are available! Natural gradient learning has attracted much attention because of its excellent dynamical behaviors. When it is applied to multilayer perceptrons, its superior properties have been proved by statistical- physical methods. It is not only locally Fisher efficient but avoids plateaus or quickly gets rid of them. Although theoretically good, it is believed that its practical implementation is difficult. This is because calculation of the Fisher information matrix and its inversion are very difficult. In order to avoid this difficulty, we have developed an adaptive method of directly calculating the inverse of the Fisher information. The natural gradient method works surprisingly well with this adaptive estimate of the inverse. The first paper proposes the method itself, which has been accepted for publication in Neural Computation. The second paper generalizes the idea to be applicable to a wider class of network models and loss functions. This has been submitted to Neural Networks. S.Amari, H.Park and K.Fukumizu, Adaptive method of realizing natural gradient learning for multilayer perceptrons, H.Park, S.Amari, and K.Fukumizu, Adaptive natural gradient learning algorithms for various stochastic models. You can copy the papers from http://www.bsis.brain.riken.go.jp/ Shun-ichi Amari ? Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan RIKEN Brain Science Institute Director of Brain-Style Information Systems Research Group Laboratory for Information Synthesis, Head tel: +81-(0)48-467-9669 fax: +81-(0)48-467-9687 e-mail: amari at brain.riken.go.jp home page: http://www.bsis.brain.riken.go.jp/ From phil at rome.cis.plym.ac.uk Fri Jun 4 08:25:03 1999 From: phil at rome.cis.plym.ac.uk (Phil Culverhouse) Date: Fri, 04 Jun 1999 13:25:03 +0100 Subject: JOB POSITION References: Message-ID: <3757C59F.247A59DD@cis.plym.ac.uk> SCHOOL OF ELECTRONIC, COMMUNICATION & ELECTRICAL ENGINEERING, UNIVERSITY OF PLYMOUTH, UK. Ref: 3289/TECH RESEARCH ASSISTANT/FELLOW Salary stlg10399 to stlg17606 pa - RA/RF An exciting 20 MONTH post is IMMEDIATELY available for a Vision Scientist You will LEAD the development of a neural network based natural object categoriser for laboratory use. The existing prototype (UNIX platform)is capable of categorising 23 species of marine plankton, but has to be further developed and a user interface tailored to Marine Ecologists. You should have a working knowledge of wavelet transforms, current machine vision techniques and multi-dimensional clustering statistics. You should ideally be familiar with UNIX and Windows NT operating systems as well as being a Matlab and C++ programmer. The POST IS AVAILABLE IMMEDIATELY and will involve some European travel. For informal enquiries regarding this post, please contact Dr P Culverhouse on +44 1752 233517 or email: pculverhouse at plymouth.ac.uk closing date: 30th June 1999 From ishii at is.aist-nara.ac.jp Sun Jun 6 22:54:41 1999 From: ishii at is.aist-nara.ac.jp (Shin Ishii) Date: Mon, 07 Jun 1999 11:54:41 +0900 Subject: Paper on on-line EM algorithm for NRBF Message-ID: <199906070254.LAA06189@axp27.aist-nara.ac.jp> The following paper is available on my web site: http://mimi.aist-nara.ac.jp/~ishii/publication.html We would greatly appreciate comments and suggestion. ------------------------------------------------------------------- On-line EM algorithm for the normalized Gaussian network Masa-aki Sato and Shin Ishii To appear in Neural Computation A Normalized Gaussian Network (NGnet) (Moody and Darken 1989) is a network of local linear regression units. The model softly partitions the input space by normalized Gaussian functions and each local unit linearly approximates the output within the partition. In this article, we propose a new on-line EM algorithm for the NGnet, which is derived from the batch EM algorithm (Xu, Jordan and Hinton 1995) by introducing a discount factor. We show that the on-line EM algorithm is equivalent to the batch EM algorithm if a specific scheduling of the discount factor is employed. In addition, we show that the on-line EM algorithm can be considered as a stochastic approximation method to find the maximum likelihood estimator. A new regularization method is proposed in order to deal with a singular input distribution. In order to manage dynamic environments, where the input-output distribution of data changes over time, unit manipulation mechanisms such as unit production, unit deletion, and unit division are also introduced based on the probabilistic interpretation. Experimental results show that our approach is suitable for function approximation problems in dynamic environments. We also apply our on-line EM algorithm to robot dynamics problems and compare our algorithm with the Mixtures-of-Experts family. ------------------------------------------------------------------- Shin Ishii Nara Institute of Science and Technology ATR Human Information Processing Research Laboratories From amari at brain.riken.go.jp Mon Jun 7 05:05:38 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Mon, 07 Jun 1999 18:05:38 +0900 Subject: exact location of two natural gradient learning papers Message-ID: <19990607180538R.amari@brain.riken.go.jp> I have received a number of complaints concerning difficulties for copying the announced papers because of my bad information. The following is the exact location of the papers for two adaptive natural gradient papers. We have made two files each, one being in the postscript form (gziped) and the other in the pdf form. The URL is as follows; http://www.islab.brain.riken.go.jp/~amari/pub_j.html#Journal Please contact Dr.Fukumizu (fuku at brain.riken.go.jp) if you have any troubles. Shun-ichi Amari ? Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan RIKEN Brain Science Institute Director of Brain-Style Information Systems Research Group Laboratory for Information Synthesis, Head tel: +81-(0)48-467-9669 fax: +81-(0)48-467-9687 e-mail: amari at brain.riken.go.jp home page: http://www.bsis.brain.riken.go.jp/ From jagota at cse.ucsc.edu Mon Jun 7 17:50:12 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Mon, 7 Jun 1999 14:50:12 -0700 (PDT) Subject: new {H}MM e-survey Message-ID: <199906072150.OAA02253@arapaho.cse.ucsc.edu> New refereed e-publication action editor: Yoram Singer Y. Bengio, Markovian Models for Sequential Data, Neural Computing Surveys 2, 129--162, 1999. 141 references. http://www.icsi.berkeley.edu/~jagota/NCS Abstract: Hidden Markov Models (HMMs) are statistical models of sequential data that have been used successfully in many machine learning applications, especially for speech recognition. Furthermore, in the last few years, many new and promising probabilistic models related to HMMs have been proposed. We first summarize the basics of HMMs, and then review several recent related learning algorithms and extensions of HMMs, including in particular hybrids of HMMs with artificial neural networks, Input-Output HMMs (which are conditional HMMs using neural networks to compute probabilities), weighted transducers, variable-length Markov models and Markov switching state-space models. Finally, we discuss some of the challenges of future research in this very active area. From ctan at bond.edu.au Tue Jun 8 03:21:35 1999 From: ctan at bond.edu.au (Dr Clarence N W Tan) Date: Tue, 8 Jun 1999 17:21:35 +1000 Subject: call for abstracts: International Conference on Advanced Investment Technology 1999 Message-ID: <012d01beb17f$8b04c0a0$3a08f483@it.bond.edu.au> International Conference on Advanced Investment Technology 1999 Incorporating Workshops on "A Primer to Advanced Investment Technology" Announcement and Call for Papers ------------------------------------------------------------------------- Date: Sun 19 December 1999 - Tue 21 December 1999 Venue: The Conference Centre Bond University, Gold Coast, Queensland 4229 Australia Extended Abstract of Paper Deadline: July 31, 1999 Organisers: ----------- Honorary Chair: Prof. Gopal Gupta Conference Chairs: Dr. Clarence Tan Dr. Kuldeep Kumar Invitation ----------- Bond University would like to invite all researchers and practitioners interested in all aspects of technology applications to finance to attend the Advanced Investment Technology 1999 Conference (AIT99). Objectives ---------- AIT99 aims to bring together academics and professionals in the finance and investment industry who are interested in applications of advanced information technology to finance problems. It intends to foster better relationships between academia and the finance industry on topics such as soft computing and web-based technology in the investment industry. A one-day workshop is planned for the day prior to the conference to cover a range of practical issues concerning use of information technology in financial environments including applications of advanced technology such as Artificial Neural Networks, Chaos Theory and Forecasting to financial analysis and investment. Structure: ---------- The conference is organised into two principal sessions: A series of workshops on 19th December 1999 and a conference program consisting of keynote speakers and presentations by the delegates/participants on 20th and 21st December 1999. There are plans to have a public exhibition area for members of the industry to display their products in conjunction with the conference. Please contact organisers for exhibition details. Conference Topics: ----------------- Topics include but are not limited to: Applications in Finance of Soft Computing, Artificial Intelligence & Statistical methods such as Neural Networks, Genetic Algorithms, Fuzzy Logic; Time Series & Forecasting, Multivariate Analysis, Hybrid Intelligent Systems, Intelligent Agents, Chaos Theory, Data Mining, Online Gambling, Gaming Strategies and Trading Systems. Invited Keynote Speakers ------------------------ Dr Jeffrey Carmichael, Ph.D. (Princeton), AO, Chairman of the Australian Prudential Regulatory Authority (APRA) and Member of the 1997 Australian Wallis Financial Inquiry Commission. Prof. Efraim Turban, Ph.D. (Berkeley), Author of over 100 publications in the areas of Information Systems, Electronic Commerce and Neural Networks in Finance and Investment. International Review Committee: ------------------------------- Prof. Efraim Turban (USA/HK) Dr. Jeff Carmichael (Aus) Prof. Ah Chung Tsoi (Aus) Emeritus Prof. Peter Poole (Aus) Prof. John D. Haynes (NZ) Prof. M. L. Tiku (Canada) Prof. R. Velu (USA) Prof. Kevin Burrage (Aus) Prof. V K Srivastava (India) Prof. Neville De Mestre (Aus) Prof. Berlin Wu (Taiwan)) Prof. Nikola Kasabov (NZ) Prof. Stan Hurn (Aus) Dr. Vance Martin (Aus) Dr. Graham McMahon (Aus) Dr. A Flitman (Aus) Dr. Mark Chignell (Can) Dr. Gavin Finnie (Aus) Dr. A. N. Roy (USA) Dr. Jeff Barker (Aus) Dr. Zheng Da Wu (Aus) Dr. S. Ganesalingam (NZ) Dr. Stephen Sugden (Aus) Dr. S. Alvandi (Singapore) Dr. Gerhard Wittig (Aus) Dr. James Liu (HK) Dr W. K. Yeap (NZ) Dr. Steven Lawrence (USA) Registration Fees ----------------- Workshop(s): Individual: Full day A$600 or A$150 each Industry representative: Full day A$1200 Conference (2 full days): Before Oct 31 After Oct 31 ------------------------------------------------------------ Students: A$150 A$150 Academic: A$300 A$400 Industry representative A$600 A$750 Exchange Rate Indication: US$1.00 = A$1.55 All SIA, AIBF, ATAA, GCRITF and ANZIAM members are entitled to a 10% discount on conference and workshop fees, subject to availability. Group bookings and sponsors are entitled to discounts. Contact organisers for details. The registration fee includes lunches and refreshments for each day of the conference and workshops. Workshops (very limited places, register early to ensure place) --------------------------------------------------------------- Workshop 1: Financial Forecasting Techniques (Kumar) Workshop 2: Basic Financial Trading System Design Using Excel (Tan) Workshop 3: Applications of E-commerce in Finance I (Prof. Efraim Turban) Workshop 4: Applications of E-commerce in Finance II (Prof. Efraim Turban) Workshop 5: Artificial Neural Networks (Tsoi) Workshop 6: Applying Chaos Theory to Finance (Kumar, Tan and Ghosh) Registration ------------ To register please complete the registration form and mail it with the payment to: AIT 99 Ms Herlina Dihardjo School of Information Technology Bond University, QLD 4229 Australia Telephone: +61 (0) 7 5595-3392 Fax: +61 (0) 7 5595-3320 *Do not dial (0) from outside Australia E-mail: ait99 at bond.edu.au Submission of Papers -------------------- To submit a paper to the AIT99 conference, please send an extended abstract to the above address no later than 31st July 1999. If accepted, two copies of final paper should be submitted before November 1 1999. The preferred format is Microsoft Word. Student papers are invited and encouraged. The Securities Institute of Australia is proud to sponsor the Best Student Paper Prize. Accommodation & Travel ---------------------- Bond University has its own Conference Centre with good accommodation facilities of one hundred standard and executive rooms. There is also a large number of choices for accommodation on the Gold Coast ranging from luxury hotels such as Jupiter's Casino/Conrad Hotel, Marriott, Sheraton Mirage and Grand Mercure to backpackers' accommodation. QANTAS is the official airline for the AIT99 conference and is proud to be part of the conference. A discount of up to 45% of the full economy airfare excluding taxes, for domestic travel in Australia, at the time of booking, has been negotiated for delegates attending the conference, subject to seat availability in group class and payment & ticketing conditions. Please quote Association Profile Number: "1203355", destination and date of conference when making your reservation. The Qantas Association Sales contact number for Australian delegates is Toll Free 1800 684 880 * International delegates can contact their local Qantas office for the best available fare of the day* Our Web Site ------------ Prospective Conference Participants and Delegates are invited to visit our website at the following URL: http://tide.it.bond.edu.au/ait99 or http://w3.to/ait99. On-line registration is available on the web. Sponsors -------- Major Sponsor: Bond University's School of Information Technology. Other AIT99 sponsors and/or supporters include: Gold Coast Regional Information Technology Forum (GRITF), the Gold Coast City Council, the Securities Institute of Australia (SIA), the Australian Institute of Banking and Finance (AIBF), the Australia and New Zealand Industrial Applied Mathematics (ANZIAM), Australian Technical Analysts Association (ATAA), TechQuad, and On The Net, etc. See web site for full list. From FYFE-CI0 at wpmail.paisley.ac.uk Tue Jun 8 09:29:40 1999 From: FYFE-CI0 at wpmail.paisley.ac.uk (COLIN FYFE) Date: Tue, 08 Jun 1999 13:29:40 +0000 Subject: PhD Studentships Message-ID: Two three year studentships are offered in the field of unsupervised artificial neural networks applied to the extraction of information from visual data. One student will work most closely with Dr Bogdan Gabrys and will concentrate on novel cost functions for extraction of independent components of visual scenes. The second will work with Dr Darryl Charles and will concentrate on additive noise for the creation of minimal code sets for sparsely coded visual data. Each will comprise payment of fees and an annual grant of approximately 5500. Both are expected to lead to the award of PhD within the three year period. The Applied Computational Intelligence Research Unit is a very active research unit within the Department of Computing and Information Systems comprising some 10 academics, 5 research assistants and 15 research students. To apply for either of these posts please send a current CV to either char-ci0 at paisley.ac.uk or fyfe-ci0 at paisley.ac.uk before 30th June 1999. Colin Fyfe From ckiw at dai.ed.ac.uk Tue Jun 8 12:27:58 1999 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Tue, 8 Jun 1999 17:27:58 +0100 (BST) Subject: Call for Participation: ICANN 99 post-conference workshops Message-ID: Dear Connectionists, Below are brief announcements of the 5 ICANN 99 post-conference workshops taking place at the University of Edinburgh on Saturday 11 September 1999. See http://www.dai.ed.ac.uk/daidb/people/homes/ckiw/icann/ and the URLs listed below for further details. * Call for participation/presentations This is a call for participation/presentations for these workshops. Please see the individual workshop web pages for details, and then contact the appropriate organizers. * Registration Arrangements Registration for the workshops is free. Those registering for the ICANN conference should register for the workshops on the same form. Anyone not attending the conference should register with the organizers of the workshop(s) they wish to attend. We need you to do this so that we can get rooms of a suitable size for the workshops. Those attending are responsible for their own travel and accommodation arrangements. Chris Williams ICANN 99 Post-Conference workshops organizer -------------------------------------------------------------------------- Interactions between theoretical and experimental approaches in developmental neuroscience Organizers: Stephen Eglen, Bruce Graham, David Willshaw (Edinburgh) http://www.anc.ed.ac.uk/~stephen/workshop.html This workshop will highlight the role of theoretical approaches in understanding the development of the nervous system. It will also allow us to discuss the ways in which experimental and theoretical approaches can interact on various developmental problems. The workshop will examine several key areas in neural development: Growth and branching in dendritic trees. Molecular gradients, and their role in topographic mappings. Neurotrophic factors. Visual system development. Development of innervation at the neuromuscular junction. -------------------------------------------------------------------------- Emergent Neural Computation Architectures Based on Neuroscience Organizers: Stefan Wermter (Sunderland), Jim Austin (York), David Willshaw (Edinburgh) http://www.his.sunderland.ac.uk/emernet/icann99w.html Areas of interest include issues of neuroscience and neural network, such as: 1.Synchronisation: How does the brain synchronise its processing? How does the brain schedule its processing? 2.Processing speed: How does the brain compute with relatively slow computing elements but still achieve rapid and real time performance? 3.Robustness: How does human memory manage to continue to operate despite failure of its components? 4.Modular construction: What can we learn from the brain for building modular more powerful artificial neural network architectures to solve larger tasks? 5.Learning in context: How can we build learning algorithms which consider context? How can we design incremental learning algorithms and dynamic architectures? -------------------------------------------------------------------------- Neural Networks for Intelligent User Interfaces Organizers: Rainer Malaka (Heidelberg), Ramin Yasdi (Sankt Augustin) http://www.dai.ed.ac.uk/daidb/people/homes/ckiw/icann/malaka.html User interfaces that adapt themselves to individual needs, preferences, and knowledge of their users are becoming more and more important. Personalized interfaces are of special importance to deal with information overload and navigation by personalizing and improving the quality of information retrieval and filtering, information restructuring and annotation, as well as information visualization. The development of these new intelligent user interfaces require techniques that enable computer programs to learn how to serve the user most efficiently. Neural networks are not yet widely used within this challenging domain. But the domain seems to be an interesting new application area for neural networks due to availability of large sets of data and the required automatic adaptation to new situations and users. Therefore, growing interest in using various powerful learning methods known from neural network models for intelligent user interfaces is arising among researchers. -------------------------------------------------------------------------- Kernel Methods: Gaussian Process and Support Vector Machine predictors Organizers: Carl Edward Rasmussen (Lyngby), Roderick Murray-Smith (Lyngby) Alex Smola (Berlin), Chris Williams (Edinburgh) http://www.dai.ed.ac.uk/daidb/people/homes/ckiw/icann/gpsvm.html This workshop aims to bring together people working with Gaussian Process (GP) and Support Vector Machine (SVM) predictors for regression and classification problems. The scope of the workshop includes: Methods for choosing kernels: Generic vs problem specific issues Uniform convergence and Bayesian theory Efficient implementation/approximation of GP and SVM predictors on large datasets GP classifiers: MCMC methods, variational and Laplace approximations Kernel methods for dynamic system modelling Applications of Kernel methods -------------------------------------------------------------------------- Developments in Artificial Neural Network Theory: Independent Component Analysis and Blind Source Separation Organizer: mark Girolami (Paisley) http://cis.paisley.ac.uk/staff/giro-ci0/ICANN99/ICANN99_ICA_WS.html This workshop seeks to re-focus the attention of ANN researchers by exploring how ICA / BSS and its further development can push forward our knowledge of the computational brain. Proposals are solicited for presentation and discussion that address and explore some of the following topics: Models of Sensory Coding in the Brain Mammalian Visual Cortex Image Feature Extraction Natural Image Statistics and Efficient Coding Auditory Modelling and the Binaural Cocktail Party Effect Over-complete Basis Representations State Space Models Time Varying Mixtures Non-linear ICA and Topographic Mappings Applications of ICA to Electrophysiological Data From yweiss at CS.Berkeley.EDU Tue Jun 8 08:43:30 1999 From: yweiss at CS.Berkeley.EDU (Yair Weiss) Date: Tue, 8 Jun 1999 05:43:30 -0700 (PDT) Subject: TR on loopy belief propagation Message-ID: <199906081243.FAA18143@gibson.CS.Berkeley.EDU> Hi, The following paper showing that belief propagation gives the exact means for Gaussian graphical models regardless of the number of loops is available online via: http://www.cs.berkeley.edu/~yweiss/gaussTR.ps.gz Comments are most welcome. Yair -------------------------------------------------------------------------- Title: Correctness of belief propagation in Gaussian graphical models of arbitrary topology. Authors: Yair Weiss and William T. Freeman Reference: UC Berkeley CS Department TR UCB//CSD-99-1046 Abstract: Graphical models, such as Bayesian networks and Markov Random Fields represent statistical dependencies of variables by a graph. Local ``belief propagation'' rules of the sort proposed by Pearl (1988) are guaranteed to converge to the correct posterior probabilities in singly connected graphical models. Recently, a number of researchers have empirically demonstrated good performance of ``loopy belief propagation''--using these same rules on graphs with loops. Perhaps the most dramatic instance is the near Shannon-limit performance of ``Turbo codes'', whose decoding algorithm is equivalent to loopy belief propagation. Except for the case of graphs with a single loop, there has been little theoretical understanding of the performance of loopy propagation. Here we analyze belief propagation in networks with arbitrary topologies when the nodes in the graph describe jointly Gaussian random variables. We give an analytical formula relating the true posterior probabilities with those calculated using loopy propagation. We give sufficient conditions for convergence and show that when belief propagation converges it gives the correct posterior means {\em for all graph topologies}, not just networks with a single loop. The related ``max-product'' belief propagation algorithm finds the maximum posterior probability estimate for singly connected networks. We show that, even for non-Gaussian probability distributions, the convergence points of the max-product algorithm in loopy networks are at least local maxima of the posterior probability. These results motivate using the powerful belief propagation algorithm in a broader class of networks, and help clarify the empirical performance results. From masaaki at hip.atr.co.jp Wed Jun 9 05:12:48 1999 From: masaaki at hip.atr.co.jp (Masa-aki SATO) Date: Wed, 9 Jun 1999 18:12:48 +0900 Subject: TR: Fast Learning of On-line EM Algorithm Message-ID: <01BEB2A3.AEC1C360@hippc4660.hip.atr.co.jp> The following paper is available on my web site: http://www.hip.atr.co.jp/~masaaki/ We would greatly appreciate comments and suggestion. TITLE: "Fast Learning of On-line EM Algorithm" Masa-aki Sato ATR Human Information Processing Research Laboratories ------------------------------------------------------------------ Abstract In this article, an on-line EM algorithm is derived for general Exponential Family models with Hidden variables (EFH models). It is proven that the on-line EM algorithm is equivalent to a stochastic gradient method with the inverse of the Fisher information matrix as a coefficient matrix. As a result, the stochastic approximation theory guarantees the convergence to a local maximum of the likelihood function. The performance of the on-line EM algorithm is examined by using the mixture of Gaussian model, which is a special type of the EFH model. The simulation results show that the on-line EM algorithm is much faster than the batch EM algorithm and the on-line gradient ascent algorithm. The fast learning speed is achieved by the systematic design of the learning rate schedule. Moreover, it is shown that the on-line EM algorithm can escape from a local maximum of the likelihood function in the early training phase, even when the batch EM algorithm is trapped to a local maximum solution. It is pointed out that the on-line EM algorithm has a similar form as the natural gradient method proposed by Amari (1998), which gives the optimal asymptotic convergence. The inverse of the Fisher information matrix in the on-line EM algorithm may contribute to fast learning performance. In our on-line EM algorithm, however, it is not necessary to calculate the inverse of the Fisher information matrix. In the future, it would be interesting to study the relation of our algorithm to the natural gradient method. -------------------------------------- Masa-aki Sato ATR Human Information Processing Research Laboratories 2-2, Hikaridai, Seika-cho, Soraku-gun Kyoto 619-0288 Japan phone : 0774-95-1039 fax : 0774-95-1008 E-mail: masaaki at hip.atr.co.jp From C.Campbell at bristol.ac.uk Wed Jun 9 08:16:16 1999 From: C.Campbell at bristol.ac.uk (I C G Campbell) Date: Wed, 9 Jun 1999 13:16:16 +0100 (BST) Subject: PhD studentship available Message-ID: <199906091216.NAA12541@zeus.bris.ac.uk> PhD STUDENTSHIP AVAILABLE A three year studentship is available in the area of Support Vector Machines and their applications. The project has a theoretical component and an applied component, principally the application of SVMs and related kernel classifiers to biosequences. The student will be a member of the Computational Intelligence Group (see our web pages http://lara.enm.bris.ac.uk/cig/ ) which is part of the Advanced Computing Research Centre, Bristol University (http://www.cs.bris.ac.uk/ACRC/ ). Suitable applicants should have a strong mathematical background and some additional computing experience would be preferred. The studentship comprises payment of fees and an annual maintenance grant at EPSRC rates. To apply for this post please send a current CV to: C.Campbell at bris.ac.uk Colin Campbell ACRC, Bristol University. United Kingdom From frey at dendrite.uwaterloo.ca Wed Jun 9 08:39:48 1999 From: frey at dendrite.uwaterloo.ca (Brendan Frey) Date: Wed, 9 Jun 1999 08:39:48 -0400 Subject: loopy learning Message-ID: <199906091239.IAA08681@dendrite.uwaterloo.ca> It turns out that loopy propagation, or "turboinference", in Gaussian networks can be used effectively for INFERENCE and LEARNING: http://www.cs.toronto.edu/~frey/papers/tfa-nc99.abs.html In this paper (submitted for publication in May), I show that iterative probability propagation in factor analysis networks has a fixed point and I give an eigenvalue condition for global convergence. I also show that iterative propagation can be used for learning factor analyzer networks and give results on face recognition. Brendan. From mdorigo at ulb.ac.be Wed Jun 9 09:31:46 1999 From: mdorigo at ulb.ac.be (Marco DORIGO) Date: Wed, 9 Jun 1999 15:31:46 +0200 Subject: Fourth European Workshop on Reinforcement Learning: Call for Participation and Abstracts Message-ID: Call for Abstracts: EWRL-4, Fourth European Workshop on Reinforcement Learning Lugano, Switzerland, October 29-30, 1999 (We apologize for duplicates of this email) Reinforcement learning (RL) is a growing research area. To build a European RL community and give visibility to the current situation on the old continent, we are running a now biennal series of workshops. EWRL-1 took place in Brussels, Belgium (1994), EWRL-2 in Milano, Italy (1995), EWRL-3 in Rennes, France (1997). EWRL-4 will take place in Lugano, Switzerland (1999). The first morning will feature a plenary talk by Dario Floreano. The rest of the two-day workshop will be dedicated to presentations given by selected participants. Presentation length will be determined once we have some feedback on the number of participants. The number of participants will be limited. Access will be restricted to active RL researchers and their students. Please communicate as soon as possible, and in any case before end of July 1999, your intention to participate by means of the intention form attached below (e-mail preferred: ewrl at iridia.ulb.ac.be). Otherwise send intention forms to: Marco Dorigo IRIDIA, CP 194/6 Universite' Libre de Bruxelles Avenue Franklin Roosvelt 50 1050 Bruxelles Belgium TIMELINE: intention forms and one page abstracts should be emailed by the end of July to ewrl at iridia.ulb.ac.be Up-to-date information, including inscription fees, hotel information, etc., is maintained at: http://iridia.ulb.ac.be/~ewrl/EWRL4/EWRL4.html The Organizing Committee Marco Dorigo and Hugues Bersini, IRIDIA, ULB, Brussels, Belgium, Luca M. Gambardella and Juergen Schmidhuber , IDSIA, Lugano, Switzerland, Marco Wiering, University of Amsterdam, The Netherlands. -------------------------------------------------------------------- INTENTION FORM (to be emailed by the end of July, 1999, to ewrl at iridia.ulb.ac.be) Fourth European Workshop on Reinforcement Learning (EWRL-4) Lugano, Switzerland, October 29-30, 1999 Family Name: First Name: Institution: Address: Phone No.: Fax No.: E-mail: ____ I intend to participate without giving a presentation ____ I intend to participate and would like to give a presentation with the following title: ____ MAX one page abstract: From a.burkitt at medoto.unimelb.edu.au Wed Jun 9 21:58:23 1999 From: a.burkitt at medoto.unimelb.edu.au (Anthony BURKITT) Date: Thu, 10 Jun 1999 11:58:23 +1000 Subject: Preprint available Message-ID: <60E1B9CE4896D111A22700E0291005973580DA@mail.medoto.unimelb.edu.au> The following paper on the analysis of integrate and fire neurons has been accepted for publication in Neural Computation and is available now from my web page: http://www.medoto.unimelb.edu.au/people/burkitta/poisson.ps.zip "Calculation of interspike intervals for integrate and fire neurons with Poisson distribution of synaptic inputs" A. N. Burkitt and G. M. Clark Abstract: In this paper we present a new technique for calculating the interspike intervals of integrate and fire neurons. There are two new components to this technique. Firstly, the probability density of the summed potential is calculated by integrating over the distribution of arrival times of the afferent postsynaptic potentials (PSPs), rather than using conventional stochastic differential equation techniques. A general formulation of this technique is given in terms of the probability distribution of the inputs and the time course of the postsynaptic response. The expressions are evaluated in the Gaussian approximation, which gives results that become more accurate for large numbers of small amplitude PSPs. Secondly, the probability density of output spikes, which are generated when the potential reaches threshold, is given in terms of an integral involving a conditional probability density. This expression is a generalization of the renewal equation, but it holds for both leaky neurons and for situations in which there is no time-translational invariance. The conditional probability density of the potential is calculated using the same technique of integrating over the distribution of arrival times of the afferent PSPs. For inputs with a Poisson distribution the known analytic solutions for both the perfect integrator model and the Stein model (which incorporates membrane potential leakage) in the diffusion limit are obtained. The interspike interval distribution may also be calculated numerically for models which incorporate both membrane potential leakage and a finite rise time of the postsynaptic response. Plots of the relationship between input and output firing rates as well as the coefficient of variation are given, and inputs with varying rates and amplitudes, including inhibitory inputs, are analyzed. The results indicate that neurons functioning near their critical threshold, where the inputs are just sufficient to cause firing, display a large variability in their spike timings. ====================ooOOOoo==================== Anthony N. Burkitt The Bionic Ear Institute 384-388 Albert Street East Melbourne, VIC 3002 Australia Email: a.burkitt at medoto.unimelb.edu.au http://www.medoto.unimelb.edu.au/people/burkitta Phone: +61 - 3 - 9283 7510 Fax: +61 - 3 - 9283 7518 =====================ooOOOoo=================== From ericwan at ece.ogi.edu Thu Jun 10 14:59:53 1999 From: ericwan at ece.ogi.edu (Eric Wan) Date: Thu, 10 Jun 1999 18:59:53 +0000 Subject: OGI PH.D. STUDENT RESEARCH POSITION Message-ID: <37600B29.9CDF09A2@ece.ogi.edu> *********** PH.D. STUDENT RESEARCH POSITION OPENING **************** CENTER FOR SPOKEN LANGUAGE UNDERSTANDING http://cslu.cse.ogi.edu/ OREGON GRADUATE INSTITUTE The Oregon Graduate Institute of Science and Technology (OGI) has an immediate opening for an outstanding student in its Electrical and Computer Engineering Ph.D program. Full stipend and tuition will be covered. The student will specifically work with Professor Eric A. Wan (http://www.ece.ogi.edu/~ericwan/) on a number of projects relating to neural network learning and speech enhancement. QUALIFICATIONS: The candidate should have a strong background in signal processing with some prior knowledge of neural networks. A Masters Degree in Electrical Engineering is preferred. Please send inquiries and background information to ericwan at ece.ogi.edu. Eric A. Wan Associate Professor, OGI ********************************************************************* OGI OGI is a young, but rapidly growing, private research institute located in the Portland area. OGI offers Masters and PhD programs in Computer Science and Engineering, Applied Physics, Electrical Engineering, Biology, Chemistry, Materials Science and Engineering, and Environmental Science and Engineering. OGI has world renowned research programs in the areas of speech systems (Center for Spoken Language Understanding) and machine learning. (Center for Information Technologies). Center for Spoken Language Understanding The Center for Spoken Language Understanding is a multidisciplinary academic organization that focuses on basic research in spoken language systems technologies, training of new investigators, and development of tools and resources for free distribution to the research and education community. Areas of specific interest include speech recognition, natural language understanding, text-to-speech synthesis, speech enhancement in noisy conditions, and modeling of human dialogue. A key activity is the ongoing development of the CSLU Toolkit, a comprehensive software platform for learning about, researching, and developing spoken dialog systems and new applications. Center for Information Technologies The Center for Information Technologies supports development of powerful, robust, and reliable information processing techniques by incorporating human strategies and constraints. Such techniques are critical building blocks of multimodal communication systems, decision support systems, and human-machine interfaces. The CIT approach is based on emulating relevant human information processing capabilities and extending them to a variety of complex tasks. The approach requires expertise in nonlinear and adaptive signal processing (e.g., neural networks), statistical computation, decision analysis, and modeling of human information processing. Correspondingly, CIT research areas include perceptual characterization of speech and images, prediction, robust signal processing, rapid adaptation to changing environments, nonlinear signal representation, integration of information from several sources, and integration of prior knowledge with adaptation. From jbower at bbb.caltech.edu Thu Jun 10 18:22:12 1999 From: jbower at bbb.caltech.edu (James M. Bower) Date: Thu, 10 Jun 1999 15:22:12 -0700 Subject: REGISTRATION FOR CNS*99 Message-ID: ************************************************************************ EIGHTH ANNUAL COMPUTATIONAL NEUROSCIENCE MEETING (CNS*99) July 18 - 22, 1999 Pittsburgh, Pennsylavania REGISTRATION INFORMATION ************************************************************************ Registration is now open for this year's Computational Neuroscience meeting (CNS*99). This is the eighth in a series of annual inter-disciplinary conferences intended to address the broad range of research approaches and issues involved in the general field of computational neuroscience. As in previous years, this meeting will bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in understanding how biological neural systems compute. The meeting will equally emphasize experimental, model-based, and more abstract theoretical approaches to understanding neurobiological computation. The meeting in 1999 will take place at Pittsburgh Hilton and Towers in Pittsburgh, Pennsylvania and include plenary, contributed, and poster sessions. The first session starts at 9 am, Sunday July 18th and the meeting ends with the annual CNS banquet on Thursday evening, July 22nd. There will be no parallel sessions. The meeting includes two half days of informal workshops focused on current issues in computational neuroscience. Day care will be not be available. LOCATION: The meeting will take place at the Pittsburgh Hilton and Towers in Pittsburgh, Pennsylvania MEETING ACCOMMODATIONS: Accommodations for the meeting have been arranged at Pittsburgh Hilton and Towers. Information concerning reservations, hotel accommodations, etc. are available at the meeting web site indicated below. A block of rooms are reserved at special rates. 40 student rate rooms are available on a first-come-first-served basis, so we recommend students acting quickly to reserve these slots. NOTE that registering for the meeting, WILL NOT result in an automatic room reservation. Instead you must make your own reservations by contacting the hotel itself. As this is the high season for tourists in Pittsburgh, you should make sure and reserve your accommodations quickly by contacting: Pittsburgh Hilton and Towers (RESERVATION REQUEST ORDER FORM LOCATED BELOW) NOTE: IN ORDER TO GET THE AVAILABLE ROOMS, YOU MUST CONFIRM HOTEL REGISTRATIONS BY SATURDAY, JUNE 17, 1999. When making reservations by phone, make sure and indicate that you are registering for the Computational Neuroscience (CNS*99) meeting. Students will be asked to verify their status on check in with a student ID or other documentation. MEETING REGISTRATION FEES: Registration received on or before July 3, 1999: Student: $ 125 Regular: $ 275 Meeting registration after July 3, 1999: Student: $ 150 Regular: $ 300 BANQUET: Registration for the meeting includes a single ticket to the annual CNS Banquet. Additional Banquet tickets can be purchased for $35 each person. The banquet will be held on Thursday, July 22nd. ********************************************************************* REGISTRATION AND ADDITIONAL INFORMATION (including the agenda with list of talks) can be obtained by: o Using our on-line WWW information and registration server, URL of: http://cns.numedeon.com/cns99/ o Sending Email to: cns99 at bbb.caltech.edu PLEASE FAX OR MAIL REGISTRATION FORM TO: Caltech, Division of Biology 216-76, Pasadena, CA 91125 Attn: Judy Macias Fax Number: (626) 795-2088 (Refund Policy: 50% refund for cancellations on or before July 9th no refund after July 10th) ******************************************************************* PLEASE CALL PITTSBURGH HILTON AND TOWERS TO MAKE HOTEL RESERVATIONS AT (412) 391-4600 Fax (412) 594-5144 OR YOU CAN MAIL REGISTRATION FORM TWO WAYS * MAIL REGISTRATION FORM TO PITTSBURGH HILTON AND TOWERS AT THE ADDRESS BELOW OR FAX REGISTRATION FORM BELOW AT (412) 594-5144 ********************************************************************** MAIL TO: Pittsburgh Hilton and Towers Attn: Reservation Department 600 Commonwealth Place Pittsburgh, Pennsylvania 15222 Check-In Time: 3:00 p.m. Check-Out Time: 12:00 noon Computational Neuroscience Conference - CNS*99 July 17 - 22, 1999 * A $50 early departure fee will be assessed should you change your departure date after you have checked in. ** The hotel requires a 1 night advance deposit for all reservations. The deposit is refunable up to 3 days before arrival. Checks and major credit cards are acceptable to establish the deposit. REQUESTS MUST BE RECEIVED BY: SATURDAY, JUNE 17, 1998 PLEASE RESERVE ____________ ROOM(S) OF THE TYPE CIRCLED ___________ ARRIVAL (DAY/DATE)______________ TIME ______________ DEPARTURE (DAY/DATE)____________ TIME ______________ Name of Person Requesting Rooms: Last Name:____________________________ First Name:____________________________ Company Name:________________________ Institute:_______________________________ Street Address or PO Box Number:___________________________ City:___________________________________ State:___________________________________ Zip Code:________________________________ Area Code and Phone Number:______________________________ PERSON SHARING ACCOMMODATIONS: 1. ____________________________ 2. ____________________________ 3. ____________________________ __ ADVANCE DEPOSIT (check) __ AMERICAN EXPRESS __ MASTERCARD __ VISA __ CARTE BLACHE __ DINERS __ DISCOVER Credit Card No. ______________________________________ Expiration Date:______________________________________ SCHEDULE OF RATE (CIRCLE 1) Queen 1 person/1 bed $119.00 Queen 2 persons/1 bed $119.00 Double/Double 2 persons/2 beds $119.99 Plus prevailing Sate Sales Tax 14% Additional Person charge is $20.00 Cut Off Date: June 26, 1999 H HONORS Membership No. ______________________________________ **If a room is not available at rate requested, reservation will be made at the next available rate. ******************************************************************** CNS*99 MEETING AGENDA SUNDAY, JULY 18, 1999 9:00 Welcoming Remarks and General Information 9:15 Featured Contributed Talk: Boris Gutkin (Center for Neuroscience at University of Pittsburgh) G. Bard Ermentrout A Canonical Theory of Spike Generation in Cortical Neurons Accounts for Complex Neural Responses to Constant and Time-varying Stimuli Contributed Talks 10:05 Frances Chance (Brandeis University) Sacha Nelson, and L. F. Abbott Recurrent Cortical Amplification Produces Complex Cell Responses 10:25 Brian Blais (Brown University) Ann Besu, Harel Shouval, and Leon Cooper Statistics of LGN Activity Determine the Segregation of ON/OFF Subfields for Simple Cells in Cortex 10:45 Alain Destexhe (Laval University) Denis Pare Correlated Synaptic Bombardment in Neocortical Pyramidal Neurons in Vivo 11:05 Break 11:20 Reinoud Maex (University of Antwerp) Bart P. Vos and Erik De Schutter Imprecise Spike Synchronization Reveals Common Excitation through Weak Distributed Synapses 11:40 Allan Coop (Rockefeller University) George Reeke Simulating the Temporal Evolution of Neuronal Discharge 12:00 Gwendal le Masson (Institut Fran=E7ois Magendie) Emmanuel Barbe, Valerie Morisset, and Frederic Nagy >From Current Clamp Experiments to Conductance-based Model Neurons: A >Direct Link Using A New Error Function and Optimization Scheme 12:20 Lunch Break and Poster Preview Session A 2:20 Featured Contributed Talk: Rama Ratnam (University of Illinois) Mark Nelson Impact of Afferent Spike Train Irregularity on the Detection of Weak Sensory Signals Contributed Talks 3:10 Elise Cassidente (Carnegie Mellon University) Xiaogang Yan and Tai Sing Lee A Bayesian Decision Approach to Decode Local and Contextual Signals from Spike Trains 3:30 Alexander Dimitrov (MSU Center for Computational Biology) John P. Miller Natural Time Scale for Neural Encoding 3:50 Break 4:10 Kevin Otto (Arizona State University) Rousche Patrick and Daryl Kipke Investigating Neural Coding and Plasticity in Auditory Cortex using Real-Time Feedback from Ensemble Neural Recordings 4:30 End of Day Announcements 8:00 Poster Session A Pamela Abshire (Johns Hopkins University) Andreas Andreou Relating Information Capacity to a Biophysical Model for Blowfly Retina Paul Adams (SUNY Stony Brook) Kingsley Cox Implications of Digital Synapses for Thalamocortical Function Ildiko Aradi (Ohio University) William Holmes Synchronized Oscillation in Networks of Hippocampal Dentate Gyrus Interneurons with Different Adaptation Properties Per Aronsson (Royal Institute of Technology) Hans Liljenstr=F6m Electromagnetic Interaction in a Neural System Giorgio A. Ascoli (George Mason University) Jeffrey L. Krichmar L-Neuron: A Modeling Tool for the Efficient Generation and Parsimonious Description of Dendritic Morphology Davis Barch (University of California at Berkeley) Characterization of Static Input by Activity Oscillations in an Excitable Membrane Model: Effect of Input Size, Shape and Texture on Oscillation Parameters Hauke Bartsch (Technische University) The Influence of Threshold Variability on the Response of Visual Cortical Neurons John Beggs (Yale University) Why LTP and LTD are Asymmetric: A Bin Model to Explain Induction Parameters Alan Bond (California Institute of Technology) Problem-solving Behavior in a System Model of the Primate Neocortex Vladimir Bondarenko (University of Pittsburgh) Teresa Chay The Role of AMPA, GABA, [Ca2+]i, and Calcium Stores in Propagating Waves in Neuronal Networks Mihail Bota (USC Brain Project) Alex Guazzelli and Michael Arbib The Extended Taxon-Affordances Model: Egocentric Navigation and the Interactions between the Prefrontal Cortex and the Striatum Hans A. Braun (University of Marburg) Martin T. Huber, Mathias Dewald, Karlheinz Voigt, Alexander Neimann, Xing Pei, and Frank Moss A Computer-Model of Temperature Encoding in Peripheral Cold Receptors: Oscillations, Noise and Chaotic Dynamics Adam Briska (University of Wisconsin) Daniel Uhlrich and William Lytton Independent Dendritic Domains in the Thalamic Circuit Dyrk Brockmann (Max-Planck-Institut) Theo Geisel The Ecology of Gaze Shifts David Brown (The Babraham Institute) Jianfeng Feng Low Correlation between Random Synaptic Inputs Impacts Considerably on the Output of the Hodgkin-Huxley Model Emery N. Brown (Massachusetts General Hospital) Riccardo Barbieri, Michael C. Quirk, Loren M. Frank, and Matthew A. Wilson Constructing a Time-dependent Gamma Probability Model of Spatial Information Encoding in the Rat Hippocampus Nicolas Brunel (Brandeis University) Phase Diagrams of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons Anthony Burkitt (The Bionic Ear Institute) Graeme Clark Analysis of Synchronization in the Response of Neurons to Noisy Periodic Synaptic Input Anthony Burkitt (The Bionic Ear Institute) Interspike Interval Variability for Balanced Networks with Reversal Potentials for Large Numbers of Inputs Marcelo Camperi (University of San Francisco) Peter Pacheco, Nicola Rugai, and Toshiyuki Uchino NEUROSYS: An Easy-to-Use System for the Simulation of Very Large Networks of Biologically Accurate Neurons on Parallel Computers Marcelo Camperi (University of San Francisco) Nicola Rugai Modeling Dopaminergic Modulation of Delay-Period Activity in Prefrontal Cortex During Working Memory Processes" Carmen Canavier (University of New Orleans) Reciprocal Excitatory Ohmic Synapses Convert Pacemaker-Like Firing into Burst Firing in a Simple Model of Coupled Neurons Gal Chechik (Tel-Aviv University) Isaac Mailijson and Eytan Ruppin Neuronal Normalization Provides Effective Learning through Ineffective Synaptic Learning Rules Yoonsuck Choe (The University of Texas at Austin) Risto Miikkulainen and Lawrence K. Cormack Effects of Presynaptic and Postsynaptic Resource Redistribution in Hebbian Weight Adaptation Carson Chow (University Of Pittsburgh) Nancy Kopell Dynamics of Spiking Neurons with Electrical Coupling Thomas Coates (Pennsylvania State University) Control and Monitoring of a Parallel Processed Neural Network via the World Wide Web Eyal Cohen (Tel-Aviv University) Nir Levy and Eytan Ruppin Global Versus Local Processing of Compressed Representations: A Computational Model of Visual Search Gennady S. Cymbalyuk (Emory University) Ronald L. Calabrese Oscillatory Behaviors in Pharmacologically Isolated Heart Interneurons from the Medicinal Leech Yue Dai (University of Manitoba) Kelvin Jones, Brent Fedirchuk, and Larry Jordan Regulation of the Action Potential Voltage Threshold in Cat Spinal Motoneurons during Fictive Locomotion Field David (Cornell University) Learning Wavelet-Like Receptive Fields from Natural Scenes with a Biologically Plausible De-correlation Network Marilene de Pinho (Universidade De S=E3o Paulo) Marcelo Mazza and Antonio Roque-da-Silva A Biologically Plausible Computer Simulation of Classical Conditioning Induced Reorganization of Tonotopic Maps in the Auditory Cortex Gustavo Deco (Siemens AG) Josef Zihl Neurodynamical Mechanism of Binding and Selective Attention for Visual Search Alain Destexhe (Laval University) Helmut Kroger Consequences of Correlated Synaptic Bombardment on Dendritic Integration in Neocortical Pyramidal Neurons Patricia M. Di Lorenzo (SUNY at Binghamton) Kurt Grandis and Christian Reich Stimulation of Sodium Channels in Taste Receptor Cells Provides Noise that Enhances Taste Detection L. M. Dobbs (Schafer Corporation) T. J. Manuccia and J. L. Murphy Planar Optically Switched Microelectrode Array (OSMA) Silke Dodel (Max Planck Institute) J. Michael Herrmann, Theo Geisel, and Jens Frahm Components of Brain Activity -Data Analysis for FMRI Gideon Dror (The Academic College of Tel-Aviv-Yaffo) Misha Tsodyks Activity of Coupled Excitatory and Inhibitory Neural Populations with Dynamic Synapses Gideon Dror (The Academic College of Tel-Aviv-Yaffo) Misha Tsodyks Chaotic Phenomena in Neural Populations with Dynamic Synapses Witali Dunin-Barkowski (Texas Tech University) Donald Wunsch Phase-Based Cerebellur Learning of Dynamic Signals Gaute T. Einevoll (Agricultural University of Norway) Paul Heggelund Mathematical Models for Spatial Receptive-Field Organization of DLGN Neurons in CAT Chris Eliasmith (Washington University in St. Louis) Charles H. Anderson Rethinking Central Pattern Generators: A General Framework Steven Epstein (Boston University) Jason Ritt, Yair Manor, Farzan Nadim, Eve Marder, and Nancy Kopell Network Oscillations Generated by Balancing Graded Asymmetric Reciprocal Inhibition in Passive Neurons T. Ghaffari Farazi (University of Southern California) J.-S. Liaw and T.W. Berger Functional Implications of Synaptic Morphology Stuart Feerick (The Babraham Institute) Jianfeng Feng and David Brown Random Pulse input Versus Continuous Current plus White/colored Noise: Are They Equivalent? Jianfeng Feng (The Babraham Institute) Stimulus-Evoked Oscillatory Synchronization in Neuronal Models Joseph Francis (George Washington University) Bruce Gluckman and Steven Schiff Deterministic Structure in Data from a Free Running Neuronal Ensemble: A Comparison of Three Non-linear Test for Determinism Mark Fuhs (Carnegie Mellon University) David Touretzky Synaptic Learning Models of Map Separation in the Hippocampus Tomoki Fukai (Tokai University) Seinichi Kanemura Precisely-Timed Transient Synchronization by Synaptic Depression Ryuta Fukuda (Keio University) Satoshi Nagayasu, Junko Hara, William Shankle, Masaru Tomita Suggesting Human Cortical Connectivity for Language-related Areas and Simulations of its Computational Model Enrique Garibay (Brandeis University) Xiao-Jing Wang Information Transfer in a Model Neuron with Correlated Inputs Daniel Gill (The Hebrew University) Lidror Troyansky and Israel Nelken Auditory Localization using Direction-dependent Spectral Information Simon F. Giszter (MCPHU) William J. Kargo On-line Limb Trajectory Adaptation by Assembly and Control of Force-field Primitives Using Modular Encapsulated Feedback J. Randall Gobbel (Carnegie Mellon University) Reinforcement Learning in a Biophysical Model of Basal Ganglia-Neocortex Loops MONDAY, JULY 19, 1999 9:00 General Information 9:15 Featured Contributed Talk: Don Johnson (Rice University) Charlotte Gruner, Raymon Glantz Quantifying Information Transfer in Spike Generation Contributed Talks 10:05 Daniel Butts (Lawrence Berkeley National Laboratory) Daniel Rokhsar The Information Content of Spontaneous Retinal Waves 10:25 Mona Spiridon (EPFL, Swiss Federal Institute of Technology) Carson Chow and Wulfram Gerstner Signal Transmission through A Population of Integrate-and-fire Neurons 10:45 Paul H.E. Tiesinga (Salk Institute) Jorge V. Jose Driven by Inhibition 11:05 Break 11:20 Jianfeng Feng (The Babraham Institute) Stimulus-Evoked Oscillatory Synchronization in Neuronal Models 11:40 Charles Anderson (Washington University) Qingfeng Huang and John Clark Harmonic Analysis of Spiking Neuronal Ensembles 12:00 Steven Schiff (Krasnow Institute) David Colella Brain Chirps: Spectrographic Signatures of Epileptic Seizures 12:20 Vikaas Sohal (Stanford University School of Medicine) Molly Hunstman and John Huguenard Reciprocal Inhibitory Connections Produce Phase Lags That Desynchronize Intrathalamic Oscillations 12:40 Lunch Break and Poster Preview Session B 2:00 Featured Contributed Talk: Nancy Kopell (Boston University) Bard Ermentrout, Miles Whittington, and Roger Traub Gamma and Beta Rhythms Have Different Synchronization Properties Contributed Talks 2:50 Carson Chow (University of Pittsburgh) Carlo Laing and G. Bard Ermentrout Bump Solutions in a Network of Spiking Neurons 3:10 Christian W. Eurich (Institute of Theoretical Physics) Klaus Pawelzik, Udo Ernst, Jack Cowan, and John Milton Delay Adaptation in the Nervous System 3:30 Kay Robbins (University of Texas at San Antonio) David Senseman The Relationship of Response Latency to Modal Decomposition: Analysis of the Initial Spread of Visually-evoked Cortical Depolarization 3:50 Break 4:10 Ernst Niebur (Johns Hopkins University) Arup Roy, Peter Steinmetz, and Kenneth Johnson Model-free Detection of Synchrony in Neuronal Spike Trains, with an Application to Primate Somatosensory Cortex 4:30 Invited Talk: To be announced 5:20 End of Day Announcements 8:00 Poster Session B Mark Goldman (Brandeis University) Jorge Golowasch, Laurence Abbott, and Eve Marder Dependence of Firing Pattern on Intrinsic Ionic Conductances: Sensitive and Insensitive Combinations Anatoli Gorchetchnikov (Middle Tennessee State University) Introduction of Threshold Self-adjustment Improves the Convergence in Feature-detective Neural Nets Boris S. Gutkin (University of Pittsburgh) G. Bard Ermentrout and Joseph O'Sullivan Layer 3 Patchy Recurrent Excitatory Connections May Determine the Spatial Organization of Sustained Activity in the Primate Prefrontal Cortex Junko Hara (University of California at Irvine) Ryuta Fukuda, William Shankle, and James Fallon Estimating Cortical Connectivity from Statistical Properties of the Microscopic Features of the Developing Human Cerebral Cortex: Comparison to Contemporary Methods and Relevance to Computational Modeling Jeanette Hellgren Kotaleski (NADA) Patrik Krieger Simulation of Metabotropic Glutamate Receptor Induced Cellular and Network Effects in the Lamprey Locomotor CPG Jeanette Hellgren Kotaleski (NADA) Alexander Kozlov, Erik Aurell, Sten Grillner, and Anders Lansner Modeling of Plasticity of the Synaptic Connections in the Lamprey Spinal CPG - Consequences for Network Behavior Tim Hely (Santa Fe Institute) The Development of Corpus Callosum Connections in Primary Visual Cortex Alix Herrmann (Swiss Federal Institute of Technology) Wulfram Gerstner Effect of Noise on Neuron Transient Response J. Michael Herrmann (Max-Planck-Institut Fuer Stroemungsforschung) Klaus Pawelzik and Theo Geisel Learning Predictive Representations Claus C. Hilgetag (Newcastle University) Spatial Neglect and Paradoxical Lesion Effects in the CAT - A Model Based on Midbrain Connectivity Ulrich Hofmann (California Institute of Technology) Stephen Van Hooser, David Kewley, and James Bower Relationship between Field Potentials and Spike Activity in Rat S1: Multi-site Cortical Recordings and Simulations William Holmes (Ohio University) Comparison of CaMKinase II Activation in a Dendritic Spine Computed with Deterministic and Stochastic Models of the NMDA Synaptic Conductance Greg Hood (Pittsburgh Supercomputing Center) John Burkardt and Greg Foss Visualizing the Visual System David Horn (Tel Aviv University) Irit Opher Complex Dynamics of Neuronal Thresholds Osamu Hoshino (The University of Electro-Communications) Satoru Inoue, Yoshiki Kashimori, and Takeshi Kambara A Role of a Hierarchical Dynamical Map in Cognition and Binding Different Sensory Modalities Michael Howe (University of Texas at Austin) Risto Miikkulainen Hebbian Learning and Temporary Storage in the Convergence-Zone Model of Episodic Memory Fred Howell (University of Edinburgh) Jonas Dyhrfjeld-Johnsen, Reinhoud Maex, Nigel Goddard, and Erik De Schutter A Large Scale Simulation Model of the Cerebellar Cortex using PGENESIS Martin T. Huber (University of Marburg) J=FCrgen C. Krieg, Hans A. Braun, Xing Pei, and Frank Moss Do Stochastic-dynamic Effects Contribute to the Progression of Mood Disorders: Implications from Neurodynamic Modelling Hidetoshi Ikeno (Himeji Institute of Technology) Shiro Usui Information Processing by Electro-diffusion in the Kenyon Cell Satoru Inoue (The University of Electro-Communications) Yoshiki Kashimori, Osamu Hoshino, and Takeshi Kambara A Neuronal Model of Nucleus Laminaris and Inferior Colliculus Detecting Microsecond Interaural Time Difference in Sound Localization David Jaffe (University of Texas at San Antonio) Nicholas Carnevale Morphological Determinants of Synaptic Integration Kelvin E. Jones (University of Manitoba) Kevin P. Carlin, Jeremy Rempel, Larry M. Jordan, and Rob M. Brownstone Dendritic Calcium Channels in Mouse Spinal Motoneurons: Implications for Bistable Membrane Properties Ranu Jung (University of Kentucky) David Magnuson Non-stationary Analysis of Extracellular Neural Activity Ranu Jung (University of Kentucky) Min Shao Robustness of the CGSA in Estimating the Hurst Exponent from Time Series with Fractal and Harmonic Components George Kalarickal (Massachusetts Institute of Technology) Jonathan Marshall Neural Model of Temporal and Stochastic Properties of Binocular Rivalry Takeshi Kambara (University of Electro-Communications) Yoshiki Kashimori A Positive Role of Noises in Accurate Detection of Time Difference by Electrosensory System of Weakly Electric Fish Jan Karbowski (Boston University) Nancy Kopell Multispikes and Synchronization in a Large Neural Network with Temporal Delays Matthias Kaschube (Max-Planck-Institut Fuer Stroemungsforschung) Fred Wolf, Theo Geisel, and Siegrid Loewel Quantifying the Variability of Patterns of Orientation Domains in the Visual Cortex of Cats Yoshiki Kashimori (Univ. of Electro-Communications) Takeshi Kambara A Role of Synaptic Variation Depending on Precise Timing of Pre- and Post-synaptic Depolarization in Electrolocation Adam Kepecs (Brandeis University) Xiao-Jing Wang An Analysis of Complex Bursting in Cortical Pyramidal Neuron Models Blackwell Kim (George Mason University) Characterization of the Light-induced Currents in Hermissenda D. O. Kim (Univ. Conn. Health Center) W. R. D'Angelo Computational Model for the Bushy Cell of the Cochlear Nucleus Wonryull Koh (Texas A&M University) Bruce H. McCormick Distributed, Web-based Microstructure Database for Brain Tissue Wonryull Koh (Texas A&M University) Bruce H. McCormick, William R. Shankle, and James H. Fallon Geometric Modeling of Local Cortical Networks Alexander Kozlov (Russian Academy of Science) Erik Aurell, Tatiana Deliagina, Sten Grillner, Jeanette Hellgren-Kotaleski, Grigory Orlovsky, and Pavel Zelenin Modeling Control of Body Orientation in the Lamprey Sarah Lesher (University of Maryland) Li Guan and Avis Cohen Symbolic Time Series Analysis of Neural Data Nir Levy (Tel Aviv University) David Horn and Eytan Ruppin Distributed Synchrony in an Attractor of Spiking Neurons Shu-Chen Li (Max Planck Institute for Human Development) Ulman Lindenberger and Peter A. Frensch Unifying Levels of Cognitive Aging: From Neurotransmission to Representation to Cognition Hualou Liang (Florida Atlantic University) Mingzhou Ding and Steve Bressler On the Tracking of Dynamic Functional Relations in Monkey Cerebral Cortex Christiane Linster (Boston University) Eve Derosa, Michaella Maloney, and Michael Hasselmo Selective Cholinergic Suppression of Pre-strengthened Synapses: A Mechanism to Minimize Associative Interference between Odors John E. Love (Florida Institute of Technology) Kathleen M. Johnson GRAVICOGNITOR: Toward a Hybrid Fuzzy Cellular Neural Network Based on the Cytoarchitectonics of Biological Gravity-Sensing Organ Ontogenesis Huo Lu (California Institute of Technology) James Bower Noradrenergic Modulation of Interneurons in the Cerebellar Cortex Malcolm A. MacIver (University of Illinois) Mark E. Nelson Evidence for Closed-Loop Control of Prey Capture in Weakly Electric Fish Norbert Mayer (MPI Fuer Stroemungsforschung) Michael Herrmann and Theo Geisel Receptive Field Formation in Binocularly Deprived Cats Marcelo Mazza (Universidade De S=E3o Paulo) Antonio Roque-da-Silva Realistic Computer Simulation of Cortical Lesion Induced Imbalances in Properties of Somatotopic Maps Bruce H. McCormick (Texas A&M University) Richard W. DeVaul, William R. Shankle, and James H. Fallon Modeling Neuron Spatial Distribution and Morphology in the Developing Human Cerebral Cortex Rebecca McNamee (University Of Pittsburgh) Mingui Sun and Robert Sclabassi Use of a Neuro-Fuzzy Inference System (NFIS) for Modeling the Physiologic System of Beat-By-Beat Cardiac Control: Comparison to an Auto-Regressive Moving Average (ARMA) Model Georgiy Medvedev (Boston University) Charles Wilson, Jay Callaway, and Nancy Kopell A Dopaminergic Neuron as a Chain of Oscillators: Analysis of Transient Dynamics Eduardo Mercado III (Rutgers University) Catherine E. Myers and Mark A. Gluck Modeling Auditory Cortical Processing as an Adaptive Chirplet Transform Eugene Mihaliuk (West Virginia University) Kenneth Showalter Entrainment with Hebbian Learning Octavian D. Mocanu (Universidad Aut=F3noma De Barcelona) Joan Oliver, Fidel Santamar=EDa, and James Bower A Passive Featuring of the Cerebellar Granule Cell (The Branching Point Hypothesis) Benoit Morel (Carnegie Mellon University) Biologically Plausible Learning Rules for Neural Networks and Quantum Computing John Nafziger (University of Pennsylvania) Leif Finkel A Stimulus Density-dependent Normalization Mechanism for Modulating the Range of Contour Integration TUESDAY, JULY 20, 1999 9:00 General Information 9:15 Featured Contributed Talk: Seth Wolpert (Penn State University-Harrisburg) W. Otto Friesen On the Parametric Stability of a Central Pattern Generator Contributed Talks 10:05 Robert Butera (Lab of Neural Control, NINDS, NIH) Sheree Johnson, Christopher Del Negro, John Rinzel, and Jeffrey Smith Dynamics of Excitatory Networks of Burst-capable Neurons: Experimental and Modeling Studies of the Respiratory Central Pattern Generator 10:25 Philip Ulinski (University of Chicago) Vivek Khatri Functional Significance of Interactions between Inhibitory Interneurons in Visual Cortex 10:45 Stella Yu (Carnegie Mellon University) Tai Sing Lee What do V1 Neurons Tell Us about Saccadic Suppression 11:05 Break 11:20 John A. White (Boston University) Matthew I. Banks, Nancy Kopell, and Robert A. Pearce A Novel Mechanism for Theta Modulation of Fast GABA_A Circuit Activity 11:40 Jeremy Caplan (Brandeis University) Michael Kahana, Robert Sekuler, Matthew Kirschen, and Joseph Madsen Task Dependence of Human Theta Oscillations during Virtual Maze Navigation 12:00 Albert Compte (Brandeis University) Nicolas Brunel and Xiao-Jing Wang Spontaneous and Spatially Tuned Persistent Activity in a Cortical Working Memory Model 12:20 Lunch Break and Poster Preview Session C 2:00 Featured Contributed Talk: A. D. Redish (University of Arizona) B. L. McNaughton and C. A. Barnes What Makes Place Cells Directional on the Linear Track? Contributed Talks 2:50 Victoria Booth (New Jersey Institute of Technology) Amitabha Bose and Michael Recce Hippocampal Place Cells and the Generation of Temporal Codes 3:10 Ali Minai (University of Cincinnati) Simona Doboli and Phillip Best A Comparison of Context-Dependent Hippocampal Place Codes In 1-Layer and 2-Layer Recurrent Networks 3:30 Mayank Mehta (Massachusetts Institute of Technology) Michael Quirk and Matthew Wilson >From Hippocampus to V1: Effect of LTP on Spatio-Temporal Dynamics of >Receptive Fields 3:50 Break 4:10 Jessica D. Bayliss (University of Rochester) Dana H. Ballard Single Trial P3 Epoch Recognition in a Virtual Environment 4:30 Geoffrey Goodhill (Georgetown University Medical Center) Andrei Cimponeriu Modeling the Joint Development of Ocular Dominance and Orientation Columns in Visual Cortex 4:50 Arjen van Ooyen (Netherlands Institute for Brain Research) David Willshaw Influence of Dendritic Morphology on Axonal Competition and Pattern of Innervation 5:20 End of Day Announcements 8:00 Poster Session C Hirofumi Nagashino (The University of Tokushima) Kazumi Achi and Yohsuke Kinouchi Synchronization with a Periodic Pulse Train in an Asymmetrically Coupled Neuronal Network Model Alexander Neiman (University of Missouri at St. Louis) Xing Pei, Frank Moss, Winfried Wojtenek, Lon Wilkens, Martin Huber, Mathias Dewald, and Hans Braun Spike Train Reveals Low Dimensional Deterministic Behaviors in a Hodgkin-Huxley Neuron with Intrinsic Oscillator Alexander Neiman (University of Missouri at St. Louis) Frank Moss, Pei Xing, David Russell, Winfried Wojtenek, Lon Wilkens, Hans Braun, and Martin Huber Synchronization of the Electroreceptors of the Paddlefish Alexander Neiman (University of Missouri at St.Louis) Ulrike Feudel, Xing Pei, Winfried Wojtenek, Frank Moss, Hans Braun, Mathias Dewald, and Martin Huber Global Bifurcations and Intermittency in a Hodgkin-Huxley Model of Thermally Sensitive Neurons Mike Neubig (University Laval) Alain Destexhe Are Inhibitory Synaptic Conductances on Thalamic Relay Neurons Inhomogeneous? Are Synapses from Individual Afferents Clustered? Duane Nykamp (New York University) Daniel Tranchina A Population Density Approach that Facilitates Large-scale Modeling of Neural Networks: Extension to Slow Inhibitory Synapses Hiroshi Okamoto (Fuji Xerox Co.) Tomoki Fukai A Model for A Cortical Mechanism to Store Intervals of Time Tim C. Pearce (University of Leicester) Odour to Sensor Space Transformations in Artificial and Biological Noses John Pezaris (California Institute of Technology) Maneesh Sahani and Richard Andersen Dynamics in LIP Spike Train Coherence Hans E. Plesser (Max-Planck-Institut for Fluid Dynamics) Wulfram Gerstner Escape Rate Models for Noisy Integrate-and-Fire Neurons Gregor Rainer (Massachusetts Institute of Technology) Earl Miller Neural Ensemble States in the Prefrontal Cortex during Free Viewing Identified using Hidden Markov Model Pamela Reinagel (Harvard Medical School) R. Clay Reid Reproducibility of Firing Patterns in the Thalamus David V. Reynolds (University of Windsor) Christopher Aswin A Large-Scale Neuroanatomical Model of Attention Implemented as a Computer Simulation Barry Richmond (National Institute of Mental Health) Mike Oram and Matthew Wiener The Random Origin of Precise Timing within Single Spike Trains during Pattern Recognition Dan Rizzuto (Brandeis University) Michael Kahana An Autoassociative Neural Network Model of Paired-associate Learning Patrick Roberts (Neurological Sciences Institute) Electrosensory Response Mechanisms in Mormyrid Electric Fish Bas Rokers (Rutgers University) Catherine Myers A Dynamic Model of Learning in the Septo-Hippocampal System Ilya A. Rybak (Drexel University) John K. Chapin, Allon Guez, and Karen A. Moxon Competition and Cooperation between the Automatic and Higher Order Voluntary/Behavioral Neural Mechanisms in the Brain Control of Movements Maneesh Sahani (California Institute of Technology) John Pezaris and Richard Andersen Short Discrete Epochs of Chirped Oscillatory Spiking in LIP Neurons Maneesh Sahani (California Institute of Technology) Jennifer Linden Doubly Stochastic Poisson Models for Smoothing and Clustering of Spike Trains Ko Sakai (RIKEN) Shigeru Tanaka Perceptual Segmentation and Neural Grouping in Tilt Illusion Anders Sandberg (Royal Institute of Technology) Anders Lansner, Karl-Magnus Peterson, Martin Ingvar, and =D6rjan Ekeberg A Palimpsest Memory Based on an Incremental Bayesian Learning Rule Fidel Santamaria (California Institute of Technology) Dieter Jaeger, James Bower, and Erik De Schutter Dendritic Temporal Integration Properties of a Purkinje Cell are Modulated by Background Activity: A Modeling Study William Rodman Shankle (University of California at Irvine) Junko Hara, James H. Fallon, A. Kimball Romney, John P. Boyd, Robert S. Sneddon, and Benjamin H. Landing Insights into the Structuring of the Cytoarchitecture of the Developing Postnatal Human Cerebral Cortex from Term Birth to Age 72 Months: Relevance to Computational Models and Access to the Data Gregory Smith (New York University) Charles Cox, S. Murray Sherman, and John Rinzel Spike-frequency Adaptation in Sinusoidally-driven Thalamocortical Relay Neurons Paul Smolen (The University of Texas) Douglas A. Baxter and John H. Byrne Biochemical Constraints on Realistic Models of Circadian Rhythms Friedrich T. Sommer (University of Ulm) On Cell Assemblies in a Cortical Column Sen Song (Brandeis University) Larry Abbott Temporally Asymmetric Hebbian Learning and Neuronal Response Variability Cristina Soto-Trevino (Brandeis University) L.F. Abbott and Eve Marder A Robust Network Based on Activity-dependent Regulation of Inhibitory Synaptic Conductances Bilin Zhang Stiber (University of California at Berkeley) Edwin R. Lewis, Michael Stiber, and Kenneth R. Henry Auditory Singularity Detection by a Gerbil Cochlea Model Katrin Suder (Ruhr-University Bochum) Florentin Woergoetter and Thomas Wennekers Neural Field Description of State-Dependent Visual Receptive Field Changes Mingui Sun (University of Pittsburgh) Robert J. Sclabassi A Novel Method to Salvage Clipped Multichannel Neurophysiological Recordings Krisztina Szalisznyo (The Hungarian Academy of Sciences) Peter Erdi Search for Resonators:Effects of Granule Cell Firing Properties on Temporal Patterns of the CA3 Pyramidal Cell David Tam (University of North Texas) A Spike Train Analysis for Detecting Spatio-temporal Integration in Neurons David Tam (University of North Texas) A Spike Train Analysis for Detecting Spatial Integration in Neurons Shoji Tanaka (Sophia University) Post-Cue Activity of Prefrontal Cortical Neurons Controlled by Local Inhibition Akaysha Tang (University of New Mexico) Barak Pearlmutter, Michael Zibulevsky, and Rebecca Loring Response Time Variability in the Human Sensory and Motor Systems Adam Taylor (University of California at San Diego) William B. Kristan, Jr. and Garrison W. Cottrell A Model of the Leech Segmental Swim Central Pattern Generator Peter Thomas (University of Chicago) Jack Cowan Spin Model for Orientation Map Via Reduction of Hebb's Rule Simon Thorpe (CERCO) Arnaud Delorme and Rufin van Rullen Real-time Simulation of Visual Processing with Millions of Spiking Neurons Paul H. E. Tiesinga (Salk Institute) Shuang Zhang and Jorge V. Jose Model of Carbachol-induced Gamma-frequency Oscillations in Hippocampus Wilson Truccolo-Filho (Florida Atlantic University) Mingzhou Ding and Steven L. Bressler Stability and Bifurcation Analysis of a Generic Cortical Area Model Philip Ulinski (University of Chicago) Zoran Nenadic and Bijoy Ghosh Spatiotemporal Dynamics in a Model of Turtle Visual Cortex Jean-Francois Vibert (Facult=E9 de M=E9decine Saint-Antoine) Vincent Lagoueyte, Nicolas Bourri=E9, Gilles Fortin, and Jean Champagnat Modeling the Primitive Rhythmic Neural Network of the Chicken Embryo Gene Wallenstein (University of Utah) Michael Hasselmo Septal Modulation of Hippocampal Firing Fields and Flexible Spatial Learning Heng Wang (University of Kentucky) Ranu Jung Effects of Supraspinal-Spinal Loops on the Dynamic Evolution of Fictive Locomotion Stuart Washington (The Krasnow Institute) Giorgio Ascoli and Jeffrey Krichmar Statistical Analysis of Dendritic Morphology's Effect on CA3 Pyramidal Cell Electrophysiology Thomas Wennekers (MPI for Mathematics in the Sciences) Dynamics of Spatio-temporal Patterns in Associative Networks of Spiking Neurons Matthew Wiener (NIMH, NIH) Mike Oram and Barry Richmond Assessing Information Processing in V1 Neurons across Stimulus Sets Russell Witte (Arizona State University) Daryl Kipke Evidence of Competing Neural Assemblies in Background Activity of Neurons in Auditory Cortex of Awake Guinea Pig Xiangbao Wu (University of Virginia) Sean Polyn and William Levy Entorhinal/dentate Excitation of CA3: A Critical Variable in Hippocampal Models? Xiangbao Wu (University of Virginia) Aaron Shon and William Levy Using Computational Simulations to Discover Optimal Training Paradigms Jian Zhang (Brandeis University) Larry Abbott Gain Modulation of Recurrent Networks Ying Zhou (Rhode Island College) Walter Gall An Organizing Center for Planar Neural Excitability WEDNESDAY, July 21, 1999 9:30 General Information 9:45 Featured contributed talk: Sharon Crook (Montana State University) Gwen Jacobs, Kelli Hodge, and Cooper Roddey Dynamic Patterns of Activation in a Neural Map in the Cricket Cercal Sensory System Contributed Talks 10:35 Richard Zemel (University of Arizona) Jonathan Pillow Encoding Multiple Orientations in a Recurrent Network 10:55 Dana Ballard (University of Rochester) Rajesh Rao A Single-spike Model of Predictive Coding 11:15 Invited Presentation: To be announced 11:50 Federal Funding Opportunities 12:30 Lunch Break 2:00 Workshops I - Organization 9:30 Rock and Roll Jam Session THURSDAY, July 22, 1999 9:30 General Information 9:45 Featured Contributed Talk: Nathaniel D. Daw (Carnegie Mellon University) David S. Touretzky Behavioral Results Suggest an Average Reward TD Model of Dopamine Neurons Contributed Talks 10:35 Christiane Linster (Boston University) Michael Hasselmo How Cholinergic Modulation Influences Generalization between Similar Odorants: Behavioral Predictions from A Computational Model 10:55 Elliot D. Menschik (University of Pennsylvania) Leif H. Finkel Cholinergic Neuromodulation of an Anatomically Reconstructed Hippocampal CA3 Pyramidal Cell 11:15 Invited Presentation: To be announced 12:25 Business Meeting 1:00 Lunch Break 2:30 Workshops II - Organization 6:30 Banquet - Banquet River Cruise From jbower at bbb.caltech.edu Thu Jun 10 22:49:13 1999 From: jbower at bbb.caltech.edu (James M. Bower) Date: Thu, 10 Jun 1999 19:49:13 -0700 Subject: Travel Grants for CNS*99 Message-ID: ********* TRAVEL GRANTS ********* FOR THE UPCOMING EIGHTH ANNUAL COMPUTATIONAL NEUROSCIENCE MEETING (CNS*99) July 18 - 22, 1999 Pittsburgh, Pennsylavania We have just learned that travel grants for students and postdoctoral fellows will be available for the upcoming CNS meeting. While priority for support will go to those who are listed as authors on CNS papers, we anticipate also being able to support some graduate students and postdoctoral fellows who want to attend the meeting without presenting a paper. Information on the meeting agenda, registration, etc can be found at: http://cns.numedeon.com/cns99/ Please note that conference rates at the hotel and assured room availability require that you make reservations at the Pittsburgh Hilton and Towers BEFORE JUNE 17th. Please contact the hotel directly for reservations and make sure and mention that you are attending CNS*99. The Pittsburgh Hilton and Towers: (412) 391-4600 From zhaoping at gatsby.ucl.ac.uk Sun Jun 13 16:21:10 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Sun, 13 Jun 1999 21:21:10 +0100 Subject: Positions available in computational vision Message-ID: <199906132021.VAA05253@flies.gatsby.ucl.ac.uk> Ph.D. student/postdoctoral positions in computational vision The Gatsby Computational Neuroscience Unit at University College London seeks applicants for a position as a Ph.D. student or postdoctoral researcher in computational vision. Candidates should have a strong analytical background and a keen interest in neuroscience and/or psychophysics. Applicants should send a detailed CV, a statement of study/research interests, and names and addresses of 3 references to zhaoping at gatsby.ucl.ac.uk (email applications preferred) or Zhaoping Li, 17 Queen Square, London, WC1N, 3AR, UK. For more information on the Gatsby Unit see http://www.gatsby.ucl.ac.uk, and on Zhaoping Li's lab see http://www.gatsby.ucl.ac.uk/~zhaoping From andre at physics.uottawa.ca Mon Jun 14 17:23:36 1999 From: andre at physics.uottawa.ca (Andre Longtin) Date: Mon, 14 Jun 99 17:23:36 EDT Subject: postdoctoral position Message-ID: <9906142123.AA16516@miro.physics.uottawa.ca.physics.uottawa.ca> POSTDOCTORAL POSITION IN NEURONAL MODELING/NONLINEAR DYNAMICS The Physics Department of the University of Ottawa has an immediate opening for a postdoctoral position in Neuronal Modeling and Nonlinear Dynamics. The research will focus mainly on the role of feedback in sensory systems, with an emphasis on electrosensory systems. The position is for one year, renewable one year. Applications will be accepted until the position is filled. Candidates must be within the first four years of their doctoral degree. Candidates must have demonstrated excellence in research, and possess a strong background in neuronal modeling and nonlinear dynamics; those whose training is more slanted towards either of these areas will also be considered. Candidates wishing to carry out a blend of experimental work and modeling/computational studies are strongly encouraged to apply; experiments would be carried out in Prof. Len Maler's laboratory in the Faculty of Medicine at the University of Ottawa. The salary of $31,000 CAN per year conforms with current NSERC guidelines; additional support for moving and conference travel is also available. A second similar postdoctoral position will become available with a starting date around January 2001. The National Capital Region of Canada, with a population around one million, is the home of the Federal Government, the National Research Council, and many other governmental research laboratories. Also known as Silicon Valley North, the region has a very high concentration of high-tech companies. Its National Art Center, Rideau Canal, museums, cafes, numerous cultural festivals, and its close proximity to hiking, swimming, canoeing and skiing areas make Ottawa a most enjoyable place to live in. Applicants should send a CV and a brief statement of research interests by regular mail or email (postdoc99 at miro.physics.uottawa.ca), and arrange for a minimum of two letters of recommendation to be sent to: Prof. Andre Longtin Physics Department University of Ottawa 150 Louis Pasteur Ottawa, Ont. Canada K1N 6N5 tel: 613-562-5800 ext.6762 fax: 613-562-5190 From marks at maxwell.ee.washington.edu Wed Jun 16 01:20:45 1999 From: marks at maxwell.ee.washington.edu (Robert J. Marks II) Date: Tue, 15 Jun 1999 22:20:45 -0700 Subject: New Book: "Neural Smithing" (MIT Press - 1999) g Message-ID: <3.0.1.32.19990615222045.006b57b4@maxwell.ee.washington.edu> NEW BOOK: Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks Russell D. Reed & Robert J. Marks II (MIT Press, 1999). _____________________ REVIEW "I have added a new book to the list of "The best elementary textbooks on practical use of NNs" in the NN FAQ ..." "Reed, R.D., and Marks, R.J, II (1999), Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks, Cambridge, MA: The MIT Press, ISBN 0-262-18190-8. "After you have read Smith (1993) or Weiss and Kulikowski (1991), Reed and Marks provide an excellent source of practical details for training MLPs. They cover both backprop and conventional optimization algorithms. Their coverage of initialization methods, constructive networks, pruning, and regularization methods is unusually thorough. Unlike the vast majority of books on NNs, this one has lots of really informative graphs..." Warren S. Sarle, SAS Institute Inc. on . ______________________ Contents: Artificial neural networks are nonlinear mapping systems whose structure is loosely based on principles observed in the nervous systems of humans and animals. The basic idea is that massive systems of simple units linked together in appropriate ways can generate many complex and interesting behaviors. This book focuses on the subset of feedforward artificial neural networks called multilayer perceptions (MLP). These are the most widely used neural networks, with applications as diverse as finance (forecasting), manufacturing (process control), and science (speech and image recognition). This book presents an extensive and practical overview of almost every aspect of MLP methodology, progressing from an initial discussion of what MLPs are and how they might be used to an in-depth examination of technical factors affecting performance. The book can be used as a tool kit by readers interested in applying networks to specific problems, yet it also presents theory and references outlining the last ten years of MLP research. Table of Contents Preface 1 Introduction 1 2 Supervised Learning 7 3 Single-Layer Networks 15 4 MLP Representational Capabilities 31 5 Back-Propagation 49 6 Learning Rate and Momentum 71 7 Weight-Initialization Techniques 97 8 The Error Surface 113 9 Faster Variations of Back-Propagation 135 10 Classical Optimization Techniques 155 11 Genetic Algorithms and Neural Networks 185 12 Constructive Methods 197 13 Pruning Algorithms 219 14 Factors Influencing Generalization 239 15 Generalization Prediction and Assessment 257 16 Heuristics for Improving Generalization 265 17 Effects of Training with Noisy Inputs 277 A Linear Regression 293 B Principal Components Analysis 299 C Jitter Calculations 311 D Sigmoid-like Nonlinear Functions 315 References 319 Index 339 Ordering information: 1. MIT Press http://mitpress.mit.edu/book-home.tcl?isbn=0262181908 2. amazon.com http://www.amazon.com/exec/obidos/ASIN/0262181908/qid%3D909520837/sr%3D1-21/ 002-3321940-3881246 3. Barnes & Nobel http://shop.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=1KKG10OPZT& mscssid=A7M4XXV5DNS12MEG00CGNDBFPT573NJS&pcount=&isbn=0262181908 4. buy.com http://www.buy.com/books/product.asp?sku=30360116 From harnad at coglit.ecs.soton.ac.uk Wed Jun 16 17:36:27 1999 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Wed, 16 Jun 1999 22:36:27 +0100 (BST) Subject: EEG and Neocortical Function: BBS Call for Commentators Message-ID: Below is the abstract of a forthcoming BBS target article: NEOCORTICAL DYNAMIC FUNCTION AND EEG by Paul L. Nunez This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please send EMAIL by May 14th to: bbs at cogsci.soton.ac.uk or write to [PLEASE NOTE SLIGHTLY CHANGED ADDRESS]: Behavioral and Brain Sciences ECS: New Zepler Building University of Southampton Highfield, Southampton SO17 1BJ UNITED KINGDOM http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/ ftp://ftp.princeton.edu/pub/harnad/BBS/ ftp://ftp.cogsci.soton.ac.uk/pub/bbs/ gopher://gopher.princeton.edu:70/11/.libraries/.pujournals If you are not a BBS Associate, please send your CV and the name of a BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work. All past BBS authors, referees and commentators are eligible to become BBS Associates. To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. An electronic draft of the full text is available for inspection with a WWW browser, anonymous ftp or gopher according to the instructions that follow after the abstract. _____________________________________________________________ TOWARD A QUANTITATIVE DESCRIPTION OF LARGE SCALE NEOCORTICAL DYNAMIC FUNCTION AND EEG. Paul L. Nunez Permanent Address: Brain Physics Group, Dept. of Biomedical Engineering, Tulane University, New Orleans, Louisiana 70118 pnunez at mailhost.tcs.tulane.edu Temporary Address (6/98 - 6/00): Brain Sciences Institute, Swinburne University of Technology, 400 Burwood Road, Melbourne, Victoria 3122, Australia pnunez at mind.scan.swin.edu.au ABSTRACT: A conceptual framework for large-scale neocortical dynamic behavior is proposed. It is sufficiently general to embrace brain theories applied to different experimental designs, spatial scales and brain states. This framework, based on the work of many scientists, is constructed from anatomical, physiological and EEG data. Neocortical dynamics and correlated behavioral/cognitive brain states are viewed in the context of partly distinct, but interacting local (regionally specific) processes and globally coherent dynamics. Local and regional processes (eg, neural networks) are enabled by functional segregation; global processes are facilitated by functional integration. Global processes can also facilitate synchronous activity in remote cell groups (top down) which function simultaneously at several different spatial scales. At the same time, local processes may help drive (bottom up) macroscopic global dynamics observed with EEG (or MEG). A specific, physiologically based local/global dynamic theory is outlined in the context of this general conceptual framework. It is consistent with a body of EEG data and fits naturally within the proposed conceptual framework. The theory is incomplete since its physiological control parameters are known only approximately. Thus, brain state-dependent contributions of local versus global dynamics cannot be predicted. It is also neutral on properties of neural networks, assumed to be embedded within macroscopic fields. Nevertheless, the purely global part of the theory makes qualitative, and in a few cases, semi-quantitative predictions of the outcomes of several disparate EEG studies in which global contributions to the dynamics appear substantial. Experimental data are used to obtain a variety of measures of traveling and standing wave phenomena, predicted by the pure global theory. The more general local/global theory is also proposed as a "meta-theory," a suggestion of what large-scale quantitative theories of neocortical dynamics may be like when more accurate treatment of local and non-linear effects is achieved. In the proposed local/global theory, the dynamics of excitatory and inhibitory synaptic action fields are described. EEG and MEG are believed to provide large-scale estimates of modulation of these synaptic fields about background levels. Brain state is determined by neuromodulatory control parameters. Some states are dominated by local cell groups, in which EEG frequencies are due to local feedback gains and rise and decay times of post-synaptic potentials. Local frequencies vary with brain location. Other states are strongly global, with multiple, closely spaced EEG frequencies, but identical at each cortical location. Coherence at these frequencies is high over large distances. The global mode frequencies are due to a combination of delays in cortico-cortical axons and neocortical boundary conditions. Many states involve dynamic interactions between local networks and the global system, in which case observed EEG frequencies may involve "matching" of local resonant frequencies with one or more of the global frequencies. KEYWORDS: EEG, neocortical dynamics, standing waves, functional integration, spatial scale, binding problem, synchronization, coherence, cell assemblies, limit cycles, pacemakers ____________________________________________________________ To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the World Wide Web or by anonymous ftp from the US or UK BBS Archive. Ftp instructions follow below. Please do not prepare a commentary on this draft. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. The URLs you can use to get to the BBS Archive: http://www.princeton.edu/~harnad/bbs/ http://www.cogsci.soton.ac.uk/bbs/Archive/bbs.nunez.html ftp://ftp.princeton.edu/pub/harnad/BBS/bbs.nunez ftp://ftp.cogsci.soton.ac.uk/pub/bbs/Archive/bbs.nunez *** FIVE IMPORTANT ANNOUNCEMENTS *** ------------------------------------------------------------------ (1) There have been some very important developments in the area of Web archiving of scientific papers very recently. Please see: Science: http://www.cogsci.soton.ac.uk/~harnad/science.html Nature: http://www.cogsci.soton.ac.uk/~harnad/nature.html American Scientist: http://www.cogsci.soton.ac.uk/~harnad/amlet.html Chronicle of Higher Education: http://www.chronicle.com/free/v45/i04/04a02901.htm --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to archive all their papers (on their Home-Servers as well as) on CogPrints: http://cogprints.soton.ac.uk/ It is extremely simple to do so and will make all of our papers available to all of us everywhere at no cost to anyone. --------------------------------------------------------------------- (3) BBS has a new policy of accepting submissions electronically. Authors can specify whether they would like their submissions archived publicly during refereeing in the BBS under-refereeing Archive, or in a referees-only, non-public archive. Upon acceptance, preprints of final drafts are moved to the public BBS Archive: ftp://ftp.princeton.edu/pub/harnad/BBS/.WWW/index.html http://www.cogsci.soton.ac.uk/bbs/Archive/ -------------------------------------------------------------------- (4) BBS has expanded its annual page quota and is now appearing bimonthly, so the service of Open Peer Commentary can now be be offered to more target articles. The BBS refereeing procedure is also going to be considerably faster with the new electronic submission and processing procedures. Authors are invited to submit papers to: Email: bbs at cogsci.soton.ac.uk Web: http://cogprints.soton.ac.uk http://bbs.cogsci.soton.ac.uk/ INSTRUCTIONS FOR AUTHORS: http://www.princeton.edu/~harnad/bbs/instructions.for.authors.html http://www.cogsci.soton.ac.uk/bbs/instructions.for.authors.html --------------------------------------------------------------------- (5) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) journal had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From Gerhard.Paass at gmd.de Wed Jun 16 08:31:54 1999 From: Gerhard.Paass at gmd.de (Gerhard Paass) Date: Wed, 16 Jun 1999 14:31:54 +0200 Subject: CFP: Workshop Neural Networks and Connectionism, Magdeburg Message-ID: <3767993A.DF7290F7@gmd.de> [ We apologise if you should receive this message more than once ] ANNOUNCEMENT AND CALL FOR PAPERS Workshop on Neural Networks and Connectionism Magdeburg, Germany, 29.Sep.1999, The meeting of the working group 1.1.2 "Connectionism" of the German Society of Computer Science (GI) takes place in Magdeburg during the GI-Workshop-Days Learning, Knowledge Discovery, and Adaptivity. It is devoted to the discussion of new trends and ongoing research projects in the areas of connectionism and neural networks. TOPICS We want to discuss papers covering empirical, theoretical or interdisciplinary topics from connectionsm and neural networks, especially * Theory: Prediction and generalization, regularization, computational learning theory, support vector machines, approximation and estimation, learning in dynamical systems. * Algorithms andArchitectures:supervised and unsupervised learning, model selection, feedforward aand recurrent architectures, hybrid symbolic-subsymbolic approaches. * Knowledge discovery and Adaptivity: active learning, reinforcement learning, Markovian state estimation, novelty detection, information content, time-varying systems. * Applications: mmedical diagnosis, data mining, expert systems, financial predictions, time series analysis, information retrieval, etc. Deadline for contributions: July 31, 1999 For details see http://ais.gmd.de/~paass/nn99 From M.Usher at ukc.ac.uk Thu Jun 17 07:25:05 1999 From: M.Usher at ukc.ac.uk (M.Usher) Date: Thu, 17 Jun 1999 12:25:05 +0100 Subject: 1 year research position Message-ID: A research associate is required to assist in the running of a BBSRC funded project titled 'The role of VISUAL SYNCHRONY in perceptual organisation: studies in human psychophysics and computational methods.' The research programme follows-up studies described in a recent paper (Usher & Donnelly, 1998, Nature, pp. 179-182) investigating spatio-temporal interactions in visual grouping as a means for exploring mechanisms for GROUPING based on neural synchrony. The award was made to Dr Nick Donnelly (University of Southampton) and to Dr Marius Usher (Birkbeck College, University of London). The project will run for twelve months in the first instance (and will be performed either in London or in Southampton). It is intended that further funds will be sought during this first year to continue the research. The ideal candidate will have a background in visual psychophysics and should have experience programming in C on Unix workstations (experience with Silicon Graphics graphical libraries and/or OpenGL will be helpful). The salary will be based on the research associate scale and will be up to BP 16,927 (i.e., approx. $27,920) per annum. For further information contact either Nick Donnelly (email: n.donnelly at ukc.ac.uk) or Marius Usher (email: m.usher at ukc.ac.uk http://www.ukc.ac.uk/psychology/people/usherm/ ) Deadline for applications: July 1st 1999. Marius Usher Lecturer in Psychology and Cognitive Neuroscience From moeller at ifi.unizh.ch Fri Jun 18 11:11:17 1999 From: moeller at ifi.unizh.ch (Ralf Moeller) Date: Fri, 18 Jun 1999 17:11:17 +0200 Subject: Ph.D. student position synthetic modeling/biorobotics Message-ID: <376A6194.4750EE96@ifi.unizh.ch> ------------------------------------------ Position for a Ph.D. student in *Synthetic modeling / Biorobotics* at the AILab, University of Zurich ------------------------------------------ A new Ph.D. student position is open at the Artificial Intelligence Laboratory, Dept. of Computer Science of the University of Zurich. Availability: Immediately or at earliest convenience. Research on this position will focus on "Synthetic modeling" or "Biorobotics". "Synthetic modeling" is a novel biological methodology to gain insights in the mechanisms underlying the behavior of biological agents. Models developed to explain the animal's abilities are implemented on an artifical agent and validated by observing the behavior and the internal states of the robot. At the same time, the method will lead to new solutions for robotics applications. The specific goal of the project is to improve our understanding of the impressive visual navigation abilities of insects and to apply this knowledge to enable robots to safely navigate in complex environments. This will require scaling-up of previous theoretical and practical work. If the above challenges capture your interest, and you would like to become a member of an international research team conducting interdisciplinary work, submit a curriculum vitae, statement of research interests, and names of three referees to: Corinne Maurer Dept. of Computer Science University of Zurich Winterthurerstrasse 190 CH-8057 Zurich, Switzerland E-mail: maurer at ifi.unizh.ch Phone: 41-1-63 54331 Fax: 41-1-63 56809 For details on the research subject, contact: Ralf Moeller Email: moeller at ifi.unizh.ch Profile: Applicants should have an M.Sc., a university Diploma, or a similar degree, in one of the following areas: computer science, electrical or mechanical engineering, biology, neurobiology, physics, mathematics, or related disciplines. He/she should have good programming skills (C, C++, etc.) and preferably experience with robot control, image processing, and electronics. Tasks: The main task for the accepted candidate will be to conduct research towards his/her Ph.D. Additional tasks include support for classes organized by the AI-Lab as well as other administrative tasks required by the computer science department. Salary: The salary will be according to the specification of the Swiss National Science Foundation. Time prospect: The candidate is expected to complete his/her Ph.D. work within a period of maximum 4 years. From l.s.smith at cs.stir.ac.uk Fri Jun 18 08:34:16 1999 From: l.s.smith at cs.stir.ac.uk (Dr L S Smith) Date: Fri, 18 Jun 99 13:34:16 +0100 Subject: 2nd European Workshop on Neuromorphic Systems: Call for participation Message-ID: <199906181234.NAA09196@tinker.cs.stir.ac.uk> (Apologies if you receive this more than once) Call for Participation: 2nd European Workshop on Neuromorphic Systems (EWNS2) 3-5 September 1999, Cottrell Building, University of Stirling, Stirling, Scotland Neuromorphic systems are implementations in silicon of sensory and neural systems whose architecture and design are based on neurobiology. The area is at the intersection of many disciplines: neurophysiology, computer science and electrical engineering. Registration Forms and Further Information are available from the WWW page http://www.cs.stir.ac.uk/EWNS2 _____________________ Provisional Programme Friday September 3 0900-1030: Registration and Coffee 1030-1115: Pedro Marijuan, Dept. Ingen. Electronica y Comunicaciones, Universidad de Zaragoza, Spain: From Darwin to Cajal: The Quest for a Neurodynamic Optimization Principle Session 1: General Papers 1115-1145: Babara Webb, University of Stirling: A Framework for Models of Biological Behaviour 1145-1215: Catherine Breslin and Leslie Smith, University of Stirling: Silicon Cellular Morphology 1215-1245: J Love, Florida Institute of Technology, Melbourne, Florida USA and K M Johnson National Research Council, NASA Kennedy Space Center, Florida USA: Towards Evolvable Neuromorphic Ststems: Adaptive Ontogenetic Engineering of Artificial Sensorineural Vestibular Organs 1245-1400: Lunch Session 2: Auditory I 1400-1445 Simon Jones, Dept. of Electrical Engineering, University of Loughborough, England: (title to be announced) 1445-1515: Mete Erturk, C P Brown, D J Klein and S A Shamma, Institute for Systems Research and Dept. of Electrical Engineering, University of Maryland, USA: A Neuromorphic Approach to the Analysis of Monaural and Binaural auditory Signals 1515-1545: Amir Hussain and Douglas R Campbell, Dept of Applied Computing, University of Dundee, Scotland: Speech Intelligibility - Improvements Using a Binaural Adaptive-Shceme Based Conceptually on the Human-Auditory System 1545-1610: Tea Session 3: Vision I 1610-1640 Tobi Delbruk, Institute for Neuronformatics, Zurich, Switzerland: Three Silicon Retinas for Simple Consumer Applications 1640-1710 Seiji Kameda, Akira Honda, Tetsuya Yagi, Faculty of Computer Science and Ststems Engineering, Kyushu Institute of Technology, Fukuoka, Japan: Real Time Image Processing with an Analog Vision Chip System 1930 Wine and Cheese Reception in the Atrium _________________________________________________________ Saturday September 4 Session 4: Auditory II 0900-0945: Andre van Schaik, Craig Jin and Simon Carlile, University of Sydney, Australia: Human Localisation of Band-Pass Filtered Noise 0945-1015: Amir Hussain, Dept of Applied Computing, University of Dundee, Scotland: Binaural Neural-Network Based Sub-Band Processing of Noisy Speech Signals 1015-1045: Sofia Cavaco, Universidade Nova de Lisboa and John Hallam, University of Edinburgh: A Biologically Plausible Acoustic Motion Detection System for a Robotic Cat 1045-1115: Coffee Session 5: Vision II 1115-1145: E Ros, F J Pelayo, D Palomar, I Rojas, J L Bernier and A Prieto, Dept of Architecture and Tecnology of Computers, University of Granada, Spain: Stimulus correlation and Adaptive Local Motion Detection 1145-1215: Reid R Harrison and Christof Koch, Computation and Neural Systems Program, California Institute of Technology, USA: An Analog VLSI Implementation of a Visual Interneuron: Enhanced Sensory Processing through Biophysical Modelling 1215-1245: R Timothy Edwards, Johns Hopkins University, Applied Physics Laboratory, Maryland, USA: Acoustic Transient Classification with a Template Correlation Processor 1245-1400: Lunch 1400-1445 Avis Cohen, Dept of Biology and Neuroscience and Cognitive Science, University of Maryland, USA: (title to be announced) Session 6: Robotics 1445-1515 Timothy Chapman and Babara Webb, University of Stirling: A Neuromorphic Hair Sensor Model of Wind-Mediated Escape in the Cricket 1515-1545: R Mudra, R Hahnloser and R J Douglas, Institute for Neuroinformatics, Zurich, Switzerland: Integrating neuromorphic action-oriented perceptual inputs to generate a navigation behaviour for a robot 1545-1615 Coffee 1615-1645 Mark Blanchard, P F M J Verschure, Institute of Neuroinformatics, Zurich, Switzerland and F C Rind, Dept of Neurobilogy, University of Newcastle upon Tyne, England: Using Mobile Robots to Study Locust Collision Avoidance Responses 1645-1715 Ralf Moller, Dept of Computer Science and Dept of Zoology, University of Zurich, Switzerland: Visual Homing in Analog Hardware 1915: Conference Dinner _______________________________________________ Sunday September 5 Session 7 Hardware 1000-1045: Shii Chii: Institute of Neuroinformatics, Zurich, Switzerland: (title to be announced) 1045-1115: Peter Paschke and Carsten Schauer, Technical University of Ilmenau, Germany: A Spike-Based Model of Binaural Sound Localization 1115-1145: Cyprian Grassmann and Joachim K Anlauf, University of Bonn, Germany: Fast Digital Simulation of Spiking Neural Networks and Neuromorphic Integration with SPIKELAB 1145-1215: B E Eriksson, L S Smith, University of Stirling, M Glover, DERA, A Hamilton, University of Edinburgh, Scotland: SPIKE II: An integrate-and-fire aVLSI chip. 1215-1315: Lunch 1315-1345: Best Paper Prize 1345-1500: Panel Discussion: Neuromorphic Systems - The Ways Forward Presentation on EPSRC Silicon and Neurobiology Network (L.S. Smith) 1500: Tea and Close of Conference We gratefully acknowledge the assistance of the Gatsby Charitable Foundation. From j.v.stone at sheffield.ac.uk Sat Jun 19 10:19:30 1999 From: j.v.stone at sheffield.ac.uk (Jim Stone) Date: Sat, 19 Jun 1999 15:19:30 +0100 Subject: Research Assistant Message-ID: ----------------------------------------------------------------------- Research Assistant in Functional Decomposition and Analysis of Brain Activity ------------------------------------------------------------------------ DEPARTMENT OF PSYCHOLOGY, UNIVERSITY OF SHEFFIELD, UK. Applications are invited for a research assistant, to commence before October 1st 1999 (the contract will end on May 31st 2002). The successful candidate should have a solid theoretical background from a numerate discipline (e.g. mathematics, control engineering). We would also consider candidates from other backgrounds with experience of image analysis/signal processing. Post-doctoral candidates are preferred, but graduates with appropriate skills will also be considered. The research assistant will work with Dr John Porrill and Dr Jim Stone, developing and applying novel techniques for analysis of brain activity using data obtained from fMRI, optical imaging and EEG. The candidate will also have the opportunity to design and run fMRI/EEG experiments. This research is funded as one component of a co-operative MRC award, and will involve collaboration with other groups working on fMRI and optical imaging techniques. The salary is up to 21,815 pounds (RA1A scale, point 11) according to experience. This interdisciplinary project is shared between the Departments of Psychology and Clinical Neurology. The project is based in the Department of Psychology, which is one of the leading research centres for psychology in the UK, and has a rapidly expanding research base in imaging technologies. This is reflected in the top ratings (currently Grade 5A) achieved by the department in all four national research assessment exercises. Applicants should send a CV and a brief statement of research interests by regular mail or email to be sent to: Dr JV Stone, Psychology Department, Sheffield University, Sheffield, S10 2TP, UK. Informal enquiries can be made via email to: {j.porrill}{j.v.stone}@Sheffield.ac.uk Recent papers relevant to this project can be obtained from: http://www.shef.ac.uk/~pc1jvs/ ----------------------------------------------------------------------- From ESANN at dice.ucl.ac.be Mon Jun 21 03:49:35 1999 From: ESANN at dice.ucl.ac.be (ESANN) Date: Mon, 21 Jun 1999 09:49:35 +0200 Subject: ESANN'2000: European Symposium on Artificial Neural Networks Message-ID: <000601bebbba$9b5cbd50$5aed6882@natacha.dice.ucl.ac.be> ***************************************************** Call for special sessions European Symposium on Artificial Neural Networks Bruges (Belgium), April 26-27-28, 2000 ***************************************************** (We apologize for duplicates of this email) The preliminary announcement for the ESANN'2000 conference is available from the WWW page http://www.dice.ucl.ac.be/esann The call for papers will be published soon on this page. We are now waiting for proposals/suggestions to organize special sessions during the conference. A description of special sessions is available from the above link. For those who never attended ESANN, the programmes (and lists of committee members) of the former conferences is also available from the above link. DEADLINE for proposals to organize special sessions: July 15, 1999. ===================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat D facto conference services 27 rue du Laekenveld - B-1080 Brussels - Belgium tel: + 32 2 420 37 57 - fax: + 32 2 420 02 55 mailto:esann at dice.ucl.ac.be ===================================================== From lorincz at iserv.iki.kfki.hu Tue Jun 22 12:58:42 1999 From: lorincz at iserv.iki.kfki.hu (Andras Lorincz) Date: Tue, 22 Jun 1999 18:58:42 +0200 (MET) Subject: Post-doctoral position in electrophysiology Message-ID: A post-doctoral position is available for 3 years from the 1st of January 2000 in the field of visual electrophysiology. The project is on the coding of various depth (3D) cues by inferior temporal neurons. The work involves behavioral training of rhesus monkeys and recording of single cortical neurons in the awake monkey. For more information and applications contact: Dr. Rufin Vogels email: Rufin.Vogels at med.kuleuven.ac.be http://www.kuleuven.ac.be/facdep/medicine/dep_neu/neufys/ Lab. Neuro- en Psychofysiologie Fac. Geneeskunde Onderwijs en Navorsing Campus Gasthuisberg B-3000 Leuven Belgium Tel: +32 16 345857 From dummy at ultra3.ing.unisi.it Wed Jun 23 07:20:45 1999 From: dummy at ultra3.ing.unisi.it (Paolo Frasconi) Date: Wed, 23 Jun 1999 13:20:45 +0200 (MET DST) Subject: REMINDER: Special Issue on Learning in Structured Domains Message-ID: REMINDER: Submission deadline, IEEE TKDE Special Issue Electronic abstracts due: July 15, 1999 Submissions due: July 30, 1999 Special issue on Connectionist Models for Learning in Structured Domains IEEE Transactions on Knowledge and Data Engineering BACKGROUND Structured representations are ubiquitous in different fields such as knowledge representation, language modeling, and pattern recognition. Although many of the most successful connectionist models are designed for "flat" (vector-based) or sequential representations, recursive or nested representations should be preferred in several situations. One obvious setting is concept learning when objects in the instance space are graphs or can be conveniently represented as graphs. Terms in first-order logic, blocks in document processing, patterns in structural and syntactic pattern recognition, chemical compounds, proteins in molecular biology, and even world wide web sites, are all entities which are best represented as graphical structures, and they cannot be easily dealt with vector-based architectures. In other cases (e.g., language processing) the process underlying data has a (hidden) recursive nature but only a flat representation is left as an observation. Still, the architecture should be able to deal with recursive representations in order to model correctly the mechanism that generated the observations. The interest in developing connectionist architectures capable of dealing with these rich representations can be traced back to the end of the 80's. Early approaches include Touretzky's BoltzCONS, the Pollack's RAAM model, Hinton's recursive distributed representations. More recent techniques include labeled RAAMs, holographic reduced representations, and recursive neural networks. Today, after more than ten years since the explosion of interest in connectionism, research in architectures and algorithms for learning structured representations still has a lot to explore and no definitive answers have emerged. It seems that the major difficulty with connectionist models is not just representing symbols, but rather devising proper ways of learning when examples are data structures, i.e. labeled graphs that can be used for describing relationships among symbols (or, more in general, combinations of symbols and continuously-valued attributes). TOPICS The aim of this special issue is to solicit and publish valuable papers that bring a clear picture of the state of the art in this area. We encourage submissions of papers addressing, in addition to other relevant issues, the following topics: * Algorithms and architectures for classification of data structures. * Unsupervised learning in structured domains. * Belief networks for learning structured patterns. * Compositional distributed representations. * Recursive autoassociative memories. * Learning structured rules and structured rule refinement. * Connectionist learning of syntactic parsing from text corpora. * Stochastic grammars and their relationships to neural and belief networks. * Links between connectionism and syntactic and structural pattern recognition. * Analogical reasoning. * Applications, including: - Medical and technical diagnosis: discovery and manipulation of structured dependencies, constraints, explanations. - Molecular biology and chemistry: prediction of molecular structure folding, classification of chemical structures. - Automated reasoning: robust matching, manipulation of logical terms, proof plans, search space reduction. - Software engineering: quality testing, modularization of software. - Geometrical and spatial reasoning: robotics, structured representation of objects in space, figure animation, layouting of objects. INSTRUCTIONS We encourage e-mail submissions (Postscript, RTF, and PDF are the only acceptable formats). For hard copy submission please send 6 copies of the manuscript to Prof. Marco Gori. Manuscripts should not exceed 30 pages double spaced (excluding Figures and Tables). The title and the abstract should be sent separately in ASCII format, even before the final submission, so that reviewers can be contacted timely. IMPORTANT DATES Submission of title and abstract (e-mail): July 15, 1999 Submission deadline: July 30, 1999 Notification of acceptance: December 31, 1999 Expected publication date: Mid-to-late 2000. GUEST EDITORS Prof. Paolo Frasconi DIEE, University of Cagliari Piazza d'Armi 09123 Cagliari (ITALY) Phone: +39 070 675 5849 E-mail: paolo at diee.unica.it Prof. Marco Gori DII, University of Siena Via Roma 56, 53100 Siena (ITALY) Phone: +39 0577 263 610 E-mail: marco at ing.unisi.it Prof. Alessandro Sperduti DI, University of Pisa Corso Italia 40, 56125 Pisa (ITALY) Phone: +39 050 887 213 E-mail: perso at di.unipi.it From adr at nsma.arizona.edu Wed Jun 23 16:34:39 1999 From: adr at nsma.arizona.edu (David Redish) Date: Wed, 23 Jun 1999 13:34:39 -0700 Subject: Book announcement Message-ID: <199906232034.NAA13211@raphe.nsma.arizona.edu> The following book might be of interest to the people on this list. adr ----------------------------------------------------- A. David Redish adr at nsma.arizona.edu Post-doc http://www.cs.cmu.edu/~dredish Neural Systems, Memory and Aging, Univ of AZ, Tucson AZ ----------------------------------------------------- BEYOND THE COGNITIVE MAP: From Place Cells to Episodic Memory A. David Redish Now available from MIT Press. [From the inside cover description] There are currently two major theories about the role of the hippocampus, a distinctive structure in the back of the temporal lobe. One says that it stores a cognitive map, the other that it is a key locus for the temporary storage of episodic memories. A. David Redish takes the approach that understanding the role of the hippocampus in space will make it possible to address its role in less easily quantifiable areas such as memory. Basing his investigation on the study of rodent navigation--one of the primary domains for understanding information processing in the brain--he places the hippocampus in its anatomical context as part of a greater functional system. Redish draws on the extensive experimental and theoretical work of the last 100 years to paint a coherent picture of rodent navigation. His presentation encompasses multiple levels of analysis, from single-unit recording results to behavioral tasks to computational modeling. From this foundation, he proposes a novel understanding of the role of the hippocampus in rodents that can shed light on the role of the hippocampus in primates, explaining data from primate studies and human neurology. The book will be of interest not only to neuroscientists and psychologists, but also to researchers in computer science, robotics, artificial intelligence, and artificial life. [Table of contents] 1 The Hippocampus Debate 2 Navigation Overview 3 Local View 4 Route Navigation: Taxon and Praxic Strategies 5 Head Direction 6 Path Integration 7 Goal Memory 8 Place Code 9 Self-Localization 10 Multiple Maps 11 Route Replay 12 Consolidation 13 Questions of Hippocampal Function 14 The Primate Hippocampus 15 Coda A Attractor Networks B Selective Experimental Review C Open Questions From alex at nervana.montana.edu Thu Jun 24 19:11:15 1999 From: alex at nervana.montana.edu (Alexander Dimitrov) Date: Thu, 24 Jun 1999 17:11:15 -0600 Subject: postdoctoral positions available Message-ID: <3772BB13.76AF5700@nervana.montana.edu> Two postdoctoral positions are available immediately in the Center for Computational Biology at Montana State University, Bozeman, in the laboratories of John Miller and Gwen Jacobs. Both positions are to study information processing in the cercal sensory system of the cricket. Applicants for both positions should have experience in electrophysiology. The focus of the project in Miller's lab is to understand encoding of dynamic sensory stimuli by ensembles of nerve cells. The primary experimental approach will be multi-unit extracellular recording. Candidates should have background in the application of information theory to analysis of neural systems. The focus of the project in Dr. Jacobs' lab is to understand the physiological mechanisms underlying neural encoding in this system. Experimental approaches will include intracellular and optical recording, and advanced morphometric analysis of identified nerve cells. For both projects, analysis of the data will involve the development of compartmental and analytical models of identified nerve cells and networks. Advanced computational, microscopy and physiological recording facilities are available within the Center, including a 32-processor SGI Origin2000 computer and a Leica TSP confocal microscope. More information about the Center for Computational Biology can be found at: http://www.nervana.montana.edu Interested parties should contact John Miller or Gwen Jacobs: jpm at nervana.montana.edu and gwen at nervana.montana.edu -- Alexander Dimitrov Center for Computational Biology Montana State University Bozeman, MT 59717-3505 phone: (406)994-6494 fax: (406)994-5122 email: alex at nervana.montana.edu From dwang at wjh.harvard.edu Fri Jun 25 09:47:20 1999 From: dwang at wjh.harvard.edu (DeLiang Wang) Date: Fri, 25 Jun 1999 09:47:20 -0400 Subject: Tech report on speech segregation Message-ID: <37738859.E77A8BAF@wjh.harvard.edu> The following technical report is available via FTP/WWW: ******************************************************************** "A Comparison of Auditory and Blind Separation Techniques for Speech Segregation" Technical Report #15, June 1999 Department of Computer and Information Science The Ohio State University ******************************************************************** Andre J. W. van der Kouwe, The Ohio State University DeLiang L. Wang, The Ohio State University Guy J. Brown, The University of Sheffield A fundamental problem in auditory and speech processing is the segregation of speech from concurrent sounds. This problem has been a focus of study in computational auditory scene analysis (CASA), and it has also been recently investigated from the perspective of blind source separation. Using a standard corpus of voiced speech mixed with interfering sounds, we report a comparison between CASA and blind source separation techniques, which have been developed independently. Our comparison reveals that they perform well under very different conditions. A number of conclusions are drawn with respect to their relative strengths and weaknesses in speech segregation applications as well as in modeling auditory function. (10 pages, 92 KB compressed) for anonymous ftp: FTP-HOST: ftp.cis.ohio-state.edu Directory: /pub/tech-report/1999/ Filename: TR15.ps.gz for WWW: http://www.cis.ohio-state.edu/~dwang/reports.html -- ------------------------------------------------------------ My contact information through September, 1999 is: Dr. DeLiang Wang Visual Sciences Lab. William James Hall Harvard University 33 Kirkland Street Cambridge, MA 02138 Email: dwang at wjh.harvard.edu (my OSU email address is good too) Phone: 617-496-3367 (OFFICE); 617-495-3884 (LAB) Fax: 617-495-3764 URL: http://www.cis.ohio-state.edu/~dwang From austin at minster.cs.york.ac.uk Mon Jun 28 05:18:45 1999 From: austin at minster.cs.york.ac.uk (Jim Austin) Date: Mon, 28 Jun 1999 10:18:45 +0100 Subject: Lectureship in Neural Networks. Message-ID: <9906281018.ZM661@minster.cs.york.ac.uk> LECTURESHIP IN NEURAL NETWORKS University of York, UK. Closing date 12 July 1999. Applications are invited for a Lectureship, available immediately, in any aspect of neural networks, including such areas as hardware implementation, parallel processing, cognitive aspects, modelling, and theory. You will join the Advanced Computer Architectures Group within the Department of Computer Science one of the UKs leading groups working in neural networks and a Department with the highest rating in research and teaching. The Group undertakes research in all the theory, implementation and application of neural networks, with over 10 researchers active in this area. The group attracts funds from many government sources including EPSRC and EU, as well as working with a number of industries including British Aerospace, The Post Office, Porta Systems and Glaxo- Welcome. The group is also well known for its work in the hardware implementation of neural networks with extensive support for this aspect of our research. The groups facilities include a SGI Origin 2000 32 node supercomputer, high performance workstations and a very supportive research culture. An associated post is also available in Computer Vision (see web below). Full information on the groups activities can be found on http://www.cs.york.ac.uk/arch/neural/. Full details of the post can be found at http://www.york.ac.uk/admin/persnl/jobs/3030.htm Informal enquires can be made to Prof. Jim Austin at austin at cs.york.ac.uk -- Jim Austin, Professor of Neural Computation Advanced Computer Architecture Group, Department of Computer Science, University of York, York, YO10 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch/ From kimmo at james.hut.fi Tue Jun 29 04:44:06 1999 From: kimmo at james.hut.fi (Kimmo Raivio) Date: 29 Jun 1999 11:44:06 +0300 Subject: Thesis: Receiver Structures based on SOMs Message-ID: The following Dr.Tech. thesis is available at http://www.cis.hut.fi/~kimmo/papers/thesis.ps.gz (compressed postscript, 217K ) http://www.cis.hut.fi/~kimmo/papers/thesis.ps (postscript, 797K ) Most of the articles that belong to the thesis can be accessed through the page http://www.cis.hut.fi/~kimmo/papers/ ---------------------------------------------------------------------- Receiver Structures Based on Self-Organizing Maps Kimmo Raivio Helsinki University of Technology Lab. of Computer and Information Science P.O.BOX 5400, FIN-02015 HUT, FINLAND Email: Kimmo.Raivio at hut.fi Abstract New adaptive receiver structures are studied to allow a more efficient compensation of the disturbances of the communication channel. This study concentrates on the use of the Self-Organizing Map (SOM) algorithm as a building block of new adaptive receivers. The SOM has been used both as an adaptive decision device and to follow up error signals. When the SOM was used as an adaptive decision device, comparisons with conventional equalizers such as the linear equalizer and the decision feedback equalizer were performed. The new structures were also compared with other neural methods like radial basis function networks and multi-layer perceptrons. The performances of the neural equalizers and especially the SOM have been found to be better in nonlinear multipath channels and about equal in linear channels. When the SOM was used to follow up error signals, the actual idea was to cancel interference. This task was divided between following up the error distribution and finding out the error estimate. The error was approximately the same as the interference. Other sources of error were noise, intersymbol interference, wrong error estimates and detection errors due to the reasons mentioned before. The error distribution can be followed up, but the problem is how to predict the error. Some solutions are presented in this thesis, but they do not provide satisfactory results. The performance has been compared with a pure detector without any kind of interference cancellation and with a receiver based on the radial basis function network. However, it was discovered that these neural receivers designed for interference cancellation perform better when nonlinear distortions are compensated. The receivers based on the SOM are slightly more complicated than conventional ones, but when a channel has nonlinear disturbances in particular, they offer one possible solution. -- * Kimmo Raivio, Dr. of Science in Technology | email: Kimmo.Raivio at hut.fi * Lab. of Computer and Information Science | http://www.cis.hut.fi/~kimmo/ * Helsinki University of Technology | phone +358 9 4515295 * P.O.BOX 5400, FIN-02015 HUT, FINLAND | fax +358 9 4513277 From aapo at james.hut.fi Tue Jun 29 09:24:37 1999 From: aapo at james.hut.fi (Aapo Hyvarinen) Date: Tue, 29 Jun 1999 16:24:37 +0300 (EEST) Subject: ICA 2000: 1st CFP Message-ID: <199906291324.QAA17887@james.hut.fi> [Our apologies if you receive multiple copies of this message.] First Call for Papers: ------------- I C A 2000 ------------- International Workshop on INDEPENDENT COMPONENT ANALYSIS and BLIND SIGNAL SEPARATION 19-22 June 2000 Helsinki, Finland http://www.cis.hut.fi/ica2000/ Submission deadline: 1 March 2000 ---------------------------------------------------------------------------- AIMS AND SCOPE ---------------------------------------------------------------------------- This workshop is the second in the series initiated by the highly succesful ICA'99 workshop in Aussois, France. It is devoted to recent advances in Independent Component Analysis and Blind Signal Separation. An important goal of the workshop is to bring together researchers from artificial neural networks, signal processing, and other related fields to provide interdisciplinary exchange. Papers describing original work on ICA and BSS are invited. Relevant topics include, for example: - Theory and estimation methods - Extensions of basic models - Convolutive and noisy mixtures - Nonlinear methods - Hardware implementations - Audio and telecommunications applications - Biomedical applications - Image processing applications - Data mining applications - Sensory coding models ---------------------------------------------------------------------------- PAPER SUBMISSION ---------------------------------------------------------------------------- Important dates: 1 March, 2000 Submission of *full* paper 15 April, 2000 Notification of acceptance 19-22 June, 2000 Workshop Detailed submission information will be available from our web site: http://www.cis.hut.fi/ica2000/ Submitted papers will be peer-reviewed, and acceptance will be based on quality, relevance and originality. All the papers presented at the workshop will be published in the Proceedings of ICA 2000. ---------------------------------------------------------------------------- INTERNATIONAL PROGRAM COMMITTEE ---------------------------------------------------------------------------- L. Almeida, INESC, Portugal S.-I. Amari, RIKEN, Japan A. Bell, Interval Research, USA J.-F. Cardoso, ENST, France A. Cichocki, RIKEN, Japan P. Comon, Universite de Nice, France S. Douglas, Southern Methodist University, USA C. Fyfe, Univ. of Paisley, UK S. Haykin, McMaster University, Canada A. Hyvarinen, Helsinki Univ. of Technology, Finland C. Jutten, INPG, France J. Karhunen, Helsinki Univ. of Technology, Finland S. Kassam, Univ. of Pennsylvania, USA V. Koivunen, Helsinki Univ. of Technology, Finland T.-W. Lee, Salk Institute, USA R.-W. Liu, Univ. of Notre Dame, USA P. Loubaton, Universite de Marne la Vallee, France K.-R. Mueller, GMD First, Germany B. Olshausen, UC Davis, USA E. Oja, Helsinki Univ. of Technology, Finland P. Pajunen, Helsinki Univ. of Technology, Finland J. Principe, Univ. of Florida, USA T. Sejnowski, Salk Institute, USA K. Torkkola, Motorola Corporate Research, USA J. Tugnait, Auburn University, USA L. Xu, The Chinese Univ. of Hong Kong, China ---------------------------------------------------------------------------- LOCAL ORGANIZING COMMITTEE ---------------------------------------------------------------------------- General Chair: E. Oja Program Chair: J. Karhunen Local Arrangements Chair: V. Koivunen Publications Chair; P. Pajunen Publicity Chair; A. Hyvarinen Treasurer: J. Iivarinen Web Master: J. Sarela ---------------------------------------------------------------------------- COOPERATING SOCIETIES ---------------------------------------------------------------------------- European Neural Network Society, IEEE Signal Processing Society, EURASIP, IEEE Neural Networks Council, IEEE Circuits and Systems Society ---------------------------------------------------------------------------- CONTACT INFORMATION ---------------------------------------------------------------------------- web site http://www.cis.hut.fi/ica2000/ email ica2000 at mail.cis.hut.fi postal mail ICA 2000, P.O.Box 5400 Lab of Comp. and Info. Science Helsinki Univ. of Technology FIN-02015 HUT, Finland ---------------------------------------------------------------------------- From sbaluja at lycos.com Tue Jun 29 18:24:10 1999 From: sbaluja at lycos.com (sbaluja@lycos.com) Date: Tue, 29 Jun 1999 18:24:10 -0400 Subject: Paper: High Performance Named-Entity Extraction Message-ID: <8525679F.007A85FD.00@pghmta2.mis.pgh.lycos.com> Paper: Applying Machine Learning for High Performance Named-Entity Extraction Authors: Shumeet Baluja, Vibhu Mittal, Rahul Sukthankar Available from: http://www.cs.cmu.edu/~baluja Abstract: This paper describes a machine learning approach to build an efficient, accurate and fast name spotting system. Finding names in free text is an important task in addressing real-world text based applications. Most previous approaches have been based on carefully hand-crafted modules encoding linguistic knowledge specific to the language and document genre. Such approaches have two drawbacks: they require large amounts of time and linguistic expertise to develop, and they are not easily portable to new languages and genres. This paper describes an extensible system which automatically combines weak evidence for name extraction. This evidence is gathered from easily available sources: part-of-speech tagging, dictionary lookups, and textual information such as capitalization and punctuation. Individually, each piece of evidence is insufficient for robust name detection. However, the combination of evidence, through standard machine learning techniques, yields a system that achieves performance equivalent to the best existing hand-crafted approaches. Contact: sbaluja at lycos.com, mittal at jprc.com, rahuls at jprc.com Questions and comments are welcome! From sue at soc.plym.ac.uk Wed Jun 30 06:27:53 1999 From: sue at soc.plym.ac.uk (Sue Denham) Date: Wed, 30 Jun 1999 11:27:53 +0100 Subject: Senior Lectureships / Readerships in Computational Neuroscience and Neural Computation Message-ID: <1.5.4.32.19990630102753.00749220@soc.plym.ac.uk> University of Plymouth, UK School of Computing Centre for Neural and Adaptive Systems Senior Lectureships / Readerships in Computational Neuroscience and Neural Computation Salary: ?27998-?29600 pa (Senior Lecturer) Applications are invited for two newly-established, permanent research-related academic positions in the Centre for Neural and Adaptive Systems, a well-established research group which has an international reputation in the theory and computational modelling of neural systems. Applicants must have a very strong research record in either computational neuroscience or biologically inspired neural computation. Appointments will be made either to a Senior Lectureship (salary: ?27998-?29600 pa), or to a Readership (salary: ?27998-?35204 pa), depending on the current research standing of the appointee. Further information about the positions and the research activities of the Centre for Neural and Adaptive Systems can be obtained via e-mail or telephone, from Professor Mike Denham (mike at soc.plym.ac.uk; tel: +44 (0)1752 232547). Applicants are invited to send, as soon as possible, a full curriculum vitae, together with details of their current research activities and future plans, to Professor Mike Denham by e-mail to the above address, or by mail to University of Plymouth, Plymouth, PL4 8AA, UK. Dr Sue Denham Centre for Neural and Adaptive Systems School of Computing University of Plymouth Plymouth PL4 8AA England tel: +44 17 52 23 26 10 fax: +44 17 52 23 25 40 e-mail: sue at soc.plym.ac.uk http://www.tech.plym.ac.uk/soc/research/neural/index.html From movellan at cogsci.ucsd.edu Wed Jun 30 18:45:48 1999 From: movellan at cogsci.ucsd.edu (Javier R. Movellan) Date: Wed, 30 Jun 1999 15:45:48 -0700 Subject: TR Announcement Message-ID: <377A9E1C.B7F37BEA@cogsci.ucsd.edu> The following technical report is available online at http://cogsci.ucsd.edu (follow links to Tech Reports & Software ) Physical copies are also available (see the site for information). Modeling Path Distributions Using Partially Observable Diffusion Networks: A Monte-Carlo Approach. Paul Mineiro Department of Cognitive Science University of California San Diego Javier R. Movellan Department of Cognitive Science & Institute for Neural Computation University of California San Diego Ruth J. Williams Department of Mathematics & Institute for Neural Computation University of California San Diego Hidden Markov models have been more successful than recurrent neural networks for problems involving temporal sequences, e.g., speech recognition. One possible reason for this is that recurrent neural networks are being used in ways that do not handle temporal uncertainty well. In this paper we present a framework for learning, recognition and stochastic filtering of temporal sequences based on a probabilistic version of continuous recurrent neural networks. We call these networks diffusion (neural) networks for they are based on stochastic diffusion processes defined by adding Brownian motion to the standard recurrent neural network dynamics. The goal is to combine the versatility of recurrent neural networks with the power of probabilistic techniques. We focus on the problem of learning to approximate a desired probability distribution of sequences. Once a distribution of sequences has been learned, well known techniques can be applied for the generation, recognition and stochastic filtering of new sequences. We present an adaptive importance sampling scheme for estimation of log-likelihood gradients. This allows the use of iterative optimization techniques, like gradient descent and the EM algorithm, to train diffusion networks. We present results for an automatic visual speech recognition task in which diffusion networks provide excellent performance when compared to hidden Markov models.