From bengioy at IRO.UMontreal.CA Tue Oct 1 13:31:35 2002 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Tue, 1 Oct 2002 13:31:35 -0400 Subject: machine learning position in HEC, Montreal Message-ID: <20021001133135.B17930@vor.iro.umontreal.ca> POSITION IN DATA MINING, ARTIFICIAL INTELLIGENCE OR MACHINE LEARNING Department of Management Science, HEC Montr?al The Department of Management Science at HEC Montreal invites applications for a tenure-track position in data mining, artificial intelligence or machine learning at the rank of assistant, associate or full professor. Priority will be given to candidates with high interest and expertise in data mining, but candidates with interests and expertise in related fields will also be considered. HEC Montr?al is affiliated with the University of Montreal and is recognized as a world leader in business education. For information about the Department of Management Science and HEC Montr?al, the candidates are invited to visit the webpage www.hec.ca/mqg/ and www.hec.ca/. Duties: Undergraduate and graduate teaching, supervision of graduate students, and research. Requirements: To hold a Ph.D. degree in Computing Science, Statistics, Operations Research or in a related discipline. Teaching excellence is required as well as a strong aptitude for research. The main language of instruction at HEC is French; however, there are also programs and courses offered in English and Spanish. Starting date: June 1, 2003. The interested candidates must submit their curriculum vitae including a concise statement of their research interests, the coordinates of three references, and copies of at most three of their most important research publications before December 15, 2002, to: Francois Bellavance Chair Department of Management Science HEC Montr?al 3000 chemin de la Cote-Sainte-Catherine Montreal (Quebec) H3T 2A7 e-mail : francois.bellavance at hec.ca From Marc.Maier at snv.jussieu.fr Tue Oct 1 10:35:26 2002 From: Marc.Maier at snv.jussieu.fr (Marc Maier) Date: Tue, 01 Oct 2002 16:35:26 +0200 Subject: Job announcement in Computational Neuroscience, Paris Message-ID: <3D99B2AD.10357644@snv.jussieu.fr> Job announcement in Computational Neuroscience We will open a permament position in Computational Neuroscience at the level of Matre de confrence universitaire (equivalent to Lecturer or Assistant Professor) at University Paris-VII in fall 2003. Compulsory qualification for candidature ends October-8 2002 (see:http://www.education.gouv.fr/personnel/enssup/antares/default.htm). The opening of the position will be officially announced early 2003. Research will be centred on learning and control of upper limb movements, including manual dexterity and object manipulation. We seek candidates with strong experience in the field of biologically inspired neural network modeling, motor control, biomechanics and a background in Neuroscience. A franco-phone background or working knowledge is required for teaching duties at University Paris-VII that go with the position. Teaching will involve the domains of Neuroscience, Computational neuroscience and Bioinformatics. The position will be attached to INSERM U.483, headed by Y Burnod, who wishes to enlarge and re-enforce our existing modeling team that comprises a wide-ranging and complementary expertise from motor neuroscience, over biophysical and neural network modeling, to robotics. For further informations contact: Marc.Maier at snv.jussieu.fr - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Prof. Marc Maier Dept. of Biology, University Paris-VII and INSERM U483 Bldg. C, 6th floor 9, Quai St-Bernard 75005 Paris tel: +33 1 44 27 34 21 From myosioka at brain.riken.go.jp Wed Oct 2 16:31:33 2002 From: myosioka at brain.riken.go.jp (Masahiko Yoshioka) Date: Thu, 03 Oct 2002 05:31:33 +0900 (JST) Subject: Preprint: Linear stability analysis for networks of spiking neurons Message-ID: <20021003.053133.74751205.myosioka@brain.riken.go.jp> Dear Connectionists, I would like to announce the availability of the preprint of my recent paper, "Linear stability analysis of retrieval state in associative memory neural networks of spiking neurons" M. Yoshioka, Phys. Rev. E in press Available at http://arxiv.org/abs/cond-mat/0209686 Abstract: We study associative memory neural networks of the Hodgkin-Huxley type of spiking neurons in which multiple periodic spatio-temporal patterns of spike timing are memorized as limit-cycle-type attractors. In encoding the spatio-temporal patterns, we assume the spike-timing-dependent synaptic plasticity with the asymmetric time window. Analysis for periodic solution of retrieval state reveals that if the area of the negative part of the time window is equivalent to the positive part, then crosstalk among encoded patterns vanishes. Phase transition due to the loss of the stability of periodic solution is observed when we assume fast alpha-function for direct interaction among neurons. In order to evaluate the critical point of this phase transition, we employ Floquet theory in which the stability problem of the infinite number of spiking neurons interacting with alpha-function is reduced into the eigenvalue problem with the finite size of matrix. Numerical integration of the single-body dynamics yields the explicit value of the matrix, which enables us to determine the critical point of the phase transition with a high degree of precision. --- Masahiko Yoshioka Brain Science Institute, RIKEN Hirosawa 2-1, Wako-shi, Saitama, 351-0198, Japan From intneuro at net2000.com.au Thu Oct 3 15:41:22 2002 From: intneuro at net2000.com.au (INTEGRATIVE NEUROSCIENCE) Date: Thu, 03 Oct 2002 12:41:22 -0700 Subject: order your copy now! Message-ID: <3D9C9D61.B483D875@net2000.com.au> The following paper is scheduled to appear in the forthcoming issue of the Journal of Integrative Neuroscience, volume 1, Issue 2, December 2002: AN INTEGRATIVE THEORY OF COGNITION A framework is outlined for connecting brain imaging activity with the underlying biophysical properties of neural networks, and their mechanisms of action and organizing principles. The main thrust of the framework is a dynamic theory of semantics based on functional integration of biophysical neural networks. It asserts that higher-brain function arises from both synaptic and extrasynaptic integration in the neuropil where information on environmental changes are represented dynamically through a discourse of semantics. Consequently, integrative neural modeling is shown to be an important methodology for analyzing the response activities of functional imaging studies in elucidating the relationship between brain structure, function and behavior. Roman R. Poznanski Centre de Recherche en Physiologie Int?grative H?pital Tarnier CHU Cochin-Port-Royal, 89, rue d'Assas, Paris 75006 FRANCE poznan at integrative-physiology.org ----------------------------- The Journal of Integrative Neuroscience will serve as a focus for new discoveries in the advancement of experimental and theoretical neuroscience. The journal covers the following disciplines: 1. PDE's and nonlinear dynamical systems; 2. Noninformationalist/noncomputationalist theories of mind; 3. Integrative neural modeling; 4. Functional imaging (PET/fMRI); 5. Experiments linking molecular with cellular phenomena. 6. Interregional functional connectivity (anatomy and physiology); 7. Philosophical foundations of neuroscience; 8. Neural engineering applications. Computational neuroscience with its computer metaphors provides little help to foundational theory because of its lack of relation to physical law and its weak relationship to neurobiological processes. The new field of integrative neuroscience encompasses the requirements of physical-biological foundations, expansive inclusiveness of scope across biological and psychological variables, multi-leveled hierarchical complexity, and analytical tools demanded by the brain's complexity. Integrative neuroscience is a large-scale synthesis whose scope and time are right. It embodies the future directions of theoretical neuroscience, and should provide many bridging recognitions. http://www.worldscinet.com/jin/mkt/editorial.shtml (Only vol. 1, issue 1 is free). For orders within Europe, please contact the Imperial College Press sales department at: Tel: +44 (0)20 7836-0888 Fax: +44 (0)20 7836-2020 during U.K. business hours. Outside Europe, our books and journals are distributed by World Scientific Publishing Co.: 5 Toh Tuck Link, SINGAPORE 596224 Fax: 65-6467-7667 Tel: 65-6466-5775 E-mail: wspc at wspc.com.sg Price Information: ISSN: 0219-6352 ; Vol. 1/2002; 2 Issues Special Rates: Individuals (Print Only) US$56 Institutions/Libraries US$ 84 -------------------------- From james at mis.mpg.de Thu Oct 3 07:51:44 2002 From: james at mis.mpg.de (Matthew James) Date: Thu, 3 Oct 2002 13:51:44 +0200 (MET DST) Subject: PhD Thesis: Dynamics of Synaptically Interacting Integrate-and-Fire Neurons Message-ID: <200210031151.NAA19358@kopernikus.mis.mpg.de> Dear Colleagues, I would like to draw your attention to my PhD thesis "Dynamics of Synaptically Interacting Integrate-and-Fire Neurons" supervised by Professor Paul Bressloff and Dr Steve Coombes at the Department of Mathematical Sciences, Loughborough University, UK. Available at: http://www.lboro.ac.uk/departments/ma/pg/theses/mampj-abs.html Abstract: Travelling waves of activity have been experimentally observed in many neural systems. The functional significance of such travelling waves is not always clear. Elucidating the mechanisms of wave initiation, propagation and bifurcation may therefore have a role to play in ascertaining the function of such waves. Previous treatments of travelling waves of neural activity have focussed on the mathematical analysis of travelling pulses and numerical studies of travelling waves. It is the aim of this thesis to provide insight into the propagation and bifurcation of travelling waveforms in biologically realistic systems. There is a great deal of experimental evidence which suggests that the response of a neuron is strongly dependent upon its previous activity. A simple model of this synaptic adaptation is incorporated into an existing theory of strongly coupled discrete integrate-and-fire (IF) networks. Stability boundaries for synchronous firing shift in parameter space according to the level of adaptation, but the qualitative nature of solutions is unaffected. The level of synaptic adaptation is found to cause a switch between bursting states and those which display temporal coherence. Travelling waves are analysed within a framework for a one-dimensional continuum of integrate-and-fire neurons. Self-consistent speeds and periods are determined from integro-differential equations. A number of synaptic responses (alpha-function and passive and quasi-active dendrites) produce qualitatively similar results in the travelling pulse case. For IF neurons, an additional refractory mechanism needs to be introduced in order to prevent arbitrarily high firing rates. Different mathematical formulations are considered with each producing similar results. Dendrites are extensions of a neuron which branch repeatedly and the electrical properties may vary. Under certain conditions, this active membrane gives rise to a membrane impedance that displays a prominent maximum at some nonzero resonant frequency. Dispersion curves which relate the speed of a periodic travelling wave to its period are constructed for the different synaptic responses with additional oscillatory behaviour apparent in the quasi-active dendritic regime. These stationary points are shown to be critical for the formation of multi-periodic wave trains. It is found that periodic travelling waves with two periods bifurcate from trains with a single period via a drift in the spike times at stationary points in the dispersion curve. Some neurons rebound and fire after release from sustained inhibition. Many previous mathematical treatments have not included the effect of this activity. Analytical studies of a simple model which exhibits post-inhibitory rebound show that these neurons can support half-centre oscillations and periodic travelling waves. In contrast to IF networks, only a single travelling pulse wavespeed is possible in this network. Simulations of this biophysical model show broad agreement with the analytical solutions and provide insight into more complex waveforms. Results of the thesis are presented in a discussion along with possible directions for future study. Noise, inhomogeneous media and higher spatial dimensions are suggested. Keywords: biophysical models, dendrites, integrate-and-fire, neural coding, neural networks, post-inhibitory rebound, synaptic adaptation, travelling waves ---------------------------------------------------- Dr. Matthew P. James Max-Planck-Institute for Mathematics in the Sciences Inselstrasse 22 - 26 04103 Leipzig / Germany Phone: +49-341-9959-531 Fax: +49-341-9959-658 Email: james at mis.mpg.de URL: http://personal-homepages.mis.mpg.de/james/ From owi at imm.dtu.dk Thu Oct 3 05:23:48 2002 From: owi at imm.dtu.dk (Ole Winther) Date: Thu, 03 Oct 2002 11:23:48 +0200 Subject: Matlab ICA packages available Message-ID: <3D9C0CA4.4A8083A1@imm.dtu.dk> [Our sincere apologies if you receive multiple copies of this email] We are happy to announce the availability of three Matlab software packages for Independent Component Analysis (ICA). The algorithms are easy to use with no parameters need to be set by default. All algorithms come with demonstration scripts. The packages contain the following three algorithms: 1. Maximum likelihood (Infomax) - the Bell and Sejnowski 1995 algorithm Optimization is performed by an effective second order method. 2. Mean Field - Bayesian ICA alg. by H?jen-S?rensen, Winther and Hansen Linear and instantaneous mixing. Estimation of noise covariance - Gaussian noise model. General mixing matrix - quadratic, over- and under-complete - free/positivity constraint estimation. Variety of source distributions, e.g. - exponential for positive sources - Gaussian for prob. PCA and factor analysis - bi-Gauss for negative kurtosis sources - heavy tailed for positive kurtosis sources. 3. Molgedey and Schuster - The Molgedey-Schuster decorrelation algorithm Square mixing matrix and no noise. Very fast - no iterations. The delay Tau is estimated. Log likelihoods are calculated for all three algorithm. As an additional feature the Bayesian Information Criterion (BIC) can be used for selecting the number of independent components. The packages are available from: http://isp.imm.dtu.dk/toolbox/ All comments and suggestions are welcome. Authors: Thomas Kolenda http://isp.imm.dtu.dk/staff/thko/ Ole Winther http://isp.imm.dtu.dk/staff/winther/ Lars Kai Hansen http://isp.imm.dtu.dk/staff/lkhansen/ Best regards, Ole Winther -- Associate Professor, Digital Signal Processing Informatics and Mathematical Modelling (IMM) Technical University of Denmark (DTU) http://www.imm.dtu.dk Tel: +45 4525 3895 Homepage: http://isp.imm.dtu.dk/staff/winther/ From qian at brahms.cpmc.columbia.edu Thu Oct 3 16:48:33 2002 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Thu, 3 Oct 2002 16:48:33 -0400 Subject: paper: a theory of depth from vertical disparity Message-ID: <200210032048.g93KmX519473@bach.cpmc.columbia.edu> Dear Colleagues, The following paper on "a physiological theory of depth perception from vertical disparity" can be downloaded from: http://brahms.cpmc.columbia.edu/publications/vertical.ps.gz or: http://brahms.cpmc.columbia.edu/publications/vertical.pdf Related work on horizontal disparity and interocular time delay (Pulfrich effect) can be found at http://brahms.cpmc.columbia.edu/ Best regards, Ning ------------------------------------------------------------------ A physiological theory of depth perception from vertical disparity Nestor Matthews, Xin Meng, Peng Xu, and Ning Qian Vision Research (in press) Abstract It has been known since the time of Helmholtz that vertical differences between the two retinal images can generate depth perception. Although many ecologically and geometrically inspired theories have been proposed, the neural mechanisms underlying the phenomenon remain elusive. Here we propose a new theory for depth perception from vertical disparity based on the oriented binocular receptive fields of visual cortical cells and on the radial bias of the preferred-orientation distribution in the cortex. The theory suggests that oriented cells may treat a vertical disparity as a weaker, equivalent horizontal disparity. It explains the induced effect, and the quadrant- and size-dependence of vertical disparity. It predicts that horizontal and vertical disparities should locally enhance or cancel each other according to their depth-signs, and that the effect of vertical disparity should be orientation dependent. These predictions were confirmed through psychophysical experiments. From kaynak at boun.edu.tr Fri Oct 4 02:49:11 2002 From: kaynak at boun.edu.tr (okyay kaynak) Date: Fri, 4 Oct 2002 09:49:11 +0300 Subject: Soft Computing days in Istanbul Message-ID: Please accept our apologies if you receive multiple copies of this message.. SOFT COMPUTING DAYS IN ISTANBUL June 26-July 3, 2003 Dear Colleague; During the early summer of 2003, the magnificent city of Istanbul is going to host two major events in the area of soft computing. ICANN, the annual conference of the European Neural Network Society and ICONIP, the annual conference of the Asia-Pacific Neural Network Assembly will be held jointly (what better place than Istanbul for such an event!) and this will be followed by the bi-annual conference of International Fuzzy Systems Association; IFSA Congress. There will be an overlapping day of the two events for the neural and the fuzzy researches to interact. For those who want to participate in both events, a special registration fee will be offered. City sightseeing and pre and post conference tours will compliment the scientific programs for you to experience some of the cultural riches of Istanbul and Turkey. Make a note of the days in your diary and come and join us. ISTANBUL AWAITS YOU..... Joint 13th International Conference on Artificial Neural Networks and 10th International Conference on Neural Information Processing: ICANN/ICONIP 2003 June 26 - 29, 2003, Istanbul, Turkey http://www.nn2003.org 10th IFSA World Congress: IFSA 2003 June 29 - July 2, 2003, Istanbul, Turkey http://www.ifsa2003.org If you are more in CONTROLS area, there is a control applications conference too. 2003 IEEE Conference on Control Applications: CCA 2003 June 23-25, 2003, Istanbul, Turkey http://mecha.ee.boun.edu.tr/cca2003 From mzib at ee.technion.ac.il Fri Oct 4 09:45:53 2002 From: mzib at ee.technion.ac.il (Michael Zibulevsky) Date: Fri, 4 Oct 2002 16:45:53 +0300 (IDT) Subject: New paper: Relative Newton Method for Quasi-ML Blind Source Separation Message-ID: Announcing a paper ... Title: Relative Newton Method for Quasi-ML Blind Source Separation Author: Michael Zibulevsky ABSTRACT: Presented relative Newton method for quasi-maximum likelihood blind source separation significantly outperforms natural gradient descent in batch mode. The structure of the corresponding Hessian matrix allows its fast inversion without assembling. Experiments with sparsely representable signals and images demonstrate super-efficient separation. URL of gzipped ps file: http://ie.technion.ac.il/~mcib/newt_ica_jmlr1.ps.gz Contact: mzib at ee.technion.ac.il =========================================================================== Michael Zibulevsky, Ph.D. Email: mzib at ee.technion.ac.il Faculty of Electrical Engineering Phone: 972-4-829-4724 Technion - Israel Institute of Technology 972-4-832-3885 Haifa 32000, Israel Cell: 972-55-968297 http://ie.technion.ac.il/~mcib/ Fax: 972-4-829-4799 =========================================================================== From esann at dice.ucl.ac.be Fri Oct 4 07:03:37 2002 From: esann at dice.ucl.ac.be (esann) Date: Fri, 4 Oct 2002 13:03:37 +0200 Subject: CFP: ESANN'2003 European Symposium on Artificial Neural Networks Message-ID: <002501c26b95$b4238100$48ed6882@dice.ucl.ac.be> ESANN'2003 11th European Symposium on Artificial Neural Networks Bruges (Belgium) - April 23-24-25, 2003 Announcement and call for papers ===================================================== Technically co-sponsored by the IEEE Region 8, the IEEE Benelux Section, the IEEE Neural Networks Society (to be confirmed), the International Neural Networks Society and the European Neural Networks Society. The call for papers for the ESANN'2003 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). If you have difficulties to connect to the Web please send an e-mail to esann at dice.ucl.ac.be and we will send you a full version of the call for papers. ESANN'2003 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first happening in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2003 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical and statistical aspects, in the context of function approximation, classification, control, time-series prediction, signal processing, vision, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, data analysis, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that it is the topics of the paper which will decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate on the author submission form that they only accept to present their paper orally. The following is a non-exhaustive list of topics covered during the ESANN conferences: - Models and architectures - Learning algorithms - Theory - Mathematics - Statistical data analysis - Classification - Approximation of functions - Time series forecasting - Nonlinear dimension reduction - Multi-layer Perceptrons - RBF networks - Self-organizing maps - Vector quantization - Support Vector Machines - Recurrent networks - Fuzzy neural nets - Hybrid networks - Bayesian neural nets - Cellular neural networks - Signal processing - Independent component analysis - Natural and artificial vision - Adaptive control - Identification of non-linear dynamical systems - Biologically plausible networks - Bio-inspired systems - Cognitive psychology - Evolutiv learning - Adaptive behaviour Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. Here is the list of special sessions that will be organized during the ESANN'2003 conference: 1. Links between neural networks and webs (M. Gori). 2. Mathematical aspects of neural networks (B. Hammer, T. Villmann) 3. Statistical learning and kernel-based algorithms (M. Pontil, J. Suykens) 4. Digital image processing with neural networks (A. Wismller, U. Seiffert) 5. Industrial and agronomical applications of neural networks (L.M. Reyneri) 6. Neural networks for human/computer interaction (C.W. Omlin) Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organised in a hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Proceedings and journal special issue ------------------------------------- The proceedings will include all communications presented to the conference (tutorials, oral and posters), and will be available on-site. Extended versions of selected papers will be published in the Neurocomputing journal (Elsevier, V. David Sanchez A. ed.). Call for contributions ---------------------- Prospective authors are invited to submit their contributions before 6 December 2002. The electronic submission procedure is detailed on the ESANN Web pages http://www.dice.ucl.ac.be/esann/. Authors must indicate their choice for oral or poster presentation at the submission. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2003. They will benefit from the advance registration fee. Deadlines --------- Submission of papers December 6, 2002 Notification of acceptance February 6, 2003 Symposium April 23-25, 2003 Registration fees ----------------- registration before registration after March 15, 2003 March 15, 2003 Universities 425 480 Industries 525 580 The registration fee includes the attendance to all sessions, the ESANN'2003 dinner, a copy of the proceedings, daily lunches (23-25 April 2003), and the coffee breaks. Conference secretariat ---------------------- ESANN'2003 d-side conference services phone: + 32 2 730 06 11 24 av. L. Mommaerts Fax: + 32 2 730 06 00 B - 1140 Evere (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee (to be confirmed) ---------------------------- Hughes Bersini Univ.Libre Bruxelles (B) Franois Blayo Prfigure (F) Marie Cottrell Univ. Paris I (F) Jeanny Hrault INPG Grenoble (F) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Herv Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Colin Campbell Bristol Univ. (UK) Stphane Canu Inst. Nat. Sciences App. (F) Holk Cruse Universitt Bielefeld (D) Eric de Bodt Univ. Lille II (F) & UCL Louvain-la-Neuve (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips Semiconductors (USA) Richard Duro Univ. Coruna (E) Jean-Claude Fort Universit Nancy I (F) Colin Fyfe Univ. Paisley (UK) Stan Gielen Univ. of Nijmegen (NL) Marco Gori Univ. Siena (I) Bernard Gosselin Fac. Polytech. Mons (B) Manuel Grana UPV San Sebastian (E) Anne Gurin-Dugu INPG Grenoble (F) Barbara Hammer Univ. of Osnbruck (D) Martin Hasler EPFL Lausanne (CH) Laurent Hrault CEA-LETI Grenoble (F) Gonzalo Joya Univ. Malaga (E) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinki Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Jouko Lampinen Helsinki Univ. of Tech. (FIN) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Erzsebet Merenyi Rice Univ. (USA) Jean Arcady Meyer Univ. Paris 6 (F) Jos Mira UNED (E) Jean-Pierre Nadal Ecole Normale Suprieure Paris (F) Christian W. Omlin Univ. of the Western Cape (SA) Gilles Pags Univ. Paris 6 (F) Thomas Parisini Univ. Trieste (I) Hlne Paugam-Moisy Universit Lumire Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Didier Puzenat Univ. Antilles-Guyane (F) Leonardo Reyneri Politecnico di Torino (I) Jean-Pierre Rospars INRA Versailles (F) Jose Santos Reyes Univ. Coruna (E) Jochen Steil Univ. Bielefeld (D) John Stonham Brunel University (UK) Johan Suykens K. U. Leuven (B) John Taylor Kings College London (UK) Claude Touzet Univ. Provence (F) Marc Van Hulle KUL Leuven (B) Thomas Villmann Univ. Leipzig (D) Christian Wellekens Eurecom Sophia-Antipolis (F) Axel Wismller Ludwig-Maximilians-Univ. Mnchen (D) ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From paulm at earthlink.net Sat Oct 5 23:22:10 2002 From: paulm at earthlink.net (Dr Paul R Martin) Date: Sun, 06 Oct 2002 13:22:10 +1000 (EST) Subject: CVNet - Positions in Physiology at Sydney Message-ID: <0H3J002ASIJJN9@mail.newcastle.edu.au> The Anderson Stuart precinct has a vigorous Visual Neuroscience Group. Potential applicants for the positions below are welcome to contact me for more details. Paul ------------------------------------------- 2 Lecturer/Senior Lecturer Positions, School of Biomedical Sciences, University of Sydney The School of Biomedical Sciences is seeking to appoint 2 Lecturers/Senior Lecturers, one to be based in the Physiology Department, the other in the Pharmacology Department. The School of Biomedical Sciences lies within the Faculty of Medicine of the University of Sydney which comprises one of the largest and most active groups of biomedical researchers in Australia. Central facilities for genomics, proteomics and bioinformatics are developing rapidly and opportunities for collaboration between basic and clinical sciences, and research institutes located in the area are exceptional. Applicants should demonstrate how their research will build on these opportunities. The University of Sydney has established areas of research strength in cellular and molecular sciences, cardiovascular and respiratory sciences and the neurosciences. It is likely that the research interests of appointee would lie within one of these broad areas, which are well represented within the two Departments. Applicants at either level would be expected to demonstrate creativity and productivity in their scientific output; applicants at the Senior Lecturer level will need a track record of competitive grant support and evidence of an emerging international reputation. The Departments teach Dental, Medical, Pharmacy and Science students and have substantial numbers of Honours and PhD students. Applicants for the position in Physiology must nominate several areas of Physiology which they could teach at an elementary level and should also identify their areas of expertise for research-based teaching. Applicants for the position in Pharmacology should demonstrate a background in the fundamentals of pharmacology and should also identify their areas of expertise for research-based teaching. Evidence of interest and experience in teaching should be presented by applicants for each position. Applicants should submit a full c.v, a short account of the type of research they wish to pursue and the areas of teaching to which they can contribute, an indication of the Departmental affiliation and level of appointment they seek, and the names of 3 referees. Further details of the departments can be found at http://www.physiol.usyd.edu.au/ and http://www.usyd.edu.au/su/pharmacology/ . ------------------------------------------------------------------- Paul R Martin Associate Professor Dept Physiology F13 University of Sydney NSW 2006 Australia paulm at physiol.usyd.edu.au Tel +61 (02) 9351 3928 Fax +61 (02) 9351 2058 ------------------------------------------------------------------- To get information on using CVNet, send a note to: majordomo at mail.ewind.com In the body of the message, enter: info cvnet From ted.carnevale at yale.edu Mon Oct 7 13:23:28 2002 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Mon, 07 Oct 2002 13:23:28 -0400 Subject: an excellent new NEURON tutorial! Message-ID: <3DA1C310.419ED028@yale.edu> It is my distinct pleasure to announce that Andrew Gillies and David Sterratt of Edinburgh University have created a new tutorial about NEURON that you will find at http://www.anc.ed.ac.uk/school/neuron This is based on the earlier tutoral written by Kevin Martin, but covers a host of new topics and reflects important changes to NEURON over the years. Items of particular interest include --the use of templates to define new cell classes --efficient and convenient management of networks with the NetCon Class --combined use of hoc code and the GUI --adding new mechanisms David and Andrew have worked with us to try to ensure that their new tutorial is clear and up-to-date, remaining tractable for entry-level users while at the same time discussing topics that may be new to more advanced users. --Ted From schwabe at cs.tu-berlin.de Mon Oct 7 12:17:24 2002 From: schwabe at cs.tu-berlin.de (Lars Schwabe) Date: Mon, 7 Oct 2002 18:17:24 +0200 Subject: International Neuroscience Summit 2002 in Berlin Message-ID: Dear Colleague, we invite you to joint the International Neuroscience Summit 2002 in Berlin (please refer to www.ins2002.org for further details) which will take place from 11/28/2002 until 12/01/2002. To this year's invited speakers belong M. Sur (MIT, USA) K. Miller (UCSF, USA) P. Dayan (Gatsby Unit, UK) M. Mozer (University of Colorado, USA) H. Markram (EPFL, Switzerland) A. Borst (MPI, Martinsried, Germany) H. Super (University of Amsterdam, Netherlands) J. Bower (University of Texas, USA) A. Aertsen (Freiburg, Germany) Cheers and see you in Berlin, Lars Schwabe, Gregor Wenning and Peter Wiesing From dgw at MIT.EDU Mon Oct 7 12:14:55 2002 From: dgw at MIT.EDU (David Weininger) Date: Mon, 07 Oct 2002 12:14:55 -0400 Subject: book announcement--Hallamb Message-ID: <200210071214558052@outgoing.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/0262582171/ Thank you! Best, David From James_Morgan at brown.edu Mon Oct 7 13:58:10 2002 From: James_Morgan at brown.edu (Jim Morgan) Date: Mon, 07 Oct 2002 13:58:10 -0400 Subject: Position in Language Processing, Brown University Message-ID: <5.1.0.14.2.20021007135740.01cca1a0@postoffice.brown.edu> LANGUAGE PROCESSING, Brown University: The Department of Cognitive and Linguistic Sciences invites applications for a three year renewable tenure-track position at the Assistant Professor level beginning July 1, 2003. Areas of interest include but are not limited to phonology or phonological processing, syntax or sentence processing, and lexical access or lexical semantics, using experimental, formal, developmental, neurological, or computational methods. Expertise in two or more areas and/or the application of multiple paradigms is preferred. Applicants should have a strong research program and a broad teaching ability in cognitive science and/or linguistics at both the undergraduate and graduate levels. Interest in contributing curricular innovations in keeping with Brown's university-college tradition is desirable. Applicants should have completed all Ph.D. requirements by no later than July 1, 2003. Women and minorities are especially encouraged to apply. Send curriculum vitae, three letters of reference, reprints and preprints of publications, and a one page statement of research interests to Dr. James Morgan, Chair, Search Committee, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, R.I. 02912 by December 1, 2002. Brown University is an Equal Opportunity/Affirmative Action Employer From becker at mcmaster.ca Mon Oct 7 23:03:18 2002 From: becker at mcmaster.ca (S. Becker) Date: Mon, 7 Oct 2002 23:03:18 -0400 (EDT) Subject: Registration for NIPS*2002 Message-ID: You are invited to attend the 15th Annual Conference of NIPS 2002, Neural Information Processing Systems, at the Hyatt Regency in Vancouver, British Columbia, Canada and the Post-Conference Workshops at The Westin Resort in Whistler, B.C. Tutorials: December 9, 2002 Conference: December 10-12, 2002 Workshops: December 12-14, 2002 The Conference Program is now online: http://www.nips.cc We accepted 207 papers this year from a record 694 submissions, maintaining the same high quality with an acceptance rate of 30% as in previous years. Sue Becker General Chair ----------------------------------------------- NEW ONLINE REGISTRATION PROCESS: https://register.nips.salk.edu/ When registering for this year's meeting you will be asked to create an account. We will retain the contact information that you provide and it will be used for a NIPS Directory that will be posted on the NIPS Foundation web site. You will be given the opportunity to choose exactly what you would like to appear in the Directory, or you may choose not to be listed at all. The NIPS Member Directory will be posted to the NIPS website in January of 2003. Our preferred method of registration is online, however, wire transfers and checks will be accepted. Even if you plan to pay using a check or wire transfer, please begin the registration process online to prevent errors in recording your contact information and to speed your registration processing. Applications for financial/travel support can also be submitted online. The deadline for such applications is Friday, Oct 18 (midnight PST). REGISTRATION DEADLINE: The early registration (with reduced registration fees) is November 8, 2002. REGISTRATION WEBSITE: https://register.nips.salk.edu/ PROCEEDINGS: All registrants will receive a CD-ROM of the conference proceedings. Proceedings will also be available free online. The 2 volume soft-cover format, published by the MIT Press, can be purchased at the special conference rate of $35. We hope you will join us in Vancouver for an exciting new NIPS 2002 Terry Sejnowski President, NIPS Foundation ----------------------------------------------- NIPS 2002 TUTORIALS - December 9, 2001 Michael Kearns, University of Pennsylvania -- Computational Game Theory Sebastian Seung, Howard Hughes Medical Institute and MIT -- Neural Integrators Yair Weiss, Hebrew University, Jianbo Shi, Carnegie Mellon University, and Serge Belongie, UC San Diego -- Eigenvector Methods for Clustering and Image Segmentation Richard M. Karp, UC Berkeley and International Computer Science Institute -- Mathematical, Statistical and Algorithmic Challenges from Genomics and Molecular Biology Martin Cooke, University of Sheffield -- Computational Auditory Scene Analysis in Listeners and Machines Andrew McCallum, University of Massachusetts at Amherst, William Cohen, Carnegie Mellon University -- Information Extraction from the World Wide Web INVITED SPEAKERS - December 10-12, 2002 Hugh Durrant-Whyte, University of Sydney -- Information Flow in Sensor Networks Paul Glimcher, New York University -- Decisions, Uncertainty and the Brain: Neuroeconomics Deborah Gordon, Stanford University -- Ants at Work David Heeger, New York University -- Neural Correlates of Perception and Attention Andrew W. Moore, Carnegie Mellon University -- Statistical Data Mining Pietro Perona, Caltech -- Learning Visual Categories WORKSHOPS - December 12-14, 2002 Propagation Algorithms on Graphs with Cycles: Theory and Applications (2 day) -- Shiro Ikeda, Toshiyuki Tanaka, Max Welling Computational Neuroimaging: Foundations, Concepts & Methods (2 day) -- S. Hanson, B. Pearlmutter, S. Strother, L. Hansen, B. Martin-Bly Multi-Agent Learning: Theory and Practice (2 day) -- Gerald Tesauro, Michael L. Littman Independent Component Analysis and Beyond Stefan Harmeling -- Luis Borges de Almeida, Erkki Oja, Dinh-Tuan Pham Learning of Invariant Representations -- Konrad Paul Kording, Bruno A. Olshausen Quantum Neural Computing -- Elizabeth C. Behrman, James E. Steck Spectral Methods in Dimensionality Reduction, Clustering, and Classification -- Josh Tenenbaum, Sam T. Roweis Universal Learning Algorithms and Optimal Search -- Juergen Schmidhuber, Marcus Hutter On Learning Kernels -- Nello Cristianini, Tommi Jaakkola, Michael Jordan, Gert Lanckriet Negative Results and Counter Examples -- Isabelle Guyon Neuromorphic Engineering in the Commercial World -- Timothy Horiuchi, Giacomo Indiveri, Ralph Etienne-Cummings Beyond Classification and Regression -- Learning Rankings, Preferences, Equality Predicates, and Other Structures Rich Caruana, Thorsten Joachims Statistical Methods for Computational Experiments in Visual Processing and Computer Vision -- Ross Beveridge, Bruce Draper, Geof Givens, Ross J. Micheals, Jonathon Phillips Unreal Data: Principles of Modeling Nonvectorial Data -- Alex Smola, Gunnar Raetsch, Zoubin Ghahramani Machine Learning Techniques for Bioinformatics -- Colin Campbell, Phil Long Adaptation -- Spatial and Temporal Effects on Coding Garrett B. Stanley Thalamocortical Processing in Audition and Vision -- Shihab A. Shamma, Anthony M. Zador ----------------------------------------------- From zaffalon at idsia.ch Wed Oct 9 10:43:38 2002 From: zaffalon at idsia.ch (Marco Zaffalon) Date: Wed, 09 Oct 2002 16:43:38 +0200 Subject: Ph.D. Position Available Message-ID: <5.1.0.14.0.20021009164303.0297c6b8@mailhost.idsia.ch> [apologies for multiple postings] Ph.D. Position Available - Deadline December 1st 2002 ----------------------------------------------------- IDSIA, Switzerland, is seeking for an outstanding Ph.D. student with excellent mathematical as well as computer programming skills. We intend to explore the opportunities offered by imprecise probabilities and graphical models to data mining, in particular to classification. Classification has a long tradition in Artificial Intelligence and Statistics. Classifiers are tools inferred from data that learn how to do real and important tasks of diagnosis, prediction and recognition. Imprecise probability-based classification is an exciting new area, with great potential to provide new and reliable ways to deal with difficult problems. The research will need to address both theoretical (foundations of statistical reasoning) and applied issues (implementation and test of new models). Possible backgrounds are computer science, physics, mathematics, engineering, etc. The position is funded by the Swiss National Science Foundation. The initial appointment will be for 2 years. Normally there will be a prolongation. The new Ph.D. student will interact with Marco Zaffalon and other people at IDSIA, and will be involved in international research collaborations. See http://www.idsia.ch/~zaffalon/positions/credal2002.htm for more information. Applicants should submit: 1. Detailed curriculum vitae, 2. List of three references (and their email addresses), 3. Transcripts of undergraduate and graduate (if applicable) studies, 4. Concise statement of their research interests (two pages max). Please address all correspondence to: Marco Zaffalon, IDSIA, Galleria 2, CH-6928 Manno (Lugano), Switzerland. Applications can also be submitted by e-mail to zaffalon at idsia.ch (2MB max). WWW pointers to ps/pdf/doc/html files are welcome. Use Firstname.Lastname.DocDescription.DocType for filename convention. Thanks for your interest. Marco Zaffalon, Senior Researcher, IDSIA tel +41 91 610 8665 fax +41 91 610 8661 e-mail mailto:zaffalon at idsia.ch web http://www.idsia.ch/~zaffalon ABOUT IDSIA ----------- IDSIA (http://www.idsia.ch) is a joint research institute of the University of Lugano (http://www.unisi.ch) and the Swiss Italian University for Applied Science (http://www.supsi.ch). Our research focuses on uncertain reasoning, imprecise probabilities, graphical models, data mining, artificial neural nets, reinforcement learning, complexity and generalization issues, unsupervised learning and information theory, forecasting, artificial ants, combinatorial optimization, evolutionary computation. IDSIA is small but visible, competitive, and influential. The "X-Lab Survey" by Business Week Magazine ranked IDSIA among the world's top ten labs in Artificial Intelligence. IDSIA's algorithms hold the world records for several important operations research benchmarks (see Nature 406(6791):39-42 for an overview of artificial ant algorithms developed at IDSIA). IDSIA is located near the swiss supercomputing center. IDSIA is close to the beautiful city of Lugano in Ticino, the scenic southernmost province of Switzerland. Zurich, Milan and Venice are only few hours away by train. From stefan.wermter at sunderland.ac.uk Wed Oct 9 13:02:35 2002 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Wed, 09 Oct 2002 18:02:35 +0100 Subject: New MSc Intelligent Systems Message-ID: <3DA4612B.7776324@sunderland.ac.uk> New MSc Intelligent Systems ------------------- The School of Computing and Technology, University of Sunderland is delighted to announce the launch of its new MSc Intelligent Systems programme for 24th February. Building on the School's leading edge research in intelligent systems this masters programme will be funded via the ESF scheme (see below). Intelligent Systems is an exciting field of study for science and industry since the currently existing computing systems have often not yet reached the various aspects of human performance. "Intelligent Systems" is a term to describe software systems and methods, which simulate aspects of intelligent behaviour. The intention is to learn from nature and human performance in order to build more powerful computing systems. The aim is to learn from cognitive science, neuroscience, biology, engineering, and linguistics for building more powerful computational system architectures. In this programme a wide variety of novel and exciting techniques will be taught including neural networks, intelligent robotics, machine learning, natural language processing, vision, evolutionary genetic computing, data mining, information retrieval, Bayesian computing, knowledge-based systems, fuzzy methods, and hybrid intelligent architectures. Programme Structure -------------- The following lectures/modules are available: Neural Networks Intelligent Systems Architectures Learning Agents Evolutionary Computation Cognitive Neural Science Knowledge Based Systems and Data Mining Bayesian Computation Vision and Intelligent Robots Natural Language Processing Dynamics of Adaptive Systems Intelligent Systems Programming Funding up to 6000 pounds (9500Euro) for eligible students ------------------------------ The Bursary Scheme applies to this Masters programme commencing February 2003 and we have obtained funding through the European Social Fund (ESF). ESF support enables the University to waive the normal tuition fee and provide a bursary of 75 per week for 45 weeks for eligible EU students, together up to 6000 pounds. For further information in the first instance please see: http://osiris.sund.ac.uk/webedit/allweb/courses/progmode.php?prog=G550A&mode=FT&mode2=&dmode=C For information on applications and start dates contact: gillian.potts at sunderland.ac.uk Tel: 0191 515 2758 For academic information about the programme contact: alfredo.moscardini at sunderland.ac.uk *************************************** Professor Stefan Wermter Chair for Intelligent Systems Informatics Centre School of Computing and Technology University of Sunderland St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From mcrae at uwo.ca Wed Oct 9 00:32:57 2002 From: mcrae at uwo.ca (Ken McRae) Date: Wed, 09 Oct 2002 00:32:57 -0400 Subject: Cognitive Position at Western Message-ID: FACULTY POSITION IN COGNITIVE PSYCHOLOGY. The Psychology Department at the University of Western Ontario invites applications for a tenure-track position at the Assistant Professor level to enhance the Department's strengths in Cognitive Psychology, Cognitive Science, and Cognitive Neuroscience. The successful applicant will be expected to maintain an active research program in his or her research area, teach undergraduate and graduate courses, and provide graduate student supervision. The primary selection criteria will be research excellence and productivity; researchers from any domain of study in Cognitive Psychology, Cognitive Science, and Cognitive Neuroscience will be considered. Applicants should submit by November 15, 2002, a curriculum vitae, statement of research and teaching experience and interests, copies of representative publications, and arrange to have 3 letters of recommendation sent to: Dr. Jim Olson, Chair, Department of Psychology, The University of Western Ontario, London, Ontario, Canada N6A 5C2. This position is subject to budgetary approval. The scheduled starting date is July 1, 2003. Please see http://www.ssc.uwo.ca/psychology for the relevant information. The University of Western Ontario is committed to employment equity and welcomes applications from all qualified women and men, including visible minorities, aboriginal people, and persons with disabilities. The Department of Psychology at UWO is strong in these areas. Faculty in the Cognition Area include Marc Joanisse (cognitive neuroscience of language processing in normal and impaired adults and children, neural network modeling), Albert Katz (autobiographical memory, figurative language), Stephen Lupker (word recognition, semantic memory), and Ken McRae (cognitive and neural bases of word meaning, sentence processing, neural network modeling). Faculty in other areas of the Department include Stephan K?hler (cognitive and neural bases of implicit and explicit memory), Debra Jared (word recognition, bilingualism), Mel Goodale (perception and action, object recognition), Keith Humphrey (perception and action, object recognition), Jody Culham (perception and action), David Sherry, William Roberts, and Scott MacDougall-Shackelton (animal learning and cognition). Research facilities at UWO include fMRI, ERP, eyetracking and TMS laboratories, high performance computing facilities for computational modeling, a large database for developmental studies, and a large participant pool of normal undergraduate adults. From cns at cns.bu.edu Thu Oct 10 10:58:26 2002 From: cns at cns.bu.edu (Boston University CNS Department) Date: Thu, 10 Oct 2002 10:58:26 -0400 Subject: Graduate Program in the Department of Cognitive and Neural Systems (CNS) at Boston University Message-ID: <3DA59592.7010504@cns.bu.edu> PLEASE POST ******************************************************************* GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The brochure may also be viewed on line at: http://www.cns.bu.edu/brochure/ and application forms at: http://www.bu.edu/cas/graduate/application.html Applications for Fall 2003 admission and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via email your full name and mailing address to the attention of Mr. Robin Amos at: amos at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. ******************************************************************* Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students and qualified undergraduates interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The department's training and research focus on two broad questions. The first question is: How does the brain control behavior? This is a modern form of the Mind/Body Problem. The second question is: How can technology emulate biological intelligence? This question needs to be answered to develop intelligent technologies that are well suited to human societies. These goals are symbiotic because brains are unparalleled in their ability to intelligently adapt on their own to complex and novel environments. Models of how the brain accomplishes this are developed through systematic empirical, mathematical, and computational analysis in the department. Autonomous adaptation to a changing world is also needed to solve many of the outstanding problems in technology, and the biological models have inspired qualitatively new designs for applications. During the past decade, CNS has led the way in developing biological models that can quantitatively simulate the dynamics of identified brain cells in identified neural circuits, and the behaviors that they control. This new level of understanding is leading to comparable advances in intelligent technology. CNS is a graduate department that is devoted to the interdisciplinary training of graduate students. The department awards MA, PhD, and BA/MA degrees. Its students are trained in a broad range of areas concerning computational neuroscience, cognitive science, and neuromorphic systems. The biological training includes study of the brain mechanisms of vision and visual object recognition; audition, speech, and language understanding; recognition learning, categorization, and long-term memory; cognitive information processing; self-organization and development, navigation, planning, and spatial orientation; cooperative and competitive network dynamics and short-term memory; reinforcement and motivation; attention; adaptive sensory-motor planning, control, and robotics; biological rhythms; consciousness; mental disorders; and the mathematical and computational methods needed to support advanced modeling research and applications. Technological training includes methods and applications in image processing, multiple types of signal processing, adaptive pattern recognition and prediction, information fusion, and intelligent control and robotics. The foundation of this broad training is the unique interdisciplinary curriculum of seventeen interdisciplinary graduate courses that have been developed at CNS. Each of these courses integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of artificial neural networks and hybrid systems to technology. A student's curriculum is tailored to his or her career goals with an academic advisor and a research adviser. In addition to taking interdisciplinary courses within CNS, students develop important disciplinary expertise by also taking courses in departments such as biology, computer science, engineering, mathematics, and psychology. In addition to these formal courses, students work individually with one or more research advisors to learn how to do advanced interdisciplinary research in their chosen research areas. As a result of this breadth and depth of training, CNS students have succeeded in finding excellent jobs in both academic and technological areas after graduation. The CNS Department interacts with colleagues in several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The units most closely linked to the department are the Center for Adaptive Systems and the CNS Technology Laboratory. Students interested in neural network hardware can work with researchers in CNS and at the College of Engineering. Other research resources include the campus-wide Program in Neuroscience, which includes distinguished research groups in cognitive neuroscience, neurophysiology, neuroanatomy, neuropharmacology, and neural modeling across the Charles River Campus and the Medical School; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department ; and in biophysics and computational physics within the Physics Department. Key colleagues in these units hold joint appointments in CNS in order to expedite training and research interactions with CNS core faculty and students. In addition to its basic research and training program, the department organizes an active colloquium series, various research and seminar series, and international conferences and symposia, to bring distinguished scientists from experimental, theoretical, and technological disciplines to the department. The department is housed in its own four-story building, which includes ample space for faculty and student offices and laboratories (computational neuroscience, visual psychophysics, psychoacoustics, speech and language, sensory-motor control, neurobotics, computer vision), as well as an auditorium, classroom, seminar rooms, a library, and a faculty-student lounge. The department has a powerful computer network for carrying out large-scale simulations of behavioral and brain models and applications. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior Helen Barbas Professor, Department of Health Sciences, Sargent College PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex Jacob Beck Research Professor of Cognitive and Neural Systems PhD, Psychology, Cornell University Visual perception, psychophysics, computational models of vision Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems Director, CNS Technology Laboratory PhD, Mathematics, University of Wisconsin, Madison Learning and memory, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems, cardiovascular oscillations physiology and time series H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, auditory virtual environments, signal processing models of hearing Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology John C. Fiala Research Assistant Professor of Biology PhD, Cognitive and Neural Systems, Boston University Synaptic plasticity, dendrite anatomy and pathology, motor learning, robotics, neuroinformatics Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Vision, audition, language, learning and memory, reward and motivation, cognition, development, sensory-motor control, mental disorders, applications Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, biological sensory-motor control and functional brain imaging Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition Michael E. Hasselmo Associate Professor of Psychology Director of Graduate Studies, Psychology Department PhD, Experimental Psychology, Oxford University Computational modeling and experimental testing of neuromodulatory mechanisms involved in encoding, retrieval and consolidation Allyn Hubbard Associate Professor of Electrical and Computer Engineering PhD, Electrical Engineering, University of Wisconsin Peripheral auditory system (experimental and modeling), chip design spanning the range from straightforward digital applications to exotic sub-threshold analog circuits that emulate the functionality of the visual and auditory periphery, BCS/FCS, the mammalian cochlea in silicon and MEMS, and drug discovery on silicon Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Neural network theory, complexity theory, wavelet theory, mathematical physics Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamics of networks of neurons Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neurodevelopmental disorders Ennio Mingolla Professor of Cognitive and Neural Systems and Psychology Acting Chairman 2002-2003, Department of Cognitive and Neural Systems PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision Bradley Rhodes Research Associate, Technology Lab, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Motor control, learning, and adaptation, serial order behavior (timing in particular), attention and memory Michele Rucci Assistant Professor of Cognitive and Neural Systems PhD, Scuola Superiore S.-Anna, Pisa, Italy Vision, sensory-motor control and learning, and computational neuroscience Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Research Scientist, Haskins Laboratories, New Haven, CT Assistant Professor in Residence, Department of Psychology and Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception Teaching about functional MRI and other brain mapping methods Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Frances and Louis H. Salvage Professor of Psychology, Brandeis University Consultant in neurosurgery, Boston Children's Hospital PhD, Psychology, Brown University Visual motion, brain imaging, relation of visual perception, memory, and movement Barbara Shinn-Cunningham Assistant Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensorimotor adaptation, mathematical models of human performance David Somers Assistant Professor of Psychology PhD, Cognitive and Neural Systems, Boston University Functional MRI, psychophysical, and computational investigations of visual perception and attention Chantal E. Stern Assistant Professor of Psychology and Program in Neuroscience, Boston University Assistant in Neuroscience, MGH-NMR Center and Harvard Medical School PhD, Experimental Psychology, Oxford University Functional neuroimaging studies (fMRI and MEG) of learning and memory Malvin C. Teich Professor of Electrical and Computer Engineering, Biomedical Engineering, and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI) Jeremy Wolfe Adjunct Associate Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Department Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, pre-attentive and attentive object representation Curtis Woodcock Professor of Geography Chairman, Department of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and student-run special interest groups, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by fellowships, grants, and contracts from federal agencies and private foundations that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception; audition, speech and language processing; and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Silicon Graphics workstations, Macintoshes, and PCs. A PC farm running Linux operating systems is available as a distributed computational environment. All students have access to X-terminals or UNIX workstation consoles, a selection of color systems and PCs, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Active Perception Laboratory The Active Perception Laboratory is dedicated to the investigation of the interactions between perception and behavior. Research focuses on the theoretical and computational analyses of the effects of motor behavior on sensory perception and on the design of psychophysical experiments with human subjects. The Active Perception Laboratory includes extensive computational facilities that allow the execution of large-scale simulations of neural systems. Additional facilities will soon include instruments for the psychophysical investigation of eye movements during visual analysis, including an accurate and non-invasive eye tracker, and robotic systems for the simulation of different types of behavior. Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Laboratory is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; a light machine shop; an active vision laboratory including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. The laboratory supports research in the areas of neural modeling, computational neuroscience, computer vision and robotics. The major question being address is the nature of representation of the visual world in the brain, in terms of observable neural architectures such as topographic mapping and columnar architecture. The application of novel architectures for image processing for computer vision and robotics is also a major topic of interest. Recent work in this area has included the design and patenting of novel actuators for robotic active vision systems, the design of real-time algorithms for use in mobile robotic applications, and the design and construction of miniature autonomous vehicles using space-variant active vision design principles. Recently one such vehicle has successfully driven itself on the streets of Boston. Neurobotics Laboratory The Neurobotics Laboratory utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The laboratory currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Other platforms may be investigated in the future. Psychoacoustics Laboratory The Psychoacoustics Laboratory in the Department of Cognitive and Neural Systems (CNS) is equipped to perform both traditional psychoacoustic experiments as well as experiments using interactive auditory virtual-reality stimuli. The laboratory contains approximately eight PCs (running Windows 98 and/or Linux), used both as workstations for students and to control laboratory equipment and run experiments. The other major equipment in the laboratory includes special-purpose signal processing and sound generating equipment from Tucker-Davis Technologies, electromagnetic head tracking systems, a two-channel spectrum analyzer, and other miscellaneous equipment for producing, measuring, analyzing, and monitoring auditory stimuli. The Psychoacoustics Laboratory consists of three adjacent rooms in the basement of 677 Beacon St. (the home of the CNS Department). One room houses an 8 ft. x 8 ft. single-walled sound-treated booth as well as space for students. The second room is primarily used as student workspace for developing and debugging experiments. The third space houses a robotic arm, capable of automatically positioning a small acoustic speaker anywhere on the surface of a sphere of adjustable radius, allowing automatic measurement of the signals reaching the ears of a listener for a sound source from different positions in space, including the effects of room reverberation. Sensory-Motor Control Laboratory The Sensory-Motor Control Laboratory supports experimental and computational studies of sensory-motor control. A computer controlled infrared WatSmart system allows measurement of large-scale (e.g. reaching) movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. The laboratory is connected to the department's extensive network of Linux and Windows workstations and Linux computational servers. Speech and Language Laboratory The Speech Laboratory includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal-processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. The laboratory also contains a network of Windows-based PC computers equipped with software for the analysis of functional magnetic resonance imaging (fMRI) data, including region-of-interest (ROI) based analyses involving software for the parcellation of cortical and subcortical brain regions in structural MRI images. Technology Laboratory The Technology Laboratory fosters the development of neural network models derived from basic scientific research and facilitates the transition of the resulting technologies to software and applications. The Lab was established in July 2001, with a grant from the Air Force Office of Scientific Research: "Information Fusion for Image Analysis: Neural Models and Technology Development." Initial projects have focused on multi-level fusion and data mining in a geospatial context, in collaboration with the Boston University Center for Remote Sensing. This research and development has built on models of opponent-color visual processing, boundary contour system (BCS) and texture processing, and Adaptive Resonance Theory (ART) pattern learning and recognition, as well as other models of associative learning and prediction. Other projects include collaborations with the New England Medical Center and Boston Medical Center, to develop methods for analysis of large-scale medical databases, currently to predict HIV resistance to antiretroviral therapy. Associated basic research projects are conducted within the joint context of scientific data and technological constraints. Visual Psychophysics Laboratory The Visual Psychophysics Laboratory occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Macintosh, Windows and Linux workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing devices, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty members have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://www.cns.bu.edu/ ******************************************************************* From bengioy at IRO.UMontreal.CA Thu Oct 10 09:55:02 2002 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Thu, 10 Oct 2002 09:55:02 -0400 Subject: tenure-track position at U of Montreal, stat. mach. learning Message-ID: <20021010095502.A15118@vor.iro.umontreal.ca> Hello, My department advertises a tenure-track faculty position, with axes of interest that include Statistical Machine Learning. The department has a statistical machine learning lab (currently with Balasz Kegl and I) with about 15 graduate students and post-docs, and access to several linux clusters for high-performance computing. The official advert is below. Please feel free to contact me for more information. Teaching is in French but non-francophones are usually given a break the first year, to learn enough French to teach. ---------------------------------------------------------------------- Universit? de Montr?al Facult? des arts et des sciences Department of Computer Science and Operations Research The DIRO (D?partement d'informatique et de recherche op?rationnelle - Department of Computer Science and Operations Research) invites applications for several tenure-track positions, starting June 1st, 2003. Applicants seeking positions at the Assistant Professor level will have priority. The Department is seeking qualified candidates in Computer Science. Preference will be given to applicants with a strong research program in one of the areas of Software Engineering, Systems (design and implementation of programming languages, compilation, parallel processing), or Computer Networking and Distributed Systems (including electronic commerce). Other areas of interest are Machine Learning (statistical learning, data mining) and Operations Research (stochastic modelling and optimization in particular). A background combining more than one of the above areas, or combining Computer Science and Operations Research, is an asset. An excellent candidate working in a field different from those listed above would also receive consideration. Beyond demonstrating a clear potential for outstanding research, the successful candidate must be committed to excellence in teaching. The candidate is expected to have a working knowledge of French, and be prepared to teach and supervise students in French within one year. The Universit? de Montr?al is the leading French-language university in North America. The DIRO offers B.Sc., M.Sc., and Ph.D. degrees in Computer Science and Operations Research, a B.Sc. in Bioinformatics, several bidisciplinary B.Sc. degrees, as well as an M.Sc. in electronic commerce. With 41 faculty members, 600 undergraduates and close to 200 graduate students, the DIRO is one of the largest Computer Science departments in Canada as well as one of the most active in research. Research interests of current faculty include bioinformatics, computer networking, intelligent tutoring systems, computer architecture, software engineering, artificial intelligence, computational linguistics, computer graphics, vision and solid modelling, automatic learning, theoretical and quantum computing, parallelism, optimization, and simulation. See http://www.iro.umontreal.ca. Task: Undergraduate and graduate teaching, research and supervision of graduate students. Requirements : Ph.D. in Computer Science or a related area. Salary : Starting salary is competitive and fringe benefits are excellent. Hardcopy applications including a curriculum vitae, a description of the candidate's current research program, at least three letters of reference, and up to three selected preprints/reprints, should be sent to: Pierre McKenzie, professeur et directeur D?partement d'informatique et de recherche op?rationnelle, FAS Universit? de Montr?al C.P. 6128, Succ. Centre-Ville Montr?al (Qu?bec) Canada H3C 3J7 by February 1st, 2003. Applications received after that date may be considered until the positions are filled. In accordance with Canadian Immigration requirements, priority will be given to Canadian citizens and permanent residents. The Universit? de Montr?al is committed to equity in employment and encourages applications from qualified women. ----- End forwarded message ----- -- Yoshua Bengio Full Professor / Professeur titulaire Canada Research Chair in Statistical Learning Algorithms / titulaire de la chaire de recherche du Canada en algorithmes d'apprentissage statistique D?partement d'Informatique et Recherche Op?rationnelle Universit? de Montr?al, adresse postale: C.P. 6128 Succ. Centre-Ville, Montr?al, Qu?bec, Canada H3C 3J7 adresse civique: 2920 Chemin de la Tour, Montr?al, Qu?bec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From robbie at bcs.rochester.edu Thu Oct 10 11:20:09 2002 From: robbie at bcs.rochester.edu (Robert Jacobs) Date: Thu, 10 Oct 2002 11:20:09 -0400 Subject: articles available Message-ID: <5.1.1.6.0.20021010111815.00ac6fd0@bcs.rochester.edu> The following papers may be of interest to readers of this mailing list: (1) Jacobs, R.A., Jiang, W., and Tanner, M.A. (2002) Factorial hidden Markov models and the generalized backfitting algorithm. Neural Computation, 14, 2415-2437. (2) Jacobs, R.A. (2002) What determines visual cue reliability? Trends in Cognitive Sciences, 6, 345-350. Robbie Jacobs =================================== (1) Jacobs, R.A., Jiang, W., and Tanner, M.A. (2002) Factorial hidden Markov models and the generalized backfitting algorithm. Neural Computation, 14, 2415-2437. Previous researchers developed new learning architectures for sequential data by extending conventional hidden Markov models through the use of distributed state representations. Although exact inference and parameter estimation in these architectures is computationally intractable, Ghahramani and Jordan (1997) showed that approximate inference and parameter estimation in one such architecture, factorial hidden Markov models (FHMMs), is feasible in certain circumstances. However, the learning algorithm proposed by these investigators, based on variational techniques, is difficult to understand and implement, and is limited to the study of real-valued datasets. This paper proposes an alternative method for approximate inference and parameter estimation in FHMMs based on the perspective that FHMMs are a generalization of a well-known class of statistical models known as Generalized Additive Models (GAMs; Hastie and Tibshirani, 1990). Using existing statistical techniques for GAMs as a guide, we have developed the generalized backfitting algorithm. This algorithm computes customized error signals for each hidden Markov chain of an FHMM, and then trains each chain one at a time using conventional techniques from the hidden Markov models literature. Relative to previous perspectives on FHMMs, we believe that the viewpoint taken here has a number of advantages. First, it places FHMMs on firm statistical foundations by relating FHMMs to a class of models that are well-studied in the statistics community, yet it generalizes this class of models in an interesting way. Second, it leads to an understanding of how FHMMs can be applied to many different types of time series data, including Bernoulli and multinomial data, not just data which are real-valued. Lastly, it leads to an effective learning procedure for FHMMs which is easier to understand and easier to implement than existing learning procedures. Simulation results suggest that FHMMs trained with the generalized backfitting algorithm are a practical and powerful tool for analyzing sequential data. http://www.bcs.rochester.edu/people/robbie/jacobs.j.t.nc02.pdf =================================== (2) Jacobs, R.A. (2002) What determines visual cue reliability? Trends in Cognitive Sciences, 6, 345-350. Visual environments often contain many cues to properties of an observed scene. In order to integrate information provided by multiple cues in an efficient manner, observers must assess the degree to which each cue provides reliable versus unreliable information. Two hypotheses are reviewed regarding how observers estimate cue reliabilities, namely that the estimated reliability of a cue is related to the ambiguity of the cue, and that people use correlations among cues in order to estimate cue reliabilities. It is shown that cue reliabilities are important both for cue combination and for aspects of visual learning. http://www.bcs.rochester.edu/people/robbie/jacobs.tics02.pdf ---------------------------------------------------------------------------------------- Robert Jacobs Department of Brain and Cognitive Sciences University of Rochester Rochester, NY 14627-0268 phone: 585-275-0753 fax: 585-442-9216 email: robbie at bcs.rochester.edu web: http://www.bcs.rochester.edu/people/robbie/robbie.html From E.Koning at elsevier.nl Fri Oct 11 08:18:40 2002 From: E.Koning at elsevier.nl (Koning, Esther (ELS)) Date: Fri, 11 Oct 2002 13:18:40 +0100 Subject: Introducing www.ComputerScienceWeb.com Message-ID: <4D56BD81F62EFD49A74B1057ECD75C0603A9A34C@elsamss02571> Researchers can now benefit from the newly launched on-line platform "www.ComputerScienceWeb.com". Tailored to your specific needs, ComputerScienceWeb offers comprehensive search facilities and customized services in 12 categories of computer science, and provides: * access to over 50.000 articles and more than 70 journals in computer science * free access to abstracts and tables of contents * integrated search and browse facilities across all journals and preprints * free services such as the "The Computer Science Preprint Server" and "Who Cites Who" From tkelley at arl.army.mil Fri Oct 11 16:43:54 2002 From: tkelley at arl.army.mil (Troy Kelley) Date: Fri, 11 Oct 2002 16:43:54 -0400 Subject: Free SuperComputers Use Available for Connectionist Researchers Message-ID: Hello, The Human Research and Engineering Directorate (HRED) of the Army Research Laboratory (ARL) is seeking proposals to develop models of the interactions and dynamics of human cognition by using ARL's High Performance Computing (HPC) SuperComputer assets. The project is called: Modeling and Interaction of Neurological Dynamics with Symbolic Structures (MINDSS). Cooperative Research and Development Agreements (CRADAs) will be used to leverage the research activities of participating major universities who are interested in developing computational models of human cognition. The objective is to use ARL's state-of-the-art computer facilities to develop connectionist, symbolic, and neurological models which are relevant to DoD's research interests. Major areas of interest include: language processing, visual and auditory processing, complex reasoning, and the brain's reaction to environmental trauma and stress. If you are interested in joining the MINDSS program, having access to our state of the art computer systems, and submitting a proposal, please send an e-mail to the address listed below. You must be a U.S. citizen and you must be affiliated with a major University. Please feel free to forward this message to anyone whom might be interested. Thanks, Troy Kelley Army Research Laboratory tkelley at arl.army.mil From ingber at ingber.com Sun Oct 13 09:52:51 2002 From: ingber at ingber.com (Lester Ingber) Date: Sun, 13 Oct 2002 09:52:51 -0400 Subject: open position: Financial Engineer Message-ID: <20021013135251.GA3356@ingber.com> If you have very strong credentials for the position described below, please email your resume to: Lester Ingber Director R&D DUNN Capital Management Stuart FL Some recent press on DUNN can be seen on http://www.businessweek.com/magazine/content/02_39/b3801113.htm http://www.businessweek.com/magazine/content/02_39/b3801114.htm Financial Engineer A disciplined, quantitative, analytic individual proficient in prototyping and coding (such as C/C++, Maple/Mathematica, or Visual Basic, etc.) is sought for financial engineering/risk:reward optimization research position with established Florida hedge fund (over two decades in the business and $1 billion in assets under management). A PhD in a mathematical science, such as physics, statistics, math, or computer-science, is preferred. Hands-on experience in the financial industry is required. Emphasis is on applying state-of-the-art methods to financial time-series of various frequencies. Ability to work with a team to transform ideas/models into robust, intelligible code is key. Salary: commensurate with experience, with bonuses tied to the individual's and the firm's performance. Status of Selection Process All applicants will be reviewed, and a long list will be generated for phone interviews. Other applicants will not be contacted further. Information on the status of this process will be available in http://www.ingber.com/open_positions.html From these phone interviews, a short list will be generated for face-to-face interviews. During the visit for the physical interview a small coding exam will be given. Start date for this position may range anywhere from immediately to six months thereafter, depending on both the candidate's and the firm's needs. -- Prof. Lester Ingber ingber at ingber.com ingber at alumni.caltech.edu www.ingber.com www.alumni.caltech.edu/~ingber From Bob.Williamson at anu.edu.au Mon Oct 14 02:54:48 2002 From: Bob.Williamson at anu.edu.au (Bob.Williamson@anu.edu.au) Date: Mon, 14 Oct 2002 16:54:48 +1000 (AUS Eastern Standard Time) Subject: Postdoctoral Fellowships Available Message-ID: Postdoctoral Fellowships (approximately 30 positions) ---------------------------------------------------- National ICT Australia is a newly formed research institute based in Canberra and Sydney. Details of the centre can be found on its website http://nicta.com.au We are now hiring postdoctoral fellows in a range of research areas including machine learning. These are 3-5 year positions. There is a program on Statistical Machine Learning based in Canberra (see http://nicta.com.au/stat-ml.html ) for a brief overview of the current group. There are around 3 postdoc vacancies in this program. The formal postdoc job ad can be found at http://nicta.com.au/jobs/postdoc.pdf which contains details on how to apply. There is no closing date. Assessment of applications will commence on 18 November 2002. -----------------------------------------+-----------------------------. | Professor Robert (Bob) Williamson, // Phone: +61 2 6125 0079 | | Designate Canberra Node Director // Office: +61 2 6125 8801 | | and Vice-President, // Fax: +61 2 6125 8623 | | National ICT Australia (NICTA) // Mobile: +61 4 0405 3877 | | Research School of Information // | | Sciences and Engineering, // Bob.Williamson at nicta.edu.au | | Australian National University, // http://www.nicta.com | | Canberra 0200 AUSTRALIA // http://axiom.anu.edu.au/~williams | `---------------------------------+-------------------------------------' From P.J.Lisboa at livjm.ac.uk Sun Oct 13 12:12:10 2002 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Sun, 13 Oct 2002 17:12:10 +0100 Subject: NNESMED/CIMED International Conference Sheffield July 21 - 23, 20 03 Message-ID: After the success of the NNESMED conference held last year in the Island of Milos, the conference returns to the UK. It is expected to retain the single-track format, in order to foster the multidisciplinary interaction that is a regular feature of this event. The deadline for submission of extended abstractsis 31 January 2003. Further details can be found at http://www.shu.ac.uk/conference/nnesmed and a text version of the call for papers follows: FIFTH INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND EXPERT SYSTEMS IN MEDICINE AND HEALTHCARE, NNESMED 2003 and First International Conference on Computational Intelligence in Medicine and Healthcare, CIMED 2003 July 21 - 23, 2003, School of Engineering, Sheffield Hallam University, Sheffield, England This is the fifth of a successful series of international conferences focused on the application of intelligent computational methods and systems to support all areas of biomedical, clinical, and healthcare practice, which brings together healthcare specialists, clinicians, biomedical engineers, computer scientists, communication and computer network engineers, and applied mathematicians. The language of the conference is English. TOPICS OF INTEREST Artificial neural networks Intelligent signal processing Intelligent Image Processing Fuzzy and neuro-fuzzy systems Rough sets Evolutionary computing Probabilistic reasoning Non-linear dynamical analysis methods Independent Components Analysis Belief networks Machine learning Artificial life Intelligent agents Data mining Expert systems Intelligent telemedicine and telecare MEDICAL APPLICATION AREAS INCLUDE: Signal processing Diagnosis and therapy Bioinformatics Monitoring and control Image processing and interpretation Rehabilitation Process modelling and simulation Education INTERNATIONAL STEERING COMMITTEE Professor Dr Barrie Jervis (Conference Chair, UK), Professor Emmanuel Ifeachor (UK), Professor Periklis Ktonas (USA), Professor Paulo Lisboa (UK), Professor Antonina Starita (Italy), Professor George Papadourakis (GR) INTERNATIONAL PROGRAMME COMMITTEE The International Programme Committee will referee the papers submitted. Marco Gori (Italy), Ben Jansen (USA), Nicolaos B. Karayiannis (USA), David Lowe (UK), Francesco Masulli (Italy), Sifis Micheloyannis (Greece), Ryszard Tadeusiewicz (Poland), Azzam Taktak (UK), Michael Zervakis (Greece), Farhat Fnaiech (Tunisia), Jiri Jan (Czech Republic). SUBMISSION OF PAPERS Authors are requested to submit an extended abstract (two pages in length, single spacing, full details in the Call for Papers, available early October). Extended abstracts should clearly identify the medical or healthcare context of the work, the methodology used, the advances made and the significance of the results. Papers will be accepted as either full session oral papers or as poster papers. Authors whose abstracts are accepted will be asked to develop them into full papers of 4-6 pages length for inclusion in the conference proceedings. IMPORTANT DEADLINES Submission of extended abstracts: 31 January 2003, Notification of provisional acceptance: 28 February 2003, Submission of full papers (camera ready): 18 April 2003, Receipt of Conference Booking Form: 13 June 2003. CONFERENCE WEBSITE The temporary website address until further notice is: http://www.shu.ac.uk/conference/nnesmed CONFERENCE FEES The full non-residential conference fee for bookings made by 30 April 2003 is 300. Later bookings will incur a supplementary charge of 50. A daily rate of 180 is also available, which does not include evening dinner. ACCOMMODATION AND SOCIAL PROGRAMME The booking form will include details of residential accommodation in hotels and student residences to suit all budgets. Accommodation is not included in the conference fee. Details of the proposed social programme will appear in the Call for Papers. CONFERENCE CONTACT Conference 21, Sheffield Hallam University, City Campus, Sheffield, S1 1WB, England. Tel.: +44-114-225-5338/5336 Fax: +44-114-225-5337 E-mail: conference21 at shu.ac.uk From Ronan.Reilly at may.ie Mon Oct 14 05:11:17 2002 From: Ronan.Reilly at may.ie (Ronan Reilly) Date: Mon, 14 Oct 2002 10:11:17 +0100 Subject: postdoctoral possibilities in Ireland Message-ID: Readers of the list may interested in the possibility of postdoctoral positions in Ireland funded by the Irish Research Council for Science, Engineering, & Technology (IRCSET). An "advance notice" to the call is appended to this email. Further details will shortly appear at http://www.embark.ie. The deadline for submissions is November 1. Potential applicants should note that there is no nationality restriction on who may apply. The Department of Computer Science at NUI Maynooth is interested in hosting post-doctoral applicants in any of the Department's active research areas (http://www.cs.may.ie/research/index.html). Potential applicants should contact the relevant members of the Department directly. Ronan ______________________ Prof. Ronan G. Reilly Department of Computer Science National University of Ireland, Maynooth Co. Kildare IRELAND v: +353-1-708 3847 f: +353-1-708 3848 w1: www.cs.may.ie/~rreilly (homepage) w2: cortex.cs.may.ie (research group) e: Ronan.Reilly at may.ie ========= Advance Notice IRCSET/Embark Initiative Post Doctoral Fellowship Scheme The Embark Initiative will shortly launch its Post Doctoral Fellowship Scheme. This scheme is designed to encourage excellence in research careers by funding doctoral students to associate with established research teams who have achieved international recognition for their work. Applicants will normally have submitted their PhD to their supervisor and have certification from their university that their thesis has been submitted for examination. Applicants will be required to submit: a research plan with input from the candidate and host research group summary of Ph.D work a list of publications and conference presentations statement from Ph.D supervisor regarding candidates achievements and suitability for post doctoral work statement from proposed laboratory regarding suitability of candidate for the research area and resources available for the proposed work The award will consist of a salary contribution of 33,000 per annum and a contribution to laboratory costs and travel of up to 5,000 per annum. Awards will be tenable for 2 years. The Closing date for applications will be 1st November 2002. This short call duration is necessary in order to ensure that assessment and completion of contracts are in place for the 2002/2003 academic year. The full application documents will be available in approximately one week. From nik.kasabov at aut.ac.nz Tue Oct 15 02:49:43 2002 From: nik.kasabov at aut.ac.nz (Nik Kasabov) Date: Tue, 15 Oct 2002 19:49:43 +1300 Subject: Book announcement: Evolving Connectionist Systems Message-ID: The monograph book " Evolving Connectionist Systems - Methods and Applications in Bioinformatics, Brain Study and Intelligent Machines" has just been published by Springer http://www.springer.de , series Perspectives in Neurocomputing, 2002XII, 308 p. Softcover; 1-85233-400-2. Some software, colour figures, .ppt presentations, and related papers are available from http://www.kedri.info (ECOS page) ---------------------------------------------------------------- Content: Prologue Part I. Evolving Connectionist Systems: Methods and Techniques Chapter 1. Evolving processes and evolving connectionist systems Chapter 2. Evolving connectionist systems for unsupervised learning Chapter 3. Evolving connectionist systems for supervised learning Chapter 4. Recurrent evolving systems, reinforcement learning, and evolving automata Chapter 5. Evolving neuro- fuzzy inference systems Chapter 6. Evolutionary computation and evolving connectionist systems Chapter 7. Evolving connectionist machines: a framework, biological motivation, and implementation issues Part II. Applications in Bioinformatics, Brain Study, and Intelligent Machines Chapter 8. Data analysis, modelling and knowledge discovery in Bioinformatics Chapter 9. Analysis and modelling of brain functions and cognitive processes Chapter 10. Modelling the emergence of acoustic segments (phones) in spoken languages Chapter 11. On-line adaptive speech recognition Chapter 12. On-line image and video data processing Chapter 13. Evolving systems for integrated multi-modal information processing Epilogue References Extended glossary Subject index ---------------------------------------------------------------- best regards Nik Kasabov Prof. Nik Kasabov, MSc, PhD Fellow RSNZ, NZCS, Sr Member IEEE Director, Knowledge Engineering and Discovery Research Institute Chair of Knowledge Engineering, School of IT Auckland University of Technology (AUT) phone: +64 9 917 9506 ; fax: +64 9 917 9501 mobile phone: +64 21 488 328 WWW http://www.kedri.info email: nkasabov at aut.ac.nz From juergen at idsia.ch Tue Oct 15 10:04:39 2002 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Tue, 15 Oct 2002 16:04:39 +0200 Subject: 2003: Job Openings (Postdocs, PhDs) at IDSIA, Switzerland Message-ID: <3DAC2077.3080902@idsia.ch> For the next year we are anticipating several job openings for outstanding postdocs and PhD students interested in at least one of the following topics: 1. State-of-the-Art Recurrent Neural Networks: http://www.idsia.ch/~juergen/rnn.html 2. Optimal Incremental Universal Search Algorithms: http://www.idsia.ch/~juergen/oops.html 3. Universal Learning Algorithms: http://www.idsia.ch/~juergen/unilearn.html To apply, please follow the instructions in http://www.idsia.ch/~juergen/jobs2003.html Job interviews are possible at NIPS 2002 in Vancouver and at the NIPS workshop on Universal Learning Algorithms and Optimal Search: http://www.idsia.ch/~marcus/idsia/nipsws.htm Juergen Schmidhuber, IDSIA http://www.idsia.ch/~juergen From mvzaanen at science.uva.nl Tue Oct 15 10:51:03 2002 From: mvzaanen at science.uva.nl (Menno van Zaanen) Date: Tue, 15 Oct 2002 16:51:03 +0200 (CEST) Subject: Call for papers: Special Issue Pattern Recognition Message-ID: Apologies for Multiple Copies. Please distribute... CALL FOR PAPERS Pattern Recognition (The Journal of the Pattern Recognition Society) Special Issue on Grammatical Inference Techniques & Applications This Special Issue will be published in April, 2004 to commemorate and honor the memory of Late Professor K. S. Fu. Grammatical Inference (GI) is a collection of methodologies for learning grammars from training data. The most traditional field of application of GI has been syntactic pattern recognition. In the recent past, however, concerted efforts from diverse disciplines to find tractable inference techniques have added new dimensions and opened up unchartered territories. Applications of GI in more nontraditional fields include Gene Analysis, Sequence Prediction, Cryptography and Information Retrieval. Development of algorithms for GI has evolved over the years from dealing with only positive training samples to more fundamental efforts that try to circumvent the lack of negative samples.. This idea is pursued in stochastic grammars and languages which attempt to overcome absence of negative samples by gathering statistical information from available positive samples. Also within the framework of information theory, probability estimation technique for Hidden Markov Model known as Backward-Forward and for Context-Free language, the Inside-Outside algorithm are focal point of investigations in stochastic grammar field. Techniques that use intelligent search to infer the rules of grammar are showing considerable promise. Recently, there has been a surge of activities dealing with specialized neural network architecture and dedicated learning algorithms to approach GI problems. In more customary track, research in learning classes of transducers continue to arouse interests in GI community. Close interaction/collaboration between different disciplines and availability of powerful computers are fueling novel research efforts in GI. The objective of the Special Issue is to present the current status of this topic through the works of researchers in different disciplines. Original and tutorial papers are solicited that address theoretical and practical issues on this theme. Topics of interest include (but are not limited to): Theory: Neural network framework and learning algorithms geared to GI GI via heuristic and genetic search Inference mechanisms for stochastic grammars/languages Algebraic methods for identification of languages Transduction learning Applications: Image processing and computer vision Biosequence analysis and prediction Speech and natural language processing Data mining/information retrieval Optical character recognition Submission Procedure: Only electronic (ftp) submission will be accepted. Instructions for submission of papers will be posted on November 10 at the guest editor's web site (http://www-ee.ccny.cuny.edu/basu) . All submitted papers will be reviewed according to guidelines and standards of Pattern Recognition. Deadlines: Manuscript Submission: December 10, 2002 Notification of Acceptance: April 16, 2003 Final Manuscript Due: June 16, 2003 Publication Date: April 2004 Guest Editor: Mitra Basu , The City College of CUNY, New York, U.S.A. basu at ccny.cuny.edu +-------------------------------------+ | Menno van Zaanen | "Let him not vow to walk in the dark, | mvzaanen at science.uva.nl | who has not seen the nightfall." | http://www.science.uva.nl/~mvzaanen | -Elrond From sml at essex.ac.uk Tue Oct 15 07:50:19 2002 From: sml at essex.ac.uk (Lucas, Simon M) Date: Tue, 15 Oct 2002 12:50:19 +0100 Subject: ICDAR 2003 Competitions and Datasets Message-ID: <7AC902A40BEDD411A3A800D0B7847B66E0FA8A@sernt14.essex.ac.uk> Dear All, For ICDAR 2003 (International Conference on Document Analysis and Recognition) we are running some competitions that may be of interest to readers of this list. The competition areas include cursive script recognition, page and table segmentation, and various text-in-scene (robust reading) problems. Some of these competitions have new datasets associated with them, which you can download. For more details: http://algoval.essex.ac.uk/icdar/Competitions.html Best regards, Simon Lucas (Competitions chair, ICDAR 2003) -------------------------------------------------- Dr. Simon Lucas Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom Email: sml at essex.ac.uk http://cswww.essex.ac.uk -------------------------------------------------- From markman at psyvax.psy.utexas.edu Wed Oct 16 10:02:51 2002 From: markman at psyvax.psy.utexas.edu (Art Markman) Date: Wed, 16 Oct 2002 09:02:51 -0500 Subject: Cognitive Science Society Virtual Seminar - Dr. James McClelland Message-ID: Virtual Seminar Series October 15, 2002 The Cognitive Science Society will be hosting a virtual colloquium series this year presented live via the Internet, with the first talk given by Jay McClelland in October. Topic: Semantic Cognition: A Parallel Distributed Processing Approach Time: Friday, October 25, 2002 1:00pm US Eastern Standard Time Presenter: Dr. James McClelland Carnegie Mellon University There are two technologies available to participate in the seminar - Web Conferencing or Phone in: Web Conferencing Voice over IP - using a PC and Internet Explorer: Address: www.voicecafe.cc/aptima/client.htm Username: seminar Password: 214365 Phone in - using a toll line (i.e., you'll be charged your regular long distance rates): Ahead of time download the slides from: http://www.cognitivesciencesociety.org/colloquium Call Phone Number: 1-620-584-8200 Enter Code: 37004# Web Conferencing installation and rehearsal Installation There is a one-time automatic download of required software plug ins. This will take about five minutes to complete. If you have a Windows 2000 operating system, you will be required to login to the ?Administrator? account to download the software. Using Internet Explorer, Address: www.voicecafe.cc/aptima/client.htm Username: seminar Password: 214365 Rehearsal Session Tuesday, October 22 at 2:00pm US EST Thursday, October 24 at 10:00am US EST We encourage you to attend one of two brief 15 minute sessions on Aptima?s Web Conferencing Facility. If you should have any difficulties during the practice session, call the audio line which will be open: 1-620-584-8200 Code: 37004# If you have any installation issues prior to the seminar view the FAQs or contact Gilbert Mizrahi. If you have difficulty on the day of the seminar, call for technical support 781-935-3966 x214 or email cta at aptima.com Please forward this invitation to colleagues who would benefit from this seminar. Sincerely, Art Markman markman at psy.utexas.edu Dr. Arthur B. Markman University of Texas Department of Psychology Austin, TX 78712 512-232-4645 The seminar series is sponsored by the Office of Naval Research -------------- next part -------------- An HTML attachment was scrubbed... URL: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/9e4f1aac/attachment.html From greiner at cs.ualberta.ca Thu Oct 17 00:03:40 2002 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Wed, 16 Oct 2002 22:03:40 -0600 Subject: Alberta Ingenuity Centre for Machine Learning Message-ID: <20021017040350Z80335-23157+1339@sunkay.cs.ualberta.ca> We are pleased to announce the creation of the new Alberta Ingenuity Centre for Machine Learning. This multi-year, multi-million dollar centre, located at the University of Alberta (Edmonton), will conduct the highest quality research in both fundamental and applied machine learning. While we will initially focus on * bioinformatics * interactive entertainment (including computer games) we are eager to extend to any other area related to Machine Learning and Datamining. We are currently recruiting at essentially EVERY level: faculty members (junior or senior; even endowed chairs!) post-doctoral fellows / research associates graduate students -- both MSc and PhD We also have a substantial budget to support visitors, both short and long term. For more information, see http://www.aicml.ca or contact us at recruit at aicml.ca | R Greiner Phone: (780) 492-5461 | | Director, Alberta Ingenuity Centre for Machine Learning | | Dep't of Computing Science FAX: (780) 492-1071 | | University of Alberta Email: greiner at cs.ualberta.ca | | Edmonton, AB T6G 2E8 Canada http://www.cs.ualberta.ca/~greiner/ | From nnk at his.atr.co.jp Thu Oct 17 02:10:58 2002 From: nnk at his.atr.co.jp (Neural Networks Japan Office) Date: Thu, 17 Oct 2002 15:10:58 +0900 Subject: [REMINDER] Call for Papers NN 2003 Special Issue on Neuroinformatics Message-ID: [Apologies if you receive this announcement more than once.] ****************************************************************** CALL FOR PAPERS Neural Networks 2003 Special Issue "Neuroinformatics" ****************************************************************** ---------------> The deadline for submission is close <------------ Co-Editors Professor Shun-ichi Amari, RIKEN Brain Science Institute Professor Michael A Arbib, University of Southern California Dr. Rolf Kotter, Heinrich Heine University Dusseldorf Submission Deadline for submission: October 30, 2002 Notification of acceptance: March 31, 2003 Format: as for normal papers in the journal (APA format) and no longer than 10,000 words Address for Papers Dr. Mitsuo Kawato ATR Human Information Science Laboratories 2-2-2 Hikaridai, Seika-cho Soraku-gun, Kyoto 619-0288, Japan. Neuroinformatics is an emerging field, integrating approaches from neuroscience and information science/technology to understanding the structure and function of the brain. Neuroinformatics research is interdisciplinary and aims at unraveling the complex structure-function relationships of the brain at all levels and scales of analysis. Neuroinformatics will accelerate the progress of neuroscience and informatics, for example, by * more efficient use of neuroscience data by information-based approaches, * developing and applying new tools and methods for acquiring, visualizing and analyzing data, * developing new methodologies for generating theories to derive further experiments and engineering applications, * generating computational theories connecting neuroscience and information science/technology. The Special Issue will include invited and contributed articles taking a broad view of neuroinformatics, with special emphasis on database construction, data mining, ontologies for neural systems, and the integration of simulation methods with data analysis. Neural Networks Official home page http://www.elsevier.com/inca/publications/store/8/4/1/index.htt Instructions to Authors http://authors.elsevier.com/GuideForAuthors.html?PubID=841&dc=GFA ----------------------------------------------------------------- END. -- ==================================================================== NEURAL NETWORKS Editorial Office ATR-I, Human Information Science Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-2647 E-MAIL nnk at his.atr.co.jp ========================================================= EMAIL ADDRESS HAS BEEN CHANGED FROM OCT.1, 2001 ========================================================= From: esann To: "'Connectionists at cs.cmu.edu'" References: From bogus@does.not.exist.com Fri Oct 18 03:29:00 2002 From: bogus@does.not.exist.com () Date: Fri, 18 Oct 2002 09:29:00 +0200 Subject: special sessions at ESANN'2003 Message-ID: ESANN'2003 11th European Symposium on Artificial Neural Networks Bruges (Belgium) - April 23-24-25, 2003 Special sessions ===================================================== The following message contains a summary of all special sessions that will be organized during the ESANN'2003 conference. Authors are invited to submit their contributions to one of these sessions or to a regular session, according to the guidelines found on the web server of the conference (http://www.dice.ucl.ac.be/esann/). Deadline for submissions is December 6, 2003. List of special sessions that will be organized during the ESANN'2003 conference ======================================================================== 1. Links between neural networks and webs (M. Gori) 2. Mathematical aspects of neural networks (B. Hammer, T. Villmann) 3. Statistical learning and kernel-based algorithms (M. Pontil, J. Suykens) 4. Digital image processing with neural networks (A. Wismller, U. Seiffert) 5. Industrial and agronomical applications of neural networks (L.M. Reyneri) 6. Neural networks for human/computer interaction (C.W. Omlin) Short description ================= 1. Links between neural networks and webs ----------------------------------------- Organised by : Marco Gori, University of Siena (Italy) Description of the session: Artificial neural networks have been the subject of massive in-depth investigation in the last twenty years. General theories on architectural and learning issues are now spread in the scientific community, are well-known, and have been widely disseminated. The recent development of the Web, with the corresponding crucial problem of performing information retrieval has been recently faced by introducing the concept of page rank (Google search engine), which is a sort of visibility index of the pages in the Web. Interestingly enough, one of the most successful solutions to page scoring is based on a dynamical model which reminds us a neural network. Recent extensions of this model give page scoring systems tightly related to neural networks, the main difference being that each unit accepts typically an input, which is roughly constant. A more general view of page scoring systems includes, however, the presence of a dynamics on the inputs of the nodes and the introduction of parameters which are dual with respect to the weights of a neural network. This special session addresses the new emerging notion of web, a sort of huge artificial neural network, whose learning environment is distributed throughout the units in a single instance, which comes in conjunction with the network, instead of being presented as a collection of examples. The Web is a noticeable case where the learning data can be the users' relevance feedback on the pages, but other interesting examples can be given. In the special session, we also expect to foresee the foundation on web learning as a generalization of the theory of adaptive computation on structured domains. 2. Mathematical aspects of neural networks ------------------------------------------ Organised by : Barbara Hammer, Univ. Osnabrck (Germany) Thomas Villmann, Univ. Leipzig (Germany) Description of the session: ESANN has become the reference for researchers on fundamentals and theoretical aspects of neural networks. Nevertheless an increasing number of presentations for successful neural network applications could be found at past ESANN conferences. This might be due to two reasons: theoretical aspects of neural networks are now well understood; theory of neural networks directly leads to improved algorithms. Though this is certainly the case with respect to many aspects, we believe that there exist still many questions concerning neural networks which are not yet understood or even adequately formalized and where direct applicability cannot be expected in the near future. We would like to open a forum for mathematical aspects of neural networks with a focus on in-principle possibilities of formalizing heuristic observations, open questions and directions of theoretical research, and mathematical results for possibly not yet practically relevant situations. We encourage submissions which could be related to the following topics: capacity and approximation results, complexity of neural network training, learning theory, convergence and stability of network dynamics or training, alternative mathematical descriptions of models, evaluation of network behaviour in non-standard domains, ... 3. Statistical learning and kernel-based algorithms --------------------------------------------------- Organised by : Massimiliano Pontil, Univ. Siena (Italy) Johan Suykens, K.U. Leuven (Belgium) Description of the session: Over the past few years, statistical learning theory has emerged as a principled approach for learning from examples. The theory has grown on ideas from different fields, including empirical processes, statistics, regularization, convex optimization, to name a few, and has produced remarkable learning algorithms, such as the popular support vector machine. Those algorithms make use of reproducing kernel Hilbert spaces as the key computational ingredient and bring together older ideas from statistics (e.g., ridge regression, principal components, canonical correlation analysis, to name a few). We encourage authors to submit papers on which either present novel algorithms ideas and/or discuss important application studies of already existing kernel-based learning algorithms. 4. Digital image processing with neural networks ------------------------------------------------ Organised by : Axel Wismller, Univ. Mnich (Germany) Udo Seiffert, Univ. Magdeburg (Germany) Description of the session: In the proposed special session we want to focus on image processing based on neural networks as well as other advanced methods of computational intelligence. A special emphasis is put on real-world applications combining original ideas and new developments with a strong theoretical background. Authors are invited to submit contributions which can be in any area of image processing with neural networks. The following non-restrictive list can serve as an orientation, however, additional topics may be chosen as well: Real-world image processing applications in science and industry, e.g. for robotics, security, biometry, medicine, biology, ... Preprocessing and feature extraction: dimension and noise reduction, image enhancement, edge detection, compression, ... Segmentation and object recognition, texture and colour analysis Image registration, matching, morphing Image understanding and scene analysis Methods for neural network image processing: classification, clustering, embedding, hybrid systems, ... Multidimensional image analysis, image time-series Knowledge discovery in image databases, web applications 5. Industrial and agronomical applications of neural networks ------------------------------------------------------------- Organised by : Leonardo M. Reyneri, Politecnico di Torino (Italy) Description of the session: Since more than 30 years, neural networks have given rise to a huge amount of theoretical studies and developments, research and evaluation phases, etc. Theory is now becoming rather stable and neural networks, together with fuzzy systems and other soft computing paradigms, have become a knowledge that each expert should manage. At this stage, it is of outmost importance to evaluate the relevance and effectiveness of neuro-fuzzy systems in real-world applications. Too few papers are still found in literature, therefore contributors are invited to share their experience by presenting papers which describes an application of a neuro-fuzzy system into an industrial or agronomical application. Requirements for papers submitted to this session: i) the paper should quickly describe the problems and limitations of older aproach(es); in case of obvious constraints due to intellectual properties or non-disclosure agreements, details of the problem can be masked or modified, provided that the user can get a feeling of the complexity; ii) enough details should be given about network paradigm, size, topology, training rule; iii) a table SHALL be included which lists, at least: design time (e.g. in days), training time (e.g. in days, NOT training epochs), one to three performance figures (in appropriate units for the problem). All these figures shall be given for the proposed solution AND for at least another (possibly more) non-neural approach(es). 6. Neural networks for human/computer interaction ------------------------------------------------- Organised by : Christian W. Omlin, Univ. Western Cape (South Africa) Description of the session: text not yet available ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From hasselmo at bu.edu Fri Oct 18 15:04:14 2002 From: hasselmo at bu.edu (Michael Hasselmo) Date: Fri, 18 Oct 2002 15:04:14 -0400 Subject: IJCNN 2003 - Portland, Oregon, USA - July 20-24, 2003 Message-ID: CALL FOR PAPERS **************************************************************** International Joint Conference on Neural Networks (IJCNN 2003) Portland, Oregon, July 20-24, 2003 http://www.ijcnn.net DEADLINE: January 29, 2003 **************************************************************** Co-sponsored by the International Neural Network Society (INNS) And the IEEE Neural Networks Society. Paper submission deadline is January 29, 2003. Selected papers will be published in a special issue of the journal Neural Networks, in addition to publication of all papers in the conference proceedings. The International Joint Conference on Neural Networks provides an overview of state of the art research in Neural Networks, covering a wide range of topics (see topic list below). The IJCNN meeting is organized annually by the International Neural Network Society (INNS) and the IEEE Neural Networks Society. Conference attendees who are INNS or IEEE Neural Networks Society members, or who join one of these societies now will receive a reduced IJCNN conference registration fee, and those who are INNS members will receive the IJCNN special issue for free as part of their annual membership subscription to Neural Networks. Location: ------------------------------------ The conference will take place at the Doubletree Hotel Portland-Columbia River, Portland, Oregon, July 20-24, 2003. For more information about Portland see http://www.ijcnn.net Article submission: ------------------------------------ Authors should submit their articles electronically on the conference web site at http://www.ijcnn.net by the conference deadline of January 29, 2003. The site opens on October 15, 2003. Special issue of the journal Neural Networks ------------------------------------ The review process of the conference will allow selection of a large subset of the articles for inclusion in the special issue of the journal Neural Networks. For more information about this journal see: http://www.elsevier.com/locate/neunet Plenary speakers: ------------------------------------ Kunihiko Fukushima, Tokyo University of Technology, Japan Earl Miller, Massachusetts Institute of Technology, USA Terrence Sejnowski, Salk Institute and UCSD, USA Vladimir Vapnik, NEC Research Labs, USA Christoph von der Malsburg, USC, USA and Univ. Bochum, Germany Special sessions: ------------------------------------ There will be a number of special sessions, including the following titles: 1. Neuroinformatics 2. Visual cortex: How illusions represent reality 3. Dynamical aspects of information encoding 4. Incremental Learning 5. Attention and consciousness in normal brains: Theoretical models and phenomenological data from MEG Tutorials ------------------------------------ Tutorials will take place on Sunday, July 20, 2003. Two hour sessions will cover a range of different topics. Researchers interested in proposing a tutorial should access the web site at http://www.ijcnn.net. Topic list ------------------------------------ Regular oral and poster sessions will include papers in the following topics: A. PERCEPTUAL AND MOTOR FUNCTION Vision and image processing Pattern recognition Face recognition Handwriting recognition Other pattern recognition Auditory and speech processing Audition Speech recognition Speech production Other perceptual systems Motor control and response B. COGNITIVE FUNCTION Cognitive information processing Learning and memory Spatial Navigation Conditioning, Reward and Behavior Mental disorders Attention and Consciousness Language Emotion and Motivation C. COMPUTATIONAL NEUROSCIENCE Models of neurons and local circuits Systems neurobiology and neural modeling Spiking neurons D. INFORMATICS Neuroinformatics Bioinformatics Artificial immune systems Data mining E. HARDWARE Neuromorphic hardware and implementations Embedded neural networks F. REINFORCEMENT LEARNING AND CONTROL Reinforcement learning Approximate/Adaptive dynamic programming Control Reconfigurable systems Robotics Fuzzy neural systems Optimization G. DYNAMICS Neurodynamics Recurrent networks Chaos and learning theory H. THEORY Mathematics of Neural Systems Support vector machines Extended Kalman filters Mixture models, EM algorithms and ensemble learning Radial basis functions Self-organizing maps Adaptive resonance theory Principal component analysis and Independent component analysis Probabilistic and information-theoretic methods Neural Networks and Evolutionary Computation I. APPLICATIONS Signal Processing Telecommunications Applications Time Series Analysis Biomedical Applications Financial Engineering Biomimetic applications Computer security applications Power system applications Aeroinformatics Diagnostics and Quality Control Other applications General Chair: Don Wunsch, University of Missouri - Rolla Program Chair: Michael Hasselmo, Boston University Program co-chairs: DeLiang Wang, Ohio State University Ganesh K. Venayagamoorthy,University of Missouri - Rolla Tutorial co-chairs: F. Carlo Morabito, University of Reggio Calabria, Italy Harold Szu, Office of Naval Research Local Arrangements Chair: George Lendaris, Portland State University Publicity chair: Derong Liu, University of Illinois at Chicago Web chair: Tomasz Cholewo, Lexmark International Inc., Kentucky Exhibits chair: Karl Mathia, Brooks-PRI Automation Inc., California Student travel and volunteer chair: Slawo Wesolkowski, University of Waterloo, Canada International Liason: William N. Howell, Mining and Mineral Sciences Laboratories, Canada Program committee: ------------------------------------------ David Brown, FDA David Casasent, Carnegie Mellon University Ke Chen , University of Birmingham, UK Michael Denham, University of Plymouth, UK Tom Dietterich, Oregon State University Lee Feldkamp, Ford Motor Company Kunihiko Fukushima, Tokyo University of Technology, Japan Joydeep Ghosh, University of Texas at Austin Stephen Grossberg, Boston University Fred Ham, Florida Institute of Technology Ron Harley, Georgia Institute of Technology Bart Kosko, University of Southern California Robert Kozma, University of Memphis Dan Levine, University of Texas at Dallas Xiuwen Liu, Florida State University F. Carlo Morabito, Universita di Reggio Calabria, Italy Ali Minai, University of Cincinnati Catherine Myers, Rutgers University Erikki Oja, Helsinki University of Technology, Finland Jose Principe, University of Florida Danil Prokhorov, Ford Motor Company Harold Szu, Office of Naval Research John Gerald Taylor, University College, London, UK Shiro Usui, Toyohashi Univ. of Technology, Japan Bernie Widrow, Stanford University Lei Xu, The Chinese University of Hong Kong Gary Yen, Oklahoma State University Lotfi Zadeh, University of California, Berkeley Review committee: ----------------------------------------------------- Reviews will be performed by a group of over 130 researchers in the field. The review committee member list will be posted on the IJCNN web site. For more information see the web page at http://www.ijcnn.net or contact INNS at: 19 Mantua Road Mt. Royal, NJ 08061 856-423-0162 or FAX: 856-423-3420. From bert at snn.kun.nl Fri Oct 18 08:11:31 2002 From: bert at snn.kun.nl (Bert Kappen) Date: Fri, 18 Oct 2002 14:11:31 +0200 (MEST) Subject: Promedas: a decision support system for medical diagnosis Message-ID: Dear all, we recently completed a report describing the state-of-the-art of our ongoing efforts in medical diagnosis, the Promedas project. "PROMEDAS": a probabilistic decision support system for medical diagnosis The objective of our project is to build a large bayesian network for diagnosis in internal medicine. As is well-known, this is not easy and requires the combined efforts of experts in internal medicine as well as advanced software and algorithmic development. On the medical side, one of the distinguishing features of our work is that we have the financial resources to contract physicians to do the medical modeling and evaluation in a clinical setting. In our view, this is critical to 1) obtain valid models and 2) to get accepted by potential users. On the algoritmic side, our research group in Nijmegen has experience on approximate inference techniques that are needed to keep computation tractable. We have developed our own graphical model software, called BayesBuilder, which is freely available for non-commercial use. The maturity of our work can be described as 'close to clinical practice'. We have just started the first clinical trials to evaluate the acceptance of a module on lipids and vascular diseases (approx 500 variables) by the intended users, i.e. experts in internal medicine working in a hospital environment. The report can be downloaded from http://www.snn.kun.nl/~bert/#diagnosis Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: www.snn.kun.nl/~bert From kim.plunkett at psy.ox.ac.uk Sat Oct 19 11:55:27 2002 From: kim.plunkett at psy.ox.ac.uk (Kim Plunkett) Date: Sat, 19 Oct 2002 16:55:27 +0100 Subject: Oxford Connectionist Summer School Message-ID: <003b01c27787$f1495340$30274381@KIMSLAPTOP> UNIVERSITY OF OXFORD OXFORD SUMMER SCHOOL ON CONNECTIONIST MODELLING Department of Experimental Psychology University of Oxford Sunday 20th July - Friday 1st August, 2003 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research and it will provide a general introduction to connectionist modelling, biologically plausible neural networks and brain function through lectures and exercises on Macintosh's and PC's. The course is interdisciplinary in content though many of the illustrative examples are taken from cognitive and developmental psychology, and cognitive neuroscience. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encouraged to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is =A3950. This figure covers the cost of accommodation (bed and breakfast at St. John's College), registration and all literature required for the Summer School. Participants will be expected to cover their own travel and meal costs. A number of partial bursaries (=A3200) will be available for graduate students. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek further funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. If you are interested in participating in the Summer School, please complete the application form at the web address http://epwww.psych.ox.ac.uk/conferences/connectionist_modelling or alternatively send a brief description of your background with an explanation of why you would like to attend the Summer School, to: Mrs Sue King Department of Experimental Psychology University of Oxford South Parks Road Oxford OX1 3UD Tel: (01865) 271353 Email: susan.king at psy.ox.ac.uk no later than 28th February 2003. From rens at science.uva.nl Sun Oct 20 14:14:53 2002 From: rens at science.uva.nl (Rens Bod) Date: Sun, 20 Oct 2002 20:14:53 +0200 (MEST) Subject: New article on Unified Model of Linguistic and Musical Processing In-Reply-To: <003b01c27787$f1495340$30274381@KIMSLAPTOP> Message-ID: Dear Connectionists, The following paper may of interest to the readers of this list. Best, Rens Bod (http://turing.wins.uva.nl/~rens) ---------------------------- Bod, Rens (2002) "A Unified Model of Structural Organization in Language and Music", Journal of Artificial Intelligence Research (JAIR) Volume 17, pages 289-308. Available at http://turing.wins.uva.nl/~rens/jair02.pdf (or also via http://www.jair.org/abstracts/bod02a.html) Abstract: Is there a general model that can predict the perceived phrase structure in language and music? While it is usually assumed that humans have separate faculties for language and music, this work focuses on the commonalities rather than on the differences between these modalities, aiming at finding a deeper 'faculty'. Our key idea is that the perceptual system strives for the simplest structure (the 'simplicity principle'), but in doing so it is biased by the likelihood of previous structures (the 'likelihood principle'). We present a series of data-oriented parsing (DOP) models that combine these two principles and that are tested on the Penn Treebank and the Essen Folksong Collection. Our experiments show that (1) a combination of the two principles outperforms the use of either of them, and (2) exactly the same model with the same parameter setting achieves maximum accuracy for both language and music. We argue that our results suggest an interesting parallel between linguistic and musical structuring. From terry at salk.edu Mon Oct 21 17:34:48 2002 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 21 Oct 2002 14:34:48 -0700 (PDT) Subject: Computational Biology of Time Message-ID: <200210212134.g9LLYmI98120@purkinje.salk.edu> COMPUTATIONAL BIOLOGY OF TIME Organizers: Terrence Sejnowski and Sydney Brenner January 31 - February 4, 2003 Banff Centre - Banff, Alberta http://www.keystonesymposia.org/Sites/SitesDetail.cfm?SiteID=19 Abstract Deadline: November 1, 2002 Early Registration Deadline: December 2, 2002 http://www.keystonesymposia.org/Meetings/ViewMeetings.cfm?MeetingID=659 Time is the final frontier in biology and uncovering molecular and cellular mechanisms in cells that keep time is essential to understanding biological systems. Biological clocks cover a wide range of time scales, from the heartbeat to circadian rhythms. In each of these systems, molecular mechanisms are being uncovered that underlie these rhythms and stabilize them, but the number of molecules and the complexity of their interactions are daunting. There is growing interest in applying computational models to these biological systems. This symposium brings together some of the leading computational model builders and key researchers studying the circadian clock, photoperiodism in plants, the cell cycle in yeast, cardiac rhythms, brain rhythms that occur during sleep and firefly synchronization. The mathematical principles that emerge from the models highlight deep similarities that exist between these diverse systems, and allow a broader understanding to emerge for how biological systems organize time in robust and effective ways. Friday, January 31, 7:30 - 8:30 PM: Keynote Address: Sydney Brenner, 2002 Nobel Prize in Physiology or Medicine HOW CELLS COMPUTE Saturday, February 1, 8:00 - 11:00 AM CIRCADIAN RHYTHMS Joseph S. Takahashi, Northwestern University "Circadian Clock Genes" Martha U. Gillette, University of Illinois "Circadian Pacemaker in the Suprachiasmatic Nucleus?" Stanislas Leibler, Rockefeller University "Oscillations and Noise in Genetic Networks" Albert Goldbeter, Universit Libre de Bruxelles "Computational Biology of Circadian Rhythms" Saturday, February 1, 5:00 - 7:00 PM COUPLED BIOLOGICAL OSCILLATORS Andrew Moiseff, University of Connecticut "Temporal Rhythms in Firefly Communication" Wolfgang O. Friesen, University of Virginia "Coupled Central and Peripheral Oscillators Generate Efficient Swim Undulations" G. Bard Ermentrout, University of Pittsburgh "Coupled Neural Oscillators" Sunday, February 2, 8:00 - 11:00 AM SLEEP RHYTHMS David A. McCormick, Yale University "Slow Oscillations in Thalamic and Cortical Slices" Mircea Steriade, Universit Laval "Sleep Oscillations In Vivo" Terrence Sejnowski, Salk Institute "Neural Models of Sleep Rhythms" Alexander A. Borbely, University of Zurich "Sleep in Humans: Intrinsic and Extrinsic Oscillations" Sunday, February 2, 5:00 - 7:00 PM PHOTOPERIODISM Steve A. Kay, The Scripps Research Institute "Comparative Genetics and Genomics Approaches to "Understanding Circadian Clock and Photoperiodism? Susan S. Golden, Texas A & M University "Plasticity of circadian rhythms of gene expression in cyanobacteria" Takao Kondo, Nagoya University "Genome-Wide Circadian System of Cyanobacteria Driven by Kai Feedback Loop" Monday, February 3, 8:00 - 11:00 AM CARDIAC RHYTHMS Denis Noble, University of Oxford "The Modes of Oscillation of the Heart" Peter Hunter, University of Auckland "Electro-Mechanical Heart Model" John Peter Wikswo Jr. , Vanderbilt University "Cardiac Reentry as a Spatiotemporal Oscillator" Leon Glass, McGill University "Puzzles Concerning the Starting and Stopping of Biological Oscillations" Monday, February 3, 5:00 - 7:00 PM CELL CYCLE John Tyson, Virginia Polytechnic Institute "Cyberyeast: Modeling the Eukaryotic Cell Cycle" Marc W. Kirschner, Harvard Medical School "Modeling the Wnt Signaling Pathway" ----- For more than 30 years, Keystone Symposia has been connecting the scientific community in a way no other meeting or conference can. Your opportunity to enjoy quality scientific discussions, networking among colleagues, and cutting-edge presentations -- all in a relaxed atmosphere -- is here. For more information about the Banff Center in Alberta, Canada: http://www.keystonesymposia.org/Sites/SitesDetail.cfm?SiteID=19 ----- From agathe at dcs.gla.ac.uk Mon Oct 21 10:03:25 2002 From: agathe at dcs.gla.ac.uk (Agathe Girard) Date: Mon, 21 Oct 2002 15:03:25 +0100 Subject: technical report available Message-ID: <3DB4092D.638E5042@dcs.gla.ac.uk> Dear All, The following new technical report Gaussian Process Priors with Uncertain Inputs: Multiple-Step Ahead Prediction A. Girard, C. E. Rasmussen and R. Murray-Smith is available at http://www.dcs.gla.ac.uk/~agathe/reports.html Feedback most appreciated! Regards, Agathe Girard http://www.dcs.gla.ac.uk/~agathe %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% Abstract: We consider the problem of multi-step ahead prediction in time series analysis using the non-parametric Gaussian process model. $k$-step ahead forecasting of a discrete-time non-linear dynamic system can be performed by doing repeated one-step ahead predictions. For a state-space model of the form $y_t=f(y_{t-1}, \dots, y_{t-L})$, the prediction of $y$ at time $t+k$ is based on the estimates ${\hat y_{t+k-1}}, \dots, {\hat y_{t+k-L}}$ of the previous outputs. We show how, using an analytical Gaussian approximation, we can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction. In this framework, the problem is that of predicting responses at a random input and we compare the Gaussian approximation to the Monte-Carlo numerical approximation of the predictive distribution. The approach is illustrated on a simulated non-linear dynamic example, as well as on a simple one-dimensional static example. From O.Simonnot at elsevier.com Tue Oct 22 06:05:53 2002 From: O.Simonnot at elsevier.com (Simonnot, Olivier (ELS)) Date: Tue, 22 Oct 2002 11:05:53 +0100 Subject: The Computer Science Preprint Server Message-ID: <4D56BD81F62EFD49A74B1057ECD75C06046C974C@elsamss02571> Computer scientists can now fully enjoy a completely free platform facilitating the exchange of scientific information, The Computer Science Preprint Server "http://www.compscipreprints.com". The Computer Science Preprint Server provides you with: * FREE full text access to the articles already posted * final-version articles as well as reports on work in progress * streamlined and easy to use submission process with instant online visibility of your article * as an author, total freedom to remove your article at any time, and/or to submit it for publication to the journal of your choice * discussion threads to get direct feedback from your peer researchers * an extended alerting service enabling you to keep on track with the latest developments but also make your article visible From canete at ctima.uma.es Wed Oct 23 06:41:34 2002 From: canete at ctima.uma.es (=?iso-8859-1?Q?Javier_Fern=E1ndez_de_Ca=F1ete?=) Date: Wed, 23 Oct 2002 12:41:34 +0200 Subject: 8th Conference of Eng. Applications of Neural Networks EANN'03. Call for papers Message-ID: <004801c27a80$c1ba2680$836dd696@isa.uma.es> Call for Papers and Participation Eighth International Conference on Engineering Applications of Neural Networks Costa del Sol, M=E1laga, Spain 8-10 September 2003 The conference is a forum for presenting the latest results on neural network applications in technical fields. The applications may be in any engineering or technical field, including but not limited to: building systems, systems engineering, mechanical engineering, robotics, process engineering, metallurgy, pulp and paper technology, aeronautical engineering, computer science, machine vision, chemistry, chemical engineering, physics, electrical engineering, electronics, civil engineering, geophysical sciences, biomedical systems, and environmental engineering. This year conference is organised by the University of Malaga in co-operation with the Dpt. of System Engineering and Automation. For information on earlier EANN conferences visit http://www.kingston.ac.uk/eann/EANN.htm EANN2003 Web Page: http://www.isa.uma.es/eann03 Abstract Submission Prospective authors are requested to send an extended abstract for review by the International Committee. All papers must be written in English, starting with a succinct statement of the problem and the application area, the results achieved, their significance and a comparison with previous work (if any). The following must be also be included: Title of proposed paper, Author names, Affiliations Addresses, Name of author to contact for correspondence, E-mail address and fax number of contact author, Topics which best describe the paper (max. 5 keywords), It is strongly recommended to submit extended abstracts by electronic mail to: eann03 at ctima.uma.es Tutorial Proposals EANN2003 is soliciting tutorial proposals, covering any area of neurocomputing from theory to implementation guidelines for the benefit of the participants. Location The conference will take place in a hotel of the Costa del Sol by the Mediterranean sea in southern Spain, 12 km west of Malaga. The hotel is located near the beach and yacht harbour "Puerto Marina" and less than 10 minutes drive from Malaga international airport. The Costa del Sol is situated in the South of Spain, located in the far west of the Mediterranean. The area of coastline that is the Costa del Sol extends northwards from the Straits of Gibraltar, which is the door to the Mediterranean Sea from the Atlantic Ocean. Diary Dates Abstract Submission Deadline: 28 February 2003 Notification of Acceptance: 28 March 2003 Delivery of full papers: 2 May 2003 Proposals for Tutorials: 17 May 2003 Social Events A boat trip will be organised on Monday 8th and a gala dinner will be offered to participants on Tuesday 9th September. Post-conference Publications A number of papers will be selected for inclusion (enhanced versions) in a Special Issue of a prestigious scientific magazine (Neurocomputing, Neural Computing and Applications, etc) as it was done in earlier EANN conferences. Registration Conference Fee Industry and University Rate:GBP300 Student Rate with proceedings:GBP200 Student Rate without proceedings:GBP150 The rates above entitle you to access to the conference sessions (3 days), a copy of the final programme and the proceedings (except option 3 see above), a list of conference participants coffee/tea Breaks (3 days), Lunch (3 days) and gala Dinner (1 day) and social events that are free of any additional charges. Contacts Local Committee Conference Secretariat J. Fernandez de Canete A. Garc=EDa-Cerezo A. Mandow I. Garc=EDa-Moral G. Joya J. Mu=F1oz-Perez ------------------------------------------------------------------------- Organising Committee A. Osman (USA) R. Baratti (Italy) C. Kuroda (Japan) A. Ruano (Portugal) A.J. Owens (USA) D. Tsaptsinos (UK) ------------------------------------------------------------------------- International Committee to be extended A. Bulsari, Finland A. Iwata, Japan S. Michaelides, Cyprus F. Sandoval, Spain R. Saatchi, UK P. Zufiria, Spain S. Lecoeuche , France A. J. Owens, USA R. Parenti, Italy A. Servida, Italy F. Garc=EDa-Lagos, Spain A. Fanni, Italy N, Hazarika, UK S. Bitzer, Germany J. Ringwood, Ireland ------------------------------------------------------------------------- Prof. Javier Fernandez de Canete. Ph. D. Dpto. de Ingenier=EDa de Sistemas y Automatica E.T.S.I. Informatica Campus de Teatinos, 29071 Malaga (SPAIN) Phone: +34-95-2132887 FAX: +34-95-2133361 e_mail: canete at ctima.uma.es From t.windeatt at eim.surrey.ac.uk Thu Oct 24 07:23:50 2002 From: t.windeatt at eim.surrey.ac.uk (Terry Windeatt) Date: Thu, 24 Oct 2002 12:23:50 +0100 Subject: MCS 2003 CALL FOR PAPERS Message-ID: <20021024122350.A9617@ee.surrey.ac.uk> **Apologies for multiple copies** ****************************************** *****MCS 2003 Call for Papers***** ****************************************** *****Paper Submission: 10 January 2003***** *********************************************************************** FOURTH INTERNATIONAL WORKSHOP ON MULTIPLE CLASSIFIER SYSTEMS Guildford, Surrey, GU2 7XH , United Kingdom June 11-13 2003 Updated information: http://www.diee.unica.it/mcs E-mail: mcs2003 at eim.surrey.ac.uk *********************************************************************** WORKSHOP OBJECTIVES MCS 2003 is the fourth workshop of a series aimed to create a common international forum for researchers of the diverse communities working in the field of Multiple Classifier Systems. Information on the previous editions of MCS workshop can be found on www.diee.unica.it/mcs. Contributions from all the research communities working in the field are welcome in order to compare the different approaches and to define the common research priorities. Special attention is also devoted to assess the applications of Multiple Classifier Systems. The workshop is an official event of the International Association for Pattern Recognition (IAPR-TC1). WORKSHOP CHAIRS Terry Windeatt (Univ. of Surrey, United Kingdom) Fabio Roli (Univ. of Cagliari, Italy) ORGANIZED BY Center for Vision, Speech and Signal Proc. of the University of Surrey Dept. of Electrical and Electronic Eng. of the University of Cagliari PAPER SUBMISSION Two hard copies of the full paper should be mailed to: MCS 2003 Dr. Terry Windeatt Dept. of Electrical and Electronic Eng. University of Surrey Guildford, Surrey, GU2 7XH, United Kingdom. In addition, participants should submit an electronic version of the manuscript ( PDF or PostScript format) to mcs2003 at eim.surrey.ac.uk. The papers should not exceed 10 pages (LNCS format, see http://www.springer.de/comp/lncs/authors.html). A cover sheet with the authors names and affiliations is also requested, with the complete address of the corresponding author, and an abstract (200 words). Two members of the Scientific Committee will referee the papers. IMPORTANT NOTICE: Submission implies the willingness of at least one author to register, attend the workshop, and present the paper. Accepted papers will be published in the proceedings only if the registration form and payment for one of the authors has been received. WORKSHOP TOPICS Papers describing original work in the following and related research topics are welcome: Foundations of multiple classifier systems Methods for classifier fusion Design of multiple classifier systems Neural network ensembles Bagging and boosting Mixtures of experts New and related approaches Applications INVITED SPEAKERS Jerry Friedman (USA) Mohamed Kamel (Canada) SCIENTIFIC COMMITTEE J. A. Benediktsson (Iceland) H. Bunke (Switzerland) L. P. Cordella (Italy) B. V. Dasarathy (USA) R. P.W. Duin (The Netherlands) C. Furlanello (Italy) J. Ghosh (USA) T. K. Ho (USA) S. Impedovo (Italy) N. Intrator (Israel) A.K. Jain (USA) M. Kamel (Canada) J. Kittler (UK) L.I. Kuncheva (UK) L. Lam (Hong Kong) D. Landgrebe (USA) D-S. Lee (USA) D. Partridge (UK) A.J.C. Sharkey (UK) K. Tumer (USA) G. Vernazza (Italy) IMPORTANT DATES January 10, 2003 : Paper Submission February 20, 2003: Notification of Acceptance April 1, 2003: Camera-ready Manuscript April 10, 2003: Registration WORKSHOP PROCEEDINGS Accepted papers will appear in the workshop proceedings that will be published in the series Lecture Notes in Computer Science by Springer-Verlag. Word processing templates are available (www.springer.de/comp/lncs/authors.html). Furthermore, extended and revised versions of selected papers will be considered for possible publication in a special journal issue. Selected papers from previous editions of the MCS workshop have been published in Pattern Analysis and Applications, Information Fusion and International Journal of Pattern Recognition and Artificial Intelligence. From djin at MIT.EDU Sat Oct 26 22:23:38 2002 From: djin at MIT.EDU (Dezhe Jin) Date: Sat, 26 Oct 2002 22:23:38 -0400 (EDT) Subject: papers available Message-ID: Dear Connectionists, The following two papers may be of interest to some of you. They can be downloaded at http://hebb.mit.edu/~djin/index.html. Thanks! -Dezhe Jin 1. Fast Convergence of Spike Sequences to Periodic Patterns in Recurrent Networks, Dezhe Z. Jin, Physical Review Letters, 89, 208102 (2002). Abstract: The dynamical attractors are thought to underlie many biological functions of recurrent neural networks. Here we show that stable periodic spike sequences with precise timings are the attractors of the spiking dynamics of recurrent neural networks with global inhibition. Almost all spike sequences converge within a finite number of transient spikes to these attractors. The convergence is fast, especially when the global inhibition is strong. These results support the possibility that precise spatiotemporal sequences of spikes are useful for information encoding and processing in biological neural networks. 2. Fast computation with spikes in a recurrent neural network, Dezhe Z. Jin and H. Sebastian Seung, Physical Review E, 65, 051922 (2002). Abstract: Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M.1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner. ============================================================================= Dezhe Z. Jin, Ph.D. Postdoctoral Fellow Seung Lab, Dept. of Brain and Cognitive Sciences, M.I.T. ============================================================================= From CL243 at cornell.edu Mon Oct 28 05:22:50 2002 From: CL243 at cornell.edu (Christiane Linster) Date: Mon, 28 Oct 2002 05:22:50 -0500 Subject: Computational Neuroscience meeting Message-ID: <003701c27e6b$f7ed7510$8201a8c0@cpl.cornell.edu> Dear colleagues At this summer's annual Computational Neuroscience (CNS) meeting held in Chicago, the decision was made to form a CNS organization to administer future meetings. The purpose of this e-mail is to solicit nominations for a 15 member board of directors for the organization. Overall policy for the organization will be set by the board and a 4 member executive committee. Board members will serve 5 year terms and may be re-elected. Executive officers serve an initial 2 year term, renewable annually by a simple majority vote of the board. After 5 years, a 2/3 majority will be needed to continue. The scientific program for each meeting will be set by a separate program committee. For details, the current draft of the bylaws (to be confirmed by the initial board) is available for viewing at www.nbb.cornell.edu/neurobio/linster/cns/cns.htm. Nominations are open to all members of the computational neuroscience community (including students and postdocs). All nominations should be accompanied by a brief (1-3 paragraph) statement outlining the candidate's past affiliation with the meeting and/or vision for the future direction of the meeting. Self-nominations are accepted. Please send nominations by e-mail to Christane Linster at CL243 at cornell.edu by November 15th, 2002. After receiving a complete list of nominees, board members will be elected by the current program committee (held over from the previous meeting structure) and the executive committee. The initial term of service for elected board members will be set at 1-5 years by lottery. This will ensure that in the future roughly equal numbers of board positions will open up for annual election. The main considerations for selecting board members will be diversity and continuity. It is expected that board members will have attended at least one CNS meeting in the past and will have some familiarity with the scope and tone of previous meetings. We anticipate that a number of changes will come up for discussion in the next few years, but feel that it is important to retain continuity until the new governing structure is established and working smoothly. That said, we are quite open to new ideas and obtaining new contributing voices will be an important consideration in forming a diverse board. Other forms of diversity that will be considered include: level of professional advancement (faculty, postdoc student), "home" discipline (e.g. math, biology, physics, engineering), level of analysis (e.g. cellular, network, systems), geographic location, and gender. Finally, we encourage any and all opinions regarding the future direction of the CNS meeting and the role that it should play in the field of computational neuroscience. Now is a particularly important time to voice your opinion since the direction of the meeting over the next 5-10 years will likely be formed within the next several years (CL243 at cornell.edu). We hope to keep CNS as a lively inclusive meeting while striving for continued improvement in scientific quality. Sincerely, The CNS executive committee: Christiane Linster - President Todd Troyer - Vice President Eric DeSchutter - Secretary Linda Larson-Prior - Treasurer **************************************************** Christiane Linster Dept. of Neurobiology and Behavior Tel: (607) 2544331 Cornell University Fax: (607)254 4308 W249 Seeley G. Mudd Hall, Ithaca, NY 14853 cl243 at cornell.edu http://www.nbb.cornell.edu/neurobio/linster *************************************************** From bert at snn.kun.nl Mon Oct 28 08:37:04 2002 From: bert at snn.kun.nl (Bert Kappen) Date: Mon, 28 Oct 2002 14:37:04 +0100 (MET) Subject: Paper available: On the storage capacity of attractor neural networks with depressing synapses Message-ID: Dear all, I would like to announce the following paper, which has be accepted for Physical Review E: On the storage capacity of attractor neural networks with depressing synapses J.J. Torres, L. Pantic and H.J. Kappen We compute the capacity of a binary neural network with dynamic depressing synapses to store and retrieve an infinite number of patterns. We use a biologically motivated model of synaptic depression and a standard mean-field approach. We find that at $T=0$ the critical storage capacity decreases with the degree of the depression. We confirm the validity of our main mean field results with numerical simulations. In short: dynamic synapses drastically reduce the storage capacity of attractor neural networks. ftp://ftp.mbfys.kun.nl/pub/snn/pub/reports/TPK_PRE2002.ps Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: www.snn.kun.nl/~bert From pavel at PH.TN.TUDelft.NL Tue Oct 29 03:23:13 2002 From: pavel at PH.TN.TUDelft.NL (Pavel Paclik) Date: 29 Oct 2002 09:23:13 +0100 Subject: PhD Position in Pattern Recognition Message-ID: Ph.D. Position Available - Deadline November 15, 2002 ----------------------------------------------------- The Delft Pattern Recognition Group, The Netherlands, is seeking for an outstanding Ph.D. student with a background in engineering, physics or computer science, having good computer programming skills, an interest in pattern recognition or artificial intelligence and preferably experienced in spectral data analysis. The Ph.D. student will participate in an applied project on Hyperspectral image analysis funded by the Technology Foundation STW. In the project industrial applications of hyperspectral imaging are studied with an emphasis on the integration of the spectral and spatial information. Automatic learning procedures based on unsupervised and partially supervised techniques will be basic components to be studied, used and developed further in this project. The research will be done in cooperation with, among others, Unilever Research Vlaardingen, AgriVision, Wageningen and the Institute of Applied Physics in Delft. The Ph.D. position is granted for 4 years. The nomination is formally for 2 years and is expected to be extended for another 2 years. The project is integrated in the research on statistical pattern recognition fundamentals and applications of the Delft Pattern Recognition Group. This research is done by 7 researchers while the entire group consists of about 35 researchers. The Ph.D. student will be a member of ASCI, the Advanced School for Computers and Imaging, a nationwide Ph.D.school. For information on the project see: http://www.ph.tn.tudelft.nl/~pavel/spectra/index.html It may also be obtained by email from Pavel Paclik, pavel at ph.tn.tudelft.nl primary researcher in the project. For information on the research on statistical pattern recognition see: http://www.ph.tn.tudelft.nl/Research/neural/index.html For information on the Delft Pattern Recognition Group see: http://www.ph.tn.tudelft.nl Applications should be sent before November 15, 2002 to the project leader, Robert P.W. Duin: duin at ph.tn.tudelft.nl. From wahba at stat.wisc.edu Tue Oct 29 21:40:01 2002 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 29 Oct 2002 20:40:01 -0600 (CST) Subject: MSVM, Poly.Penalized.Likelihood, Nonparametric.LASSO-variable selector Message-ID: <200210300240.UAA29879@spline.stat.wisc.edu> Announcing papers recently available via http://www.stat.wisc.edu/~wahba click on TRLIST 1,2 and 3 below present the Multicategory Support Vector Machine (MSVM), a generalization of the SVM which classifies to one of k categories via a single optimization problem. 4 contrasts the MSVM with the Polychotomous Penalized Likelihood estimate, which estimates k probabilities, one for each category. 5 and 6 present a new nonparametric variable selection and model building via likelihood basis pursuit and a generalization of the LASSO. 1 Lee, Y., Lin, Y. and Wahba, G. " Multicategory Support Vector Machines, Theory, and Application to the Classification of Microarray Data and Satellite Radiance Data " TR 1064, September 2002. 2 Lee, Y. " Multicategory Support Vector Machines, Theory, and Application to the Classification of Microarray Data and Satellite Radiance Data " TR 1063, September 2002. PhD. Thesis. 3 Lee, Y. and Lee, C.-K. Classification of Multiple Cancer Types by Multicategory Support Vector Machines Using Gene Expression Data. (ps) TR 1051, April 2002, minor revisions July 2002. 4 Wahba, G. " Soft and Hard Classification by Reproducing Kernel Hilbert Space Methods " (ps) (pdf) TR 1067, October 2002. To appear Proceedings of the National Academy of Sciences. 5 Zhang, H. " Nonparametric Variable Selection and Model Building Via Likelihood Basis Pursuit " TR 1066, September 2002. PhD. Thesis. 6 Zhang, H., Wahba, G., Lin, Y., Voelker, M., Ferris, M., Klein, R. and Klein, B. Variable Selection and Model Building via Likelihood Basis Pursuit (ps) (pdf) TR 1059, July, 2002. From oreilly at grey.colorado.edu Wed Oct 30 16:05:07 2002 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Wed, 30 Oct 2002 14:05:07 -0700 Subject: ICS Director Position: CU Boulder Message-ID: <200210302105.g9UL57m21522@grey.colorado.edu> People with a computational approach would be strongly considered for this position -- if you have any questions you can also contact me or Mike Mozer (mozer at cs.colorado.edu) or Yuko Munakata (munakata at psych.colorado.edu). - Randy DIRECTOR OF THE INSTITUTE OF COGNITIVE SCIENCE The Institute of Cognitive Science at the University of Colorado, Boulder, is looking for a cognitive psychologist to become its Director. The director should be a person with a distinguished academic career and some experience and interest in the duties of an executive position in academic administration. Exceptional individuals in any of the cognitive sciences other than cognitive psychology will also be considered. The Director is expected to create and promulgate the vision for the Institute's future. This requires leadership skills and active interfacing with members of the supporting departments, CU administration, and various national and international agencies and businesses whose agendas are relevant to ICS research and academic program goals. The core departments that make up the Institute of Cognitive Science include Psychology, Computer Science, Education, Linguistics, Philosophy, and Speech/Language & Hearing Sciences. The Director must be eligible for tenure as Full Professor in one of these home departments. The expected balance of duties are 40% service, 40% research, 20% teaching. The position start date is August 2003. The desired application deadline is December 31, 2002, but we will accept applications until the position is filled. Contact Dr. Donna Caccamise, Associate Director, Institute of Cognitive Science, University of Colorado, Boulder CO 80309-0344. The University of Colorado at Boulder is committed to diversity and equality in education and employment. From christof at teuscher.ch Wed Oct 30 07:20:48 2002 From: christof at teuscher.ch (Christof Teuscher) Date: Wed, 30 Oct 2002 13:20:48 +0100 Subject: [IPCAT2003] - Second Call for Papers Message-ID: <3DBFCEA0.3054293A@teuscher.ch> ================================================================ We apologize if you receive multiple copies of this email. Please distribute this announcement to all interested parties. For removal, go to http://lslwww.epfl.ch/ipcat2003/del.html ================================================================ **************************************************************** SECOND CALL FOR PAPERS **************************************************************** ** IPCAT2003 ** Fifth International Workshop on Information Processing in Cells and Tissues September 8 - 11, 2003 Swiss Federal Institute of Technology Lausanne (EPFL) Lausanne, Switzerland http://lslwww.epfl.ch/ipcat2003 **************************************************************** Important Dates: ---------------- Paper submission: February 28, 2003 Notification of acceptance: May 28, 2003 Camera-ready copy: July 11, 2003 Description: ------------ The aim of the series of IPCAT workshops is to bring together a multidisciplinary core of scientists who are working in the general area of modeling information processing in biosystems. A general theme is the nature of biological information and the ways in which it is processed in biological and artificial cells and tissues. The key motivation is to provide a common ground for dialogue and interaction, without emphasis on any particular research constituency, or way of modeling, or single issue in the relationship between biology and information. IPCAT2003 will highlight recent research and seek to further the dialogue, exchange of ideas, and development of interactive viewpoints between biologists, physicists, computer scientists, technologists and mathematicians that have been progressively expanded throughout the IPCAT series of meetings (since 1995). The workshop will feature sessions of selected original research papers grouped around emergent themes of common interest, and a number of discussions and talks focusing on wider themes. IPCAT2003 will give particular attention to morphogenetic and ontogenetic processes and systems. IPCAT2003 encourages experimental, computational, and theoretical articles that link biology and the information processing sciences and that encompass the fundamental nature of biological information processing, the computational modeling of complex biological systems, evolutionary models of computation, the application of biological principles to the design of novel computing systems, and the use of biomolecular materials to synthesize artificial systems that capture essential principles of natural biological information processing. Accepted papers will be published in a special issue of the BioSystems journal (Elsevier Science). Topics of Interest: ------------------- Topics to be covered will include, but not limited to, the following list: o Self-organizing, self-repairing, and self-replicating systems o Evolutionary algorithms o Machine learning o Evolving, adapting, and neural hardware o Automata and cellular automata o Information processing in neural and non-neural biosystems o Parallel distributed processing biosystem models o Information processing in bio-developmental systems o Novel bio-information processing systems o Autonomous and evolutionary robotics o Bionics, neural implants, and bio-robotics o Molecular evolution and theoretical biology o Enzyme and gene networks o Modeling of metabolic pathways and responses o Simulation of genetic and ecological systems o Single neuron and sub-neuron information processing o Microelectronic simulation of bio-information systemics o Artificial bio-sensor and vision implementations o Artificial tissue and organ implementations o Applications of nanotechnology o Quantum informational biology o Quantum computation in cells and tissues o DNA computing Special Session: ---------------- Morphomechanics of the Embryo and Genome + Artificial Life -> Embryonics Artificial intelligence started with imitation of the adult brain, and artificial life has dealt mostly with the adult organism and its evolution, in that the span from genome to organism has been short or nonexistent. Embryonics is the attempt to grow artificial life in a way analogous to real embryonic development. This session will include speakers grappling with both ends of the problem. Papers for this special session should be submitted through the regular procedure. Organizers: R. Gordon, Lev V. Beloussov For up-to-date information, consult the IPCAT2003 web-site: http://lslwww.epfl.ch/ipcat2003 We are looking forward to seeing you in beautiful Lausanne! Sincerely, Christof Teuscher IPCAT2003 Program Chair ---------------------------------------------------------------- Christof Teuscher Swiss Federal Institute of Technology Lausanne (EPFL) christof at teuscher.ch http://www.teuscher.ch/christof ---------------------------------------------------------------- IPCAT2003: http://lslwww.epfl.ch/ipcat2003 ---------------------------------------------------------------- From ahirose at eis.t.u-tokyo.ac.jp Thu Oct 31 01:35:26 2002 From: ahirose at eis.t.u-tokyo.ac.jp (Akira Hirose) Date: Thu, 31 Oct 2002 15:35:26 +0900 (JST) Subject: Complex-valued Neural Networks: Special Invited Session in KES2003 Message-ID: <200210310635.g9V6ZQwH007694@pekoe.eis.t.u-tokyo.ac.jp> Special Invited Session "Complex-Valued Neural Networks" KES 2003 (7th International Conference on Knowledge-Based Intelligent Information & Engineering Systems) 3-5 September 2003, St Anne's College, University of Oxford, U.K. Conference Web Site: http://www.bton.ac.uk/kes/kes2003/ Topics and Objectives of the Special Invited Session: In these years the Complex-Valued Neural Networks expand the application fields in optoelectronic imaging, remote sensing, quantum neural devices and systems, spatiotemporal analysis of physiological neural systems as well as artificial neural information processing. At the same time, the potentially wide applicability yields new theories required for novel and more effective functions and mechanisms. This session aims to discuss the latest progress of the field and to penetrate the future prospects. Instructions for Authors: Please find them in the above Conference Web Site. Publication: The Conference Proceedings will be published by a major publisher, for example IOS Press of Amsterdam. Extended versions of selected papers will also be considered for publication in the International Journal of Knowledge-Based Intelligent Engineering Systems, http://www.bton.ac.uk/kes/journal/ Important Dates (tentative): First of all, please contact the Session Chair below. Deadline for submission intention: December 1, 2002 Deadline for receipt of papers by Session Chair: February 1, 2003 Notification of acceptance: March 1, 2003 Camera-ready papers to session chair by: April 1, 2003 Contact: Session Chair: Akira Hirose, Assoc. Prof. Department of Electrical and Electronic Engineering The University of Tokyo 7-3-1 Hongo, Bunkyo-ku, Tokyo 153-8656, Japan Email: ahirose at ee.t.u-tokyo.ac.jp http://www.eis.t.u-tokyo.ac.jp/ *Note* Those who are interested in submission are requested to make contact with the Session Chair by ca. 1st December 2002. From dwang at cis.ohio-state.edu Wed Oct 30 16:10:22 2002 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Wed, 30 Oct 2002 17:10:22 -0400 Subject: IEEE TNN Special Issue on temporal coding Message-ID: <3DC03C8C.B2A5BA8E@cis.ohio-state.edu> IEEE Transactions on Neural Networks Call for Papers Special Issue on "Temporal Coding for Neural Information Processing" Largely motivated by neurobiological discoveries, neural network research is currently witnessing a significant shift of emphasis towards temporal coding, which uses time as an extra degree of freedom in neural representations. Temporal coding is passionately debated in neuroscience and related fields, but in the last few years a large volume of physiological and behavioral data has emerged that supports a key role for temporal coding in the brain. In neural networks, a great deal of research is undertaken under the topics of nonlinear dynamics, oscillatory and chaotic networks, spiking neurons, and pulse-coupled networks. Various information processing tasks are investigated using temporal coding, including scene segmentation, figure-ground separation, classification, learning, associative memory, inference, motor control, and communication. Progress has been made that substantially advances the state-of-the-art of neural computing. In many instances, however, neural models incorporating temporal coding are driven merely by the assertion that real neurons use impulses. It is often unclear whether, and to what extent, the temporal aspects of the models contribute to information processing capabilities. It is time to assess the role and potential of temporal coding in terms of information processing performance by providing a comprehensive view of the current approaches and issues to the neural networks community. This special issue seeks to present, in a collective way, research that makes a clear contribution to addressing information processing tasks using temporal coding. The issue is intended not only to highlight successful use of temporal coding in neural networks but also clarify outstanding issues for future progress. Suggested topics include but are not limited to the following: - Synchrony, desynchrony, and other temporal phenomena - Encoding and decoding in the temporal domain - Comparative issues in rate coding and temporal coding - Cognitive aspects of temporal/spatiotemporal phenomena in neural systems - Effects and uses of time delays - Potential roles of chaos, randomness and noise - Learning for temporal codes - Temporal/spatiotemporal information processing for: * Perceptual processing * Learning, memory, and reasoning * Motor control * Communication - Innovative applications - Hardware implementation The guest editors of the special issue are: Walter Freeman, University of California, Berkeley Robert Kozma, University of Memphis Andrzej Lozowski, Southern Illinois University Ali Minai, University of Cincinnati DeLiang Wang, Ohio State University Manuscripts will first be screened for topical relevance, and those that pass the screening process will undergo the standard review procedure of the IEEE Transactions on Neural Networks (see the instructions for authors for the Transactions). Paper submission deadline is May 30, 2003, and the special issue will be published by July 2004. Papers should be submitted in PDF format via email to the lead guest editor: DeLiang Wang Email: dwang at cis.ohio-state.edu http://www.cis.ohio-state.edu/~dwang From butz at illigal.ge.uiuc.edu Thu Oct 31 14:38:34 2002 From: butz at illigal.ge.uiuc.edu (Martin Butz) Date: Thu, 31 Oct 2002 13:38:34 -0600 (CST) Subject: Call for contributions - Anticipatory Behavior in Adaptive Learning Systems Message-ID: (We apologize for multiple copies) ############################################################################### C A L L F O R C O N T R I B U T I O N S ABiALS 2002 Post Proceedings Book: "Anticipatory Behavior in Adaptive Learning Systems: Foundations, Theories, and Systems" to appear in Springer's Lecture Notes in Artificial Intelligence ############################################################################### the first workshop on Adaptive Behavior in Anticipatory Learning Systems (ABiALS 2002) was held on August 11., 2002 in Edinburgh, Scotland http://www-illigal.ge.uiuc.edu/ABiALS in association with the seventh international conference on Simulation of Adaptive Behavior (SAB'02) http://www.isab.org.uk/sab02/ This upcoming volume addresses the question of when, where, and how anticipations are useful in adaptive systems. Anticipations refer to the influence of future predictions or future expectations on behavior and learning. ABiALS 2002 was a first interdisciplinary gathering of people interested in how anticipations can be used efficiently to improve behavior and learning. Four fundamentally different systems were distinguished: (1) Implicitly anticipatory systems are those that act/learn in an intelligent way but do not include any predictive bias in the applied learning and/or behavioral mechanisms. (2) Payoff anticipatory systems are those systems that do compare payoff predictions for action decisions but do not use any state predictions. (3) Sensory anticipatory systems are systems that use sensory predictions to improve perceptual processing (e.g. preparatory attention). (4) State anticipatory systems are systems that form explicit future predictions/expectations that influence action decisions and learning. The book "Anticipatory Behavior in Adaptive Learning Systems" will address the latter two of the four types of systems. Submissions are welcome that are concerned with any type of sensory anticipatory or state anticipatory system. ___________________________________________________________________________ Aim and Objectives: Most of the research over the last years in artificial adaptive behavior concerned with model learning and anticipatory behavior has focused on the model learning side. Research is particularly engaged in online generalized model learning. Until now, though, exploitation of the model has been done mainly to show that exploitation is possible or that an appropriate model exists in the first place. Only very few applications are available that show the utility of the model for the simulation of anticipatory behavior. The aim of this book is to lay out the foundations for a study of anticipatory learning and behavior. The content will be divided roughly into three chapters. The first chapter will provide psychological background that not only supports the presence of anticipatory mechanisms in ``higher'' animals and humans but also sheds light on when and why anticipatory mechanisms can be useful. Chapter 2 will provide foundations and frameworks for the study of anticipatory mechanisms distinguishing fundamentally different mechanisms. Finally, Chapter 3 will contain examples of implemented frameworks and systems. ___________________________________________________________________________ Essential questions: * How can anticipations influence the adaptive behavior of an artificial learning system? * How can anticipatory adaptive behavior be implemented in an artificial learning system? * How does an incomplete model influence anticipatory behavior? * How can anticipations guide further model learning? * How can anticipations control attention? * Can anticipations be used for the detection of special environmental properties? * What are the benefits of anticipations for adaptive behavior? * What is the trade-off between simple bottom-up stimulus-response driven behavior and more top-down anticipatory driven behavior? * In what respect does anticipation mediate between low-level environmental processing and more complex cognitive simulation? * What role do anticipations play for the implementation of motivations and emotions? ___________________________________________________________________________ Submission: Submissions for the book should address one or more of the above questions or provide appropriate psychological background on anticipatory mechanisms in animals and humans. Papers with inappropriate content will be rejected. The book is intended to be interdisciplinary and open to all approaches to anticipatory behavior. There is no restriction on the type of anticipatory learning system or on the representation of predictions for anticipatory behavior and learning. Papers will be reviewed for acceptance by the program committee and the organizers. Papers should be submitted electronically to one of the organizers via email in pdf or ps format. Electronic submission is strongly encouraged. If you cannot submit your contribution electronically, please contact one of the organizers. Submitted papers should be between 10 and 20 pages in 10pt, one-column format. Please use the LNCS style available at: http://www.springer.de/comp/lncs/authors.html. Submission deadline is DECEMBER 20, 2002. For more information please refer to the workshop page: http://www-illigal.ge.uiuc.edu/ABiALS/ Please also see our introductory talk to the workshop for more detailed information on anticipations and different types of anticipatory behavior: http://www-illigal.ge.uiuc.edu/ABiALS/ABiALS2002Introduction.htm There is also an introductory paper available that provides further general information on the topic: http://www-illigal.ge.uiuc.edu/ABiALS/Papers/ABiALS2002Intro.pdf ___________________________________________________________________________ Important Dates: 20.December 2002: Deadline for Submissions 24.January 2002: Notification of Acceptance 21.February 2002: Camera Ready Version for LNAI Volume ___________________________________________________________________________ Program Committee: Emmanuel Dauc? Facult? des sciences du sport Universit? de la M?diterrann?e Marseille, France Ralf Moeller Cognitive Robotics Max Planck Institute for Psychological Research Munich, Germany Wolfgang Stolzmann DaimlerChrysler AG Berlin, Germany Jun Tani Lab. for Behavior and Dynamic Cognition Brain Science Institute, RIKEN 2-1 Hirosawa, Wako-shi, Saitama, 351-0198 Japan Stewart W. Wilson President Prediction Dynamics USA ___________________________________________________________________________ Organizers: Martin V. Butz, Illinois Genetic Algorithms Laboratory (IlliGAL), Universtiy of Illinois at Urbana-Champaign, Illinois, USA also: Department of Cognitive Psychology University of Wuerzburg, Germany butz at illigal.ge.uiuc.edu http://www-illigal.ge.uiuc.edu/~butz Pierre G?rard, AnimatLab, University Paris VI, Paris, France pierre.gerard at lip6.fr http://animatlab.lip6.fr/Gerard Olivier Sigaud AnimatLab, University Paris VI, Paris, France olivier.sigaud at lip6.fr http://animatlab.lip6.fr/Sigaud From bengioy at IRO.UMontreal.CA Tue Oct 1 13:31:35 2002 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Tue, 1 Oct 2002 13:31:35 -0400 Subject: machine learning position in HEC, Montreal Message-ID: <20021001133135.B17930@vor.iro.umontreal.ca> POSITION IN DATA MINING, ARTIFICIAL INTELLIGENCE OR MACHINE LEARNING Department of Management Science, HEC Montr?al The Department of Management Science at HEC Montreal invites applications for a tenure-track position in data mining, artificial intelligence or machine learning at the rank of assistant, associate or full professor. Priority will be given to candidates with high interest and expertise in data mining, but candidates with interests and expertise in related fields will also be considered. HEC Montr?al is affiliated with the University of Montreal and is recognized as a world leader in business education. For information about the Department of Management Science and HEC Montr?al, the candidates are invited to visit the webpage www.hec.ca/mqg/ and www.hec.ca/. Duties: Undergraduate and graduate teaching, supervision of graduate students, and research. Requirements: To hold a Ph.D. degree in Computing Science, Statistics, Operations Research or in a related discipline. Teaching excellence is required as well as a strong aptitude for research. The main language of instruction at HEC is French; however, there are also programs and courses offered in English and Spanish. Starting date: June 1, 2003. The interested candidates must submit their curriculum vitae including a concise statement of their research interests, the coordinates of three references, and copies of at most three of their most important research publications before December 15, 2002, to: Francois Bellavance Chair Department of Management Science HEC Montr?al 3000 chemin de la Cote-Sainte-Catherine Montreal (Quebec) H3T 2A7 e-mail : francois.bellavance at hec.ca From Marc.Maier at snv.jussieu.fr Tue Oct 1 10:35:26 2002 From: Marc.Maier at snv.jussieu.fr (Marc Maier) Date: Tue, 01 Oct 2002 16:35:26 +0200 Subject: Job announcement in Computational Neuroscience, Paris Message-ID: <3D99B2AD.10357644@snv.jussieu.fr> Job announcement in Computational Neuroscience We will open a permament position in Computational Neuroscience at the level of Matre de confrence universitaire (equivalent to Lecturer or Assistant Professor) at University Paris-VII in fall 2003. Compulsory qualification for candidature ends October-8 2002 (see:http://www.education.gouv.fr/personnel/enssup/antares/default.htm). The opening of the position will be officially announced early 2003. Research will be centred on learning and control of upper limb movements, including manual dexterity and object manipulation. We seek candidates with strong experience in the field of biologically inspired neural network modeling, motor control, biomechanics and a background in Neuroscience. A franco-phone background or working knowledge is required for teaching duties at University Paris-VII that go with the position. Teaching will involve the domains of Neuroscience, Computational neuroscience and Bioinformatics. The position will be attached to INSERM U.483, headed by Y Burnod, who wishes to enlarge and re-enforce our existing modeling team that comprises a wide-ranging and complementary expertise from motor neuroscience, over biophysical and neural network modeling, to robotics. For further informations contact: Marc.Maier at snv.jussieu.fr - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Prof. Marc Maier Dept. of Biology, University Paris-VII and INSERM U483 Bldg. C, 6th floor 9, Quai St-Bernard 75005 Paris tel: +33 1 44 27 34 21 From myosioka at brain.riken.go.jp Wed Oct 2 16:31:33 2002 From: myosioka at brain.riken.go.jp (Masahiko Yoshioka) Date: Thu, 03 Oct 2002 05:31:33 +0900 (JST) Subject: Preprint: Linear stability analysis for networks of spiking neurons Message-ID: <20021003.053133.74751205.myosioka@brain.riken.go.jp> Dear Connectionists, I would like to announce the availability of the preprint of my recent paper, "Linear stability analysis of retrieval state in associative memory neural networks of spiking neurons" M. Yoshioka, Phys. Rev. E in press Available at http://arxiv.org/abs/cond-mat/0209686 Abstract: We study associative memory neural networks of the Hodgkin-Huxley type of spiking neurons in which multiple periodic spatio-temporal patterns of spike timing are memorized as limit-cycle-type attractors. In encoding the spatio-temporal patterns, we assume the spike-timing-dependent synaptic plasticity with the asymmetric time window. Analysis for periodic solution of retrieval state reveals that if the area of the negative part of the time window is equivalent to the positive part, then crosstalk among encoded patterns vanishes. Phase transition due to the loss of the stability of periodic solution is observed when we assume fast alpha-function for direct interaction among neurons. In order to evaluate the critical point of this phase transition, we employ Floquet theory in which the stability problem of the infinite number of spiking neurons interacting with alpha-function is reduced into the eigenvalue problem with the finite size of matrix. Numerical integration of the single-body dynamics yields the explicit value of the matrix, which enables us to determine the critical point of the phase transition with a high degree of precision. --- Masahiko Yoshioka Brain Science Institute, RIKEN Hirosawa 2-1, Wako-shi, Saitama, 351-0198, Japan From intneuro at net2000.com.au Thu Oct 3 15:41:22 2002 From: intneuro at net2000.com.au (INTEGRATIVE NEUROSCIENCE) Date: Thu, 03 Oct 2002 12:41:22 -0700 Subject: order your copy now! Message-ID: <3D9C9D61.B483D875@net2000.com.au> The following paper is scheduled to appear in the forthcoming issue of the Journal of Integrative Neuroscience, volume 1, Issue 2, December 2002: AN INTEGRATIVE THEORY OF COGNITION A framework is outlined for connecting brain imaging activity with the underlying biophysical properties of neural networks, and their mechanisms of action and organizing principles. The main thrust of the framework is a dynamic theory of semantics based on functional integration of biophysical neural networks. It asserts that higher-brain function arises from both synaptic and extrasynaptic integration in the neuropil where information on environmental changes are represented dynamically through a discourse of semantics. Consequently, integrative neural modeling is shown to be an important methodology for analyzing the response activities of functional imaging studies in elucidating the relationship between brain structure, function and behavior. Roman R. Poznanski Centre de Recherche en Physiologie Int?grative H?pital Tarnier CHU Cochin-Port-Royal, 89, rue d'Assas, Paris 75006 FRANCE poznan at integrative-physiology.org ----------------------------- The Journal of Integrative Neuroscience will serve as a focus for new discoveries in the advancement of experimental and theoretical neuroscience. The journal covers the following disciplines: 1. PDE's and nonlinear dynamical systems; 2. Noninformationalist/noncomputationalist theories of mind; 3. Integrative neural modeling; 4. Functional imaging (PET/fMRI); 5. Experiments linking molecular with cellular phenomena. 6. Interregional functional connectivity (anatomy and physiology); 7. Philosophical foundations of neuroscience; 8. Neural engineering applications. Computational neuroscience with its computer metaphors provides little help to foundational theory because of its lack of relation to physical law and its weak relationship to neurobiological processes. The new field of integrative neuroscience encompasses the requirements of physical-biological foundations, expansive inclusiveness of scope across biological and psychological variables, multi-leveled hierarchical complexity, and analytical tools demanded by the brain's complexity. Integrative neuroscience is a large-scale synthesis whose scope and time are right. It embodies the future directions of theoretical neuroscience, and should provide many bridging recognitions. http://www.worldscinet.com/jin/mkt/editorial.shtml (Only vol. 1, issue 1 is free). For orders within Europe, please contact the Imperial College Press sales department at: Tel: +44 (0)20 7836-0888 Fax: +44 (0)20 7836-2020 during U.K. business hours. Outside Europe, our books and journals are distributed by World Scientific Publishing Co.: 5 Toh Tuck Link, SINGAPORE 596224 Fax: 65-6467-7667 Tel: 65-6466-5775 E-mail: wspc at wspc.com.sg Price Information: ISSN: 0219-6352 ; Vol. 1/2002; 2 Issues Special Rates: Individuals (Print Only) US$56 Institutions/Libraries US$ 84 -------------------------- From james at mis.mpg.de Thu Oct 3 07:51:44 2002 From: james at mis.mpg.de (Matthew James) Date: Thu, 3 Oct 2002 13:51:44 +0200 (MET DST) Subject: PhD Thesis: Dynamics of Synaptically Interacting Integrate-and-Fire Neurons Message-ID: <200210031151.NAA19358@kopernikus.mis.mpg.de> Dear Colleagues, I would like to draw your attention to my PhD thesis "Dynamics of Synaptically Interacting Integrate-and-Fire Neurons" supervised by Professor Paul Bressloff and Dr Steve Coombes at the Department of Mathematical Sciences, Loughborough University, UK. Available at: http://www.lboro.ac.uk/departments/ma/pg/theses/mampj-abs.html Abstract: Travelling waves of activity have been experimentally observed in many neural systems. The functional significance of such travelling waves is not always clear. Elucidating the mechanisms of wave initiation, propagation and bifurcation may therefore have a role to play in ascertaining the function of such waves. Previous treatments of travelling waves of neural activity have focussed on the mathematical analysis of travelling pulses and numerical studies of travelling waves. It is the aim of this thesis to provide insight into the propagation and bifurcation of travelling waveforms in biologically realistic systems. There is a great deal of experimental evidence which suggests that the response of a neuron is strongly dependent upon its previous activity. A simple model of this synaptic adaptation is incorporated into an existing theory of strongly coupled discrete integrate-and-fire (IF) networks. Stability boundaries for synchronous firing shift in parameter space according to the level of adaptation, but the qualitative nature of solutions is unaffected. The level of synaptic adaptation is found to cause a switch between bursting states and those which display temporal coherence. Travelling waves are analysed within a framework for a one-dimensional continuum of integrate-and-fire neurons. Self-consistent speeds and periods are determined from integro-differential equations. A number of synaptic responses (alpha-function and passive and quasi-active dendrites) produce qualitatively similar results in the travelling pulse case. For IF neurons, an additional refractory mechanism needs to be introduced in order to prevent arbitrarily high firing rates. Different mathematical formulations are considered with each producing similar results. Dendrites are extensions of a neuron which branch repeatedly and the electrical properties may vary. Under certain conditions, this active membrane gives rise to a membrane impedance that displays a prominent maximum at some nonzero resonant frequency. Dispersion curves which relate the speed of a periodic travelling wave to its period are constructed for the different synaptic responses with additional oscillatory behaviour apparent in the quasi-active dendritic regime. These stationary points are shown to be critical for the formation of multi-periodic wave trains. It is found that periodic travelling waves with two periods bifurcate from trains with a single period via a drift in the spike times at stationary points in the dispersion curve. Some neurons rebound and fire after release from sustained inhibition. Many previous mathematical treatments have not included the effect of this activity. Analytical studies of a simple model which exhibits post-inhibitory rebound show that these neurons can support half-centre oscillations and periodic travelling waves. In contrast to IF networks, only a single travelling pulse wavespeed is possible in this network. Simulations of this biophysical model show broad agreement with the analytical solutions and provide insight into more complex waveforms. Results of the thesis are presented in a discussion along with possible directions for future study. Noise, inhomogeneous media and higher spatial dimensions are suggested. Keywords: biophysical models, dendrites, integrate-and-fire, neural coding, neural networks, post-inhibitory rebound, synaptic adaptation, travelling waves ---------------------------------------------------- Dr. Matthew P. James Max-Planck-Institute for Mathematics in the Sciences Inselstrasse 22 - 26 04103 Leipzig / Germany Phone: +49-341-9959-531 Fax: +49-341-9959-658 Email: james at mis.mpg.de URL: http://personal-homepages.mis.mpg.de/james/ From owi at imm.dtu.dk Thu Oct 3 05:23:48 2002 From: owi at imm.dtu.dk (Ole Winther) Date: Thu, 03 Oct 2002 11:23:48 +0200 Subject: Matlab ICA packages available Message-ID: <3D9C0CA4.4A8083A1@imm.dtu.dk> [Our sincere apologies if you receive multiple copies of this email] We are happy to announce the availability of three Matlab software packages for Independent Component Analysis (ICA). The algorithms are easy to use with no parameters need to be set by default. All algorithms come with demonstration scripts. The packages contain the following three algorithms: 1. Maximum likelihood (Infomax) - the Bell and Sejnowski 1995 algorithm Optimization is performed by an effective second order method. 2. Mean Field - Bayesian ICA alg. by H?jen-S?rensen, Winther and Hansen Linear and instantaneous mixing. Estimation of noise covariance - Gaussian noise model. General mixing matrix - quadratic, over- and under-complete - free/positivity constraint estimation. Variety of source distributions, e.g. - exponential for positive sources - Gaussian for prob. PCA and factor analysis - bi-Gauss for negative kurtosis sources - heavy tailed for positive kurtosis sources. 3. Molgedey and Schuster - The Molgedey-Schuster decorrelation algorithm Square mixing matrix and no noise. Very fast - no iterations. The delay Tau is estimated. Log likelihoods are calculated for all three algorithm. As an additional feature the Bayesian Information Criterion (BIC) can be used for selecting the number of independent components. The packages are available from: http://isp.imm.dtu.dk/toolbox/ All comments and suggestions are welcome. Authors: Thomas Kolenda http://isp.imm.dtu.dk/staff/thko/ Ole Winther http://isp.imm.dtu.dk/staff/winther/ Lars Kai Hansen http://isp.imm.dtu.dk/staff/lkhansen/ Best regards, Ole Winther -- Associate Professor, Digital Signal Processing Informatics and Mathematical Modelling (IMM) Technical University of Denmark (DTU) http://www.imm.dtu.dk Tel: +45 4525 3895 Homepage: http://isp.imm.dtu.dk/staff/winther/ From qian at brahms.cpmc.columbia.edu Thu Oct 3 16:48:33 2002 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Thu, 3 Oct 2002 16:48:33 -0400 Subject: paper: a theory of depth from vertical disparity Message-ID: <200210032048.g93KmX519473@bach.cpmc.columbia.edu> Dear Colleagues, The following paper on "a physiological theory of depth perception from vertical disparity" can be downloaded from: http://brahms.cpmc.columbia.edu/publications/vertical.ps.gz or: http://brahms.cpmc.columbia.edu/publications/vertical.pdf Related work on horizontal disparity and interocular time delay (Pulfrich effect) can be found at http://brahms.cpmc.columbia.edu/ Best regards, Ning ------------------------------------------------------------------ A physiological theory of depth perception from vertical disparity Nestor Matthews, Xin Meng, Peng Xu, and Ning Qian Vision Research (in press) Abstract It has been known since the time of Helmholtz that vertical differences between the two retinal images can generate depth perception. Although many ecologically and geometrically inspired theories have been proposed, the neural mechanisms underlying the phenomenon remain elusive. Here we propose a new theory for depth perception from vertical disparity based on the oriented binocular receptive fields of visual cortical cells and on the radial bias of the preferred-orientation distribution in the cortex. The theory suggests that oriented cells may treat a vertical disparity as a weaker, equivalent horizontal disparity. It explains the induced effect, and the quadrant- and size-dependence of vertical disparity. It predicts that horizontal and vertical disparities should locally enhance or cancel each other according to their depth-signs, and that the effect of vertical disparity should be orientation dependent. These predictions were confirmed through psychophysical experiments. From kaynak at boun.edu.tr Fri Oct 4 02:49:11 2002 From: kaynak at boun.edu.tr (okyay kaynak) Date: Fri, 4 Oct 2002 09:49:11 +0300 Subject: Soft Computing days in Istanbul Message-ID: Please accept our apologies if you receive multiple copies of this message.. SOFT COMPUTING DAYS IN ISTANBUL June 26-July 3, 2003 Dear Colleague; During the early summer of 2003, the magnificent city of Istanbul is going to host two major events in the area of soft computing. ICANN, the annual conference of the European Neural Network Society and ICONIP, the annual conference of the Asia-Pacific Neural Network Assembly will be held jointly (what better place than Istanbul for such an event!) and this will be followed by the bi-annual conference of International Fuzzy Systems Association; IFSA Congress. There will be an overlapping day of the two events for the neural and the fuzzy researches to interact. For those who want to participate in both events, a special registration fee will be offered. City sightseeing and pre and post conference tours will compliment the scientific programs for you to experience some of the cultural riches of Istanbul and Turkey. Make a note of the days in your diary and come and join us. ISTANBUL AWAITS YOU..... Joint 13th International Conference on Artificial Neural Networks and 10th International Conference on Neural Information Processing: ICANN/ICONIP 2003 June 26 - 29, 2003, Istanbul, Turkey http://www.nn2003.org 10th IFSA World Congress: IFSA 2003 June 29 - July 2, 2003, Istanbul, Turkey http://www.ifsa2003.org If you are more in CONTROLS area, there is a control applications conference too. 2003 IEEE Conference on Control Applications: CCA 2003 June 23-25, 2003, Istanbul, Turkey http://mecha.ee.boun.edu.tr/cca2003 From mzib at ee.technion.ac.il Fri Oct 4 09:45:53 2002 From: mzib at ee.technion.ac.il (Michael Zibulevsky) Date: Fri, 4 Oct 2002 16:45:53 +0300 (IDT) Subject: New paper: Relative Newton Method for Quasi-ML Blind Source Separation Message-ID: Announcing a paper ... Title: Relative Newton Method for Quasi-ML Blind Source Separation Author: Michael Zibulevsky ABSTRACT: Presented relative Newton method for quasi-maximum likelihood blind source separation significantly outperforms natural gradient descent in batch mode. The structure of the corresponding Hessian matrix allows its fast inversion without assembling. Experiments with sparsely representable signals and images demonstrate super-efficient separation. URL of gzipped ps file: http://ie.technion.ac.il/~mcib/newt_ica_jmlr1.ps.gz Contact: mzib at ee.technion.ac.il =========================================================================== Michael Zibulevsky, Ph.D. Email: mzib at ee.technion.ac.il Faculty of Electrical Engineering Phone: 972-4-829-4724 Technion - Israel Institute of Technology 972-4-832-3885 Haifa 32000, Israel Cell: 972-55-968297 http://ie.technion.ac.il/~mcib/ Fax: 972-4-829-4799 =========================================================================== From esann at dice.ucl.ac.be Fri Oct 4 07:03:37 2002 From: esann at dice.ucl.ac.be (esann) Date: Fri, 4 Oct 2002 13:03:37 +0200 Subject: CFP: ESANN'2003 European Symposium on Artificial Neural Networks Message-ID: <002501c26b95$b4238100$48ed6882@dice.ucl.ac.be> ESANN'2003 11th European Symposium on Artificial Neural Networks Bruges (Belgium) - April 23-24-25, 2003 Announcement and call for papers ===================================================== Technically co-sponsored by the IEEE Region 8, the IEEE Benelux Section, the IEEE Neural Networks Society (to be confirmed), the International Neural Networks Society and the European Neural Networks Society. The call for papers for the ESANN'2003 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). If you have difficulties to connect to the Web please send an e-mail to esann at dice.ucl.ac.be and we will send you a full version of the call for papers. ESANN'2003 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first happening in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2003 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical and statistical aspects, in the context of function approximation, classification, control, time-series prediction, signal processing, vision, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, data analysis, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that it is the topics of the paper which will decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate on the author submission form that they only accept to present their paper orally. The following is a non-exhaustive list of topics covered during the ESANN conferences: - Models and architectures - Learning algorithms - Theory - Mathematics - Statistical data analysis - Classification - Approximation of functions - Time series forecasting - Nonlinear dimension reduction - Multi-layer Perceptrons - RBF networks - Self-organizing maps - Vector quantization - Support Vector Machines - Recurrent networks - Fuzzy neural nets - Hybrid networks - Bayesian neural nets - Cellular neural networks - Signal processing - Independent component analysis - Natural and artificial vision - Adaptive control - Identification of non-linear dynamical systems - Biologically plausible networks - Bio-inspired systems - Cognitive psychology - Evolutiv learning - Adaptive behaviour Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. Here is the list of special sessions that will be organized during the ESANN'2003 conference: 1. Links between neural networks and webs (M. Gori). 2. Mathematical aspects of neural networks (B. Hammer, T. Villmann) 3. Statistical learning and kernel-based algorithms (M. Pontil, J. Suykens) 4. Digital image processing with neural networks (A. Wismller, U. Seiffert) 5. Industrial and agronomical applications of neural networks (L.M. Reyneri) 6. Neural networks for human/computer interaction (C.W. Omlin) Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organised in a hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Proceedings and journal special issue ------------------------------------- The proceedings will include all communications presented to the conference (tutorials, oral and posters), and will be available on-site. Extended versions of selected papers will be published in the Neurocomputing journal (Elsevier, V. David Sanchez A. ed.). Call for contributions ---------------------- Prospective authors are invited to submit their contributions before 6 December 2002. The electronic submission procedure is detailed on the ESANN Web pages http://www.dice.ucl.ac.be/esann/. Authors must indicate their choice for oral or poster presentation at the submission. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2003. They will benefit from the advance registration fee. Deadlines --------- Submission of papers December 6, 2002 Notification of acceptance February 6, 2003 Symposium April 23-25, 2003 Registration fees ----------------- registration before registration after March 15, 2003 March 15, 2003 Universities 425 480 Industries 525 580 The registration fee includes the attendance to all sessions, the ESANN'2003 dinner, a copy of the proceedings, daily lunches (23-25 April 2003), and the coffee breaks. Conference secretariat ---------------------- ESANN'2003 d-side conference services phone: + 32 2 730 06 11 24 av. L. Mommaerts Fax: + 32 2 730 06 00 B - 1140 Evere (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee (to be confirmed) ---------------------------- Hughes Bersini Univ.Libre Bruxelles (B) Franois Blayo Prfigure (F) Marie Cottrell Univ. Paris I (F) Jeanny Hrault INPG Grenoble (F) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Herv Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Colin Campbell Bristol Univ. (UK) Stphane Canu Inst. Nat. Sciences App. (F) Holk Cruse Universitt Bielefeld (D) Eric de Bodt Univ. Lille II (F) & UCL Louvain-la-Neuve (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips Semiconductors (USA) Richard Duro Univ. Coruna (E) Jean-Claude Fort Universit Nancy I (F) Colin Fyfe Univ. Paisley (UK) Stan Gielen Univ. of Nijmegen (NL) Marco Gori Univ. Siena (I) Bernard Gosselin Fac. Polytech. Mons (B) Manuel Grana UPV San Sebastian (E) Anne Gurin-Dugu INPG Grenoble (F) Barbara Hammer Univ. of Osnbruck (D) Martin Hasler EPFL Lausanne (CH) Laurent Hrault CEA-LETI Grenoble (F) Gonzalo Joya Univ. Malaga (E) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinki Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Jouko Lampinen Helsinki Univ. of Tech. (FIN) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Erzsebet Merenyi Rice Univ. (USA) Jean Arcady Meyer Univ. Paris 6 (F) Jos Mira UNED (E) Jean-Pierre Nadal Ecole Normale Suprieure Paris (F) Christian W. Omlin Univ. of the Western Cape (SA) Gilles Pags Univ. Paris 6 (F) Thomas Parisini Univ. Trieste (I) Hlne Paugam-Moisy Universit Lumire Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Didier Puzenat Univ. Antilles-Guyane (F) Leonardo Reyneri Politecnico di Torino (I) Jean-Pierre Rospars INRA Versailles (F) Jose Santos Reyes Univ. Coruna (E) Jochen Steil Univ. Bielefeld (D) John Stonham Brunel University (UK) Johan Suykens K. U. Leuven (B) John Taylor Kings College London (UK) Claude Touzet Univ. Provence (F) Marc Van Hulle KUL Leuven (B) Thomas Villmann Univ. Leipzig (D) Christian Wellekens Eurecom Sophia-Antipolis (F) Axel Wismller Ludwig-Maximilians-Univ. Mnchen (D) ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From paulm at earthlink.net Sat Oct 5 23:22:10 2002 From: paulm at earthlink.net (Dr Paul R Martin) Date: Sun, 06 Oct 2002 13:22:10 +1000 (EST) Subject: CVNet - Positions in Physiology at Sydney Message-ID: <0H3J002ASIJJN9@mail.newcastle.edu.au> The Anderson Stuart precinct has a vigorous Visual Neuroscience Group. Potential applicants for the positions below are welcome to contact me for more details. Paul ------------------------------------------- 2 Lecturer/Senior Lecturer Positions, School of Biomedical Sciences, University of Sydney The School of Biomedical Sciences is seeking to appoint 2 Lecturers/Senior Lecturers, one to be based in the Physiology Department, the other in the Pharmacology Department. The School of Biomedical Sciences lies within the Faculty of Medicine of the University of Sydney which comprises one of the largest and most active groups of biomedical researchers in Australia. Central facilities for genomics, proteomics and bioinformatics are developing rapidly and opportunities for collaboration between basic and clinical sciences, and research institutes located in the area are exceptional. Applicants should demonstrate how their research will build on these opportunities. The University of Sydney has established areas of research strength in cellular and molecular sciences, cardiovascular and respiratory sciences and the neurosciences. It is likely that the research interests of appointee would lie within one of these broad areas, which are well represented within the two Departments. Applicants at either level would be expected to demonstrate creativity and productivity in their scientific output; applicants at the Senior Lecturer level will need a track record of competitive grant support and evidence of an emerging international reputation. The Departments teach Dental, Medical, Pharmacy and Science students and have substantial numbers of Honours and PhD students. Applicants for the position in Physiology must nominate several areas of Physiology which they could teach at an elementary level and should also identify their areas of expertise for research-based teaching. Applicants for the position in Pharmacology should demonstrate a background in the fundamentals of pharmacology and should also identify their areas of expertise for research-based teaching. Evidence of interest and experience in teaching should be presented by applicants for each position. Applicants should submit a full c.v, a short account of the type of research they wish to pursue and the areas of teaching to which they can contribute, an indication of the Departmental affiliation and level of appointment they seek, and the names of 3 referees. Further details of the departments can be found at http://www.physiol.usyd.edu.au/ and http://www.usyd.edu.au/su/pharmacology/ . ------------------------------------------------------------------- Paul R Martin Associate Professor Dept Physiology F13 University of Sydney NSW 2006 Australia paulm at physiol.usyd.edu.au Tel +61 (02) 9351 3928 Fax +61 (02) 9351 2058 ------------------------------------------------------------------- To get information on using CVNet, send a note to: majordomo at mail.ewind.com In the body of the message, enter: info cvnet From ted.carnevale at yale.edu Mon Oct 7 13:23:28 2002 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Mon, 07 Oct 2002 13:23:28 -0400 Subject: an excellent new NEURON tutorial! Message-ID: <3DA1C310.419ED028@yale.edu> It is my distinct pleasure to announce that Andrew Gillies and David Sterratt of Edinburgh University have created a new tutorial about NEURON that you will find at http://www.anc.ed.ac.uk/school/neuron This is based on the earlier tutoral written by Kevin Martin, but covers a host of new topics and reflects important changes to NEURON over the years. Items of particular interest include --the use of templates to define new cell classes --efficient and convenient management of networks with the NetCon Class --combined use of hoc code and the GUI --adding new mechanisms David and Andrew have worked with us to try to ensure that their new tutorial is clear and up-to-date, remaining tractable for entry-level users while at the same time discussing topics that may be new to more advanced users. --Ted From schwabe at cs.tu-berlin.de Mon Oct 7 12:17:24 2002 From: schwabe at cs.tu-berlin.de (Lars Schwabe) Date: Mon, 7 Oct 2002 18:17:24 +0200 Subject: International Neuroscience Summit 2002 in Berlin Message-ID: Dear Colleague, we invite you to joint the International Neuroscience Summit 2002 in Berlin (please refer to www.ins2002.org for further details) which will take place from 11/28/2002 until 12/01/2002. To this year's invited speakers belong M. Sur (MIT, USA) K. Miller (UCSF, USA) P. Dayan (Gatsby Unit, UK) M. Mozer (University of Colorado, USA) H. Markram (EPFL, Switzerland) A. Borst (MPI, Martinsried, Germany) H. Super (University of Amsterdam, Netherlands) J. Bower (University of Texas, USA) A. Aertsen (Freiburg, Germany) Cheers and see you in Berlin, Lars Schwabe, Gregor Wenning and Peter Wiesing From dgw at MIT.EDU Mon Oct 7 12:14:55 2002 From: dgw at MIT.EDU (David Weininger) Date: Mon, 07 Oct 2002 12:14:55 -0400 Subject: book announcement--Hallamb Message-ID: <200210071214558052@outgoing.mit.edu> I thought readers of the Connectionists List might be interested in this book. For more information, please visit http://mitpress.mit.edu/0262582171/ Thank you! Best, David From James_Morgan at brown.edu Mon Oct 7 13:58:10 2002 From: James_Morgan at brown.edu (Jim Morgan) Date: Mon, 07 Oct 2002 13:58:10 -0400 Subject: Position in Language Processing, Brown University Message-ID: <5.1.0.14.2.20021007135740.01cca1a0@postoffice.brown.edu> LANGUAGE PROCESSING, Brown University: The Department of Cognitive and Linguistic Sciences invites applications for a three year renewable tenure-track position at the Assistant Professor level beginning July 1, 2003. Areas of interest include but are not limited to phonology or phonological processing, syntax or sentence processing, and lexical access or lexical semantics, using experimental, formal, developmental, neurological, or computational methods. Expertise in two or more areas and/or the application of multiple paradigms is preferred. Applicants should have a strong research program and a broad teaching ability in cognitive science and/or linguistics at both the undergraduate and graduate levels. Interest in contributing curricular innovations in keeping with Brown's university-college tradition is desirable. Applicants should have completed all Ph.D. requirements by no later than July 1, 2003. Women and minorities are especially encouraged to apply. Send curriculum vitae, three letters of reference, reprints and preprints of publications, and a one page statement of research interests to Dr. James Morgan, Chair, Search Committee, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, R.I. 02912 by December 1, 2002. Brown University is an Equal Opportunity/Affirmative Action Employer From becker at mcmaster.ca Mon Oct 7 23:03:18 2002 From: becker at mcmaster.ca (S. Becker) Date: Mon, 7 Oct 2002 23:03:18 -0400 (EDT) Subject: Registration for NIPS*2002 Message-ID: You are invited to attend the 15th Annual Conference of NIPS 2002, Neural Information Processing Systems, at the Hyatt Regency in Vancouver, British Columbia, Canada and the Post-Conference Workshops at The Westin Resort in Whistler, B.C. Tutorials: December 9, 2002 Conference: December 10-12, 2002 Workshops: December 12-14, 2002 The Conference Program is now online: http://www.nips.cc We accepted 207 papers this year from a record 694 submissions, maintaining the same high quality with an acceptance rate of 30% as in previous years. Sue Becker General Chair ----------------------------------------------- NEW ONLINE REGISTRATION PROCESS: https://register.nips.salk.edu/ When registering for this year's meeting you will be asked to create an account. We will retain the contact information that you provide and it will be used for a NIPS Directory that will be posted on the NIPS Foundation web site. You will be given the opportunity to choose exactly what you would like to appear in the Directory, or you may choose not to be listed at all. The NIPS Member Directory will be posted to the NIPS website in January of 2003. Our preferred method of registration is online, however, wire transfers and checks will be accepted. Even if you plan to pay using a check or wire transfer, please begin the registration process online to prevent errors in recording your contact information and to speed your registration processing. Applications for financial/travel support can also be submitted online. The deadline for such applications is Friday, Oct 18 (midnight PST). REGISTRATION DEADLINE: The early registration (with reduced registration fees) is November 8, 2002. REGISTRATION WEBSITE: https://register.nips.salk.edu/ PROCEEDINGS: All registrants will receive a CD-ROM of the conference proceedings. Proceedings will also be available free online. The 2 volume soft-cover format, published by the MIT Press, can be purchased at the special conference rate of $35. We hope you will join us in Vancouver for an exciting new NIPS 2002 Terry Sejnowski President, NIPS Foundation ----------------------------------------------- NIPS 2002 TUTORIALS - December 9, 2001 Michael Kearns, University of Pennsylvania -- Computational Game Theory Sebastian Seung, Howard Hughes Medical Institute and MIT -- Neural Integrators Yair Weiss, Hebrew University, Jianbo Shi, Carnegie Mellon University, and Serge Belongie, UC San Diego -- Eigenvector Methods for Clustering and Image Segmentation Richard M. Karp, UC Berkeley and International Computer Science Institute -- Mathematical, Statistical and Algorithmic Challenges from Genomics and Molecular Biology Martin Cooke, University of Sheffield -- Computational Auditory Scene Analysis in Listeners and Machines Andrew McCallum, University of Massachusetts at Amherst, William Cohen, Carnegie Mellon University -- Information Extraction from the World Wide Web INVITED SPEAKERS - December 10-12, 2002 Hugh Durrant-Whyte, University of Sydney -- Information Flow in Sensor Networks Paul Glimcher, New York University -- Decisions, Uncertainty and the Brain: Neuroeconomics Deborah Gordon, Stanford University -- Ants at Work David Heeger, New York University -- Neural Correlates of Perception and Attention Andrew W. Moore, Carnegie Mellon University -- Statistical Data Mining Pietro Perona, Caltech -- Learning Visual Categories WORKSHOPS - December 12-14, 2002 Propagation Algorithms on Graphs with Cycles: Theory and Applications (2 day) -- Shiro Ikeda, Toshiyuki Tanaka, Max Welling Computational Neuroimaging: Foundations, Concepts & Methods (2 day) -- S. Hanson, B. Pearlmutter, S. Strother, L. Hansen, B. Martin-Bly Multi-Agent Learning: Theory and Practice (2 day) -- Gerald Tesauro, Michael L. Littman Independent Component Analysis and Beyond Stefan Harmeling -- Luis Borges de Almeida, Erkki Oja, Dinh-Tuan Pham Learning of Invariant Representations -- Konrad Paul Kording, Bruno A. Olshausen Quantum Neural Computing -- Elizabeth C. Behrman, James E. Steck Spectral Methods in Dimensionality Reduction, Clustering, and Classification -- Josh Tenenbaum, Sam T. Roweis Universal Learning Algorithms and Optimal Search -- Juergen Schmidhuber, Marcus Hutter On Learning Kernels -- Nello Cristianini, Tommi Jaakkola, Michael Jordan, Gert Lanckriet Negative Results and Counter Examples -- Isabelle Guyon Neuromorphic Engineering in the Commercial World -- Timothy Horiuchi, Giacomo Indiveri, Ralph Etienne-Cummings Beyond Classification and Regression -- Learning Rankings, Preferences, Equality Predicates, and Other Structures Rich Caruana, Thorsten Joachims Statistical Methods for Computational Experiments in Visual Processing and Computer Vision -- Ross Beveridge, Bruce Draper, Geof Givens, Ross J. Micheals, Jonathon Phillips Unreal Data: Principles of Modeling Nonvectorial Data -- Alex Smola, Gunnar Raetsch, Zoubin Ghahramani Machine Learning Techniques for Bioinformatics -- Colin Campbell, Phil Long Adaptation -- Spatial and Temporal Effects on Coding Garrett B. Stanley Thalamocortical Processing in Audition and Vision -- Shihab A. Shamma, Anthony M. Zador ----------------------------------------------- From zaffalon at idsia.ch Wed Oct 9 10:43:38 2002 From: zaffalon at idsia.ch (Marco Zaffalon) Date: Wed, 09 Oct 2002 16:43:38 +0200 Subject: Ph.D. Position Available Message-ID: <5.1.0.14.0.20021009164303.0297c6b8@mailhost.idsia.ch> [apologies for multiple postings] Ph.D. Position Available - Deadline December 1st 2002 ----------------------------------------------------- IDSIA, Switzerland, is seeking for an outstanding Ph.D. student with excellent mathematical as well as computer programming skills. We intend to explore the opportunities offered by imprecise probabilities and graphical models to data mining, in particular to classification. Classification has a long tradition in Artificial Intelligence and Statistics. Classifiers are tools inferred from data that learn how to do real and important tasks of diagnosis, prediction and recognition. Imprecise probability-based classification is an exciting new area, with great potential to provide new and reliable ways to deal with difficult problems. The research will need to address both theoretical (foundations of statistical reasoning) and applied issues (implementation and test of new models). Possible backgrounds are computer science, physics, mathematics, engineering, etc. The position is funded by the Swiss National Science Foundation. The initial appointment will be for 2 years. Normally there will be a prolongation. The new Ph.D. student will interact with Marco Zaffalon and other people at IDSIA, and will be involved in international research collaborations. See http://www.idsia.ch/~zaffalon/positions/credal2002.htm for more information. Applicants should submit: 1. Detailed curriculum vitae, 2. List of three references (and their email addresses), 3. Transcripts of undergraduate and graduate (if applicable) studies, 4. Concise statement of their research interests (two pages max). Please address all correspondence to: Marco Zaffalon, IDSIA, Galleria 2, CH-6928 Manno (Lugano), Switzerland. Applications can also be submitted by e-mail to zaffalon at idsia.ch (2MB max). WWW pointers to ps/pdf/doc/html files are welcome. Use Firstname.Lastname.DocDescription.DocType for filename convention. Thanks for your interest. Marco Zaffalon, Senior Researcher, IDSIA tel +41 91 610 8665 fax +41 91 610 8661 e-mail mailto:zaffalon at idsia.ch web http://www.idsia.ch/~zaffalon ABOUT IDSIA ----------- IDSIA (http://www.idsia.ch) is a joint research institute of the University of Lugano (http://www.unisi.ch) and the Swiss Italian University for Applied Science (http://www.supsi.ch). Our research focuses on uncertain reasoning, imprecise probabilities, graphical models, data mining, artificial neural nets, reinforcement learning, complexity and generalization issues, unsupervised learning and information theory, forecasting, artificial ants, combinatorial optimization, evolutionary computation. IDSIA is small but visible, competitive, and influential. The "X-Lab Survey" by Business Week Magazine ranked IDSIA among the world's top ten labs in Artificial Intelligence. IDSIA's algorithms hold the world records for several important operations research benchmarks (see Nature 406(6791):39-42 for an overview of artificial ant algorithms developed at IDSIA). IDSIA is located near the swiss supercomputing center. IDSIA is close to the beautiful city of Lugano in Ticino, the scenic southernmost province of Switzerland. Zurich, Milan and Venice are only few hours away by train. From stefan.wermter at sunderland.ac.uk Wed Oct 9 13:02:35 2002 From: stefan.wermter at sunderland.ac.uk (Stefan Wermter) Date: Wed, 09 Oct 2002 18:02:35 +0100 Subject: New MSc Intelligent Systems Message-ID: <3DA4612B.7776324@sunderland.ac.uk> New MSc Intelligent Systems ------------------- The School of Computing and Technology, University of Sunderland is delighted to announce the launch of its new MSc Intelligent Systems programme for 24th February. Building on the School's leading edge research in intelligent systems this masters programme will be funded via the ESF scheme (see below). Intelligent Systems is an exciting field of study for science and industry since the currently existing computing systems have often not yet reached the various aspects of human performance. "Intelligent Systems" is a term to describe software systems and methods, which simulate aspects of intelligent behaviour. The intention is to learn from nature and human performance in order to build more powerful computing systems. The aim is to learn from cognitive science, neuroscience, biology, engineering, and linguistics for building more powerful computational system architectures. In this programme a wide variety of novel and exciting techniques will be taught including neural networks, intelligent robotics, machine learning, natural language processing, vision, evolutionary genetic computing, data mining, information retrieval, Bayesian computing, knowledge-based systems, fuzzy methods, and hybrid intelligent architectures. Programme Structure -------------- The following lectures/modules are available: Neural Networks Intelligent Systems Architectures Learning Agents Evolutionary Computation Cognitive Neural Science Knowledge Based Systems and Data Mining Bayesian Computation Vision and Intelligent Robots Natural Language Processing Dynamics of Adaptive Systems Intelligent Systems Programming Funding up to 6000 pounds (9500Euro) for eligible students ------------------------------ The Bursary Scheme applies to this Masters programme commencing February 2003 and we have obtained funding through the European Social Fund (ESF). ESF support enables the University to waive the normal tuition fee and provide a bursary of 75 per week for 45 weeks for eligible EU students, together up to 6000 pounds. For further information in the first instance please see: http://osiris.sund.ac.uk/webedit/allweb/courses/progmode.php?prog=G550A&mode=FT&mode2=&dmode=C For information on applications and start dates contact: gillian.potts at sunderland.ac.uk Tel: 0191 515 2758 For academic information about the programme contact: alfredo.moscardini at sunderland.ac.uk *************************************** Professor Stefan Wermter Chair for Intelligent Systems Informatics Centre School of Computing and Technology University of Sunderland St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From mcrae at uwo.ca Wed Oct 9 00:32:57 2002 From: mcrae at uwo.ca (Ken McRae) Date: Wed, 09 Oct 2002 00:32:57 -0400 Subject: Cognitive Position at Western Message-ID: FACULTY POSITION IN COGNITIVE PSYCHOLOGY. The Psychology Department at the University of Western Ontario invites applications for a tenure-track position at the Assistant Professor level to enhance the Department's strengths in Cognitive Psychology, Cognitive Science, and Cognitive Neuroscience. The successful applicant will be expected to maintain an active research program in his or her research area, teach undergraduate and graduate courses, and provide graduate student supervision. The primary selection criteria will be research excellence and productivity; researchers from any domain of study in Cognitive Psychology, Cognitive Science, and Cognitive Neuroscience will be considered. Applicants should submit by November 15, 2002, a curriculum vitae, statement of research and teaching experience and interests, copies of representative publications, and arrange to have 3 letters of recommendation sent to: Dr. Jim Olson, Chair, Department of Psychology, The University of Western Ontario, London, Ontario, Canada N6A 5C2. This position is subject to budgetary approval. The scheduled starting date is July 1, 2003. Please see http://www.ssc.uwo.ca/psychology for the relevant information. The University of Western Ontario is committed to employment equity and welcomes applications from all qualified women and men, including visible minorities, aboriginal people, and persons with disabilities. The Department of Psychology at UWO is strong in these areas. Faculty in the Cognition Area include Marc Joanisse (cognitive neuroscience of language processing in normal and impaired adults and children, neural network modeling), Albert Katz (autobiographical memory, figurative language), Stephen Lupker (word recognition, semantic memory), and Ken McRae (cognitive and neural bases of word meaning, sentence processing, neural network modeling). Faculty in other areas of the Department include Stephan K?hler (cognitive and neural bases of implicit and explicit memory), Debra Jared (word recognition, bilingualism), Mel Goodale (perception and action, object recognition), Keith Humphrey (perception and action, object recognition), Jody Culham (perception and action), David Sherry, William Roberts, and Scott MacDougall-Shackelton (animal learning and cognition). Research facilities at UWO include fMRI, ERP, eyetracking and TMS laboratories, high performance computing facilities for computational modeling, a large database for developmental studies, and a large participant pool of normal undergraduate adults. From cns at cns.bu.edu Thu Oct 10 10:58:26 2002 From: cns at cns.bu.edu (Boston University CNS Department) Date: Thu, 10 Oct 2002 10:58:26 -0400 Subject: Graduate Program in the Department of Cognitive and Neural Systems (CNS) at Boston University Message-ID: <3DA59592.7010504@cns.bu.edu> PLEASE POST ******************************************************************* GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The brochure may also be viewed on line at: http://www.cns.bu.edu/brochure/ and application forms at: http://www.bu.edu/cas/graduate/application.html Applications for Fall 2003 admission and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via email your full name and mailing address to the attention of Mr. Robin Amos at: amos at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. ******************************************************************* Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students and qualified undergraduates interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The department's training and research focus on two broad questions. The first question is: How does the brain control behavior? This is a modern form of the Mind/Body Problem. The second question is: How can technology emulate biological intelligence? This question needs to be answered to develop intelligent technologies that are well suited to human societies. These goals are symbiotic because brains are unparalleled in their ability to intelligently adapt on their own to complex and novel environments. Models of how the brain accomplishes this are developed through systematic empirical, mathematical, and computational analysis in the department. Autonomous adaptation to a changing world is also needed to solve many of the outstanding problems in technology, and the biological models have inspired qualitatively new designs for applications. During the past decade, CNS has led the way in developing biological models that can quantitatively simulate the dynamics of identified brain cells in identified neural circuits, and the behaviors that they control. This new level of understanding is leading to comparable advances in intelligent technology. CNS is a graduate department that is devoted to the interdisciplinary training of graduate students. The department awards MA, PhD, and BA/MA degrees. Its students are trained in a broad range of areas concerning computational neuroscience, cognitive science, and neuromorphic systems. The biological training includes study of the brain mechanisms of vision and visual object recognition; audition, speech, and language understanding; recognition learning, categorization, and long-term memory; cognitive information processing; self-organization and development, navigation, planning, and spatial orientation; cooperative and competitive network dynamics and short-term memory; reinforcement and motivation; attention; adaptive sensory-motor planning, control, and robotics; biological rhythms; consciousness; mental disorders; and the mathematical and computational methods needed to support advanced modeling research and applications. Technological training includes methods and applications in image processing, multiple types of signal processing, adaptive pattern recognition and prediction, information fusion, and intelligent control and robotics. The foundation of this broad training is the unique interdisciplinary curriculum of seventeen interdisciplinary graduate courses that have been developed at CNS. Each of these courses integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of artificial neural networks and hybrid systems to technology. A student's curriculum is tailored to his or her career goals with an academic advisor and a research adviser. In addition to taking interdisciplinary courses within CNS, students develop important disciplinary expertise by also taking courses in departments such as biology, computer science, engineering, mathematics, and psychology. In addition to these formal courses, students work individually with one or more research advisors to learn how to do advanced interdisciplinary research in their chosen research areas. As a result of this breadth and depth of training, CNS students have succeeded in finding excellent jobs in both academic and technological areas after graduation. The CNS Department interacts with colleagues in several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The units most closely linked to the department are the Center for Adaptive Systems and the CNS Technology Laboratory. Students interested in neural network hardware can work with researchers in CNS and at the College of Engineering. Other research resources include the campus-wide Program in Neuroscience, which includes distinguished research groups in cognitive neuroscience, neurophysiology, neuroanatomy, neuropharmacology, and neural modeling across the Charles River Campus and the Medical School; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department ; and in biophysics and computational physics within the Physics Department. Key colleagues in these units hold joint appointments in CNS in order to expedite training and research interactions with CNS core faculty and students. In addition to its basic research and training program, the department organizes an active colloquium series, various research and seminar series, and international conferences and symposia, to bring distinguished scientists from experimental, theoretical, and technological disciplines to the department. The department is housed in its own four-story building, which includes ample space for faculty and student offices and laboratories (computational neuroscience, visual psychophysics, psychoacoustics, speech and language, sensory-motor control, neurobotics, computer vision), as well as an auditorium, classroom, seminar rooms, a library, and a faculty-student lounge. The department has a powerful computer network for carrying out large-scale simulations of behavioral and brain models and applications. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior Helen Barbas Professor, Department of Health Sciences, Sargent College PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex Jacob Beck Research Professor of Cognitive and Neural Systems PhD, Psychology, Cornell University Visual perception, psychophysics, computational models of vision Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems Director, CNS Technology Laboratory PhD, Mathematics, University of Wisconsin, Madison Learning and memory, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems, cardiovascular oscillations physiology and time series H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, auditory virtual environments, signal processing models of hearing Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology John C. Fiala Research Assistant Professor of Biology PhD, Cognitive and Neural Systems, Boston University Synaptic plasticity, dendrite anatomy and pathology, motor learning, robotics, neuroinformatics Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Vision, audition, language, learning and memory, reward and motivation, cognition, development, sensory-motor control, mental disorders, applications Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, biological sensory-motor control and functional brain imaging Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition Michael E. Hasselmo Associate Professor of Psychology Director of Graduate Studies, Psychology Department PhD, Experimental Psychology, Oxford University Computational modeling and experimental testing of neuromodulatory mechanisms involved in encoding, retrieval and consolidation Allyn Hubbard Associate Professor of Electrical and Computer Engineering PhD, Electrical Engineering, University of Wisconsin Peripheral auditory system (experimental and modeling), chip design spanning the range from straightforward digital applications to exotic sub-threshold analog circuits that emulate the functionality of the visual and auditory periphery, BCS/FCS, the mammalian cochlea in silicon and MEMS, and drug discovery on silicon Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Neural network theory, complexity theory, wavelet theory, mathematical physics Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamics of networks of neurons Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neurodevelopmental disorders Ennio Mingolla Professor of Cognitive and Neural Systems and Psychology Acting Chairman 2002-2003, Department of Cognitive and Neural Systems PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision Bradley Rhodes Research Associate, Technology Lab, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Motor control, learning, and adaptation, serial order behavior (timing in particular), attention and memory Michele Rucci Assistant Professor of Cognitive and Neural Systems PhD, Scuola Superiore S.-Anna, Pisa, Italy Vision, sensory-motor control and learning, and computational neuroscience Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Research Scientist, Haskins Laboratories, New Haven, CT Assistant Professor in Residence, Department of Psychology and Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception Teaching about functional MRI and other brain mapping methods Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Frances and Louis H. Salvage Professor of Psychology, Brandeis University Consultant in neurosurgery, Boston Children's Hospital PhD, Psychology, Brown University Visual motion, brain imaging, relation of visual perception, memory, and movement Barbara Shinn-Cunningham Assistant Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensorimotor adaptation, mathematical models of human performance David Somers Assistant Professor of Psychology PhD, Cognitive and Neural Systems, Boston University Functional MRI, psychophysical, and computational investigations of visual perception and attention Chantal E. Stern Assistant Professor of Psychology and Program in Neuroscience, Boston University Assistant in Neuroscience, MGH-NMR Center and Harvard Medical School PhD, Experimental Psychology, Oxford University Functional neuroimaging studies (fMRI and MEG) of learning and memory Malvin C. Teich Professor of Electrical and Computer Engineering, Biomedical Engineering, and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI) Jeremy Wolfe Adjunct Associate Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Department Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, pre-attentive and attentive object representation Curtis Woodcock Professor of Geography Chairman, Department of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and student-run special interest groups, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by fellowships, grants, and contracts from federal agencies and private foundations that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception; audition, speech and language processing; and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Silicon Graphics workstations, Macintoshes, and PCs. A PC farm running Linux operating systems is available as a distributed computational environment. All students have access to X-terminals or UNIX workstation consoles, a selection of color systems and PCs, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Active Perception Laboratory The Active Perception Laboratory is dedicated to the investigation of the interactions between perception and behavior. Research focuses on the theoretical and computational analyses of the effects of motor behavior on sensory perception and on the design of psychophysical experiments with human subjects. The Active Perception Laboratory includes extensive computational facilities that allow the execution of large-scale simulations of neural systems. Additional facilities will soon include instruments for the psychophysical investigation of eye movements during visual analysis, including an accurate and non-invasive eye tracker, and robotic systems for the simulation of different types of behavior. Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Laboratory is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; a light machine shop; an active vision laboratory including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. The laboratory supports research in the areas of neural modeling, computational neuroscience, computer vision and robotics. The major question being address is the nature of representation of the visual world in the brain, in terms of observable neural architectures such as topographic mapping and columnar architecture. The application of novel architectures for image processing for computer vision and robotics is also a major topic of interest. Recent work in this area has included the design and patenting of novel actuators for robotic active vision systems, the design of real-time algorithms for use in mobile robotic applications, and the design and construction of miniature autonomous vehicles using space-variant active vision design principles. Recently one such vehicle has successfully driven itself on the streets of Boston. Neurobotics Laboratory The Neurobotics Laboratory utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The laboratory currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Other platforms may be investigated in the future. Psychoacoustics Laboratory The Psychoacoustics Laboratory in the Department of Cognitive and Neural Systems (CNS) is equipped to perform both traditional psychoacoustic experiments as well as experiments using interactive auditory virtual-reality stimuli. The laboratory contains approximately eight PCs (running Windows 98 and/or Linux), used both as workstations for students and to control laboratory equipment and run experiments. The other major equipment in the laboratory includes special-purpose signal processing and sound generating equipment from Tucker-Davis Technologies, electromagnetic head tracking systems, a two-channel spectrum analyzer, and other miscellaneous equipment for producing, measuring, analyzing, and monitoring auditory stimuli. The Psychoacoustics Laboratory consists of three adjacent rooms in the basement of 677 Beacon St. (the home of the CNS Department). One room houses an 8 ft. x 8 ft. single-walled sound-treated booth as well as space for students. The second room is primarily used as student workspace for developing and debugging experiments. The third space houses a robotic arm, capable of automatically positioning a small acoustic speaker anywhere on the surface of a sphere of adjustable radius, allowing automatic measurement of the signals reaching the ears of a listener for a sound source from different positions in space, including the effects of room reverberation. Sensory-Motor Control Laboratory The Sensory-Motor Control Laboratory supports experimental and computational studies of sensory-motor control. A computer controlled infrared WatSmart system allows measurement of large-scale (e.g. reaching) movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. The laboratory is connected to the department's extensive network of Linux and Windows workstations and Linux computational servers. Speech and Language Laboratory The Speech Laboratory includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal-processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. The laboratory also contains a network of Windows-based PC computers equipped with software for the analysis of functional magnetic resonance imaging (fMRI) data, including region-of-interest (ROI) based analyses involving software for the parcellation of cortical and subcortical brain regions in structural MRI images. Technology Laboratory The Technology Laboratory fosters the development of neural network models derived from basic scientific research and facilitates the transition of the resulting technologies to software and applications. The Lab was established in July 2001, with a grant from the Air Force Office of Scientific Research: "Information Fusion for Image Analysis: Neural Models and Technology Development." Initial projects have focused on multi-level fusion and data mining in a geospatial context, in collaboration with the Boston University Center for Remote Sensing. This research and development has built on models of opponent-color visual processing, boundary contour system (BCS) and texture processing, and Adaptive Resonance Theory (ART) pattern learning and recognition, as well as other models of associative learning and prediction. Other projects include collaborations with the New England Medical Center and Boston Medical Center, to develop methods for analysis of large-scale medical databases, currently to predict HIV resistance to antiretroviral therapy. Associated basic research projects are conducted within the joint context of scientific data and technological constraints. Visual Psychophysics Laboratory The Visual Psychophysics Laboratory occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Macintosh, Windows and Linux workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing devices, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty members have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://www.cns.bu.edu/ ******************************************************************* From bengioy at IRO.UMontreal.CA Thu Oct 10 09:55:02 2002 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Thu, 10 Oct 2002 09:55:02 -0400 Subject: tenure-track position at U of Montreal, stat. mach. learning Message-ID: <20021010095502.A15118@vor.iro.umontreal.ca> Hello, My department advertises a tenure-track faculty position, with axes of interest that include Statistical Machine Learning. The department has a statistical machine learning lab (currently with Balasz Kegl and I) with about 15 graduate students and post-docs, and access to several linux clusters for high-performance computing. The official advert is below. Please feel free to contact me for more information. Teaching is in French but non-francophones are usually given a break the first year, to learn enough French to teach. ---------------------------------------------------------------------- Universit? de Montr?al Facult? des arts et des sciences Department of Computer Science and Operations Research The DIRO (D?partement d'informatique et de recherche op?rationnelle - Department of Computer Science and Operations Research) invites applications for several tenure-track positions, starting June 1st, 2003. Applicants seeking positions at the Assistant Professor level will have priority. The Department is seeking qualified candidates in Computer Science. Preference will be given to applicants with a strong research program in one of the areas of Software Engineering, Systems (design and implementation of programming languages, compilation, parallel processing), or Computer Networking and Distributed Systems (including electronic commerce). Other areas of interest are Machine Learning (statistical learning, data mining) and Operations Research (stochastic modelling and optimization in particular). A background combining more than one of the above areas, or combining Computer Science and Operations Research, is an asset. An excellent candidate working in a field different from those listed above would also receive consideration. Beyond demonstrating a clear potential for outstanding research, the successful candidate must be committed to excellence in teaching. The candidate is expected to have a working knowledge of French, and be prepared to teach and supervise students in French within one year. The Universit? de Montr?al is the leading French-language university in North America. The DIRO offers B.Sc., M.Sc., and Ph.D. degrees in Computer Science and Operations Research, a B.Sc. in Bioinformatics, several bidisciplinary B.Sc. degrees, as well as an M.Sc. in electronic commerce. With 41 faculty members, 600 undergraduates and close to 200 graduate students, the DIRO is one of the largest Computer Science departments in Canada as well as one of the most active in research. Research interests of current faculty include bioinformatics, computer networking, intelligent tutoring systems, computer architecture, software engineering, artificial intelligence, computational linguistics, computer graphics, vision and solid modelling, automatic learning, theoretical and quantum computing, parallelism, optimization, and simulation. See http://www.iro.umontreal.ca. Task: Undergraduate and graduate teaching, research and supervision of graduate students. Requirements : Ph.D. in Computer Science or a related area. Salary : Starting salary is competitive and fringe benefits are excellent. Hardcopy applications including a curriculum vitae, a description of the candidate's current research program, at least three letters of reference, and up to three selected preprints/reprints, should be sent to: Pierre McKenzie, professeur et directeur D?partement d'informatique et de recherche op?rationnelle, FAS Universit? de Montr?al C.P. 6128, Succ. Centre-Ville Montr?al (Qu?bec) Canada H3C 3J7 by February 1st, 2003. Applications received after that date may be considered until the positions are filled. In accordance with Canadian Immigration requirements, priority will be given to Canadian citizens and permanent residents. The Universit? de Montr?al is committed to equity in employment and encourages applications from qualified women. ----- End forwarded message ----- -- Yoshua Bengio Full Professor / Professeur titulaire Canada Research Chair in Statistical Learning Algorithms / titulaire de la chaire de recherche du Canada en algorithmes d'apprentissage statistique D?partement d'Informatique et Recherche Op?rationnelle Universit? de Montr?al, adresse postale: C.P. 6128 Succ. Centre-Ville, Montr?al, Qu?bec, Canada H3C 3J7 adresse civique: 2920 Chemin de la Tour, Montr?al, Qu?bec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From robbie at bcs.rochester.edu Thu Oct 10 11:20:09 2002 From: robbie at bcs.rochester.edu (Robert Jacobs) Date: Thu, 10 Oct 2002 11:20:09 -0400 Subject: articles available Message-ID: <5.1.1.6.0.20021010111815.00ac6fd0@bcs.rochester.edu> The following papers may be of interest to readers of this mailing list: (1) Jacobs, R.A., Jiang, W., and Tanner, M.A. (2002) Factorial hidden Markov models and the generalized backfitting algorithm. Neural Computation, 14, 2415-2437. (2) Jacobs, R.A. (2002) What determines visual cue reliability? Trends in Cognitive Sciences, 6, 345-350. Robbie Jacobs =================================== (1) Jacobs, R.A., Jiang, W., and Tanner, M.A. (2002) Factorial hidden Markov models and the generalized backfitting algorithm. Neural Computation, 14, 2415-2437. Previous researchers developed new learning architectures for sequential data by extending conventional hidden Markov models through the use of distributed state representations. Although exact inference and parameter estimation in these architectures is computationally intractable, Ghahramani and Jordan (1997) showed that approximate inference and parameter estimation in one such architecture, factorial hidden Markov models (FHMMs), is feasible in certain circumstances. However, the learning algorithm proposed by these investigators, based on variational techniques, is difficult to understand and implement, and is limited to the study of real-valued datasets. This paper proposes an alternative method for approximate inference and parameter estimation in FHMMs based on the perspective that FHMMs are a generalization of a well-known class of statistical models known as Generalized Additive Models (GAMs; Hastie and Tibshirani, 1990). Using existing statistical techniques for GAMs as a guide, we have developed the generalized backfitting algorithm. This algorithm computes customized error signals for each hidden Markov chain of an FHMM, and then trains each chain one at a time using conventional techniques from the hidden Markov models literature. Relative to previous perspectives on FHMMs, we believe that the viewpoint taken here has a number of advantages. First, it places FHMMs on firm statistical foundations by relating FHMMs to a class of models that are well-studied in the statistics community, yet it generalizes this class of models in an interesting way. Second, it leads to an understanding of how FHMMs can be applied to many different types of time series data, including Bernoulli and multinomial data, not just data which are real-valued. Lastly, it leads to an effective learning procedure for FHMMs which is easier to understand and easier to implement than existing learning procedures. Simulation results suggest that FHMMs trained with the generalized backfitting algorithm are a practical and powerful tool for analyzing sequential data. http://www.bcs.rochester.edu/people/robbie/jacobs.j.t.nc02.pdf =================================== (2) Jacobs, R.A. (2002) What determines visual cue reliability? Trends in Cognitive Sciences, 6, 345-350. Visual environments often contain many cues to properties of an observed scene. In order to integrate information provided by multiple cues in an efficient manner, observers must assess the degree to which each cue provides reliable versus unreliable information. Two hypotheses are reviewed regarding how observers estimate cue reliabilities, namely that the estimated reliability of a cue is related to the ambiguity of the cue, and that people use correlations among cues in order to estimate cue reliabilities. It is shown that cue reliabilities are important both for cue combination and for aspects of visual learning. http://www.bcs.rochester.edu/people/robbie/jacobs.tics02.pdf ---------------------------------------------------------------------------------------- Robert Jacobs Department of Brain and Cognitive Sciences University of Rochester Rochester, NY 14627-0268 phone: 585-275-0753 fax: 585-442-9216 email: robbie at bcs.rochester.edu web: http://www.bcs.rochester.edu/people/robbie/robbie.html From E.Koning at elsevier.nl Fri Oct 11 08:18:40 2002 From: E.Koning at elsevier.nl (Koning, Esther (ELS)) Date: Fri, 11 Oct 2002 13:18:40 +0100 Subject: Introducing www.ComputerScienceWeb.com Message-ID: <4D56BD81F62EFD49A74B1057ECD75C0603A9A34C@elsamss02571> Researchers can now benefit from the newly launched on-line platform "www.ComputerScienceWeb.com". Tailored to your specific needs, ComputerScienceWeb offers comprehensive search facilities and customized services in 12 categories of computer science, and provides: * access to over 50.000 articles and more than 70 journals in computer science * free access to abstracts and tables of contents * integrated search and browse facilities across all journals and preprints * free services such as the "The Computer Science Preprint Server" and "Who Cites Who" From tkelley at arl.army.mil Fri Oct 11 16:43:54 2002 From: tkelley at arl.army.mil (Troy Kelley) Date: Fri, 11 Oct 2002 16:43:54 -0400 Subject: Free SuperComputers Use Available for Connectionist Researchers Message-ID: Hello, The Human Research and Engineering Directorate (HRED) of the Army Research Laboratory (ARL) is seeking proposals to develop models of the interactions and dynamics of human cognition by using ARL's High Performance Computing (HPC) SuperComputer assets. The project is called: Modeling and Interaction of Neurological Dynamics with Symbolic Structures (MINDSS). Cooperative Research and Development Agreements (CRADAs) will be used to leverage the research activities of participating major universities who are interested in developing computational models of human cognition. The objective is to use ARL's state-of-the-art computer facilities to develop connectionist, symbolic, and neurological models which are relevant to DoD's research interests. Major areas of interest include: language processing, visual and auditory processing, complex reasoning, and the brain's reaction to environmental trauma and stress. If you are interested in joining the MINDSS program, having access to our state of the art computer systems, and submitting a proposal, please send an e-mail to the address listed below. You must be a U.S. citizen and you must be affiliated with a major University. Please feel free to forward this message to anyone whom might be interested. Thanks, Troy Kelley Army Research Laboratory tkelley at arl.army.mil From ingber at ingber.com Sun Oct 13 09:52:51 2002 From: ingber at ingber.com (Lester Ingber) Date: Sun, 13 Oct 2002 09:52:51 -0400 Subject: open position: Financial Engineer Message-ID: <20021013135251.GA3356@ingber.com> If you have very strong credentials for the position described below, please email your resume to: Lester Ingber Director R&D DUNN Capital Management Stuart FL Some recent press on DUNN can be seen on http://www.businessweek.com/magazine/content/02_39/b3801113.htm http://www.businessweek.com/magazine/content/02_39/b3801114.htm Financial Engineer A disciplined, quantitative, analytic individual proficient in prototyping and coding (such as C/C++, Maple/Mathematica, or Visual Basic, etc.) is sought for financial engineering/risk:reward optimization research position with established Florida hedge fund (over two decades in the business and $1 billion in assets under management). A PhD in a mathematical science, such as physics, statistics, math, or computer-science, is preferred. Hands-on experience in the financial industry is required. Emphasis is on applying state-of-the-art methods to financial time-series of various frequencies. Ability to work with a team to transform ideas/models into robust, intelligible code is key. Salary: commensurate with experience, with bonuses tied to the individual's and the firm's performance. Status of Selection Process All applicants will be reviewed, and a long list will be generated for phone interviews. Other applicants will not be contacted further. Information on the status of this process will be available in http://www.ingber.com/open_positions.html From these phone interviews, a short list will be generated for face-to-face interviews. During the visit for the physical interview a small coding exam will be given. Start date for this position may range anywhere from immediately to six months thereafter, depending on both the candidate's and the firm's needs. -- Prof. Lester Ingber ingber at ingber.com ingber at alumni.caltech.edu www.ingber.com www.alumni.caltech.edu/~ingber From Bob.Williamson at anu.edu.au Mon Oct 14 02:54:48 2002 From: Bob.Williamson at anu.edu.au (Bob.Williamson@anu.edu.au) Date: Mon, 14 Oct 2002 16:54:48 +1000 (AUS Eastern Standard Time) Subject: Postdoctoral Fellowships Available Message-ID: Postdoctoral Fellowships (approximately 30 positions) ---------------------------------------------------- National ICT Australia is a newly formed research institute based in Canberra and Sydney. Details of the centre can be found on its website http://nicta.com.au We are now hiring postdoctoral fellows in a range of research areas including machine learning. These are 3-5 year positions. There is a program on Statistical Machine Learning based in Canberra (see http://nicta.com.au/stat-ml.html ) for a brief overview of the current group. There are around 3 postdoc vacancies in this program. The formal postdoc job ad can be found at http://nicta.com.au/jobs/postdoc.pdf which contains details on how to apply. There is no closing date. Assessment of applications will commence on 18 November 2002. -----------------------------------------+-----------------------------. | Professor Robert (Bob) Williamson, // Phone: +61 2 6125 0079 | | Designate Canberra Node Director // Office: +61 2 6125 8801 | | and Vice-President, // Fax: +61 2 6125 8623 | | National ICT Australia (NICTA) // Mobile: +61 4 0405 3877 | | Research School of Information // | | Sciences and Engineering, // Bob.Williamson at nicta.edu.au | | Australian National University, // http://www.nicta.com | | Canberra 0200 AUSTRALIA // http://axiom.anu.edu.au/~williams | `---------------------------------+-------------------------------------' From P.J.Lisboa at livjm.ac.uk Sun Oct 13 12:12:10 2002 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Sun, 13 Oct 2002 17:12:10 +0100 Subject: NNESMED/CIMED International Conference Sheffield July 21 - 23, 20 03 Message-ID: After the success of the NNESMED conference held last year in the Island of Milos, the conference returns to the UK. It is expected to retain the single-track format, in order to foster the multidisciplinary interaction that is a regular feature of this event. The deadline for submission of extended abstractsis 31 January 2003. Further details can be found at http://www.shu.ac.uk/conference/nnesmed and a text version of the call for papers follows: FIFTH INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND EXPERT SYSTEMS IN MEDICINE AND HEALTHCARE, NNESMED 2003 and First International Conference on Computational Intelligence in Medicine and Healthcare, CIMED 2003 July 21 - 23, 2003, School of Engineering, Sheffield Hallam University, Sheffield, England This is the fifth of a successful series of international conferences focused on the application of intelligent computational methods and systems to support all areas of biomedical, clinical, and healthcare practice, which brings together healthcare specialists, clinicians, biomedical engineers, computer scientists, communication and computer network engineers, and applied mathematicians. The language of the conference is English. TOPICS OF INTEREST Artificial neural networks Intelligent signal processing Intelligent Image Processing Fuzzy and neuro-fuzzy systems Rough sets Evolutionary computing Probabilistic reasoning Non-linear dynamical analysis methods Independent Components Analysis Belief networks Machine learning Artificial life Intelligent agents Data mining Expert systems Intelligent telemedicine and telecare MEDICAL APPLICATION AREAS INCLUDE: Signal processing Diagnosis and therapy Bioinformatics Monitoring and control Image processing and interpretation Rehabilitation Process modelling and simulation Education INTERNATIONAL STEERING COMMITTEE Professor Dr Barrie Jervis (Conference Chair, UK), Professor Emmanuel Ifeachor (UK), Professor Periklis Ktonas (USA), Professor Paulo Lisboa (UK), Professor Antonina Starita (Italy), Professor George Papadourakis (GR) INTERNATIONAL PROGRAMME COMMITTEE The International Programme Committee will referee the papers submitted. Marco Gori (Italy), Ben Jansen (USA), Nicolaos B. Karayiannis (USA), David Lowe (UK), Francesco Masulli (Italy), Sifis Micheloyannis (Greece), Ryszard Tadeusiewicz (Poland), Azzam Taktak (UK), Michael Zervakis (Greece), Farhat Fnaiech (Tunisia), Jiri Jan (Czech Republic). SUBMISSION OF PAPERS Authors are requested to submit an extended abstract (two pages in length, single spacing, full details in the Call for Papers, available early October). Extended abstracts should clearly identify the medical or healthcare context of the work, the methodology used, the advances made and the significance of the results. Papers will be accepted as either full session oral papers or as poster papers. Authors whose abstracts are accepted will be asked to develop them into full papers of 4-6 pages length for inclusion in the conference proceedings. IMPORTANT DEADLINES Submission of extended abstracts: 31 January 2003, Notification of provisional acceptance: 28 February 2003, Submission of full papers (camera ready): 18 April 2003, Receipt of Conference Booking Form: 13 June 2003. CONFERENCE WEBSITE The temporary website address until further notice is: http://www.shu.ac.uk/conference/nnesmed CONFERENCE FEES The full non-residential conference fee for bookings made by 30 April 2003 is 300. Later bookings will incur a supplementary charge of 50. A daily rate of 180 is also available, which does not include evening dinner. ACCOMMODATION AND SOCIAL PROGRAMME The booking form will include details of residential accommodation in hotels and student residences to suit all budgets. Accommodation is not included in the conference fee. Details of the proposed social programme will appear in the Call for Papers. CONFERENCE CONTACT Conference 21, Sheffield Hallam University, City Campus, Sheffield, S1 1WB, England. Tel.: +44-114-225-5338/5336 Fax: +44-114-225-5337 E-mail: conference21 at shu.ac.uk From Ronan.Reilly at may.ie Mon Oct 14 05:11:17 2002 From: Ronan.Reilly at may.ie (Ronan Reilly) Date: Mon, 14 Oct 2002 10:11:17 +0100 Subject: postdoctoral possibilities in Ireland Message-ID: Readers of the list may interested in the possibility of postdoctoral positions in Ireland funded by the Irish Research Council for Science, Engineering, & Technology (IRCSET). An "advance notice" to the call is appended to this email. Further details will shortly appear at http://www.embark.ie. The deadline for submissions is November 1. Potential applicants should note that there is no nationality restriction on who may apply. The Department of Computer Science at NUI Maynooth is interested in hosting post-doctoral applicants in any of the Department's active research areas (http://www.cs.may.ie/research/index.html). Potential applicants should contact the relevant members of the Department directly. Ronan ______________________ Prof. Ronan G. Reilly Department of Computer Science National University of Ireland, Maynooth Co. Kildare IRELAND v: +353-1-708 3847 f: +353-1-708 3848 w1: www.cs.may.ie/~rreilly (homepage) w2: cortex.cs.may.ie (research group) e: Ronan.Reilly at may.ie ========= Advance Notice IRCSET/Embark Initiative Post Doctoral Fellowship Scheme The Embark Initiative will shortly launch its Post Doctoral Fellowship Scheme. This scheme is designed to encourage excellence in research careers by funding doctoral students to associate with established research teams who have achieved international recognition for their work. Applicants will normally have submitted their PhD to their supervisor and have certification from their university that their thesis has been submitted for examination. Applicants will be required to submit: a research plan with input from the candidate and host research group summary of Ph.D work a list of publications and conference presentations statement from Ph.D supervisor regarding candidates achievements and suitability for post doctoral work statement from proposed laboratory regarding suitability of candidate for the research area and resources available for the proposed work The award will consist of a salary contribution of 33,000 per annum and a contribution to laboratory costs and travel of up to 5,000 per annum. Awards will be tenable for 2 years. The Closing date for applications will be 1st November 2002. This short call duration is necessary in order to ensure that assessment and completion of contracts are in place for the 2002/2003 academic year. The full application documents will be available in approximately one week. From nik.kasabov at aut.ac.nz Tue Oct 15 02:49:43 2002 From: nik.kasabov at aut.ac.nz (Nik Kasabov) Date: Tue, 15 Oct 2002 19:49:43 +1300 Subject: Book announcement: Evolving Connectionist Systems Message-ID: The monograph book " Evolving Connectionist Systems - Methods and Applications in Bioinformatics, Brain Study and Intelligent Machines" has just been published by Springer http://www.springer.de , series Perspectives in Neurocomputing, 2002XII, 308 p. Softcover; 1-85233-400-2. Some software, colour figures, .ppt presentations, and related papers are available from http://www.kedri.info (ECOS page) ---------------------------------------------------------------- Content: Prologue Part I. Evolving Connectionist Systems: Methods and Techniques Chapter 1. Evolving processes and evolving connectionist systems Chapter 2. Evolving connectionist systems for unsupervised learning Chapter 3. Evolving connectionist systems for supervised learning Chapter 4. Recurrent evolving systems, reinforcement learning, and evolving automata Chapter 5. Evolving neuro- fuzzy inference systems Chapter 6. Evolutionary computation and evolving connectionist systems Chapter 7. Evolving connectionist machines: a framework, biological motivation, and implementation issues Part II. Applications in Bioinformatics, Brain Study, and Intelligent Machines Chapter 8. Data analysis, modelling and knowledge discovery in Bioinformatics Chapter 9. Analysis and modelling of brain functions and cognitive processes Chapter 10. Modelling the emergence of acoustic segments (phones) in spoken languages Chapter 11. On-line adaptive speech recognition Chapter 12. On-line image and video data processing Chapter 13. Evolving systems for integrated multi-modal information processing Epilogue References Extended glossary Subject index ---------------------------------------------------------------- best regards Nik Kasabov Prof. Nik Kasabov, MSc, PhD Fellow RSNZ, NZCS, Sr Member IEEE Director, Knowledge Engineering and Discovery Research Institute Chair of Knowledge Engineering, School of IT Auckland University of Technology (AUT) phone: +64 9 917 9506 ; fax: +64 9 917 9501 mobile phone: +64 21 488 328 WWW http://www.kedri.info email: nkasabov at aut.ac.nz From juergen at idsia.ch Tue Oct 15 10:04:39 2002 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Tue, 15 Oct 2002 16:04:39 +0200 Subject: 2003: Job Openings (Postdocs, PhDs) at IDSIA, Switzerland Message-ID: <3DAC2077.3080902@idsia.ch> For the next year we are anticipating several job openings for outstanding postdocs and PhD students interested in at least one of the following topics: 1. State-of-the-Art Recurrent Neural Networks: http://www.idsia.ch/~juergen/rnn.html 2. Optimal Incremental Universal Search Algorithms: http://www.idsia.ch/~juergen/oops.html 3. Universal Learning Algorithms: http://www.idsia.ch/~juergen/unilearn.html To apply, please follow the instructions in http://www.idsia.ch/~juergen/jobs2003.html Job interviews are possible at NIPS 2002 in Vancouver and at the NIPS workshop on Universal Learning Algorithms and Optimal Search: http://www.idsia.ch/~marcus/idsia/nipsws.htm Juergen Schmidhuber, IDSIA http://www.idsia.ch/~juergen From mvzaanen at science.uva.nl Tue Oct 15 10:51:03 2002 From: mvzaanen at science.uva.nl (Menno van Zaanen) Date: Tue, 15 Oct 2002 16:51:03 +0200 (CEST) Subject: Call for papers: Special Issue Pattern Recognition Message-ID: Apologies for Multiple Copies. Please distribute... CALL FOR PAPERS Pattern Recognition (The Journal of the Pattern Recognition Society) Special Issue on Grammatical Inference Techniques & Applications This Special Issue will be published in April, 2004 to commemorate and honor the memory of Late Professor K. S. Fu. Grammatical Inference (GI) is a collection of methodologies for learning grammars from training data. The most traditional field of application of GI has been syntactic pattern recognition. In the recent past, however, concerted efforts from diverse disciplines to find tractable inference techniques have added new dimensions and opened up unchartered territories. Applications of GI in more nontraditional fields include Gene Analysis, Sequence Prediction, Cryptography and Information Retrieval. Development of algorithms for GI has evolved over the years from dealing with only positive training samples to more fundamental efforts that try to circumvent the lack of negative samples.. This idea is pursued in stochastic grammars and languages which attempt to overcome absence of negative samples by gathering statistical information from available positive samples. Also within the framework of information theory, probability estimation technique for Hidden Markov Model known as Backward-Forward and for Context-Free language, the Inside-Outside algorithm are focal point of investigations in stochastic grammar field. Techniques that use intelligent search to infer the rules of grammar are showing considerable promise. Recently, there has been a surge of activities dealing with specialized neural network architecture and dedicated learning algorithms to approach GI problems. In more customary track, research in learning classes of transducers continue to arouse interests in GI community. Close interaction/collaboration between different disciplines and availability of powerful computers are fueling novel research efforts in GI. The objective of the Special Issue is to present the current status of this topic through the works of researchers in different disciplines. Original and tutorial papers are solicited that address theoretical and practical issues on this theme. Topics of interest include (but are not limited to): Theory: Neural network framework and learning algorithms geared to GI GI via heuristic and genetic search Inference mechanisms for stochastic grammars/languages Algebraic methods for identification of languages Transduction learning Applications: Image processing and computer vision Biosequence analysis and prediction Speech and natural language processing Data mining/information retrieval Optical character recognition Submission Procedure: Only electronic (ftp) submission will be accepted. Instructions for submission of papers will be posted on November 10 at the guest editor's web site (http://www-ee.ccny.cuny.edu/basu) . All submitted papers will be reviewed according to guidelines and standards of Pattern Recognition. Deadlines: Manuscript Submission: December 10, 2002 Notification of Acceptance: April 16, 2003 Final Manuscript Due: June 16, 2003 Publication Date: April 2004 Guest Editor: Mitra Basu , The City College of CUNY, New York, U.S.A. basu at ccny.cuny.edu +-------------------------------------+ | Menno van Zaanen | "Let him not vow to walk in the dark, | mvzaanen at science.uva.nl | who has not seen the nightfall." | http://www.science.uva.nl/~mvzaanen | -Elrond From sml at essex.ac.uk Tue Oct 15 07:50:19 2002 From: sml at essex.ac.uk (Lucas, Simon M) Date: Tue, 15 Oct 2002 12:50:19 +0100 Subject: ICDAR 2003 Competitions and Datasets Message-ID: <7AC902A40BEDD411A3A800D0B7847B66E0FA8A@sernt14.essex.ac.uk> Dear All, For ICDAR 2003 (International Conference on Document Analysis and Recognition) we are running some competitions that may be of interest to readers of this list. The competition areas include cursive script recognition, page and table segmentation, and various text-in-scene (robust reading) problems. Some of these competitions have new datasets associated with them, which you can download. For more details: http://algoval.essex.ac.uk/icdar/Competitions.html Best regards, Simon Lucas (Competitions chair, ICDAR 2003) -------------------------------------------------- Dr. Simon Lucas Department of Computer Science University of Essex Colchester CO4 3SQ United Kingdom Email: sml at essex.ac.uk http://cswww.essex.ac.uk -------------------------------------------------- From markman at psyvax.psy.utexas.edu Wed Oct 16 10:02:51 2002 From: markman at psyvax.psy.utexas.edu (Art Markman) Date: Wed, 16 Oct 2002 09:02:51 -0500 Subject: Cognitive Science Society Virtual Seminar - Dr. James McClelland Message-ID: Virtual Seminar Series October 15, 2002 The Cognitive Science Society will be hosting a virtual colloquium series this year presented live via the Internet, with the first talk given by Jay McClelland in October. Topic: Semantic Cognition: A Parallel Distributed Processing Approach Time: Friday, October 25, 2002 1:00pm US Eastern Standard Time Presenter: Dr. James McClelland Carnegie Mellon University There are two technologies available to participate in the seminar - Web Conferencing or Phone in: Web Conferencing Voice over IP - using a PC and Internet Explorer: Address: www.voicecafe.cc/aptima/client.htm Username: seminar Password: 214365 Phone in - using a toll line (i.e., you'll be charged your regular long distance rates): Ahead of time download the slides from: http://www.cognitivesciencesociety.org/colloquium Call Phone Number: 1-620-584-8200 Enter Code: 37004# Web Conferencing installation and rehearsal Installation There is a one-time automatic download of required software plug ins. This will take about five minutes to complete. If you have a Windows 2000 operating system, you will be required to login to the ?Administrator? account to download the software. Using Internet Explorer, Address: www.voicecafe.cc/aptima/client.htm Username: seminar Password: 214365 Rehearsal Session Tuesday, October 22 at 2:00pm US EST Thursday, October 24 at 10:00am US EST We encourage you to attend one of two brief 15 minute sessions on Aptima?s Web Conferencing Facility. If you should have any difficulties during the practice session, call the audio line which will be open: 1-620-584-8200 Code: 37004# If you have any installation issues prior to the seminar view the FAQs or contact Gilbert Mizrahi. If you have difficulty on the day of the seminar, call for technical support 781-935-3966 x214 or email cta at aptima.com Please forward this invitation to colleagues who would benefit from this seminar. Sincerely, Art Markman markman at psy.utexas.edu Dr. Arthur B. Markman University of Texas Department of Psychology Austin, TX 78712 512-232-4645 The seminar series is sponsored by the Office of Naval Research -------------- next part -------------- An HTML attachment was scrubbed... URL: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/9e4f1aac/attachment-0001.html From greiner at cs.ualberta.ca Thu Oct 17 00:03:40 2002 From: greiner at cs.ualberta.ca (Russ Greiner) Date: Wed, 16 Oct 2002 22:03:40 -0600 Subject: Alberta Ingenuity Centre for Machine Learning Message-ID: <20021017040350Z80335-23157+1339@sunkay.cs.ualberta.ca> We are pleased to announce the creation of the new Alberta Ingenuity Centre for Machine Learning. This multi-year, multi-million dollar centre, located at the University of Alberta (Edmonton), will conduct the highest quality research in both fundamental and applied machine learning. While we will initially focus on * bioinformatics * interactive entertainment (including computer games) we are eager to extend to any other area related to Machine Learning and Datamining. We are currently recruiting at essentially EVERY level: faculty members (junior or senior; even endowed chairs!) post-doctoral fellows / research associates graduate students -- both MSc and PhD We also have a substantial budget to support visitors, both short and long term. For more information, see http://www.aicml.ca or contact us at recruit at aicml.ca | R Greiner Phone: (780) 492-5461 | | Director, Alberta Ingenuity Centre for Machine Learning | | Dep't of Computing Science FAX: (780) 492-1071 | | University of Alberta Email: greiner at cs.ualberta.ca | | Edmonton, AB T6G 2E8 Canada http://www.cs.ualberta.ca/~greiner/ | From nnk at his.atr.co.jp Thu Oct 17 02:10:58 2002 From: nnk at his.atr.co.jp (Neural Networks Japan Office) Date: Thu, 17 Oct 2002 15:10:58 +0900 Subject: [REMINDER] Call for Papers NN 2003 Special Issue on Neuroinformatics Message-ID: [Apologies if you receive this announcement more than once.] ****************************************************************** CALL FOR PAPERS Neural Networks 2003 Special Issue "Neuroinformatics" ****************************************************************** ---------------> The deadline for submission is close <------------ Co-Editors Professor Shun-ichi Amari, RIKEN Brain Science Institute Professor Michael A Arbib, University of Southern California Dr. Rolf Kotter, Heinrich Heine University Dusseldorf Submission Deadline for submission: October 30, 2002 Notification of acceptance: March 31, 2003 Format: as for normal papers in the journal (APA format) and no longer than 10,000 words Address for Papers Dr. Mitsuo Kawato ATR Human Information Science Laboratories 2-2-2 Hikaridai, Seika-cho Soraku-gun, Kyoto 619-0288, Japan. Neuroinformatics is an emerging field, integrating approaches from neuroscience and information science/technology to understanding the structure and function of the brain. Neuroinformatics research is interdisciplinary and aims at unraveling the complex structure-function relationships of the brain at all levels and scales of analysis. Neuroinformatics will accelerate the progress of neuroscience and informatics, for example, by * more efficient use of neuroscience data by information-based approaches, * developing and applying new tools and methods for acquiring, visualizing and analyzing data, * developing new methodologies for generating theories to derive further experiments and engineering applications, * generating computational theories connecting neuroscience and information science/technology. The Special Issue will include invited and contributed articles taking a broad view of neuroinformatics, with special emphasis on database construction, data mining, ontologies for neural systems, and the integration of simulation methods with data analysis. Neural Networks Official home page http://www.elsevier.com/inca/publications/store/8/4/1/index.htt Instructions to Authors http://authors.elsevier.com/GuideForAuthors.html?PubID=841&dc=GFA ----------------------------------------------------------------- END. -- ==================================================================== NEURAL NETWORKS Editorial Office ATR-I, Human Information Science Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan TEL +81-774-95-1058 FAX +81-774-95-2647 E-MAIL nnk at his.atr.co.jp ========================================================= EMAIL ADDRESS HAS BEEN CHANGED FROM OCT.1, 2001 ========================================================= From: esann To: "'Connectionists at cs.cmu.edu'" References: From bogus@does.not.exist.com Fri Oct 18 03:29:00 2002 From: bogus@does.not.exist.com () Date: Fri, 18 Oct 2002 09:29:00 +0200 Subject: special sessions at ESANN'2003 Message-ID: ESANN'2003 11th European Symposium on Artificial Neural Networks Bruges (Belgium) - April 23-24-25, 2003 Special sessions ===================================================== The following message contains a summary of all special sessions that will be organized during the ESANN'2003 conference. Authors are invited to submit their contributions to one of these sessions or to a regular session, according to the guidelines found on the web server of the conference (http://www.dice.ucl.ac.be/esann/). Deadline for submissions is December 6, 2003. List of special sessions that will be organized during the ESANN'2003 conference ======================================================================== 1. Links between neural networks and webs (M. Gori) 2. Mathematical aspects of neural networks (B. Hammer, T. Villmann) 3. Statistical learning and kernel-based algorithms (M. Pontil, J. Suykens) 4. Digital image processing with neural networks (A. Wismller, U. Seiffert) 5. Industrial and agronomical applications of neural networks (L.M. Reyneri) 6. Neural networks for human/computer interaction (C.W. Omlin) Short description ================= 1. Links between neural networks and webs ----------------------------------------- Organised by : Marco Gori, University of Siena (Italy) Description of the session: Artificial neural networks have been the subject of massive in-depth investigation in the last twenty years. General theories on architectural and learning issues are now spread in the scientific community, are well-known, and have been widely disseminated. The recent development of the Web, with the corresponding crucial problem of performing information retrieval has been recently faced by introducing the concept of page rank (Google search engine), which is a sort of visibility index of the pages in the Web. Interestingly enough, one of the most successful solutions to page scoring is based on a dynamical model which reminds us a neural network. Recent extensions of this model give page scoring systems tightly related to neural networks, the main difference being that each unit accepts typically an input, which is roughly constant. A more general view of page scoring systems includes, however, the presence of a dynamics on the inputs of the nodes and the introduction of parameters which are dual with respect to the weights of a neural network. This special session addresses the new emerging notion of web, a sort of huge artificial neural network, whose learning environment is distributed throughout the units in a single instance, which comes in conjunction with the network, instead of being presented as a collection of examples. The Web is a noticeable case where the learning data can be the users' relevance feedback on the pages, but other interesting examples can be given. In the special session, we also expect to foresee the foundation on web learning as a generalization of the theory of adaptive computation on structured domains. 2. Mathematical aspects of neural networks ------------------------------------------ Organised by : Barbara Hammer, Univ. Osnabrck (Germany) Thomas Villmann, Univ. Leipzig (Germany) Description of the session: ESANN has become the reference for researchers on fundamentals and theoretical aspects of neural networks. Nevertheless an increasing number of presentations for successful neural network applications could be found at past ESANN conferences. This might be due to two reasons: theoretical aspects of neural networks are now well understood; theory of neural networks directly leads to improved algorithms. Though this is certainly the case with respect to many aspects, we believe that there exist still many questions concerning neural networks which are not yet understood or even adequately formalized and where direct applicability cannot be expected in the near future. We would like to open a forum for mathematical aspects of neural networks with a focus on in-principle possibilities of formalizing heuristic observations, open questions and directions of theoretical research, and mathematical results for possibly not yet practically relevant situations. We encourage submissions which could be related to the following topics: capacity and approximation results, complexity of neural network training, learning theory, convergence and stability of network dynamics or training, alternative mathematical descriptions of models, evaluation of network behaviour in non-standard domains, ... 3. Statistical learning and kernel-based algorithms --------------------------------------------------- Organised by : Massimiliano Pontil, Univ. Siena (Italy) Johan Suykens, K.U. Leuven (Belgium) Description of the session: Over the past few years, statistical learning theory has emerged as a principled approach for learning from examples. The theory has grown on ideas from different fields, including empirical processes, statistics, regularization, convex optimization, to name a few, and has produced remarkable learning algorithms, such as the popular support vector machine. Those algorithms make use of reproducing kernel Hilbert spaces as the key computational ingredient and bring together older ideas from statistics (e.g., ridge regression, principal components, canonical correlation analysis, to name a few). We encourage authors to submit papers on which either present novel algorithms ideas and/or discuss important application studies of already existing kernel-based learning algorithms. 4. Digital image processing with neural networks ------------------------------------------------ Organised by : Axel Wismller, Univ. Mnich (Germany) Udo Seiffert, Univ. Magdeburg (Germany) Description of the session: In the proposed special session we want to focus on image processing based on neural networks as well as other advanced methods of computational intelligence. A special emphasis is put on real-world applications combining original ideas and new developments with a strong theoretical background. Authors are invited to submit contributions which can be in any area of image processing with neural networks. The following non-restrictive list can serve as an orientation, however, additional topics may be chosen as well: Real-world image processing applications in science and industry, e.g. for robotics, security, biometry, medicine, biology, ... Preprocessing and feature extraction: dimension and noise reduction, image enhancement, edge detection, compression, ... Segmentation and object recognition, texture and colour analysis Image registration, matching, morphing Image understanding and scene analysis Methods for neural network image processing: classification, clustering, embedding, hybrid systems, ... Multidimensional image analysis, image time-series Knowledge discovery in image databases, web applications 5. Industrial and agronomical applications of neural networks ------------------------------------------------------------- Organised by : Leonardo M. Reyneri, Politecnico di Torino (Italy) Description of the session: Since more than 30 years, neural networks have given rise to a huge amount of theoretical studies and developments, research and evaluation phases, etc. Theory is now becoming rather stable and neural networks, together with fuzzy systems and other soft computing paradigms, have become a knowledge that each expert should manage. At this stage, it is of outmost importance to evaluate the relevance and effectiveness of neuro-fuzzy systems in real-world applications. Too few papers are still found in literature, therefore contributors are invited to share their experience by presenting papers which describes an application of a neuro-fuzzy system into an industrial or agronomical application. Requirements for papers submitted to this session: i) the paper should quickly describe the problems and limitations of older aproach(es); in case of obvious constraints due to intellectual properties or non-disclosure agreements, details of the problem can be masked or modified, provided that the user can get a feeling of the complexity; ii) enough details should be given about network paradigm, size, topology, training rule; iii) a table SHALL be included which lists, at least: design time (e.g. in days), training time (e.g. in days, NOT training epochs), one to three performance figures (in appropriate units for the problem). All these figures shall be given for the proposed solution AND for at least another (possibly more) non-neural approach(es). 6. Neural networks for human/computer interaction ------------------------------------------------- Organised by : Christian W. Omlin, Univ. Western Cape (South Africa) Description of the session: text not yet available ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From hasselmo at bu.edu Fri Oct 18 15:04:14 2002 From: hasselmo at bu.edu (Michael Hasselmo) Date: Fri, 18 Oct 2002 15:04:14 -0400 Subject: IJCNN 2003 - Portland, Oregon, USA - July 20-24, 2003 Message-ID: CALL FOR PAPERS **************************************************************** International Joint Conference on Neural Networks (IJCNN 2003) Portland, Oregon, July 20-24, 2003 http://www.ijcnn.net DEADLINE: January 29, 2003 **************************************************************** Co-sponsored by the International Neural Network Society (INNS) And the IEEE Neural Networks Society. Paper submission deadline is January 29, 2003. Selected papers will be published in a special issue of the journal Neural Networks, in addition to publication of all papers in the conference proceedings. The International Joint Conference on Neural Networks provides an overview of state of the art research in Neural Networks, covering a wide range of topics (see topic list below). The IJCNN meeting is organized annually by the International Neural Network Society (INNS) and the IEEE Neural Networks Society. Conference attendees who are INNS or IEEE Neural Networks Society members, or who join one of these societies now will receive a reduced IJCNN conference registration fee, and those who are INNS members will receive the IJCNN special issue for free as part of their annual membership subscription to Neural Networks. Location: ------------------------------------ The conference will take place at the Doubletree Hotel Portland-Columbia River, Portland, Oregon, July 20-24, 2003. For more information about Portland see http://www.ijcnn.net Article submission: ------------------------------------ Authors should submit their articles electronically on the conference web site at http://www.ijcnn.net by the conference deadline of January 29, 2003. The site opens on October 15, 2003. Special issue of the journal Neural Networks ------------------------------------ The review process of the conference will allow selection of a large subset of the articles for inclusion in the special issue of the journal Neural Networks. For more information about this journal see: http://www.elsevier.com/locate/neunet Plenary speakers: ------------------------------------ Kunihiko Fukushima, Tokyo University of Technology, Japan Earl Miller, Massachusetts Institute of Technology, USA Terrence Sejnowski, Salk Institute and UCSD, USA Vladimir Vapnik, NEC Research Labs, USA Christoph von der Malsburg, USC, USA and Univ. Bochum, Germany Special sessions: ------------------------------------ There will be a number of special sessions, including the following titles: 1. Neuroinformatics 2. Visual cortex: How illusions represent reality 3. Dynamical aspects of information encoding 4. Incremental Learning 5. Attention and consciousness in normal brains: Theoretical models and phenomenological data from MEG Tutorials ------------------------------------ Tutorials will take place on Sunday, July 20, 2003. Two hour sessions will cover a range of different topics. Researchers interested in proposing a tutorial should access the web site at http://www.ijcnn.net. Topic list ------------------------------------ Regular oral and poster sessions will include papers in the following topics: A. PERCEPTUAL AND MOTOR FUNCTION Vision and image processing Pattern recognition Face recognition Handwriting recognition Other pattern recognition Auditory and speech processing Audition Speech recognition Speech production Other perceptual systems Motor control and response B. COGNITIVE FUNCTION Cognitive information processing Learning and memory Spatial Navigation Conditioning, Reward and Behavior Mental disorders Attention and Consciousness Language Emotion and Motivation C. COMPUTATIONAL NEUROSCIENCE Models of neurons and local circuits Systems neurobiology and neural modeling Spiking neurons D. INFORMATICS Neuroinformatics Bioinformatics Artificial immune systems Data mining E. HARDWARE Neuromorphic hardware and implementations Embedded neural networks F. REINFORCEMENT LEARNING AND CONTROL Reinforcement learning Approximate/Adaptive dynamic programming Control Reconfigurable systems Robotics Fuzzy neural systems Optimization G. DYNAMICS Neurodynamics Recurrent networks Chaos and learning theory H. THEORY Mathematics of Neural Systems Support vector machines Extended Kalman filters Mixture models, EM algorithms and ensemble learning Radial basis functions Self-organizing maps Adaptive resonance theory Principal component analysis and Independent component analysis Probabilistic and information-theoretic methods Neural Networks and Evolutionary Computation I. APPLICATIONS Signal Processing Telecommunications Applications Time Series Analysis Biomedical Applications Financial Engineering Biomimetic applications Computer security applications Power system applications Aeroinformatics Diagnostics and Quality Control Other applications General Chair: Don Wunsch, University of Missouri - Rolla Program Chair: Michael Hasselmo, Boston University Program co-chairs: DeLiang Wang, Ohio State University Ganesh K. Venayagamoorthy,University of Missouri - Rolla Tutorial co-chairs: F. Carlo Morabito, University of Reggio Calabria, Italy Harold Szu, Office of Naval Research Local Arrangements Chair: George Lendaris, Portland State University Publicity chair: Derong Liu, University of Illinois at Chicago Web chair: Tomasz Cholewo, Lexmark International Inc., Kentucky Exhibits chair: Karl Mathia, Brooks-PRI Automation Inc., California Student travel and volunteer chair: Slawo Wesolkowski, University of Waterloo, Canada International Liason: William N. Howell, Mining and Mineral Sciences Laboratories, Canada Program committee: ------------------------------------------ David Brown, FDA David Casasent, Carnegie Mellon University Ke Chen , University of Birmingham, UK Michael Denham, University of Plymouth, UK Tom Dietterich, Oregon State University Lee Feldkamp, Ford Motor Company Kunihiko Fukushima, Tokyo University of Technology, Japan Joydeep Ghosh, University of Texas at Austin Stephen Grossberg, Boston University Fred Ham, Florida Institute of Technology Ron Harley, Georgia Institute of Technology Bart Kosko, University of Southern California Robert Kozma, University of Memphis Dan Levine, University of Texas at Dallas Xiuwen Liu, Florida State University F. Carlo Morabito, Universita di Reggio Calabria, Italy Ali Minai, University of Cincinnati Catherine Myers, Rutgers University Erikki Oja, Helsinki University of Technology, Finland Jose Principe, University of Florida Danil Prokhorov, Ford Motor Company Harold Szu, Office of Naval Research John Gerald Taylor, University College, London, UK Shiro Usui, Toyohashi Univ. of Technology, Japan Bernie Widrow, Stanford University Lei Xu, The Chinese University of Hong Kong Gary Yen, Oklahoma State University Lotfi Zadeh, University of California, Berkeley Review committee: ----------------------------------------------------- Reviews will be performed by a group of over 130 researchers in the field. The review committee member list will be posted on the IJCNN web site. For more information see the web page at http://www.ijcnn.net or contact INNS at: 19 Mantua Road Mt. Royal, NJ 08061 856-423-0162 or FAX: 856-423-3420. From bert at snn.kun.nl Fri Oct 18 08:11:31 2002 From: bert at snn.kun.nl (Bert Kappen) Date: Fri, 18 Oct 2002 14:11:31 +0200 (MEST) Subject: Promedas: a decision support system for medical diagnosis Message-ID: Dear all, we recently completed a report describing the state-of-the-art of our ongoing efforts in medical diagnosis, the Promedas project. "PROMEDAS": a probabilistic decision support system for medical diagnosis The objective of our project is to build a large bayesian network for diagnosis in internal medicine. As is well-known, this is not easy and requires the combined efforts of experts in internal medicine as well as advanced software and algorithmic development. On the medical side, one of the distinguishing features of our work is that we have the financial resources to contract physicians to do the medical modeling and evaluation in a clinical setting. In our view, this is critical to 1) obtain valid models and 2) to get accepted by potential users. On the algoritmic side, our research group in Nijmegen has experience on approximate inference techniques that are needed to keep computation tractable. We have developed our own graphical model software, called BayesBuilder, which is freely available for non-commercial use. The maturity of our work can be described as 'close to clinical practice'. We have just started the first clinical trials to evaluate the acceptance of a module on lipids and vascular diseases (approx 500 variables) by the intended users, i.e. experts in internal medicine working in a hospital environment. The report can be downloaded from http://www.snn.kun.nl/~bert/#diagnosis Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: www.snn.kun.nl/~bert From kim.plunkett at psy.ox.ac.uk Sat Oct 19 11:55:27 2002 From: kim.plunkett at psy.ox.ac.uk (Kim Plunkett) Date: Sat, 19 Oct 2002 16:55:27 +0100 Subject: Oxford Connectionist Summer School Message-ID: <003b01c27787$f1495340$30274381@KIMSLAPTOP> UNIVERSITY OF OXFORD OXFORD SUMMER SCHOOL ON CONNECTIONIST MODELLING Department of Experimental Psychology University of Oxford Sunday 20th July - Friday 1st August, 2003 Applications are invited for participation in a 2-week residential Summer School on techniques in connectionist modelling. The course is aimed primarily at researchers who wish to exploit neural network models in their teaching and/or research and it will provide a general introduction to connectionist modelling, biologically plausible neural networks and brain function through lectures and exercises on Macintosh's and PC's. The course is interdisciplinary in content though many of the illustrative examples are taken from cognitive and developmental psychology, and cognitive neuroscience. The instructors with primary responsibility for teaching the course are Kim Plunkett and Edmund Rolls. No prior knowledge of computational modelling will be required though simple word processing skills will be assumed. Participants will be encouraged to start work on their own modelling projects during the Summer School. The cost of participation in the Summer School is =A3950. This figure covers the cost of accommodation (bed and breakfast at St. John's College), registration and all literature required for the Summer School. Participants will be expected to cover their own travel and meal costs. A number of partial bursaries (=A3200) will be available for graduate students. Applicants should indicate whether they wish to be considered for a graduate student scholarship but are advised to seek further funding as well, since in previous years the number of graduate student applications has far exceeded the number of scholarships available. If you are interested in participating in the Summer School, please complete the application form at the web address http://epwww.psych.ox.ac.uk/conferences/connectionist_modelling or alternatively send a brief description of your background with an explanation of why you would like to attend the Summer School, to: Mrs Sue King Department of Experimental Psychology University of Oxford South Parks Road Oxford OX1 3UD Tel: (01865) 271353 Email: susan.king at psy.ox.ac.uk no later than 28th February 2003. From rens at science.uva.nl Sun Oct 20 14:14:53 2002 From: rens at science.uva.nl (Rens Bod) Date: Sun, 20 Oct 2002 20:14:53 +0200 (MEST) Subject: New article on Unified Model of Linguistic and Musical Processing In-Reply-To: <003b01c27787$f1495340$30274381@KIMSLAPTOP> Message-ID: Dear Connectionists, The following paper may of interest to the readers of this list. Best, Rens Bod (http://turing.wins.uva.nl/~rens) ---------------------------- Bod, Rens (2002) "A Unified Model of Structural Organization in Language and Music", Journal of Artificial Intelligence Research (JAIR) Volume 17, pages 289-308. Available at http://turing.wins.uva.nl/~rens/jair02.pdf (or also via http://www.jair.org/abstracts/bod02a.html) Abstract: Is there a general model that can predict the perceived phrase structure in language and music? While it is usually assumed that humans have separate faculties for language and music, this work focuses on the commonalities rather than on the differences between these modalities, aiming at finding a deeper 'faculty'. Our key idea is that the perceptual system strives for the simplest structure (the 'simplicity principle'), but in doing so it is biased by the likelihood of previous structures (the 'likelihood principle'). We present a series of data-oriented parsing (DOP) models that combine these two principles and that are tested on the Penn Treebank and the Essen Folksong Collection. Our experiments show that (1) a combination of the two principles outperforms the use of either of them, and (2) exactly the same model with the same parameter setting achieves maximum accuracy for both language and music. We argue that our results suggest an interesting parallel between linguistic and musical structuring. From terry at salk.edu Mon Oct 21 17:34:48 2002 From: terry at salk.edu (Terry Sejnowski) Date: Mon, 21 Oct 2002 14:34:48 -0700 (PDT) Subject: Computational Biology of Time Message-ID: <200210212134.g9LLYmI98120@purkinje.salk.edu> COMPUTATIONAL BIOLOGY OF TIME Organizers: Terrence Sejnowski and Sydney Brenner January 31 - February 4, 2003 Banff Centre - Banff, Alberta http://www.keystonesymposia.org/Sites/SitesDetail.cfm?SiteID=19 Abstract Deadline: November 1, 2002 Early Registration Deadline: December 2, 2002 http://www.keystonesymposia.org/Meetings/ViewMeetings.cfm?MeetingID=659 Time is the final frontier in biology and uncovering molecular and cellular mechanisms in cells that keep time is essential to understanding biological systems. Biological clocks cover a wide range of time scales, from the heartbeat to circadian rhythms. In each of these systems, molecular mechanisms are being uncovered that underlie these rhythms and stabilize them, but the number of molecules and the complexity of their interactions are daunting. There is growing interest in applying computational models to these biological systems. This symposium brings together some of the leading computational model builders and key researchers studying the circadian clock, photoperiodism in plants, the cell cycle in yeast, cardiac rhythms, brain rhythms that occur during sleep and firefly synchronization. The mathematical principles that emerge from the models highlight deep similarities that exist between these diverse systems, and allow a broader understanding to emerge for how biological systems organize time in robust and effective ways. Friday, January 31, 7:30 - 8:30 PM: Keynote Address: Sydney Brenner, 2002 Nobel Prize in Physiology or Medicine HOW CELLS COMPUTE Saturday, February 1, 8:00 - 11:00 AM CIRCADIAN RHYTHMS Joseph S. Takahashi, Northwestern University "Circadian Clock Genes" Martha U. Gillette, University of Illinois "Circadian Pacemaker in the Suprachiasmatic Nucleus?" Stanislas Leibler, Rockefeller University "Oscillations and Noise in Genetic Networks" Albert Goldbeter, Universit Libre de Bruxelles "Computational Biology of Circadian Rhythms" Saturday, February 1, 5:00 - 7:00 PM COUPLED BIOLOGICAL OSCILLATORS Andrew Moiseff, University of Connecticut "Temporal Rhythms in Firefly Communication" Wolfgang O. Friesen, University of Virginia "Coupled Central and Peripheral Oscillators Generate Efficient Swim Undulations" G. Bard Ermentrout, University of Pittsburgh "Coupled Neural Oscillators" Sunday, February 2, 8:00 - 11:00 AM SLEEP RHYTHMS David A. McCormick, Yale University "Slow Oscillations in Thalamic and Cortical Slices" Mircea Steriade, Universit Laval "Sleep Oscillations In Vivo" Terrence Sejnowski, Salk Institute "Neural Models of Sleep Rhythms" Alexander A. Borbely, University of Zurich "Sleep in Humans: Intrinsic and Extrinsic Oscillations" Sunday, February 2, 5:00 - 7:00 PM PHOTOPERIODISM Steve A. Kay, The Scripps Research Institute "Comparative Genetics and Genomics Approaches to "Understanding Circadian Clock and Photoperiodism? Susan S. Golden, Texas A & M University "Plasticity of circadian rhythms of gene expression in cyanobacteria" Takao Kondo, Nagoya University "Genome-Wide Circadian System of Cyanobacteria Driven by Kai Feedback Loop" Monday, February 3, 8:00 - 11:00 AM CARDIAC RHYTHMS Denis Noble, University of Oxford "The Modes of Oscillation of the Heart" Peter Hunter, University of Auckland "Electro-Mechanical Heart Model" John Peter Wikswo Jr. , Vanderbilt University "Cardiac Reentry as a Spatiotemporal Oscillator" Leon Glass, McGill University "Puzzles Concerning the Starting and Stopping of Biological Oscillations" Monday, February 3, 5:00 - 7:00 PM CELL CYCLE John Tyson, Virginia Polytechnic Institute "Cyberyeast: Modeling the Eukaryotic Cell Cycle" Marc W. Kirschner, Harvard Medical School "Modeling the Wnt Signaling Pathway" ----- For more than 30 years, Keystone Symposia has been connecting the scientific community in a way no other meeting or conference can. Your opportunity to enjoy quality scientific discussions, networking among colleagues, and cutting-edge presentations -- all in a relaxed atmosphere -- is here. For more information about the Banff Center in Alberta, Canada: http://www.keystonesymposia.org/Sites/SitesDetail.cfm?SiteID=19 ----- From agathe at dcs.gla.ac.uk Mon Oct 21 10:03:25 2002 From: agathe at dcs.gla.ac.uk (Agathe Girard) Date: Mon, 21 Oct 2002 15:03:25 +0100 Subject: technical report available Message-ID: <3DB4092D.638E5042@dcs.gla.ac.uk> Dear All, The following new technical report Gaussian Process Priors with Uncertain Inputs: Multiple-Step Ahead Prediction A. Girard, C. E. Rasmussen and R. Murray-Smith is available at http://www.dcs.gla.ac.uk/~agathe/reports.html Feedback most appreciated! Regards, Agathe Girard http://www.dcs.gla.ac.uk/~agathe %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% Abstract: We consider the problem of multi-step ahead prediction in time series analysis using the non-parametric Gaussian process model. $k$-step ahead forecasting of a discrete-time non-linear dynamic system can be performed by doing repeated one-step ahead predictions. For a state-space model of the form $y_t=f(y_{t-1}, \dots, y_{t-L})$, the prediction of $y$ at time $t+k$ is based on the estimates ${\hat y_{t+k-1}}, \dots, {\hat y_{t+k-L}}$ of the previous outputs. We show how, using an analytical Gaussian approximation, we can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction. In this framework, the problem is that of predicting responses at a random input and we compare the Gaussian approximation to the Monte-Carlo numerical approximation of the predictive distribution. The approach is illustrated on a simulated non-linear dynamic example, as well as on a simple one-dimensional static example. From O.Simonnot at elsevier.com Tue Oct 22 06:05:53 2002 From: O.Simonnot at elsevier.com (Simonnot, Olivier (ELS)) Date: Tue, 22 Oct 2002 11:05:53 +0100 Subject: The Computer Science Preprint Server Message-ID: <4D56BD81F62EFD49A74B1057ECD75C06046C974C@elsamss02571> Computer scientists can now fully enjoy a completely free platform facilitating the exchange of scientific information, The Computer Science Preprint Server "http://www.compscipreprints.com". The Computer Science Preprint Server provides you with: * FREE full text access to the articles already posted * final-version articles as well as reports on work in progress * streamlined and easy to use submission process with instant online visibility of your article * as an author, total freedom to remove your article at any time, and/or to submit it for publication to the journal of your choice * discussion threads to get direct feedback from your peer researchers * an extended alerting service enabling you to keep on track with the latest developments but also make your article visible From canete at ctima.uma.es Wed Oct 23 06:41:34 2002 From: canete at ctima.uma.es (=?iso-8859-1?Q?Javier_Fern=E1ndez_de_Ca=F1ete?=) Date: Wed, 23 Oct 2002 12:41:34 +0200 Subject: 8th Conference of Eng. Applications of Neural Networks EANN'03. Call for papers Message-ID: <004801c27a80$c1ba2680$836dd696@isa.uma.es> Call for Papers and Participation Eighth International Conference on Engineering Applications of Neural Networks Costa del Sol, M=E1laga, Spain 8-10 September 2003 The conference is a forum for presenting the latest results on neural network applications in technical fields. The applications may be in any engineering or technical field, including but not limited to: building systems, systems engineering, mechanical engineering, robotics, process engineering, metallurgy, pulp and paper technology, aeronautical engineering, computer science, machine vision, chemistry, chemical engineering, physics, electrical engineering, electronics, civil engineering, geophysical sciences, biomedical systems, and environmental engineering. This year conference is organised by the University of Malaga in co-operation with the Dpt. of System Engineering and Automation. For information on earlier EANN conferences visit http://www.kingston.ac.uk/eann/EANN.htm EANN2003 Web Page: http://www.isa.uma.es/eann03 Abstract Submission Prospective authors are requested to send an extended abstract for review by the International Committee. All papers must be written in English, starting with a succinct statement of the problem and the application area, the results achieved, their significance and a comparison with previous work (if any). The following must be also be included: Title of proposed paper, Author names, Affiliations Addresses, Name of author to contact for correspondence, E-mail address and fax number of contact author, Topics which best describe the paper (max. 5 keywords), It is strongly recommended to submit extended abstracts by electronic mail to: eann03 at ctima.uma.es Tutorial Proposals EANN2003 is soliciting tutorial proposals, covering any area of neurocomputing from theory to implementation guidelines for the benefit of the participants. Location The conference will take place in a hotel of the Costa del Sol by the Mediterranean sea in southern Spain, 12 km west of Malaga. The hotel is located near the beach and yacht harbour "Puerto Marina" and less than 10 minutes drive from Malaga international airport. The Costa del Sol is situated in the South of Spain, located in the far west of the Mediterranean. The area of coastline that is the Costa del Sol extends northwards from the Straits of Gibraltar, which is the door to the Mediterranean Sea from the Atlantic Ocean. Diary Dates Abstract Submission Deadline: 28 February 2003 Notification of Acceptance: 28 March 2003 Delivery of full papers: 2 May 2003 Proposals for Tutorials: 17 May 2003 Social Events A boat trip will be organised on Monday 8th and a gala dinner will be offered to participants on Tuesday 9th September. Post-conference Publications A number of papers will be selected for inclusion (enhanced versions) in a Special Issue of a prestigious scientific magazine (Neurocomputing, Neural Computing and Applications, etc) as it was done in earlier EANN conferences. Registration Conference Fee Industry and University Rate:GBP300 Student Rate with proceedings:GBP200 Student Rate without proceedings:GBP150 The rates above entitle you to access to the conference sessions (3 days), a copy of the final programme and the proceedings (except option 3 see above), a list of conference participants coffee/tea Breaks (3 days), Lunch (3 days) and gala Dinner (1 day) and social events that are free of any additional charges. Contacts Local Committee Conference Secretariat J. Fernandez de Canete A. Garc=EDa-Cerezo A. Mandow I. Garc=EDa-Moral G. Joya J. Mu=F1oz-Perez ------------------------------------------------------------------------- Organising Committee A. Osman (USA) R. Baratti (Italy) C. Kuroda (Japan) A. Ruano (Portugal) A.J. Owens (USA) D. Tsaptsinos (UK) ------------------------------------------------------------------------- International Committee to be extended A. Bulsari, Finland A. Iwata, Japan S. Michaelides, Cyprus F. Sandoval, Spain R. Saatchi, UK P. Zufiria, Spain S. Lecoeuche , France A. J. Owens, USA R. Parenti, Italy A. Servida, Italy F. Garc=EDa-Lagos, Spain A. Fanni, Italy N, Hazarika, UK S. Bitzer, Germany J. Ringwood, Ireland ------------------------------------------------------------------------- Prof. Javier Fernandez de Canete. Ph. D. Dpto. de Ingenier=EDa de Sistemas y Automatica E.T.S.I. Informatica Campus de Teatinos, 29071 Malaga (SPAIN) Phone: +34-95-2132887 FAX: +34-95-2133361 e_mail: canete at ctima.uma.es From t.windeatt at eim.surrey.ac.uk Thu Oct 24 07:23:50 2002 From: t.windeatt at eim.surrey.ac.uk (Terry Windeatt) Date: Thu, 24 Oct 2002 12:23:50 +0100 Subject: MCS 2003 CALL FOR PAPERS Message-ID: <20021024122350.A9617@ee.surrey.ac.uk> **Apologies for multiple copies** ****************************************** *****MCS 2003 Call for Papers***** ****************************************** *****Paper Submission: 10 January 2003***** *********************************************************************** FOURTH INTERNATIONAL WORKSHOP ON MULTIPLE CLASSIFIER SYSTEMS Guildford, Surrey, GU2 7XH , United Kingdom June 11-13 2003 Updated information: http://www.diee.unica.it/mcs E-mail: mcs2003 at eim.surrey.ac.uk *********************************************************************** WORKSHOP OBJECTIVES MCS 2003 is the fourth workshop of a series aimed to create a common international forum for researchers of the diverse communities working in the field of Multiple Classifier Systems. Information on the previous editions of MCS workshop can be found on www.diee.unica.it/mcs. Contributions from all the research communities working in the field are welcome in order to compare the different approaches and to define the common research priorities. Special attention is also devoted to assess the applications of Multiple Classifier Systems. The workshop is an official event of the International Association for Pattern Recognition (IAPR-TC1). WORKSHOP CHAIRS Terry Windeatt (Univ. of Surrey, United Kingdom) Fabio Roli (Univ. of Cagliari, Italy) ORGANIZED BY Center for Vision, Speech and Signal Proc. of the University of Surrey Dept. of Electrical and Electronic Eng. of the University of Cagliari PAPER SUBMISSION Two hard copies of the full paper should be mailed to: MCS 2003 Dr. Terry Windeatt Dept. of Electrical and Electronic Eng. University of Surrey Guildford, Surrey, GU2 7XH, United Kingdom. In addition, participants should submit an electronic version of the manuscript ( PDF or PostScript format) to mcs2003 at eim.surrey.ac.uk. The papers should not exceed 10 pages (LNCS format, see http://www.springer.de/comp/lncs/authors.html). A cover sheet with the authors names and affiliations is also requested, with the complete address of the corresponding author, and an abstract (200 words). Two members of the Scientific Committee will referee the papers. IMPORTANT NOTICE: Submission implies the willingness of at least one author to register, attend the workshop, and present the paper. Accepted papers will be published in the proceedings only if the registration form and payment for one of the authors has been received. WORKSHOP TOPICS Papers describing original work in the following and related research topics are welcome: Foundations of multiple classifier systems Methods for classifier fusion Design of multiple classifier systems Neural network ensembles Bagging and boosting Mixtures of experts New and related approaches Applications INVITED SPEAKERS Jerry Friedman (USA) Mohamed Kamel (Canada) SCIENTIFIC COMMITTEE J. A. Benediktsson (Iceland) H. Bunke (Switzerland) L. P. Cordella (Italy) B. V. Dasarathy (USA) R. P.W. Duin (The Netherlands) C. Furlanello (Italy) J. Ghosh (USA) T. K. Ho (USA) S. Impedovo (Italy) N. Intrator (Israel) A.K. Jain (USA) M. Kamel (Canada) J. Kittler (UK) L.I. Kuncheva (UK) L. Lam (Hong Kong) D. Landgrebe (USA) D-S. Lee (USA) D. Partridge (UK) A.J.C. Sharkey (UK) K. Tumer (USA) G. Vernazza (Italy) IMPORTANT DATES January 10, 2003 : Paper Submission February 20, 2003: Notification of Acceptance April 1, 2003: Camera-ready Manuscript April 10, 2003: Registration WORKSHOP PROCEEDINGS Accepted papers will appear in the workshop proceedings that will be published in the series Lecture Notes in Computer Science by Springer-Verlag. Word processing templates are available (www.springer.de/comp/lncs/authors.html). Furthermore, extended and revised versions of selected papers will be considered for possible publication in a special journal issue. Selected papers from previous editions of the MCS workshop have been published in Pattern Analysis and Applications, Information Fusion and International Journal of Pattern Recognition and Artificial Intelligence. From djin at MIT.EDU Sat Oct 26 22:23:38 2002 From: djin at MIT.EDU (Dezhe Jin) Date: Sat, 26 Oct 2002 22:23:38 -0400 (EDT) Subject: papers available Message-ID: Dear Connectionists, The following two papers may be of interest to some of you. They can be downloaded at http://hebb.mit.edu/~djin/index.html. Thanks! -Dezhe Jin 1. Fast Convergence of Spike Sequences to Periodic Patterns in Recurrent Networks, Dezhe Z. Jin, Physical Review Letters, 89, 208102 (2002). Abstract: The dynamical attractors are thought to underlie many biological functions of recurrent neural networks. Here we show that stable periodic spike sequences with precise timings are the attractors of the spiking dynamics of recurrent neural networks with global inhibition. Almost all spike sequences converge within a finite number of transient spikes to these attractors. The convergence is fast, especially when the global inhibition is strong. These results support the possibility that precise spatiotemporal sequences of spikes are useful for information encoding and processing in biological neural networks. 2. Fast computation with spikes in a recurrent neural network, Dezhe Z. Jin and H. Sebastian Seung, Physical Review E, 65, 051922 (2002). Abstract: Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M.1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner. ============================================================================= Dezhe Z. Jin, Ph.D. Postdoctoral Fellow Seung Lab, Dept. of Brain and Cognitive Sciences, M.I.T. ============================================================================= From CL243 at cornell.edu Mon Oct 28 05:22:50 2002 From: CL243 at cornell.edu (Christiane Linster) Date: Mon, 28 Oct 2002 05:22:50 -0500 Subject: Computational Neuroscience meeting Message-ID: <003701c27e6b$f7ed7510$8201a8c0@cpl.cornell.edu> Dear colleagues At this summer's annual Computational Neuroscience (CNS) meeting held in Chicago, the decision was made to form a CNS organization to administer future meetings. The purpose of this e-mail is to solicit nominations for a 15 member board of directors for the organization. Overall policy for the organization will be set by the board and a 4 member executive committee. Board members will serve 5 year terms and may be re-elected. Executive officers serve an initial 2 year term, renewable annually by a simple majority vote of the board. After 5 years, a 2/3 majority will be needed to continue. The scientific program for each meeting will be set by a separate program committee. For details, the current draft of the bylaws (to be confirmed by the initial board) is available for viewing at www.nbb.cornell.edu/neurobio/linster/cns/cns.htm. Nominations are open to all members of the computational neuroscience community (including students and postdocs). All nominations should be accompanied by a brief (1-3 paragraph) statement outlining the candidate's past affiliation with the meeting and/or vision for the future direction of the meeting. Self-nominations are accepted. Please send nominations by e-mail to Christane Linster at CL243 at cornell.edu by November 15th, 2002. After receiving a complete list of nominees, board members will be elected by the current program committee (held over from the previous meeting structure) and the executive committee. The initial term of service for elected board members will be set at 1-5 years by lottery. This will ensure that in the future roughly equal numbers of board positions will open up for annual election. The main considerations for selecting board members will be diversity and continuity. It is expected that board members will have attended at least one CNS meeting in the past and will have some familiarity with the scope and tone of previous meetings. We anticipate that a number of changes will come up for discussion in the next few years, but feel that it is important to retain continuity until the new governing structure is established and working smoothly. That said, we are quite open to new ideas and obtaining new contributing voices will be an important consideration in forming a diverse board. Other forms of diversity that will be considered include: level of professional advancement (faculty, postdoc student), "home" discipline (e.g. math, biology, physics, engineering), level of analysis (e.g. cellular, network, systems), geographic location, and gender. Finally, we encourage any and all opinions regarding the future direction of the CNS meeting and the role that it should play in the field of computational neuroscience. Now is a particularly important time to voice your opinion since the direction of the meeting over the next 5-10 years will likely be formed within the next several years (CL243 at cornell.edu). We hope to keep CNS as a lively inclusive meeting while striving for continued improvement in scientific quality. Sincerely, The CNS executive committee: Christiane Linster - President Todd Troyer - Vice President Eric DeSchutter - Secretary Linda Larson-Prior - Treasurer **************************************************** Christiane Linster Dept. of Neurobiology and Behavior Tel: (607) 2544331 Cornell University Fax: (607)254 4308 W249 Seeley G. Mudd Hall, Ithaca, NY 14853 cl243 at cornell.edu http://www.nbb.cornell.edu/neurobio/linster *************************************************** From bert at snn.kun.nl Mon Oct 28 08:37:04 2002 From: bert at snn.kun.nl (Bert Kappen) Date: Mon, 28 Oct 2002 14:37:04 +0100 (MET) Subject: Paper available: On the storage capacity of attractor neural networks with depressing synapses Message-ID: Dear all, I would like to announce the following paper, which has be accepted for Physical Review E: On the storage capacity of attractor neural networks with depressing synapses J.J. Torres, L. Pantic and H.J. Kappen We compute the capacity of a binary neural network with dynamic depressing synapses to store and retrieve an infinite number of patterns. We use a biologically motivated model of synaptic depression and a standard mean-field approach. We find that at $T=0$ the critical storage capacity decreases with the degree of the depression. We confirm the validity of our main mean field results with numerical simulations. In short: dynamic synapses drastically reduce the storage capacity of attractor neural networks. ftp://ftp.mbfys.kun.nl/pub/snn/pub/reports/TPK_PRE2002.ps Bert Kappen SNN University of Nijmegen tel: +31 24 3614241 fax: +31 24 3541435 URL: www.snn.kun.nl/~bert From pavel at PH.TN.TUDelft.NL Tue Oct 29 03:23:13 2002 From: pavel at PH.TN.TUDelft.NL (Pavel Paclik) Date: 29 Oct 2002 09:23:13 +0100 Subject: PhD Position in Pattern Recognition Message-ID: Ph.D. Position Available - Deadline November 15, 2002 ----------------------------------------------------- The Delft Pattern Recognition Group, The Netherlands, is seeking for an outstanding Ph.D. student with a background in engineering, physics or computer science, having good computer programming skills, an interest in pattern recognition or artificial intelligence and preferably experienced in spectral data analysis. The Ph.D. student will participate in an applied project on Hyperspectral image analysis funded by the Technology Foundation STW. In the project industrial applications of hyperspectral imaging are studied with an emphasis on the integration of the spectral and spatial information. Automatic learning procedures based on unsupervised and partially supervised techniques will be basic components to be studied, used and developed further in this project. The research will be done in cooperation with, among others, Unilever Research Vlaardingen, AgriVision, Wageningen and the Institute of Applied Physics in Delft. The Ph.D. position is granted for 4 years. The nomination is formally for 2 years and is expected to be extended for another 2 years. The project is integrated in the research on statistical pattern recognition fundamentals and applications of the Delft Pattern Recognition Group. This research is done by 7 researchers while the entire group consists of about 35 researchers. The Ph.D. student will be a member of ASCI, the Advanced School for Computers and Imaging, a nationwide Ph.D.school. For information on the project see: http://www.ph.tn.tudelft.nl/~pavel/spectra/index.html It may also be obtained by email from Pavel Paclik, pavel at ph.tn.tudelft.nl primary researcher in the project. For information on the research on statistical pattern recognition see: http://www.ph.tn.tudelft.nl/Research/neural/index.html For information on the Delft Pattern Recognition Group see: http://www.ph.tn.tudelft.nl Applications should be sent before November 15, 2002 to the project leader, Robert P.W. Duin: duin at ph.tn.tudelft.nl. From wahba at stat.wisc.edu Tue Oct 29 21:40:01 2002 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 29 Oct 2002 20:40:01 -0600 (CST) Subject: MSVM, Poly.Penalized.Likelihood, Nonparametric.LASSO-variable selector Message-ID: <200210300240.UAA29879@spline.stat.wisc.edu> Announcing papers recently available via http://www.stat.wisc.edu/~wahba click on TRLIST 1,2 and 3 below present the Multicategory Support Vector Machine (MSVM), a generalization of the SVM which classifies to one of k categories via a single optimization problem. 4 contrasts the MSVM with the Polychotomous Penalized Likelihood estimate, which estimates k probabilities, one for each category. 5 and 6 present a new nonparametric variable selection and model building via likelihood basis pursuit and a generalization of the LASSO. 1 Lee, Y., Lin, Y. and Wahba, G. " Multicategory Support Vector Machines, Theory, and Application to the Classification of Microarray Data and Satellite Radiance Data " TR 1064, September 2002. 2 Lee, Y. " Multicategory Support Vector Machines, Theory, and Application to the Classification of Microarray Data and Satellite Radiance Data " TR 1063, September 2002. PhD. Thesis. 3 Lee, Y. and Lee, C.-K. Classification of Multiple Cancer Types by Multicategory Support Vector Machines Using Gene Expression Data. (ps) TR 1051, April 2002, minor revisions July 2002. 4 Wahba, G. " Soft and Hard Classification by Reproducing Kernel Hilbert Space Methods " (ps) (pdf) TR 1067, October 2002. To appear Proceedings of the National Academy of Sciences. 5 Zhang, H. " Nonparametric Variable Selection and Model Building Via Likelihood Basis Pursuit " TR 1066, September 2002. PhD. Thesis. 6 Zhang, H., Wahba, G., Lin, Y., Voelker, M., Ferris, M., Klein, R. and Klein, B. Variable Selection and Model Building via Likelihood Basis Pursuit (ps) (pdf) TR 1059, July, 2002. From oreilly at grey.colorado.edu Wed Oct 30 16:05:07 2002 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Wed, 30 Oct 2002 14:05:07 -0700 Subject: ICS Director Position: CU Boulder Message-ID: <200210302105.g9UL57m21522@grey.colorado.edu> People with a computational approach would be strongly considered for this position -- if you have any questions you can also contact me or Mike Mozer (mozer at cs.colorado.edu) or Yuko Munakata (munakata at psych.colorado.edu). - Randy DIRECTOR OF THE INSTITUTE OF COGNITIVE SCIENCE The Institute of Cognitive Science at the University of Colorado, Boulder, is looking for a cognitive psychologist to become its Director. The director should be a person with a distinguished academic career and some experience and interest in the duties of an executive position in academic administration. Exceptional individuals in any of the cognitive sciences other than cognitive psychology will also be considered. The Director is expected to create and promulgate the vision for the Institute's future. This requires leadership skills and active interfacing with members of the supporting departments, CU administration, and various national and international agencies and businesses whose agendas are relevant to ICS research and academic program goals. The core departments that make up the Institute of Cognitive Science include Psychology, Computer Science, Education, Linguistics, Philosophy, and Speech/Language & Hearing Sciences. The Director must be eligible for tenure as Full Professor in one of these home departments. The expected balance of duties are 40% service, 40% research, 20% teaching. The position start date is August 2003. The desired application deadline is December 31, 2002, but we will accept applications until the position is filled. Contact Dr. Donna Caccamise, Associate Director, Institute of Cognitive Science, University of Colorado, Boulder CO 80309-0344. The University of Colorado at Boulder is committed to diversity and equality in education and employment. From christof at teuscher.ch Wed Oct 30 07:20:48 2002 From: christof at teuscher.ch (Christof Teuscher) Date: Wed, 30 Oct 2002 13:20:48 +0100 Subject: [IPCAT2003] - Second Call for Papers Message-ID: <3DBFCEA0.3054293A@teuscher.ch> ================================================================ We apologize if you receive multiple copies of this email. Please distribute this announcement to all interested parties. For removal, go to http://lslwww.epfl.ch/ipcat2003/del.html ================================================================ **************************************************************** SECOND CALL FOR PAPERS **************************************************************** ** IPCAT2003 ** Fifth International Workshop on Information Processing in Cells and Tissues September 8 - 11, 2003 Swiss Federal Institute of Technology Lausanne (EPFL) Lausanne, Switzerland http://lslwww.epfl.ch/ipcat2003 **************************************************************** Important Dates: ---------------- Paper submission: February 28, 2003 Notification of acceptance: May 28, 2003 Camera-ready copy: July 11, 2003 Description: ------------ The aim of the series of IPCAT workshops is to bring together a multidisciplinary core of scientists who are working in the general area of modeling information processing in biosystems. A general theme is the nature of biological information and the ways in which it is processed in biological and artificial cells and tissues. The key motivation is to provide a common ground for dialogue and interaction, without emphasis on any particular research constituency, or way of modeling, or single issue in the relationship between biology and information. IPCAT2003 will highlight recent research and seek to further the dialogue, exchange of ideas, and development of interactive viewpoints between biologists, physicists, computer scientists, technologists and mathematicians that have been progressively expanded throughout the IPCAT series of meetings (since 1995). The workshop will feature sessions of selected original research papers grouped around emergent themes of common interest, and a number of discussions and talks focusing on wider themes. IPCAT2003 will give particular attention to morphogenetic and ontogenetic processes and systems. IPCAT2003 encourages experimental, computational, and theoretical articles that link biology and the information processing sciences and that encompass the fundamental nature of biological information processing, the computational modeling of complex biological systems, evolutionary models of computation, the application of biological principles to the design of novel computing systems, and the use of biomolecular materials to synthesize artificial systems that capture essential principles of natural biological information processing. Accepted papers will be published in a special issue of the BioSystems journal (Elsevier Science). Topics of Interest: ------------------- Topics to be covered will include, but not limited to, the following list: o Self-organizing, self-repairing, and self-replicating systems o Evolutionary algorithms o Machine learning o Evolving, adapting, and neural hardware o Automata and cellular automata o Information processing in neural and non-neural biosystems o Parallel distributed processing biosystem models o Information processing in bio-developmental systems o Novel bio-information processing systems o Autonomous and evolutionary robotics o Bionics, neural implants, and bio-robotics o Molecular evolution and theoretical biology o Enzyme and gene networks o Modeling of metabolic pathways and responses o Simulation of genetic and ecological systems o Single neuron and sub-neuron information processing o Microelectronic simulation of bio-information systemics o Artificial bio-sensor and vision implementations o Artificial tissue and organ implementations o Applications of nanotechnology o Quantum informational biology o Quantum computation in cells and tissues o DNA computing Special Session: ---------------- Morphomechanics of the Embryo and Genome + Artificial Life -> Embryonics Artificial intelligence started with imitation of the adult brain, and artificial life has dealt mostly with the adult organism and its evolution, in that the span from genome to organism has been short or nonexistent. Embryonics is the attempt to grow artificial life in a way analogous to real embryonic development. This session will include speakers grappling with both ends of the problem. Papers for this special session should be submitted through the regular procedure. Organizers: R. Gordon, Lev V. Beloussov For up-to-date information, consult the IPCAT2003 web-site: http://lslwww.epfl.ch/ipcat2003 We are looking forward to seeing you in beautiful Lausanne! Sincerely, Christof Teuscher IPCAT2003 Program Chair ---------------------------------------------------------------- Christof Teuscher Swiss Federal Institute of Technology Lausanne (EPFL) christof at teuscher.ch http://www.teuscher.ch/christof ---------------------------------------------------------------- IPCAT2003: http://lslwww.epfl.ch/ipcat2003 ---------------------------------------------------------------- From ahirose at eis.t.u-tokyo.ac.jp Thu Oct 31 01:35:26 2002 From: ahirose at eis.t.u-tokyo.ac.jp (Akira Hirose) Date: Thu, 31 Oct 2002 15:35:26 +0900 (JST) Subject: Complex-valued Neural Networks: Special Invited Session in KES2003 Message-ID: <200210310635.g9V6ZQwH007694@pekoe.eis.t.u-tokyo.ac.jp> Special Invited Session "Complex-Valued Neural Networks" KES 2003 (7th International Conference on Knowledge-Based Intelligent Information & Engineering Systems) 3-5 September 2003, St Anne's College, University of Oxford, U.K. Conference Web Site: http://www.bton.ac.uk/kes/kes2003/ Topics and Objectives of the Special Invited Session: In these years the Complex-Valued Neural Networks expand the application fields in optoelectronic imaging, remote sensing, quantum neural devices and systems, spatiotemporal analysis of physiological neural systems as well as artificial neural information processing. At the same time, the potentially wide applicability yields new theories required for novel and more effective functions and mechanisms. This session aims to discuss the latest progress of the field and to penetrate the future prospects. Instructions for Authors: Please find them in the above Conference Web Site. Publication: The Conference Proceedings will be published by a major publisher, for example IOS Press of Amsterdam. Extended versions of selected papers will also be considered for publication in the International Journal of Knowledge-Based Intelligent Engineering Systems, http://www.bton.ac.uk/kes/journal/ Important Dates (tentative): First of all, please contact the Session Chair below. Deadline for submission intention: December 1, 2002 Deadline for receipt of papers by Session Chair: February 1, 2003 Notification of acceptance: March 1, 2003 Camera-ready papers to session chair by: April 1, 2003 Contact: Session Chair: Akira Hirose, Assoc. Prof. Department of Electrical and Electronic Engineering The University of Tokyo 7-3-1 Hongo, Bunkyo-ku, Tokyo 153-8656, Japan Email: ahirose at ee.t.u-tokyo.ac.jp http://www.eis.t.u-tokyo.ac.jp/ *Note* Those who are interested in submission are requested to make contact with the Session Chair by ca. 1st December 2002. From dwang at cis.ohio-state.edu Wed Oct 30 16:10:22 2002 From: dwang at cis.ohio-state.edu (DeLiang Wang) Date: Wed, 30 Oct 2002 17:10:22 -0400 Subject: IEEE TNN Special Issue on temporal coding Message-ID: <3DC03C8C.B2A5BA8E@cis.ohio-state.edu> IEEE Transactions on Neural Networks Call for Papers Special Issue on "Temporal Coding for Neural Information Processing" Largely motivated by neurobiological discoveries, neural network research is currently witnessing a significant shift of emphasis towards temporal coding, which uses time as an extra degree of freedom in neural representations. Temporal coding is passionately debated in neuroscience and related fields, but in the last few years a large volume of physiological and behavioral data has emerged that supports a key role for temporal coding in the brain. In neural networks, a great deal of research is undertaken under the topics of nonlinear dynamics, oscillatory and chaotic networks, spiking neurons, and pulse-coupled networks. Various information processing tasks are investigated using temporal coding, including scene segmentation, figure-ground separation, classification, learning, associative memory, inference, motor control, and communication. Progress has been made that substantially advances the state-of-the-art of neural computing. In many instances, however, neural models incorporating temporal coding are driven merely by the assertion that real neurons use impulses. It is often unclear whether, and to what extent, the temporal aspects of the models contribute to information processing capabilities. It is time to assess the role and potential of temporal coding in terms of information processing performance by providing a comprehensive view of the current approaches and issues to the neural networks community. This special issue seeks to present, in a collective way, research that makes a clear contribution to addressing information processing tasks using temporal coding. The issue is intended not only to highlight successful use of temporal coding in neural networks but also clarify outstanding issues for future progress. Suggested topics include but are not limited to the following: - Synchrony, desynchrony, and other temporal phenomena - Encoding and decoding in the temporal domain - Comparative issues in rate coding and temporal coding - Cognitive aspects of temporal/spatiotemporal phenomena in neural systems - Effects and uses of time delays - Potential roles of chaos, randomness and noise - Learning for temporal codes - Temporal/spatiotemporal information processing for: * Perceptual processing * Learning, memory, and reasoning * Motor control * Communication - Innovative applications - Hardware implementation The guest editors of the special issue are: Walter Freeman, University of California, Berkeley Robert Kozma, University of Memphis Andrzej Lozowski, Southern Illinois University Ali Minai, University of Cincinnati DeLiang Wang, Ohio State University Manuscripts will first be screened for topical relevance, and those that pass the screening process will undergo the standard review procedure of the IEEE Transactions on Neural Networks (see the instructions for authors for the Transactions). Paper submission deadline is May 30, 2003, and the special issue will be published by July 2004. Papers should be submitted in PDF format via email to the lead guest editor: DeLiang Wang Email: dwang at cis.ohio-state.edu http://www.cis.ohio-state.edu/~dwang From butz at illigal.ge.uiuc.edu Thu Oct 31 14:38:34 2002 From: butz at illigal.ge.uiuc.edu (Martin Butz) Date: Thu, 31 Oct 2002 13:38:34 -0600 (CST) Subject: Call for contributions - Anticipatory Behavior in Adaptive Learning Systems Message-ID: (We apologize for multiple copies) ############################################################################### C A L L F O R C O N T R I B U T I O N S ABiALS 2002 Post Proceedings Book: "Anticipatory Behavior in Adaptive Learning Systems: Foundations, Theories, and Systems" to appear in Springer's Lecture Notes in Artificial Intelligence ############################################################################### the first workshop on Adaptive Behavior in Anticipatory Learning Systems (ABiALS 2002) was held on August 11., 2002 in Edinburgh, Scotland http://www-illigal.ge.uiuc.edu/ABiALS in association with the seventh international conference on Simulation of Adaptive Behavior (SAB'02) http://www.isab.org.uk/sab02/ This upcoming volume addresses the question of when, where, and how anticipations are useful in adaptive systems. Anticipations refer to the influence of future predictions or future expectations on behavior and learning. ABiALS 2002 was a first interdisciplinary gathering of people interested in how anticipations can be used efficiently to improve behavior and learning. Four fundamentally different systems were distinguished: (1) Implicitly anticipatory systems are those that act/learn in an intelligent way but do not include any predictive bias in the applied learning and/or behavioral mechanisms. (2) Payoff anticipatory systems are those systems that do compare payoff predictions for action decisions but do not use any state predictions. (3) Sensory anticipatory systems are systems that use sensory predictions to improve perceptual processing (e.g. preparatory attention). (4) State anticipatory systems are systems that form explicit future predictions/expectations that influence action decisions and learning. The book "Anticipatory Behavior in Adaptive Learning Systems" will address the latter two of the four types of systems. Submissions are welcome that are concerned with any type of sensory anticipatory or state anticipatory system. ___________________________________________________________________________ Aim and Objectives: Most of the research over the last years in artificial adaptive behavior concerned with model learning and anticipatory behavior has focused on the model learning side. Research is particularly engaged in online generalized model learning. Until now, though, exploitation of the model has been done mainly to show that exploitation is possible or that an appropriate model exists in the first place. Only very few applications are available that show the utility of the model for the simulation of anticipatory behavior. The aim of this book is to lay out the foundations for a study of anticipatory learning and behavior. The content will be divided roughly into three chapters. The first chapter will provide psychological background that not only supports the presence of anticipatory mechanisms in ``higher'' animals and humans but also sheds light on when and why anticipatory mechanisms can be useful. Chapter 2 will provide foundations and frameworks for the study of anticipatory mechanisms distinguishing fundamentally different mechanisms. Finally, Chapter 3 will contain examples of implemented frameworks and systems. ___________________________________________________________________________ Essential questions: * How can anticipations influence the adaptive behavior of an artificial learning system? * How can anticipatory adaptive behavior be implemented in an artificial learning system? * How does an incomplete model influence anticipatory behavior? * How can anticipations guide further model learning? * How can anticipations control attention? * Can anticipations be used for the detection of special environmental properties? * What are the benefits of anticipations for adaptive behavior? * What is the trade-off between simple bottom-up stimulus-response driven behavior and more top-down anticipatory driven behavior? * In what respect does anticipation mediate between low-level environmental processing and more complex cognitive simulation? * What role do anticipations play for the implementation of motivations and emotions? ___________________________________________________________________________ Submission: Submissions for the book should address one or more of the above questions or provide appropriate psychological background on anticipatory mechanisms in animals and humans. Papers with inappropriate content will be rejected. The book is intended to be interdisciplinary and open to all approaches to anticipatory behavior. There is no restriction on the type of anticipatory learning system or on the representation of predictions for anticipatory behavior and learning. Papers will be reviewed for acceptance by the program committee and the organizers. Papers should be submitted electronically to one of the organizers via email in pdf or ps format. Electronic submission is strongly encouraged. If you cannot submit your contribution electronically, please contact one of the organizers. Submitted papers should be between 10 and 20 pages in 10pt, one-column format. Please use the LNCS style available at: http://www.springer.de/comp/lncs/authors.html. Submission deadline is DECEMBER 20, 2002. For more information please refer to the workshop page: http://www-illigal.ge.uiuc.edu/ABiALS/ Please also see our introductory talk to the workshop for more detailed information on anticipations and different types of anticipatory behavior: http://www-illigal.ge.uiuc.edu/ABiALS/ABiALS2002Introduction.htm There is also an introductory paper available that provides further general information on the topic: http://www-illigal.ge.uiuc.edu/ABiALS/Papers/ABiALS2002Intro.pdf ___________________________________________________________________________ Important Dates: 20.December 2002: Deadline for Submissions 24.January 2002: Notification of Acceptance 21.February 2002: Camera Ready Version for LNAI Volume ___________________________________________________________________________ Program Committee: Emmanuel Dauc? Facult? des sciences du sport Universit? de la M?diterrann?e Marseille, France Ralf Moeller Cognitive Robotics Max Planck Institute for Psychological Research Munich, Germany Wolfgang Stolzmann DaimlerChrysler AG Berlin, Germany Jun Tani Lab. for Behavior and Dynamic Cognition Brain Science Institute, RIKEN 2-1 Hirosawa, Wako-shi, Saitama, 351-0198 Japan Stewart W. Wilson President Prediction Dynamics USA ___________________________________________________________________________ Organizers: Martin V. Butz, Illinois Genetic Algorithms Laboratory (IlliGAL), Universtiy of Illinois at Urbana-Champaign, Illinois, USA also: Department of Cognitive Psychology University of Wuerzburg, Germany butz at illigal.ge.uiuc.edu http://www-illigal.ge.uiuc.edu/~butz Pierre G?rard, AnimatLab, University Paris VI, Paris, France pierre.gerard at lip6.fr http://animatlab.lip6.fr/Gerard Olivier Sigaud AnimatLab, University Paris VI, Paris, France olivier.sigaud at lip6.fr http://animatlab.lip6.fr/Sigaud