From amari at brain.riken.go.jp Thu Jun 1 04:54:58 2000 From: amari at brain.riken.go.jp (Shun-ichi Amari) Date: Thu, 1 Jun 2000 17:54:58 +0900 Subject: No subject Message-ID: Dear connectionists: It is my pleasure to announce the following symposium, focussing on 1) graphical methods and statistics, 2) combining learners, 3) VC dimension and support vector machines, and 4) information geometry and statistical physical methods. Shun-ichi Amari Vice Director, RIKEN Brain Science Institute Laboratory for Mathematical Neuroscience Research Group on Brain-Style Information Systems tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687 amari at brain.riken.go.jp http://www.bsis.brain.riken.go.jp/ ********************************************** Bernoulli-RIKEN BSI 2000 Symposium on Neural Networks and Learning Dates and Venues: Symposium: October 25 - 27, 2000 Ohkouchi Hall, RIKEN (The Institute of Physical and Chemical Research), Japan Satellite Workshop: October 28, 2000 The Institute of Statistical Mathematics, Japan Aim: In order to celebrate Mathematical Year 2000, The Bernoulli Society is organizing a number of Symposia in rapidly developing research areas in which probability theory and statistics can play important roles. Brain science in the wide sense will become more important in the 21 century. Information processing in the brain is so flexible and its learning ability is so strong that it is indeed a challenge for information science to elucidate its mechanisms. It is also a big challenge to construct information processing systems of brain style. The present Symposium focuses on learning ability of real and artificial neural networks and related systems from theoretical and practical points of view. Probability theory and statistics will play fundamental roles in elucidating these systems, and they in turn fortify stochastic and statistical methods. Theories of neural learning and pattern recognition have a long history, and lots of new ideas are emerging currently. They have also practical applicability in the real world problems. Now is a good time to review all of these new ideas and methods and to discuss future directions of developments. We will invite worldwide top class researchers in these fields and discuss the state-of-the-art of neural networks and learning as well as future directions of this important area. Participants are by invitation only. We are expecting 50 -80 participants from all over the world. After the symposium, we will organize a more informal one-day workshop: "Towards new unification of statistics and neural networks learning". The detailed information for time tables and abstracts can be obtained at http://www.bsis.brain.riken.go.jp/Bernoulli Those who have interests in joining the Symposium and Workshop may ask invitation through the above web-site after June 15 when we are ready. If you have any questions, contact the organizing committee at bernoulli2000 at bsis.brain.riken.go.jp ******************* Sponsors: The Bernoulli Society for Mathematical Statistics and Probability RIKEN Brain Science Institute The Institute of Statistical Mathematics Japanese Neural Networks Society In Cooperation with: Japanese Statistical Society Supported by: The Commemorative Association for the Japan World Exposition (1970) The Support Center for Advanced Telecommunications research Technology (SCAT) Organizing Committee: Chair Shun-ichi Amari, RIKEN Brain Science Institute, Japan Leo Breiman, University of California, Berkeley, USA Shinto Eguchi, The Institute of Statistical Mathematics, Japan Michael Jordan, University of California, Berkeley, USA Noboru Murata, Waseda University, Japan Mike Titterington, University of Glasgow, UK Vladimir Vapnik, AT&T, USA Registration fee 10,000 Japanese yen (nearly 100 US$) (including reception) is requested at the conference vennue. There is 50% student discount. ***************** Program: 1. Graphical Models and Statistical Methods: Steffen L. Lauritzen (Aalborg University) Graphical models for learning Thomas S. Richardson (University of Warwick) Ancestral graph Markov models: an alternative to models with latent or selection variables Lawrence Saul (AT&T Labs) Hidden variables and distributed representations in automatic speech recognition Martin Tanner (Northwestern University) Inference for and Applications of Hierarchical Mixtures-of-Experts 2. Combining Learners Leo Breiman (University of California, Berkeley) Random Forests Jerome H. Friedman (Stanford University) Gradient boosting and multiple additive regression trees Peter Bartlett (Australian National University) Large Margin Classifiers and Risk Minimization Yoram Singer (The Hebrew University) Combining Learners: an Output Coding Perspective 3. Information Geometry and Statistical Physics Shinto Eguchi (The Institute of Statistical Mathematics) Information geometry of tubular neighbourhoods for a statistical model Shun-ichi Amari (RIKEN Brain Science Institute) Information geometry of neural networks Manfred Opper (Aston University) The TAP Mean Field approach for probabilistic models Magnus Rattray (University of Manchester) Modelling the learning dynamics of latent variable models 4. VC Dimension and SVM Vladimir Vapnik (AT&T Labs) Statistical learning theory and support vector machines Michael Kearns (AT&T Labs) Sparse Sampling Algorithms for Probabilistic Artificial Intelligence Gabor Lugosi (Pompeu Fabra University) Model selection, error estimation, and concentration Bernhard Schoelkopf (Microsoft Research Ltd.) SV Algorithms and Applications From t.c.pearce at leicester.ac.uk Thu Jun 1 07:00:38 2000 From: t.c.pearce at leicester.ac.uk (Tim Pearce) Date: Thu, 1 Jun 2000 12:00:38 +0100 Subject: Ph.D Studentship in Neuronal Modelling Message-ID: Ph.D. Studentship NEURONAL MODELLING IN FIELD PROGRAMMABLE GATE ARRAYS (FPGAs) (Ph.D Position, University of Leicester/ETH Switzerland UK/EU Nationals) An opportunity exists for an EPSRC Ph.D. studentship to develop novel neuronal models that may be implemented in both software and FPGAs. Biologically realistic models of spiking neurons are a relatively new research topic, and require significant computational power to simulate. This project investigates reduced complexity models of biological neurons that may be implemented digitally, and function in parallel, using standalone FPGA devices. Research will then focus on combining large numbers of these models on a single device for real-time olfactory (smell) sensing for use on mobile behaving robots. This portion of project will be conducted in close collaboration with the Institute of Neuroinformatics, ETH, Zrich, Switzerland, to which a visit will take place during the final year. While a biological background is not necessary, a good degree in a numerate discipline such as maths, engineering, physics, or computer science is required. The successful candidate will be expected to register for the degree of Ph.D in the Control and Instrumentation Research Group at Leicester University Engineering Dept. Applicants should send a CV and names and addresses of two referees by regular mail to: Dr. Tim Pearce, Dept. of Engineering University of Leicester University Road LEICESTER LE1 7RH, U.K. Informal enquiries (phone or e-mail) are also welcome. Tel +44 (0)116 223 1290 Fax +44 (0)116 252 2619 e-mail: t.c.pearce at le.ac.uk -- T.C. Pearce, PhD URL: http://www.leicester.ac.uk/engineering/ Lecturer in Bioengineering E-mail: t.c.pearce at leicester.ac.uk Department of Engineering Tel: +44 (0)116 223 1290 University of Leicester Fax: +44 (0)116 252 2619 Leicester LE1 7RH Bioengineering, Transducers and United Kingdom Signal Processing Group From morten at compute.it.siu.edu Thu Jun 1 12:27:04 2000 From: morten at compute.it.siu.edu (Morten H. Christiansen) Date: Thu, 1 Jun 2000 11:27:04 -0500 (CDT) Subject: Two Graduate Openings in Brain and Cognitive Sciences Message-ID: Dear Colleague, Please bring the following information to the attention of potential graduate school applicants from your program with an interest in Brain and Cognitive Sciences. TWO GRADUATE OPENINGS IN BRAIN AND COGNITIVE SCIENCES IN THE DEPARTMENT OF PSYCHOLOGY AT SOUTHERN ILLINOIS UNIVERSITY, CARBONDALE. Dr. Matthew Schlessinger (UMass) and Dr. Michael Young (UIowa) will be joining the faculty in the Brain and Cognitive Science Graduate Program, the Department of Psychology at Southern Illinois University. In this connection, the Brain and Cognitive Science program has two openings for graduate study (though the two openings are not tied to the incoming faculty). Each opening comes with a monthly stipend of approximately $1000.00 for at least nine months. The start date is August 14, 2000, and applications should be submitted a.s.a.p. The Ph.D. program in Brain and Cognitive Sciences is unique and exciting. The focus is on an interdisciplinary approach to understanding human behavior approached from a combination of developmental (infancy and childhood, adolescence and aging), neurobiological (neurophysiology, neuropsychology, genetics), behavioral (human and animal experimentation) and computational (neural networks, statistical analyses, intelligent software agents) perspectives. As an integral part of their training, students become active participants in ongoing faculty research programs in the Brain and Cognitive Sciences. Students will receive training in two or more different research methodologies, and are expected to develop a multidisciplinary approach to their own research. Current research by the Brain and Cognitive Sciences faculty includes perinatal risk factors in child development, neurophysiological and behavioral correlates of infant and child cognitive and language development, personality and social correlates of cognitive aging, child play and social behaviors, identity development across the life span, judgment and decision making, causal and category learning, neural network models of learning and sensorimotor cognition, neural network models of language acquisition and processing, agent-based computational modeling of the evolution and development of action and perception, artificial grammar learning, sentence processing, evolution of language and the brain, the pharmacological modulation of memory, effects of psychoactive drugs, reversible inactivation of discrete brain areas and memory, recovery of function from brain damage, electrophysiological models (e.g., long-term potentiation), the neurophysiology of memory, animal learning, and human learning and memory. For more information about the program and application procedures, please visit our web site at: http://www.siu.edu/~psycho/bcs Visit also the Department's web site at: http://www.siu.edu/~psycho Best regards, Morten Christiansen Coordinator of the Brain and Cognitive Sciences Program ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A Personal Web Page: http://www.siu.edu/~psycho/faculty/mhc.html Lab Web Site: http://www.siu.edu/~morten/csl ---------------------------------------------------------------------- From ericwan at ece.ogi.edu Thu Jun 1 13:45:22 2000 From: ericwan at ece.ogi.edu (Eric Wan) Date: Thu, 01 Jun 2000 10:45:22 -0700 Subject: POSTDOCTORAL and PH.D. RESEARCH POSITIONS Message-ID: <3936A132.70AC4573@ece.ogi.edu> POSTDOCTORAL and PH.D. RESEARCH POSITIONS The Center for Spoken Language Understanding (CSLU) at the Oregon Graduate Institute of Science and Technology (OGI) is seeking applicants for one Postdoctoral Research Associate and one Ph.D. Student Fellowship to work with Professor Eric A. Wan (http://www.ece.ogi.edu/~ericwan/) on a number of projects relating to speech enhancement and machine learning. QUALIFICATIONS: The postdoctoral candidate should have a Ph.D. with a strong background in signal processing, speech technologies, and neural networks. The Ph.D. candidate should have a strong background in signal processing, neural networks, and some familiarity with speech technologies. A Masters Degree in Electrical Engineering is preferred. Please send inquiries to ericwan at ece.ogi.edu. Include the following background information: - name and affiliation, - a short paragraph describing qualifications and interests - a CV including a list of publications and prior work Eric A. Wan Associate Professor, OGI ****************************************************** OGI OGI is a young, but rapidly growing, private research institute located in the Portland area. OGI offers Masters and Ph.D. programs in Computer Science and Engineering, Applied Physics, Electrical Engineering, Biology, Chemistry, Materials Science and Engineering, and Environmental Science and Engineering. OGI is located near Portland, a thriving city in the heart of the lush natural beauty of the Pacific Northwest. It is centered only a 1-2 hour drive from year-round downhill skiing on dormant volcanos, high desert, countless forest hiking trails, the Pacific Ocean, and numerous breweries and wineries. OGI is an equal-opportunity, affirmative action employer; women, minorities, and individuals with disabilities are encouraged to apply. OGI has world renowned research programs in the areas of speech systems (Center for Spoken Language Understanding) and machine learning. (Center for Information Technologies). Center for Spoken Language Understanding http://cslu.cse.ogi.edu The Center for Spoken Language Understanding is a multidisciplinary academic organization that focuses on basic research in spoken language systems technologies, training of new investigators, and development of tools and resources for free distribution to the research and education community. Areas of specific interest include speech recognition, natural language understanding, text-to-speech synthesis, speech enhancement in noisy conditions, and modeling of human dialogue. A key activity is the ongoing development of the CSLU Toolkit, a comprehensive software platform for learning about, researching, and developing spoken dialog systems and new applications. Center for Information Technologies The Center for Information Technologies supports development of powerful, robust, and reliable information processing techniques by incorporating human strategies and constraints. Such techniques are critical building blocks of multimodal communication systems, decision support systems, and human-machine interfaces. The CIT approach is based on emulating relevant human information processing capabilities and extending them to a variety of complex tasks. The approach requires expertise in nonlinear and adaptive signal processing (e.g., neural networks), statistical computation, decision analysis, and modeling of human information processing. Correspondingly, CIT research areas include perceptual characterization of speech and images, prediction, robust signal processing, rapid adaptation to changing environments, nonlinear signal representation, integration of information from several sources, and integration of prior knowledge with adaptation. From adr at raphe.NSMA.Arizona.EDU Thu Jun 1 15:18:31 2000 From: adr at raphe.NSMA.Arizona.EDU (David Redish) Date: Thu, 01 Jun 2000 12:18:31 -0700 Subject: MClust: Spike-sorting toolbox Message-ID: <200006011937.MAA24927@cortex.NSMA.Arizona.EDU> Announcing the release of the MClust spike-sorting toolbox. MClust is a Matlab toolbox which enables a user to perform manual clustering on single-electrode, stereotrode, and tetrode recordings taken with the DataWave and Cheetah recording systems, including data generated from tetrodes, stereotrodes, and single electrodes. It can be found at http://www.nsma.arizona.edu/adr/mclust/ The MClust toolbox is free-ware, but you will need Matlab 5.2 or higher to run it. It has been tested under the Windows and Solaris families of operating systems, and ports to other operating are in the works. Further details (such as the copyright notice and disclaimer) are available from the website above. ----------------------------------------------------- A. David Redish adr at nsma.arizona.edu Post-doc http://www.cs.cmu.edu/~dredish Neural Systems, Memory and Aging, Univ of AZ, Tucson AZ ----------------------------------------------------- From bvr at stanford.edu Fri Jun 2 01:33:45 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Thu, 01 Jun 2000 22:33:45 -0700 Subject: NIPS*2000 WORKSHOP PROPOSALS - DEADLINE EXTENDED TO JUNE 9 Message-ID: <4.2.0.58.20000601223246.00da83c0@bvr.pobox.stanford.edu> NIPS*2000 WORKSHOP PROPOSALS - DEADLINE EXTENDED TO JUNE 9 ===================================== Neural Information Processing Systems Natural and Synthetic NIPS*2000 Post-Conference Workshops December 1 and 2, 2000 Breckenridge, Colorado ===================================== Following the regular program of the Neural Information Processing Systems 2000 conference, workshops on various current topics in neural information processing will be held on December 1 and 2, 2000, in Breckenridge, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Example topics include: Active Learning, Architectural Issues, Attention, Audition, Bayesian Analysis, Bayesian Networks, Benchmarking, Brain Imaging, Computational Complexity, Computational Molecular Biology, Control, Genetic Algorithms, Graphical Models, Hippocampus and Memory, Hybrid Supervised/Unsupervised Learning Methods, Hybrid HMM/ANN Systems, Implementations, Independent Component Analysis, Mean-Field Methods, Markov Chain Monte-Carlo Methods, Music, Network Dynamics, Neural Coding, Neural Plasticity, On-Line Learning, Optimization, Recurrent Nets, Robot Learning, Rule Extraction, Self-Organization, Sensory Biophysics, Signal Processing, Spike Timing, Support Vectors, Speech, Time Series, Topological Maps, and Vision. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. There will be six hours of workshop meetings per day, split into morning and afternoon sessions, with free time in between for ongoing individual exchange or outdoor activities. Controversial issues, open problems, and comparison of competing approaches are encouraged and preferred as workshop topics. Representation of alternative viewpoints and panel-style discussions are particularly encouraged. Descriptions of previous workshops may be found at http://www.cs.cmu.edu/Groups/NIPS/NIPS99/Workshops/ Select workshops may be invited to submit their workshop proceedings for publication as part of a new series of monographs for the post-NIPS workshops. Workshop organizers will have responsibilities including: ++ coordinating workshop participation and content, which includes arranging short informal presentations by experts, arranging for expert commentators to sit on a discussion panel, formulating a set of discussion topics, etc. ++ moderating the discussion, and reporting its findings and conclusions to the group during evening plenary sessions ++ writing a brief summary and/or coordinating submitted material for post-conference electronic dissemination. ======================= Submission Instructions ======================= Interested parties should submit a short proposal for a workshop of interest via email by June 9, 2000. Proposals should include title, description of what the workshop is to address and accomplish, proposed workshop length (1 or 2 days), planned format (mini-conference, panel discussion, combinations of the above, etc), and proposed speakers. Names of potential invitees should be given where possible. Preference will be given to workshops that reserve a significant portion of time for open discussion or panel discussion, as opposed to pure "mini-conference" format. An example format is: ++ Tutorial lecture providing background and introducing terminology relevant to the topic. ++ Two short lectures introducing different approaches, alternating with discussions after each lecture. ++ Discussion or panel presentation. ++ Short talks or panels alternating with discussion and question/answer sessions. ++ General discussion and wrap-up. We suggest that organizers allocate at least 50% of the workshop schedule to questions, discussion, and breaks. Past experience suggests that workshops otherwise degrade into mini-conferences as talks begin to run over. The proposal should motivate why the topic is of interest or controversial, why it should be discussed, and who the targeted group of participants is. It also should include a brief resume of the prospective workshop chair with a list of publications to establish scholarship in the field. Submissions should include contact name, address, email address, phone and fax numbers. Proposals should be emailed to caruana at cs.cmu.edu. Proposals must be RECEIVED by June 9, 2000. If email is unavailable, mail to: NIPS Workshops, Rich Caruana, SCS CMU, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA. Questions may be addressed to either of the Workshop Co-Chairs: Rich Caruana (caruana at cs.cmu.edu) Virginia de Sa (desa at phy.ucsf.edu) PROPOSALS MUST BE RECEIVED BY June 9, 2000 From juergen at idsia.ch Fri Jun 2 12:51:17 2000 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Fri, 2 Jun 2000 18:51:17 +0200 Subject: job Message-ID: <200006021651.SAA20708@ruebe.idsia.ch> We are seeking an outstanding PhD candidate for an ongoing research project that combines machine learning (unsupervised coding, neural networks, reinforcement learning, evolutionary computation) and computational fluid dynamics. We tackle problems such as turbulent flow encoding and drag minimisation. Details in http://www.idsia.ch/~juergen/flow2000.html Interviews: IJCNN (24-27 July 2000) will take place in Como, Italy. Como is very close to the Swiss border, just 30 minutes away from IDSIA. In case you are participating in the conference: this would be a good time for a job interview. ___________________________________________________ Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno (Lugano), Switzerland juergen at idsia.ch http://www.idsia.ch/~juergen From Pierre.Bessiere at imag.fr Fri Jun 2 06:14:30 2000 From: Pierre.Bessiere at imag.fr (Pierre Bessiere) Date: Fri, 2 Jun 2000 11:14:30 +0100 Subject: 3 papers about BAYESIAN ROBOTICS Message-ID: 3 papers about Bayesian Robotics are available online (comments welcome) : Bayesian Robots Programming Abstract: We propose a new method to program robots based on Bayesian inference and learning. The capacities of this programming method are demonstrated through a succession of increasingly complex experiments. Starting from the learning of simple reactive behaviors, we present instances of behavior combinations, sensor fusion, hierarchical behavior composition, situation recognition and temporal sequencing. This series of experiments comprises the steps in the incremental development of a complex robot program. The advantages and drawbacks of this approach are discussed along with these different experiments and summed up as a conclusion. These different robotics programs may be seen as an illustration of probabilistic programming applicable whenever one must deal with problems based on uncertain or incomplete knowledge. The scope of possible applications is obviously much broader than robotics. PDF: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Lebeltel2000.pdf PS: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Lebeltel2000.ps PS.GZ: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Lebeltel2000.ps.gz Reference: Lebeltel O., Bessi=E8re P., Diard J. & Mazer E. (2000); Bayesian Robots Programming; Les cahiers du Laboratoire Leibniz (Technical Report), n=B01, Mai 2000; Grenoble, France The Design and Implementation of a Bayesian CAD Modeler for Robotic Applications Abstract: We present a Bayesian CAD modeler for robotic applications. We address the problem of taking into account the propagation of geometric uncertainties when solving inverse geometric problems. The proposed method may be seen as a generalization of constraint-based approaches in which we explicitly model geometric uncertainties. Using our methodology, a geometric constraint is expressed as a probability distribution on the system's parameters and the sensor measurements, instead of a simple equality or inequality. To solve geometric problems in this framework, we propose an original resolution method able to adapt to problem complexity. Using two examples, we show how to apply our approach by providing simulation results using our modeler. PDF: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Mekhnacha2000.pdf PS: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Mekhnacha2000.ps PS.GZ: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Mekhnacha2000.ps.gz Reference: Mekhnacha K., Mazer E. & Bessi=E8re P. (2000); The Design and Implementation of a Bayesian CAD Modeler for Robotic Applications; Les cahiers du Laboratoire Leibniz (Technical report), n=B02, Mai 2000; Grenoble, France State Identification for Planetary Rovers: Learning and Recognition Abstract: A planetary rover must be able to identify states where it should stop or change its plan. With limited and infrequent communication from ground, the rover must recognize states accurately. However, the sensor data is inherently noisy, so identifying the temporal patterns of data that correspond to interesting or important states becomes a complex problem. In this paper, we present an approach to state identification using second-order Hidden Markov Models. Models are trained automatically on a set of labeled training data; the rover uses those models to identify its state from the observed data. The approach is demonstrated on data from a planetary rover platform. PDF: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Aycard2000.pdf PS: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Aycard2000.ps PS.GZ: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Aycard2000.ps.gz Reference: O. Aycard and R. Washington.(2000) State Identificationfor Planetary Rovers: Learning and Recognition. In Proceedings of the 2000 IEEE International Conference on Robotics and Automation. San Francisco, USA. ***** You are welcome to visit our WWW pages (http://www-leibniz.imag.fr/LAPLACE) with numerous VIDEOs and DEMOs. ___________________________________________________________________ Dr Pierre BESSIERE CNRS ********************* Laboratoire LEIBNIZ - Institut IMAG 46 ave. Felix Viallet Work: +33/(0)4.76.57.46.73 38031 Grenoble - FRANCE Fax : +33/(0)4.76.57.46.02 mailto:Pierre.Bessiere at imag.fr WWW: http://www-leibniz.imag.fr/LAPLACE http://www-leibniz.imag.fr/PRASC http://www-leibniz.imag.fr/~bessiere CNRS - INPG - UJF From poznan at harl.hitachi.co.jp Sat Jun 3 02:27:15 2000 From: poznan at harl.hitachi.co.jp (Roman Poznanski) Date: Sat, 03 Jun 2000 15:27:15 +0900 Subject: A new book on NEURAL NETWORKS....... References: <38D96662.6203063A@neuron.kaist.ac.kr> Message-ID: <3938A543.1B7368D2@harl.hitachi.co.jp> FORTHCOMING Biophysical Neural Networks: Foundations of Analytical Neuroscience Edited by Roman R. Poznanski, Advanced Research Laboratory, Hitachi, Ltd., Japan Biophysical Neural Networks focuses on biologically realistic models in the exploration of brain function at a multi-hierarchical level of organization. From biochemistry to large assemblies of neurons, the modeling of morphologically diverse, biophysically and biochemically realistic neural networks is the main cogitation of the book. An essential text for researchers active in the field of medicine (neuroscience), biophysics, biomedical engineering, and neural science. Key Features -------------- **Learn why the brain is instrinsically noncomputational. **Read how the Darwinian brain model can be made more profitable for engineers. **Discover the new field of ANALYTICAL NEUROSCIENCE. **With 45 research projects/ unsolved problems. **Over 500 references and color plates. The book is scheduled to appear late this year. If you would like to receive a color brochure email me your name and postal address. Sincerely, Roman R. Poznanski Editor, Biophysical Neural Networks: Foundations of Analytical Neuroscience From stefan.wermter at sunderland.ac.uk Mon Jun 5 07:48:06 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Mon, 05 Jun 2000 12:48:06 +0100 Subject: cognitive systems research journal book review Message-ID: <393B9376.3A2462EC@sunderland.ac.uk> The journal of COGNITIVE SYSTEMS RESEARCH invites recent and new books to be sent for review. Also we are interested to hear from researchers who are interested to write a review which will get published as a brief article in the journal cognitive systems research. Also, if you have recently read a new interesting, challenging, controversial book in the scope of the journal please let us know at the address below. We are particularly interested in new books and new review article writers. The journal of Cognitive Systems Research covers all topics in the study of cognitive processes, in both natural and artificial systems. The journal emphasizes the integration/synthesis of ideas, concepts, constructs, theories, and techniques from multiple paradigms, perspectives, and disciplines, in the analysis, understanding, and design of cognitive and intelligent systems. Contributions describing results obtained within the traditional disciplines (e.g., psychology, artificial intelligence) using well-established paradigms are also sought if such work has broader implications and relevance. The journal seeks to foster and promote the discussion of novel approaches in studying cognitive and intelligent systems. It also encourages cross fertilization of disciplines. This is to be achieved by soliciting and publishing high-quality contributions in all of the areas of study in cognitive science, including artificial intelligence, linguistics, psychology, psychiatry, philosophy, system and control theory, anthropology, sociology, biological sciences, and neuroscience. For more information on topics and scope please see http://www.cecs.missouri.edu/~rsun/journal.html Please send suggestions for book reviews or for being a review article author to the book review editor at the address below. Also publishers are invited to send two review copies for inspection directly. best wishes, Stefan *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From niranjan at eng.cam.ac.uk Tue Jun 6 09:25:44 2000 From: niranjan at eng.cam.ac.uk (niranjan@eng.cam.ac.uk) Date: Tue, 6 Jun 2000 14:25:44 +0100 (BST) Subject: Faculty Jobs - Sheffield Message-ID: <200006061325.25395@baby.eng.cam.ac.uk> The University of Sheffield Department of Computer Science has two faculty positions (lectureships), to start asap. We are looking for outstanding researchers in any area of CS, with a willingness to do some undergraduate teaching outside their area of research. Appointees in the area of Machine Learning may join a new group of four faculty: Joab Winkler (works in Wavelets), Thia Kirubarajan (works in tracking, currently at UCONN, joining us in September), Si Wu (works in Pattern Recognition, currently at RIKEN, joining us in September) and myself. I am keen to encourage applicants who are good in probabilistic modelling with interest in difficult real world problems such as signal processing or computational biology. Deadline 16 June; Interview 27 June. Formal application procedure is via instructions in http://www.shef.ac.uk/jobs Information on the department and University from: http://www.dcs.shef.ac.uk If you are interested, or know someone who might be, please get in touch. Many thanx niranjan __________________________________________________________________ Mahesan Niranjan Professor of Computer Science The University of Sheffield M.Niranjan at dcs.shef.ac.uk _________________________________________________________________ From n.sharkey at dcs.shef.ac.uk Wed Jun 7 05:27:35 2000 From: n.sharkey at dcs.shef.ac.uk (Noel Sharkey) Date: Wed, 7 Jun 2000 10:27:35 +0100 (BST) Subject: Faculty Positions at Sheffield Message-ID: *Apologies if you receive more than one copy ********************* Faculty POSITIONS ****************** The University of Sheffield Department of Computer Science has two faculty positions (lectureships - British equivalent of tenure-track assistant professor), to start ASAP. We are looking for outstanding researchers in any area of Computer Science, with a willingness to do some undergraduate teaching outside their area of research. There are a number of internationally excellent research groups within the dept. - see www.dcs.shef.ac.uk/research/groups/ I would particularly like to encourage researchers in the field of robotics with a preference for adaptive methods (NNs, GAs), BioRobotics, or intelligent sensing. (See the NRG group pages: www.dcs.shef.ac.uk/research/groups/nrg/) Please pass this on to anyone who might be interested. Deadline 16 June; Interview 27 June. Formal application procedure is via instructions in http://www.shef.ac.uk/jobs Information on the department and University from: http://www.dcs.shef.ac.uk ******************************************************************** Noel Sharkey PhD FIEE FBCS Professor of Computer Science Dept. Computer Science email: n.sharkey at dcs.shef.ac.uk University of Sheffield fax: (0114) 2221810 Sheffield, S. Yorks, UK phone: (0114) 2221803 ******************************************************************** From thomas.runkler at mchp.siemens.de Thu Jun 8 09:46:51 2000 From: thomas.runkler at mchp.siemens.de (Thomas Runkler) Date: Thu, 8 Jun 2000 15:46:51 +0200 (MET DST) Subject: Siemens Ph.D. studentship Message-ID: <200006081346.PAA17307@obsidian.mchp.siemens.de> Siemens Corporate Technology is seeking outstanding applicants for a three-year Ph.D. studentship for the project NEURAL COMPUTATION AND FUZZY SYSTEMS IN INDUSTRY with a focus on applications in process industry, production, and logistics. Applicants must have a Masters degree in EE/CS/Physics/Applied Mathematics or an engineering discipline with a solid background in one or more of the following areas: neural networks, fuzzy logic, data analysis, modeling, simulation, control, optimization, diagnosis, nonlinear dynamics, and distributed systems. Programming experience in MATLAB and C/C++ or JAVA is essential for the offered position. Candidates must demonstrate good communication skills in either English or German. The successful candidate(s) will join an active R&D team located in picturesque Munich not far from the Bavarian Alps with its world-famous castles and mountain lakes. Candidates will be responsible for conducting leading-edge research in the field of data-driven process modeling, preventive diagnosis, distributed control, and distributed optimization and develop systems for new products. Starting date for the position is October 2000. Successful applicants will be issued a residency visa for the three year period. Applicants should send their resume, three letters of recommendations, and a statement of interests and goals to: Barbara Mayr Siemens AG, Otto-Hahn-Ring 6, D-81730 Munich, Germany Phone: 0049-89-636-46863 Fax: 0049-89-636-53981 Email: barbara.mayr at mchp.siemens.de From Dave_Touretzky at cs.cmu.edu Fri Jun 9 21:45:59 2000 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Fri, 09 Jun 2000 21:45:59 -0400 Subject: new journal: Brain and Mind Message-ID: <17121.960601559@skinner.boltz.cs.cmu.edu> The first issue of Brain and Mind, a new journal from Kluwer, is available for free on the web at http://www.wkap.nl./journalhome.htm/1389-1987 The table of contents follows. -- Dave Touretzky ================================================================ Brain and Mind A Transdisciplinary Journal of Neuroscience and Neurophilosophy Table of Contents Volume 1, Issue 1, April 2000 Editors' Introduction John Bickle, Gillian Einstein, Valerie Hardcastle pp. 1-6 Why is Brain Size so Important: Design Problems and Solutions as Neocortex Gets Bigger or Smaller Jon H. Kaas pp. 7-23 Editor's Note John Bickle pp. 25-25 A Remembrance of an Event - Foreword to ''The Two Factor Theory of the Mind-Brain Relation'' by Ullin T. Place C.B. Martin pp. 27-27 The Two Factor Theory of the Mind-Brain Relation Ullin T. Place pp. 29-43 Behavioral Tolerance (Contingent Tolerance) is Mediated in Part by Variations in Regional Cerebral Blood Flow Stephen C. Fowler pp. 45-57 Self, World and Space: The Meaning and Mechanisms of Ego- and Allocentric Spatial Representation Rick Grush pp. 59-92 Terra Cognita: From Functional Neuroimaging to the Map of the Mind Dan Lloyd pp. 93-116 Editor's Note: State of the Science Article pp. 117-117 On the 'Dynamic Brain' Metaphor Pter rdi pp. 119-145 ==== From robtag at unisa.it Mon Jun 12 07:02:25 2000 From: robtag at unisa.it (Roberto Tagliaferri) Date: Mon, 12 Jun 2000 13:02:25 +0200 Subject: material available: SOFT COMPUTING METHODOLOGIES FOR DATA MINING Message-ID: <3944C341.45E0401C@dia.unisa.it> Dear Collegues, It is now on-line downloadable the programme with the transparencies of the workshop held in IIASS "E.R. Caianiello" on May 19-20 2000 and organized by SIREN (Italian Neural Networks Society, Societ=E0 Italiana Reti Neuroniche) on SOFT COMPUTING METHODOLOGIES FOR DATA MINING You can find them on the SIREN site at http://dsi.ing.unifi.it/neural/siren/Workshop2000/workshop2000_en.html I would like to thank the speakers, the organizing committee and Dr. Enrico Francesconi for their help in the organization. For any problems you can send an e-mail to robtag at unisa.it Roberto Tagliaferri From abbass at cs.adfa.edu.au Mon Jun 12 20:51:43 2000 From: abbass at cs.adfa.edu.au (Hussein A. Abbass) Date: Tue, 13 Jun 2000 10:51:43 +1000 Subject: Call for Book Chapters: (Data Mining: A Heuristic Approach) Message-ID: <3.0.6.32.20000613105143.007a3c90@csadfa.cs.adfa.edu.au> Our sincere apologies if you receive multiple copies of this call for chapters or it is not in your academic research interests. Dear colleague, Please post this call for chapters to the relevant researchers in your organization. Call for Chapters and Contributions ------------------------------------------- Data Mining: A Heuristic Approach http://www.cs.adfa.edu.au/~abbass/Book/DMHA.html Editors: H.A. Abbass, R. Sarkar, and C. Newton Publisher: Idea Group Publishing, USA This book volume will be a repository for the applications of heuristic techniques in data mining. With roots in optimisation, artificial intelligence, and statistics, data mining is an interdisciplinary area that is concerned with finding patterns in databases. These patterns might be the expected trend of the fashion in women's clothes, the potential change in the prices of some shares in the stock exchange market, the prospective behaviour of some competitors, or the causes of a budding virus. With the large amount of data stored in many organizations, businessmen observed that these data are an important intangible asset, if not the most important one, in their organizations. This instantiated an enormous amount of research, searching for learning methods that are capable of recognising novel and non-trivial patterns in databases. Unfortunately, handling large databases is a very complex process and traditional learning techniques such as Neural Networks and traditional Decision Trees are expensive to use. New optimisation techniques such as support vector machines and kernels methods, as well as statistical techniques such as Bayesian learning, are widely used in the field of data mining nowadays. However, these techniques are computationally expensive. Obviously, heuristic techniques provide much help in this arena. Notwithstanding, there are few books in the area of heuristics and few more in the area of data mining. Surprisingly, no single book has been published to put together these two fast-changing inter-related fields. Topics The use of heuristics (Evolutionary algorithms, simulated annealing, tabu search, swarm intelligence, biological agents, memetic, and others) in the following areas Feature selection. Data cleaning. Clustering, classification, prediction, and association rules. Optimisation methods for data mining. Kernels and support vector machines. Fast algorithms for training neural networks. Bayesian inference and learning. Survey chapters are also welcomed. and other related topics Important dates Abstract submission: August 15, 2000 Acceptance of abstract: September 15, 2000 Full chapter due: January 15, 2001 Notification of full-chapter acceptance: March 1, 2001 Final Version Due: April 30, 2001 Estimated publication date: Fall 2001 by Idea Group Publishing Contact information: Send electronic submissions to one of the editors at abbass at cs.adfa.edu.au ruhul at cs.adfa.edu.au csn at cs.adfa.edu.au Hard copies should be sent to any of the editors at: School of Computer Science, University College, University of New South Wales, Australian Defence Force Academy, Canberra, ACT2600, Australia. Fax submission to: 02-62688581 within Australia +61-2-62688581 International Hussein Aly Abbass Amein Lecturer in Computer Science, Email: abbass at cs.adfa.edu.au Australian Defence Force Academy, http: http://www.cs.adfa.edu.au/~abbass School of Computer Science, Tel.(M) (+61) 0402212977 University College, Tel.(H) (+61) (2) 62578757 University of New Wouth Wales, Tel.(W) (+61) (2) 62688158 Canberra, ACT2600, Australia. Fax.(W) (+61) (2) 62688581 From cyrano at arti.vub.ac.be Tue Jun 13 09:47:00 2000 From: cyrano at arti.vub.ac.be (Andreas Birk) Date: Tue, 13 Jun 2000 15:47:00 +0200 (MET DST) Subject: book announcements Message-ID: <200006131347.PAA25190@arti13.vub.ac.be> Dear researchers interested in robot learning, the book "Interdisciplinary Approaches to Robot Learning" edited by John Demiris and Andreas Birk is now available. It is published and distributed by World Scientific. At the same time, Springer decided to offer an online version of the previously published book "Learning Robots" edited by Andreas Birk and John Demiris. Information on both books is given below. ---- John Demiris, Andreas Birk (Eds.) Interdisciplinary Approaches to Robot Learning Robotics and Intelligent Systems Series, World Scientific, 2000 ISBN 981-02-4320-0 250 pp (approx.) US$56 book description: http://www.wspc.com/books/compsci/4436.htm Contents: Preface to Interdisciplinary Approaches to Robot Learning (J Demiris & A Birk) Bootstrapping the Developmental Process: The Filter Hypothesis (L Berthouze) Biomimetic Gaze Stabilization (T Shibata & S Schaal) Experiments and Models About Cognitive Map Learning for Motivated Navigation (P Gaussier et al.) Learning Selection of Action for Cortically-Inspired Robot Control (H Frezza-Buet & F Alexandre) Transferring Learned Knowledge in a Lifelong Learning Mobile Robot Agent (J O'Sullivan) Of Hummingbirds and Helicopters: An Algebraic Framework for Interdisciplinary Studies of Imitation and Its Applications (C Nehaniv & K Dautenhahn) Evolving Complex Visual Behaviours Using Genetic Programming and Shaping (S Perkins & G M Hayes) Preston: A System for the Evaluation of Behaviour Sequences (M Wilson) Readership: Researchers and graduate students in robotics and machine learning who are interested in interdisciplinary approaches to their fields. ----- Andreas Birk, John Demiris (Eds.) Learning Robots Proceedings of EWLR-6, Brighton, UK; Lecture Notes in Artificial Intelligence (LNAI) 1545, Springer, 1998 ISBN 3-540-65480-1 188 pp DM 50,- This book is now online via the Springer LINK service. This means that subscribed parties (libraries, institutes, etc.) can access the complete content of the book via the WWW. Access from hosts which are not subscribed to the LINK service is limited to the abstracts of the different chapters. The online access of "Learning Robots" is located at http://link.springer.de/link/service/series/0558/tocs/t1545.htm ----- From cindy at cns.bu.edu Thu Jun 15 09:59:00 2000 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Thu, 15 Jun 2000 09:59:00 -0400 Subject: Neural Networks 13(4/5) Message-ID: <200006151359.JAA27950@retina.bu.edu> NEURAL NETWORKS 13(4/5) Contents - Volume 13, Numbers 4 & 5 - 2000 ------------------------------------------------------------------ NEURAL NETWORKS LETTERS: Bias reduction in skewed binary classification with Bayesian neural networks P.J.G. Lisboa, A. Vellido, and H. Wong INVITED ARTICLES: Independent component analysis: Algorithms and applications A. Hyvarinen and E. Oja Evolutionary robots with on-line self-organization and behavioral fitness D. Floreano and J. Urzelai CONTRIBUTED ARTICLES: ***** Neuroscience and Neuropsychology ***** A generalized Hebbian rule for activity-dependent synaptic modifications T. Kitajima and K.-I. Hara ***** Mathematical and Computational Analysis ***** Mutual information of sparsely coded associative memory with self-control and ternary neurons D. Bolle, D.R.C. Dominguez, and S. Amari Construction of confidence intervals for neural networks based on least squares estimation I. Rivals and L. Personnaz A new algorithm for learning in piecewise-linear neural networks E.F. Gad, A.F. Atiya, S. Shaheen, and A. El-Dessouki Evolution and generalization of a single neurone, III: Primitive, standard, robust, and minimax regressions S. Raudys ***** Engineering and Design ***** Determining the number of centroids for CMLP network M. Lehtokangas ***** Technology and Applications ***** Defining a neural network controller structure for a rubbertune robot M. Ozkan, K. Inoue, K. Negishi, and T. Yamanaka An efficient learning algorithm for improving generalization performance of radial basis function neural networks Zheng-ou Wang and Tao Zhu ***** Book Review ***** Adaptive resonance theory microchips: Circuit design techniques A.E. Hubbard ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From duch at phys.uni.torun.pl Fri Jun 16 11:09:07 2000 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Fri, 16 Jun 2000 17:09:07 +0200 Subject: tutorial on extraction of knowledge from data, IJCNN'2000 Message-ID: <006501bfd7a4$d1a7b2a0$04054b9e@phys.uni.torun.pl> Web pages containing notes for the Extraction of Knowledge from Data Using Computational Intelligence Methods to be presented at the The International Joint Conference on Neural Networks, Como, Italy, 24-27 July 2000 are available at the address: http://www.phys.uni.torun.pl/~duch/ref/kdd-tut/index.html These pages will still be developed further but at the present stage they should be sufficient to give you an idea about the neural and machine leanring methods that will be presented. See you in Como! W/lodzis/law Duch http://www.phys.uni.torun.pl/~duch From amari at brain.riken.go.jp Fri Jun 16 03:07:45 2000 From: amari at brain.riken.go.jp (Shun-ichi Amari) Date: Fri, 16 Jun 2000 16:07:45 +0900 Subject: Bernoulli-RIKEN Symposium on Neural Networks and Learning Message-ID: Now we are accepting registration for the Bernoulli-RIKEN Symposium on Neural Networks and Learning. Please register through the web-site http://www.bsis.brain.riken.go.jp/Bernoulli if you hope to be invited to this Symposium. Only a limited number of around one hundred people are able to attend the conference, therefore we may have to decline your requests according to cicumstances. Thank you for your understanding and cooperation. ********************************************** Bernoulli-RIKEN BSI 2000 Symposium on Neural Networks and Learning Dates and Venues: Symposium: October 25 - 27, 2000 Ohkouchi Hall, RIKEN (The Institute of Physical and Chemical Research), ??Japan Satellite Workshop: October 28, 2000 The Institute of Statistical Mathematics, Japan Aim: In order to celebrate Mathematical Year 2000, The Bernoulli Society is organizing a number of Symposia in rapidly developing research areas in which probability theory and statistics can play important roles. Brain science in the wide sense will become more important in the 21 century. Information processing in the brain is so flexible and its learning ability is so strong that it is indeed a challenge for information science to elucidate its mechanisms. It is also a big challenge to construct information processing systems of brain style. The present Symposium focuses on learning ability of real and artificial neural networks and related systems from theoretical and practical points of view. Probability theory and statistics will play fundamental roles in elucidating these systems, and they in turn fortify stochastic and statistical methods. Theories of neural learning and pattern recognition have a long history, and lots of new ideas are emerging currently. They have also practical applicability in the real world problems. Now is a good time to review all of these new ideas and methods and to discuss future directions of developments. We will invite worldwide top class researchers in these fields and discuss the state-of-the-art of neural networks and learning as well as future directions of this important area. Participants are by invitation only. We are expecting 50 -80 participants from all over the world. After the symposium, we will organize a more informal one-day workshop: "Towards new unification of statistics and neural networks learning". The detailed information for time tables and abstracts can be obtained at http://www.bsis.brain.riken.go.jp/Bernoulli Those who have interests in joining the Symposium and Workshop may ask invitation through the above web-site after June 15 when we are ready. If you have any questions, contact the organizing committee at bernoulli2000 at bsis.brain.riken.go.jp ******************* Sponsors: The Bernoulli Society for Mathematical Statistics and Probability RIKEN Brain Science Institute The Institute of Statistical Mathematics Japanese Neural Networks Society In Cooperation with: Japanese Statistical Society Supported by: The Commemorative Association for the Japan World Exposition (1970) The Support Center for Advanced Telecommunications research Technology (SCAT) Organizing Committee: Chair Shun-ichi Amari, RIKEN Brain Science Institute, Japan Leo Breiman, University of California, Berkeley, USA Shinto Eguchi, The Institute of Statistical Mathematics, Japan Michael Jordan, University of California, Berkeley, USA Noboru Murata, Waseda University, Japan Mike Titterington, University of Glasgow, UK Vladimir Vapnik, AT&T, USA Registration fee 10,000 Japanese yen (nearly 100 US$) (including reception) is requested at the conference vennue. There is 50% student discount. ***************** Program: 1. Graphical Models and Statistical Methods: Steffen L. Lauritzen (Aalborg University) Graphical models for learning Thomas S. Richardson (University of Warwick) Ancestral graph Markov models: an alternative to models with latent or selection variables Lawrence Saul (AT&T Labs) Learning the Global Structure of Nonlinear Manifolds Martin Tanner (Northwestern University) Inference for and Applications of Hierarchical Mixtures-of-Experts 2. Combining Learners Leo Breiman (University of California, Berkeley) Random Forests Jerome H. Friedman (Stanford University) Gradient boosting and multiple additive regression trees Peter Bartlett (Australian National University) Large Margin Classifiers and Risk Minimization Yoram Singer (The Hebrew University) Combining Learners: an Output Coding Perspective 3. Information Geometry and Statistical Physics Shinto Eguchi (The Institute of Statistical Mathematics) Information geometry of tubular neighbourhoods for a statistical model Shun-ichi Amari (RIKEN Brain Science Institute) Information geometry of neural networks Manfred Opper (Aston University) The TAP Mean Field approach for probabilistic models Magnus Rattray (University of Manchester) Modelling the learning dynamics of latent variable models 4. VC Dimension and SVM Vladimir Vapnik (AT&T Labs) Statistical learning theory and support vector machines Michael Kearns (AT&T Labs) Sparse Sampling Algorithms for Probabilistic Artificial Intelligence Gabor Lugosi (Pompeu Fabra University) Model selection, error estimation, and concentration Bernhard Schoelkopf (Microsoft Research Ltd.) SV Algorithms and Applications ******************** Shun-ichi Amari Vice Director, RIKEN Brain Science Institute Laboratory for Mathematical Neuroscience Research Group on Brain-Style Information Systems tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687 amari at brain.riken.go.jp http://www.bsis.brain.riken.go.jp/ From harnad at coglit.ecs.soton.ac.uk Sun Jun 18 10:59:04 2000 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sun, 18 Jun 2000 15:59:04 +0100 (BST) Subject: Language-Origins: PSYC Call for Multiple Book Reviewers Message-ID: PSYCOLOQUY CALL FOR BOOK REVIEWERS of: "The Origins of Complex Language" by Andrew Carstairs-McCarthy (OUP 1999) Below is the abstract of the Precis of "The Origins of Complex Language" by Andrew Carstairs-McCarthy (740 lines). This book has been selected for multiple review in Psycoloquy. If you wish to submit a formal book review please write to psyc at pucc.princeton.edu indicating what expertise you would bring to bear on reviewing the book if you were selected to review it. Full Precis: http://www.cogsci.soton.ac.uk/psyc-bin/newpsy?11.082 (If you have never reviewed for PSYCOLOQUY or Behavioral & Brain Sciences before, it would be helpful if you could also append a copy of your CV to your inquiry.) If you are selected as one of the reviewers and do not have a copy of the book, you will be sent a copy of the book directly by the publisher (please let us know if you have a copy already). Reviews may also be submitted without invitation, but all reviews will be refereed. The author will reply to all accepted reviews. FULL PSYCOLOQUY BOOK REVIEW INSTRUCTIONS AT: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psycoloquy/ Psycoloquy reviews are of the book, not the Precis. Length should be about 200 lines [c. 1800 words], with a short abstract (about 50 words), an indexable title, and reviewer's full name and institutional address, email and Home Page URL. All references that are electronically accessible should also have URLs. AUTHOR'S RATIONALE FOR SOLICITING MULTIPLE BOOK REVIEW Most recent investigators assume that the brain has always been the most important part of human anatomy for the evolution of language, and do not seriously examine other conceivable directions in which grammatical evolution might have proceeded. In "The Origins of Complex Language," it is suggested that certain central features of language-as-it-is, notably the distinction between sentences and noun phrases, are by no means inevitable outcomes of linguistic or cognitive evolution, so that where they come from constitutes a genuine puzzle. The solution that is proposed is that grammar-as-it-is was, in fundamental respects, exapted from, or tinkered out of, the neural mechanisms that arose for the control of syllabically organized vocalization, made possible by (among other things) the descent of the larynx. This proposal turns upside down mainstream views about the relationship between language development and vocal tract development, and also challenges the logical and epistemological basis of notions closely tied to the distinction between sentences and noun phrases, such as 'reference', 'predication' and 'assertion'. It should therefore be of interest to anthropologists, psychologists, cognitive scientists, linguists and philosophers of language. psycoloquy.00.11.082.language-origins.1.carstairs-mccarthy Wed May 24 2000 ISSN 1055-0143 (44 paragraphs, 27 references, 85 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 2000 Andrew Carstairs-McCarthy THE ORIGINS OF COMPLEX LANGUAGE [Oxford University Press 1999, ISBN 0-19-823822-3, 0-19-823821-5] Precis of Carstairs-McCarthy on Complex Language Andrew Carstairs-McCarthy University of Canterbury Department of Linguistics Private Bag 4800 Christchurch New Zealand a.c-mcc at ling.canterbury.ac.nz ABSTRACT: Some puzzling characteristics of grammar, such as the sentence/NP distinction and the organization of inflection classes, may provide clues about its prehistory. When bipedalism led to changes in the vocal tract that favoured syllabically organized vocalization, this made possible an increase in vocabulary which in turn rendered advantageous a reliable syntax, whose source was the neural mechanism for controlling syllable structure. Several features of syntax make sense as byproducts of characteristics of the syllable (for example, grammatical 'subjects' may be byproducts of onset margins). This scenario is consistent with evidence from biological anthropology, ape language studies, and brain neurophysiology. KEYWORDS: ape, aphasia, brain development, evolution of language, grammar, language, larynx, noun phrase, predication, principle of contrast, reference, sentence, sign language, speech, syllable, truth From NKasabov at infoscience.otago.ac.nz Sun Jun 18 18:54:50 2000 From: NKasabov at infoscience.otago.ac.nz (Nik Kasabov) Date: Mon, 19 Jun 2000 10:54:50 +1200 Subject: a new book on intelligent systems and information sciences Message-ID: Dear colleagues, The following book has now been published by Springer Verlag (Physica Verlag): "Future Directions for Intelligent System and Information Sciences", N.Kasabov (ed), 2000, Springer Verlag (Physica Verlag). The book contains 19 chapters on contemporary topics, that include: adaptive learning and evolving systems; artificial life; speech and image processing; virtual reality; intelligent robots; brain-like computing; bio-informatics; quantum neural networks; intelligent decision making; data mining; granular computing and computing with words. The chapters are written by internationally recognised authors. Content: Part I: Adaptive, evolving, learning systems N.Kasabov, ECOS- Evolving Connectionist Systems - a new/old paradigm for on-line learning and knowledge engineering. S.-B. Cho, Artificial life technology for adaptive information processing. R.J. Duro, J. Santos, J.A. Becerra: Evolving ANN controllers for smart mobile robots. G. Coghill, A simulation environment for the manipulation of naturally variable objects. Y.Maeda: Behavior-decision fuzzy algorithm for autonomous mobile robot. Part II: Intelligent human computer interaction and scientific visualisation: J. Taylor, N. Kasabov: Modelling the emergence of speech and language through evolving connectionist systems. H.J.van den Herik, E.O. Postma: Discovering the visual signature of painters. A. Nijholt, J. Hulstijn: Multimodal interactions with agents in virtual worlds. M. Paulin, R.Berquist: Virtual BioBots. Part III: New connectionist computational paradigms: Brainlike computing and quantum neural networks: J.G. Taylor, Future directions for neural networks and intelligent systems from the brain imaging research. A.A. Ezhov, D. Ventura,Quantum neural networks. N.G. Stocks, R. Mannella,Suprathreshold stochastic resonance in a neuronal network model: a possible strategy for sensory coding. Part IV: Bioinformatics: C. Brown, M. Schreiber, B.Chapman, G. Jacobs, Information science and bioinformatics. V.B. Bajic, I.V. Bajic, Neural network system for promoter recognition. Part V: Knowledge representation, knowledge processing, knowledge discovery, and some applications: W. Pedrycz, Granular computing: An introduction. J. Kacpzryk, A new paradigm shift from computation on numbers to computation on words on an example of linguistic database summarization. N. Kasabov, L. Erzegovesi, M.Fedrizzi, A. Beber, D. Deng: Hybrid intelligent decision support systems and applications for risk analysis and discovery of evolving economic clusters in Europe. Y.Y.Yun: Intelligent resource management through the constrained resource planning model. A. Ramer, M. do Carmo Nicoletti, S.Y. Sung: Evaluative studies of fuzzy knowledge discovery through NF systems. Series: Studies in Fuzziness and Soft Computing.VOL. 45 More information about the book can be obtained from the Web site: http://www.springer.de/cgi-bin/search_book.pl?isbn=3-7908-1276-5 with best regards, Nik Kasabov ------------------------------------------------------------------------ Prof.Dr. Nikola (Nik) Kasabov Director Knowledge Engineering Laboratory Department of Information Science University of Otago, P.O.Box 56, Dunedin,New Zealand phone: +64 3 4798319; fax: +64 3 4798311 email: nkasabov at otago.ac.nz http://divcom.otago.ac.nz/infosci/Staff/NikolaK.htm ------------------------------------------------------------------------ From char-ci0 at wpmail.paisley.ac.uk Mon Jun 19 10:59:20 2000 From: char-ci0 at wpmail.paisley.ac.uk (Darryl Charles) Date: Mon, 19 Jun 2000 15:59:20 +0100 Subject: Ph.D Studentship in Image Processing Message-ID: Ph.D. Studentship (From September 2000) Image processing with hybrid supervised and unsupervised neural networks. Ph.D. studentship available in the Applied Computational Intelligence Research Unit (ACIRU) at the University of Paisley, Scotland. An opportunity exists for a university sponsored Ph.D. studentship to research and develop novel hybrid supervised/unsupervised artificial neural networks for application to visual data. A particular emphasis in this Ph.D. will be on data related to remote sensing, but there will be opportunity to develop generic models that may be adapted and applied to other image processing tasks. Research will attempt to uncover methods that improve classification and feature detection with the developed hybrid models on visual data Applicants should normally hold a good degree in a related subject area and specific work experience and/or publications with regard to neural networks and image processing will strengthen application. Programming experience with MATLAB and/or C++ will be an advantage. The successful candidate will be supervised jointly by Dr Darryl Charles and Dr Bogdan Gabrys and will play an active part in a vibrant research group. A 3 year grant is available at the normal rate and some part-time teaching should be available to suppliment income. Applicants should send a CV and the names and addresses of two referees by e-mail or standard mail to the following address. Dr Darryl Charles CIS dept. University of Paisley High Street Paisley PA1 2BE e-mail: Darryl.Charles at paisley.ac.uk dept. home page: http://cis.paisley.ac.uk/ Dr Darryl Charles CIS Department University of Paisley Scotland Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From tsr at stat.washington.edu Mon Jun 19 18:37:56 2000 From: tsr at stat.washington.edu (Thomas Richardson) Date: Mon, 19 Jun 2000 15:37:56 -0700 (PDT) Subject: AI & STATISTICS 2001 - Second Call for Papers: Deadline 10 July Message-ID: (apologies for multiple posting) ==================================================================== Second call for papers: AI and STATISTICS 2001 Eighth International Workshop on Artificial Intelligence and Statistics January 3-6, 2001, Hyatt Hotel, Key West, Florida http://www.ai.mit.edu/conferences/aistats2001/ SUBMISSION DEADLINE: midnight July 10, 2000 (PST) This is the eighth in a series of workshops which have brought together researchers in Artificial Intelligence (AI) and in Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encouraged interdisciplinary work. Papers on all aspects of the interface between AI & Statistics are encouraged. To encourage interaction and a broad exchange of ideas, the presentations will be limited to about 20 discussion papers in single session meetings over three days (Jan. 4-6). Focused poster sessions will provide the means for presenting and discussing the remaining research papers. Papers for poster sessions will be treated equally with papers for presentation in publications. Attendance at the workshop will not be limited. The three days of research presentations will be preceded by a day of tutorials (Jan. 3). These are intended to expose researchers in each field to the methodology and techniques used in other related areas. The Eighth workshop especially encourages submissions related to the following workshop themes in the interface between information retrieval and statistics: Statistical natural language processing Game theory Missing information; unlabeled examples Error correcting codes In addition, papers on all aspects of the interface between AI & Statistics are strongly encouraged, including but not limited to Automated data analysis Cluster analysis and unsupervised learning Statistical advisory systems, experimental design Integrated man-machine modeling methods Interpretability in modelling Knowledge discovery in databases Metadata and the design of statistical data bases Model uncertainty, multiple models Multivariate graphical models, belief networks, causal modeling Online analytic processing in statistics Pattern recognition Prediction: classification and regression Probabilistic neural networks Probability and search Statistical strategy Vision, robotics, natural language processing, speech recognition Visualization of very large datasets Submission Requirements: Electronic submission of abstracts is required. The abstracts (up to 4 pages in length) should be submitted through the AI and Statistics Conference Management page supported by Microsoft Research. More specific instructions are available at http://cmt.research.microsoft.com/AISTATS2001/ In special circumstances other arrangements can be made to facilitate submission. For more information about possible arrangements, please contact the conference chairs. Submissions will be considered if they are received by midnight July 10, 2000 (PST). Please indicate the theme and/or the topic(s) your abstract addresses. Receipt of all submissions will be confirmed via electronic mail. Acceptance notices will be emailed by September 1, 2000. Preliminary papers (up to 12 pages, double column) must be received by November 1, 2000. These preliminary papers will be copied and distributed at the workshop. Program Chairs: Thomas Richardson, University of Washington, tsr at stat.washington.edu Tommi Jaakkola, MIT, tommi at ai.mit.edu Program Committee: Russell Almond, Educational Testing Service, Princeton Hagai Attias, Microsoft Research, Redmond Yoshua Bengio, University of Montreal Max Chickering, Microsoft Research, Redmond Greg Cooper, University of Pittsburgh Robert Cowell, City University, London Phil Dawid, University College, London Vanessa Didelez, University of Munich David Dowe, Monash University Brendan Frey, University of Waterloo Nir Friedman, Hebrew University, Jerusalem Dan Geiger, Technion Edward George, University of Texas Paolo Giudici, University of Pavia Zoubin Ghahramani, University College, London Clark Glymour, Carnegie-Mellon University Moises Goldszmidt, Peakstone Corporation David Heckerman, Microsoft Research, Redmond Thomas Hofmann, Brown University Reimar Hofmann, Siemens Michael Jordan, University of California, Berkeley David Madigan, Soliloquy Chris Meek, Microsoft Research, Redmond Marina Meila, Carnegie-Mellon University Kevin Murphy, University of California, Berkeley Mahesan Niranjan, University of Sheffield John Platt, Microsoft Research, Redmond Greg Ridgeway, University of Washington Lawrence Saul, AT&T Research Prakash Shenoy, University of Kansas Dale Schuurmans, University of Waterloo Padhraic Smyth, University of California, Irvine David Spiegelhalter, University of Cambridge Peter Spirtes, Carnegie-Mellon University Milan Studeny, Academy of Sciences, Czech Republic Michael Tipping, Microsoft Research, Cambridge Henry Tirri, University of Helsinki Volker Tresp, Siemens Chris Watkins, Royal Holloway and Bedford New College, Nanny Wermuth, University of Mainz Joe Whittaker, Lancaster University Chris Williams, University of Edinburgh From gluck at pavlov.rutgers.edu Tue Jun 20 10:19:44 2000 From: gluck at pavlov.rutgers.edu (Mark A. Gluck) Date: Tue, 20 Jun 2000 10:19:44 -0400 Subject: Postdoctoral Position in Computational Neurosci. at Rutgers-Newark Message-ID: Computational Neuroscience of Learning and Memory. Postdoctoral position open for January, 2001, start. Work on computational models of neural substrates of associative learning in animals and humans, with reference to the hippocampal, basal forebrain, basal ganglia, amygdala, and frontal brain systems. Applicants must have strong prior record (PhD thesis and/or papers) in the theory, implementation, and analysis of neural network algorithms and models. Some familiarity with relevant biological and behavioral systems helpful. Modeling will be integrated into ongoing experimental studies of the psychobiology of animal conditioning, and the neuropsychology of human memory disorders. Additional training in these areas provided. Located at Center for Molecular & Behavioral Neuroscience, Rutgers, Newark, NJ (15 min by train from downtown New York City). Send-- by email only--cover letter, CV, and names and emails of three references to: Mark Gluck and Catherine Myers, c/o gluck at pavlov.rutgers.edu. ______________________________________________________ Dr. Mark A. Gluck, Associate Professor Center for Molecular and Behavioral Neuroscience Phone: (973) 353-1080 x3221 Rutgers University Fax: (973) 353-1272 197 University Ave. Newark, New Jersey 07102 Email: gluck at pavlov.rutgers.edu WWW Homepages: Research Lab: http://www.gluck.edu Rutgers Memory Disorders Project: http://www.memory.rutgers.edu _______________________________________________________ From ckiw at dai.ed.ac.uk Tue Jun 20 14:20:06 2000 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Tue, 20 Jun 2000 19:20:06 +0100 (BST) Subject: Faculty position at University of Edinburgh, UK Message-ID: Apologies if you receive this message multiple times. I am keen to encourage people from the machine learning/probabilistic modelling fields who work on life sciences problems to apply. I shall be attending the ICML/UAI conferences at Stanford and will be happy to discuss this further with interested parties. Chris Williams Dr Chris Williams ckiw at dai.ed.ac.uk Institute for Adaptive and Neural Computation Division of Informatics, University of Edinburgh 5 Forrest Hill, Edinburgh EH1 2QL, Scotland, UK fax: +44 131 650 6899 tel: (direct) +44 131 651 1212 (department switchboard) +44 131 650 3090 http://anc.ed.ac.uk/ -------------------------------------------------------------------- INFORMATICS AND ITS APPLICATION TO THE LIFE SCIENCES The Division of Informatics at the University of Edinburgh (http://www.informatics.ed.ac.uk) is seeking to make an appointment in the area of Informatics and its applications to the Life Sciences. This is aimed not only at extending the applications of Informatics to biological problems but also at stimulating fundamental research in Informatics. The Division has inherited a very strong tradition of research in computer systems, theoretical computer science, cognitive science, artificial intelligence, robotics and neural networks. The successful candidate will add to our existing strengths in research and teaching, encourage the integration of his or her own research with that of others and contribute to the development of Informatics. The successful candidate can anticipate profitable involvement with local researchers in Biology, Medicine and Veterinary Medicine, including the new Edinburgh Genomic Microarray Facility (http://www.gmf-microarray.ed.ac.uk/). It is anticipated that the successful candidate will take a leadership role in the rapidly expanding bioinformatics enterprise at the University. The appointment will be at the Lecturer (17,238 pounds - 30,065 pounds), or Senior Lecturer or Readership (31,356 pounds - 35,670 pounds) scale. (Salary scales under review.) Further particulars can be found at http://www.informatics.ed.ac.uk/events/vacancies/life_sciences_fp.html and applications packs can be obtained from the PERSONNEL DEPARTMENT, The University of Edinburgh, 9-16 Chambers Street, Edinburgh EH1 1HT, UK Closing date: 28 July 2000 Please quote reference number 306395. Informal questions and requests for information can be sent to d.willshaw at cns.ed.ac.uk bonnie.webber at ed.ac.uk From wsenn at cns.unibe.ch Wed Jun 21 09:40:44 2000 From: wsenn at cns.unibe.ch (Walter Senn) Date: Wed, 21 Jun 2000 15:40:44 +0200 Subject: Similar IF neurons synchronize Message-ID: <3950C5DC.CAC8E678@cns.unibe.ch> The following paper is accepted at the SIAM Journal on Applied Mathematics: "Similar non-leaky integrate-and-fire neurons with instantaneous couplings always synchronize" The paper reconsiders the dynamics of pulse-coupled integrate-and-fire (IF) neurons analyzed by Mirollo and Strogatz (SIAM J. Appl. Math., 1990). Lifting their restriction to identical oscillators, we study the case of different intrinsic frequencies and different thresholds of the neurons, as well as different but positive couplings. For non-leaky neurons, we prove that generically the dynamics becomes fully synchronous for any initial conditions if the intrinsic frequencies, the thresholds and the couplings are not too different. For the case of non-leaky IF neurons, this confirms Peskin's conjecture (1975) according to which nearly identical pulse-coupled oscillators in general synchronize. In particular, leakyness, or more general, concave evolution functions as imposed by Mirollo and Strogatz, are not necessary to insure global synchronization. Our result differs from the findings of Mirollo and Strogatz in that, for non-leaky IF neurons, almost all networks with weak homogeneity converge for all initial conditions to synchronous firing, while in their work with leaky and identical IF neurons, the dynamics synchronizes for all set of parameter values, but for each set only for almost all initial conditions. Interestingly, for non-leaky neurons the case of exactly identical oscillators and weights is exceptional in that it does not assure full synchronization for all initial conditions. Walter Senn and Robert Urbanczik The paper can be downloaded from http://www.cns.unibe.ch/~wsenn/#pub . ------------------------------------------------------------- Walter Senn Phone office: +41 31 631 87 21 Physiological Institute Phone home: +41 31 332 38 31 University of Bern Fax: +41 31 631 46 11 Buehlplatz 5 email: wsenn at cns.unibe.ch CH-3012 Bern SWITZERLAND http://www.cns.unibe.ch/ ------------------------------------------------------------- From duch at phys.uni.torun.pl Wed Jun 21 10:01:27 2000 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Wed, 21 Jun 2000 16:01:27 +0200 Subject: book reviewers wanted Message-ID: <000b01bfdb89$317c5ea0$04054b9e@phys.uni.torun.pl> Dear all, I have a few good books for reviewing in Transactions on Neural Networks. Unfortunately most of our reviewers are so slow that I have to look for help; please let me know who would be willing to write some reviews, mention your credentials, and after some consulting I'll add you to the list of reviewers and send you a list of books for reviewing. Sincerely, W/lodzis/law Duch Department of Computer Methods, Nicholas Copernicus University, Poland tel/fax: (+48-56) 622 1543, home 623 6685 http://www.phys.uni.torun.pl/~duch From rosaria at ICSI.Berkeley.EDU Wed Jun 21 14:50:34 2000 From: rosaria at ICSI.Berkeley.EDU (Rosaria Silipo) Date: Wed, 21 Jun 2000 11:50:34 -0700 (PDT) Subject: Summer School on Intelligent Data Analysis Message-ID: PRELIMINARY ANNOUNCEMENT SUMMER SCHOOL ON INTELLIGENT DATA ANALYSIS PALERMO, SEPTEMBER 18-22, 2000 Over the last decade or so, the size of machine-readable data sets has increased dramatically and the problem of "data explosion" has become apparent. In parallel with this recent developments in computing have provided the basic infrastructure for fast access to online data. In particular many advanced computational methods for extracting information from large quantities of heterogeneous data and for data representation are now beginning to mature. These developments have created a new range of problems and challenges for the analysts, as well as new opportunities for intelligent systems in data analysis. All this has led to the emergence of the field of Intelligent Data Analysis (IDA), a combination of diverse disciplines including Artificial Intelligence and Statistics in particular. The School on Intelligent Data Analysis (IDA) will focus on the core techniques of Intelligent Data Analysis: - Statistics, - Bayesian Networks, - Neural networks, - Time Series Analysis, - Rule Induction, - Fuzzy Logic - Evolutionary Computation. All courses are organized as to provide a wide description of the theoretical and practical aspects of each discipline. For this purpose also speakers from industry are invited to show practical and already implemented applications of Intelligent Data Analysis techniques. The target audience of the IDA Summer School are advanced undergraduate students, PhD students, postdoctoral students and academic and industrial researchers and developers. The Summer School will take place at the University of Palermo (Italy) from September 18th and September 22nd 2000. More information - including the preliminary program and the list of speakers - will be available since early June on the IDA summer school's web page: http://www.cere.pa.cnr.it/IDAschool/ FINANCIAL SUPPORT. At the moment it is very likely that financial support from the European Commission will be available for young researchers (<35) to attend the school. This support will cover the registration fee and part of the travel and subsistence expenses. In the course of June more details will be made available. If you are interested in attending the school or would like to inquire about the possibility for financial support, please send e-mail to ida at cere.pa.cnr.it. From rfrench at ulg.ac.be Fri Jun 23 13:18:16 2000 From: rfrench at ulg.ac.be (Robert French) Date: Fri, 23 Jun 2000 19:18:16 +0200 Subject: Deadline extension for Abstracts for NCPW6: June 30 Message-ID: <4.1.20000623190321.00ad2e10@pop3.mailst.ulg.ac.be> SIXTH NEURAL COMPUTATION AND PSYCHOLOGY WORKSHOP (NCPW6) We have decided to extend to JUNE 30 of the Final Deadline for Abstracts for NCPW6, to be held at the University of Li=E8ge in eastern Belgium from September 16-18 of this year. The goal of this Workshop, the sixth in a series, is to bring together psychologists and neuropsychologists doing neural network modeling. Abstracts are to be approximately 200 words long. Notification of acceptance for a paper presentation will be done by July 5. The finished paper must be ready by the time of the Workshop. See the NCPW6 Web page for all details: http://www.fapse.ulg.ac.be/ncpw6/ So far we have people presenting from Belgium, Brazil, Britain, France, Germany, Holland, Italy, and the U.S. Some of the participants include: Domenico Parisi, Maartje Raijmakers, John Bullinaria, Bob French, Samantha Hartley, Martin LeVoi, Richard Shillcock, Jonathan Shapiro, Joe Levy, Richard Cooper, David Glasspool, Roland Baddeley, Noel Sharkey, Axel Cleeremans, Frank van Overwall, Barbara Tillmann, Gert Westermann, Jacques Sougn=E9, and Claudia Schiffer, among others. (Well, actually, Claudia Schiffer won=92t be there. She doesn=92t do _connectionist_ modeling.) The Workshop will be held over two and a half days (Sept 16-18) and there will be about 25-30 talks, no parallel sessions and no posters. The atmosphere is designed to be congenial but rigorous. The welcoming reception, coffee breaks, lunches and a copy of the Proceedings (published by Springer-Verlag) will be included in the registration fee (100 euros). There will be an optional banquet (30 euros) on Sunday night, Sept. 17. Please register as soon as you are sure you are coming. You can register and pay electronically over a secure line. This year the theme will be =93Evolution, Learning and Development.=94 This is a broad topic and intentionally so. Although we aren=92t interested in, say, connectionist applications to submarine warfare, we will consider all papers that have something to do with the announced topic, even if rather tangentially. This is the first year that NCPW is being held on the Continent, a move that was explicitly designed to attract not only the usual contingent of British connectionists, but also our colleagues from other European countries as well. The exact organization of the final program will depend on the submissions received. We will publish the program on the Web site as soon as it is determined. Hope to see you in September. Bob French, on behalf of the NCPW6 organizing committee CONTACT DETAILS For any problems or questions, please send e-mail to mailto:cogsci at ulg.ac.be ---------------------------------------------------------------------------- Robert M. French, Ph.D Quantitative Psychology and Cognitive Science Psychology Department University of Liege 4000 Liege, Belgium Tel: (32.[0]4) 366.20.10 FAX: (32.[0]4) 366.28.59 email: rfrench at ulg.ac.be URL: http://www.fapse.ulg.ac.be/Lab/cogsci/rfrench.html ---------------------------------------------------------------------------- From marcusg at csee.uq.edu.au Sun Jun 25 23:46:22 2000 From: marcusg at csee.uq.edu.au (Marcus Gallagher) Date: Mon, 26 Jun 2000 13:46:22 +1000 Subject: PhD thesis available: MLP Error Surfaces. Message-ID: <3956D20E.1512DE44@csee.uq.edu.au> Dear Connectionists, I am happy to annouce the availability of my PhD thesis for download in electronic format. Apologies if you receive multiple copies of this posting. URL: http://www.elec.uq.edu.au/~marcusg/thesis.html Regards, Marcus. ---------------------------------------------------------------- Multi-Layer Perceptron Error Surfaces: Visualization, Structure and Modelling Marcus R. Gallagher PhD Thesis, University of Queensland, Department of Computer Science and Electrical Engineering, 2000. Abstract The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural Network model. MLP networks are normally applied to performing supervised learning tasks, which involve iterative training methods to adjust the connection weights within the network. This is commonly formulated as a multivariate non-linear optimization problem over a very high-dimensional space of possible weight configurations. Analogous to the field of mathematical optimization, training an MLP is often described as the search of an error surface for a weight vector which gives the smallest possible error value. Although this presents a useful notion of the training process, there are many problems associated with using the error surface to understand the behaviour of learning algorithms and the properties of MLP mappings themselves. Because of the high-dimensionality of the system, many existing methods of analysis are not well-suited to this problem. Visualizing and describing the error surface are also nontrivial and problematic. These problems are specific to complex systems such as neural networks, which contain large numbers of adjustable parameters, and the investigation of such systems in this way is largely a developing area of research. In this thesis, the concept of the error surface is explored using three related methods. Firstly, Principal Component Analysis (PCA) is proposed as a method for visualizing the learning trajectory followed by an algorithm on the error surface. It is found that PCA provides an effective method for performing such a visualization, as well as providing an indication of the significance of individual weights to the training process. Secondly, sampling methods are used to explore the error surface and to measure certain properties of the error surface, providing the necessary data for an intuitive description of the error surface. A number of practical MLP error surfaces are found to contain a high degree of ultrametric structure, in common with other known configuration spaces of complex systems. Thirdly, a class of global optimization algorithms is also developed, which is focused on the construction and evolution of a model of the error surface (or search space) as an integral part of the optimization process. The relationships between this algorithm class, the Population-Based Incremental Learning algorithm, evolutionary algorithms and cooperative search are discussed. The work provides important practical techniques for exploration of the error surfaces of MLP networks. These techniques can be used to examine the dynamics of different training algorithms, the complexity of MLP mappings and an intuitive description of the nature of the error surface. The configuration spaces of other complex systems are also amenable to many of these techniques. Finally, the algorithmic framework provides a powerful paradigm for visualization of the optimization process and the development of parallel coupled optimization algorithms which apply knowledge of the error surface to solving the optimization problem. Keywords: error surface, neural networks, multi-layer perceptron, global optimization, supervised learning, scientific visualization, ultrametricity, configuration space analysis, search space analysis, evolutionary algorithms, probabilistic modelling, probability density estimation, principal component analysis. -- marcusg at csee.uq.edu.au http://www.elec.uq.edu.au/~marcusg/ From vogdrup at daimi.au.dk Mon Jun 26 09:22:44 2000 From: vogdrup at daimi.au.dk (Jakob Vogdrup Hansen) Date: Mon, 26 Jun 2000 15:22:44 +0200 Subject: PhD thesis available: Combining Predictors ... Message-ID: <200006261322.PAA03273@ppp.brics.dk> Dear Connectionists, I am happy to annouce the availability of my PhD thesis for download in postscript format. URL: http://www.daimi.au.dk/~vogdrup/diss.ps Comments are welcome. regards, Jakob Title: Combining Predictors. Meta Machine Learning Methods and Bias/Variance & Ambiguity Decompositions Abstract: The most important theoretical tool in connection with machine learning is the bias/variance decomposition of error functions. Together with Tom Heskes, I have found the family of error functions with a natural bias/variance decomposition that has target independent variance. It is shown that no other group of error functions can be decomposed in the same way. An open problem in the machine learning community is thereby solved. The error functions are derived from the deviance measure on distributions in the one-parameter exponential family. It is therefore called the deviance error family. A bias/variance decomposition can also be viewed as an ambiguity decomposition for an ensemble method. The family of error functions with a natural bias/variance decomposition that has target independent variance can therefore be of use in connection with ensemble methods. The logarithmic opinion pool ensemble method has been developed together with Anders Krogh. It is based on the logarithmic opinion pool ambiguity decomposition using the Kullback-Leibler error function. It has been extended to the cross-validation logarithmic opinion pool ensemble method. The advantage of the cross-validation logarithmic opinion pool ensemble method is that it can use unlabeled data to estimate the generalization error, while it still uses the entire labeled example set for training. The cross-validation logarithmic opinion pool ensemble method is easily reformulated for another error function, as long as the error function has an ambiguity decomposition with target independent ambiguity. It is therefore possible to use the cross-validation ensemble method on all error functions in the deviance error family. -- Jakob V. Hansen Tlf: 86 750618 Rydevnget 87, 1. th. Kontor: B2.15 Lokal: (8942)3355 8210 Aarhus V E-mail: Vogdrup at daimi.au.dk From P.J.Lisboa at livjm.ac.uk Mon Jun 26 05:51:35 2000 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Mon, 26 Jun 2000 10:51:35 +0100 Subject: Computational Neuroscience: Modelling single trial EEG data for source separation and localisation Message-ID: PhD Studentship at JMU, Liverpool A position is available for the analysis of single trial EEG for source separation and localisation, funded as a 3 year PhD studentship (approx. ?6,800 p.a. plus fees) in a project involving industrial collaboration. The project will involve the use of Hidden Markov modelling, Independent Component Analysis, and purpose-built neural networks , as well as advanced statistical methods, all of which are in demand in industrial positions world-wide. Applicants should have a good first degree or a Masters degree in Mathematics, Statistics or Physics, preferably with familiarity with neural networks, and with interest in cognitive neuroscience. Familiarity with programming using Matlab or C++ are essential, as are good written and oral communication skills in English. The successful candidate will be expected TO contribute towards a team effort, but must also be self-motivated and able to work individually. If you would like to obtain further information, please contact Professor Paulo Lisboa at p.j.lisboa at livjm.ac.uk The deadline for expressions of interest is Friday, 7th July 2000. From sok at cs.york.ac.uk Mon Jun 26 12:17:24 2000 From: sok at cs.york.ac.uk (Simon E M O'Keefe) Date: Mon, 26 Jun 2000 17:17:24 +0100 Subject: Research Post in Parallel Implementation of Neural Networks, University of York, UK Message-ID: <39578214.AFFCE5E8@cs.york.ac.uk> University of York Department of Computer Science Advanced Computer Architectures Group Research Post in Parallel Implementation of Neural Networks (Ref: web/6051) The above post is available immediately to work on an EPSRC-funded project investigating the parallel implementation of binary neural networks. The work will centre on parallelisation and load-balancing of neural network software on a special purpose parallel associative neural network machine (Cortex-1) under construction in the department. You will be expected to be proficient in C++ and programming at the systems level. Knowledge of parallelisation, neural networks, performance evaluation and MPI would be an advantage, but is not essential. The post is available for a period of up to 8 months and the salary will be in the range 16,286 - 20,811 per annum, depending on experience. Further details of the project can be found at http://www.cs.york.ac.uk/arch/nn/aura.html and details of the department can be found at http://www.cs.york.ac.uk/arch/neural/ Informal enquires can be made to Prof. Jim Austin at austin at cs.york.ac.uk or on +44/0 1904 432734. The closing date for applications is 25 July 2000. -- Jim Austin, Professor of Neural Computation Advanced Computer Architecture Group, Department of Computer Science, University of York, York, YO10 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch From avrama at gmu.edu Tue Jun 27 09:16:32 2000 From: avrama at gmu.edu (avrama) Date: Tue, 27 Jun 2000 09:16:32 -0400 (EDT) Subject: Post-doctoral position available Message-ID: Please post and circulate as you see fit. Many thanks! NEUROSCIENCE POST-DOCTORAL POSITION AVAILABLE George Mason University A post-doctoral position is available beginning in September to work on a project funded by the National Science Foundation. The goal of the project is to completely characterize the differences between Hermissenda type A and type B photoreceptors by measuring the light-induced currents and developing computational models. All highly motivated candidates with a recent PhD (or expecting one in the year 2000) in physiology, neuroscience, or a related discipline are encouraged to apply. Electrophysiology skills are essential, as well as the ability and willingness to learn new techniques and concepts. Programming skills and/or experience with modeling packages are desirable but not necessary. The position is ideal for an experimentalist who would like to learn mathematical and computational neuroscience techniques. The post-doc will join the multidisciplinary Invertebrate Neurobiology Laboratory at the Krasnow Institute for Advanced Study, and the new School of Computational Sciences of George Mason University, both of which are located in Fairfax, VA (less than 20 miles west of Washington DC). Members of the INL are interested in the biophysical and biochemical mechanisms of long term memory storage. In particular, we seek to understand the cellular events underlying the requirement for temporal proximity of stimu li to be associated, and the neural circuits involved in the behavioral expression of memory. The PI has developed software for modeling the biochemical reactions of second messenger dynamics, and the effects of second messengers on channel properties. Please refer to the website for further details: http://www.krasnow.gmu.edu/avrama/invertlab.html The post-doc will be hired as an assistant research professor with a salary based on the NIH postdoctoral scale (with VA state employee benefits), and will have full access to library, laboratory and computing facilities both within the Krasnow Institute and George Mason University. Application review will begin in August and continue until a suitable candidate is found. Send CV, (p)reprints, a brief description of your motivation, and names, email addresses and phone/fax numbers of three references to: Avrama Blackwell, V.M.D., Ph.D. Krasnow Institute for Advanced Study, MS 2A1 George Mason University Fairfax, VA 22030 Ph. (703)993-4381 Fax (703)993-4325 Non-resident aliens are welcome to apply. Women and minorities are encouraged to apply. George Mason University is an equal opportunity employer. From brenner at informatik.uni-freiburg.de Tue Jun 27 11:57:00 2000 From: brenner at informatik.uni-freiburg.de (Michael Brenner) Date: Tue, 27 Jun 2000 17:57:00 +0200 Subject: German Autumn School on Cognition, Call for Participation Message-ID: <3958CECC.AEE8816C@informatik.uni-freiburg.de> --------------------------------------------------------------------------- We apologize in advance if you have received this announcement before --------------------------------------------------------------------------- CALL FOR PARTICIPATION German Autumn School on Cognition 2000 September 10-15, 2000 Freiburg, Germany http://www.fr.vgk.de/herbstschule_2000/ The autumn school offers courses and lectures covering different areas of cognitive science and learning technologies. It is intended for students and scientists from the fields of psychology, computer science, linguistics, computational linguistics, cognitive and instructional science. Lecturers: Shaaron Ainsworth, Joost Breuker, Cristiano Castelfranchi, Richard Cooper, Hector Geffner, Jim Hollan, Dietmar Janetzko, Timothy Koschmann, Marcia Linn, Deborah McGuinness, Thomas Metzinger, Bernhard Nebel, Josef Nerb, Werner Nutt, Uwe Oestermeier, Rolf Ploetzner, Lloyd Rieber, Keith Stenning, Gerhard Strube, Peter Yule Registration is open to all, and is possible electronically through the Autumn School web page (see above for the URL). Registration fees are DM 60,- for early registration (before 1st of July, 2000), after 1st of July, DM 80. (DM 80 is approximately 39 US$, as of June 25, 2000). Registration includes admission to all sessions and plenary addresses at the Autumn School 2000 in Freiburg. For additional information on lecture and tutorial topics, schedule, accomodation and travel, please refer to our web pages. Please forward this message to interested colleagues and students. We are looking forward to welcome you in Freiburg, Michael Brenner, Katja Lay, Susanne Thalemann From aapo at james.hut.fi Fri Jun 30 05:27:53 2000 From: aapo at james.hut.fi (Aapo Hyvarinen) Date: Fri, 30 Jun 2000 12:27:53 +0300 (EEST) Subject: New papers on ICA Message-ID: Dear All, The following papers on ICA and related topics are now available on my home page: http://www.cis.hut.fi/aapo/pub.html I would also like to mention that I'm going to give a tutorial on ICA at the IJCNN'00 in Como, Italy. ------------------------------------------------------------------------ A. Hyvarinen. Complexity Pursuit: Separating interesting components from time-series. Shorter version appeared in Proc. Int. Workshop on Independent Component Analysis and Blind Signal Separation (ICA2000), Helsinki, Finland, 2000 http://www.cis.hut.fi/aapo/ps/gz/complexity.ps.gz Abstract: A generalization of projection pursuit for time series, i.e. signals with time structure, is introduced. The goal is to find projections of time series that have interesting structure. We define the interestingness using criteria related to Kolmogoroff Complexity or coding length: Interesting signals are those that can be coded with a short code length. We derive a simple approximation of coding length that takes into account both the nongaussianity and the autocorrelations of the time series. Also, we derive a simple algorithm for its approximative optimization. The resulting method is closely related to blind separation of nongaussian, time-dependent source signals. ------------------------------------------------------------------------ P.O. Hoyer and A. Hyvarinen. Independent Component Analysis Applied to Feature Extraction from Colour and Stereo Images. To appear in Network. http://www.cis.hut.fi/aapo/ps/gz/Network00.ps.gz Abstract: Previous work has shown that independent component analysis (ICA) applied to feature extraction from natural image data yields features resembling Gabor functions and simple-cell receptive fields. This article considers the effects of including chromatic and stereo information. The inclusion of colour leads to features divided into separate red/green, blue/yellow, and bright/dark channels. Stereo image data, on the other hand, leads to binocular receptive fields which are tuned to various disparities. The similarities between these results and observed properties of simple cells in primary visual cortex are further evidence for the hypothesis that visual cortical neurons perform some type of redundancy reduction, which was one of the original motivations for ICA in the first place. In addition, ICA provides a principled method for feature extraction from colour and stereo images; such features could be used in image processing operations such as denoising and compression, as well as in pattern recognition. ------------------------------------------------------------------------- A. Hyvarinen and R. Karthikesh. Sparse priors on the mixing matrix in independent component analysis. Proc. ICA2000, Helsinki, Finland. http://www.cis.hut.fi/aapo/ps/gz/ICA00_sp.ps.gz Abstract: In independent component analysis, prior information on the distributions of the independent components is often used; some weak information is in fact necessary for succesful estimation. In contrast, prior information on the mixing matrix is usually not used. This is because it is considered that the estimation should be completely blind as to the form of the mixing matrix. Nevertheless, it could be possible to find forms of prior information that are sufficiently general to be useful in a wide range of applications. In this paper, we argue that prior information on the sparsity of the mixing matrix could be a constraint general enough to merit attention. Moreover, we show that the computational implementation of such sparsifying priors on the mixing matrix is very simple since in many cases they can be expressed as conjugate priors. The property of being conjugate priors means that essentially the same algorithm can be used as in ordinary ICA. Best Regards, Aapo ---------------------------------------------------- Aapo Hyvarinen Neural Networks Research Centre Helsinki University of Technology P.O.Box 5400, FIN-02015 HUT, Finland Tel: +358-9-4513278, Fax: +358-9-4513277 Email: Aapo.Hyvarinen at hut.fi Home page: http://www.cis.hut.fi/~aapo/ ---------------------------------------------------- From amari at brain.riken.go.jp Thu Jun 1 04:54:58 2000 From: amari at brain.riken.go.jp (Shun-ichi Amari) Date: Thu, 1 Jun 2000 17:54:58 +0900 Subject: No subject Message-ID: Dear connectionists: It is my pleasure to announce the following symposium, focussing on 1) graphical methods and statistics, 2) combining learners, 3) VC dimension and support vector machines, and 4) information geometry and statistical physical methods. Shun-ichi Amari Vice Director, RIKEN Brain Science Institute Laboratory for Mathematical Neuroscience Research Group on Brain-Style Information Systems tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687 amari at brain.riken.go.jp http://www.bsis.brain.riken.go.jp/ ********************************************** Bernoulli-RIKEN BSI 2000 Symposium on Neural Networks and Learning Dates and Venues: Symposium: October 25 - 27, 2000 Ohkouchi Hall, RIKEN (The Institute of Physical and Chemical Research), Japan Satellite Workshop: October 28, 2000 The Institute of Statistical Mathematics, Japan Aim: In order to celebrate Mathematical Year 2000, The Bernoulli Society is organizing a number of Symposia in rapidly developing research areas in which probability theory and statistics can play important roles. Brain science in the wide sense will become more important in the 21 century. Information processing in the brain is so flexible and its learning ability is so strong that it is indeed a challenge for information science to elucidate its mechanisms. It is also a big challenge to construct information processing systems of brain style. The present Symposium focuses on learning ability of real and artificial neural networks and related systems from theoretical and practical points of view. Probability theory and statistics will play fundamental roles in elucidating these systems, and they in turn fortify stochastic and statistical methods. Theories of neural learning and pattern recognition have a long history, and lots of new ideas are emerging currently. They have also practical applicability in the real world problems. Now is a good time to review all of these new ideas and methods and to discuss future directions of developments. We will invite worldwide top class researchers in these fields and discuss the state-of-the-art of neural networks and learning as well as future directions of this important area. Participants are by invitation only. We are expecting 50 -80 participants from all over the world. After the symposium, we will organize a more informal one-day workshop: "Towards new unification of statistics and neural networks learning". The detailed information for time tables and abstracts can be obtained at http://www.bsis.brain.riken.go.jp/Bernoulli Those who have interests in joining the Symposium and Workshop may ask invitation through the above web-site after June 15 when we are ready. If you have any questions, contact the organizing committee at bernoulli2000 at bsis.brain.riken.go.jp ******************* Sponsors: The Bernoulli Society for Mathematical Statistics and Probability RIKEN Brain Science Institute The Institute of Statistical Mathematics Japanese Neural Networks Society In Cooperation with: Japanese Statistical Society Supported by: The Commemorative Association for the Japan World Exposition (1970) The Support Center for Advanced Telecommunications research Technology (SCAT) Organizing Committee: Chair Shun-ichi Amari, RIKEN Brain Science Institute, Japan Leo Breiman, University of California, Berkeley, USA Shinto Eguchi, The Institute of Statistical Mathematics, Japan Michael Jordan, University of California, Berkeley, USA Noboru Murata, Waseda University, Japan Mike Titterington, University of Glasgow, UK Vladimir Vapnik, AT&T, USA Registration fee 10,000 Japanese yen (nearly 100 US$) (including reception) is requested at the conference vennue. There is 50% student discount. ***************** Program: 1. Graphical Models and Statistical Methods: Steffen L. Lauritzen (Aalborg University) Graphical models for learning Thomas S. Richardson (University of Warwick) Ancestral graph Markov models: an alternative to models with latent or selection variables Lawrence Saul (AT&T Labs) Hidden variables and distributed representations in automatic speech recognition Martin Tanner (Northwestern University) Inference for and Applications of Hierarchical Mixtures-of-Experts 2. Combining Learners Leo Breiman (University of California, Berkeley) Random Forests Jerome H. Friedman (Stanford University) Gradient boosting and multiple additive regression trees Peter Bartlett (Australian National University) Large Margin Classifiers and Risk Minimization Yoram Singer (The Hebrew University) Combining Learners: an Output Coding Perspective 3. Information Geometry and Statistical Physics Shinto Eguchi (The Institute of Statistical Mathematics) Information geometry of tubular neighbourhoods for a statistical model Shun-ichi Amari (RIKEN Brain Science Institute) Information geometry of neural networks Manfred Opper (Aston University) The TAP Mean Field approach for probabilistic models Magnus Rattray (University of Manchester) Modelling the learning dynamics of latent variable models 4. VC Dimension and SVM Vladimir Vapnik (AT&T Labs) Statistical learning theory and support vector machines Michael Kearns (AT&T Labs) Sparse Sampling Algorithms for Probabilistic Artificial Intelligence Gabor Lugosi (Pompeu Fabra University) Model selection, error estimation, and concentration Bernhard Schoelkopf (Microsoft Research Ltd.) SV Algorithms and Applications From t.c.pearce at leicester.ac.uk Thu Jun 1 07:00:38 2000 From: t.c.pearce at leicester.ac.uk (Tim Pearce) Date: Thu, 1 Jun 2000 12:00:38 +0100 Subject: Ph.D Studentship in Neuronal Modelling Message-ID: Ph.D. Studentship NEURONAL MODELLING IN FIELD PROGRAMMABLE GATE ARRAYS (FPGAs) (Ph.D Position, University of Leicester/ETH Switzerland UK/EU Nationals) An opportunity exists for an EPSRC Ph.D. studentship to develop novel neuronal models that may be implemented in both software and FPGAs. Biologically realistic models of spiking neurons are a relatively new research topic, and require significant computational power to simulate. This project investigates reduced complexity models of biological neurons that may be implemented digitally, and function in parallel, using standalone FPGA devices. Research will then focus on combining large numbers of these models on a single device for real-time olfactory (smell) sensing for use on mobile behaving robots. This portion of project will be conducted in close collaboration with the Institute of Neuroinformatics, ETH, Zrich, Switzerland, to which a visit will take place during the final year. While a biological background is not necessary, a good degree in a numerate discipline such as maths, engineering, physics, or computer science is required. The successful candidate will be expected to register for the degree of Ph.D in the Control and Instrumentation Research Group at Leicester University Engineering Dept. Applicants should send a CV and names and addresses of two referees by regular mail to: Dr. Tim Pearce, Dept. of Engineering University of Leicester University Road LEICESTER LE1 7RH, U.K. Informal enquiries (phone or e-mail) are also welcome. Tel +44 (0)116 223 1290 Fax +44 (0)116 252 2619 e-mail: t.c.pearce at le.ac.uk -- T.C. Pearce, PhD URL: http://www.leicester.ac.uk/engineering/ Lecturer in Bioengineering E-mail: t.c.pearce at leicester.ac.uk Department of Engineering Tel: +44 (0)116 223 1290 University of Leicester Fax: +44 (0)116 252 2619 Leicester LE1 7RH Bioengineering, Transducers and United Kingdom Signal Processing Group From morten at compute.it.siu.edu Thu Jun 1 12:27:04 2000 From: morten at compute.it.siu.edu (Morten H. Christiansen) Date: Thu, 1 Jun 2000 11:27:04 -0500 (CDT) Subject: Two Graduate Openings in Brain and Cognitive Sciences Message-ID: Dear Colleague, Please bring the following information to the attention of potential graduate school applicants from your program with an interest in Brain and Cognitive Sciences. TWO GRADUATE OPENINGS IN BRAIN AND COGNITIVE SCIENCES IN THE DEPARTMENT OF PSYCHOLOGY AT SOUTHERN ILLINOIS UNIVERSITY, CARBONDALE. Dr. Matthew Schlessinger (UMass) and Dr. Michael Young (UIowa) will be joining the faculty in the Brain and Cognitive Science Graduate Program, the Department of Psychology at Southern Illinois University. In this connection, the Brain and Cognitive Science program has two openings for graduate study (though the two openings are not tied to the incoming faculty). Each opening comes with a monthly stipend of approximately $1000.00 for at least nine months. The start date is August 14, 2000, and applications should be submitted a.s.a.p. The Ph.D. program in Brain and Cognitive Sciences is unique and exciting. The focus is on an interdisciplinary approach to understanding human behavior approached from a combination of developmental (infancy and childhood, adolescence and aging), neurobiological (neurophysiology, neuropsychology, genetics), behavioral (human and animal experimentation) and computational (neural networks, statistical analyses, intelligent software agents) perspectives. As an integral part of their training, students become active participants in ongoing faculty research programs in the Brain and Cognitive Sciences. Students will receive training in two or more different research methodologies, and are expected to develop a multidisciplinary approach to their own research. Current research by the Brain and Cognitive Sciences faculty includes perinatal risk factors in child development, neurophysiological and behavioral correlates of infant and child cognitive and language development, personality and social correlates of cognitive aging, child play and social behaviors, identity development across the life span, judgment and decision making, causal and category learning, neural network models of learning and sensorimotor cognition, neural network models of language acquisition and processing, agent-based computational modeling of the evolution and development of action and perception, artificial grammar learning, sentence processing, evolution of language and the brain, the pharmacological modulation of memory, effects of psychoactive drugs, reversible inactivation of discrete brain areas and memory, recovery of function from brain damage, electrophysiological models (e.g., long-term potentiation), the neurophysiology of memory, animal learning, and human learning and memory. For more information about the program and application procedures, please visit our web site at: http://www.siu.edu/~psycho/bcs Visit also the Department's web site at: http://www.siu.edu/~psycho Best regards, Morten Christiansen Coordinator of the Brain and Cognitive Sciences Program ---------------------------------------------------------------------- Morten H. Christiansen Assistant Professor Phone: +1 (618) 453-3547 Department of Psychology Fax: +1 (618) 453-3563 Southern Illinois University Email: morten at siu.edu Carbondale, IL 62901-6502 Office: Life Sciences II, Room 271A Personal Web Page: http://www.siu.edu/~psycho/faculty/mhc.html Lab Web Site: http://www.siu.edu/~morten/csl ---------------------------------------------------------------------- From ericwan at ece.ogi.edu Thu Jun 1 13:45:22 2000 From: ericwan at ece.ogi.edu (Eric Wan) Date: Thu, 01 Jun 2000 10:45:22 -0700 Subject: POSTDOCTORAL and PH.D. RESEARCH POSITIONS Message-ID: <3936A132.70AC4573@ece.ogi.edu> POSTDOCTORAL and PH.D. RESEARCH POSITIONS The Center for Spoken Language Understanding (CSLU) at the Oregon Graduate Institute of Science and Technology (OGI) is seeking applicants for one Postdoctoral Research Associate and one Ph.D. Student Fellowship to work with Professor Eric A. Wan (http://www.ece.ogi.edu/~ericwan/) on a number of projects relating to speech enhancement and machine learning. QUALIFICATIONS: The postdoctoral candidate should have a Ph.D. with a strong background in signal processing, speech technologies, and neural networks. The Ph.D. candidate should have a strong background in signal processing, neural networks, and some familiarity with speech technologies. A Masters Degree in Electrical Engineering is preferred. Please send inquiries to ericwan at ece.ogi.edu. Include the following background information: - name and affiliation, - a short paragraph describing qualifications and interests - a CV including a list of publications and prior work Eric A. Wan Associate Professor, OGI ****************************************************** OGI OGI is a young, but rapidly growing, private research institute located in the Portland area. OGI offers Masters and Ph.D. programs in Computer Science and Engineering, Applied Physics, Electrical Engineering, Biology, Chemistry, Materials Science and Engineering, and Environmental Science and Engineering. OGI is located near Portland, a thriving city in the heart of the lush natural beauty of the Pacific Northwest. It is centered only a 1-2 hour drive from year-round downhill skiing on dormant volcanos, high desert, countless forest hiking trails, the Pacific Ocean, and numerous breweries and wineries. OGI is an equal-opportunity, affirmative action employer; women, minorities, and individuals with disabilities are encouraged to apply. OGI has world renowned research programs in the areas of speech systems (Center for Spoken Language Understanding) and machine learning. (Center for Information Technologies). Center for Spoken Language Understanding http://cslu.cse.ogi.edu The Center for Spoken Language Understanding is a multidisciplinary academic organization that focuses on basic research in spoken language systems technologies, training of new investigators, and development of tools and resources for free distribution to the research and education community. Areas of specific interest include speech recognition, natural language understanding, text-to-speech synthesis, speech enhancement in noisy conditions, and modeling of human dialogue. A key activity is the ongoing development of the CSLU Toolkit, a comprehensive software platform for learning about, researching, and developing spoken dialog systems and new applications. Center for Information Technologies The Center for Information Technologies supports development of powerful, robust, and reliable information processing techniques by incorporating human strategies and constraints. Such techniques are critical building blocks of multimodal communication systems, decision support systems, and human-machine interfaces. The CIT approach is based on emulating relevant human information processing capabilities and extending them to a variety of complex tasks. The approach requires expertise in nonlinear and adaptive signal processing (e.g., neural networks), statistical computation, decision analysis, and modeling of human information processing. Correspondingly, CIT research areas include perceptual characterization of speech and images, prediction, robust signal processing, rapid adaptation to changing environments, nonlinear signal representation, integration of information from several sources, and integration of prior knowledge with adaptation. From adr at raphe.NSMA.Arizona.EDU Thu Jun 1 15:18:31 2000 From: adr at raphe.NSMA.Arizona.EDU (David Redish) Date: Thu, 01 Jun 2000 12:18:31 -0700 Subject: MClust: Spike-sorting toolbox Message-ID: <200006011937.MAA24927@cortex.NSMA.Arizona.EDU> Announcing the release of the MClust spike-sorting toolbox. MClust is a Matlab toolbox which enables a user to perform manual clustering on single-electrode, stereotrode, and tetrode recordings taken with the DataWave and Cheetah recording systems, including data generated from tetrodes, stereotrodes, and single electrodes. It can be found at http://www.nsma.arizona.edu/adr/mclust/ The MClust toolbox is free-ware, but you will need Matlab 5.2 or higher to run it. It has been tested under the Windows and Solaris families of operating systems, and ports to other operating are in the works. Further details (such as the copyright notice and disclaimer) are available from the website above. ----------------------------------------------------- A. David Redish adr at nsma.arizona.edu Post-doc http://www.cs.cmu.edu/~dredish Neural Systems, Memory and Aging, Univ of AZ, Tucson AZ ----------------------------------------------------- From bvr at stanford.edu Fri Jun 2 01:33:45 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Thu, 01 Jun 2000 22:33:45 -0700 Subject: NIPS*2000 WORKSHOP PROPOSALS - DEADLINE EXTENDED TO JUNE 9 Message-ID: <4.2.0.58.20000601223246.00da83c0@bvr.pobox.stanford.edu> NIPS*2000 WORKSHOP PROPOSALS - DEADLINE EXTENDED TO JUNE 9 ===================================== Neural Information Processing Systems Natural and Synthetic NIPS*2000 Post-Conference Workshops December 1 and 2, 2000 Breckenridge, Colorado ===================================== Following the regular program of the Neural Information Processing Systems 2000 conference, workshops on various current topics in neural information processing will be held on December 1 and 2, 2000, in Breckenridge, Colorado. Proposals by qualified individuals interested in chairing one of these workshops are solicited. Example topics include: Active Learning, Architectural Issues, Attention, Audition, Bayesian Analysis, Bayesian Networks, Benchmarking, Brain Imaging, Computational Complexity, Computational Molecular Biology, Control, Genetic Algorithms, Graphical Models, Hippocampus and Memory, Hybrid Supervised/Unsupervised Learning Methods, Hybrid HMM/ANN Systems, Implementations, Independent Component Analysis, Mean-Field Methods, Markov Chain Monte-Carlo Methods, Music, Network Dynamics, Neural Coding, Neural Plasticity, On-Line Learning, Optimization, Recurrent Nets, Robot Learning, Rule Extraction, Self-Organization, Sensory Biophysics, Signal Processing, Spike Timing, Support Vectors, Speech, Time Series, Topological Maps, and Vision. The goal of the workshops is to provide an informal forum for researchers to discuss important issues of current interest. There will be six hours of workshop meetings per day, split into morning and afternoon sessions, with free time in between for ongoing individual exchange or outdoor activities. Controversial issues, open problems, and comparison of competing approaches are encouraged and preferred as workshop topics. Representation of alternative viewpoints and panel-style discussions are particularly encouraged. Descriptions of previous workshops may be found at http://www.cs.cmu.edu/Groups/NIPS/NIPS99/Workshops/ Select workshops may be invited to submit their workshop proceedings for publication as part of a new series of monographs for the post-NIPS workshops. Workshop organizers will have responsibilities including: ++ coordinating workshop participation and content, which includes arranging short informal presentations by experts, arranging for expert commentators to sit on a discussion panel, formulating a set of discussion topics, etc. ++ moderating the discussion, and reporting its findings and conclusions to the group during evening plenary sessions ++ writing a brief summary and/or coordinating submitted material for post-conference electronic dissemination. ======================= Submission Instructions ======================= Interested parties should submit a short proposal for a workshop of interest via email by June 9, 2000. Proposals should include title, description of what the workshop is to address and accomplish, proposed workshop length (1 or 2 days), planned format (mini-conference, panel discussion, combinations of the above, etc), and proposed speakers. Names of potential invitees should be given where possible. Preference will be given to workshops that reserve a significant portion of time for open discussion or panel discussion, as opposed to pure "mini-conference" format. An example format is: ++ Tutorial lecture providing background and introducing terminology relevant to the topic. ++ Two short lectures introducing different approaches, alternating with discussions after each lecture. ++ Discussion or panel presentation. ++ Short talks or panels alternating with discussion and question/answer sessions. ++ General discussion and wrap-up. We suggest that organizers allocate at least 50% of the workshop schedule to questions, discussion, and breaks. Past experience suggests that workshops otherwise degrade into mini-conferences as talks begin to run over. The proposal should motivate why the topic is of interest or controversial, why it should be discussed, and who the targeted group of participants is. It also should include a brief resume of the prospective workshop chair with a list of publications to establish scholarship in the field. Submissions should include contact name, address, email address, phone and fax numbers. Proposals should be emailed to caruana at cs.cmu.edu. Proposals must be RECEIVED by June 9, 2000. If email is unavailable, mail to: NIPS Workshops, Rich Caruana, SCS CMU, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA. Questions may be addressed to either of the Workshop Co-Chairs: Rich Caruana (caruana at cs.cmu.edu) Virginia de Sa (desa at phy.ucsf.edu) PROPOSALS MUST BE RECEIVED BY June 9, 2000 From juergen at idsia.ch Fri Jun 2 12:51:17 2000 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Fri, 2 Jun 2000 18:51:17 +0200 Subject: job Message-ID: <200006021651.SAA20708@ruebe.idsia.ch> We are seeking an outstanding PhD candidate for an ongoing research project that combines machine learning (unsupervised coding, neural networks, reinforcement learning, evolutionary computation) and computational fluid dynamics. We tackle problems such as turbulent flow encoding and drag minimisation. Details in http://www.idsia.ch/~juergen/flow2000.html Interviews: IJCNN (24-27 July 2000) will take place in Como, Italy. Como is very close to the Swiss border, just 30 minutes away from IDSIA. In case you are participating in the conference: this would be a good time for a job interview. ___________________________________________________ Juergen Schmidhuber director IDSIA, Galleria 2, 6928 Manno (Lugano), Switzerland juergen at idsia.ch http://www.idsia.ch/~juergen From Pierre.Bessiere at imag.fr Fri Jun 2 06:14:30 2000 From: Pierre.Bessiere at imag.fr (Pierre Bessiere) Date: Fri, 2 Jun 2000 11:14:30 +0100 Subject: 3 papers about BAYESIAN ROBOTICS Message-ID: 3 papers about Bayesian Robotics are available online (comments welcome) : Bayesian Robots Programming Abstract: We propose a new method to program robots based on Bayesian inference and learning. The capacities of this programming method are demonstrated through a succession of increasingly complex experiments. Starting from the learning of simple reactive behaviors, we present instances of behavior combinations, sensor fusion, hierarchical behavior composition, situation recognition and temporal sequencing. This series of experiments comprises the steps in the incremental development of a complex robot program. The advantages and drawbacks of this approach are discussed along with these different experiments and summed up as a conclusion. These different robotics programs may be seen as an illustration of probabilistic programming applicable whenever one must deal with problems based on uncertain or incomplete knowledge. The scope of possible applications is obviously much broader than robotics. PDF: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Lebeltel2000.pdf PS: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Lebeltel2000.ps PS.GZ: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Lebeltel2000.ps.gz Reference: Lebeltel O., Bessi=E8re P., Diard J. & Mazer E. (2000); Bayesian Robots Programming; Les cahiers du Laboratoire Leibniz (Technical Report), n=B01, Mai 2000; Grenoble, France The Design and Implementation of a Bayesian CAD Modeler for Robotic Applications Abstract: We present a Bayesian CAD modeler for robotic applications. We address the problem of taking into account the propagation of geometric uncertainties when solving inverse geometric problems. The proposed method may be seen as a generalization of constraint-based approaches in which we explicitly model geometric uncertainties. Using our methodology, a geometric constraint is expressed as a probability distribution on the system's parameters and the sensor measurements, instead of a simple equality or inequality. To solve geometric problems in this framework, we propose an original resolution method able to adapt to problem complexity. Using two examples, we show how to apply our approach by providing simulation results using our modeler. PDF: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Mekhnacha2000.pdf PS: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Mekhnacha2000.ps PS.GZ: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Mekhnacha2000.ps.gz Reference: Mekhnacha K., Mazer E. & Bessi=E8re P. (2000); The Design and Implementation of a Bayesian CAD Modeler for Robotic Applications; Les cahiers du Laboratoire Leibniz (Technical report), n=B02, Mai 2000; Grenoble, France State Identification for Planetary Rovers: Learning and Recognition Abstract: A planetary rover must be able to identify states where it should stop or change its plan. With limited and infrequent communication from ground, the rover must recognize states accurately. However, the sensor data is inherently noisy, so identifying the temporal patterns of data that correspond to interesting or important states becomes a complex problem. In this paper, we present an approach to state identification using second-order Hidden Markov Models. Models are trained automatically on a set of labeled training data; the rover uses those models to identify its state from the observed data. The approach is demonstrated on data from a planetary rover platform. PDF: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Aycard2000.pdf PS: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Aycard2000.ps PS.GZ: http://www-leibniz.imag.fr/LAPLACE/Publications/Rayons/Aycard2000.ps.gz Reference: O. Aycard and R. Washington.(2000) State Identificationfor Planetary Rovers: Learning and Recognition. In Proceedings of the 2000 IEEE International Conference on Robotics and Automation. San Francisco, USA. ***** You are welcome to visit our WWW pages (http://www-leibniz.imag.fr/LAPLACE) with numerous VIDEOs and DEMOs. ___________________________________________________________________ Dr Pierre BESSIERE CNRS ********************* Laboratoire LEIBNIZ - Institut IMAG 46 ave. Felix Viallet Work: +33/(0)4.76.57.46.73 38031 Grenoble - FRANCE Fax : +33/(0)4.76.57.46.02 mailto:Pierre.Bessiere at imag.fr WWW: http://www-leibniz.imag.fr/LAPLACE http://www-leibniz.imag.fr/PRASC http://www-leibniz.imag.fr/~bessiere CNRS - INPG - UJF From poznan at harl.hitachi.co.jp Sat Jun 3 02:27:15 2000 From: poznan at harl.hitachi.co.jp (Roman Poznanski) Date: Sat, 03 Jun 2000 15:27:15 +0900 Subject: A new book on NEURAL NETWORKS....... References: <38D96662.6203063A@neuron.kaist.ac.kr> Message-ID: <3938A543.1B7368D2@harl.hitachi.co.jp> FORTHCOMING Biophysical Neural Networks: Foundations of Analytical Neuroscience Edited by Roman R. Poznanski, Advanced Research Laboratory, Hitachi, Ltd., Japan Biophysical Neural Networks focuses on biologically realistic models in the exploration of brain function at a multi-hierarchical level of organization. From biochemistry to large assemblies of neurons, the modeling of morphologically diverse, biophysically and biochemically realistic neural networks is the main cogitation of the book. An essential text for researchers active in the field of medicine (neuroscience), biophysics, biomedical engineering, and neural science. Key Features -------------- **Learn why the brain is instrinsically noncomputational. **Read how the Darwinian brain model can be made more profitable for engineers. **Discover the new field of ANALYTICAL NEUROSCIENCE. **With 45 research projects/ unsolved problems. **Over 500 references and color plates. The book is scheduled to appear late this year. If you would like to receive a color brochure email me your name and postal address. Sincerely, Roman R. Poznanski Editor, Biophysical Neural Networks: Foundations of Analytical Neuroscience From stefan.wermter at sunderland.ac.uk Mon Jun 5 07:48:06 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Mon, 05 Jun 2000 12:48:06 +0100 Subject: cognitive systems research journal book review Message-ID: <393B9376.3A2462EC@sunderland.ac.uk> The journal of COGNITIVE SYSTEMS RESEARCH invites recent and new books to be sent for review. Also we are interested to hear from researchers who are interested to write a review which will get published as a brief article in the journal cognitive systems research. Also, if you have recently read a new interesting, challenging, controversial book in the scope of the journal please let us know at the address below. We are particularly interested in new books and new review article writers. The journal of Cognitive Systems Research covers all topics in the study of cognitive processes, in both natural and artificial systems. The journal emphasizes the integration/synthesis of ideas, concepts, constructs, theories, and techniques from multiple paradigms, perspectives, and disciplines, in the analysis, understanding, and design of cognitive and intelligent systems. Contributions describing results obtained within the traditional disciplines (e.g., psychology, artificial intelligence) using well-established paradigms are also sought if such work has broader implications and relevance. The journal seeks to foster and promote the discussion of novel approaches in studying cognitive and intelligent systems. It also encourages cross fertilization of disciplines. This is to be achieved by soliciting and publishing high-quality contributions in all of the areas of study in cognitive science, including artificial intelligence, linguistics, psychology, psychiatry, philosophy, system and control theory, anthropology, sociology, biological sciences, and neuroscience. For more information on topics and scope please see http://www.cecs.missouri.edu/~rsun/journal.html Please send suggestions for book reviews or for being a review article author to the book review editor at the address below. Also publishers are invited to send two review copies for inspection directly. best wishes, Stefan *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From niranjan at eng.cam.ac.uk Tue Jun 6 09:25:44 2000 From: niranjan at eng.cam.ac.uk (niranjan@eng.cam.ac.uk) Date: Tue, 6 Jun 2000 14:25:44 +0100 (BST) Subject: Faculty Jobs - Sheffield Message-ID: <200006061325.25395@baby.eng.cam.ac.uk> The University of Sheffield Department of Computer Science has two faculty positions (lectureships), to start asap. We are looking for outstanding researchers in any area of CS, with a willingness to do some undergraduate teaching outside their area of research. Appointees in the area of Machine Learning may join a new group of four faculty: Joab Winkler (works in Wavelets), Thia Kirubarajan (works in tracking, currently at UCONN, joining us in September), Si Wu (works in Pattern Recognition, currently at RIKEN, joining us in September) and myself. I am keen to encourage applicants who are good in probabilistic modelling with interest in difficult real world problems such as signal processing or computational biology. Deadline 16 June; Interview 27 June. Formal application procedure is via instructions in http://www.shef.ac.uk/jobs Information on the department and University from: http://www.dcs.shef.ac.uk If you are interested, or know someone who might be, please get in touch. Many thanx niranjan __________________________________________________________________ Mahesan Niranjan Professor of Computer Science The University of Sheffield M.Niranjan at dcs.shef.ac.uk _________________________________________________________________ From n.sharkey at dcs.shef.ac.uk Wed Jun 7 05:27:35 2000 From: n.sharkey at dcs.shef.ac.uk (Noel Sharkey) Date: Wed, 7 Jun 2000 10:27:35 +0100 (BST) Subject: Faculty Positions at Sheffield Message-ID: *Apologies if you receive more than one copy ********************* Faculty POSITIONS ****************** The University of Sheffield Department of Computer Science has two faculty positions (lectureships - British equivalent of tenure-track assistant professor), to start ASAP. We are looking for outstanding researchers in any area of Computer Science, with a willingness to do some undergraduate teaching outside their area of research. There are a number of internationally excellent research groups within the dept. - see www.dcs.shef.ac.uk/research/groups/ I would particularly like to encourage researchers in the field of robotics with a preference for adaptive methods (NNs, GAs), BioRobotics, or intelligent sensing. (See the NRG group pages: www.dcs.shef.ac.uk/research/groups/nrg/) Please pass this on to anyone who might be interested. Deadline 16 June; Interview 27 June. Formal application procedure is via instructions in http://www.shef.ac.uk/jobs Information on the department and University from: http://www.dcs.shef.ac.uk ******************************************************************** Noel Sharkey PhD FIEE FBCS Professor of Computer Science Dept. Computer Science email: n.sharkey at dcs.shef.ac.uk University of Sheffield fax: (0114) 2221810 Sheffield, S. Yorks, UK phone: (0114) 2221803 ******************************************************************** From thomas.runkler at mchp.siemens.de Thu Jun 8 09:46:51 2000 From: thomas.runkler at mchp.siemens.de (Thomas Runkler) Date: Thu, 8 Jun 2000 15:46:51 +0200 (MET DST) Subject: Siemens Ph.D. studentship Message-ID: <200006081346.PAA17307@obsidian.mchp.siemens.de> Siemens Corporate Technology is seeking outstanding applicants for a three-year Ph.D. studentship for the project NEURAL COMPUTATION AND FUZZY SYSTEMS IN INDUSTRY with a focus on applications in process industry, production, and logistics. Applicants must have a Masters degree in EE/CS/Physics/Applied Mathematics or an engineering discipline with a solid background in one or more of the following areas: neural networks, fuzzy logic, data analysis, modeling, simulation, control, optimization, diagnosis, nonlinear dynamics, and distributed systems. Programming experience in MATLAB and C/C++ or JAVA is essential for the offered position. Candidates must demonstrate good communication skills in either English or German. The successful candidate(s) will join an active R&D team located in picturesque Munich not far from the Bavarian Alps with its world-famous castles and mountain lakes. Candidates will be responsible for conducting leading-edge research in the field of data-driven process modeling, preventive diagnosis, distributed control, and distributed optimization and develop systems for new products. Starting date for the position is October 2000. Successful applicants will be issued a residency visa for the three year period. Applicants should send their resume, three letters of recommendations, and a statement of interests and goals to: Barbara Mayr Siemens AG, Otto-Hahn-Ring 6, D-81730 Munich, Germany Phone: 0049-89-636-46863 Fax: 0049-89-636-53981 Email: barbara.mayr at mchp.siemens.de From Dave_Touretzky at cs.cmu.edu Fri Jun 9 21:45:59 2000 From: Dave_Touretzky at cs.cmu.edu (Dave_Touretzky@cs.cmu.edu) Date: Fri, 09 Jun 2000 21:45:59 -0400 Subject: new journal: Brain and Mind Message-ID: <17121.960601559@skinner.boltz.cs.cmu.edu> The first issue of Brain and Mind, a new journal from Kluwer, is available for free on the web at http://www.wkap.nl./journalhome.htm/1389-1987 The table of contents follows. -- Dave Touretzky ================================================================ Brain and Mind A Transdisciplinary Journal of Neuroscience and Neurophilosophy Table of Contents Volume 1, Issue 1, April 2000 Editors' Introduction John Bickle, Gillian Einstein, Valerie Hardcastle pp. 1-6 Why is Brain Size so Important: Design Problems and Solutions as Neocortex Gets Bigger or Smaller Jon H. Kaas pp. 7-23 Editor's Note John Bickle pp. 25-25 A Remembrance of an Event - Foreword to ''The Two Factor Theory of the Mind-Brain Relation'' by Ullin T. Place C.B. Martin pp. 27-27 The Two Factor Theory of the Mind-Brain Relation Ullin T. Place pp. 29-43 Behavioral Tolerance (Contingent Tolerance) is Mediated in Part by Variations in Regional Cerebral Blood Flow Stephen C. Fowler pp. 45-57 Self, World and Space: The Meaning and Mechanisms of Ego- and Allocentric Spatial Representation Rick Grush pp. 59-92 Terra Cognita: From Functional Neuroimaging to the Map of the Mind Dan Lloyd pp. 93-116 Editor's Note: State of the Science Article pp. 117-117 On the 'Dynamic Brain' Metaphor Pter rdi pp. 119-145 ==== From robtag at unisa.it Mon Jun 12 07:02:25 2000 From: robtag at unisa.it (Roberto Tagliaferri) Date: Mon, 12 Jun 2000 13:02:25 +0200 Subject: material available: SOFT COMPUTING METHODOLOGIES FOR DATA MINING Message-ID: <3944C341.45E0401C@dia.unisa.it> Dear Collegues, It is now on-line downloadable the programme with the transparencies of the workshop held in IIASS "E.R. Caianiello" on May 19-20 2000 and organized by SIREN (Italian Neural Networks Society, Societ=E0 Italiana Reti Neuroniche) on SOFT COMPUTING METHODOLOGIES FOR DATA MINING You can find them on the SIREN site at http://dsi.ing.unifi.it/neural/siren/Workshop2000/workshop2000_en.html I would like to thank the speakers, the organizing committee and Dr. Enrico Francesconi for their help in the organization. For any problems you can send an e-mail to robtag at unisa.it Roberto Tagliaferri From abbass at cs.adfa.edu.au Mon Jun 12 20:51:43 2000 From: abbass at cs.adfa.edu.au (Hussein A. Abbass) Date: Tue, 13 Jun 2000 10:51:43 +1000 Subject: Call for Book Chapters: (Data Mining: A Heuristic Approach) Message-ID: <3.0.6.32.20000613105143.007a3c90@csadfa.cs.adfa.edu.au> Our sincere apologies if you receive multiple copies of this call for chapters or it is not in your academic research interests. Dear colleague, Please post this call for chapters to the relevant researchers in your organization. Call for Chapters and Contributions ------------------------------------------- Data Mining: A Heuristic Approach http://www.cs.adfa.edu.au/~abbass/Book/DMHA.html Editors: H.A. Abbass, R. Sarkar, and C. Newton Publisher: Idea Group Publishing, USA This book volume will be a repository for the applications of heuristic techniques in data mining. With roots in optimisation, artificial intelligence, and statistics, data mining is an interdisciplinary area that is concerned with finding patterns in databases. These patterns might be the expected trend of the fashion in women's clothes, the potential change in the prices of some shares in the stock exchange market, the prospective behaviour of some competitors, or the causes of a budding virus. With the large amount of data stored in many organizations, businessmen observed that these data are an important intangible asset, if not the most important one, in their organizations. This instantiated an enormous amount of research, searching for learning methods that are capable of recognising novel and non-trivial patterns in databases. Unfortunately, handling large databases is a very complex process and traditional learning techniques such as Neural Networks and traditional Decision Trees are expensive to use. New optimisation techniques such as support vector machines and kernels methods, as well as statistical techniques such as Bayesian learning, are widely used in the field of data mining nowadays. However, these techniques are computationally expensive. Obviously, heuristic techniques provide much help in this arena. Notwithstanding, there are few books in the area of heuristics and few more in the area of data mining. Surprisingly, no single book has been published to put together these two fast-changing inter-related fields. Topics The use of heuristics (Evolutionary algorithms, simulated annealing, tabu search, swarm intelligence, biological agents, memetic, and others) in the following areas Feature selection. Data cleaning. Clustering, classification, prediction, and association rules. Optimisation methods for data mining. Kernels and support vector machines. Fast algorithms for training neural networks. Bayesian inference and learning. Survey chapters are also welcomed. and other related topics Important dates Abstract submission: August 15, 2000 Acceptance of abstract: September 15, 2000 Full chapter due: January 15, 2001 Notification of full-chapter acceptance: March 1, 2001 Final Version Due: April 30, 2001 Estimated publication date: Fall 2001 by Idea Group Publishing Contact information: Send electronic submissions to one of the editors at abbass at cs.adfa.edu.au ruhul at cs.adfa.edu.au csn at cs.adfa.edu.au Hard copies should be sent to any of the editors at: School of Computer Science, University College, University of New South Wales, Australian Defence Force Academy, Canberra, ACT2600, Australia. Fax submission to: 02-62688581 within Australia +61-2-62688581 International Hussein Aly Abbass Amein Lecturer in Computer Science, Email: abbass at cs.adfa.edu.au Australian Defence Force Academy, http: http://www.cs.adfa.edu.au/~abbass School of Computer Science, Tel.(M) (+61) 0402212977 University College, Tel.(H) (+61) (2) 62578757 University of New Wouth Wales, Tel.(W) (+61) (2) 62688158 Canberra, ACT2600, Australia. Fax.(W) (+61) (2) 62688581 From cyrano at arti.vub.ac.be Tue Jun 13 09:47:00 2000 From: cyrano at arti.vub.ac.be (Andreas Birk) Date: Tue, 13 Jun 2000 15:47:00 +0200 (MET DST) Subject: book announcements Message-ID: <200006131347.PAA25190@arti13.vub.ac.be> Dear researchers interested in robot learning, the book "Interdisciplinary Approaches to Robot Learning" edited by John Demiris and Andreas Birk is now available. It is published and distributed by World Scientific. At the same time, Springer decided to offer an online version of the previously published book "Learning Robots" edited by Andreas Birk and John Demiris. Information on both books is given below. ---- John Demiris, Andreas Birk (Eds.) Interdisciplinary Approaches to Robot Learning Robotics and Intelligent Systems Series, World Scientific, 2000 ISBN 981-02-4320-0 250 pp (approx.) US$56 book description: http://www.wspc.com/books/compsci/4436.htm Contents: Preface to Interdisciplinary Approaches to Robot Learning (J Demiris & A Birk) Bootstrapping the Developmental Process: The Filter Hypothesis (L Berthouze) Biomimetic Gaze Stabilization (T Shibata & S Schaal) Experiments and Models About Cognitive Map Learning for Motivated Navigation (P Gaussier et al.) Learning Selection of Action for Cortically-Inspired Robot Control (H Frezza-Buet & F Alexandre) Transferring Learned Knowledge in a Lifelong Learning Mobile Robot Agent (J O'Sullivan) Of Hummingbirds and Helicopters: An Algebraic Framework for Interdisciplinary Studies of Imitation and Its Applications (C Nehaniv & K Dautenhahn) Evolving Complex Visual Behaviours Using Genetic Programming and Shaping (S Perkins & G M Hayes) Preston: A System for the Evaluation of Behaviour Sequences (M Wilson) Readership: Researchers and graduate students in robotics and machine learning who are interested in interdisciplinary approaches to their fields. ----- Andreas Birk, John Demiris (Eds.) Learning Robots Proceedings of EWLR-6, Brighton, UK; Lecture Notes in Artificial Intelligence (LNAI) 1545, Springer, 1998 ISBN 3-540-65480-1 188 pp DM 50,- This book is now online via the Springer LINK service. This means that subscribed parties (libraries, institutes, etc.) can access the complete content of the book via the WWW. Access from hosts which are not subscribed to the LINK service is limited to the abstracts of the different chapters. The online access of "Learning Robots" is located at http://link.springer.de/link/service/series/0558/tocs/t1545.htm ----- From cindy at cns.bu.edu Thu Jun 15 09:59:00 2000 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Thu, 15 Jun 2000 09:59:00 -0400 Subject: Neural Networks 13(4/5) Message-ID: <200006151359.JAA27950@retina.bu.edu> NEURAL NETWORKS 13(4/5) Contents - Volume 13, Numbers 4 & 5 - 2000 ------------------------------------------------------------------ NEURAL NETWORKS LETTERS: Bias reduction in skewed binary classification with Bayesian neural networks P.J.G. Lisboa, A. Vellido, and H. Wong INVITED ARTICLES: Independent component analysis: Algorithms and applications A. Hyvarinen and E. Oja Evolutionary robots with on-line self-organization and behavioral fitness D. Floreano and J. Urzelai CONTRIBUTED ARTICLES: ***** Neuroscience and Neuropsychology ***** A generalized Hebbian rule for activity-dependent synaptic modifications T. Kitajima and K.-I. Hara ***** Mathematical and Computational Analysis ***** Mutual information of sparsely coded associative memory with self-control and ternary neurons D. Bolle, D.R.C. Dominguez, and S. Amari Construction of confidence intervals for neural networks based on least squares estimation I. Rivals and L. Personnaz A new algorithm for learning in piecewise-linear neural networks E.F. Gad, A.F. Atiya, S. Shaheen, and A. El-Dessouki Evolution and generalization of a single neurone, III: Primitive, standard, robust, and minimax regressions S. Raudys ***** Engineering and Design ***** Determining the number of centroids for CMLP network M. Lehtokangas ***** Technology and Applications ***** Defining a neural network controller structure for a rubbertune robot M. Ozkan, K. Inoue, K. Negishi, and T. Yamanaka An efficient learning algorithm for improving generalization performance of radial basis function neural networks Zheng-ou Wang and Tao Zhu ***** Book Review ***** Adaptive resonance theory microchips: Circuit design techniques A.E. Hubbard ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From duch at phys.uni.torun.pl Fri Jun 16 11:09:07 2000 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Fri, 16 Jun 2000 17:09:07 +0200 Subject: tutorial on extraction of knowledge from data, IJCNN'2000 Message-ID: <006501bfd7a4$d1a7b2a0$04054b9e@phys.uni.torun.pl> Web pages containing notes for the Extraction of Knowledge from Data Using Computational Intelligence Methods to be presented at the The International Joint Conference on Neural Networks, Como, Italy, 24-27 July 2000 are available at the address: http://www.phys.uni.torun.pl/~duch/ref/kdd-tut/index.html These pages will still be developed further but at the present stage they should be sufficient to give you an idea about the neural and machine leanring methods that will be presented. See you in Como! W/lodzis/law Duch http://www.phys.uni.torun.pl/~duch From amari at brain.riken.go.jp Fri Jun 16 03:07:45 2000 From: amari at brain.riken.go.jp (Shun-ichi Amari) Date: Fri, 16 Jun 2000 16:07:45 +0900 Subject: Bernoulli-RIKEN Symposium on Neural Networks and Learning Message-ID: Now we are accepting registration for the Bernoulli-RIKEN Symposium on Neural Networks and Learning. Please register through the web-site http://www.bsis.brain.riken.go.jp/Bernoulli if you hope to be invited to this Symposium. Only a limited number of around one hundred people are able to attend the conference, therefore we may have to decline your requests according to cicumstances. Thank you for your understanding and cooperation. ********************************************** Bernoulli-RIKEN BSI 2000 Symposium on Neural Networks and Learning Dates and Venues: Symposium: October 25 - 27, 2000 Ohkouchi Hall, RIKEN (The Institute of Physical and Chemical Research), ??Japan Satellite Workshop: October 28, 2000 The Institute of Statistical Mathematics, Japan Aim: In order to celebrate Mathematical Year 2000, The Bernoulli Society is organizing a number of Symposia in rapidly developing research areas in which probability theory and statistics can play important roles. Brain science in the wide sense will become more important in the 21 century. Information processing in the brain is so flexible and its learning ability is so strong that it is indeed a challenge for information science to elucidate its mechanisms. It is also a big challenge to construct information processing systems of brain style. The present Symposium focuses on learning ability of real and artificial neural networks and related systems from theoretical and practical points of view. Probability theory and statistics will play fundamental roles in elucidating these systems, and they in turn fortify stochastic and statistical methods. Theories of neural learning and pattern recognition have a long history, and lots of new ideas are emerging currently. They have also practical applicability in the real world problems. Now is a good time to review all of these new ideas and methods and to discuss future directions of developments. We will invite worldwide top class researchers in these fields and discuss the state-of-the-art of neural networks and learning as well as future directions of this important area. Participants are by invitation only. We are expecting 50 -80 participants from all over the world. After the symposium, we will organize a more informal one-day workshop: "Towards new unification of statistics and neural networks learning". The detailed information for time tables and abstracts can be obtained at http://www.bsis.brain.riken.go.jp/Bernoulli Those who have interests in joining the Symposium and Workshop may ask invitation through the above web-site after June 15 when we are ready. If you have any questions, contact the organizing committee at bernoulli2000 at bsis.brain.riken.go.jp ******************* Sponsors: The Bernoulli Society for Mathematical Statistics and Probability RIKEN Brain Science Institute The Institute of Statistical Mathematics Japanese Neural Networks Society In Cooperation with: Japanese Statistical Society Supported by: The Commemorative Association for the Japan World Exposition (1970) The Support Center for Advanced Telecommunications research Technology (SCAT) Organizing Committee: Chair Shun-ichi Amari, RIKEN Brain Science Institute, Japan Leo Breiman, University of California, Berkeley, USA Shinto Eguchi, The Institute of Statistical Mathematics, Japan Michael Jordan, University of California, Berkeley, USA Noboru Murata, Waseda University, Japan Mike Titterington, University of Glasgow, UK Vladimir Vapnik, AT&T, USA Registration fee 10,000 Japanese yen (nearly 100 US$) (including reception) is requested at the conference vennue. There is 50% student discount. ***************** Program: 1. Graphical Models and Statistical Methods: Steffen L. Lauritzen (Aalborg University) Graphical models for learning Thomas S. Richardson (University of Warwick) Ancestral graph Markov models: an alternative to models with latent or selection variables Lawrence Saul (AT&T Labs) Learning the Global Structure of Nonlinear Manifolds Martin Tanner (Northwestern University) Inference for and Applications of Hierarchical Mixtures-of-Experts 2. Combining Learners Leo Breiman (University of California, Berkeley) Random Forests Jerome H. Friedman (Stanford University) Gradient boosting and multiple additive regression trees Peter Bartlett (Australian National University) Large Margin Classifiers and Risk Minimization Yoram Singer (The Hebrew University) Combining Learners: an Output Coding Perspective 3. Information Geometry and Statistical Physics Shinto Eguchi (The Institute of Statistical Mathematics) Information geometry of tubular neighbourhoods for a statistical model Shun-ichi Amari (RIKEN Brain Science Institute) Information geometry of neural networks Manfred Opper (Aston University) The TAP Mean Field approach for probabilistic models Magnus Rattray (University of Manchester) Modelling the learning dynamics of latent variable models 4. VC Dimension and SVM Vladimir Vapnik (AT&T Labs) Statistical learning theory and support vector machines Michael Kearns (AT&T Labs) Sparse Sampling Algorithms for Probabilistic Artificial Intelligence Gabor Lugosi (Pompeu Fabra University) Model selection, error estimation, and concentration Bernhard Schoelkopf (Microsoft Research Ltd.) SV Algorithms and Applications ******************** Shun-ichi Amari Vice Director, RIKEN Brain Science Institute Laboratory for Mathematical Neuroscience Research Group on Brain-Style Information Systems tel: +81-(0)48-467-9669; fax: +81-(0)48-467-9687 amari at brain.riken.go.jp http://www.bsis.brain.riken.go.jp/ From harnad at coglit.ecs.soton.ac.uk Sun Jun 18 10:59:04 2000 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sun, 18 Jun 2000 15:59:04 +0100 (BST) Subject: Language-Origins: PSYC Call for Multiple Book Reviewers Message-ID: PSYCOLOQUY CALL FOR BOOK REVIEWERS of: "The Origins of Complex Language" by Andrew Carstairs-McCarthy (OUP 1999) Below is the abstract of the Precis of "The Origins of Complex Language" by Andrew Carstairs-McCarthy (740 lines). This book has been selected for multiple review in Psycoloquy. If you wish to submit a formal book review please write to psyc at pucc.princeton.edu indicating what expertise you would bring to bear on reviewing the book if you were selected to review it. Full Precis: http://www.cogsci.soton.ac.uk/psyc-bin/newpsy?11.082 (If you have never reviewed for PSYCOLOQUY or Behavioral & Brain Sciences before, it would be helpful if you could also append a copy of your CV to your inquiry.) If you are selected as one of the reviewers and do not have a copy of the book, you will be sent a copy of the book directly by the publisher (please let us know if you have a copy already). Reviews may also be submitted without invitation, but all reviews will be refereed. The author will reply to all accepted reviews. FULL PSYCOLOQUY BOOK REVIEW INSTRUCTIONS AT: http://www.princeton.edu/~harnad/psyc.html http://www.cogsci.soton.ac.uk/psycoloquy/ Psycoloquy reviews are of the book, not the Precis. Length should be about 200 lines [c. 1800 words], with a short abstract (about 50 words), an indexable title, and reviewer's full name and institutional address, email and Home Page URL. All references that are electronically accessible should also have URLs. AUTHOR'S RATIONALE FOR SOLICITING MULTIPLE BOOK REVIEW Most recent investigators assume that the brain has always been the most important part of human anatomy for the evolution of language, and do not seriously examine other conceivable directions in which grammatical evolution might have proceeded. In "The Origins of Complex Language," it is suggested that certain central features of language-as-it-is, notably the distinction between sentences and noun phrases, are by no means inevitable outcomes of linguistic or cognitive evolution, so that where they come from constitutes a genuine puzzle. The solution that is proposed is that grammar-as-it-is was, in fundamental respects, exapted from, or tinkered out of, the neural mechanisms that arose for the control of syllabically organized vocalization, made possible by (among other things) the descent of the larynx. This proposal turns upside down mainstream views about the relationship between language development and vocal tract development, and also challenges the logical and epistemological basis of notions closely tied to the distinction between sentences and noun phrases, such as 'reference', 'predication' and 'assertion'. It should therefore be of interest to anthropologists, psychologists, cognitive scientists, linguists and philosophers of language. psycoloquy.00.11.082.language-origins.1.carstairs-mccarthy Wed May 24 2000 ISSN 1055-0143 (44 paragraphs, 27 references, 85 lines) PSYCOLOQUY is sponsored by the American Psychological Association (APA) Copyright 2000 Andrew Carstairs-McCarthy THE ORIGINS OF COMPLEX LANGUAGE [Oxford University Press 1999, ISBN 0-19-823822-3, 0-19-823821-5] Precis of Carstairs-McCarthy on Complex Language Andrew Carstairs-McCarthy University of Canterbury Department of Linguistics Private Bag 4800 Christchurch New Zealand a.c-mcc at ling.canterbury.ac.nz ABSTRACT: Some puzzling characteristics of grammar, such as the sentence/NP distinction and the organization of inflection classes, may provide clues about its prehistory. When bipedalism led to changes in the vocal tract that favoured syllabically organized vocalization, this made possible an increase in vocabulary which in turn rendered advantageous a reliable syntax, whose source was the neural mechanism for controlling syllable structure. Several features of syntax make sense as byproducts of characteristics of the syllable (for example, grammatical 'subjects' may be byproducts of onset margins). This scenario is consistent with evidence from biological anthropology, ape language studies, and brain neurophysiology. KEYWORDS: ape, aphasia, brain development, evolution of language, grammar, language, larynx, noun phrase, predication, principle of contrast, reference, sentence, sign language, speech, syllable, truth From NKasabov at infoscience.otago.ac.nz Sun Jun 18 18:54:50 2000 From: NKasabov at infoscience.otago.ac.nz (Nik Kasabov) Date: Mon, 19 Jun 2000 10:54:50 +1200 Subject: a new book on intelligent systems and information sciences Message-ID: Dear colleagues, The following book has now been published by Springer Verlag (Physica Verlag): "Future Directions for Intelligent System and Information Sciences", N.Kasabov (ed), 2000, Springer Verlag (Physica Verlag). The book contains 19 chapters on contemporary topics, that include: adaptive learning and evolving systems; artificial life; speech and image processing; virtual reality; intelligent robots; brain-like computing; bio-informatics; quantum neural networks; intelligent decision making; data mining; granular computing and computing with words. The chapters are written by internationally recognised authors. Content: Part I: Adaptive, evolving, learning systems N.Kasabov, ECOS- Evolving Connectionist Systems - a new/old paradigm for on-line learning and knowledge engineering. S.-B. Cho, Artificial life technology for adaptive information processing. R.J. Duro, J. Santos, J.A. Becerra: Evolving ANN controllers for smart mobile robots. G. Coghill, A simulation environment for the manipulation of naturally variable objects. Y.Maeda: Behavior-decision fuzzy algorithm for autonomous mobile robot. Part II: Intelligent human computer interaction and scientific visualisation: J. Taylor, N. Kasabov: Modelling the emergence of speech and language through evolving connectionist systems. H.J.van den Herik, E.O. Postma: Discovering the visual signature of painters. A. Nijholt, J. Hulstijn: Multimodal interactions with agents in virtual worlds. M. Paulin, R.Berquist: Virtual BioBots. Part III: New connectionist computational paradigms: Brainlike computing and quantum neural networks: J.G. Taylor, Future directions for neural networks and intelligent systems from the brain imaging research. A.A. Ezhov, D. Ventura,Quantum neural networks. N.G. Stocks, R. Mannella,Suprathreshold stochastic resonance in a neuronal network model: a possible strategy for sensory coding. Part IV: Bioinformatics: C. Brown, M. Schreiber, B.Chapman, G. Jacobs, Information science and bioinformatics. V.B. Bajic, I.V. Bajic, Neural network system for promoter recognition. Part V: Knowledge representation, knowledge processing, knowledge discovery, and some applications: W. Pedrycz, Granular computing: An introduction. J. Kacpzryk, A new paradigm shift from computation on numbers to computation on words on an example of linguistic database summarization. N. Kasabov, L. Erzegovesi, M.Fedrizzi, A. Beber, D. Deng: Hybrid intelligent decision support systems and applications for risk analysis and discovery of evolving economic clusters in Europe. Y.Y.Yun: Intelligent resource management through the constrained resource planning model. A. Ramer, M. do Carmo Nicoletti, S.Y. Sung: Evaluative studies of fuzzy knowledge discovery through NF systems. Series: Studies in Fuzziness and Soft Computing.VOL. 45 More information about the book can be obtained from the Web site: http://www.springer.de/cgi-bin/search_book.pl?isbn=3-7908-1276-5 with best regards, Nik Kasabov ------------------------------------------------------------------------ Prof.Dr. Nikola (Nik) Kasabov Director Knowledge Engineering Laboratory Department of Information Science University of Otago, P.O.Box 56, Dunedin,New Zealand phone: +64 3 4798319; fax: +64 3 4798311 email: nkasabov at otago.ac.nz http://divcom.otago.ac.nz/infosci/Staff/NikolaK.htm ------------------------------------------------------------------------ From char-ci0 at wpmail.paisley.ac.uk Mon Jun 19 10:59:20 2000 From: char-ci0 at wpmail.paisley.ac.uk (Darryl Charles) Date: Mon, 19 Jun 2000 15:59:20 +0100 Subject: Ph.D Studentship in Image Processing Message-ID: Ph.D. Studentship (From September 2000) Image processing with hybrid supervised and unsupervised neural networks. Ph.D. studentship available in the Applied Computational Intelligence Research Unit (ACIRU) at the University of Paisley, Scotland. An opportunity exists for a university sponsored Ph.D. studentship to research and develop novel hybrid supervised/unsupervised artificial neural networks for application to visual data. A particular emphasis in this Ph.D. will be on data related to remote sensing, but there will be opportunity to develop generic models that may be adapted and applied to other image processing tasks. Research will attempt to uncover methods that improve classification and feature detection with the developed hybrid models on visual data Applicants should normally hold a good degree in a related subject area and specific work experience and/or publications with regard to neural networks and image processing will strengthen application. Programming experience with MATLAB and/or C++ will be an advantage. The successful candidate will be supervised jointly by Dr Darryl Charles and Dr Bogdan Gabrys and will play an active part in a vibrant research group. A 3 year grant is available at the normal rate and some part-time teaching should be available to suppliment income. Applicants should send a CV and the names and addresses of two referees by e-mail or standard mail to the following address. Dr Darryl Charles CIS dept. University of Paisley High Street Paisley PA1 2BE e-mail: Darryl.Charles at paisley.ac.uk dept. home page: http://cis.paisley.ac.uk/ Dr Darryl Charles CIS Department University of Paisley Scotland Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From tsr at stat.washington.edu Mon Jun 19 18:37:56 2000 From: tsr at stat.washington.edu (Thomas Richardson) Date: Mon, 19 Jun 2000 15:37:56 -0700 (PDT) Subject: AI & STATISTICS 2001 - Second Call for Papers: Deadline 10 July Message-ID: (apologies for multiple posting) ==================================================================== Second call for papers: AI and STATISTICS 2001 Eighth International Workshop on Artificial Intelligence and Statistics January 3-6, 2001, Hyatt Hotel, Key West, Florida http://www.ai.mit.edu/conferences/aistats2001/ SUBMISSION DEADLINE: midnight July 10, 2000 (PST) This is the eighth in a series of workshops which have brought together researchers in Artificial Intelligence (AI) and in Statistics to discuss problems of mutual interest. The exchange has broadened research in both fields and has strongly encouraged interdisciplinary work. Papers on all aspects of the interface between AI & Statistics are encouraged. To encourage interaction and a broad exchange of ideas, the presentations will be limited to about 20 discussion papers in single session meetings over three days (Jan. 4-6). Focused poster sessions will provide the means for presenting and discussing the remaining research papers. Papers for poster sessions will be treated equally with papers for presentation in publications. Attendance at the workshop will not be limited. The three days of research presentations will be preceded by a day of tutorials (Jan. 3). These are intended to expose researchers in each field to the methodology and techniques used in other related areas. The Eighth workshop especially encourages submissions related to the following workshop themes in the interface between information retrieval and statistics: Statistical natural language processing Game theory Missing information; unlabeled examples Error correcting codes In addition, papers on all aspects of the interface between AI & Statistics are strongly encouraged, including but not limited to Automated data analysis Cluster analysis and unsupervised learning Statistical advisory systems, experimental design Integrated man-machine modeling methods Interpretability in modelling Knowledge discovery in databases Metadata and the design of statistical data bases Model uncertainty, multiple models Multivariate graphical models, belief networks, causal modeling Online analytic processing in statistics Pattern recognition Prediction: classification and regression Probabilistic neural networks Probability and search Statistical strategy Vision, robotics, natural language processing, speech recognition Visualization of very large datasets Submission Requirements: Electronic submission of abstracts is required. The abstracts (up to 4 pages in length) should be submitted through the AI and Statistics Conference Management page supported by Microsoft Research. More specific instructions are available at http://cmt.research.microsoft.com/AISTATS2001/ In special circumstances other arrangements can be made to facilitate submission. For more information about possible arrangements, please contact the conference chairs. Submissions will be considered if they are received by midnight July 10, 2000 (PST). Please indicate the theme and/or the topic(s) your abstract addresses. Receipt of all submissions will be confirmed via electronic mail. Acceptance notices will be emailed by September 1, 2000. Preliminary papers (up to 12 pages, double column) must be received by November 1, 2000. These preliminary papers will be copied and distributed at the workshop. Program Chairs: Thomas Richardson, University of Washington, tsr at stat.washington.edu Tommi Jaakkola, MIT, tommi at ai.mit.edu Program Committee: Russell Almond, Educational Testing Service, Princeton Hagai Attias, Microsoft Research, Redmond Yoshua Bengio, University of Montreal Max Chickering, Microsoft Research, Redmond Greg Cooper, University of Pittsburgh Robert Cowell, City University, London Phil Dawid, University College, London Vanessa Didelez, University of Munich David Dowe, Monash University Brendan Frey, University of Waterloo Nir Friedman, Hebrew University, Jerusalem Dan Geiger, Technion Edward George, University of Texas Paolo Giudici, University of Pavia Zoubin Ghahramani, University College, London Clark Glymour, Carnegie-Mellon University Moises Goldszmidt, Peakstone Corporation David Heckerman, Microsoft Research, Redmond Thomas Hofmann, Brown University Reimar Hofmann, Siemens Michael Jordan, University of California, Berkeley David Madigan, Soliloquy Chris Meek, Microsoft Research, Redmond Marina Meila, Carnegie-Mellon University Kevin Murphy, University of California, Berkeley Mahesan Niranjan, University of Sheffield John Platt, Microsoft Research, Redmond Greg Ridgeway, University of Washington Lawrence Saul, AT&T Research Prakash Shenoy, University of Kansas Dale Schuurmans, University of Waterloo Padhraic Smyth, University of California, Irvine David Spiegelhalter, University of Cambridge Peter Spirtes, Carnegie-Mellon University Milan Studeny, Academy of Sciences, Czech Republic Michael Tipping, Microsoft Research, Cambridge Henry Tirri, University of Helsinki Volker Tresp, Siemens Chris Watkins, Royal Holloway and Bedford New College, Nanny Wermuth, University of Mainz Joe Whittaker, Lancaster University Chris Williams, University of Edinburgh From gluck at pavlov.rutgers.edu Tue Jun 20 10:19:44 2000 From: gluck at pavlov.rutgers.edu (Mark A. Gluck) Date: Tue, 20 Jun 2000 10:19:44 -0400 Subject: Postdoctoral Position in Computational Neurosci. at Rutgers-Newark Message-ID: Computational Neuroscience of Learning and Memory. Postdoctoral position open for January, 2001, start. Work on computational models of neural substrates of associative learning in animals and humans, with reference to the hippocampal, basal forebrain, basal ganglia, amygdala, and frontal brain systems. Applicants must have strong prior record (PhD thesis and/or papers) in the theory, implementation, and analysis of neural network algorithms and models. Some familiarity with relevant biological and behavioral systems helpful. Modeling will be integrated into ongoing experimental studies of the psychobiology of animal conditioning, and the neuropsychology of human memory disorders. Additional training in these areas provided. Located at Center for Molecular & Behavioral Neuroscience, Rutgers, Newark, NJ (15 min by train from downtown New York City). Send-- by email only--cover letter, CV, and names and emails of three references to: Mark Gluck and Catherine Myers, c/o gluck at pavlov.rutgers.edu. ______________________________________________________ Dr. Mark A. Gluck, Associate Professor Center for Molecular and Behavioral Neuroscience Phone: (973) 353-1080 x3221 Rutgers University Fax: (973) 353-1272 197 University Ave. Newark, New Jersey 07102 Email: gluck at pavlov.rutgers.edu WWW Homepages: Research Lab: http://www.gluck.edu Rutgers Memory Disorders Project: http://www.memory.rutgers.edu _______________________________________________________ From ckiw at dai.ed.ac.uk Tue Jun 20 14:20:06 2000 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Tue, 20 Jun 2000 19:20:06 +0100 (BST) Subject: Faculty position at University of Edinburgh, UK Message-ID: Apologies if you receive this message multiple times. I am keen to encourage people from the machine learning/probabilistic modelling fields who work on life sciences problems to apply. I shall be attending the ICML/UAI conferences at Stanford and will be happy to discuss this further with interested parties. Chris Williams Dr Chris Williams ckiw at dai.ed.ac.uk Institute for Adaptive and Neural Computation Division of Informatics, University of Edinburgh 5 Forrest Hill, Edinburgh EH1 2QL, Scotland, UK fax: +44 131 650 6899 tel: (direct) +44 131 651 1212 (department switchboard) +44 131 650 3090 http://anc.ed.ac.uk/ -------------------------------------------------------------------- INFORMATICS AND ITS APPLICATION TO THE LIFE SCIENCES The Division of Informatics at the University of Edinburgh (http://www.informatics.ed.ac.uk) is seeking to make an appointment in the area of Informatics and its applications to the Life Sciences. This is aimed not only at extending the applications of Informatics to biological problems but also at stimulating fundamental research in Informatics. The Division has inherited a very strong tradition of research in computer systems, theoretical computer science, cognitive science, artificial intelligence, robotics and neural networks. The successful candidate will add to our existing strengths in research and teaching, encourage the integration of his or her own research with that of others and contribute to the development of Informatics. The successful candidate can anticipate profitable involvement with local researchers in Biology, Medicine and Veterinary Medicine, including the new Edinburgh Genomic Microarray Facility (http://www.gmf-microarray.ed.ac.uk/). It is anticipated that the successful candidate will take a leadership role in the rapidly expanding bioinformatics enterprise at the University. The appointment will be at the Lecturer (17,238 pounds - 30,065 pounds), or Senior Lecturer or Readership (31,356 pounds - 35,670 pounds) scale. (Salary scales under review.) Further particulars can be found at http://www.informatics.ed.ac.uk/events/vacancies/life_sciences_fp.html and applications packs can be obtained from the PERSONNEL DEPARTMENT, The University of Edinburgh, 9-16 Chambers Street, Edinburgh EH1 1HT, UK Closing date: 28 July 2000 Please quote reference number 306395. Informal questions and requests for information can be sent to d.willshaw at cns.ed.ac.uk bonnie.webber at ed.ac.uk From wsenn at cns.unibe.ch Wed Jun 21 09:40:44 2000 From: wsenn at cns.unibe.ch (Walter Senn) Date: Wed, 21 Jun 2000 15:40:44 +0200 Subject: Similar IF neurons synchronize Message-ID: <3950C5DC.CAC8E678@cns.unibe.ch> The following paper is accepted at the SIAM Journal on Applied Mathematics: "Similar non-leaky integrate-and-fire neurons with instantaneous couplings always synchronize" The paper reconsiders the dynamics of pulse-coupled integrate-and-fire (IF) neurons analyzed by Mirollo and Strogatz (SIAM J. Appl. Math., 1990). Lifting their restriction to identical oscillators, we study the case of different intrinsic frequencies and different thresholds of the neurons, as well as different but positive couplings. For non-leaky neurons, we prove that generically the dynamics becomes fully synchronous for any initial conditions if the intrinsic frequencies, the thresholds and the couplings are not too different. For the case of non-leaky IF neurons, this confirms Peskin's conjecture (1975) according to which nearly identical pulse-coupled oscillators in general synchronize. In particular, leakyness, or more general, concave evolution functions as imposed by Mirollo and Strogatz, are not necessary to insure global synchronization. Our result differs from the findings of Mirollo and Strogatz in that, for non-leaky IF neurons, almost all networks with weak homogeneity converge for all initial conditions to synchronous firing, while in their work with leaky and identical IF neurons, the dynamics synchronizes for all set of parameter values, but for each set only for almost all initial conditions. Interestingly, for non-leaky neurons the case of exactly identical oscillators and weights is exceptional in that it does not assure full synchronization for all initial conditions. Walter Senn and Robert Urbanczik The paper can be downloaded from http://www.cns.unibe.ch/~wsenn/#pub . ------------------------------------------------------------- Walter Senn Phone office: +41 31 631 87 21 Physiological Institute Phone home: +41 31 332 38 31 University of Bern Fax: +41 31 631 46 11 Buehlplatz 5 email: wsenn at cns.unibe.ch CH-3012 Bern SWITZERLAND http://www.cns.unibe.ch/ ------------------------------------------------------------- From duch at phys.uni.torun.pl Wed Jun 21 10:01:27 2000 From: duch at phys.uni.torun.pl (Wlodzislaw Duch) Date: Wed, 21 Jun 2000 16:01:27 +0200 Subject: book reviewers wanted Message-ID: <000b01bfdb89$317c5ea0$04054b9e@phys.uni.torun.pl> Dear all, I have a few good books for reviewing in Transactions on Neural Networks. Unfortunately most of our reviewers are so slow that I have to look for help; please let me know who would be willing to write some reviews, mention your credentials, and after some consulting I'll add you to the list of reviewers and send you a list of books for reviewing. Sincerely, W/lodzis/law Duch Department of Computer Methods, Nicholas Copernicus University, Poland tel/fax: (+48-56) 622 1543, home 623 6685 http://www.phys.uni.torun.pl/~duch From rosaria at ICSI.Berkeley.EDU Wed Jun 21 14:50:34 2000 From: rosaria at ICSI.Berkeley.EDU (Rosaria Silipo) Date: Wed, 21 Jun 2000 11:50:34 -0700 (PDT) Subject: Summer School on Intelligent Data Analysis Message-ID: PRELIMINARY ANNOUNCEMENT SUMMER SCHOOL ON INTELLIGENT DATA ANALYSIS PALERMO, SEPTEMBER 18-22, 2000 Over the last decade or so, the size of machine-readable data sets has increased dramatically and the problem of "data explosion" has become apparent. In parallel with this recent developments in computing have provided the basic infrastructure for fast access to online data. In particular many advanced computational methods for extracting information from large quantities of heterogeneous data and for data representation are now beginning to mature. These developments have created a new range of problems and challenges for the analysts, as well as new opportunities for intelligent systems in data analysis. All this has led to the emergence of the field of Intelligent Data Analysis (IDA), a combination of diverse disciplines including Artificial Intelligence and Statistics in particular. The School on Intelligent Data Analysis (IDA) will focus on the core techniques of Intelligent Data Analysis: - Statistics, - Bayesian Networks, - Neural networks, - Time Series Analysis, - Rule Induction, - Fuzzy Logic - Evolutionary Computation. All courses are organized as to provide a wide description of the theoretical and practical aspects of each discipline. For this purpose also speakers from industry are invited to show practical and already implemented applications of Intelligent Data Analysis techniques. The target audience of the IDA Summer School are advanced undergraduate students, PhD students, postdoctoral students and academic and industrial researchers and developers. The Summer School will take place at the University of Palermo (Italy) from September 18th and September 22nd 2000. More information - including the preliminary program and the list of speakers - will be available since early June on the IDA summer school's web page: http://www.cere.pa.cnr.it/IDAschool/ FINANCIAL SUPPORT. At the moment it is very likely that financial support from the European Commission will be available for young researchers (<35) to attend the school. This support will cover the registration fee and part of the travel and subsistence expenses. In the course of June more details will be made available. If you are interested in attending the school or would like to inquire about the possibility for financial support, please send e-mail to ida at cere.pa.cnr.it. From rfrench at ulg.ac.be Fri Jun 23 13:18:16 2000 From: rfrench at ulg.ac.be (Robert French) Date: Fri, 23 Jun 2000 19:18:16 +0200 Subject: Deadline extension for Abstracts for NCPW6: June 30 Message-ID: <4.1.20000623190321.00ad2e10@pop3.mailst.ulg.ac.be> SIXTH NEURAL COMPUTATION AND PSYCHOLOGY WORKSHOP (NCPW6) We have decided to extend to JUNE 30 of the Final Deadline for Abstracts for NCPW6, to be held at the University of Li=E8ge in eastern Belgium from September 16-18 of this year. The goal of this Workshop, the sixth in a series, is to bring together psychologists and neuropsychologists doing neural network modeling. Abstracts are to be approximately 200 words long. Notification of acceptance for a paper presentation will be done by July 5. The finished paper must be ready by the time of the Workshop. See the NCPW6 Web page for all details: http://www.fapse.ulg.ac.be/ncpw6/ So far we have people presenting from Belgium, Brazil, Britain, France, Germany, Holland, Italy, and the U.S. Some of the participants include: Domenico Parisi, Maartje Raijmakers, John Bullinaria, Bob French, Samantha Hartley, Martin LeVoi, Richard Shillcock, Jonathan Shapiro, Joe Levy, Richard Cooper, David Glasspool, Roland Baddeley, Noel Sharkey, Axel Cleeremans, Frank van Overwall, Barbara Tillmann, Gert Westermann, Jacques Sougn=E9, and Claudia Schiffer, among others. (Well, actually, Claudia Schiffer won=92t be there. She doesn=92t do _connectionist_ modeling.) The Workshop will be held over two and a half days (Sept 16-18) and there will be about 25-30 talks, no parallel sessions and no posters. The atmosphere is designed to be congenial but rigorous. The welcoming reception, coffee breaks, lunches and a copy of the Proceedings (published by Springer-Verlag) will be included in the registration fee (100 euros). There will be an optional banquet (30 euros) on Sunday night, Sept. 17. Please register as soon as you are sure you are coming. You can register and pay electronically over a secure line. This year the theme will be =93Evolution, Learning and Development.=94 This is a broad topic and intentionally so. Although we aren=92t interested in, say, connectionist applications to submarine warfare, we will consider all papers that have something to do with the announced topic, even if rather tangentially. This is the first year that NCPW is being held on the Continent, a move that was explicitly designed to attract not only the usual contingent of British connectionists, but also our colleagues from other European countries as well. The exact organization of the final program will depend on the submissions received. We will publish the program on the Web site as soon as it is determined. Hope to see you in September. Bob French, on behalf of the NCPW6 organizing committee CONTACT DETAILS For any problems or questions, please send e-mail to mailto:cogsci at ulg.ac.be ---------------------------------------------------------------------------- Robert M. French, Ph.D Quantitative Psychology and Cognitive Science Psychology Department University of Liege 4000 Liege, Belgium Tel: (32.[0]4) 366.20.10 FAX: (32.[0]4) 366.28.59 email: rfrench at ulg.ac.be URL: http://www.fapse.ulg.ac.be/Lab/cogsci/rfrench.html ---------------------------------------------------------------------------- From marcusg at csee.uq.edu.au Sun Jun 25 23:46:22 2000 From: marcusg at csee.uq.edu.au (Marcus Gallagher) Date: Mon, 26 Jun 2000 13:46:22 +1000 Subject: PhD thesis available: MLP Error Surfaces. Message-ID: <3956D20E.1512DE44@csee.uq.edu.au> Dear Connectionists, I am happy to annouce the availability of my PhD thesis for download in electronic format. Apologies if you receive multiple copies of this posting. URL: http://www.elec.uq.edu.au/~marcusg/thesis.html Regards, Marcus. ---------------------------------------------------------------- Multi-Layer Perceptron Error Surfaces: Visualization, Structure and Modelling Marcus R. Gallagher PhD Thesis, University of Queensland, Department of Computer Science and Electrical Engineering, 2000. Abstract The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural Network model. MLP networks are normally applied to performing supervised learning tasks, which involve iterative training methods to adjust the connection weights within the network. This is commonly formulated as a multivariate non-linear optimization problem over a very high-dimensional space of possible weight configurations. Analogous to the field of mathematical optimization, training an MLP is often described as the search of an error surface for a weight vector which gives the smallest possible error value. Although this presents a useful notion of the training process, there are many problems associated with using the error surface to understand the behaviour of learning algorithms and the properties of MLP mappings themselves. Because of the high-dimensionality of the system, many existing methods of analysis are not well-suited to this problem. Visualizing and describing the error surface are also nontrivial and problematic. These problems are specific to complex systems such as neural networks, which contain large numbers of adjustable parameters, and the investigation of such systems in this way is largely a developing area of research. In this thesis, the concept of the error surface is explored using three related methods. Firstly, Principal Component Analysis (PCA) is proposed as a method for visualizing the learning trajectory followed by an algorithm on the error surface. It is found that PCA provides an effective method for performing such a visualization, as well as providing an indication of the significance of individual weights to the training process. Secondly, sampling methods are used to explore the error surface and to measure certain properties of the error surface, providing the necessary data for an intuitive description of the error surface. A number of practical MLP error surfaces are found to contain a high degree of ultrametric structure, in common with other known configuration spaces of complex systems. Thirdly, a class of global optimization algorithms is also developed, which is focused on the construction and evolution of a model of the error surface (or search space) as an integral part of the optimization process. The relationships between this algorithm class, the Population-Based Incremental Learning algorithm, evolutionary algorithms and cooperative search are discussed. The work provides important practical techniques for exploration of the error surfaces of MLP networks. These techniques can be used to examine the dynamics of different training algorithms, the complexity of MLP mappings and an intuitive description of the nature of the error surface. The configuration spaces of other complex systems are also amenable to many of these techniques. Finally, the algorithmic framework provides a powerful paradigm for visualization of the optimization process and the development of parallel coupled optimization algorithms which apply knowledge of the error surface to solving the optimization problem. Keywords: error surface, neural networks, multi-layer perceptron, global optimization, supervised learning, scientific visualization, ultrametricity, configuration space analysis, search space analysis, evolutionary algorithms, probabilistic modelling, probability density estimation, principal component analysis. -- marcusg at csee.uq.edu.au http://www.elec.uq.edu.au/~marcusg/ From vogdrup at daimi.au.dk Mon Jun 26 09:22:44 2000 From: vogdrup at daimi.au.dk (Jakob Vogdrup Hansen) Date: Mon, 26 Jun 2000 15:22:44 +0200 Subject: PhD thesis available: Combining Predictors ... Message-ID: <200006261322.PAA03273@ppp.brics.dk> Dear Connectionists, I am happy to annouce the availability of my PhD thesis for download in postscript format. URL: http://www.daimi.au.dk/~vogdrup/diss.ps Comments are welcome. regards, Jakob Title: Combining Predictors. Meta Machine Learning Methods and Bias/Variance & Ambiguity Decompositions Abstract: The most important theoretical tool in connection with machine learning is the bias/variance decomposition of error functions. Together with Tom Heskes, I have found the family of error functions with a natural bias/variance decomposition that has target independent variance. It is shown that no other group of error functions can be decomposed in the same way. An open problem in the machine learning community is thereby solved. The error functions are derived from the deviance measure on distributions in the one-parameter exponential family. It is therefore called the deviance error family. A bias/variance decomposition can also be viewed as an ambiguity decomposition for an ensemble method. The family of error functions with a natural bias/variance decomposition that has target independent variance can therefore be of use in connection with ensemble methods. The logarithmic opinion pool ensemble method has been developed together with Anders Krogh. It is based on the logarithmic opinion pool ambiguity decomposition using the Kullback-Leibler error function. It has been extended to the cross-validation logarithmic opinion pool ensemble method. The advantage of the cross-validation logarithmic opinion pool ensemble method is that it can use unlabeled data to estimate the generalization error, while it still uses the entire labeled example set for training. The cross-validation logarithmic opinion pool ensemble method is easily reformulated for another error function, as long as the error function has an ambiguity decomposition with target independent ambiguity. It is therefore possible to use the cross-validation ensemble method on all error functions in the deviance error family. -- Jakob V. Hansen Tlf: 86 750618 Rydevnget 87, 1. th. Kontor: B2.15 Lokal: (8942)3355 8210 Aarhus V E-mail: Vogdrup at daimi.au.dk From P.J.Lisboa at livjm.ac.uk Mon Jun 26 05:51:35 2000 From: P.J.Lisboa at livjm.ac.uk (Lisboa Paulo) Date: Mon, 26 Jun 2000 10:51:35 +0100 Subject: Computational Neuroscience: Modelling single trial EEG data for source separation and localisation Message-ID: PhD Studentship at JMU, Liverpool A position is available for the analysis of single trial EEG for source separation and localisation, funded as a 3 year PhD studentship (approx. ?6,800 p.a. plus fees) in a project involving industrial collaboration. The project will involve the use of Hidden Markov modelling, Independent Component Analysis, and purpose-built neural networks , as well as advanced statistical methods, all of which are in demand in industrial positions world-wide. Applicants should have a good first degree or a Masters degree in Mathematics, Statistics or Physics, preferably with familiarity with neural networks, and with interest in cognitive neuroscience. Familiarity with programming using Matlab or C++ are essential, as are good written and oral communication skills in English. The successful candidate will be expected TO contribute towards a team effort, but must also be self-motivated and able to work individually. If you would like to obtain further information, please contact Professor Paulo Lisboa at p.j.lisboa at livjm.ac.uk The deadline for expressions of interest is Friday, 7th July 2000. From sok at cs.york.ac.uk Mon Jun 26 12:17:24 2000 From: sok at cs.york.ac.uk (Simon E M O'Keefe) Date: Mon, 26 Jun 2000 17:17:24 +0100 Subject: Research Post in Parallel Implementation of Neural Networks, University of York, UK Message-ID: <39578214.AFFCE5E8@cs.york.ac.uk> University of York Department of Computer Science Advanced Computer Architectures Group Research Post in Parallel Implementation of Neural Networks (Ref: web/6051) The above post is available immediately to work on an EPSRC-funded project investigating the parallel implementation of binary neural networks. The work will centre on parallelisation and load-balancing of neural network software on a special purpose parallel associative neural network machine (Cortex-1) under construction in the department. You will be expected to be proficient in C++ and programming at the systems level. Knowledge of parallelisation, neural networks, performance evaluation and MPI would be an advantage, but is not essential. The post is available for a period of up to 8 months and the salary will be in the range 16,286 - 20,811 per annum, depending on experience. Further details of the project can be found at http://www.cs.york.ac.uk/arch/nn/aura.html and details of the department can be found at http://www.cs.york.ac.uk/arch/neural/ Informal enquires can be made to Prof. Jim Austin at austin at cs.york.ac.uk or on +44/0 1904 432734. The closing date for applications is 25 July 2000. -- Jim Austin, Professor of Neural Computation Advanced Computer Architecture Group, Department of Computer Science, University of York, York, YO10 5DD, UK. Tel : 01904 43 2734 Fax : 01904 43 2767 web pages: http://www.cs.york.ac.uk/arch From avrama at gmu.edu Tue Jun 27 09:16:32 2000 From: avrama at gmu.edu (avrama) Date: Tue, 27 Jun 2000 09:16:32 -0400 (EDT) Subject: Post-doctoral position available Message-ID: Please post and circulate as you see fit. Many thanks! NEUROSCIENCE POST-DOCTORAL POSITION AVAILABLE George Mason University A post-doctoral position is available beginning in September to work on a project funded by the National Science Foundation. The goal of the project is to completely characterize the differences between Hermissenda type A and type B photoreceptors by measuring the light-induced currents and developing computational models. All highly motivated candidates with a recent PhD (or expecting one in the year 2000) in physiology, neuroscience, or a related discipline are encouraged to apply. Electrophysiology skills are essential, as well as the ability and willingness to learn new techniques and concepts. Programming skills and/or experience with modeling packages are desirable but not necessary. The position is ideal for an experimentalist who would like to learn mathematical and computational neuroscience techniques. The post-doc will join the multidisciplinary Invertebrate Neurobiology Laboratory at the Krasnow Institute for Advanced Study, and the new School of Computational Sciences of George Mason University, both of which are located in Fairfax, VA (less than 20 miles west of Washington DC). Members of the INL are interested in the biophysical and biochemical mechanisms of long term memory storage. In particular, we seek to understand the cellular events underlying the requirement for temporal proximity of stimu li to be associated, and the neural circuits involved in the behavioral expression of memory. The PI has developed software for modeling the biochemical reactions of second messenger dynamics, and the effects of second messengers on channel properties. Please refer to the website for further details: http://www.krasnow.gmu.edu/avrama/invertlab.html The post-doc will be hired as an assistant research professor with a salary based on the NIH postdoctoral scale (with VA state employee benefits), and will have full access to library, laboratory and computing facilities both within the Krasnow Institute and George Mason University. Application review will begin in August and continue until a suitable candidate is found. Send CV, (p)reprints, a brief description of your motivation, and names, email addresses and phone/fax numbers of three references to: Avrama Blackwell, V.M.D., Ph.D. Krasnow Institute for Advanced Study, MS 2A1 George Mason University Fairfax, VA 22030 Ph. (703)993-4381 Fax (703)993-4325 Non-resident aliens are welcome to apply. Women and minorities are encouraged to apply. George Mason University is an equal opportunity employer. From brenner at informatik.uni-freiburg.de Tue Jun 27 11:57:00 2000 From: brenner at informatik.uni-freiburg.de (Michael Brenner) Date: Tue, 27 Jun 2000 17:57:00 +0200 Subject: German Autumn School on Cognition, Call for Participation Message-ID: <3958CECC.AEE8816C@informatik.uni-freiburg.de> --------------------------------------------------------------------------- We apologize in advance if you have received this announcement before --------------------------------------------------------------------------- CALL FOR PARTICIPATION German Autumn School on Cognition 2000 September 10-15, 2000 Freiburg, Germany http://www.fr.vgk.de/herbstschule_2000/ The autumn school offers courses and lectures covering different areas of cognitive science and learning technologies. It is intended for students and scientists from the fields of psychology, computer science, linguistics, computational linguistics, cognitive and instructional science. Lecturers: Shaaron Ainsworth, Joost Breuker, Cristiano Castelfranchi, Richard Cooper, Hector Geffner, Jim Hollan, Dietmar Janetzko, Timothy Koschmann, Marcia Linn, Deborah McGuinness, Thomas Metzinger, Bernhard Nebel, Josef Nerb, Werner Nutt, Uwe Oestermeier, Rolf Ploetzner, Lloyd Rieber, Keith Stenning, Gerhard Strube, Peter Yule Registration is open to all, and is possible electronically through the Autumn School web page (see above for the URL). Registration fees are DM 60,- for early registration (before 1st of July, 2000), after 1st of July, DM 80. (DM 80 is approximately 39 US$, as of June 25, 2000). Registration includes admission to all sessions and plenary addresses at the Autumn School 2000 in Freiburg. For additional information on lecture and tutorial topics, schedule, accomodation and travel, please refer to our web pages. Please forward this message to interested colleagues and students. We are looking forward to welcome you in Freiburg, Michael Brenner, Katja Lay, Susanne Thalemann From aapo at james.hut.fi Fri Jun 30 05:27:53 2000 From: aapo at james.hut.fi (Aapo Hyvarinen) Date: Fri, 30 Jun 2000 12:27:53 +0300 (EEST) Subject: New papers on ICA Message-ID: Dear All, The following papers on ICA and related topics are now available on my home page: http://www.cis.hut.fi/aapo/pub.html I would also like to mention that I'm going to give a tutorial on ICA at the IJCNN'00 in Como, Italy. ------------------------------------------------------------------------ A. Hyvarinen. Complexity Pursuit: Separating interesting components from time-series. Shorter version appeared in Proc. Int. Workshop on Independent Component Analysis and Blind Signal Separation (ICA2000), Helsinki, Finland, 2000 http://www.cis.hut.fi/aapo/ps/gz/complexity.ps.gz Abstract: A generalization of projection pursuit for time series, i.e. signals with time structure, is introduced. The goal is to find projections of time series that have interesting structure. We define the interestingness using criteria related to Kolmogoroff Complexity or coding length: Interesting signals are those that can be coded with a short code length. We derive a simple approximation of coding length that takes into account both the nongaussianity and the autocorrelations of the time series. Also, we derive a simple algorithm for its approximative optimization. The resulting method is closely related to blind separation of nongaussian, time-dependent source signals. ------------------------------------------------------------------------ P.O. Hoyer and A. Hyvarinen. Independent Component Analysis Applied to Feature Extraction from Colour and Stereo Images. To appear in Network. http://www.cis.hut.fi/aapo/ps/gz/Network00.ps.gz Abstract: Previous work has shown that independent component analysis (ICA) applied to feature extraction from natural image data yields features resembling Gabor functions and simple-cell receptive fields. This article considers the effects of including chromatic and stereo information. The inclusion of colour leads to features divided into separate red/green, blue/yellow, and bright/dark channels. Stereo image data, on the other hand, leads to binocular receptive fields which are tuned to various disparities. The similarities between these results and observed properties of simple cells in primary visual cortex are further evidence for the hypothesis that visual cortical neurons perform some type of redundancy reduction, which was one of the original motivations for ICA in the first place. In addition, ICA provides a principled method for feature extraction from colour and stereo images; such features could be used in image processing operations such as denoising and compression, as well as in pattern recognition. ------------------------------------------------------------------------- A. Hyvarinen and R. Karthikesh. Sparse priors on the mixing matrix in independent component analysis. Proc. ICA2000, Helsinki, Finland. http://www.cis.hut.fi/aapo/ps/gz/ICA00_sp.ps.gz Abstract: In independent component analysis, prior information on the distributions of the independent components is often used; some weak information is in fact necessary for succesful estimation. In contrast, prior information on the mixing matrix is usually not used. This is because it is considered that the estimation should be completely blind as to the form of the mixing matrix. Nevertheless, it could be possible to find forms of prior information that are sufficiently general to be useful in a wide range of applications. In this paper, we argue that prior information on the sparsity of the mixing matrix could be a constraint general enough to merit attention. Moreover, we show that the computational implementation of such sparsifying priors on the mixing matrix is very simple since in many cases they can be expressed as conjugate priors. The property of being conjugate priors means that essentially the same algorithm can be used as in ordinary ICA. Best Regards, Aapo ---------------------------------------------------- Aapo Hyvarinen Neural Networks Research Centre Helsinki University of Technology P.O.Box 5400, FIN-02015 HUT, Finland Tel: +358-9-4513278, Fax: +358-9-4513277 Email: Aapo.Hyvarinen at hut.fi Home page: http://www.cis.hut.fi/~aapo/ ----------------------------------------------------