From jwl at discus.anu.edu.au Mon Sep 1 01:26:57 2003 From: jwl at discus.anu.edu.au (John Lloyd) Date: Mon, 1 Sep 2003 15:26:57 +1000 Subject: 3 Year Post at The Australian National University Message-ID: Smart Internet Technology Cooperative Research Centre Smart Personal Assistant Research Program The Australian National University Research School of Information Sciences and Engineering The CRC has a vacant 3 year Research Fellowship at ANU. Applicants should have expertise in one or more of the following areas of research: machine learning, intelligent agents, or computational logic. Details are available at: http://info.anu.edu.au/hr/Jobs/Academic_Positions/_ISE1885.asp Closing date: 19 September 2003 From bio-adit2004-NOSPAM at listes.epfl.ch Mon Sep 1 16:06:55 2003 From: bio-adit2004-NOSPAM at listes.epfl.ch (Bio-ADIT2004) Date: Mon, 1 Sep 2003 22:06:55 +0200 Subject: [Bio-ADIT2004] final CFP and extension of submission deadline: 19 Sept. Message-ID: <91226.AJTLYJAJ@listes.epfl.ch> Bio-ADIT 2004 FINAL CALL FOR PAPERS + 2-WEEK SUBMISSION EXTENSION: SEPTEMBER 19, 2003 The First International Workshop on Biologically Inspired Approaches to Advanced Information Technology January 29 - 30, 2004 Swiss Federal Institute of Technology, Lausanne (EPFL), Switzerland Web site: http://lslwww.epfl.ch/bio-adit2004 Sponsored by - Osaka University Forum - Swiss Federal Institute of Technology, Lausanne (EPFL) - The 21st Century Center of Excellence Program of The Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under the Program Title "Opening Up New Information Technologies for Building a Networked Symbiosis Environment" Biologically inspired approaches have already proved successful in achieving major breakthroughs in a wide variety of problems in information technology (IT). A more recent trend is to explore the applicability of bio-inspired approaches to the development of self-organizing, evolving, adaptive and autonomous information technologies, which will meet the requirements of next-generation information systems, such as diversity, scalability, robustness, and resilience. These new technologies will become a base on which to build a networked symbiotic environment for pleasant, symbiotic society of human beings in the 21st century. Bio-ADIT 2004 will be the first international workshop to present original research results in the field of bio-inspired approaches to advanced information technologies. It will also serve to foster the connection between biological paradigms and solutions to building the next-generation information systems. SCOPE: The primary focus of the workshop is on new and original research results in the areas of information systems inspired by biology. We invite you to submit papers that present novel, challenging, and innovative results. The topics include all aspects of bio-inspired information technologies in networks, distributed/parallel systems, hardware (including robotics) and software. We also encourage you to submit papers dealing with: - Self-organizing, self-repairing, self-replicating and self-stabilizing systems - Evolving and adapting systems - Autonomous and evolutionary software and robotic systems - Scalable, robust and resilient systems - Complex biosystems - Gene, protein and metabolic networks - Symbiosis networks SUBMISSION OF PAPERS: Authors are invited to submit complete and original papers. Papers submitted should not have been previously published in any forum, nor be under review for any journal or other conference. All submitted papers will be refereed for quality, correctness, originality and relevance. All accepted papers will be published in the conference proceedings. It is also planned to publish accepted papers as a book. Manuscripts should include an abstract and be limited to 16 pages in single spaced and single column format. Submissions should include the title, author(s), author's affiliation, e-mail address, fax number and postal address. In the case of multiple authors, an indication of which author is responsible for correspondence and preparing the camera ready paper should also be included. Electronic submission is strongly encouraged. Preferred file formats are PDF (.pdf) or Postscript (.ps). Visit our Web site at http://lslwww.epfl.ch/bio-adit2004/ for more information. Please contact Dr. Murata if you have to submit hard copies. Manuscripts should be submitted by September 5, 2003 through the Bio-ADIT Web site. Please contact the technical program co-chairs for any questions: Professor Auke Jan Ijspeert School of Computer and Communication Sciences Swiss Federal Institute of Technology, Lausanne (EPFL) CH-1015 Lausanne, Switzerland Tel: +41-21-693-2658 Fax: +41-21-693-3705 E-mail: Auke.Ijspeert at epfl.ch Professor Masayuki Murata Cybermedia Center Osaka University 1-32 Machikaneyama, Toyonaka, Osaka 560-0043, Japan Tel: +81-6-6850-6860 Fax: +81-6-6850-6868 E-mail: murata at cmc.osaka-u.ac.jp STUDENT TRAVEL GRANTS A limited number of travel grants will be provided for students attending Bio-ADIT 2004. Details of how to apply for a student travel grant will be posted on the workshop Web site. IMPORTANT DATES: Paper submission deadline : September 19, 2003 (NEW!) Notification of acceptance: November 3, 2003 Camera ready papers due : December 1, 2003 WEB SITE: An electronic paper submission system is up and ready from July 1, 2003 to accept on-line paper submissions for Bio-ADIT. Please visit our Web site at http://lslwww.epfl.ch/bio-adit2004/ for more up-to-date information. REGISTRATION: Thanks to the generous support from the sponsors, we expect that the total registration fee for the conference (including pre-proceedings and lunches) will be fixed to only 55.- CHF (Swiss Francs). EXECUTIVE COMMITTEE: General Co-Chairs: Daniel Mange (EPFL, Switzerland) Shojiro Nishio (Osaka Univ., Japan) Technical Program Committee Co-Chairs: Auke Jan Ijspeert (EPFL, Switzerland) Masayuki Murata (Osaka Univ., Japan) Finance Chairs: Marlyse Taric (EPFL, Switzerland) Toshimitsu Masuzawa (Osaka Univ., Japan) Publicity Chairs: Christof Teuscher (EPFL, Switzerland) Takao Onoye (Osaka Univ., Japan) Publications Chair: Naoki Wakamiya (Osaka Univ., Japan) Local Arrangements Chair: Carlos Andres Pena-Reyes (EPFL, Switzerland) Internet Chair: Jonas Buchli (EPFL, Switzerland) TECHNICAL PROGRAM COMMITTEE: Co-Chairs: Auke Jan Ijspeert (EPFL, Switzerland) Masayuki Murata (Osaka Univ., Japan) Members: Michael A. Arbib (Univ. of Southern California, USA) Aude Billard (EPFL, Switzerland) Takeshi Fukuda (IBM Tokyo Research Lab., Japan) Katsuro Inoue (Osaka Univ., Japan) Wolfgang Maass (Graz Univ. of Technology, Austria) Ian W. Marshall (BTexact Technologies, UK) Toshimitsu Masuzawa (Osaka Univ., Japan) Alberto Montresor (Univ. of Bologna, Italy) Stefano Nolfi (ISTC, CNR, Italy) Takao Onoye (Osaka Univ., Japan) Rolf Pfeifer (Univ. of Zurich, Switzerland) Eduardo Sanchez (EPFL, Switzerland) Hiroshi Shimizu (Osaka Univ., Japan) Moshe Sipper (Ben-Gurion Univ., Israel) Gregory Stephanopoulos (MIT, USA) Adrian Stoica (Jet Propulsion Lab., USA) Gianluca Tempesti (EPFL, Switzerland) Naoki Wakamiya (Osaka Univ., Japan) Hans V. Westerhoff (Free University, Amsterdam, The Netherlands) Xin Yao (Univ. of Birmingham, UK) From klaus at prosun.first.fraunhofer.de Tue Sep 2 13:36:59 2003 From: klaus at prosun.first.fraunhofer.de (Klaus-R. Mueller) Date: Tue, 2 Sep 2003 19:36:59 +0200 (MEST) Subject: NIPS 2003 Message-ID: <200309021736.h82Haxse016013@prosun.first.fraunhofer.de> Dear collegues, The NIPS conference program is now available online at www.nips.cc and the Web site is now accepting online registrations. Best wishes, Klaus Robert Mueller NIPS*203 Publicity Chair &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& Prof. Dr. Klaus Robert Mueller Fraunhofer FIRST.IDA and University of Potsdam Kekulestr. 7 12489 Berlin, Germany Klaus AT first.fhg.de Tel: +49 30 6392 1860 or 1800 Fax: +49 30 6392 1805 http://www.first.fhg.de/persons/Mueller.Klaus-Robert.html &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& From moody at ICSI.Berkeley.EDU Tue Sep 2 13:05:41 2003 From: moody at ICSI.Berkeley.EDU (John Moody) Date: Tue, 2 Sep 2003 10:05:41 -0700 (PDT) Subject: POSTDOC -- Reinforcement Learning & Finance Message-ID: The International Computer Science Institute (ICSI) is seeking to hire a Postdoctoral Fellow to work with Professor John Moody on research in Reinforcement Learning and Computational Finance. This two year Postdoc opportunity is part of a project funded by the National Science Foundation entitled "Risk, Reward and Reinforcement". This interdisciplinary investigation will explore powerful, new algorithms for Direct Reinforcement and the application of these algorithms to competitive games and important, real-world financial problems. Activities will include fundamental algorithm research, extensive simulation and empirical testing. Prototype development will make use of substantial financial market data resources under development at ICSI. The goals are to create highly effective and efficient algorithms for Direct Reinforcement that can discover robust, low-risk solutions to challenging problems. Candidates must have a Ph.D. or comparable research experience in a physical, mathematical or engineering science or a quantitative social science such as economics or finance. Strong mathematical, statistical and computational skills are required. Ideally, applicants should have knowledge of machine learning, Monte Carlo methods, time series analysis, optimization, control engineering or quantitative finance. Preference will be given to candidates who are about to complete or have completed a Ph.D. within the past five years. Exceptionally well- qualified non-Ph.D. candidates with three to five years of relevant professional experience may also be considered. ICSI is an independent, nonprofit research institute with headquarters in Berkeley, California. It is closely affiliated with the University of California at Berkeley. ICSI is funded by the U.S. and several European governments, and brings together top researchers from participating countries. This ICSI project will be based in Portland, Oregon, which offers exceptional quality of life and was recently named America's "Best Big City". See http://www.moneymag.com/best/bplive/portland.html . The successful applicant for this postdoc will have opportunities to interact with researchers at ICSI, UC Berkeley, and other leading Northwest and West Coast universities. Interested individuals should email a curriculum vitae, up to three representative publications, names / phones / emails of three to five references and a cover note describing your research interests, professional goals and availability. Please email materials to moody at icsi.berkeley.edu. Applications received on or before September 15, 2003 will receive priority consideration. We will begin reviewing applications on September 15 and will continue considering candidates until the position is filled. Applicants should be available to begin a two year postdoc between October 1, 2003 and February 1, 2004. ICSI is an equal opportunity employer; minorities and women are encouraged to apply. ____________________________________________________________________ John Moody, Professor International Computer Science Institute Berkeley & Portland Tel: +1-503-750-5942 moody at icsi.berkeley.edu FAX: +1-503-531-2026 http://www.icsi.berkeley.edu/~moody ____________________________________________________________________ From nik.kasabov at aut.ac.nz Wed Sep 3 01:35:13 2003 From: nik.kasabov at aut.ac.nz (Nik Kasabov) Date: Wed, 03 Sep 2003 17:35:13 +1200 Subject: NCEI'03: Neurocomputing and Evolving Intelligence 2003 Message-ID: Conference "Neuro-Computing and Evolving Intelligence" 2003 - NCEI'03 20 and 21 November, 2003, Auckland, New Zealand Conference Chair: Prof. Nik Kasabov (nkasabov at aut.ac.nz) Programme Chair: Dr. Zeke Chan (shchan at aut.ac.nz) Contact person: Joyce D'Mello (joyce.dmello at aut.ac.nz) phone: +64 9 917 9504 Technical support: Peter Hwang (peter.hwang at aut.ac.nz) The emphasis of the Conference will be on connectionist-based methods, systems and applications of Evolving Intelligence (EI) - information systems that develop, unfold, evolve their structure and functionality over time through interaction with the environment. These systems evolve their "intelligence" through learning and interaction. The topics of interest include: Novel methods for autonomous learning Novel methods for knowledge engineering and knowledge discovery Evolving intelligence (EI) Evolving molecular processes and their modelling Evolving processes in the brain and their modelling Evolving language and cognition Adaptive speech recognition Adaptive image and multi-modal processing Adaptive decision making Dynamic time-series modelling in a changing environment Adaptive control Adaptive intelligent systems on the WWW Applications in Medicine, Health, Information Technologies, Horticulture, Agriculture, Bio-security, Business and finance, Process and robot control. The two days event will include oral presentations, poster presentations and various demonstrations of neurocomputing systems for bioinformatics and biomedical applications; brain study and cognitive engineering, agriculture, environment, decision support, speech recognition and language processing, image and video processing, multi-modal information processing, process control. Abstracts of 1 page should be submitted to the Program Chair before 6th October 2003. The abstracts will be reviewed and authors will be notified on acceptance by 25th October. All accepted abstracts will be published in conference proceedings. Selected papers will be invited for publication in the special issues of two international journals. Registration fee: NZ $300. Bookings and registrations should be made through the contact person Joyce D'Mello (joyce.dmello at aut.ac.nz; phone: 09 917 9504). Conference site: www.kedri.org/NCEI_03.htm ----------------------------------------------------------------------------------------------------------------------------------- Prof. Nik Kasabov, MSc, PhD Fellow RSNZ, FNZCS, SrMIEEE Director, Knowledge Engineering and Discovery Research Institute Chair of Knowledge Engineering, School of Computer and Information Sciences Auckland University of Technology (AUT) phone: +64 9 917 9506 ; fax: +64 9 917 9501 mobile phone: +64 21 488 328 WWW http://www.kedri.info email: nkasabov at aut.ac.nz From daw at cs.cmu.edu Wed Sep 3 16:15:09 2003 From: daw at cs.cmu.edu (Nathaniel Daw) Date: Wed, 3 Sep 2003 16:15:09 -0400 (EDT) Subject: thesis: Reinforcement learning models of the dopamine system and their behavioral implications Message-ID: Dear Connectionists, I thought that some of you might be interested in my recently completed PhD thesis, "Reinforcement learning models of the dopamine system and their behavioral implications," which is available as a (rather large 4Mb) pdf download at http://www.cs.cmu.edu/~daw/thesis.pdf An abstract follows. best, Nathaniel Daw ABSTRACT This thesis aims to improve theories of how the brain functions and to provide a framework to guide future neuroscientific experiments by making use of theoretical and algorithmic ideas from computer science. The work centers around the detailed understanding of the dopamine system, an important and phylogenetically venerable brain system that is implicated in such general functions as motivation, decision-making and motor control, and whose dysfunction is associated with disorders such as schizophrenia, addiction, and Parkinson's disease. A series of influential models have proposed that the responses of dopamine neurons recorded from behaving monkeys can be identified with the error signal from temporal difference (TD) learning, a reinforcement learning algorithm for learning to predict rewards in order to guide decision-making. Here I propose extensions to these theories that improve them along a number of dimensions simultaneously. The new models that result eliminate several unrealistic simplifying assumptions from the original accounts; explain many sorts of dopamine responses that had previously seemed anomalous; flesh out nascent suggestions that these neurophysiological mechanisms can also explain animal behavior in conditioning experiments; and extend the theories' reach to incorporate proposals about the computational function of several other brain systems that interact with the dopamine neurons. Chapter 3 relaxes the assumption from previous models that the system tracks only short-term predictions about rewards expected within a single experimental trial. It introduces a new model based on average-reward TD learning that suggests that long-run reward predictions affect the slow-timescale, tonic behavior of dopamine neurons. This account resolves a seemingly paradoxical finding that the dopamine system is excited by aversive events such as electric shock, which had fueled several published attacks on the TD theories. These investigations also provide a basis for proposals about the functional role of interactions between the dopamine and serotonin systems, and about behavioral data on animal decision-making. Chapter 4 further revises the theory to account for animals' uncertainty about the timing of events and about the moment-to-moment state of an experimental task. These issues are handled in the context of a TD algorithm incorporating partial observability and semi-Markov dynamics; a number of other new or extant models are shown to follow from this one in various limits. The new theory is able to explain a number of previously puzzling results about dopamine responses to events whose timing is variable, and provides an appropriate framework for investigating behavioral results concerning variability in animals' temporal judgments and timescale invariance properties in animal learning. Chapter 5 departs from the thesis' primary methodology of computational modeling to present a complementary attempt to address the same issues empirically. The chapter reports the results of an experiment that record from the striatum of behaving rats (a brain area that is one of the major inputs and outputs of the dopamine system), during a task designed to probe the functional organization of decision-making in the brain. The results broadly support the contention of most versions of the TD models that the functions of action selection and reward prediction are segregated in the brain, as in "actor/critic" reinforcement learning systems. From terry at salk.edu Fri Sep 5 20:49:17 2003 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 5 Sep 2003 17:49:17 -0700 (PDT) Subject: NEURAL COMPUTATION 15:10 In-Reply-To: <200308051940.h75JenY17542@purkinje.salk.edu> Message-ID: <200309060049.h860nHD15021@dax.salk.edu> Neural Computation - Contents - Volume 15, Number 10 - October 1, 2003 LETTERS Doubly Distributional Population Codes: Simultaneous Representation of Uncertainty and Multiplicity Maneesh Sahani and Peter Dayan Firing Rate of the Noisy Quadratic Integrate-and-Fire Neuron Nicolas Brunel and Peter E. Latham Pattern Filtering for Detection of Neural Activity, with Examples from HVc Activity During Sleep in Zebra Finches Zhiyi Chi, Peter L. Rauske, and Daniel Margoliash Synaptic Depression Leads to Nonmonotonic Frequency Dependence in the Coincidence Detector Shawn Mikula and Ernst Niebur Probing Changes in Neural Interaction During Adaptation Liqiang Zhu, Ying-Cheng Lai, France C. Hoppensteadt and Jiping He Memory Encoding by Theta Phase Precession in the Hippocampal Network Naoyuki Sato and Yoko Yamaguchi A Computational Model as Neurodecoder Based on Synchronous Oscillation in the Visual Cortex Zhao Songnian, Xiong Xiaoyun, Yao Guozheng and Fu Zhi Parameter Estimation of Sigmoid Superpositions: Dynamical System Approach Ivan Tyukin, Cees van Leeuwen, Danil Prokhorov On The Partitioning Capabilities of Feedforward Neural Networks with Sigmoid Nodes K. Koutroumbas ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $95 $101.65 $143 Institution $590 $631.30 $638 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From junshuilanl at yahoo.com Sat Sep 6 10:33:33 2003 From: junshuilanl at yahoo.com (Junshui Ma) Date: Sat, 06 Sep 2003 10:33:33 -0400 Subject: Position: Senior Machine Learning Scientist Message-ID: <007701c37483$da1605c0$6400a8c0@apple> Please do *not* reply to this email address. Instead, please reply directly to: Nancy.Schwartz at kornferry.com ============================= Position: Senior Machine Learning Scientist Job Description: Aureon Biosciences corp., a bioscience company in Yonkers, New York, is looking for a candidate with extensive experience and knowledge in machine learning, statistics, and data mining, including statistical analysis, Bayesian reasoning and learning, combinatorial optimization and learning, neural networks, and kernel-based methods. The candidate is required to have an MS or Ph.D. related to the mentioned areas from an accredited university. Post-doctorate and/or industry experience is strongly preferred. The ideal candidate will be versed in the latest research, methods, developments and theories in these areas, and possess extensive experience of applying them to commercial/industrial applications. The candidate is also required to be a visionary, creative, and great problem solver capable of proposing solutions to multiple problems in parallel. Additional Skills a.. Familiarity with various machine learning and statistical software tools. b.. Excellent programming skill. Candidates must be well versed and experienced in at least two of the follow languages Matlab, S-Plus (or R), SAS, and C/C++. Working experience in Java is a plus. From Alain.Destexhe at iaf.cnrs-gif.fr Mon Sep 8 06:46:18 2003 From: Alain.Destexhe at iaf.cnrs-gif.fr (Alain Destexhe) Date: Mon, 08 Sep 2003 12:46:18 +0200 Subject: Call for proposal to host the Advanced Course in Computational Neuroscience Message-ID: <3F5C5DFA.288ABDFC@iaf.cnrs-gif.fr> CALL FOR PROPOSALS The organizing committee of the "European Advanced Course in Computational Neuroscience" is looking for applications for potential sites to host the course for 3 years (2005-2008). The course is now in its eighth year. It was held for three years in Crete (Greece, 1996-1998), three years in Trieste (Italy, 1999-2001), it is currently being held in the small medieval village of Obidos (Portugal; 2002-2004). Traditionally, the course is held in August in a European (or Associated) country. The ideal site is relatively remote and small (ie not a large institution in a big city), in order to ensure intimacy and quietness, and be an attractive location to spend the summer. We also need a relatively fast internet connection for the computer network. One of the most important aspects of the course is to have an efficient local organizer to sort out local facilities, such as lodging, food, transport, rooms to hold the lectures and the computer network. We also will need a firm commitment to secure everything for a period of three years (2005-2007). Anyone interested should contact Alain Destexhe (see address below) and will be requested to send details such as a description of the site and approximate budget for lodging, food, rental of computers, etc. A site-visit to the selected locations is planned for the spring of 2004 to decide for our next host. Below are contact addresses and a short description of the course CONTACT Alain Destexhe Integrative and Computational Neuroscience Unit CNRS 1, Avenue de la Terrasse (BAT 33) 91198 Gif-sur-Yvette, France email: destexhe at iaf.cnrs-gif.fr Tel: 33-1-69-82-34-35 SHORT DESCRIPTION OF THE COURSE The European Advanced Course in Computational Neuroscience is a high-level 4-week intensive course on the computational aspects of the central nervous system function, from the cellular to the systems level. It is usually structured in 4 thematic weeks, cellular, sensory, motor and multilevel systems. The invited faculty members (usually from 10 to 15 per week) are among the best known scientists in their respective fields (both experimental and theoretical; see web site for past programs). The course is highly selective - we receive from 90 to 180 applications every year, from which 25 to 30 students are selected. Students are mid-term PhD or postdocs, and can be of any background (usually a mixture of experimentalists and theoreticians). The course is intended to give them a solid basis on the different aspects that are important to understand the complexity of the nervous system, as well as the different approaches that have been used in theoretical studies. Students are required to do a research project during the course, and are helped by the faculty and tutors. The selection of students is based on letters of recommendation and the advice of three independent referees. More information is available at our website: http://www.neuroinf.org/courses/EUCOURSE From isabelle at clopinet.com Mon Sep 8 13:38:49 2003 From: isabelle at clopinet.com (Isabelle Guyon) Date: Mon, 08 Sep 2003 10:38:49 -0700 Subject: Feature selection challenge Message-ID: <3F5CBEA9.787E7837@clopinet.com> Dear feature selection researcher, We are launching today a benchmark on feature selection, see: http://www.nipsfsc.ecs.soton.ac.uk/ Deadline: December 1st, 2003. Discussion of the benchmark results will take place at a one-day NIPS 2003 workshop on feature extraction (December 11-13, 2003, Whistler, British Columbia, CA), see http://clopinet.com/isabelle/Projects/NIPS2003/. Good luck! Isabelle Guyon From glanzman at helix.nih.gov Mon Sep 8 11:05:02 2003 From: glanzman at helix.nih.gov (Dennis Glanzman) Date: Mon, 08 Sep 2003 11:05:02 -0400 Subject: Flat text version of Dynamical Neuroscience Flyer Message-ID: <4.3.2.7.2.20030908110405.02cea910@helix.nih.gov> An HTML attachment was scrubbed... URL: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/77a55c69/attachment.html From bayer at cs.orst.edu Tue Sep 9 01:25:56 2003 From: bayer at cs.orst.edu (Valentina Bayer) Date: Mon, 8 Sep 2003 22:25:56 -0700 (PDT) Subject: Ph.D. thesis available: Learning Cost-Sensitive Diagnostic Policies from Data Message-ID: Dear Connectionists, My Ph.D. thesis: Learning Cost-Sensitive Diagnostic Policies from Data is available to download from http://eecs.oregonstate.edu/library/?call=2003-13 Advisor: Prof. Tom Dietterich, Oregon State University The abstract and table of context follow. Best regards, Valentina Bayer Zubek http://web.engr.oregonstate.edu/~bayer/ ------------- Abstract: In its simplest form, the process of diagnosis is a decision-making process in which the diagnostician performs a sequence of tests culminating in a diagnostic decision. For example, a physician might perform a series of simple measurements (body temperature, weight, etc.) and laboratory measurements (white blood count, CT scan, MRI scan, etc.) in order to determine the disease of the patient. A diagnostic policy is a complete description of the decision-making actions of a diagnostician under all possible circumstances. This dissertation studies the problem of learning diagnostic policies from training examples. An optimal diagnostic policy is one that minimizes the expected total cost of diagnosing a patient, where the cost is composed of two components: (a) measurement costs (the costs of performing various diagnostic tests) and (b) misdiagnosis costs (the costs incurred when the patient is incorrectly diagnosed). The optimal policy must perform diagnostic tests until further measurements do not reduce the expected total cost of diagnosis. The dissertation investigates two families of algorithms for learning diagnostic policies: greedy methods and methods based on the AO* algorithm for systematic search. Previous work in supervised learning constructed greedy diagnostic policies that either ignored all costs or considered only measurement costs or only misdiagnosis costs. This research recognizes the practical importance of costs incurred by performing measurements and by making incorrect diagnoses and studies the tradeoff between them. This dissertation develops improved greedy methods. It also introduces a new family of learning algorithms based on systematic search. Systematic search has previously been regarded as computationally infeasible for learning diagnostic policies. However, this dissertation describes an admissible heuristic for AO* that enables it to prune large parts of the search space. In addition, the dissertation shows that policies with better performance on an independent test set are learned when the AO* method is regularized in order to reduce overfitting. Experimental studies on benchmark data sets show that in most cases the systematic search methods produce better diagnostic policies than the greedy methods. Hence, these AO*-based methods are recommended for learning diagnostic policies that seek to minimize the expected total cost of diagnosis. -------------- Table of Contents: Chapter 1:Introduction Chapter 2: Cost-sensitive Learning (CSL) 2.1 Supervised Learning 2.2 Markov Decision Problems (MDPs) 2.3 Formal Description of the Cost-sensitive Learning Problem as an (Acyclic) MDP 2.4 Example of Diagnostic Policies 2.5 Assumptions and Extensions of Our Cost-sensitive Learning Framework 2.5.1 Complex Attribute Costs and Misclassification Costs 2.5.2 Complex Actions 2.5.3 CSL Problem Changes in Time 2.5.4 Missing Attribute Values 2.5.5 Multiple Classes 2.5.6 Continuous Attributes 2.5.7 Objective Function 2.6 Literature Review for the Cost-sensitive Learning Problem in Machine Learning 2.7 Related Work in Decision-theoretic Analysis 2.8 Summary Chapter 3: Greedy Search for Diagnostic Policies 3.1 General Description of Greedy Algorithms 3.2 InfoGainCost Methods 3.3 Modified InfoGainCost Methods (MC+InfoGainCost) 3.4 One-step Value of Information (VOI) 3.5 Implementation Details for Greedy Algorithms 3.6 Summary Chapter 4: Systematic Search for Diagnostic Policies 4.1 AND/OR Graphs 4.2 AO* Algorithm 4.2.1 Overview of the AO* Algorithm 4.2.2 Admissible Heuristic 4.2.3 Optimistic Values and Optimistic Policy 4.2.4 Realistic Values and Realistic Policy 4.2.5 Selecting a Node for Expansion 4.2.6 Our Implementation of AO* (High Level) 4.2.7 AO* for CSL Problems, With an Admissible Heuristic, Converges to the Optimal Value Function V* 4.2.8 Pseudocode and Implementation Details for the AO* Algorithm 4.3 Regularizers 4.3.1 Memory Limit 4.3.2 Laplace Correction (L) 4.3.3 Statistical Pruning (SP) 4.3.4 Pessimistic Post-Pruning (PPP) Based on Misclassification Costs 4.3.5 Early Stopping (ES) 4.3.6 Dynamic Method 4.3.7 AND/OR Graph Initialized with a Known Policy 4.3.8 Combining Regularizers 4.4 Review of AO* Literature 4.4.1 AO* Relation with A* 4.4.2 AO* Notations, Implementations, and Relation with Branch-and-Bound 4.4.3 Theoretical Results on AO* 4.4.4 POMDPs 4.4.5 Decision-theoretic Analysis 4.4.6 Test Sequencing Problem 4.4.7 Relation of CSL with Reinforcement Learning 4.5 Summary Chapter 5: Experimental Studies 5.1 Experimental Setup 5.1.1 UCI Domains 5.1.2 Setting the Misclassification Costs (MC) 5.1.3 Training Data, Test Data, Memory Limit 5.1.4 Notations for the Cost-Sensitive Algorithms 5.1.5 Evaluation Methods 5.2 Overfitting 5.3 Results 5.3.1 Laplace Correction Improves All Algorithms 5.3.2 Results on the bupa Domain 5.3.3 Results on the pima Domain 5.3.4 Results on the heart Domain 5.3.5 Results on the breast-cancer Domain 5.3.6 Results on the spect Domain 5.3.7 Summary of Algorithms' Performance 5.4 Discussion 5.4.1 An Overall Score for Algorithms (Chess Metric) 5.4.2 The Most Robust Algorithms 5.4.3 Comparing The Most Robust Algorithms Against the Best Algorithm on Each Domain 5.4.4 Summary of Discussion 5.4.5 Insights Into the Algorithms' Performance 5.5 Summary Chapter 6: Conclusions and Future Work 6.1 Contributions of This Dissertation 6.2 Future Work 6.3 Thesis Summary Appendices Appendix A: Details on Our AO* Implementation Appendix B: More Information on the Experimental Studies B.1 Misclassification Costs Matrices for the UCI Domains B.2 Comparing the Worst Algorithms in the Systematic and Greedy Search Families B.3 Comparing AO* with All the Other Algorithms using BDeltaCost B.4 Results of Comparing Each Laplace-Corrected Algorithm with All the Other Laplace-corrected Algorithms, on Each Domain and Misclassification Cost Level (MC) B.5 Paired-Graphs Comparing the Best Algorithm on Each Domain with Our Recommended Algorithms From ted.carnevale at yale.edu Tue Sep 9 15:59:28 2003 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Tue, 09 Sep 2003 15:59:28 -0400 Subject: NEURON course at SFN 2003 meeting Message-ID: <3F5E3120.80007@yale.edu> Due to increased support for this year's NEURON course at the SFN 2003 meeting, we are dropping the registration fee to $100. This is the lowest registration fee for this course since 1998. However, we can only accept a total of _30_ registrants at this reduced rate. Sign up now to ensure a place in this year's course! USING THE NEURON SIMULATION ENVIRONMENT Satellite Symposium, Society for Neuroscience Meeting 9 AM - 5 PM on Friday, Nov. 7, 2003 Speakers to include M.L. Hines and N.T. Carnevale This 1 day course with lectures and live demonstrations will present information essential for teaching and research applications of NEURON, an advanced simulation environment that handles realistic models of biophysical mechanisms, individual neurons, and networks of cells. The emphasis is on practical issues that are key to the most productive use of this powerful and convenient modeling tool. Features that will be covered include: constructing and managing models of cells and networks importing detailed morphometric data expanding NEURON's repertoire of biophysical mechanisms database resources for empirically-based modeling Each registrant will a comprehensive set of notes which include material that has not appeared elsewhere in print. Registration is limited to _30_ individuals on a first-come, first serve basis. For more information see http://www.neuron.yale.edu/no2003.html --Ted From mm at cse.ogi.edu Tue Sep 9 18:48:17 2003 From: mm at cse.ogi.edu (Melanie Mitchell) Date: Tue, 9 Sep 2003 15:48:17 -0700 Subject: Ph.D. Student Position at OGI Message-ID: <16222.22705.136974.174360@sangre.cse.ogi.edu> Dear Connectionists, The following position announcement is for research on combining evolutionary computation methods with other machine learning methods, as applied to image processing. Persons with neural network experience are strongly encouraged to apply. I have an opening for one graduate research assistant to work on a project applying machine learning methods, including evolutionary algorithms, to image analysis. The applications will be primarily in biomedical domains. Two major aspects of the research will be (1) how to best combine different machine learning methods and (2) how to automatically incorporate prior knowledge and contextual information in image analysis. Applicants must be willing to pursue a Ph.D. degree in Computer Science and Engineering at the OGI School of Science and Engineering, Oregon Health & Science University, near Portland, Oregon. The department web pages can be found at http://www.cse.ogi.edu. Proficiency in C, C++, Java, or another high-level programming language is required. Background in machine learning, evolutionary computation, image processing and/or computer vision is highly desirable. The assistantship will cover tuition and stipend. To apply, send a resume with your research interests, list of relevant course work or experience, programming experience and languages, and any other information you think would be relevant, and the names and contact information of at least two professors or scientists who will act as references. Please send this information in electronic form to Melanie Mitchell at the e-mail address above. Applications will be considered until the position is filled. Students of any nationality may apply. OGI is an equal opportunity employer and particularly welcomes applications from women and minority candidates. OGI is located 12 miles west of downtown Portland. Portland offers a superb quality of life, with extensive cultural amenities and spectacular natural surroundings, including close proximity to mountains, beaches, and wilderness areas. ----------------------------------- Melanie Mitchell Associate Professor Department of Computer Science and Engineering OGI School of Science & Engineering Oregon Health & Science University 20000 NW Walker Road Beaverton, OR 97006 From Nicolas.Rougier at loria.fr Mon Sep 15 04:13:30 2003 From: Nicolas.Rougier at loria.fr (Nicolas Rougier) Date: Mon, 15 Sep 2003 10:13:30 +0200 Subject: Frontal workshop to be held in Nancy, France. 20 October 2003. Message-ID: <3F6574AA.9020606@loria> Dear Connectionists, I would like to announce the following workshop on frontal cortex to be held in Nancy, France on October, 20 2003. "A multidisciplinary approach to the study of the frontal cortex" Substantial progress has been achieved in the last decade regarding the role of the prefrontal cortex on several grounds: computational modeling, neuropsychology, neurosciences, and neuroimaging to mention but the most manifest. The complexity of the subject compels a diversity of standpoints. Accordingly, the aim of the present workshop is to propose an unambiguously multidisciplinary approach to the study of the prefrontal cortex. We wish to set off fruitful discussion on how to bridge the gap among different perspectives, by presenting data from diverse fields of research. We think that viewing together some of the multifarious facets of the work on the prefrontal cortex may help to improve our attempts to understand its role. Invited speakers: Todd Braver (Washington University) "Prefrontal mechanisms of cognitive control" Vittorio Gallese (Parma University) "From action control to action representation: mirror neurons and the cognitive functions of the premotor cortex" Etienne Koechlin (Pierre et Marie Curie University) "Organization of executive functions in the human prefrontal cortex" Michael Kopelman (Kings College, London) "Frontal amnesias" Randall O'Reilly ( University of Colorado) "Reinforcement learning of dynamic gating signal in the prefrontal cortex/basal ganglia working memory system" Registration is free (and mandatory) but places are limited. Further information available at: http://www.loria.fr/~rougier/workshop.html Detailed program at: http://www.loria.fr/~rougier/workshop-program.html Regards, Nicolas Rougier From Chris.Mellen at gmf.com.au Sun Sep 14 21:02:58 2003 From: Chris.Mellen at gmf.com.au (Chris Mellen) Date: Mon, 15 Sep 2003 11:02:58 +1000 Subject: Research Position at Grinham Managed Funds, Australia Message-ID: <026c01c37b25$1dc59400$0e80a8c0@gmf.com.au> Dear Connectionists, You might find the following of interest. Regards, Christopher Mellen +++++++++++++++++++++++++++++++++++++++++++++++++++ Research Position Grinham Managed Funds Sydney, NSW, Australia. Grinham Managed Funds is one of the Southern Hemisphere's largest Hedge Fund Managers. Managing in excess of $1 billion we trade in over 40 futures markets into 9 countries, 24 hours a day. We are looking for an individual to fill a newly created permanent research position. The primary task will be to undertake research into the detection and exploitation of robust statistically significant patterns within financial time series data. This is a task which has the potential to encompass a diverse range of research directions. Consequently the role will have a broad scope. The successful applicant will have a Ph.D. in physics, statistics, mathematics, computer science, engineering or a related field. All levels of experience will be considered. Competency in software development is essential and knowledge of one or more of C/C++, R/S+, Matlab/Octave or related languages will be required. Past research experience in any of the fields of complex systems, statistical and numerical analysis, machine learning, pattern recognition, time series modelling or related would be highly regarded. Prior knowledge of or experience in finance is however not a pre-requisite. Applicants should be willing to work closely with other researchers and with I.T. professionals within the company. This is an exciting, intellectually challenging and rewarding role for someone with enthusiasm and imagination. The work environment is friendly and informal. Salary will be commensurate with experience. Bonuses are linked to the individual's and to the firm's performance. Individuals who are interested may apply by emailing their resume to : Research at gmf.com.au From anderson at CS.ColoState.EDU Mon Sep 15 18:08:49 2003 From: anderson at CS.ColoState.EDU (Chuck Anderson) Date: Mon, 15 Sep 2003 16:08:49 -0600 Subject: graduate research assistant position in reinforcement learning Message-ID: <3F663871.3070801@cs.colostate.edu> A graduate research assistant position is available starting Spring, 2004, for research on combining reinforcement learning, recurrent neural networks, and robust control theory, on a project funded by the National Science Foundation. You may read about this project at http://www.engr.colostate.edu/nnhvac. Proficiency in C, C++, and Matlab is required. Experience with neural networks, feedback control, and Simulink for Matlab is highly desirable. Applicants must be willing to pursue a Ph.D. degree in Computer Science at Colorado State University. Information about the department can be found at http://www.cs.colostate.edu. The assistantship will cover tuition and a stipend. To apply, send a resume describing your education, work experience, and programming experience and languages, plus a statement of your research interests and any other information that might be relevant. Also include the names and contact information for at least two professors or scientists who will serve as references. Please send this information in electronic form to Chuck Anderson at anderson at cs.colostate.edu. Colorado State University is located in Fort Collins, Colorado, which is situated about 60 miles from Denver, alongside the foothills of the Rocky Mountains. Read more about CSU at http://www.colostate.edu. -- Chuck Anderson associate professor Department of Computer Science anderson at cs.colostate.edu Colorado State University http://www.cs.colostate.edu/~anderson Fort Collins, CO 80523-1873 office: 970-491-7491 From pfbaldi at ics.uci.edu Mon Sep 15 22:05:39 2003 From: pfbaldi at ics.uci.edu (Pierre Baldi) Date: Mon, 15 Sep 2003 19:05:39 -0700 Subject: Postdoctoral Fellowships in Bioinformatics/Computational Biology and Machine Learning at UCI Message-ID: <001601c37bf7$06bb1b90$cd04c380@TIMESLICE2> Please forward this announcement to people you think might be interested. Thank you. ======================================================================= Several NIH-supported postdoctoral positions in the areas of Computational Biology/Bioinformatics and Machine Learning are available in the School of Information and Computer Science ( www.ics.uci.edu) and the Institute for Genomics and Bioinformatics ( www.igb.uci.edu) at the University of California, Irvine. Areas of particular interest include: protein structure/function prediction, molecular docking and drug design, chemical informatics, comparative genomics, analysis of high-throughput data (e.g. DNA microarray data ), gene regulation, systems biology, medical informatics, and all areas of machine learning and large scale data analysis. Prospective candidates should apply with a cover letter, CV, statement of research interests and accomplishments, and names and email addresses of 3 referees to be sent, preferably by email, to: pfbaldi at ics.uci.edu. The positions are available starting October 1, 2003 and the duration of the appointments are typically 2 years with possibility of renewal. Relevant faculty in the School include: P. Baldi, R. Dechter, D. Kibler, R. Lathrop, E. Mjolsness, M. Pazzani, P. Smyth , H. Stern, and M. Welling.There are many opportunities for collaboration with life scientists located in other units within short walking distance from the School. The University of California is an Equal Opportunity Employer committed to excellence through diversity. =========================================================================== Pierre Baldi School of Information and Computer Science and Department of Biological Chemistry, College of Medicine Director Institute for Genomics and Bioinformatics University of California, Irvine Irvine, CA 92697-3425 (949) 824-5809 (949) 824-4056 FAX www.ics.uci.edu/~pfbaldi From fredrik.linaker at ida.his.se Tue Sep 16 09:28:52 2003 From: fredrik.linaker at ida.his.se (Fredrik Linaker) Date: Tue, 16 Sep 2003 15:28:52 +0200 Subject: Ph.D. thesis available: Unsupervised On-line Data Reduction for Memorisation and Learning in Mobile Robotics Message-ID: <3F671014.5090000@ida.his.se> Dear Connectionists, My Ph.D. thesis: Unsupervised On-line Data Reduction for Memorisation and Learning in Mobile Robotics is now available for download: http://www.ida.his.se/~fredrik/publications/linaker_thesis2003.pdf http://www.ida.his.se/~fredrik/publications/linaker_thesis2003.ps.gz Supervisor: Prof. Noel Sharkey, University of Sheffield, UK An abstract follows. I'd be very interested in information about open post-doc positions within the learning and/or robotics areas. Best regards, Fredrik Linaker fredrik.linaker at ida.his.se ABSTRACT The amount of data available to a mobile robot controller is staggering. This thesis investigates how extensive continuous-valued data streams of noisy sensor and actuator activations can be stored, recalled, and processed by robots equipped with only limited memory buffers. We address three robot memorisation problems, namely Route Learning (store a route), Novelty Detection (detect changes along a route) and the Lost Robot Problem (find best match along a route or routes). A robot learning problem called the Road-Sign Problem is also addressed. It involves a long-term delayed response task where temporal credit assignment is needed. The limited memory buffer entails that there is a trade-off between memorisation and learning. A traditional overall data compression could be used for memorisation, but the compressed representations are not always suitable for subsequent learning. We present a novel unsupervised on-line data reduction technique which focuses on change detection rather than overall data compression. It produces reduced sensory flows which are suitable for storage in the memory buffer while preserving underrepresented inputs. Such inputs can be essential when using temporal credit assignment for learning a task. The usefulness of the technique is evaluated through a number of experiments on the identified robot problems. Results show that a learning ability can be introduced while at the same time maintaining memorisation capabilities. The essentially symbolic representation, resulting from the unsupervised on-line reduction could in the extension also help bridge the gap between the raw sensory flows and the symbolic structures useful in prediction and communication. http://www.ida.his.se/~fredrik/ From a.hussain at cs.stir.ac.uk Tue Sep 16 08:23:37 2003 From: a.hussain at cs.stir.ac.uk (Dr. Amir Hussain) Date: Tue, 16 Sep 2003 13:23:37 +0100 Subject: Call for Papers: Control & Intelligent Systems Journal Special Issue on "Non-linear Adaptive PID Control" References: <001601c37bf7$06bb1b90$cd04c380@TIMESLICE2> Message-ID: <3F6700C9.E15B3E85@cs.stir.ac.uk> Dear all, Please forward this CFP announcement to people you think might be interested. Thank you.. ======================================================================= Call for Papers Special Issue on "NON-LINEAR ADAPTIVE PID CONTROL" International Journal of Control and Intelligent Systems (CIS), IASTED / ACTA Press, Vol. 33, 2005 CIS Journal Special Issue Website: http://www.actapress.com/journals/specialci2.htm It is well known that three-term proportional-integral-derivative (PID) controllers are quite versatile and widely used, particularly in process control industries. This domination is mainly due both to the robustness of these controllers in a wide range of processes and the simplicity of their structure. They are easy to implement using both digital or analog hardware, easily understood by technical personnel, and, are remarkably effective in regulating a wide range of applications when properly tuned. For most simple processes, PID control can achieve satisfactory closed loop performance. However, many industrial processes possess complex properties such as nonlinear and time varying characteristics. In this context, conventional PID controllers with fixed parameters have a major drawback in that they need to be re-tuned in order to adapt to variations in plant dynamics and environments, leading to poor control performance. For the control of real-world processes, an alternative is to use adaptive control schemes, which can automatically adjust PID parameters on-line. However, before such adaptive controllers can be implemented, mathematical modeling of the plant has to be done, which is usually based on the assumption that the process to be controlled is linear. Since most real-world plants are nonlinear, the responses of these plants cannot be shaped for a desired performance using linear controllers. Inaccuracies in the modeling of real-world plants result in degraded performances of linear adaptive controllers. Hence, it can be concluded that the applications of conventional linear adaptive controllers are limited and not ideally suited for nonlinear and complex real-world processes. Consequently, it may be argued that nonlinear adaptive controllers are required to effectively control such plants, and it is necessary to incorporate the inherent nonlinearity of the process into the controller design. One of the main difficulties in designing nonlinear controllers, however, is the lack of a general structure for them. In recent years, computational-intelligence techniques such as artificial neural networks, fuzzy logic, genetic algorithms, combined neuro-fuzzy approaches, and other nonlinear and biologically inspired techniques have become valuable tools to describe and control nonlinear plants. More recently however, there has been a growing interest among academic and industrial researchers in new types of controllers, which can combine the approximation power of such nonlinear adaptive techniques with the simplicity of PID control structures. This special issue of the International Journal, Control & Intelligent Systems, is an attempt to bring together active researchers in this emerging field of nonlinear adaptive PID control. Scope: The application domain includes (but is not limited to): - Neuro-PID Control - Fuzzy-PID Control - Neuro-fuzzy - PID Control - General nonlinear techniques including, nonlinear predictive- and minimum variance-based PID Control. - Design, development, stability, and robustness issues of these techniques - Practical applications Any topic relevant to Nonlinear Adaptive PID Control will be considered. Instructions for Manuscripts: All manuscripts should be e-mailed to the Guest Editor of the Special Issue at a.hussain at cs.stir.ac.uk by Dec 15, 2003. On the e-mail subject line please indicate "Submission for CIS Special Issue Nonlinear Adaptive PID Control." The submission should include the name(s) of the author(s), their affiliation, addresses, fax numbers, and e-mail addresses. Manuscripts should strictly follow the guidelines of ACT A Press, given at the following website: http://www.actapress.com/joumals/submission.htm Important Dates Deadline for paper submission: December 15,2003 Notification of acceptance: April 30 , 2004 Final manuscripts due: July 15 , 2004 Publication in Special Issue: Early 2005 Guest Editor: Dr. Amir Hussain Department of Computing Science & Mathematics University of Stirling Stirling FK9 4LA Scotland, UK E-mail: a.hussain at cs.stir.ac.uk http://www.cs.stir.ac.uk/~ahu/ Tel/Fax: (++44) 01786-467437/464551 -- The University of Stirling is a university established in Scotland by charter at Stirling, FK9 4LA. Privileged/Confidential Information may be contained in this message. If you are not the addressee indicated in this message (or responsible for delivery of the message to such person), you may not disclose, copy or deliver this message to anyone and any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. In such case, you should destroy this message and kindly notify the sender by reply email. Please advise immediately if you or your employer do not consent to Internet email for messages of this kind. From Anthony.Pipe at uwe.ac.uk Wed Sep 17 00:06:01 2003 From: Anthony.Pipe at uwe.ac.uk (Tony Pipe) Date: Tue, 16 Sep 2003 16:06:01 -1200 Subject: CFP: Track on Applications of Neural Networks at Flairs 04 in Miami Florida Message-ID: <005301c37cd1$01f66580$55800ba4@HP29375324311> Neural Network Applications: Special Track at the 17th International FLAIRS Conference In cooperation with the American Association for Artificial Intelligence Palms South Beach Hotel Miami Beach, FL May 17-19, 2004 -------------------------------------------------------------------------------- Call for Papers Papers are being solicited for a special track on Neural Network Applications at the 17th International FLAIRS Conference (FLAIRS-2004). The special track will be devoted to the applications of Neural Networks with the aim of presenting new and important contributions in this area. These application areas include, but are not limited to, the following: a.. Vision b.. Pattern Recognition c.. Control and Process Monitoring d.. Biomedical Applications e.. Robotics f.. Speech Recognition g.. Text Mining h.. Diagnostic Problems i.. Telecommunications j.. Power Systems k.. Signal Processing l.. Image Processing -------------------------------------------------------------------------------- Submission Guidelines Interested authors must submit completed manuscripts by October 24, 2003. Submissions should be no more than 6 pages (4000 words) in length, including figures, and contain no identifying reference to self or organization. Papers should be formatted according to AAAI Guidelines. Submission instructions can be found at FLAIRS-04 website at http://www.flairs.com/flairs2004. Notification of acceptance will be mailed around January 7, 2004. Authors of accepted papers will be expected to submit the final camera-ready copies of their full papers by February 6, 2004 for publication in the conference proceedings which will be published by AAAI Press. Authors may be invited to submit a revised copy of their paper to a special issue of the International Journal on Artificial Intelligence Tools (IJAIT). -------------------------------------------------------------------------------- FLAIRS 2004 Invited Speakers a.. Justine Cassell (Massachusetts Institute of Technology) b.. Edward Feigenbaum (Stanford University) c.. Jim Hendler (University of Maryland) d.. Tom Mitchell (Carnegie Mellon University) -------------------------------------------------------------------------------- Important Dates a.. Paper submissions due: October 24, 2003 b.. Notification letters sent: January 7, 2004 c.. Camera ready copy due: February 6, 2004 -------------------------------------------------------------------------------- Special Track Committee (Tentative) Ingrid Russell (Co-Chair), University of Hartford, USA Tony Pipe (Co-Chair), University of the West of England, UK Brian Carse (Co-Chair), University of the West of England, UK Jim Austin, University of York, UK Vijayakumar Bhagavatula, Carnegie Mellon University, USA Serge Dolenko, Moscow State University, Russia Okan Ersoy, Purdue University, USA Michael Georgiopoulos, University of Central Florida, USA Mike James, York University, UK John Kolen, University of West Florida, USA Lisa Meeden, Swarthmore College, USA Sergio Roa, National University of Colombia, Columbia Roberto Santana, Institute of Cybernetics, Mathematics and Physics (ICIMAF), Cuba Bernhard Sendhoff, Honda Research and Development Europe, Offenbach/Main, Germany C. N. Schizas, University of Cyprus, Cyprus Wai Sum Tang, Chinese University of Hong Kong, Hong Kong Stefan Wermter, University of Sunderland, UK Hujun Yin, University of Manchester Institute of Science and Technology, UK -------------------------------------------------------------------------------- Further Information Questions regarding the special track should be addressed to: Tony Pipe Voice: +44-117-344-2818 Fax: +44-117-344-3800 email: Anthony.Pipe at uwe.ac.uk From dimi at ci.tuwien.ac.at Wed Sep 17 09:30:19 2003 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Wed, 17 Sep 2003 15:30:19 +0200 (CEST) Subject: CI BibTeX Collection -- Update Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 6/5-7/4 IEEE Transactions on Fuzzy Systems, Volumes 10/5-11/4 IEEE Transactions on Neural Networks, Volumes 13/6-14/4 Machine Learning, Volumes 50/1-2-53/1-2 Neural Computation, Volumes 14/10-15/9 Neural Networks, Volumes 15/8-9-16/6 Neural Processing Letters, Volumes 16/3-17/2 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/services/BibTeX.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fuer Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitaet Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ From djaeger at emory.edu Sat Sep 20 12:38:05 2003 From: djaeger at emory.edu (Dieter Jaeger) Date: Sat, 20 Sep 2003 12:38:05 -0400 Subject: Postdoctoral Position in Computational Neuroscience Message-ID: <3F6C826D.2F419699@emory.edu> A funded postdoctoral opening in the area of computational neuroscience is available in my laboratory at Emory University, Atlanta. The research project is aimed at elucidating the operation of the deep cerebellar nuclei using whole cell recordings in slices and compartmental modeling. This work will build on our previous publications in this area (Gauck and Jaeger, J. Neurosci. 2000; 2003). The Neuroscience environment at Emory University is excellent, and living in Atlanta features a large international community and plenty of activities. Candidates should have previous experience in intracellular electrophysiology and/or compartmental modeling. Interested candidates should contact djaeger at emory.edu for further details. -Dieter Jaeger Associate Professor Emory University Department of Biology 1510 Clifton Rd. Atlanta, GA 30322 Tel: 404 727 8139 Fax: 404 727 2880 e-mail: djaeger at emory.edu From ishikawa at brain.kyutech.ac.jp Sun Sep 21 04:23:30 2003 From: ishikawa at brain.kyutech.ac.jp (Masumi Ishikawa) Date: Sun, 21 Sep 2003 17:23:30 +0900 Subject: 2004 Special Issue of Neural Networks Message-ID: <5.0.2.5.2.20030921171938.00e14008@mail.brain.kyutech.ac.jp> ---------------------------------------------------------- We apologize if you receive multiple copies of this email. ---------------------------------------------------------- Call for Papers 2004 Special Issue of Neural Networks New Developments in Self-Organizing Systems Research on self-organizing systems including self-organizing maps (SOMs) is an important area of unsupervised learning and has been rapidly growing in various directions: theoretical developments, applications in various fields, in-depth analysis of self-organizing systems in neuroscience and so forth. The Workshop on Self-Organizing Maps (WSOM) has been held biennially since 1997. In October, 2002, a Special Issue on Self-Organizing Maps was published in the journal Neural Networks selected from presentations at WSOM'01. The latest workshop, WSOM'03, was held in September 2003 in Kitakyushu, Japan (for details, see http://www.brain.kyutech.ac.jp/~wsom). Considering the extensive growth of this area, we plan another special issue related to self-organization in 2004, ranging from theoretical aspects to various applications. We will select papers from those submitted to this special issue of Neural Networks; papers will be either revisions of those presented at WSOM'03 or those directly submitted to the special issue. A limited number of invited papers by leading scientists are also planned. Technical areas include, but are not limited to: - Theory of self-organizing systems - Data visualization and mining - Applications to WEB intelligence - Applications to text and document analysis - Applications to robotics - Applications to image processing and vision - Applications to pattern recognition - Hardware and architecture - Self-organizing systems in neuroscience Guest-Editors Masumi Ishikawa, Kyushu Institute of Technology Risto Miikkulainen, The University of Texas at Austin Helge Ritter, University of Bielefeld Submission Deadline for submission: December 10, 2003 Notification of acceptance: April 15, 2004 Deadline for submission of accepted papers: June 20, 2004 Deadline for submission of final papers: August 30, 2004 Format: as normal papers in the journal Address for Papers: Dr. Mitsuo Kawato ATR Computational Neuroscience Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan From owen at stat.Stanford.EDU Tue Sep 23 12:06:44 2003 From: owen at stat.Stanford.EDU (Art Owen) Date: Tue, 23 Sep 2003 09:06:44 -0700 Subject: No subject Message-ID: Dear Connectionists, I'm helping Stanford's School of Education recruit a faculty member with an interest in statistics. They're looking for somebody who is: 1) at the assistant or associate professor level 2) a good teacher for bright non-mathematical grad students 3) able to handle large data sets featuring missing values, selection biases, non-tabular structures, etc. 4) interested in policy issues Recently there seems to be a sharp increase in quantitative methods in the social sciences. They, like everybody else, are getting floods of data. There is a copy of the position advertisement at: www-stat.stanford.edu/~owen/EdPos.pdf Please read it carefully if you're interested in applying. Note that it stresses excellence in teaching, and methodology relevant to the social sciences. -Art Owen _____________________________________________________________________ Art Owen Tel: 650.725.2232 Dept. of Statistics Fax: 650.725.8977 Sequoia Hall, 130 art "AT" stat.stanford.edu Stanford, CA 94305 www.stanford.edu/~owen From r.p.w.duin at tnw.tudelft.nl Wed Sep 24 11:09:49 2003 From: r.p.w.duin at tnw.tudelft.nl (Bob Duin) Date: Wed, 24 Sep 2003 17:09:49 +0200 Subject: Four positions in Delft Pattern Recognition Research Message-ID: <20030924150948.GA29364@ph.tn.tudelft.nl> Dear colleagues, We have at this moment a number of open positions in pattern recognition. We are looking for PhD students (receiving a full salary) as well as post-docs. Applicants have a solid academic background in physics, electrical engineering, computer science or mathematics. Post-Docs should have recently finished their PhD research. Experience in multi-dimensional data analysis, pattern recognition, machine learning or image processing is an advantage. Good programming skills are highly desirable. For one of the PhD positions a strong mathematical background is needed. For more information have a look at: http://www.ph.tn.tudelft.nl/~duin/vacancies.html Sincerely, Robert P.W. Duin -- R.P.W. Duin Phone: (31) 15 2786143 Faculty of Applied Sciences Fax: (31) 15 2786740 Delft University of Technology mailto: r.p.w.duin at tnw.tudelft.nl P.O. Box 5046, 2600 GA Delft http://www.ph.tn.tudelft.nl/~duin The Netherlands From calls at bbsonline.org Wed Sep 24 12:41:35 2003 From: calls at bbsonline.org (Behavioral & Brain Sciences) Date: Wed, 24 Sep 2003 17:41:35 +0100 Subject: Burns/An evolutionary theory of schizophrenia: BBS Call for Commentators Message-ID: Below is a link to the forthcoming BBS target article An evolutionary theory of schizophrenia: Cortical connectivity, metarepresentation and the social brain by Jonathan Kenneth Burns http://www.bbsonline.org/Preprints/Burns/Referees/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or suggested by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to suggest someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. An electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) ======================================================================= ** IMPORTANT ** ======================================================================= To help us put together a balanced list of commentators, it would be most helpful if you would send us an indication of the relevant expertise you would bring to bear on the paper, and what aspect of the paper you would anticipate commenting upon. (Please note that we only request expertise information in order to simplify the selection process.) Please DO NOT prepare a commentary until you receive a formal invitation, indicating that it was possible to include your name on the final list, which is constructed so as to balance areas of expertise and frequency of prior commentaries in BBS. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable at the URL that follows the abstract and keywords below. ======================================================================= ======================================================================= An evolutionary theory of schizophrenia: Cortical connectivity, metarepresentation and the social brain Jonathan Kenneth Burns University of Edinburgh ABSTRACT: Schizophrenia is a worldwide prevalent disorder with a multifactorial but highly genetic aetiology. A constant prevalence rate in the face of reduced fecundity has caused some to argue that an evolutionary advantage exists in unaffected relatives. This adaptationist approach is critiqued and Crow's 'speciation' hypothesis is reviewed and found wanting. In keeping with available biological and psychological evidence, an alternative theory of the origins of this disorder is proposed. Schizophrenia is a disorder of the social brain and exists as a costly trade off in the evolution of complex social cognition. Paleoanthropological and comparative primate research suggests that hominids evolved complex cortical interconnectivity (in particular fronto-temporal and fronto-parietal circuits) in order to regulate social cognition and the intellectual demands of group living. I suggest that the ontogenetic mechanism underlying this cerebral adaptation was sequential hypermorphosis and that it rendered the hominid brain vulnerable to genetic and environmental insults. I argue that changes in genes regulating the timing of neurodevelopment occurred, prior to the migration of H. sapiens out of Africa 150 -100 000 years ago, giving rise to the schizotypal spectrum. While some individuals within this spectrum may have exhibited unusual creativity and iconoclasm, this phenotype was not necessarily adaptive in reproductive terms. However, because the disorder shared a common genetic basis with the evolving circuitry of the social brain, it persisted. Thus schizophrenia emerged as a costly trade off in the evolution of complex social cognition. KEYWORDS: cortical connectivity, evolution, heterochrony, metarepresentation, primates, psychiatry, schizophrenia, social brain, social cognition http://www.bbsonline.org/Preprints/Burns/Referees/ ======================================================================= ======================================================================= *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password. Or, email a response with the word "remove" in the subject line. *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Paul Bloom - Editor Barbara Finlay - Editor Jeffrey Gray - Editor Behavioral and Brain Sciences bbs at bbsonline.org http://www.bbsonline.org ------------------------------------------------------------------- From cindy at bu.edu Thu Sep 25 09:28:00 2003 From: cindy at bu.edu (Cynthia Bradford) Date: Thu, 25 Sep 2003 09:28:00 -0400 Subject: Call for Papers: 2004 Special Issue of Neural Networks on Vision and Brain Message-ID: <021d01c38368$d7fa2c00$903dc580@cnspc31> ***** FINAL REMINDER ***** CALL FOR PAPERS 2004 Special Issue VISION AND BRAIN Understanding how the brain sees is one of the most active and exciting areas in perceptual science, neuroscience, and modeling. This is because vision is one of our most important sources of information about the world, and a large amount of brain is used to process visual signals, ranging from early filtering processes through perceptual grouping, surface formation, depth perception, texture perception, figure-ground separation, motion perception, navigation, search, and object recognition. This Special Issue will incorporate invited and contributed articles focused on recent experimental and modeling progress in unifying physiological, psychophysical and computational mechanisms of vision. The Special Issue will also include articles that summarize biologically inspired approaches to computer vision in technology, including hardware approaches to realizing neuromorphic vision algorithms. CO-EDITORS: Professor David Field, Cornell University Professor Leif Finkel, University of Pennsylvania Professor Stephen Grossberg, Boston University SUBMISSION: Deadline for submission: September 30, 2003 Notification of acceptance: January 31, 2004 Format: no longer than 10,000 words; APA reference format ADDRESS FOR SUBMISSION: Stephen Grossberg, Editor Neural Networks Department of Cognitive and Neural Systems Boston University 677 Beacon Street, Room 203 Boston, Massachusetts 02215 USA From jose at tractatus.rutgers.edu Thu Sep 25 10:15:45 2003 From: jose at tractatus.rutgers.edu (Stephen J. Hanson) Date: 25 Sep 2003 10:15:45 -0400 Subject: GRADUATE FELLOWSHISP and RESEARCH ASSISTANT POSITIONS - RUTGERS U. RUMBA LABS Message-ID: <1064499345.2076.40.camel@vaio> RUTGERS UNIVERSITY (NEWARK CAMPUS)-- RUMBA LABORATORIES-ADVANCED IMAGING CENTER (UMDNJ/RUTGERS)--RESEARCH ASSISTANTS/GRADUATE FELLOWSHIPS. IMMEDIATE OPENINGS Research in cognitive neuroscience, category learning, event perception, using magnetic resonance imaging and electrophysiological techniques. These Research assistants are intended lead to Competitive (20k$ +Tuition) GRADUATE FELLOWSHIPS in Cognitive Science/Cognitive Neurosciece. Experience in standard Neuroimaging Software (SPM, AFNI, VOXBO,etc..) is important. Background in experimental psychology or cognitive science (BA/BS Required), neuroscience and statistics would be helpful. Strong computer skills are a plus. An excellent opportunity for someone bound for graduate school in cognitive science, cognitive neuroscience or medicine. Send by email a CV with description of research experience and the names of three references to: rumbalabs at psychology.rutgers.edu (see www.rumba.rutgers.edu for more information) -- Stephen J. Hanson From jose at tractatus.rutgers.edu Thu Sep 25 09:40:29 2003 From: jose at tractatus.rutgers.edu (Stephen J. Hanson) Date: 25 Sep 2003 09:40:29 -0400 Subject: FACULTY COG SCI/COGNITIVE NEURO RUTGERS UNIVERSITY (NEWARK CAMPUS) Message-ID: <1064497229.1938.12.camel@vaio> RUTGERS UNIVERSITY (NEWARK CAMPUS), PSYCHOLOGY DEPARTMENT, COGNITIVE SCIENCE, COGNTIVE NEUROSCIENCE The Department of Psychology anticipates making one tenure track, Associate Professor level appointment (will consider Junior Faculty appointment near Tenure) in area of COGNITIVE SCIENCE/NEUROSCIENCE. In particular we are seeking individuals from one of any of the following THREE areas: LEARNING, COMPUTATIONAL NEUROSCIENCE, or SOCIAL NEUROSCIENCE (interests in NEUROIMAGING in any of these areas would also be a plus, since the Department in conjunction with UMDNJ jointly administers a 3T Neuroimaging Center (see http://www.newark.rutgers.edu/fmri/). The successful candidate is expected to develop and maintain an active, externally funded research program, and to teach at both the graduate and undergraduate levels. Review of applications will begin JANUARY 30th 2004, pending final budgetary approval from the administration. Rutgers University is an equal opportunity/ affirmative action employer. Qualified women and minority candidates are encouraged to apply. Please send a CV, a statement of current and future research interests, and three letters of recommendation to COGNITIVE SCIENCE SEARCH COMMITTEE, Department of Psychology, Rutgers University, Newark, NJ 07102. Email enquires can be made to cogsci at psychology.rutgers.edu. -- Stephen J. Hanson From cns at cns.bu.edu Fri Sep 26 14:09:00 2003 From: cns at cns.bu.edu (CNS Department) Date: Fri, 26 Sep 2003 14:09:00 -0400 Subject: GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY Message-ID: <3F7480BC.8010201@cns.bu.edu> PLEASE POST ******************************************************************* GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The brochure may also be viewed on line at: http://www.cns.bu.edu/brochure/ and application forms at: http://www.bu.edu/cas/graduate/application.html Applications for Fall 2004 admission and financial aid are now being accepted for PhD, MA, and BA/MA degree programs. To obtain a brochure describing CNS programs and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via email your full name and mailing address to the attention of Mr. Robin Amos at: amos at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) general test scores. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. ******************************************************************* Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students and qualified undergraduates interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The department's training and research focus on two broad questions. The first question is: How does the brain control behavior? This is a modern form of the Mind/Body Problem. The second question is: How can technology emulate biological intelligence? This question needs to be answered to develop intelligent technologies that are well suited to human societies. These goals are symbiotic because brains are unparalleled in their ability to intelligently adapt on their own to complex and novel environments. Models of how the brain accomplishes this are developed through systematic empirical, mathematical, and computational analysis in the department. Autonomous adaptation to a changing world is also needed to solve many of the outstanding problems in technology, and the biological models have inspired qualitatively new designs for applications. CNS is a world leader in developing biological models that can quantitatively simulate the dynamics of identified brain cells in identified neural circuits, and the behaviors that they control. This new level of understanding is producing comparable advances in intelligent technology. CNS is a graduate department that is devoted to the interdisciplinary training of graduate students. The department awards MA, PhD, and BA/MA degrees. Its students are trained in a broad range of areas concerning computational neuroscience, cognitive science, and neuromorphic systems. The biological training includes study of the brain mechanisms of vision and visual object recognition; audition, speech, and language understanding; recognition learning, categorization, and long-term memory; cognitive information processing; self-organization and development, navigation, planning, and spatial orientation; cooperative and competitive network dynamics and short-term memory; reinforcement and motivation; attention; adaptive sensory-motor planning, control, and robotics; biological rhythms; consciousness; mental disorders; and the mathematical and computational methods needed to support advanced modeling research and applications. Technological training includes methods and applications in image processing, multiple types of signal processing, adaptive pattern recognition and prediction, information fusion, and intelligent control and robotics. The foundation of this broad training is the unique interdisciplinary curriculum of seventeen interdisciplinary graduate courses that have been developed at CNS. Each of these courses integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of artificial neural networks and hybrid systems to technology. A student's curriculum is tailored to his or her career goals with academic and research advisors. In addition to taking interdisciplinary courses within CNS, students develop important disciplinary expertise by also taking courses in departments such as biology, computer science, engineering, mathematics, and psychology. In addition to these formal courses, students work individually with one or more research advisors to learn how to carry out advanced interdisciplinary research in their chosen research areas. As a result of this breadth and depth of training, CNS students have succeeded in finding excellent jobs in both academic and technological areas after graduation. The CNS Department interacts with colleagues in several Boston University research centers, and with Boston-area scientists collaborating with these centers. The units most closely linked to the department are the Center for Adaptive Systems and the CNS Technology Laboratory. Students interested in neural network hardware can work with researchers in CNS and at the College of Engineering. Other research resources include distinguished research groups the campus-wide Program in Neuroscience, which unites cognitive neuroscience, neurophysiology, neuroanatomy, neuropharmacology, and neural modeling across the Charles River Campus and the Medical School; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department ; and in biophysics and computational physics within the Physics Department. Key colleagues in these units hold joint appointments in CNS in order to expedite training and research interactions with CNS core faculty and students. In addition to its basic research and training program, the department organizes an active colloquium series, various research and seminar series, and international conferences and symposia, to bring distinguished scientists from experimental, theoretical, and technological disciplines to the department. The department is housed in its own four-story building, which includes ample space for faculty and student offices and laboratories (computational neuroscience, visual psychophysics, psychoacoustics, speech and language, sensory-motor control, neurobotics, computer vision, and technology), as well as an auditorium, classroom, seminar rooms, a library, and a faculty-student lounge. The department has a powerful computer network for carrying out large-scale simulations of behavioral and brain models and applications. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior Helen Barbas Professor, Department of Health Sciences, Sargent College PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems PhD, Mathematics, University of Wisconsin, Madison Learning and memory, vision, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems, cardiovascular oscillations physiology and time series H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, auditory virtual environments, signal processing models of hearing Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology John C. Fiala Research Assistant Professor of Biology PhD, Cognitive and Neural Systems, Boston University Synaptic plasticity, dendrite anatomy and pathology, motor learning, robotics, neuroinformatics Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Vision, audition, language, learning and memory, reward and motivation, cognition, development, sensory-motor control, mental disorders, applications Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, biological sensory-motor control and functional brain imaging Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition Michael E. Hasselmo Associate Professor of Psychology Director of Graduate Studies, Psychology Department PhD, Experimental Psychology, Oxford University Computational modeling and experimental testing of neuromodulatory mechanisms involved in encoding, retrieval and consolidation Allyn Hubbard Associate Professor of Electrical and Computer Engineering PhD, Electrical Engineering, University of Wisconsin VLSI circuit design: digital, analog, subthreshold analog, biCMOS, CMOS; information processing in neurons, neural net chips, synthetic aperture radar (SAR) processing chips, sonar processing chips; auditory models and experiments Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Neural network theory, complexity theory, wavelet theory, mathematical physics Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamics of networks of neurons Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neurodevelopmental disorders Siegfried Martens Research Associate, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Learning models, pattern recognition, visualization, remote sensing, sensor fusion Ennio Mingolla Professor of Cognitive and Neural Systems and Psychology PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision Bradley Rhodes Research Associate, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Motor control, learning, and adaptation, serial order behavior (timing in particular), attention and memory Michele Rucci Assistant Professor of Cognitive and Neural Systems PhD, Scuola Superiore S.-Anna, Pisa, Italy Vision, sensory-motor control and learning, and computational neuroscience Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Research Scientist, Haskins Laboratories, New Haven, CT Assistant Professor in Residence, Department of Psychology and Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Scientist, Rowland Institute for Science Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception Teaching about functional MRI and other brain mapping methods Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Frances and Louis H. Salvage Professor of Psychology, Brandeis University Consultant in neurosurgery, Boston Children's Hospital PhD, Psychology, Brown University Visual motion, brain imaging, relation of visual perception, memory, and movement Barbara Shinn-Cunningham Associate Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensorimotor adaptation, mathematical models of human performance David Somers Assistant Professor of Psychology PhD, Cognitive and Neural Systems, Boston University Functional MRI, psychophysical, and computational investigations of visual perception and attention Chantal E. Stern Assistant Professor of Psychology and Program in Neuroscience, Boston University Assistant in Neuroscience, MGH-NMR Center and Harvard Medical School PhD, Experimental Psychology, Oxford University Functional neuroimaging studies (fMRI and MEG) of learning and memory Malvin C. Teich Professor of Electrical and Computer Engineering, Biomedical Engineering, and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI) Jeremy Wolfe Adjunct Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Department Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, pre-attentive and attentive object representation Curtis Woodcock Professor of Geography Chairman, Department of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and student-run special interest groups, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by fellowships, grants, and contracts from federal agencies and private foundations that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception; audition, speech and language processing; and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Macintoshes, and PCs. A PC farm running Linux operating systems is available as a distributed computational environment. All students have access to X-terminals or UNIX workstation consoles, a selection of color systems and PCs, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Active Perception Laboratory The Active Perception Laboratory is dedicated to the investigation of the interactions between perception and behavior. Research focuses on the theoretical and computational analyses of the effects of motor behavior on sensory perception and on the design of psychophysical experiments with human subjects. The Active Perception Laboratory includes extensive computational facilities that allow the execution of large-scale simulations of neural systems. Additional facilities include instruments for the psychophysical investigation of eye movements during visual analysis, including an accurate and non-invasive eye tracker, and robotic systems for the simulation of different types of behavior. Auditory Neuroscience Laboratory The Auditory Neuroscience Laboratory in the Department of Cognitive and Neural Systems (CNS) is equipped to perform both traditional psychoacoustic experiments as well as experiments using interactive auditory virtual-reality stimuli. The laboratory contains approximately eight PCs (running Windows 98 and/or Linux), used both as workstations for students and to control laboratory equipment and run experiments. The other major equipment in the laboratory includes special-purpose signal processing and sound generating equipment from Tucker-Davis Technologies, electromagnetic head tracking systems, a two-channel spectrum analyzer, and other miscellaneous equipment for producing, measuring, analyzing, and monitoring auditory stimuli. The Auditory Neuroscience Laboratory consists of three adjacent rooms in the basement of 677 Beacon Steet (the home of the CNS Department). One room houses an 8 ft. ? 8 ft. single-walled sound-treated booth as well as space for students. The second room is primarily used as student workspace for developing and debugging experiments. The third space houses a robotic arm, capable of automatically positioning a small acoustic speaker anywhere on the surface of a sphere of adjustable radius, allowing automatic measurement of the signals reaching the ears of a listener for a sound source from different positions in space, including the effects of room reverberation. Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Laboratory is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; an active vision laboratory including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. The laboratory supports research in the areas of neural modeling, computational neuroscience, computer vision and robotics. The major question being address is the nature of representation of the visual world in the brain, in terms of observable neural architectures such as topographic mapping and columnar architecture. The application of novel architectures for image processing for computer vision and robotics is also a major topic of interest. Recent work in this area has included the design and patenting of novel actuators for robotic active vision systems, the design of real-time algorithms for use in mobile robotic applications, and the design and construction of miniature autonomous vehicles using space-variant active vision design principles. Recently one such vehicle has successfully driven itself on the streets of Boston. Neurobotics Laboratory The Neurobotics Laboratory utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The laboratory currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Other platforms may be investigated in the future. Sensory-Motor Control Laboratory The Sensory-Motor Control Laboratory supports experimental and computational studies of sensory-motor control. A computer controlled infrared WatSmart system allows measurement of large-scale (e.g. reaching) movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. The laboratory is connected to the department's extensive network of Linux and Windows workstations and Linux computational servers. Speech and Language Laboratory The Speech Laboratory includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal-processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. The laboratory also contains a network of Windows-based PC computers equipped with software for the analysis of functional magnetic resonance imaging (fMRI) data, including region-of-interest (ROI) based analyses involving software for the parcellation of cortical and subcortical brain regions in structural MRI images. Technology Laboratory The Technology Laboratory fosters the development of neural network models derived from basic scientific research and facilitates the transition of the resulting technologies to software and applications. The Lab was established in July 2001, with a grant from the Air Force Office of Scientific Research: "Information Fusion for Image Analysis: Neural Models and Technology Development." Initial projects have focused on multi-level fusion and data mining in a geospatial context, in collaboration with the Boston University Center for Remote Sensing. This research and development have built on models of opponent-color visual processing, boundary contour system (BCS) and texture processing, and Adaptive Resonance Theory (ART) pattern learning and recognition, as well as other models of associative learning and prediction. Other projects include collaborations with the New England Medical Center and Boston Medical Center, to develop methods for analysis of large-scale medical databases, currently to predict HIV resistance to antiretroviral therapy. Associated basic research projects are conducted within the joint context of scientific data and technological constraints. Visual Psychophysics Laboratory The Visual Psychophysics Laboratory occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Macintosh, Windows and Linux workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing devices, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty members have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://cns.bu.edu/ ******************************************************************* From kunliu1 at cs.umbc.edu Sat Sep 27 05:03:12 2003 From: kunliu1 at cs.umbc.edu (Kun Liu) Date: Sat, 27 Sep 2003 4:3:12 -0500 Subject: Distributed Data Mining Bibliography Message-ID: <200309270803.h8R83Sgv023416@mailserver-ng.cs.umbc.edu> We are pleased to announce the availability of a bibliography on Distributed Data Mining. The current version contains 246 entries. It can be downloaded from the following site: http://www.csee.umbc.edu/~hillol/DDMBIB/ The site also provides an interface to submit bibliographic information for relevant papers. Kun Liu University of Maryland, Baltimore County From jwl at discus.anu.edu.au Mon Sep 1 01:26:57 2003 From: jwl at discus.anu.edu.au (John Lloyd) Date: Mon, 1 Sep 2003 15:26:57 +1000 Subject: 3 Year Post at The Australian National University Message-ID: Smart Internet Technology Cooperative Research Centre Smart Personal Assistant Research Program The Australian National University Research School of Information Sciences and Engineering The CRC has a vacant 3 year Research Fellowship at ANU. Applicants should have expertise in one or more of the following areas of research: machine learning, intelligent agents, or computational logic. Details are available at: http://info.anu.edu.au/hr/Jobs/Academic_Positions/_ISE1885.asp Closing date: 19 September 2003 From bio-adit2004-NOSPAM at listes.epfl.ch Mon Sep 1 16:06:55 2003 From: bio-adit2004-NOSPAM at listes.epfl.ch (Bio-ADIT2004) Date: Mon, 1 Sep 2003 22:06:55 +0200 Subject: [Bio-ADIT2004] final CFP and extension of submission deadline: 19 Sept. Message-ID: <91226.AJTLYJAJ@listes.epfl.ch> Bio-ADIT 2004 FINAL CALL FOR PAPERS + 2-WEEK SUBMISSION EXTENSION: SEPTEMBER 19, 2003 The First International Workshop on Biologically Inspired Approaches to Advanced Information Technology January 29 - 30, 2004 Swiss Federal Institute of Technology, Lausanne (EPFL), Switzerland Web site: http://lslwww.epfl.ch/bio-adit2004 Sponsored by - Osaka University Forum - Swiss Federal Institute of Technology, Lausanne (EPFL) - The 21st Century Center of Excellence Program of The Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan, under the Program Title "Opening Up New Information Technologies for Building a Networked Symbiosis Environment" Biologically inspired approaches have already proved successful in achieving major breakthroughs in a wide variety of problems in information technology (IT). A more recent trend is to explore the applicability of bio-inspired approaches to the development of self-organizing, evolving, adaptive and autonomous information technologies, which will meet the requirements of next-generation information systems, such as diversity, scalability, robustness, and resilience. These new technologies will become a base on which to build a networked symbiotic environment for pleasant, symbiotic society of human beings in the 21st century. Bio-ADIT 2004 will be the first international workshop to present original research results in the field of bio-inspired approaches to advanced information technologies. It will also serve to foster the connection between biological paradigms and solutions to building the next-generation information systems. SCOPE: The primary focus of the workshop is on new and original research results in the areas of information systems inspired by biology. We invite you to submit papers that present novel, challenging, and innovative results. The topics include all aspects of bio-inspired information technologies in networks, distributed/parallel systems, hardware (including robotics) and software. We also encourage you to submit papers dealing with: - Self-organizing, self-repairing, self-replicating and self-stabilizing systems - Evolving and adapting systems - Autonomous and evolutionary software and robotic systems - Scalable, robust and resilient systems - Complex biosystems - Gene, protein and metabolic networks - Symbiosis networks SUBMISSION OF PAPERS: Authors are invited to submit complete and original papers. Papers submitted should not have been previously published in any forum, nor be under review for any journal or other conference. All submitted papers will be refereed for quality, correctness, originality and relevance. All accepted papers will be published in the conference proceedings. It is also planned to publish accepted papers as a book. Manuscripts should include an abstract and be limited to 16 pages in single spaced and single column format. Submissions should include the title, author(s), author's affiliation, e-mail address, fax number and postal address. In the case of multiple authors, an indication of which author is responsible for correspondence and preparing the camera ready paper should also be included. Electronic submission is strongly encouraged. Preferred file formats are PDF (.pdf) or Postscript (.ps). Visit our Web site at http://lslwww.epfl.ch/bio-adit2004/ for more information. Please contact Dr. Murata if you have to submit hard copies. Manuscripts should be submitted by September 5, 2003 through the Bio-ADIT Web site. Please contact the technical program co-chairs for any questions: Professor Auke Jan Ijspeert School of Computer and Communication Sciences Swiss Federal Institute of Technology, Lausanne (EPFL) CH-1015 Lausanne, Switzerland Tel: +41-21-693-2658 Fax: +41-21-693-3705 E-mail: Auke.Ijspeert at epfl.ch Professor Masayuki Murata Cybermedia Center Osaka University 1-32 Machikaneyama, Toyonaka, Osaka 560-0043, Japan Tel: +81-6-6850-6860 Fax: +81-6-6850-6868 E-mail: murata at cmc.osaka-u.ac.jp STUDENT TRAVEL GRANTS A limited number of travel grants will be provided for students attending Bio-ADIT 2004. Details of how to apply for a student travel grant will be posted on the workshop Web site. IMPORTANT DATES: Paper submission deadline : September 19, 2003 (NEW!) Notification of acceptance: November 3, 2003 Camera ready papers due : December 1, 2003 WEB SITE: An electronic paper submission system is up and ready from July 1, 2003 to accept on-line paper submissions for Bio-ADIT. Please visit our Web site at http://lslwww.epfl.ch/bio-adit2004/ for more up-to-date information. REGISTRATION: Thanks to the generous support from the sponsors, we expect that the total registration fee for the conference (including pre-proceedings and lunches) will be fixed to only 55.- CHF (Swiss Francs). EXECUTIVE COMMITTEE: General Co-Chairs: Daniel Mange (EPFL, Switzerland) Shojiro Nishio (Osaka Univ., Japan) Technical Program Committee Co-Chairs: Auke Jan Ijspeert (EPFL, Switzerland) Masayuki Murata (Osaka Univ., Japan) Finance Chairs: Marlyse Taric (EPFL, Switzerland) Toshimitsu Masuzawa (Osaka Univ., Japan) Publicity Chairs: Christof Teuscher (EPFL, Switzerland) Takao Onoye (Osaka Univ., Japan) Publications Chair: Naoki Wakamiya (Osaka Univ., Japan) Local Arrangements Chair: Carlos Andres Pena-Reyes (EPFL, Switzerland) Internet Chair: Jonas Buchli (EPFL, Switzerland) TECHNICAL PROGRAM COMMITTEE: Co-Chairs: Auke Jan Ijspeert (EPFL, Switzerland) Masayuki Murata (Osaka Univ., Japan) Members: Michael A. Arbib (Univ. of Southern California, USA) Aude Billard (EPFL, Switzerland) Takeshi Fukuda (IBM Tokyo Research Lab., Japan) Katsuro Inoue (Osaka Univ., Japan) Wolfgang Maass (Graz Univ. of Technology, Austria) Ian W. Marshall (BTexact Technologies, UK) Toshimitsu Masuzawa (Osaka Univ., Japan) Alberto Montresor (Univ. of Bologna, Italy) Stefano Nolfi (ISTC, CNR, Italy) Takao Onoye (Osaka Univ., Japan) Rolf Pfeifer (Univ. of Zurich, Switzerland) Eduardo Sanchez (EPFL, Switzerland) Hiroshi Shimizu (Osaka Univ., Japan) Moshe Sipper (Ben-Gurion Univ., Israel) Gregory Stephanopoulos (MIT, USA) Adrian Stoica (Jet Propulsion Lab., USA) Gianluca Tempesti (EPFL, Switzerland) Naoki Wakamiya (Osaka Univ., Japan) Hans V. Westerhoff (Free University, Amsterdam, The Netherlands) Xin Yao (Univ. of Birmingham, UK) From klaus at prosun.first.fraunhofer.de Tue Sep 2 13:36:59 2003 From: klaus at prosun.first.fraunhofer.de (Klaus-R. Mueller) Date: Tue, 2 Sep 2003 19:36:59 +0200 (MEST) Subject: NIPS 2003 Message-ID: <200309021736.h82Haxse016013@prosun.first.fraunhofer.de> Dear collegues, The NIPS conference program is now available online at www.nips.cc and the Web site is now accepting online registrations. Best wishes, Klaus Robert Mueller NIPS*203 Publicity Chair &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& Prof. Dr. Klaus Robert Mueller Fraunhofer FIRST.IDA and University of Potsdam Kekulestr. 7 12489 Berlin, Germany Klaus AT first.fhg.de Tel: +49 30 6392 1860 or 1800 Fax: +49 30 6392 1805 http://www.first.fhg.de/persons/Mueller.Klaus-Robert.html &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& From moody at ICSI.Berkeley.EDU Tue Sep 2 13:05:41 2003 From: moody at ICSI.Berkeley.EDU (John Moody) Date: Tue, 2 Sep 2003 10:05:41 -0700 (PDT) Subject: POSTDOC -- Reinforcement Learning & Finance Message-ID: The International Computer Science Institute (ICSI) is seeking to hire a Postdoctoral Fellow to work with Professor John Moody on research in Reinforcement Learning and Computational Finance. This two year Postdoc opportunity is part of a project funded by the National Science Foundation entitled "Risk, Reward and Reinforcement". This interdisciplinary investigation will explore powerful, new algorithms for Direct Reinforcement and the application of these algorithms to competitive games and important, real-world financial problems. Activities will include fundamental algorithm research, extensive simulation and empirical testing. Prototype development will make use of substantial financial market data resources under development at ICSI. The goals are to create highly effective and efficient algorithms for Direct Reinforcement that can discover robust, low-risk solutions to challenging problems. Candidates must have a Ph.D. or comparable research experience in a physical, mathematical or engineering science or a quantitative social science such as economics or finance. Strong mathematical, statistical and computational skills are required. Ideally, applicants should have knowledge of machine learning, Monte Carlo methods, time series analysis, optimization, control engineering or quantitative finance. Preference will be given to candidates who are about to complete or have completed a Ph.D. within the past five years. Exceptionally well- qualified non-Ph.D. candidates with three to five years of relevant professional experience may also be considered. ICSI is an independent, nonprofit research institute with headquarters in Berkeley, California. It is closely affiliated with the University of California at Berkeley. ICSI is funded by the U.S. and several European governments, and brings together top researchers from participating countries. This ICSI project will be based in Portland, Oregon, which offers exceptional quality of life and was recently named America's "Best Big City". See http://www.moneymag.com/best/bplive/portland.html . The successful applicant for this postdoc will have opportunities to interact with researchers at ICSI, UC Berkeley, and other leading Northwest and West Coast universities. Interested individuals should email a curriculum vitae, up to three representative publications, names / phones / emails of three to five references and a cover note describing your research interests, professional goals and availability. Please email materials to moody at icsi.berkeley.edu. Applications received on or before September 15, 2003 will receive priority consideration. We will begin reviewing applications on September 15 and will continue considering candidates until the position is filled. Applicants should be available to begin a two year postdoc between October 1, 2003 and February 1, 2004. ICSI is an equal opportunity employer; minorities and women are encouraged to apply. ____________________________________________________________________ John Moody, Professor International Computer Science Institute Berkeley & Portland Tel: +1-503-750-5942 moody at icsi.berkeley.edu FAX: +1-503-531-2026 http://www.icsi.berkeley.edu/~moody ____________________________________________________________________ From nik.kasabov at aut.ac.nz Wed Sep 3 01:35:13 2003 From: nik.kasabov at aut.ac.nz (Nik Kasabov) Date: Wed, 03 Sep 2003 17:35:13 +1200 Subject: NCEI'03: Neurocomputing and Evolving Intelligence 2003 Message-ID: Conference "Neuro-Computing and Evolving Intelligence" 2003 - NCEI'03 20 and 21 November, 2003, Auckland, New Zealand Conference Chair: Prof. Nik Kasabov (nkasabov at aut.ac.nz) Programme Chair: Dr. Zeke Chan (shchan at aut.ac.nz) Contact person: Joyce D'Mello (joyce.dmello at aut.ac.nz) phone: +64 9 917 9504 Technical support: Peter Hwang (peter.hwang at aut.ac.nz) The emphasis of the Conference will be on connectionist-based methods, systems and applications of Evolving Intelligence (EI) - information systems that develop, unfold, evolve their structure and functionality over time through interaction with the environment. These systems evolve their "intelligence" through learning and interaction. The topics of interest include: Novel methods for autonomous learning Novel methods for knowledge engineering and knowledge discovery Evolving intelligence (EI) Evolving molecular processes and their modelling Evolving processes in the brain and their modelling Evolving language and cognition Adaptive speech recognition Adaptive image and multi-modal processing Adaptive decision making Dynamic time-series modelling in a changing environment Adaptive control Adaptive intelligent systems on the WWW Applications in Medicine, Health, Information Technologies, Horticulture, Agriculture, Bio-security, Business and finance, Process and robot control. The two days event will include oral presentations, poster presentations and various demonstrations of neurocomputing systems for bioinformatics and biomedical applications; brain study and cognitive engineering, agriculture, environment, decision support, speech recognition and language processing, image and video processing, multi-modal information processing, process control. Abstracts of 1 page should be submitted to the Program Chair before 6th October 2003. The abstracts will be reviewed and authors will be notified on acceptance by 25th October. All accepted abstracts will be published in conference proceedings. Selected papers will be invited for publication in the special issues of two international journals. Registration fee: NZ $300. Bookings and registrations should be made through the contact person Joyce D'Mello (joyce.dmello at aut.ac.nz; phone: 09 917 9504). Conference site: www.kedri.org/NCEI_03.htm ----------------------------------------------------------------------------------------------------------------------------------- Prof. Nik Kasabov, MSc, PhD Fellow RSNZ, FNZCS, SrMIEEE Director, Knowledge Engineering and Discovery Research Institute Chair of Knowledge Engineering, School of Computer and Information Sciences Auckland University of Technology (AUT) phone: +64 9 917 9506 ; fax: +64 9 917 9501 mobile phone: +64 21 488 328 WWW http://www.kedri.info email: nkasabov at aut.ac.nz From daw at cs.cmu.edu Wed Sep 3 16:15:09 2003 From: daw at cs.cmu.edu (Nathaniel Daw) Date: Wed, 3 Sep 2003 16:15:09 -0400 (EDT) Subject: thesis: Reinforcement learning models of the dopamine system and their behavioral implications Message-ID: Dear Connectionists, I thought that some of you might be interested in my recently completed PhD thesis, "Reinforcement learning models of the dopamine system and their behavioral implications," which is available as a (rather large 4Mb) pdf download at http://www.cs.cmu.edu/~daw/thesis.pdf An abstract follows. best, Nathaniel Daw ABSTRACT This thesis aims to improve theories of how the brain functions and to provide a framework to guide future neuroscientific experiments by making use of theoretical and algorithmic ideas from computer science. The work centers around the detailed understanding of the dopamine system, an important and phylogenetically venerable brain system that is implicated in such general functions as motivation, decision-making and motor control, and whose dysfunction is associated with disorders such as schizophrenia, addiction, and Parkinson's disease. A series of influential models have proposed that the responses of dopamine neurons recorded from behaving monkeys can be identified with the error signal from temporal difference (TD) learning, a reinforcement learning algorithm for learning to predict rewards in order to guide decision-making. Here I propose extensions to these theories that improve them along a number of dimensions simultaneously. The new models that result eliminate several unrealistic simplifying assumptions from the original accounts; explain many sorts of dopamine responses that had previously seemed anomalous; flesh out nascent suggestions that these neurophysiological mechanisms can also explain animal behavior in conditioning experiments; and extend the theories' reach to incorporate proposals about the computational function of several other brain systems that interact with the dopamine neurons. Chapter 3 relaxes the assumption from previous models that the system tracks only short-term predictions about rewards expected within a single experimental trial. It introduces a new model based on average-reward TD learning that suggests that long-run reward predictions affect the slow-timescale, tonic behavior of dopamine neurons. This account resolves a seemingly paradoxical finding that the dopamine system is excited by aversive events such as electric shock, which had fueled several published attacks on the TD theories. These investigations also provide a basis for proposals about the functional role of interactions between the dopamine and serotonin systems, and about behavioral data on animal decision-making. Chapter 4 further revises the theory to account for animals' uncertainty about the timing of events and about the moment-to-moment state of an experimental task. These issues are handled in the context of a TD algorithm incorporating partial observability and semi-Markov dynamics; a number of other new or extant models are shown to follow from this one in various limits. The new theory is able to explain a number of previously puzzling results about dopamine responses to events whose timing is variable, and provides an appropriate framework for investigating behavioral results concerning variability in animals' temporal judgments and timescale invariance properties in animal learning. Chapter 5 departs from the thesis' primary methodology of computational modeling to present a complementary attempt to address the same issues empirically. The chapter reports the results of an experiment that record from the striatum of behaving rats (a brain area that is one of the major inputs and outputs of the dopamine system), during a task designed to probe the functional organization of decision-making in the brain. The results broadly support the contention of most versions of the TD models that the functions of action selection and reward prediction are segregated in the brain, as in "actor/critic" reinforcement learning systems. From terry at salk.edu Fri Sep 5 20:49:17 2003 From: terry at salk.edu (Terry Sejnowski) Date: Fri, 5 Sep 2003 17:49:17 -0700 (PDT) Subject: NEURAL COMPUTATION 15:10 In-Reply-To: <200308051940.h75JenY17542@purkinje.salk.edu> Message-ID: <200309060049.h860nHD15021@dax.salk.edu> Neural Computation - Contents - Volume 15, Number 10 - October 1, 2003 LETTERS Doubly Distributional Population Codes: Simultaneous Representation of Uncertainty and Multiplicity Maneesh Sahani and Peter Dayan Firing Rate of the Noisy Quadratic Integrate-and-Fire Neuron Nicolas Brunel and Peter E. Latham Pattern Filtering for Detection of Neural Activity, with Examples from HVc Activity During Sleep in Zebra Finches Zhiyi Chi, Peter L. Rauske, and Daniel Margoliash Synaptic Depression Leads to Nonmonotonic Frequency Dependence in the Coincidence Detector Shawn Mikula and Ernst Niebur Probing Changes in Neural Interaction During Adaptation Liqiang Zhu, Ying-Cheng Lai, France C. Hoppensteadt and Jiping He Memory Encoding by Theta Phase Precession in the Hippocampal Network Naoyuki Sato and Yoko Yamaguchi A Computational Model as Neurodecoder Based on Synchronous Oscillation in the Visual Cortex Zhao Songnian, Xiong Xiaoyun, Yao Guozheng and Fu Zhi Parameter Estimation of Sigmoid Superpositions: Dynamical System Approach Ivan Tyukin, Cees van Leeuwen, Danil Prokhorov On The Partitioning Capabilities of Feedforward Neural Networks with Sigmoid Nodes K. Koutroumbas ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2003 - VOLUME 15 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $95 $101.65 $143 Institution $590 $631.30 $638 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From junshuilanl at yahoo.com Sat Sep 6 10:33:33 2003 From: junshuilanl at yahoo.com (Junshui Ma) Date: Sat, 06 Sep 2003 10:33:33 -0400 Subject: Position: Senior Machine Learning Scientist Message-ID: <007701c37483$da1605c0$6400a8c0@apple> Please do *not* reply to this email address. Instead, please reply directly to: Nancy.Schwartz at kornferry.com ============================= Position: Senior Machine Learning Scientist Job Description: Aureon Biosciences corp., a bioscience company in Yonkers, New York, is looking for a candidate with extensive experience and knowledge in machine learning, statistics, and data mining, including statistical analysis, Bayesian reasoning and learning, combinatorial optimization and learning, neural networks, and kernel-based methods. The candidate is required to have an MS or Ph.D. related to the mentioned areas from an accredited university. Post-doctorate and/or industry experience is strongly preferred. The ideal candidate will be versed in the latest research, methods, developments and theories in these areas, and possess extensive experience of applying them to commercial/industrial applications. The candidate is also required to be a visionary, creative, and great problem solver capable of proposing solutions to multiple problems in parallel. Additional Skills a.. Familiarity with various machine learning and statistical software tools. b.. Excellent programming skill. Candidates must be well versed and experienced in at least two of the follow languages Matlab, S-Plus (or R), SAS, and C/C++. Working experience in Java is a plus. From Alain.Destexhe at iaf.cnrs-gif.fr Mon Sep 8 06:46:18 2003 From: Alain.Destexhe at iaf.cnrs-gif.fr (Alain Destexhe) Date: Mon, 08 Sep 2003 12:46:18 +0200 Subject: Call for proposal to host the Advanced Course in Computational Neuroscience Message-ID: <3F5C5DFA.288ABDFC@iaf.cnrs-gif.fr> CALL FOR PROPOSALS The organizing committee of the "European Advanced Course in Computational Neuroscience" is looking for applications for potential sites to host the course for 3 years (2005-2008). The course is now in its eighth year. It was held for three years in Crete (Greece, 1996-1998), three years in Trieste (Italy, 1999-2001), it is currently being held in the small medieval village of Obidos (Portugal; 2002-2004). Traditionally, the course is held in August in a European (or Associated) country. The ideal site is relatively remote and small (ie not a large institution in a big city), in order to ensure intimacy and quietness, and be an attractive location to spend the summer. We also need a relatively fast internet connection for the computer network. One of the most important aspects of the course is to have an efficient local organizer to sort out local facilities, such as lodging, food, transport, rooms to hold the lectures and the computer network. We also will need a firm commitment to secure everything for a period of three years (2005-2007). Anyone interested should contact Alain Destexhe (see address below) and will be requested to send details such as a description of the site and approximate budget for lodging, food, rental of computers, etc. A site-visit to the selected locations is planned for the spring of 2004 to decide for our next host. Below are contact addresses and a short description of the course CONTACT Alain Destexhe Integrative and Computational Neuroscience Unit CNRS 1, Avenue de la Terrasse (BAT 33) 91198 Gif-sur-Yvette, France email: destexhe at iaf.cnrs-gif.fr Tel: 33-1-69-82-34-35 SHORT DESCRIPTION OF THE COURSE The European Advanced Course in Computational Neuroscience is a high-level 4-week intensive course on the computational aspects of the central nervous system function, from the cellular to the systems level. It is usually structured in 4 thematic weeks, cellular, sensory, motor and multilevel systems. The invited faculty members (usually from 10 to 15 per week) are among the best known scientists in their respective fields (both experimental and theoretical; see web site for past programs). The course is highly selective - we receive from 90 to 180 applications every year, from which 25 to 30 students are selected. Students are mid-term PhD or postdocs, and can be of any background (usually a mixture of experimentalists and theoreticians). The course is intended to give them a solid basis on the different aspects that are important to understand the complexity of the nervous system, as well as the different approaches that have been used in theoretical studies. Students are required to do a research project during the course, and are helped by the faculty and tutors. The selection of students is based on letters of recommendation and the advice of three independent referees. More information is available at our website: http://www.neuroinf.org/courses/EUCOURSE From isabelle at clopinet.com Mon Sep 8 13:38:49 2003 From: isabelle at clopinet.com (Isabelle Guyon) Date: Mon, 08 Sep 2003 10:38:49 -0700 Subject: Feature selection challenge Message-ID: <3F5CBEA9.787E7837@clopinet.com> Dear feature selection researcher, We are launching today a benchmark on feature selection, see: http://www.nipsfsc.ecs.soton.ac.uk/ Deadline: December 1st, 2003. Discussion of the benchmark results will take place at a one-day NIPS 2003 workshop on feature extraction (December 11-13, 2003, Whistler, British Columbia, CA), see http://clopinet.com/isabelle/Projects/NIPS2003/. Good luck! Isabelle Guyon From glanzman at helix.nih.gov Mon Sep 8 11:05:02 2003 From: glanzman at helix.nih.gov (Dennis Glanzman) Date: Mon, 08 Sep 2003 11:05:02 -0400 Subject: Flat text version of Dynamical Neuroscience Flyer Message-ID: <4.3.2.7.2.20030908110405.02cea910@helix.nih.gov> An HTML attachment was scrubbed... URL: https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/77a55c69/attachment-0001.html From bayer at cs.orst.edu Tue Sep 9 01:25:56 2003 From: bayer at cs.orst.edu (Valentina Bayer) Date: Mon, 8 Sep 2003 22:25:56 -0700 (PDT) Subject: Ph.D. thesis available: Learning Cost-Sensitive Diagnostic Policies from Data Message-ID: Dear Connectionists, My Ph.D. thesis: Learning Cost-Sensitive Diagnostic Policies from Data is available to download from http://eecs.oregonstate.edu/library/?call=2003-13 Advisor: Prof. Tom Dietterich, Oregon State University The abstract and table of context follow. Best regards, Valentina Bayer Zubek http://web.engr.oregonstate.edu/~bayer/ ------------- Abstract: In its simplest form, the process of diagnosis is a decision-making process in which the diagnostician performs a sequence of tests culminating in a diagnostic decision. For example, a physician might perform a series of simple measurements (body temperature, weight, etc.) and laboratory measurements (white blood count, CT scan, MRI scan, etc.) in order to determine the disease of the patient. A diagnostic policy is a complete description of the decision-making actions of a diagnostician under all possible circumstances. This dissertation studies the problem of learning diagnostic policies from training examples. An optimal diagnostic policy is one that minimizes the expected total cost of diagnosing a patient, where the cost is composed of two components: (a) measurement costs (the costs of performing various diagnostic tests) and (b) misdiagnosis costs (the costs incurred when the patient is incorrectly diagnosed). The optimal policy must perform diagnostic tests until further measurements do not reduce the expected total cost of diagnosis. The dissertation investigates two families of algorithms for learning diagnostic policies: greedy methods and methods based on the AO* algorithm for systematic search. Previous work in supervised learning constructed greedy diagnostic policies that either ignored all costs or considered only measurement costs or only misdiagnosis costs. This research recognizes the practical importance of costs incurred by performing measurements and by making incorrect diagnoses and studies the tradeoff between them. This dissertation develops improved greedy methods. It also introduces a new family of learning algorithms based on systematic search. Systematic search has previously been regarded as computationally infeasible for learning diagnostic policies. However, this dissertation describes an admissible heuristic for AO* that enables it to prune large parts of the search space. In addition, the dissertation shows that policies with better performance on an independent test set are learned when the AO* method is regularized in order to reduce overfitting. Experimental studies on benchmark data sets show that in most cases the systematic search methods produce better diagnostic policies than the greedy methods. Hence, these AO*-based methods are recommended for learning diagnostic policies that seek to minimize the expected total cost of diagnosis. -------------- Table of Contents: Chapter 1:Introduction Chapter 2: Cost-sensitive Learning (CSL) 2.1 Supervised Learning 2.2 Markov Decision Problems (MDPs) 2.3 Formal Description of the Cost-sensitive Learning Problem as an (Acyclic) MDP 2.4 Example of Diagnostic Policies 2.5 Assumptions and Extensions of Our Cost-sensitive Learning Framework 2.5.1 Complex Attribute Costs and Misclassification Costs 2.5.2 Complex Actions 2.5.3 CSL Problem Changes in Time 2.5.4 Missing Attribute Values 2.5.5 Multiple Classes 2.5.6 Continuous Attributes 2.5.7 Objective Function 2.6 Literature Review for the Cost-sensitive Learning Problem in Machine Learning 2.7 Related Work in Decision-theoretic Analysis 2.8 Summary Chapter 3: Greedy Search for Diagnostic Policies 3.1 General Description of Greedy Algorithms 3.2 InfoGainCost Methods 3.3 Modified InfoGainCost Methods (MC+InfoGainCost) 3.4 One-step Value of Information (VOI) 3.5 Implementation Details for Greedy Algorithms 3.6 Summary Chapter 4: Systematic Search for Diagnostic Policies 4.1 AND/OR Graphs 4.2 AO* Algorithm 4.2.1 Overview of the AO* Algorithm 4.2.2 Admissible Heuristic 4.2.3 Optimistic Values and Optimistic Policy 4.2.4 Realistic Values and Realistic Policy 4.2.5 Selecting a Node for Expansion 4.2.6 Our Implementation of AO* (High Level) 4.2.7 AO* for CSL Problems, With an Admissible Heuristic, Converges to the Optimal Value Function V* 4.2.8 Pseudocode and Implementation Details for the AO* Algorithm 4.3 Regularizers 4.3.1 Memory Limit 4.3.2 Laplace Correction (L) 4.3.3 Statistical Pruning (SP) 4.3.4 Pessimistic Post-Pruning (PPP) Based on Misclassification Costs 4.3.5 Early Stopping (ES) 4.3.6 Dynamic Method 4.3.7 AND/OR Graph Initialized with a Known Policy 4.3.8 Combining Regularizers 4.4 Review of AO* Literature 4.4.1 AO* Relation with A* 4.4.2 AO* Notations, Implementations, and Relation with Branch-and-Bound 4.4.3 Theoretical Results on AO* 4.4.4 POMDPs 4.4.5 Decision-theoretic Analysis 4.4.6 Test Sequencing Problem 4.4.7 Relation of CSL with Reinforcement Learning 4.5 Summary Chapter 5: Experimental Studies 5.1 Experimental Setup 5.1.1 UCI Domains 5.1.2 Setting the Misclassification Costs (MC) 5.1.3 Training Data, Test Data, Memory Limit 5.1.4 Notations for the Cost-Sensitive Algorithms 5.1.5 Evaluation Methods 5.2 Overfitting 5.3 Results 5.3.1 Laplace Correction Improves All Algorithms 5.3.2 Results on the bupa Domain 5.3.3 Results on the pima Domain 5.3.4 Results on the heart Domain 5.3.5 Results on the breast-cancer Domain 5.3.6 Results on the spect Domain 5.3.7 Summary of Algorithms' Performance 5.4 Discussion 5.4.1 An Overall Score for Algorithms (Chess Metric) 5.4.2 The Most Robust Algorithms 5.4.3 Comparing The Most Robust Algorithms Against the Best Algorithm on Each Domain 5.4.4 Summary of Discussion 5.4.5 Insights Into the Algorithms' Performance 5.5 Summary Chapter 6: Conclusions and Future Work 6.1 Contributions of This Dissertation 6.2 Future Work 6.3 Thesis Summary Appendices Appendix A: Details on Our AO* Implementation Appendix B: More Information on the Experimental Studies B.1 Misclassification Costs Matrices for the UCI Domains B.2 Comparing the Worst Algorithms in the Systematic and Greedy Search Families B.3 Comparing AO* with All the Other Algorithms using BDeltaCost B.4 Results of Comparing Each Laplace-Corrected Algorithm with All the Other Laplace-corrected Algorithms, on Each Domain and Misclassification Cost Level (MC) B.5 Paired-Graphs Comparing the Best Algorithm on Each Domain with Our Recommended Algorithms From ted.carnevale at yale.edu Tue Sep 9 15:59:28 2003 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Tue, 09 Sep 2003 15:59:28 -0400 Subject: NEURON course at SFN 2003 meeting Message-ID: <3F5E3120.80007@yale.edu> Due to increased support for this year's NEURON course at the SFN 2003 meeting, we are dropping the registration fee to $100. This is the lowest registration fee for this course since 1998. However, we can only accept a total of _30_ registrants at this reduced rate. Sign up now to ensure a place in this year's course! USING THE NEURON SIMULATION ENVIRONMENT Satellite Symposium, Society for Neuroscience Meeting 9 AM - 5 PM on Friday, Nov. 7, 2003 Speakers to include M.L. Hines and N.T. Carnevale This 1 day course with lectures and live demonstrations will present information essential for teaching and research applications of NEURON, an advanced simulation environment that handles realistic models of biophysical mechanisms, individual neurons, and networks of cells. The emphasis is on practical issues that are key to the most productive use of this powerful and convenient modeling tool. Features that will be covered include: constructing and managing models of cells and networks importing detailed morphometric data expanding NEURON's repertoire of biophysical mechanisms database resources for empirically-based modeling Each registrant will a comprehensive set of notes which include material that has not appeared elsewhere in print. Registration is limited to _30_ individuals on a first-come, first serve basis. For more information see http://www.neuron.yale.edu/no2003.html --Ted From mm at cse.ogi.edu Tue Sep 9 18:48:17 2003 From: mm at cse.ogi.edu (Melanie Mitchell) Date: Tue, 9 Sep 2003 15:48:17 -0700 Subject: Ph.D. Student Position at OGI Message-ID: <16222.22705.136974.174360@sangre.cse.ogi.edu> Dear Connectionists, The following position announcement is for research on combining evolutionary computation methods with other machine learning methods, as applied to image processing. Persons with neural network experience are strongly encouraged to apply. I have an opening for one graduate research assistant to work on a project applying machine learning methods, including evolutionary algorithms, to image analysis. The applications will be primarily in biomedical domains. Two major aspects of the research will be (1) how to best combine different machine learning methods and (2) how to automatically incorporate prior knowledge and contextual information in image analysis. Applicants must be willing to pursue a Ph.D. degree in Computer Science and Engineering at the OGI School of Science and Engineering, Oregon Health & Science University, near Portland, Oregon. The department web pages can be found at http://www.cse.ogi.edu. Proficiency in C, C++, Java, or another high-level programming language is required. Background in machine learning, evolutionary computation, image processing and/or computer vision is highly desirable. The assistantship will cover tuition and stipend. To apply, send a resume with your research interests, list of relevant course work or experience, programming experience and languages, and any other information you think would be relevant, and the names and contact information of at least two professors or scientists who will act as references. Please send this information in electronic form to Melanie Mitchell at the e-mail address above. Applications will be considered until the position is filled. Students of any nationality may apply. OGI is an equal opportunity employer and particularly welcomes applications from women and minority candidates. OGI is located 12 miles west of downtown Portland. Portland offers a superb quality of life, with extensive cultural amenities and spectacular natural surroundings, including close proximity to mountains, beaches, and wilderness areas. ----------------------------------- Melanie Mitchell Associate Professor Department of Computer Science and Engineering OGI School of Science & Engineering Oregon Health & Science University 20000 NW Walker Road Beaverton, OR 97006 From Nicolas.Rougier at loria.fr Mon Sep 15 04:13:30 2003 From: Nicolas.Rougier at loria.fr (Nicolas Rougier) Date: Mon, 15 Sep 2003 10:13:30 +0200 Subject: Frontal workshop to be held in Nancy, France. 20 October 2003. Message-ID: <3F6574AA.9020606@loria> Dear Connectionists, I would like to announce the following workshop on frontal cortex to be held in Nancy, France on October, 20 2003. "A multidisciplinary approach to the study of the frontal cortex" Substantial progress has been achieved in the last decade regarding the role of the prefrontal cortex on several grounds: computational modeling, neuropsychology, neurosciences, and neuroimaging to mention but the most manifest. The complexity of the subject compels a diversity of standpoints. Accordingly, the aim of the present workshop is to propose an unambiguously multidisciplinary approach to the study of the prefrontal cortex. We wish to set off fruitful discussion on how to bridge the gap among different perspectives, by presenting data from diverse fields of research. We think that viewing together some of the multifarious facets of the work on the prefrontal cortex may help to improve our attempts to understand its role. Invited speakers: Todd Braver (Washington University) "Prefrontal mechanisms of cognitive control" Vittorio Gallese (Parma University) "From action control to action representation: mirror neurons and the cognitive functions of the premotor cortex" Etienne Koechlin (Pierre et Marie Curie University) "Organization of executive functions in the human prefrontal cortex" Michael Kopelman (Kings College, London) "Frontal amnesias" Randall O'Reilly ( University of Colorado) "Reinforcement learning of dynamic gating signal in the prefrontal cortex/basal ganglia working memory system" Registration is free (and mandatory) but places are limited. Further information available at: http://www.loria.fr/~rougier/workshop.html Detailed program at: http://www.loria.fr/~rougier/workshop-program.html Regards, Nicolas Rougier From Chris.Mellen at gmf.com.au Sun Sep 14 21:02:58 2003 From: Chris.Mellen at gmf.com.au (Chris Mellen) Date: Mon, 15 Sep 2003 11:02:58 +1000 Subject: Research Position at Grinham Managed Funds, Australia Message-ID: <026c01c37b25$1dc59400$0e80a8c0@gmf.com.au> Dear Connectionists, You might find the following of interest. Regards, Christopher Mellen +++++++++++++++++++++++++++++++++++++++++++++++++++ Research Position Grinham Managed Funds Sydney, NSW, Australia. Grinham Managed Funds is one of the Southern Hemisphere's largest Hedge Fund Managers. Managing in excess of $1 billion we trade in over 40 futures markets into 9 countries, 24 hours a day. We are looking for an individual to fill a newly created permanent research position. The primary task will be to undertake research into the detection and exploitation of robust statistically significant patterns within financial time series data. This is a task which has the potential to encompass a diverse range of research directions. Consequently the role will have a broad scope. The successful applicant will have a Ph.D. in physics, statistics, mathematics, computer science, engineering or a related field. All levels of experience will be considered. Competency in software development is essential and knowledge of one or more of C/C++, R/S+, Matlab/Octave or related languages will be required. Past research experience in any of the fields of complex systems, statistical and numerical analysis, machine learning, pattern recognition, time series modelling or related would be highly regarded. Prior knowledge of or experience in finance is however not a pre-requisite. Applicants should be willing to work closely with other researchers and with I.T. professionals within the company. This is an exciting, intellectually challenging and rewarding role for someone with enthusiasm and imagination. The work environment is friendly and informal. Salary will be commensurate with experience. Bonuses are linked to the individual's and to the firm's performance. Individuals who are interested may apply by emailing their resume to : Research at gmf.com.au From anderson at CS.ColoState.EDU Mon Sep 15 18:08:49 2003 From: anderson at CS.ColoState.EDU (Chuck Anderson) Date: Mon, 15 Sep 2003 16:08:49 -0600 Subject: graduate research assistant position in reinforcement learning Message-ID: <3F663871.3070801@cs.colostate.edu> A graduate research assistant position is available starting Spring, 2004, for research on combining reinforcement learning, recurrent neural networks, and robust control theory, on a project funded by the National Science Foundation. You may read about this project at http://www.engr.colostate.edu/nnhvac. Proficiency in C, C++, and Matlab is required. Experience with neural networks, feedback control, and Simulink for Matlab is highly desirable. Applicants must be willing to pursue a Ph.D. degree in Computer Science at Colorado State University. Information about the department can be found at http://www.cs.colostate.edu. The assistantship will cover tuition and a stipend. To apply, send a resume describing your education, work experience, and programming experience and languages, plus a statement of your research interests and any other information that might be relevant. Also include the names and contact information for at least two professors or scientists who will serve as references. Please send this information in electronic form to Chuck Anderson at anderson at cs.colostate.edu. Colorado State University is located in Fort Collins, Colorado, which is situated about 60 miles from Denver, alongside the foothills of the Rocky Mountains. Read more about CSU at http://www.colostate.edu. -- Chuck Anderson associate professor Department of Computer Science anderson at cs.colostate.edu Colorado State University http://www.cs.colostate.edu/~anderson Fort Collins, CO 80523-1873 office: 970-491-7491 From pfbaldi at ics.uci.edu Mon Sep 15 22:05:39 2003 From: pfbaldi at ics.uci.edu (Pierre Baldi) Date: Mon, 15 Sep 2003 19:05:39 -0700 Subject: Postdoctoral Fellowships in Bioinformatics/Computational Biology and Machine Learning at UCI Message-ID: <001601c37bf7$06bb1b90$cd04c380@TIMESLICE2> Please forward this announcement to people you think might be interested. Thank you. ======================================================================= Several NIH-supported postdoctoral positions in the areas of Computational Biology/Bioinformatics and Machine Learning are available in the School of Information and Computer Science ( www.ics.uci.edu) and the Institute for Genomics and Bioinformatics ( www.igb.uci.edu) at the University of California, Irvine. Areas of particular interest include: protein structure/function prediction, molecular docking and drug design, chemical informatics, comparative genomics, analysis of high-throughput data (e.g. DNA microarray data ), gene regulation, systems biology, medical informatics, and all areas of machine learning and large scale data analysis. Prospective candidates should apply with a cover letter, CV, statement of research interests and accomplishments, and names and email addresses of 3 referees to be sent, preferably by email, to: pfbaldi at ics.uci.edu. The positions are available starting October 1, 2003 and the duration of the appointments are typically 2 years with possibility of renewal. Relevant faculty in the School include: P. Baldi, R. Dechter, D. Kibler, R. Lathrop, E. Mjolsness, M. Pazzani, P. Smyth , H. Stern, and M. Welling.There are many opportunities for collaboration with life scientists located in other units within short walking distance from the School. The University of California is an Equal Opportunity Employer committed to excellence through diversity. =========================================================================== Pierre Baldi School of Information and Computer Science and Department of Biological Chemistry, College of Medicine Director Institute for Genomics and Bioinformatics University of California, Irvine Irvine, CA 92697-3425 (949) 824-5809 (949) 824-4056 FAX www.ics.uci.edu/~pfbaldi From fredrik.linaker at ida.his.se Tue Sep 16 09:28:52 2003 From: fredrik.linaker at ida.his.se (Fredrik Linaker) Date: Tue, 16 Sep 2003 15:28:52 +0200 Subject: Ph.D. thesis available: Unsupervised On-line Data Reduction for Memorisation and Learning in Mobile Robotics Message-ID: <3F671014.5090000@ida.his.se> Dear Connectionists, My Ph.D. thesis: Unsupervised On-line Data Reduction for Memorisation and Learning in Mobile Robotics is now available for download: http://www.ida.his.se/~fredrik/publications/linaker_thesis2003.pdf http://www.ida.his.se/~fredrik/publications/linaker_thesis2003.ps.gz Supervisor: Prof. Noel Sharkey, University of Sheffield, UK An abstract follows. I'd be very interested in information about open post-doc positions within the learning and/or robotics areas. Best regards, Fredrik Linaker fredrik.linaker at ida.his.se ABSTRACT The amount of data available to a mobile robot controller is staggering. This thesis investigates how extensive continuous-valued data streams of noisy sensor and actuator activations can be stored, recalled, and processed by robots equipped with only limited memory buffers. We address three robot memorisation problems, namely Route Learning (store a route), Novelty Detection (detect changes along a route) and the Lost Robot Problem (find best match along a route or routes). A robot learning problem called the Road-Sign Problem is also addressed. It involves a long-term delayed response task where temporal credit assignment is needed. The limited memory buffer entails that there is a trade-off between memorisation and learning. A traditional overall data compression could be used for memorisation, but the compressed representations are not always suitable for subsequent learning. We present a novel unsupervised on-line data reduction technique which focuses on change detection rather than overall data compression. It produces reduced sensory flows which are suitable for storage in the memory buffer while preserving underrepresented inputs. Such inputs can be essential when using temporal credit assignment for learning a task. The usefulness of the technique is evaluated through a number of experiments on the identified robot problems. Results show that a learning ability can be introduced while at the same time maintaining memorisation capabilities. The essentially symbolic representation, resulting from the unsupervised on-line reduction could in the extension also help bridge the gap between the raw sensory flows and the symbolic structures useful in prediction and communication. http://www.ida.his.se/~fredrik/ From a.hussain at cs.stir.ac.uk Tue Sep 16 08:23:37 2003 From: a.hussain at cs.stir.ac.uk (Dr. Amir Hussain) Date: Tue, 16 Sep 2003 13:23:37 +0100 Subject: Call for Papers: Control & Intelligent Systems Journal Special Issue on "Non-linear Adaptive PID Control" References: <001601c37bf7$06bb1b90$cd04c380@TIMESLICE2> Message-ID: <3F6700C9.E15B3E85@cs.stir.ac.uk> Dear all, Please forward this CFP announcement to people you think might be interested. Thank you.. ======================================================================= Call for Papers Special Issue on "NON-LINEAR ADAPTIVE PID CONTROL" International Journal of Control and Intelligent Systems (CIS), IASTED / ACTA Press, Vol. 33, 2005 CIS Journal Special Issue Website: http://www.actapress.com/journals/specialci2.htm It is well known that three-term proportional-integral-derivative (PID) controllers are quite versatile and widely used, particularly in process control industries. This domination is mainly due both to the robustness of these controllers in a wide range of processes and the simplicity of their structure. They are easy to implement using both digital or analog hardware, easily understood by technical personnel, and, are remarkably effective in regulating a wide range of applications when properly tuned. For most simple processes, PID control can achieve satisfactory closed loop performance. However, many industrial processes possess complex properties such as nonlinear and time varying characteristics. In this context, conventional PID controllers with fixed parameters have a major drawback in that they need to be re-tuned in order to adapt to variations in plant dynamics and environments, leading to poor control performance. For the control of real-world processes, an alternative is to use adaptive control schemes, which can automatically adjust PID parameters on-line. However, before such adaptive controllers can be implemented, mathematical modeling of the plant has to be done, which is usually based on the assumption that the process to be controlled is linear. Since most real-world plants are nonlinear, the responses of these plants cannot be shaped for a desired performance using linear controllers. Inaccuracies in the modeling of real-world plants result in degraded performances of linear adaptive controllers. Hence, it can be concluded that the applications of conventional linear adaptive controllers are limited and not ideally suited for nonlinear and complex real-world processes. Consequently, it may be argued that nonlinear adaptive controllers are required to effectively control such plants, and it is necessary to incorporate the inherent nonlinearity of the process into the controller design. One of the main difficulties in designing nonlinear controllers, however, is the lack of a general structure for them. In recent years, computational-intelligence techniques such as artificial neural networks, fuzzy logic, genetic algorithms, combined neuro-fuzzy approaches, and other nonlinear and biologically inspired techniques have become valuable tools to describe and control nonlinear plants. More recently however, there has been a growing interest among academic and industrial researchers in new types of controllers, which can combine the approximation power of such nonlinear adaptive techniques with the simplicity of PID control structures. This special issue of the International Journal, Control & Intelligent Systems, is an attempt to bring together active researchers in this emerging field of nonlinear adaptive PID control. Scope: The application domain includes (but is not limited to): - Neuro-PID Control - Fuzzy-PID Control - Neuro-fuzzy - PID Control - General nonlinear techniques including, nonlinear predictive- and minimum variance-based PID Control. - Design, development, stability, and robustness issues of these techniques - Practical applications Any topic relevant to Nonlinear Adaptive PID Control will be considered. Instructions for Manuscripts: All manuscripts should be e-mailed to the Guest Editor of the Special Issue at a.hussain at cs.stir.ac.uk by Dec 15, 2003. On the e-mail subject line please indicate "Submission for CIS Special Issue Nonlinear Adaptive PID Control." The submission should include the name(s) of the author(s), their affiliation, addresses, fax numbers, and e-mail addresses. Manuscripts should strictly follow the guidelines of ACT A Press, given at the following website: http://www.actapress.com/joumals/submission.htm Important Dates Deadline for paper submission: December 15,2003 Notification of acceptance: April 30 , 2004 Final manuscripts due: July 15 , 2004 Publication in Special Issue: Early 2005 Guest Editor: Dr. Amir Hussain Department of Computing Science & Mathematics University of Stirling Stirling FK9 4LA Scotland, UK E-mail: a.hussain at cs.stir.ac.uk http://www.cs.stir.ac.uk/~ahu/ Tel/Fax: (++44) 01786-467437/464551 -- The University of Stirling is a university established in Scotland by charter at Stirling, FK9 4LA. Privileged/Confidential Information may be contained in this message. If you are not the addressee indicated in this message (or responsible for delivery of the message to such person), you may not disclose, copy or deliver this message to anyone and any action taken or omitted to be taken in reliance on it, is prohibited and may be unlawful. In such case, you should destroy this message and kindly notify the sender by reply email. Please advise immediately if you or your employer do not consent to Internet email for messages of this kind. From Anthony.Pipe at uwe.ac.uk Wed Sep 17 00:06:01 2003 From: Anthony.Pipe at uwe.ac.uk (Tony Pipe) Date: Tue, 16 Sep 2003 16:06:01 -1200 Subject: CFP: Track on Applications of Neural Networks at Flairs 04 in Miami Florida Message-ID: <005301c37cd1$01f66580$55800ba4@HP29375324311> Neural Network Applications: Special Track at the 17th International FLAIRS Conference In cooperation with the American Association for Artificial Intelligence Palms South Beach Hotel Miami Beach, FL May 17-19, 2004 -------------------------------------------------------------------------------- Call for Papers Papers are being solicited for a special track on Neural Network Applications at the 17th International FLAIRS Conference (FLAIRS-2004). The special track will be devoted to the applications of Neural Networks with the aim of presenting new and important contributions in this area. These application areas include, but are not limited to, the following: a.. Vision b.. Pattern Recognition c.. Control and Process Monitoring d.. Biomedical Applications e.. Robotics f.. Speech Recognition g.. Text Mining h.. Diagnostic Problems i.. Telecommunications j.. Power Systems k.. Signal Processing l.. Image Processing -------------------------------------------------------------------------------- Submission Guidelines Interested authors must submit completed manuscripts by October 24, 2003. Submissions should be no more than 6 pages (4000 words) in length, including figures, and contain no identifying reference to self or organization. Papers should be formatted according to AAAI Guidelines. Submission instructions can be found at FLAIRS-04 website at http://www.flairs.com/flairs2004. Notification of acceptance will be mailed around January 7, 2004. Authors of accepted papers will be expected to submit the final camera-ready copies of their full papers by February 6, 2004 for publication in the conference proceedings which will be published by AAAI Press. Authors may be invited to submit a revised copy of their paper to a special issue of the International Journal on Artificial Intelligence Tools (IJAIT). -------------------------------------------------------------------------------- FLAIRS 2004 Invited Speakers a.. Justine Cassell (Massachusetts Institute of Technology) b.. Edward Feigenbaum (Stanford University) c.. Jim Hendler (University of Maryland) d.. Tom Mitchell (Carnegie Mellon University) -------------------------------------------------------------------------------- Important Dates a.. Paper submissions due: October 24, 2003 b.. Notification letters sent: January 7, 2004 c.. Camera ready copy due: February 6, 2004 -------------------------------------------------------------------------------- Special Track Committee (Tentative) Ingrid Russell (Co-Chair), University of Hartford, USA Tony Pipe (Co-Chair), University of the West of England, UK Brian Carse (Co-Chair), University of the West of England, UK Jim Austin, University of York, UK Vijayakumar Bhagavatula, Carnegie Mellon University, USA Serge Dolenko, Moscow State University, Russia Okan Ersoy, Purdue University, USA Michael Georgiopoulos, University of Central Florida, USA Mike James, York University, UK John Kolen, University of West Florida, USA Lisa Meeden, Swarthmore College, USA Sergio Roa, National University of Colombia, Columbia Roberto Santana, Institute of Cybernetics, Mathematics and Physics (ICIMAF), Cuba Bernhard Sendhoff, Honda Research and Development Europe, Offenbach/Main, Germany C. N. Schizas, University of Cyprus, Cyprus Wai Sum Tang, Chinese University of Hong Kong, Hong Kong Stefan Wermter, University of Sunderland, UK Hujun Yin, University of Manchester Institute of Science and Technology, UK -------------------------------------------------------------------------------- Further Information Questions regarding the special track should be addressed to: Tony Pipe Voice: +44-117-344-2818 Fax: +44-117-344-3800 email: Anthony.Pipe at uwe.ac.uk From dimi at ci.tuwien.ac.at Wed Sep 17 09:30:19 2003 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Wed, 17 Sep 2003 15:30:19 +0200 (CEST) Subject: CI BibTeX Collection -- Update Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 6/5-7/4 IEEE Transactions on Fuzzy Systems, Volumes 10/5-11/4 IEEE Transactions on Neural Networks, Volumes 13/6-14/4 Machine Learning, Volumes 50/1-2-53/1-2 Neural Computation, Volumes 14/10-15/9 Neural Networks, Volumes 15/8-9-16/6 Neural Processing Letters, Volumes 16/3-17/2 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/services/BibTeX.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fuer Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitaet Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ From djaeger at emory.edu Sat Sep 20 12:38:05 2003 From: djaeger at emory.edu (Dieter Jaeger) Date: Sat, 20 Sep 2003 12:38:05 -0400 Subject: Postdoctoral Position in Computational Neuroscience Message-ID: <3F6C826D.2F419699@emory.edu> A funded postdoctoral opening in the area of computational neuroscience is available in my laboratory at Emory University, Atlanta. The research project is aimed at elucidating the operation of the deep cerebellar nuclei using whole cell recordings in slices and compartmental modeling. This work will build on our previous publications in this area (Gauck and Jaeger, J. Neurosci. 2000; 2003). The Neuroscience environment at Emory University is excellent, and living in Atlanta features a large international community and plenty of activities. Candidates should have previous experience in intracellular electrophysiology and/or compartmental modeling. Interested candidates should contact djaeger at emory.edu for further details. -Dieter Jaeger Associate Professor Emory University Department of Biology 1510 Clifton Rd. Atlanta, GA 30322 Tel: 404 727 8139 Fax: 404 727 2880 e-mail: djaeger at emory.edu From ishikawa at brain.kyutech.ac.jp Sun Sep 21 04:23:30 2003 From: ishikawa at brain.kyutech.ac.jp (Masumi Ishikawa) Date: Sun, 21 Sep 2003 17:23:30 +0900 Subject: 2004 Special Issue of Neural Networks Message-ID: <5.0.2.5.2.20030921171938.00e14008@mail.brain.kyutech.ac.jp> ---------------------------------------------------------- We apologize if you receive multiple copies of this email. ---------------------------------------------------------- Call for Papers 2004 Special Issue of Neural Networks New Developments in Self-Organizing Systems Research on self-organizing systems including self-organizing maps (SOMs) is an important area of unsupervised learning and has been rapidly growing in various directions: theoretical developments, applications in various fields, in-depth analysis of self-organizing systems in neuroscience and so forth. The Workshop on Self-Organizing Maps (WSOM) has been held biennially since 1997. In October, 2002, a Special Issue on Self-Organizing Maps was published in the journal Neural Networks selected from presentations at WSOM'01. The latest workshop, WSOM'03, was held in September 2003 in Kitakyushu, Japan (for details, see http://www.brain.kyutech.ac.jp/~wsom). Considering the extensive growth of this area, we plan another special issue related to self-organization in 2004, ranging from theoretical aspects to various applications. We will select papers from those submitted to this special issue of Neural Networks; papers will be either revisions of those presented at WSOM'03 or those directly submitted to the special issue. A limited number of invited papers by leading scientists are also planned. Technical areas include, but are not limited to: - Theory of self-organizing systems - Data visualization and mining - Applications to WEB intelligence - Applications to text and document analysis - Applications to robotics - Applications to image processing and vision - Applications to pattern recognition - Hardware and architecture - Self-organizing systems in neuroscience Guest-Editors Masumi Ishikawa, Kyushu Institute of Technology Risto Miikkulainen, The University of Texas at Austin Helge Ritter, University of Bielefeld Submission Deadline for submission: December 10, 2003 Notification of acceptance: April 15, 2004 Deadline for submission of accepted papers: June 20, 2004 Deadline for submission of final papers: August 30, 2004 Format: as normal papers in the journal Address for Papers: Dr. Mitsuo Kawato ATR Computational Neuroscience Laboratories 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan From owen at stat.Stanford.EDU Tue Sep 23 12:06:44 2003 From: owen at stat.Stanford.EDU (Art Owen) Date: Tue, 23 Sep 2003 09:06:44 -0700 Subject: No subject Message-ID: Dear Connectionists, I'm helping Stanford's School of Education recruit a faculty member with an interest in statistics. They're looking for somebody who is: 1) at the assistant or associate professor level 2) a good teacher for bright non-mathematical grad students 3) able to handle large data sets featuring missing values, selection biases, non-tabular structures, etc. 4) interested in policy issues Recently there seems to be a sharp increase in quantitative methods in the social sciences. They, like everybody else, are getting floods of data. There is a copy of the position advertisement at: www-stat.stanford.edu/~owen/EdPos.pdf Please read it carefully if you're interested in applying. Note that it stresses excellence in teaching, and methodology relevant to the social sciences. -Art Owen _____________________________________________________________________ Art Owen Tel: 650.725.2232 Dept. of Statistics Fax: 650.725.8977 Sequoia Hall, 130 art "AT" stat.stanford.edu Stanford, CA 94305 www.stanford.edu/~owen From r.p.w.duin at tnw.tudelft.nl Wed Sep 24 11:09:49 2003 From: r.p.w.duin at tnw.tudelft.nl (Bob Duin) Date: Wed, 24 Sep 2003 17:09:49 +0200 Subject: Four positions in Delft Pattern Recognition Research Message-ID: <20030924150948.GA29364@ph.tn.tudelft.nl> Dear colleagues, We have at this moment a number of open positions in pattern recognition. We are looking for PhD students (receiving a full salary) as well as post-docs. Applicants have a solid academic background in physics, electrical engineering, computer science or mathematics. Post-Docs should have recently finished their PhD research. Experience in multi-dimensional data analysis, pattern recognition, machine learning or image processing is an advantage. Good programming skills are highly desirable. For one of the PhD positions a strong mathematical background is needed. For more information have a look at: http://www.ph.tn.tudelft.nl/~duin/vacancies.html Sincerely, Robert P.W. Duin -- R.P.W. Duin Phone: (31) 15 2786143 Faculty of Applied Sciences Fax: (31) 15 2786740 Delft University of Technology mailto: r.p.w.duin at tnw.tudelft.nl P.O. Box 5046, 2600 GA Delft http://www.ph.tn.tudelft.nl/~duin The Netherlands From calls at bbsonline.org Wed Sep 24 12:41:35 2003 From: calls at bbsonline.org (Behavioral & Brain Sciences) Date: Wed, 24 Sep 2003 17:41:35 +0100 Subject: Burns/An evolutionary theory of schizophrenia: BBS Call for Commentators Message-ID: Below is a link to the forthcoming BBS target article An evolutionary theory of schizophrenia: Cortical connectivity, metarepresentation and the social brain by Jonathan Kenneth Burns http://www.bbsonline.org/Preprints/Burns/Referees/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or suggested by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 10,000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to suggest someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. An electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) ======================================================================= ** IMPORTANT ** ======================================================================= To help us put together a balanced list of commentators, it would be most helpful if you would send us an indication of the relevant expertise you would bring to bear on the paper, and what aspect of the paper you would anticipate commenting upon. (Please note that we only request expertise information in order to simplify the selection process.) Please DO NOT prepare a commentary until you receive a formal invitation, indicating that it was possible to include your name on the final list, which is constructed so as to balance areas of expertise and frequency of prior commentaries in BBS. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable at the URL that follows the abstract and keywords below. ======================================================================= ======================================================================= An evolutionary theory of schizophrenia: Cortical connectivity, metarepresentation and the social brain Jonathan Kenneth Burns University of Edinburgh ABSTRACT: Schizophrenia is a worldwide prevalent disorder with a multifactorial but highly genetic aetiology. A constant prevalence rate in the face of reduced fecundity has caused some to argue that an evolutionary advantage exists in unaffected relatives. This adaptationist approach is critiqued and Crow's 'speciation' hypothesis is reviewed and found wanting. In keeping with available biological and psychological evidence, an alternative theory of the origins of this disorder is proposed. Schizophrenia is a disorder of the social brain and exists as a costly trade off in the evolution of complex social cognition. Paleoanthropological and comparative primate research suggests that hominids evolved complex cortical interconnectivity (in particular fronto-temporal and fronto-parietal circuits) in order to regulate social cognition and the intellectual demands of group living. I suggest that the ontogenetic mechanism underlying this cerebral adaptation was sequential hypermorphosis and that it rendered the hominid brain vulnerable to genetic and environmental insults. I argue that changes in genes regulating the timing of neurodevelopment occurred, prior to the migration of H. sapiens out of Africa 150 -100 000 years ago, giving rise to the schizotypal spectrum. While some individuals within this spectrum may have exhibited unusual creativity and iconoclasm, this phenotype was not necessarily adaptive in reproductive terms. However, because the disorder shared a common genetic basis with the evolving circuitry of the social brain, it persisted. Thus schizophrenia emerged as a costly trade off in the evolution of complex social cognition. KEYWORDS: cortical connectivity, evolution, heterochrony, metarepresentation, primates, psychiatry, schizophrenia, social brain, social cognition http://www.bbsonline.org/Preprints/Burns/Referees/ ======================================================================= ======================================================================= *** SUPPLEMENTARY ANNOUNCEMENT *** (1) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Please note: Your email address has been added to our user database for Calls for Commentators, the reason you received this email. If you do not wish to receive further Calls, please feel free to change your mailshot status through your User Login link on the BBSPrints homepage, using your username and password. Or, email a response with the word "remove" in the subject line. *-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-* Paul Bloom - Editor Barbara Finlay - Editor Jeffrey Gray - Editor Behavioral and Brain Sciences bbs at bbsonline.org http://www.bbsonline.org ------------------------------------------------------------------- From cindy at bu.edu Thu Sep 25 09:28:00 2003 From: cindy at bu.edu (Cynthia Bradford) Date: Thu, 25 Sep 2003 09:28:00 -0400 Subject: Call for Papers: 2004 Special Issue of Neural Networks on Vision and Brain Message-ID: <021d01c38368$d7fa2c00$903dc580@cnspc31> ***** FINAL REMINDER ***** CALL FOR PAPERS 2004 Special Issue VISION AND BRAIN Understanding how the brain sees is one of the most active and exciting areas in perceptual science, neuroscience, and modeling. This is because vision is one of our most important sources of information about the world, and a large amount of brain is used to process visual signals, ranging from early filtering processes through perceptual grouping, surface formation, depth perception, texture perception, figure-ground separation, motion perception, navigation, search, and object recognition. This Special Issue will incorporate invited and contributed articles focused on recent experimental and modeling progress in unifying physiological, psychophysical and computational mechanisms of vision. The Special Issue will also include articles that summarize biologically inspired approaches to computer vision in technology, including hardware approaches to realizing neuromorphic vision algorithms. CO-EDITORS: Professor David Field, Cornell University Professor Leif Finkel, University of Pennsylvania Professor Stephen Grossberg, Boston University SUBMISSION: Deadline for submission: September 30, 2003 Notification of acceptance: January 31, 2004 Format: no longer than 10,000 words; APA reference format ADDRESS FOR SUBMISSION: Stephen Grossberg, Editor Neural Networks Department of Cognitive and Neural Systems Boston University 677 Beacon Street, Room 203 Boston, Massachusetts 02215 USA From jose at tractatus.rutgers.edu Thu Sep 25 10:15:45 2003 From: jose at tractatus.rutgers.edu (Stephen J. Hanson) Date: 25 Sep 2003 10:15:45 -0400 Subject: GRADUATE FELLOWSHISP and RESEARCH ASSISTANT POSITIONS - RUTGERS U. RUMBA LABS Message-ID: <1064499345.2076.40.camel@vaio> RUTGERS UNIVERSITY (NEWARK CAMPUS)-- RUMBA LABORATORIES-ADVANCED IMAGING CENTER (UMDNJ/RUTGERS)--RESEARCH ASSISTANTS/GRADUATE FELLOWSHIPS. IMMEDIATE OPENINGS Research in cognitive neuroscience, category learning, event perception, using magnetic resonance imaging and electrophysiological techniques. These Research assistants are intended lead to Competitive (20k$ +Tuition) GRADUATE FELLOWSHIPS in Cognitive Science/Cognitive Neurosciece. Experience in standard Neuroimaging Software (SPM, AFNI, VOXBO,etc..) is important. Background in experimental psychology or cognitive science (BA/BS Required), neuroscience and statistics would be helpful. Strong computer skills are a plus. An excellent opportunity for someone bound for graduate school in cognitive science, cognitive neuroscience or medicine. Send by email a CV with description of research experience and the names of three references to: rumbalabs at psychology.rutgers.edu (see www.rumba.rutgers.edu for more information) -- Stephen J. Hanson From jose at tractatus.rutgers.edu Thu Sep 25 09:40:29 2003 From: jose at tractatus.rutgers.edu (Stephen J. Hanson) Date: 25 Sep 2003 09:40:29 -0400 Subject: FACULTY COG SCI/COGNITIVE NEURO RUTGERS UNIVERSITY (NEWARK CAMPUS) Message-ID: <1064497229.1938.12.camel@vaio> RUTGERS UNIVERSITY (NEWARK CAMPUS), PSYCHOLOGY DEPARTMENT, COGNITIVE SCIENCE, COGNTIVE NEUROSCIENCE The Department of Psychology anticipates making one tenure track, Associate Professor level appointment (will consider Junior Faculty appointment near Tenure) in area of COGNITIVE SCIENCE/NEUROSCIENCE. In particular we are seeking individuals from one of any of the following THREE areas: LEARNING, COMPUTATIONAL NEUROSCIENCE, or SOCIAL NEUROSCIENCE (interests in NEUROIMAGING in any of these areas would also be a plus, since the Department in conjunction with UMDNJ jointly administers a 3T Neuroimaging Center (see http://www.newark.rutgers.edu/fmri/). The successful candidate is expected to develop and maintain an active, externally funded research program, and to teach at both the graduate and undergraduate levels. Review of applications will begin JANUARY 30th 2004, pending final budgetary approval from the administration. Rutgers University is an equal opportunity/ affirmative action employer. Qualified women and minority candidates are encouraged to apply. Please send a CV, a statement of current and future research interests, and three letters of recommendation to COGNITIVE SCIENCE SEARCH COMMITTEE, Department of Psychology, Rutgers University, Newark, NJ 07102. Email enquires can be made to cogsci at psychology.rutgers.edu. -- Stephen J. Hanson From cns at cns.bu.edu Fri Sep 26 14:09:00 2003 From: cns at cns.bu.edu (CNS Department) Date: Fri, 26 Sep 2003 14:09:00 -0400 Subject: GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY Message-ID: <3F7480BC.8010201@cns.bu.edu> PLEASE POST ******************************************************************* GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The brochure may also be viewed on line at: http://www.cns.bu.edu/brochure/ and application forms at: http://www.bu.edu/cas/graduate/application.html Applications for Fall 2004 admission and financial aid are now being accepted for PhD, MA, and BA/MA degree programs. To obtain a brochure describing CNS programs and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via email your full name and mailing address to the attention of Mr. Robin Amos at: amos at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) general test scores. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. ******************************************************************* Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students and qualified undergraduates interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. The department's training and research focus on two broad questions. The first question is: How does the brain control behavior? This is a modern form of the Mind/Body Problem. The second question is: How can technology emulate biological intelligence? This question needs to be answered to develop intelligent technologies that are well suited to human societies. These goals are symbiotic because brains are unparalleled in their ability to intelligently adapt on their own to complex and novel environments. Models of how the brain accomplishes this are developed through systematic empirical, mathematical, and computational analysis in the department. Autonomous adaptation to a changing world is also needed to solve many of the outstanding problems in technology, and the biological models have inspired qualitatively new designs for applications. CNS is a world leader in developing biological models that can quantitatively simulate the dynamics of identified brain cells in identified neural circuits, and the behaviors that they control. This new level of understanding is producing comparable advances in intelligent technology. CNS is a graduate department that is devoted to the interdisciplinary training of graduate students. The department awards MA, PhD, and BA/MA degrees. Its students are trained in a broad range of areas concerning computational neuroscience, cognitive science, and neuromorphic systems. The biological training includes study of the brain mechanisms of vision and visual object recognition; audition, speech, and language understanding; recognition learning, categorization, and long-term memory; cognitive information processing; self-organization and development, navigation, planning, and spatial orientation; cooperative and competitive network dynamics and short-term memory; reinforcement and motivation; attention; adaptive sensory-motor planning, control, and robotics; biological rhythms; consciousness; mental disorders; and the mathematical and computational methods needed to support advanced modeling research and applications. Technological training includes methods and applications in image processing, multiple types of signal processing, adaptive pattern recognition and prediction, information fusion, and intelligent control and robotics. The foundation of this broad training is the unique interdisciplinary curriculum of seventeen interdisciplinary graduate courses that have been developed at CNS. Each of these courses integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of artificial neural networks and hybrid systems to technology. A student's curriculum is tailored to his or her career goals with academic and research advisors. In addition to taking interdisciplinary courses within CNS, students develop important disciplinary expertise by also taking courses in departments such as biology, computer science, engineering, mathematics, and psychology. In addition to these formal courses, students work individually with one or more research advisors to learn how to carry out advanced interdisciplinary research in their chosen research areas. As a result of this breadth and depth of training, CNS students have succeeded in finding excellent jobs in both academic and technological areas after graduation. The CNS Department interacts with colleagues in several Boston University research centers, and with Boston-area scientists collaborating with these centers. The units most closely linked to the department are the Center for Adaptive Systems and the CNS Technology Laboratory. Students interested in neural network hardware can work with researchers in CNS and at the College of Engineering. Other research resources include distinguished research groups the campus-wide Program in Neuroscience, which unites cognitive neuroscience, neurophysiology, neuroanatomy, neuropharmacology, and neural modeling across the Charles River Campus and the Medical School; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department ; and in biophysics and computational physics within the Physics Department. Key colleagues in these units hold joint appointments in CNS in order to expedite training and research interactions with CNS core faculty and students. In addition to its basic research and training program, the department organizes an active colloquium series, various research and seminar series, and international conferences and symposia, to bring distinguished scientists from experimental, theoretical, and technological disciplines to the department. The department is housed in its own four-story building, which includes ample space for faculty and student offices and laboratories (computational neuroscience, visual psychophysics, psychoacoustics, speech and language, sensory-motor control, neurobotics, computer vision, and technology), as well as an auditorium, classroom, seminar rooms, a library, and a faculty-student lounge. The department has a powerful computer network for carrying out large-scale simulations of behavioral and brain models and applications. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior Helen Barbas Professor, Department of Health Sciences, Sargent College PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems PhD, Mathematics, University of Wisconsin, Madison Learning and memory, vision, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems, cardiovascular oscillations physiology and time series H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, auditory virtual environments, signal processing models of hearing Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology John C. Fiala Research Assistant Professor of Biology PhD, Cognitive and Neural Systems, Boston University Synaptic plasticity, dendrite anatomy and pathology, motor learning, robotics, neuroinformatics Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Vision, audition, language, learning and memory, reward and motivation, cognition, development, sensory-motor control, mental disorders, applications Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, biological sensory-motor control and functional brain imaging Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition Michael E. Hasselmo Associate Professor of Psychology Director of Graduate Studies, Psychology Department PhD, Experimental Psychology, Oxford University Computational modeling and experimental testing of neuromodulatory mechanisms involved in encoding, retrieval and consolidation Allyn Hubbard Associate Professor of Electrical and Computer Engineering PhD, Electrical Engineering, University of Wisconsin VLSI circuit design: digital, analog, subthreshold analog, biCMOS, CMOS; information processing in neurons, neural net chips, synthetic aperture radar (SAR) processing chips, sonar processing chips; auditory models and experiments Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Neural network theory, complexity theory, wavelet theory, mathematical physics Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamics of networks of neurons Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neurodevelopmental disorders Siegfried Martens Research Associate, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Learning models, pattern recognition, visualization, remote sensing, sensor fusion Ennio Mingolla Professor of Cognitive and Neural Systems and Psychology PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision Bradley Rhodes Research Associate, Department of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Motor control, learning, and adaptation, serial order behavior (timing in particular), attention and memory Michele Rucci Assistant Professor of Cognitive and Neural Systems PhD, Scuola Superiore S.-Anna, Pisa, Italy Vision, sensory-motor control and learning, and computational neuroscience Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Research Scientist, Haskins Laboratories, New Haven, CT Assistant Professor in Residence, Department of Psychology and Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Scientist, Rowland Institute for Science Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception Teaching about functional MRI and other brain mapping methods Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Frances and Louis H. Salvage Professor of Psychology, Brandeis University Consultant in neurosurgery, Boston Children's Hospital PhD, Psychology, Brown University Visual motion, brain imaging, relation of visual perception, memory, and movement Barbara Shinn-Cunningham Associate Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensorimotor adaptation, mathematical models of human performance David Somers Assistant Professor of Psychology PhD, Cognitive and Neural Systems, Boston University Functional MRI, psychophysical, and computational investigations of visual perception and attention Chantal E. Stern Assistant Professor of Psychology and Program in Neuroscience, Boston University Assistant in Neuroscience, MGH-NMR Center and Harvard Medical School PhD, Experimental Psychology, Oxford University Functional neuroimaging studies (fMRI and MEG) of learning and memory Malvin C. Teich Professor of Electrical and Computer Engineering, Biomedical Engineering, and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI) Jeremy Wolfe Adjunct Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Department Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, pre-attentive and attentive object representation Curtis Woodcock Professor of Geography Chairman, Department of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and student-run special interest groups, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by fellowships, grants, and contracts from federal agencies and private foundations that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception; audition, speech and language processing; and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Macintoshes, and PCs. A PC farm running Linux operating systems is available as a distributed computational environment. All students have access to X-terminals or UNIX workstation consoles, a selection of color systems and PCs, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Active Perception Laboratory The Active Perception Laboratory is dedicated to the investigation of the interactions between perception and behavior. Research focuses on the theoretical and computational analyses of the effects of motor behavior on sensory perception and on the design of psychophysical experiments with human subjects. The Active Perception Laboratory includes extensive computational facilities that allow the execution of large-scale simulations of neural systems. Additional facilities include instruments for the psychophysical investigation of eye movements during visual analysis, including an accurate and non-invasive eye tracker, and robotic systems for the simulation of different types of behavior. Auditory Neuroscience Laboratory The Auditory Neuroscience Laboratory in the Department of Cognitive and Neural Systems (CNS) is equipped to perform both traditional psychoacoustic experiments as well as experiments using interactive auditory virtual-reality stimuli. The laboratory contains approximately eight PCs (running Windows 98 and/or Linux), used both as workstations for students and to control laboratory equipment and run experiments. The other major equipment in the laboratory includes special-purpose signal processing and sound generating equipment from Tucker-Davis Technologies, electromagnetic head tracking systems, a two-channel spectrum analyzer, and other miscellaneous equipment for producing, measuring, analyzing, and monitoring auditory stimuli. The Auditory Neuroscience Laboratory consists of three adjacent rooms in the basement of 677 Beacon Steet (the home of the CNS Department). One room houses an 8 ft. ? 8 ft. single-walled sound-treated booth as well as space for students. The second room is primarily used as student workspace for developing and debugging experiments. The third space houses a robotic arm, capable of automatically positioning a small acoustic speaker anywhere on the surface of a sphere of adjustable radius, allowing automatic measurement of the signals reaching the ears of a listener for a sound source from different positions in space, including the effects of room reverberation. Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Laboratory is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; an active vision laboratory including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. The laboratory supports research in the areas of neural modeling, computational neuroscience, computer vision and robotics. The major question being address is the nature of representation of the visual world in the brain, in terms of observable neural architectures such as topographic mapping and columnar architecture. The application of novel architectures for image processing for computer vision and robotics is also a major topic of interest. Recent work in this area has included the design and patenting of novel actuators for robotic active vision systems, the design of real-time algorithms for use in mobile robotic applications, and the design and construction of miniature autonomous vehicles using space-variant active vision design principles. Recently one such vehicle has successfully driven itself on the streets of Boston. Neurobotics Laboratory The Neurobotics Laboratory utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The laboratory currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Other platforms may be investigated in the future. Sensory-Motor Control Laboratory The Sensory-Motor Control Laboratory supports experimental and computational studies of sensory-motor control. A computer controlled infrared WatSmart system allows measurement of large-scale (e.g. reaching) movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. The laboratory is connected to the department's extensive network of Linux and Windows workstations and Linux computational servers. Speech and Language Laboratory The Speech Laboratory includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal-processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. The laboratory also contains a network of Windows-based PC computers equipped with software for the analysis of functional magnetic resonance imaging (fMRI) data, including region-of-interest (ROI) based analyses involving software for the parcellation of cortical and subcortical brain regions in structural MRI images. Technology Laboratory The Technology Laboratory fosters the development of neural network models derived from basic scientific research and facilitates the transition of the resulting technologies to software and applications. The Lab was established in July 2001, with a grant from the Air Force Office of Scientific Research: "Information Fusion for Image Analysis: Neural Models and Technology Development." Initial projects have focused on multi-level fusion and data mining in a geospatial context, in collaboration with the Boston University Center for Remote Sensing. This research and development have built on models of opponent-color visual processing, boundary contour system (BCS) and texture processing, and Adaptive Resonance Theory (ART) pattern learning and recognition, as well as other models of associative learning and prediction. Other projects include collaborations with the New England Medical Center and Boston Medical Center, to develop methods for analysis of large-scale medical databases, currently to predict HIV resistance to antiretroviral therapy. Associated basic research projects are conducted within the joint context of scientific data and technological constraints. Visual Psychophysics Laboratory The Visual Psychophysics Laboratory occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Macintosh, Windows and Linux workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing devices, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty members have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://cns.bu.edu/ ******************************************************************* From kunliu1 at cs.umbc.edu Sat Sep 27 05:03:12 2003 From: kunliu1 at cs.umbc.edu (Kun Liu) Date: Sat, 27 Sep 2003 4:3:12 -0500 Subject: Distributed Data Mining Bibliography Message-ID: <200309270803.h8R83Sgv023416@mailserver-ng.cs.umbc.edu> We are pleased to announce the availability of a bibliography on Distributed Data Mining. The current version contains 246 entries. It can be downloaded from the following site: http://www.csee.umbc.edu/~hillol/DDMBIB/ The site also provides an interface to submit bibliographic information for relevant papers. Kun Liu University of Maryland, Baltimore County