From vlassis at science.uva.nl Wed Oct 1 05:06:24 2003 From: vlassis at science.uva.nl (Nikos Vlassis) Date: 01 Oct 2003 11:06:24 +0200 Subject: intro to multiagent systems and DAI Message-ID: <1064999184.23181.102.camel@thebe> Dear colleagues, I would like to point out to a short introductory text (70 pages) on "Multiagent systems and Distributed AI": http://www.science.uva.nl/~vlassis/cimasdai/ There is also software available for the well-known "predator-prey" simulation environment for multiagent systems. Corrections/suggestions are most welcome! Best regards, Nikos Vlassis Univ. of Amsterdam, The Netherlands -- From esann at dice.ucl.ac.be Wed Oct 1 15:25:17 2003 From: esann at dice.ucl.ac.be (esann) Date: Wed, 1 Oct 2003 21:25:17 +0200 Subject: CFP: ESANN'2004 European Symposium on Artificial Neural Networks Message-ID: <002b01c38851$c1928780$43ed6882@dice.ucl.ac.be> ESANN'2004 12th European Symposium on Artificial Neural Networks Bruges (Belgium) - April 28-29-30, 2004 Announcement and call for papers ===================================================== Technically co-sponsored by the International Neural Networks Society, the European Neural Networks Society, the IEEE Neural Networks Society, the IEEE Region 8 (to be confirmed), the IEEE Benelux Section. The call for papers for the ESANN'2004 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). ESANN'2004 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first happening in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2004 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical and statistical aspects, in the context of function approximation, classification, data analysis, control, time-series prediction, signal processing, vision, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. The topic of the paper decides if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate their choice for oral or poster presentation only. The following is a non-exhaustive list of topics covered during the ESANN conferences: - Models and architectures - Learning algorithms - Theory - Mathematics - Statistical data analysis - Classification - Approximation of functions - Time series forecasting - Nonlinear dimension reduction - Multi-layer Perceptrons - RBF networks - Self-organizing maps - Vector quantization - Support Vector Machines - Recurrent networks - Fuzzy neural nets - Hybrid networks - Bayesian neural nets - Cellular neural networks - Signal processing - Independent component analysis - Natural and artificial vision - Adaptive control - Identification of non-linear dynamical systems - Biologically plausible networks - Bio-inspired systems - Cognitive psychology - Evolutiv learning - Adaptive behaviour Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. Here is the list of special sessions that will be organized during the ESANN'2004 conference: 1. Neural methods for non-standard data B. Hammer, Univ. Osnabrck, B.J. Jain, Tech. Univ. Berlin (Germany) 2. Soft-computing techniques for time series forecasting I. Rojas, Univ. Granada (Spain) 3. Neural networks for data mining R. Andonie, Transylvania Univ. (Romania) 4. Theory and applications of neural maps U. Seiffert, IPK Gatersleben, T. Villmann, Univ. Leipzig, A. Wismller, Univ. Munich (Germany) 5. Industrial applications of neural networks L.M. Reyneri, Politecnico. di Torino (Italy) 6. Hardware systems for Neural devices P. Fleury, A. Bofill-i-Petit, Univ. Edinburgh (Scotland, UK) Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organized in a hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Proceedings and journal special issue ------------------------------------- The proceedings will include all communications presented to the conference (tutorials, oral and posters), and will be available on-site. Extended versions of selected papers will be published in the Neurocomputing journal (Elsevier). Call for contributions ---------------------- Prospective authors are invited to submit their contributions before 5 December 2003. The electronic submission procedure will be available soon on the ESANN Web pages http://www.dice.ucl.ac.be/esann/. Authors must indicate their choice for oral or poster presentation at the submission. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2004. They will benefit from the advance registration fee. Deadlines --------- Submission of papers December 5, 2003 Notification of acceptance February 6, 2004 Symposium April 28-30, 2004 Registration fees ----------------- Universities Industries speakers registration 430 530 (before 28 February 2004) (one paper per speaker) non-speaker registration (before 12 March 2004) 430 530 non-speaker registration (after 12 March 2004) 485 585 The registration fee includes the attendance to all sessions, the ESANN'2004 dinner, a copy of the proceedings, daily lunches (28-30 April 2004), and the coffee breaks. Conference secretariat ---------------------- ESANN'2004 d-side conference services phone: + 32 2 730 06 11 24 av. L. Mommaerts Fax: + 32 2 730 06 00 B - 1140 Evere (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee (to be confirmed) ---------------------------- Hugues Bersini Univ. Libre Bruxelles (B) Franois Blayo Prfigure (F) Marie Cottrell Univ. Paris I (F) Jeanny Hrault INPG Grenoble (F) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Herv Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Colin Campbell Bristol Univ. (UK) Stphane Canu Inst. Nat. Sciences App. (F) Holk Cruse Universitt Bielefeld (D) Eric de Bodt Univ. Lille II (F) & UCL Louvain-la-Neuve (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips Semiconductors (USA) Richard Duro Univ. Coruna (E) Jean-Claude Fort Universit Nancy I (F) Colin Fyfe Univ. Paisley (UK) Stan Gielen Univ. of Nijmegen (NL) Marco Gori Univ. Siena (I) Bernard Gosselin Fac. Polytech. Mons (B) Manuel Grana UPV San Sebastian (E) Anne Gurin-Dugu INPG Grenoble (F) Barbara Hammer Univ. of Osnbruck (D) Martin Hasler EPFL Lausanne (CH) Laurent Hrault CEA-LETI Grenoble (F) Gonzalo Joya Univ. Malaga (E) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinki Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Jouko Lampinen Helsinki Univ. of Tech. (FIN) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Erzsebet Merenyi Rice Univ. (USA) Jean Arcady Meyer Univ. Paris 6 (F) Jos Mira UNED (E) Jean-Pierre Nadal Ecole Normale Suprieure Paris (F) Gilles Pags Univ. Paris 6 (F) Thomas Parisini Univ. Trieste (I) Hlne Paugam-Moisy Universit Lumire Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Didier Puzenat Univ. Antilles-Guyane (F) Leonardo Reyneri Politecnico di Torino (I) Jean-Pierre Rospars INRA Versailles (F) Jose Santos Reyes Univ. Coruna (E) Udo Seiffert IPK Gatersleben (D) Jochen Steil Univ. Bielefeld (D) John Stonham Brunel University (UK) Johan Suykens K. U. Leuven (B) John Taylor Kings College London (UK) Claude Touzet Univ. Provence (F) Marc Van Hulle KUL Leuven (B) Thomas Villmann Univ. Leipzig (D) Christian Wellekens Eurecom Sophia-Antipolis (F) Axel Wismller Ludwig-Maximilians-Univ. Mnchen (D) ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Machine Learning Group 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From chaudhri at AI.SRI.COM Wed Oct 1 20:26:43 2003 From: chaudhri at AI.SRI.COM (Vinay K. Chaudhri) Date: Wed, 01 Oct 2003 17:26:43 -0700 Subject: Research Position Message-ID: <3F7B70C3.2030709@ai.sri.com> Dear Colleagues: *The Artificial Intelligence Center at SRI International is looking for a highly creative Computer Scientist to join a team of researchers building evaluation driven, but knowledge based systems. We are seeking an individual with a thorough understanding of knowledge representation with an emphasis on the representation of uncertain knowledge and reasoning under uncertainty. The person should have a knack for combining theoretical background with pragmatic approaches necessary for engineering real systems. For a complete job posting, please see: http://sri.hrdpt.com/cgi-bin/c/highlightjob.cgi?jobID=1314 * From juergen at idsia.ch Thu Oct 2 08:58:17 2003 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Thu, 02 Oct 2003 14:58:17 +0200 Subject: Job at IDSIA, Switzerland / nips 2003 RNNaissance workshop Message-ID: <3F7C20E9.4060300@idsia.ch> We are seeking an outstanding postdoc or visitor for 8 months or so in 2004, with possibility of prolongation for another year or more. He/she should be interested in reinforcement learning, adaptive robotics, recurrent neural networks, optimal universal search algorithms, program evolution, cognitive science: http://www.idsia.ch/~juergen/jobcsem2004.html Juergen Schmidhuber, IDSIA ps: the NIPS 2003 RNNaissance workshop on recurrent neural networks is now open for submissions: http://www.idsia.ch/~juergen/rnnaissance.html From cns at cnsorg.org Thu Oct 2 12:47:12 2003 From: cns at cnsorg.org (CNS - Organization for Computational Neurosciences) Date: Thu, 2 Oct 2003 09:47:12 -0700 Subject: CNS*04 Message-ID: <1065113232.3f7c569021007@webmail.mydomain.com> CNS*04 The annual Computational Neuroscience Meeting will be held in Baltimore, MD, from July 18th - 20th, 2004. The main meeting will be followed by two days of workshops on July 21st and 22nd. In conjunction, the "2004 Annual Symposium, University of Maryland Program in Neuroscience: Computation in the Olfactory System" will be held as a satellite symposium to CNS*04 on Saturday, July 17th. The call for papers for CNS*04 can be expected in December 2003. For CNS*04, two types of submissions will be considered. Full papers, which can be included in the proceedings, will appear in the journal Neurocomputing and will be fully peer reviewed; extended abstracts will not be included in the proceedings but will be peer reviewed for inclusion in the meeting program. The submission deadline for full papers (in final format) and extended abstracts will be January 19, 2004. CNS is organized by the Organization for Computational Neurosciences. Further announcements will be posted on the organization's website, www.cnsorg.org. Christiane Linster, President Erik De Schutter, Program Chair Asaf Keller, Local Organizer ** CNS - Organization for Computational Neurosciences ** From ken at phy.ucsf.edu Thu Oct 2 20:44:42 2003 From: ken at phy.ucsf.edu (Ken Miller) Date: Thu, 2 Oct 2003 17:44:42 -0700 Subject: New Q-Bio Archives Message-ID: <16252.50810.380146.842433@coltrane.ucsf.edu> Hi, I received the following announcement which seems very relevant to these lists, so I am passing it on. I am not formally involved in the Q-Bio archive, just an interested potential user. List moderator: It's probably best if you just post the announcement below without this header from me, since I have nothing to do with it, but it's up to you. Ken Miller Kenneth D. Miller telephone: (415) 476-8217 Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 ---------------------------------- ---------------------------------- Announcement of new Quantitative Biology (q-bio) archive 15 Sept 2003 Dear Colleagues, In recent years, an increasing number of researchers from mathematics, computer science, and the physical sciences have been joining biologists in the ongoing revolution in biology. In a variety of ways, these researchers are contributing towards making biology a quantitative science. With this letter, we announce the formation of the q-bio archive (http://arXiv.org/archive/q-bio, see also http://arXiv.org/new/q-bio.html), which aims to serve the need of this emerging community. If you and your colleagues have active interest in quantitative biology (including but not limited to biological physics, computational biology, neural science, systems biology, bioinformatics, mathematical biology, and theoretical biology), we urge you to subscribe to the archive and submit (p)reprints to it. Both theoretical and experimental contributions are welcome, and subscription is freely accessible over the internet to all members of the scientific community. Instructions for registration, submission and subscription to the archive can be found at http://arXiv.org/help/registerhelp, http://arXiv.org/help/uploads, and http://arXiv.org/help/subscribe. The q-bio archive has grown out of a well-established series of e-Print archives accessible at http://arxiv.org/. The number of biology-related submissions to these archives has risen steadily over the last several years, and is averaging over 40/month so far in 2003. Unfortunately, these submissions are currently scattered across a number of sub-archives (including physics, cond-mat, nonlinear science, math, etc.), reflecting mostly the "home field" of the contributors rather than the subject matters of their submissions. Many colleagues have expressed the desire to have a centralized archive to share their latest results, and to learn about related findings by others in this field. The q-bio archive is designed to address this problem. It is organized mainly according to different categories of biological processes and partitioned according to their scales in space and time. The categories http://arXiv.org/new/q-bio.html range from molecular and sub-cellular structures to tissues and organs, from the kinetics of molecules to population and evolutionary dynamics. In addition, a separate category is devoted to method-dominated contributions, including computational algorithms, experimental methods, as well as novel approaches to analyzing experimental data. All submissions are required to choose a primary category, with the option for one or more secondary categories. Subscribers of the archive will receive by e-mail the title/abstracts of all submissions in their chosen categories on a regular basis. A large number of bio-related submissions to the e-Print archives during the past decade have already been identified and categorized according to the above scheme using an automated procedure. They can be accessed at http://arxiv.org/archive/q-bio. Please note that the current list of categories is a compromise between the large number of active subject matters in biology and the areas of quantitative biology where the e-print archives have received significant contributions during the past several years. The subject list will undoubtedly be updated as the major active areas develop/shift in time. This continuous structuring of the archive is overseen by an advisory committee. It consists of a number of well-established biologists, William F. Loomis (UCSD), Chuck Stevens (Salk), Gary Stormo (WUSTL), Diethard Tautz (Cologne), together with a number of dedicated volunteers who will serve as "moderators" for each category listed at http://arXiv.org/new/q-bio.html. If you have suggestions to improve the q-bio archive, please contact the coordinators or the relevant moderators by e-mail. From nips-workshop-admin at bcs.rochester.edu Fri Oct 3 17:21:45 2003 From: nips-workshop-admin at bcs.rochester.edu (Robert Jacobs) Date: Fri, 03 Oct 2003 17:21:45 -0400 Subject: NIPS workshop schedule Message-ID: <5.1.1.6.0.20031003171116.0373ca38@bcs.rochester.edu> The Neural Information Processing Systems (NIPS) conference and workshops will take place in Vancouver and Whistler, respectively, on December 8-13. The workshop schedule is now available (see below). More information=20 (including more details about each workshop) can be obtained from the NIPS web site: http://www.nips.cc Robert Jacobs and Satinder Singh NIPS workshop co-chairs ========================================== Neural Information Processing Systems (NIPS) workshops ------------------------------------------------------------------------------------- Two-Day Workshops ------------------------------- Title: Neural-Inspired Architectures for Nanoelectronics Organizers: Valeriu Beiu, Ulrich R=FCckert Title: Robust Communication Dynamics in Complex Networks Organizers: Rajarshi Das, Irina Rish, Gerald Tesauro, Cris Moore Friday, December 12 ------------------------------- Title: Estimation of Entropy and Information of Undersampled Probability Distributions: Theory, Algorithms, and Applications to the Neural Code Organizers: William Bialek, Ilya Nemenman Title: Feature Extraction Challenge Organizers: Isabelle Guyon, Masoud Nikravesh, Kristin Bennett, Richard Caruana, Asa Ben-Hur, Andre Elisseeff, Fernando Perez-Cruz, Steve Gunn Title: Hyperspectral Remote Sensing and Machine Learning Organizers: J. Anthony Gualtieri Title: Machine Learning Meets the User Interface Organizers: John Shawe-Taylor, John Platt Title: Neural Representation of Uncertainty Organizers: Sophie Deneve, Angela Yu Title: New Problems and Methods in Bioinformatics Organizers: Christina Leslie, William Noble, Koji Tsuda Title: RNNaissance Workshop (Recurrent Neural Networks) Organizers: Juergen Schmidhuber, Alex Graves, Bram Bakker Title: Syntax, Semantics, and Statistics Organizers: Richard Shiffrin, Mark Steyvers, David Blei, Tom Griffiths Saturday, December 13 ----------------------------------- Title: Approximate Nearest Neighbor Techniques for Local Learning and Perception Organizers: Trevor Darrell, Piotr Indyk, Gregory Shakhnarovich, Paul Viola Title: Computing with Spikes: Implementation of Biology and Theory Organizers: Ralph Etienne-Cummings, Timothy Horiuchi, Giacomo Indiveri Title: ICA: Sparse Representations in Signal Processing Organizers: Barak Pearlmutter, Scott Rickard, Justinian Rosca, Stefan Harmeling Title: Information Theory and Learning: The Bottleneck and Information Distortion Approach Organizers: Naftali Tishby, Tomas Gedeon Title: Neural Processing of Complex Acoustic Signals Organizers: Melissa Dominguez, Ian Bruce, Sue Becker Title: Nonparametric Bayesian Methods and Infinite Models Organizers: Matthew Beal, Yee Whye Teh Title: Open Challenges in Cognitive Vision Organizers: Barbara Caputo, Henrik Christensen, Christian Wallraven Title: Planning for the Real World: The Promises and Challenges of Dealing with Uncertainty Organizers: Drew Bagnell, Joelle Pineau, Nicholas Roy From mcrae at uwo.ca Sun Oct 5 17:50:26 2003 From: mcrae at uwo.ca (Ken McRae) Date: Sun, 05 Oct 2003 17:50:26 -0400 Subject: Postdoctoral Fellowship Message-ID: Postdoctoral Fellowship in Connectionist Modeling and Semantic Memory We have funding for a two-year Postdoctoral Fellowship at the University of Western Ontario in London, Ontario, Canada. The stipend is $45,000 per year plus $2,500 per year for conference travel. There are no citizenship restrictions. We are most interested in someone who would like to study issues regarding the role of semantic factors in word recognition and connectionist models of the computations involved. Our research incorporates theories and methodologies from a number of areas, including those typically associated with connectionist modeling, word recognition, semantic memory, concepts and categorization, and cognitive neuropsychology. Thus, there is the potential for working on various projects under the general umbrella of modeling semantic memory phenomena. Our department has a number of Cognition faculty, many of whom conduct research related to language processing and concepts. Thus, our faculty, postdocs, and graduate students provide a rich research environment. Our labs are well-equipped for both human experimentation and computational modeling. London is a pleasant city of approximately 350,000, and is located 2 hours drive from either Toronto or Detroit. Note that a reasonable one-bedroom apartment in London costs approximately $700 per month. For further information about our labs, and Cognition at UWO, see: http://www.ssc.uwo.ca/psychology/cognitive/cognitive.html although it is somewhat out of date and is currently being reconstructed. If you are interested in this position, please send a cv, statement of research interests, at least 2 letters of reference, and sample articles to the address below. Sending all information electronically is preferable. We are interested in hiring someone as soon as possible. If you would like more information about this position, please contact either Stephen Lupker (lupker at uwo.ca) or I directly. Ken McRae Associate Professor, Department of Psychology & Neuroscience Program Social Science Centre University of Western Ontario London, Ontario CANADA N6A 5C2 email: mcrae at uwo.ca phone: (519) 661-2111 ext. 84688 fax: (519) 661-3961 From mseeger at EECS.Berkeley.EDU Mon Oct 6 19:31:47 2003 From: mseeger at EECS.Berkeley.EDU (mseeger@EECS.Berkeley.EDU) Date: Mon, 06 Oct 2003 23:31:47 GMT Subject: PhD thesis on PAC-Bayesian bounds and sparse Gaussian processes Message-ID: <182621182fdd.182fdd182621@EECS.Berkeley.EDU> Dear colleagues, my PhD thesis is available online at www.dai.ed.ac.uk/~seeger/papers/thesis.html. It mainly deals with: - PAC-Bayesian generalisation error bounds and applications to Gaussian process classification - Sparse approximations for linear-time inference in Gaussian process models Please find abstract and table of contents on the website. You might also be interested in the tutorial paper Gaussian Processes for Machine Learning, available at www.dai.ed.ac.uk/~seeger/papers/bayesgp-tut.html which is extracted from the thesis, but is self-contained. An abstract follows. Best wishes, Matthias. ---- Gaussian Processes for Machine Learning Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to infinite (countably or continuous) index sets. GPs have been applied in a large number of fields to a diverse range of ends, and very many deep theoretical analyses of various properties are available. This paper gives an introduction to Gaussian processes on a fairly elementary level with special emphasis on characteristics relevant in machine learning. It draws explicit connections to branches such as spline smoothing models and support vector machines in which similar ideas have been investigated. --- Matthias Seeger Tel: 485 Soda Hall, UC Berkeley Fax: 510-642-5775 Berkeley, CA 94720-1776 www.dai.ed.ac.uk/~seeger From manfred at cse.ucsc.edu Tue Oct 7 17:16:51 2003 From: manfred at cse.ucsc.edu (Manfred Warmuth) Date: Tue, 7 Oct 2003 14:16:51 -0700 (PDT) Subject: Rob Schapire and Yoav Freund receive Goedel Prize for AdaBoost algorithm Message-ID: The Goedel Prize is one of most prestigious prizes in Theoretical Computer Science (jointly sponsored by EATCS and SIGACT). This is the first time that a paper in Machine Learning received this award See http://sigact.acm.org/prizes/godel for some background information on the Goedel Prize and a list of past recipients Rob Schapire and Yoav Freund received the 2003 prize for their famed AdaBoost paper. For an announcement see http://sigact.acm.org/prizes/godel/2003.html Background: Michael Kearns and Les Valiant first defined weak and strong learning and posed the open problem whether weak learning and strong learning are the same. In short weak learners must have accuracy only slightly better than 50% and strong learners must be able to achieve high accuracy In his 1991 Ph.D. thesis from MIT Rob gave the first recursive construction for combining many weak learners to form a strong learner This was followed by Yoav Freund's Ph.D. thesis in 1993, where he gave a simple flat scheme of combining weak learner by a majority vote After graduating from Santa Cruz, Yoav accepted a job at AT&T Bell labs in what was one of the strongest machine learning research groups in the country. Rob was part of that group as well They combined ideas and came up with an ``adaptive'' Boosting algorithm (called AdaBoost - just 10 lines of code) which received a lot of attention in the Machine Learning and Statistics communities Prize winning paper that introduced AdaBoost: "A Decision Theoretic Generalization of On-Line Learning and an Application to Boosting," Journal of Computer and System Sciences 55 (1997), pp. 119-139. Congrats to them ! _____________________________ Manfred K. Warmuth Prof. in Computer Science Univ. of Calif. at Santa Cruz manfred at cse.ucsc.edu http://www.cse.ucsc.edu/~manfred/ From BGabrys at bournemouth.ac.uk Tue Oct 7 10:23:51 2003 From: BGabrys at bournemouth.ac.uk (Bogdan Gabrys) Date: Tue, 7 Oct 2003 15:23:51 +0100 Subject: Industrial CASE PhD studentship available Message-ID: <4C40C6148BACD711AEF800805F8B335EDA45F1@exchange1.bournemouth.ac.uk> Dear Connectionists, The following research studentship is currently available. Please feel free to distribute to whoever may be interested in and qualified for this position. Best regards, Bogdan Gabrys *************************************************************************** EPSRC/BT funded Industrial CASE Studentship Computational Intelligence Research Group (CIRG) School of Design, Engineering and Computing, Bournemouth University, United Kingdom Applications are invited for a 3 year PhD research studentship to work on a project entitled "High Performance Fusion Systems" which is jointly funded by EPSRC and British Telecommunications plc (BT) under the EPSRC CASE scheme. The proposed research project will investigate and develop various approaches for highly efficient multiple classifier (prediction) systems composed of actively generated, well performing and decorrelated classifiers (predictors). The emphasis will be put on the automatic avoidance of data overfitting accompanied by complexity and reliability control appropriate for potential industrial applications. Combination, aggregation and fusion of information are major problems for all kinds of knowledge-based systems, from image processing to decision making, from pattern recognition to automatic learning. Various statistical, machine learning and hybrid intelligent techniques will be used for processing and modelling of imperfect data and information. The student will be joining a Computational Intelligence Research Group and will be primarily based in the School of Design, Engineering & Computing in Bournemouth but will also spend up to 3 months in each year of the project duration at the BT Exact in Ipswich. The studentship carries a remuneration starting at a minimum of =A312000 pa tax-free (suitably increased in subsequent years) and payment of tuition fees at home/EU rate. The successful applicant will need to have a permanent residency status in the UK. Applicants should have a strong mathematical background and hold a first or upper second class honours degree or equivalent in computer science, mathematics, physics, engineering, statistics or a similar discipline. Additionally the candidate should have strong programming experience using any or combination of C, C++, Matlab or Java. Knowledge of ORACLE will be an advantage. For further details please contact Dr. Bogdan Gabrys, e-mail: bgabrys at bournemouth.ac.uk . Interested candidates should send a letter of application and a detailed CV with the names and addresses of two referees to: Dr. Bogdan Gabrys, Computational Intelligence Research Group, School of DEC, Bournemouth University, Poole House, Talbot Campus, Poole, BH12 5BB, UK. The applications can be also sent by e-mail. *************************************************************************** --------------------------------------------------------------------------- Dr Bogdan Gabrys Computational Intelligence Research Group School of Design, Engineering & Computing Bournemouth University, Poole House Talbot Campus, Fern Barrow Poole, BH12 5BB United Kingdom Tel: +44 (0)1202 595298 Fax: +44 (0) 1202 595314 E-mail: bgabrys at bournemouth.ac.uk WWW: http://dec.bournemouth.ac.uk/staff/bgabrys/ --------------------------------------------------------------------------- From rogilmore at psu.edu Tue Oct 7 11:28:57 2003 From: rogilmore at psu.edu (Rick Gilmore) Date: Tue, 7 Oct 2003 11:28:57 -0400 Subject: Developmental neuroscience position at Penn State Message-ID: Developmental Neuroscience Search PSYCHOLOGY, PENN STATE. The Department of Psychology at Penn State is broadening a search for candidates for a tenure line faculty position, at any rank, with a specialization in any area of developmental neuroscience for Fall 2004. The department seeks an individual whose research will contribute to the department-wide neuroscience initiative and complement and broaden the capacities of the Department's Child Study Center (CSC) and Penn State's Child, Youth, and Family Consortium (CYFC). The CSC, a unit of the Psychology Department, is dedicated to the integration of developmental and clinical science (http://csc.la.psu.edu). The CYFC is a university-wide consortium dedicated to promoting interdisciplinary collaborations that advance research and outreach to children, youth, and families (http://www.cyfc.psu.edu). Candidates for the position may hold a doctorate in developmental or clinical child psychology (clinical candidates must hold a doctorate from an APA-approved program with an APA-approved internship). Preference will be given to candidates with post doctoral experience. The ideal candidate would bring a research program with a focus on the role of the developing brain in the development of competence and/or psychopathology in childhood (e.g., cognitive neuroscience applied to learning and/or learning difficulties, affective neuroscience applied to the development of emotional and social competence or particular disorders). The ideal candidate would also be interested in collaborating with colleagues in the Psychology Department, other departments at Penn State, and the Hershey Medical School. Please send letter of interest, vita, sample papers, and three letters of references to Pamela M. Cole, Developmental Neuroscience Search Committee, Box M, Department of Psychology, Penn State University, University Park, PA 16802. Review of applications will begin September 15, 2003 and will continue until the position is filled. Penn State is committed to affirmative action, equal opportunity and the diversity of its workforce. From nnk at atr.co.jp Wed Oct 8 04:08:25 2003 From: nnk at atr.co.jp (Neural Networks Editorial Office) Date: Wed, 8 Oct 2003 17:08:25 +0900 Subject: NEURAL NETWORKS 16(8) Message-ID: NEURAL NETWORKS 16(8) Contents - Volume 16, Number 8 - 2003 ------------------------------------------------------------------ NEURAL NETWORKS LETTERS: Recurrent neural networks with trainable amplitude of activation functions Su Lee Goh, Danilo P. Mandic Solving the XOR problem and the detection of symmetry using a single complex-valued neuron Tohru Nitta CONTRIBUTED ARTICLES: ***Psychology and Cognitive Science*** A neural model of how the brain represents and compares multi-digit numbers: spatial and categorical processes Stephen Grossberg, Dmitry V. Repin A neural network simulating human reach-grasp coordination by continuous updating of vector positioning commands Antonio Ulloa, Daniel Bullock ***Neuroscience and Neuropsychology*** A model synapse that incorporates the properties of short- and long-term synaptic plasticity Armen R. Sargsyan, Albert A. Melkonyan, Costas Papatheodoropoulos, Hovhannes H. Mkrtchian, George K. Kostopoulos ***Mathematical and Computational Analysis Back-propagation learning of infinite-dimensional dynamical systems Isao Tokuda, Ryuji Tokunaga, Kazuyuki Aihara Controlling chaos in chaotic neural network Guoguang He, Zhitong Cao, Ping Zhu, Hisakazu Ogura Neural independent component analysis by 'maximum-mismatch' learning principle Simone Fiori Necessary and sufficient condition for absolute stability of normal neural networks Tianguang Chu, Cishen Zhang, Zongda Zhang Book Review: Artificial Immune Systems: A New Computational Intelligence Approach L.N. de Castro, J. Timmis *** CURRENT EVENTS *** ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ------------------------------------------------------------------------ ---- Membership Type INNS ENNS JNNS ------------------------------------------------------------------------ ---- membership with $80 (regular) SEK 660 Y 13,000 Neural Networks (plus Y 2,000 enrollment fee) $20 (student) SEK 460 Y 11,000 (plus Y 2,000 enrollment fee) ------------------------------------------------------------------------ ---- membership without $30 SEK 200 not available Neural Networks to non-students (subscribe through another society) Y 5,000 student (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Shozo Yasui Kyushu Institute of Technology Graduate School of Life Science and Engineering 2-4 Hibikino, Wakamatsu-ku Kitakyushu 808-0196 Japan 81 93 695 6108 (phone and fax) jnns at brain.kyutech.ac.jp http://www.jnns.org/ ----------------------------------------------------------------- From masami at email.arizona.edu Wed Oct 8 12:26:57 2003 From: masami at email.arizona.edu (Masami TATSUNO) Date: Wed, 8 Oct 2003 09:26:57 -0700 Subject: Preprint on possible neural architectures underlying information-geometric measures Message-ID: <4CC5E04DDFF2FF4D84EAC8F1A5EC00D82D05EB@mail.nsma.arizona.edu> Dear Connectionists, Our preprint, 'Investigation of Possible Neural Architectures Underlying Information-Geometric Measures', M. Tatsuno and M. Okada, to appear in Neural Computation is now available for download at, http://www.mns.brain.riken.go.jp/~okada/Tatsuno_Okada.pdf The preliminary results of this study have been reported in the following articles. 'Possible neural mechanisms underlying information-geometric measure parameters', M. Tatsuno and M. Okada, Society for Neuroscience Abstracts, 28, 675.15, 2002. 'How does the information-geometric measure depend on underlying neural mechanisms?', M. Tatsuno and M. Okada, Neurocomputing, Vol. 52 - 54, pp. 649 - 654, 2003. Best regards, Masami TATSUNO ARL Division of Neural Systems, Memory and Aging Life Sciences North Building, Room 384 The University of Arizona Tucson, AZ 85724, USA ----- Abstract ----- A novel analytical method based on information geometry was recently proposed, and this method may provide useful insights into the statistical interactions within neural groups. The link between information-geometric measures and the structure of neural interactions has not yet been elucidated, however, because of the ill-posed nature of the problem. Here, possible neural architectures underlying information-geometric measures are investigated using an isolated pair and an isolated triplet of model neurons. By assuming the existence of equilibrium states, we derive analytically the relationship between the information-geometric parameters and these simple neural architectures. For symmetric networks, the first- and second-order information-geometric parameters represent, respectively, the external input and the underlying connections between the neurons provided that the number of neurons used in the parameter estimation in the log-linear model and the number of neurons in the network are the same. For asymmetric networks, however, these parameters are dependent both on the intrinsic connections and on the external inputs to each neuron. In addition, we derive the relation between the information-geometric parameter corresponding to the two-neuron interaction and a conventional cross-correlation measure. We also show that the information-geometric parameters vary depending on the number of neurons assumed for parameter estimation in the log-linear model. This finding suggests a need to examine the information-geometric method carefully, and a possible criterion for choosing an appropriate orthogonal coordinate is also discussed. This paper points out the importance of a model-based approach, and sheds light on the possible neural structure underlying the application of information geometry to neural network analysis. From poznan at iub-psych.psych.indiana.edu Thu Oct 9 01:18:09 2003 From: poznan at iub-psych.psych.indiana.edu (Roman Poznanski) Date: Thu, 09 Oct 2003 01:18:09 -0400 Subject: JIN 2(2) CONTENTS Message-ID: <3F84EF91.4010304@iub-psych.psych.indiana.edu> JOURNAL OF INTEGRATIVE NEUROSCIENCE Vol. 2, No.2, December 2003 CONTENTS ----------- Short communications Savant-like Skills Exposed in Normal People by Suppressing the Left Fronto-temporal Lobe Allan W. Snyder, Elaine Mulcahy, Janet L. Taylor, D. John Mitchell, Perminder Sachdev, and Simon C. Gandevia. Closing an Open-loop Control System: Vestibular Substitution Through the Tongue Mitchell Tyler, Yuri Danilov and Paul Bach-y-Rita. Research Reports Age Related Alterations in the Complexity of Respiratory Patterns Metin Akay, Karen L. Moodie, P. Jack Hoopes, A Basal Ganglia Inspired Model of Action Selection Evaluated in a Robotic Survival Task B. Girard, V. Cuzin, A. Guillot, K.N. Gurney, T.J. Prescott. Applying Database Technology to Clinical and Basic Research Bioinformatics Projects Kelly A. Sullivan Short-term Autonomic Control of Cardiovascular Function: A Mini-review With the Help of Mathematical Models Mauro Ursino Distributed Coding Efficiency in Orthogonal Models of Facial Processing Paul A. Watters -- Roman R. Poznanski, Ph.D Associate Editor, Journal of Integrative Neuroscience Department of Psychology Indiana University 1101 E. 10th St. Bloomington, IN 47405-7007 email: poznan at iub-psych.psych.indiana.edu phone (Office): (812) 856-0838 http://www.worldscinet.com/jin/mkt/editorial.shtml From malchiodi at dsi.unimi.it Fri Oct 10 11:06:26 2003 From: malchiodi at dsi.unimi.it (Dario Malchiodi) Date: Fri, 10 Oct 2003 17:06:26 +0200 Subject: Book announcement: Algorithmic Inference in Machine Learning Message-ID: <5191CBE0-FB33-11D7-A0ED-0003939B3D3E@dsi.unimi.it> Apologizing for cross-posting Dear Colleagues, It is our pleasure to announce the availability of our book Algorithmic Inference in Machine Learning (International Series on Advanced Intelligence, Vol. 5) ISBN 0-9751004-2-4 (see http://laren.dsi.unimi.it/aibook) This book offers a new theoretical framework for modern statistical inference problems, generally referred to as learning problems. They arise in connection with hard operational contexts to be managed in the lack of all necessary knowledge. The success of their solutions lies in a suitable mix of computational skill? in processing the available data and sophisticated attitude? in stating logical relations between their properties and the expected behavior of candidate solutions. The framework is discussed through rigorous mathematical statements in the province of probability theory and a highly comprehensive style. Theoretical concepts are introduced using examples from everyday life. The book can be ordered to Advanced Knowledge International Pty Ltd PO Box 228 Magill, Adelaide South Australia SA 5072 Australia Email: info at innoknowledge.com Fax: +61-8-8332-6805 or on-line at amazon.com: http://www.amazon.com/exec/obidos/tg/detail/-/0975100424/ qid=1065775131/sr=1-2/ref=sr_1_2/103-1284135-9639054?v=glance&s=books A numerical tool is growing for computing statistics according to our framework. The tool is available through the?web site http://laren.dsi.unimi.it/TAP, and we hope for your suggestions and comments. Sincerely, Bruno Apolloni, Dario Malchiodi and Sabrina Gaito From Harel.Shouval at uth.tmc.edu Fri Oct 10 13:06:07 2003 From: Harel.Shouval at uth.tmc.edu (Harel Shouval) Date: Fri, 10 Oct 2003 12:06:07 -0500 Subject: Postdoctoral Positions Message-ID: <3F86E6FF.2020907@uth.tmc.edu> Postdoctoral positions available in Theoretical/Computational Neuroscience. We use analytical and computational techniques for studying the cellular basis of learning memory and development. Current topics include modeling of 'spike time dependent plasticity', receptive field development with spiking neurons and studying the molecular basis of stable long term plasticity. Knowledge of analytical or computational methods is required, knowledge of Neuroscience is preferred. Most of our computational work is carried out using Matlab and C, on Linux platforms. Description of work in my lab, and links to recent papers can be found in: http://nba.uth.tmc.edu/resources/faculty/members/shouval.htm Please contact: Harel Shouval Department of Neurobiology and Anatomy The University of Texas- Houston medical center Harel.shouval at uth.tmc.edu Tel: 713-500-5708 EOE/AA/SSP/smoke free environment From djaeger at emory.edu Thu Oct 9 13:00:34 2003 From: djaeger at emory.edu (Dieter Jaeger) Date: Thu, 9 Oct 2003 19:00:34 +0200 Subject: Postdoctoral Position in Computational Neuroscience Message-ID: <3F6C826D.2F419699@emory.edu> A funded postdoctoral opening in the area of computational neuroscience is available in my laboratory at Emory University, Atlanta. The research project is aimed at elucidating the operation of the deep cerebellar nuclei using whole cell recordings in slices and compartmental modeling. This work will build on our previous publications in this area (Gauck and Jaeger, J. Neurosci. 2000; 2003). The Neuroscience environment at Emory University is excellent, and living in Atlanta features a large international community and plenty of activities. Candidates should have previous experience in intracellular electrophysiology and/or compartmental modeling. Interested candidates should contact djaeger at emory.edu for further details. -Dieter Jaeger Associate Professor Emory University Department of Biology 1510 Clifton Rd. Atlanta, GA 30322 Tel: 404 727 8139 Fax: 404 727 2880 e-mail: djaeger at emory.edu From mackay at mrao.cam.ac.uk Mon Oct 13 09:08:28 2003 From: mackay at mrao.cam.ac.uk (David J.C. MacKay) Date: Mon, 13 Oct 2003 14:08:28 +0100 Subject: Information Theory, Inference, and Learning Algorithms Message-ID: The following book is now available for purchase in bookstores, price 30 pounds or $50 US. As of Mon 13/10/03, Barnes and Noble are offering it at the special price of $40.00. The book also remains available for free on-screen viewing. It can be downloaded from http://www.inference.phy.cam.ac.uk/mackay/itila/ ======================================================================== "Information Theory, Inference, and Learning Algorithms" by David J.C. MacKay Cambridge University Press http://www.cambridge.org/0521642981 ------------------------------------------------------------------------- `An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.' Bob McEliece, California Institute of Technology An undergraduate / graduate textbook. This book features: * lots of figures and demonstrations. * more than four hundred exercises, many with worked solutions. * up to date exposition of: . source coding - including arithmetic coding. . channel coding - including Gallager codes, turbo codes, and digital fountain codes. . machine learning - including clustering, neural networks, and Gaussian processes. . message-passing algorithms - especially the sum-product algorithm, or loopy belief propagation. . variational methods - including the variational view of the E-M algorithm. . Monte Carlo methods - including Hamiltonian Monte Carlo, Overrelaxation, Slice sampling, and Exact sampling. * Entertaining asides on topics as varied as crosswords, codebreaking, evolution, and sex. The book was published in September 2003. (Hardback, 640 pages) Inspect it for free at http://www.inference.phy.cam.ac.uk/mackay/itila/ --- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - David J.C. MacKay mackay at mrao.cam.ac.uk http://www.inference.phy.cam.ac.uk/mackay/ Cavendish Laboratory, Madingley Road, Cambridge CB3 0HE. U.K. (01223) 339852 | fax: 354599 | home: 740511 international: +44 1223 From planning at icsc.ab.ca Sun Oct 12 17:16:26 2003 From: planning at icsc.ab.ca (Jeanny S. Ryffel) Date: Sun, 12 Oct 2003 15:16:26 -0600 Subject: Biologically Inspired Cognitive Systems / Scotland 2004 Message-ID: <5.1.0.14.2.20031012151451.00a44b30@pop.interbaun.com> The science of neural computation focuses on mathematical aspects for solving complex practical problems. It also seeks to help neurology, brain theory and cognitive psychology in the understanding of the functioning of the nervous system by means of computational models of neurons, neural nets and sub-cellular processes. BICS2004 aims to become a major point of contact for research scientists, engineers and practitioners throughout the world in the fields of cognitive and computational systems inspired by the brain and biology. Participants will share the latest research, developments and ideas in the wide arena of disciplines encompassed under the heading of BICS2004: First International ICSC Symposium on Cognitive Neuro Science (CNS 2004) (from computationally inspired models to brain-inspired computation) Chair: Prof. Igor Aleksander, Imperial College London, U.K Second International ICSC Symposium on Biologically Inspired Systems (BIS 2004) Chair: Prof. Leslie Smith, University of Stirling, U.K. Third International ICSC Symposium on Neural Computation (NC'2004) Chair: Dr. Amir Hussain, University of Stirling, U.K. http://www.icsc-naiso.org/conferences/bics2004/bics-cfp.html From werning at phil-fak.uni-duesseldorf.de Mon Oct 13 07:32:52 2003 From: werning at phil-fak.uni-duesseldorf.de (Markus Werning) Date: Mon, 13 Oct 2003 13:32:52 +0200 Subject: CFP: Compositionality, Concepts and Cognition Message-ID: <008201c3917d$d146e650$e26a6386@philos19> Please distribute - apologies for multiple posting -------------------------------------------------------------------- FIRST CALL FOR PAPERS Compositionality, Concepts and Cognition An Interdisciplinary Conference in Cognitive Science D?sseldorf, Germany February 28 to March 3, 2004 http://www.phil.uni-duesseldorf.de/thphil/compositionality CONFERENCE AIM The conference on compositionality is to take place at Heinrich Heine University D?sseldorf, Germany, from February 28 to March 3, 2004. Compositionality is a key feature of structured representational systems, be they linguistic, mental or neuronal. A system of representations is called compositional just in case the semantic values of complex representations are determined by the semantic values of their parts. The conference brings together internationally renowned scholars from various disciplines of the cognitive sciences, including philosophers, psychologists, linguists, computer scientists and neuro scientists. The speakers will address the issue of compositionality from very different perspectives. It is the aim of the conference to further the exchange of views on compositionality across the disciplines and to explore the implications and condition of compositionality as a property of representational systems in the study of language, mind and brain. PLENARY SPEAKERS The list of plenary speakers includes Johannes Brandl, Henry Brighton, Daniel Cohnitz, Andreas K. Engel, Lila Gleitman, Terry Horgan, Theo Janssen, Hannes Leitgeb, Sebastian L?bner, Edouard Machery, Alexander Maye, Brian McLaughlin, C. Ulises Moulines, Jeff Pelletier, Martina Penke, Jesse Prinz, Gabriel Sandu, Richard Schantz, Oliver Scholz, Ricarda Schubotz, Gerhard Schurz, Markus Werning, Gert Westermann, Edward Wisniewski, and Dieter Wunderlich. ORGANIZATION The conference is the result of a co-operation between the Institut Jean Nicod, the University Paris-Sorbonne, the Ecole Normale Superieure in France and Heinrich Heine University D?sseldorf in Germany. Its is organized by - Markus Werning, Department of Philosophy, Heinrich Heine University and Center for Language, Logic, and Information, D?sseldorf; - Edouard Machery, Department of Philosophy, Sorbonne, Paris, and Max-Planck-Institute for Human Development, Berlin; - Gerhard Schurz, Department of Philosophy, Heinrich Heine University, D?sseldorf. SCIENTIFIC BOARD - Daniel Andler, Department of Philosophiy, Sorbonne, Paris, and Department of Cognitive Studies, ENS-Ulm, Paris; - Peter Carruthers, Department of Philosophy, University of Maryland; - James Hampton, Department of Psychology, City University London; - Douglas Medin, Department of Psychology, Northwestern University, Evanston; - Jesse Prinz, Department of Philosophy, University of North Carolina, Chapel-Hill; - Francois Recanati, Institut Jean-Nicod, Centre National de la Recherche Scientifique, Paris; - Philippe Schlenker, Department of Linguistics, University of California, Los Angeles, and Institut Jean-Nicod, Centre National de la Recherche Scientifique, Paris; - Dag Westerstahl, Department of Philosophy, University of Gotenborg. ABSTRACT SUBMISSION The programme committee invites researchers in the cognitive sciences (philosophy, psychology, neuroscience, computer science, linguistics, etc.) to present their work on compositionality at the conference. The deadline for paper submission is December 10, 2003. Only a limited number of oral and poster presentations can be accepted. Papers should fit into the overall programme of the conference and should be accessible to an interdisciplinary audience. Oral presentations are 20 minutes plus 10 minutes discussion. To submit a paper, please send in an extended abstract of about 1500 words using the online submission form on the conference homepage: http://www.phil-fak.uni-duesseldorf.de/thphil/compositionality In exceptional cases, hardcopy submission is also possible at: CoCoCo2004 c/o Markus Werning Chair of Theoretical Philosophy Heinrich-Heine-University D?sseldorf Universit?tsstr. 1 D-40225 D?sseldorf, Germany All submitted abstracts will be reviewed. The corresponding author will be notified about acceptance by January 15, 2004. Submissions must be received by December 10, 2003. The presenting author(s) must register for the conference after notification of acceptance. SPONSOR The conference is sponsored by the Fritz Thyssen Foundation. CONTACT Please address any questions to cococo2004 at phil.uni-duesseldorf.de From bogus@does.not.exist.com Sun Oct 12 21:06:38 2003 From: bogus@does.not.exist.com () Date: Mon, 13 Oct 2003 11:06:38 +1000 Subject: Two preprints on spike timing-dependent synaptic plasticity Message-ID: From mm at cse.ogi.edu Wed Oct 15 14:28:18 2003 From: mm at cse.ogi.edu (Melanie Mitchell) Date: Wed, 15 Oct 2003 11:28:18 -0700 Subject: faculty positions at OGI Message-ID: <16269.37314.622706.412847@sangre.cse.ogi.edu> Dear Connectionists, I wanted to bring to your attention the following advertisement for faculty recruitment at OGI. Adaptive systems and machine learning is one of our specific target areas for hiring. OGI is putting significant resources into building a strong and diverse Laboratory for Adaptive Systems, and we expect to hire between two and four people in this area within the next five years. Our recent merger with the Oregon Health & Science University has opened up great opportunities for collaboration with researchers in biological and medical sciences, and our close proximity to the OHSU Neurological Sciences Institute has already resulted in collaborations between our respective faculties in the fields of neural modeling and neural computation. OGI's current adaptive systems faculty includes: Dan Hammerstrom: Biologically inspired computation, VLSI chip design, neural networks Marwan Jabri: Intelligent signal processing, biologically inspired control for robotics Todd Leen: Machine learning, local and mixture models, neurophysiological modeling Melanie Mitchell: Evolutionary computation, cognitive science, complex systems John Moody (ICSI; joint appointment at OGI): Reinforcement learning, neural networks, time series analysis, data mining, computational finance Misha Pavel: Cognitive science, biologically inspired computation, biomedical engineering Xubo Song: Image processing, statistical pattern recognition Eric Wan: Neural networks, adaptive signal processing and control We are looking for outstanding individuals in any area of machine learning or adaptive systems to join our faculty. Please pass this on to anyone you think might be interested. --------------------------------------------------------------------- Faculty Positions at the OGI School of Science and Engineering The Department of Computer Science and Engineering invites applications for faculty positions at all ranks. We invite applications from outstanding candidates in any area of computer science and engineering. The typical teaching load in CSE is 2 graduate-level classes per year. Following an institutional merger in 2001, the Oregon Graduate Institute of Science and Technology (OGI) became one of the four schools of Oregon Health & Science University (OHSU). The merger is enabling the CSE department to expand in core disciplines of computer science and engineering and to establish strong interdisciplinary collaborations among researchers in computational, biological, and biomedical sciences. OGI is located 12 miles west of Portland, Oregon, in the heart of the Silicon Forest. Portland's extensive high-tech community, diverse cultural amenities and spectacular natural surroundings combine to make the quality of life here extraordinary. To learn more about the department, OGI, OHSU and Portland, please visit http://www.cse.ogi.edu. To apply, send a brief description of your research interests, the names of at least three references, and a curriculum vitae with a list of publications to: Chair, Recruiting Committee Department of Computer Science and Engineering OGI School of Science and Engineering at OHSU 20000 NW Walker Road Beaverton, Oregon 97006 Applications sent in before January 15, 2004 will be given preference. All applications will be reviewed. The email address for inquiries is: csedept at cse.ogi.edu. OGI/OHSU is an Equal Opportunity/Affirmative Action employer. We particularly welcome applications from women, minorities, and individuals with disabilities. From klaus at prosun.first.fraunhofer.de Thu Oct 16 10:42:24 2003 From: klaus at prosun.first.fraunhofer.de (Klaus-R. Mueller) Date: Thu, 16 Oct 2003 16:42:24 +0200 (MEST) Subject: NIPS Travel Grant Deadline Message-ID: <200310161442.h9GEgOCx021488@prosun.first.fraunhofer.de> Dear collegues, please take note of the following announcement by Bartlett Mel (NIPS Treasurer): The deadline to apply for a travel grant for the NIPS 2003 Conference is this Friday, October 17, 2003. Modest travel support will be available, with a preference to students, to those who will be presenting at the meeting, and to those who have not previously received travel awards. For details, and to fill out the on-line application, click here: http://www.nips.cc/Conferences/2003/TravelSupport.php Best wishes, klaus Muller -- &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& Prof. Dr. Klaus-Robert M\"uller University of Potsdam and Fraunhofer Institut FIRST Intelligent Data Analysis Group (IDA) Kekulestr. 7, 12489 Berlin e-mail: Klaus-Robert.Mueller at first.fraunhofer.de and klaus at first.fhg.de Tel: +49 30 6392 1860 Tel: +49 30 6392 1800 (secretary) FAX: +49 30 6392 1805 http://www.first.fhg.de/persons/Mueller.Klaus-Robert.html &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& From nnk at atr.co.jp Sun Oct 19 20:35:42 2003 From: nnk at atr.co.jp (Neural Networks Editorial Office) Date: Mon, 20 Oct 2003 09:35:42 +0900 Subject: Neural Networks 16(9): Special Issue on "Neuroinformatics" Message-ID: Contents - Volume 16, Numbers 9 - 2003 2003 Special Issue:"Neuroinformatics" edited by Shun-ichi Amari, Michael A Arbib and Rolf Kotter ------------------------------------------------------------------ Contents: Introduction Language evolution: neural homologies and neuroinformatics. Michael Arbib, Mihail Bota Network participation indices: characterizing component roles for information processing in neural networks. Rolf Kotter, Klaas E. Stephan Towards a formalization of disease-specific ontologies for neuroinformatics. Amarnath Gupta, Bertram Ludascher, Jeffrey S. Grethe, Maryann E. Martone Visiome: neuroinformatics research in vision project. Shiro Usui Neuroanatomical database of normal Japanese brains. Kazunori Sato, Yasuyuki Taki, Hiroshi Fukuda, Ryuta Kawashima Complex independent component analysis of frequency-domain electroencephalographic data. Jorn Anemuller, Terrence J. Sejnowsky, Scott Makeig Learning and inference in the brain. Karl Friston Modeling the adaptive visual system: a survey of principled approaches. Lars Schwabe, Klaus Obermayer Self-correction mechanism for path integration in a modular navigation system on the basis of an egocentric spatial map. Regina Mudra, Rodney J. Douglas Kinetic simulation of signal transduction system in hippocampal long-term potentiation with dynamic modeling of protein phosphatase 2A. Shinichi Kikuchi, Kenji Fujimoto, Noriyuki Kitagawa, Taro Fuchikawa, Michiko Abe, Kotaro Oka, Kohtaro Takei, Masaru Tomita ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Shozo Yasui Kyushu Institute of Technology Graduate School of Life Science and Engineering 2-4 Hibikino, Wakamatsu-ku Kitakyushu 808-0196 Japan 81 93 695 6108 (phone and fax) jnns at brain.kyutech.ac.jp http://www.jnns.org/ ----------------------------------------------------------------- From mark.laubach at yale.edu Mon Oct 20 10:24:50 2003 From: mark.laubach at yale.edu (Mark Laubach) Date: Mon, 20 Oct 2003 10:24:50 -0400 Subject: POSTDOCTORAL POSITION: REAL-TIME DECODING FOR BRAIN-MACHINE INTERFACES Message-ID: <3F93F032.4000502@yale.edu> Content-Type: text/plain; charset=us-ascii; format=flowed Content-Transfer-Encoding: 7bit POSTDOCTORAL POSITION: REAL-TIME DECODING FOR BRAIN-MACHINE INTERFACES A postdoctoral position is available in January, 2003 in the laboratory of Dr Mark Laubach at the John B Pierce Laboratory, a non-profit research institute affiliated with Yale University. The position is part of a collaboration with Drs Jon Kaas and Troy Hackett and their colleagues at Vanderbilt University and is supported by DARPA. This individual will develop novel methods for on-line analyses of neuronal ensemble data (e.g., adaptive methods for spike sorting and decoding analyses of spike and field potential data) using a cluster of Linux workstations. In addition, putative neuronal codes in auditory, prefrontal, and motor areas of the cerebral cortex will be investigated using neuronal ensemble recording and microstimulation methods in collaboration with neurophysiologists in our group. The position requires expertise in methods for statistical pattern recognition (e.g., random forest, SVMs), functional data analysis, and scientific computing under Linux. Our lab makes heavy use of Matlab, R, and Python, so knowledge of these tools is also necessary. Interested individuals should submit a letter of application, CV, copies of publications, and the names of three references to: Mark Laubach, PhD The John B. Pierce Laboratory Inc. 290 Congress Avenue New Haven, CT 06519 Inquires: laubach at jbpierce.org The review of applications will begin November 1, 2003, and will continue until the position is filled. E.O.E. From mr287 at georgetown.edu Mon Oct 20 19:57:45 2003 From: mr287 at georgetown.edu (Maximilian Riesenhuber) Date: Mon, 20 Oct 2003 19:57:45 -0400 Subject: postdoctoral position in computational neuroscience/fMRI, Georgetown University Message-ID: <3F947679.4070201@georgetown.edu> Postdoctoral Position in Computational Neuroscience and fMRI Department of Neuroscience Georgetown University A postdoctoral position is available immediately to study the neural mechanisms underlying real world object recognition in cortex using a combination of modeling, psychophysics, and fMRI. The project is part of an NIH-funded collaboration between the Riesenhuber lab at Georgetown and labs at MIT (Jim DiCarlo, Earl Miller, Tomaso Poggio), Caltech (Christof Koch) and Northwestern (David Ferster). Candidates should have experience in two of the following: computational neuroscience, visual psychophysics, fMRI. Experience with fMRI is a particular asset as a key component of the project will be the experimental testing of model predictions with fMRI, using Georgetown's new 3T magnet. For more information, see http://riesenhuberlab.neuro.georgetown.edu/Riesenhuber_lab_jobs.html . Interested candidates should send a CV, representative reprints, and the names of three references to Maximilian Riesenhuber (mr287 at georgetown.edu). Review of applications will begin November 1, with a possibility for interviews at this year's SFN meeting in New Orleans. ************************************* Maximilian Riesenhuber Department of Neuroscience Georgetown University Medical Center 3970 Reservoir Rd., NW Research Building Room EP09 Washington, DC 20007 From terry at salk.edu Tue Oct 21 20:13:40 2003 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 21 Oct 2003 17:13:40 -0700 (PDT) Subject: UCSD Computational Neurobiology Training Program Message-ID: <200310220013.h9M0DeN21455@purkinje.salk.edu> DEADLINE: JANUARY 2, 2004 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/grad/CN_overview.html The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers who are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Financial support for students enrolled in this training program is available through an NSF Integrative Graduate Education and Research Training (IGERT) award. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, behavioral, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology * Darwin Berg (Neurobiology): Regulation synaptic components, assembly and localization, function and long-term stability. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms * Virginia De Sa (Cognitive Science): Computational basis of perception and learning (both human and machine); multi-sensory integration and contextual influences * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research * Marla Feller (Neurobiology): Mechanisms and function of spontaneous activity in the developing nervous system including the retina, spinal cord, hippocampus and neocortex. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels * David Kleinfeld (Physics): Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. * William Kristan (Neurobiology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies * Scott Makeig (Institute for Neural Computation): Analysis of cognitive event-related brain dynamics and fMRI using time-frequency and Independent Component Analysis * Javier Movellan (Institute for Neural Computation): Sensory fusion and learning algorithms for continuous stochastic systems * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects * Terrence Sejnowski (Salk Institute/Neurobiology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques * Nicholas Spitzer (Neurobiology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of Neurobiology * Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons * Jochen Triesch (Cognitive Science): Sensory integration, visual psychophysics, vision systems and robotics, human-robot interaction, cognitive developmental * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms Requests for application materials should be sent to the University of California, San Diego, Division of Biological Sciences 0348, Graduate Admissions Office, 9500 Gilman Drive, La Jolla, CA, 92093-0348 or to [gradprog at biomail.ucsd.edu]. The deadline for completed application materials, including letters of recommendation, is January 2, 2004. For more information about applying to the UCSD Biology Graduate Program. A preapplication is not required for the Computational Neurobiology Program. http://www.biology.ucsd.edu/grad/admissions/index.html From Wulfram.Gerstner at epfl.ch Wed Oct 22 03:43:52 2003 From: Wulfram.Gerstner at epfl.ch (Wulfram Gerstner) Date: Wed, 22 Oct 2003 09:43:52 +0200 Subject: Spike-Timing Dependent Plasticity Workshop Message-ID: <3F963538.82628815@epfl.ch> I would like to announce ********************************************************************** The Monte Verita Workshop on Spike-Timing Dependent Plasticity (STDP) Ascona, Switzerland, February 29-March5 , 2004 ********************************************************************** http://diwww.epfl.ch/~gerstner/STDP/index.html The symposium will bring together experimentalists and theoreticians working on Hebbian Learning, in particular Spike-Timing Dependent Plasticity (STDP) There will be a single track of talks. Most talks are by invitation, but a few extra time slots for contributed talks are available. There is also the possibility to present a poster. Registration is possible before January 15, 2004, but early registration is encouraged since the total number of participants is limited. Invited speakers A. Experiments G.Q Bi, Pittsburgh Dean Buonomano, UCLA Yang Dan, Berkeley Dominique Debanne, Marseille Nace Golding, Texas Mark Hubener, MPI of Neurobiology, Munich Mayank Mehta, MIT Henry Markram, EPFL, Lausanne M.-M. Poo, Berkeley Miguel Remondes, Caltech Jesper Sjoestroem, Brandeis and London B. Theory Henry D.I. Abarbanel, UCSD Larry Abbott, Brandeis S. Fusi, Bern W. Gerstner, Lausanne R. Guetig, Berlin Leo van Hemmen, TU Munich . R. Rao, Univ. of Washington P.D. Roberts, Oregon S. Seung, MIT Harel Shouval, Brown University Haim Sompolinsky, Jerusalem Walter Senn, Bern Misha Tsodyks, Jerusalem for further information, see http://diwww.epfl.ch/~gerstner/STDP/index.html organization committee: W. Gerstner, H. Markram, and W. Senn best regards Wulfram -- Wulfram Gerstner Swiss Federal Institute of Technology Lausanne Professor Laboratory of Computational Neuroscience, LCN wulfram.gerstner at epfl.ch Batiment AA-B Tel. +41-21-693 6713 1015 Lausanne EPFL Fax. +41-21-693 9600 http://diwww.epfl.ch/mantra From todorov at Cogsci.ucsd.edu Thu Oct 23 08:45:04 2003 From: todorov at Cogsci.ucsd.edu (Emo Todorov) Date: Thu, 23 Oct 2003 05:45:04 -0700 Subject: SFN computational motor control satellite Message-ID: <003a01c39963$7b86cdf0$8306ef84@todorov> Dear Colleagues, We are pleased to announce the second SFN satellite "Advances in Computational Motor Control". The program is now finalized and attached below. More information about the symposium, including extended abstracts for all contributed presentations, can be found at www.bme.jhu.edu/acmc Yours, Emo Todorov, UCSD Reza Shadmehr, JHU ================================= Advances in Computational Motor Control II Symposium at the Annual Society for Neuroscience Meeting Friday, November 7, 2003, 2:00PM - 9:30PM Room 288-290, Morial Convention Center, New Orleans 1:00-2:00 Registration 2:05-3:35 Session 1 INVITED TALK: Stefan Schaal. USC. "A computational approach to motor control and learning with motor primitives" James Patton, Sandro Mussa-Ivaldi, Y. Wei, M. Phillips, M. Stoykov. Northwestern University. "Exploiting sensorimotor adaptation" Steve Massaquoi. MIT. "Stabilization of cerebrocerebellar feedback control without internal dynamic models" Opher Donchin and Reza Shadmehr. Johns Hopkins University. "Can training change the desired trajectory?" 3:50-5:45 Session 2 Ken Ohta, Rafael Laboissiere, Mikhail Svinin. Max Planck Institute for Pscyhological Research and RIKEN. "Optimal trajectory of human arm reaching movements in dynamical environments" Emmanuel Guigon, Pierre Baraduc, Michel Desmurget. INSERM France. "Constant effort computation as a determinant of motor behavior" Emanuel Todorov. University of California San Diego. "Stochastic optimal feedback control of nonlinear biomechanical systems" Zhaoping Li, Alex Lewis, Silvia Scarpetta. University College London and University of Salerno, Italy. "Computational understanding of the neural circuit for the central pattern generator for locomotion and its control in lamprey" Madhusudhan Venkadesan, Francisco Valero-Cuevas, John Guckenheimer. Cornell University. "The boundary of instability as a powerful experimental paradigm for understanding complex dynamical sensorimotor behavior: dexterous manipulation as an example" 7:30-9:30 Session 3 INVITED TALK: Richard Andersen. CalTech. "Coordinate transformations for sensory guided movements" A. Roitman, S. Pasalar, and Tim Ebner. University of Minnesota. "Models of Purkinje cell discharge during circular manual tracking in monkey" Edward Boyden, Richard Tsien, Talal Chatila, Jennifer Raymond. Stanford University. "Is oppositely directed motor learning implemented with inverse plasticity mechanisms?" Paul Cisek. University of Montreal. "A computational model of reach decisions in the primate cerebral cortex" Dana Cohen and Miguel Nicolelis. Duke University. "Uncertainty reduction at the neuronal ensemble but not in single neurons during motor skill learning" From gpatane at ai.unime.it Fri Oct 24 04:49:24 2003 From: gpatane at ai.unime.it (Giuseppe Patane') Date: Fri, 24 Oct 2003 10:49:24 +0200 Subject: Clustering/VQ applet available (ELBG algorithm) Message-ID: <3F98E794.40701@ai.unime.it> Dear Connectionists, I am pleased to invite you to take a look at the Java applet demonstrating how "The Enhanced LBG Algorithm" (ELBG), developed by me and Prof. Marco Russo, works. The paper describing ELBG appeared on "Neural Networks", vol. 14 no. 9, pp 1219--1237, November 2001. You can find it, together with the related papers, at: http://ai.unime.it/~gp/english/clustering_applet/elbg_demo_applet.htm In the same site you can find other papers of mine about clustering and vector quantization. Please, note that the applet was developed using Java Swing components and it could not be supported by your browser as the following quoting from the official Java web site reports: "You can run Swing applets in any browser that has the appropriate version of Java Plug-in installed. Another option is to use a 1.2-compliant browser. Currently, the only 1.2-compliant browser available is the Applet Viewer utility provided with the Java 2 SDK." Best Regards -- Giuseppe Patane', Ph.D ------------------------------------------ Department of Physics University of Messina Contrada Papardo - Salita Sperone, 31 98166 S. Agata, Messina - ITALY and INFN Section of Catania 64, Via S. Sofia, I-95123 Catania - Italy e-mail: gpatane at ai.unime.it Fax: +39 (0)6 233245336 Home page: http://ai.unime.it/~gp/ From osporns at indiana.edu Tue Oct 28 09:25:57 2003 From: osporns at indiana.edu (Olaf Sporns) Date: Tue, 28 Oct 2003 09:25:57 -0500 Subject: Positions in robotics/cognitive science Message-ID: <3F9E7C75.1000509@indiana.edu> Indiana University, Bloomington, IN. Faculty position beginning August 2004. As part of a series of new appointments, the Cognitive Science Program at Indiana University Bloomington seeks applicants with a strong record of research and teaching using the ideas and techniques of biomorphic robotics, evolutionary robotics and artificial life. The successful applicant will take a leadership role in the planning and execution of a new, state-of-the-art laboratory for teaching and research. Applicants should send full dossiers, including letters of recommendation or names and addresses of referees. Indiana University is an equal opportunity/affirmative action employer. Applications from women and minority group members are especially encouraged. Please send materials to: Professor Andy Clark Search Committee Indiana University Cognitive Science Program 1033 E. Third St., Sycamore 0014 Bloomington, IN 47405. Applications received by January 9, 2004 are assured full consideration. -- Olaf Sporns, PhD Department of Psychology Neural Science Program Cognitive Science Program Indiana University Bloomington, IN 47405 http://php.indiana.edu/~osporns From tgd at cs.orst.edu Tue Oct 28 19:42:23 2003 From: tgd at cs.orst.edu (Thomas G. Dietterich) Date: Tue, 28 Oct 2003 16:42:23 -0800 Subject: Faculty Positions in Machine Learning and Artificial Intelligence Message-ID: <9018-Tue28Oct2003164223-0800-tgd@cs.orst.edu> The School of Electrical Engineering and Computer Science at Oregon State University is recruiting faculty in machine learning and related areas including vision, speech, natural language processing, robotics, and so on. Our school already includes several faculty members in these areas: * Bruce D'Ambrosio (Graphical Models) * Tom Dietterich (Machine Learning; Reinforcement Learning; Pattern Recognition) * Jon Herlocker (Intelligent Information Access; Collaborative Filtering) * Luca Lucchese (Camera Calibration) * Larry Marple (Signal Interpretation) * Eric Mortensen (Computer Vision, Image Processing) * Prasad Tadepalli (Reinforcement Learning) Full details are available at http://eecs.oregonstate.edu/faculty/03-04cs.html --Tom -- Thomas G. Dietterich, Professor Voice: 541-737-5559 School of Electrical Engineering FAX: 541-737-3014 and Computer Science URL: http://www.cs.orst.edu/~tgd Dearborn Hall 102, Oregon State University, Corvallis, OR 97331-3102 From geos at etf.ukim.edu.mk Tue Oct 28 15:49:17 2003 From: geos at etf.ukim.edu.mk (Georgi Stojanov) Date: Tue, 28 Oct 2003 21:49:17 +0100 Subject: CFP Epigenetic Robotics 2003 Message-ID: <001d01c39d94$fadc0630$cb9095c2@toshibauser> EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004 EPIROB2004 EPIROB2004 CALL FOR PAPERS EPIROB2004 EPIROB2004 Fourth International Workshop on Epigenetic Robotics: EPIROB2004 Modeling Cognitive Development in Robotic Systems EPIROB2004 EPIROB2004 http://www.epigenetic-robotics.org EPIROB2004 EPIROB2004 EPIROB2004 August 25-27, 2004 EPIROB2004 Location: LIRA-Lab, University of Genoa EPIROB2004 Genoa, Italy EPIROB2004 EPIROB2004 EPIROB2004 Submission Deadline: March 1st, 2004 EPIROB2004 EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004 This workshop focuses on combining developmental psychology, neuroscience, biology, and robotics with the goal of understanding the functioning of biological systems. Epigenetic systems, either natural or artificial, share a prolonged developmental process through which varied and complex cognitive and perceptual structures emerge as a result of the interaction of an embodied system with a physical and social environment. Epigenetic robotics includes the two-fold goal of understanding biological systems by the interdisciplinary integration between neural and engineering sciences and, simultaneously, that of enabling robots and artificial systems to develop skills for any particular environment instead of programming them for specific environments. To this aim, psychological theory and empirical evidence should be used to inform epigenetic robotic models, and these models should be used as theoretical tools to make experimental predictions in developmental psychology. We encourage the submission from different disciplines such as robotics, artificial intelligence, developmental psychology, biology or neurophysiology, as well as interdisciplinary work bridging the gap between science and engineering. Subject Areas include, but are not limited to: * The role of motivations, emotions, and value systems in development; * The development of: concepts, consciousness and self-awareness, emotion, imitation, intentionality, intersubjectivity, joint attention, learning, motivation, non-verbal and verbal communication, self, sensorimotor schemata, shared meaning and symbolic reference, social learning, social relationships, social understanding ("mind reading", "theory of mind"), value systems; * Interaction between innate structure, ongoing developing structure, and experience; * Related issues in algorithms, robotics, simulated robots, and embodied systems; * Strong AI (true intelligence and autonomy) versus weak AI; * Related issues from human and nonhuman empirical studies. For summaries of the papers from the latest workshops see: Zlatev and Balkenius (2001), Prince (2002), and Berthouze and Prince (2003). Please send any questions to the workshop co-chairs: Giorgio Metta (pasa at dist.unige.it) and Luc Berthouze (Luc.Berthouze at aist.go.jp). Sponsors LIRA-Lab, University of Genoa, Italy Communications Research Laboratory, Japan Location University of Genoa, Italy Invited Speakers Luciano Fadiga, Dept. of Biomedical Sciences, University of Ferrara, Italy Claes von Hofsten, Dept. of Psychology, University of Upssala, Sweden J=FCrgen Konczak, Human Sensorimotor Control Lab, University of Minnesota, USA Jacqueline Nadel, CNRS, University Pierre & Marie Curie, Paris, France Submissions Papers not exceeding eight (8) pages should be submitted electronically (PDF or Postscript) as attachment files to Luc Berthouze (Luc.Berthouze at aist.go.jp). Extended abstracts (maximum two pages) can also be submitted, and will be presented as posters (extended abstracts should also be submitted in PDF or Postscript as attachments to Luc Berthouze (Luc.Berthouze at aist.go.jp). Further instructions to authors will be posted on the workshop web page: http://www.epigenetic-robotics.org Important Dates March 1st, 2004: Deadline for submission of papers and posters April 21st, 2004: Notification of acceptance for papers and posters May 21st, 2004: Deadline for camera ready-papers & posters August 25-27, 2004: Workshop Organizing Committee Christian Balkenius (Cognitive Science, Lund University, Sweden) Luc Berthouze (Neuroscience Research Institute, AIST, Japan) Hideki Kozima (Communications Research Laboratory, Japan) Giorgio Metta (LIRA-Lab, University of Genoa, Italy) Giulio Sandini (LIRA-Lab, University of Genoa, Italy) Georgi Stojanov (Computer Science Institute, SS Cyril and Methodius University, Macedonia) Program Committee Christian Balkenius (Cognitive Science, Lund University, Sweden) Luc Berthouze (Neuroscience Research Institute, AIST, Japan) Aude Billard (Autonomous Systems Laboratory, EPFL, Switzerland) Daniel Bullock (Cognitive & Neural Systems Department, Boston University, USA) Kerstin Dautenhahn (Adaptive Systems Research Group, University of Hertfordshire, UK) Yiannis Demiris (Intelligent and Interactive Systems, Imperial College, UK) Luciano Fadiga (University of Ferrara, Italy) Peter G=E4rdenfors (Cognitive Science, Lund University, Sweden) Philippe Gaussier (Universite de Cergy-Pointoise & ENSEA, France) Gyorgy Gergely (Institute for Psychological Research, Hungarian Academy of Sciences, Hungary) Fr=E9d=E9ric Kaplan (Sony Computer Science Lab Paris, France) Hideki Kozima (Communications Research Laboratory, Japan) Valerie Kuhlmeier (Yale University, Department of Psychology, USA) Max Lungarella (Neuroscience Research Institute, AIST, Japan) Yuval Marom (Division of Informatics, University of Edinburgh, UK) Giorgio Metta (LIRA-Lab, Genoa, Italy) Jacqueline Nadel (CNRS, France) Chrystopher Nehaniv (Adaptive Systems Research Group, University of Hertfordshire, UK) Rolf Pfeifer (AI Lab, University of Zurich, Switzerland) Christopher G. Prince (Computer Science, University of Minnesota Duluth, USA) Deb Roy (Media Laboratory, MIT, USA) Giulio Sandini (LIRA-Lab, Genoa, Italy) Brian Scassellati (Department of Computer Science, Yale University, USA) Stefan Schaal (Computer Science Department, USC, USA) Matthew Schlesinger (Psychology Department, Southern Illinois University, USA) Sylvain Sirois (Department of Psychology, Manchester University, UK) Georgi Stojanov (Computer Science Institute, SS Cyril and Methodius University, Macedonia) Gert Westermann (Department of psychology, Oxford Brookes University, UK) Tom Ziemke (Department of Computer Science, University of Skovde, Sweden) Publication of Papers & Poster Abstracts Papers and poster abstracts will be published in the proceedings, and archived at CogPrints (http://cogprints.ecs.soton.ac.uk). REFERENCES Zlatev, J. & Balkenius, C. (2001). Introduction: Why "epigenetic robotics"? Proceedings of the First International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems (pp. 1-4). Lund University Cognitive Studies, Volume 85. Available at: http://www.lucs.lu.se/Epigenetic-robotics/Papers/Zlatev.Balkenius.2001.pdf Prince, C. G. (2002). Introduction: The Second International Workshop on Epigenetic Robotics. In C. G. Prince, Y. Demiris, Y. Marom, H. Kozima, & C. Balkenius (Eds.) Proceedings of the Second International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund, Sweden: Lund University Cognitive Studies Volume 94. Available at: http://www.cprince.com/PubRes/EpigeneticRobotics2002/Prince-Intro.pdf Weng, J., McClelland, J., Pentland, A., Sporns, O., Stockman, I., Sur, M., & Thelen, E. (2001). Autonomous mental development by robots and animals. Science, 291, 599-600. Available at: http://www.cse.msu.edu/dl/SciencePaper.pdf Berthouze, L. and Prince, C. G. (2003). Introduction: The Third International Workshop on Epigenetic Robotics. In C. G. Prince, L. Berthouze, H. Kozima, D. Bullock, G. Stojanov, & C. Balkenius (Eds.) Proceedings of the Third International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund, Sweden: Lund University Cognitive Studies Volume 101. Available at: http://www.d.umn.edu/~cprince/epigenetic-robotics/2003/intro.pdf From verleysen at dice.ucl.ac.be Wed Oct 29 11:33:03 2003 From: verleysen at dice.ucl.ac.be (Michel Verleysen) Date: Wed, 29 Oct 2003 17:33:03 +0100 Subject: special sessions at ESANN'2004 Message-ID: <007c01c39e3a$5342e2d0$43ed6882@dice.ucl.ac.be> ---------------------------------------------------- | | | ESANN'2004 | | | | 12h European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 28-29-30, 2004 | | | | Special sessions | ---------------------------------------------------- The following message contains a summary of all special sessions that will be organized during the ESANN'2004 conference. Authors are invited to submit their contributions to one of these sessions or to a regular session, according to the guidelines found on the web pages of the conference http://www.dice.ucl.ac.be/esann/). List of special sessions that will be organized during the ESANN'2004 conference ===================================================================== 1. Neural methods for non-standard data B. Hammer, Univ. Osnabr=FCck, B.J. Jain, Tech. Univ. Berlin (Germany) 2. Soft-computing techniques for time series forecasting I. Rojas, H. Pomares, Univ. Granada (Spain) 3. Neural networks for data mining R. Andonie, Central Washington Univ. (USA) 4. Theory and applications of neural maps U. Seiffert, IPK Gatersleben, T. Villmann, Univ. Leipzig, A. Wism=FCller, Univ. Munich (Germany) 5. Industrial applications of neural networks L.M. Reyneri, Politecnico. di Torino (Italy) 6. Hardware systems for Neural devices P. Fleury, A. Bofill-i-Petit, Univ. Edinburgh (Scotland, UK) Short descriptions ================== Neural methods for non-standard data ------------------------------------ Organized by : - B. Hammer, Univ. Osnabr=FCck (Germany) - B.J. Jain, Tech. Univ. Berlin (Germany) In modern neural network research it is common practice to represent data as feature vectors in an Euclidean vector space. This kind of representation is convenient; due to possibly high dimensions or potential loss of structural information, however, it is limited for many relevant application areas including bioinformatics, chemistry, natural language processing, network analysis, or text mining. Alternative powerful and expressive representations for complex data structures include, for example, graphs, trees, strings, sequences, or functions. Recent neural models which directly deal with complex data structures include recursive models, kernels for structures, or functional networks, to name just a few approaches. The session will focus on neural techniques for processing of non-vectorial data. Authors are invited to submit contributions related to the following list of topics: - supervised and unsupervised models for complex data structures, - coupling of symbolic and sub-symbolic systems, - similarity measures and kernel models for non-vectorial data, - specific preprocessing methods for complex data structures, - incorporatation of prior knowledge and invariances, - theoretical results within this topic, - applications e.g. in bioinformatics, chemistry, language processing, - time series processing, graph processing. Soft-computing techniques for time series forecasting ----------------------------------------------------- Organized by : - I. Rojas, Univ. Granada (Spain) - H. Pomares, Univ. Granada (Spain) It is obvious that forecasting activities play an important role in our daily life. A time series is a sequence of measured quantities, of some physical system taken at regular intervals of time. Time series analysis includes three important specific problems: prediction, modelling, and characterization. The goal of prediction is to accurately forecast the short-term evolution of the system, the aim of modelling is to precisely capture the features of the long-term behaviour of the system, and the purpose of system characterization is to determine some underlying fundamental properties of the system. Papers concerning these goals, using traditional statistical model (ARMA), neural networks, soft-computing techniques, fuzzy system, evolutionary algorithms, etc are welcome. Neural networks for data mining ------------------------------- Organized by : - R. Andonie, Central Washington Univ. (USA) Data mining is an attractive application area for neural networks. This session will focus on the specificity and limits of neural computation for data mining. The following questions will be discussed: 1. What makes the difference between data for data mining applications and data for other NN applications: huge data bases, mixed data types, uncertain data, redundant and conflicting data, etc. 2. Data mining applications may be related to Internet applications with on-line processing capability. Only few NN models can handle such requirements. How useful are NN in this case? 3. Data mining includes not only knowledge acquisition (rule extraction) but also decision making. This can be done by using NN models. How specific is this task, considering points 1-2? 4. Computational intelligence applications in E-commerce, customer profiling, marketing segmentation, etc. Authors are invited to submit contributions related to neural and neuro-fuzzy techniques used in data mining. Papers discussing why/when/how neural models are appropriate for data mining applications are especially welcome. Theory and applications of neural maps -------------------------------------- Organized by : - U. Seiffert, IPK Gatersleben (Germany) - T. Villmann, Univ. Leipzig (Germany) - A. Wism=FCller, Univ. Munich (Germany) Neural maps in real biological systems can be seen as information processing systems which map complex information onto a roughly two-dimensional structure such that the statistical relations within the data are transformed into geometrical relations - called topographic mapping. Models which describe these brain properties are called neural maps. In technical context these models are utilized as topographic vector quantizers. Famous examples are the Self-Organizing Map (SOM), the Elastic Net (EN), etc. However, also other vector quantizers, originally not inspired by biological motivation, can be taken as topographic vector quantizers. Examples are the Neural Gas (NG), Soft Topographic Vector Quantization (STVQ) and other. Topographic vector quantizers have found a large range of applications in data mining, visualization, data processing, control and so on. In parallel, a growing number of extensions of existing algorithms as well as new approaches were developed during the last years. In the proposed session we want to focus onto new developments of topographic vector quantization and neural maps. Thereby we emphasize the theoretical background as well as interesting applications with key ideas for optimal use of the properties of neural maps. We invite researchers to provide new ideas in these topics. Possible contributions can be in any area matching this framework with following (but not restricted) topics: - theory of topographic vector quantization - estimation of probability density - image processing - time series prediction - classification tasks - pattern classification, clustering, fuzzy clustering - blind source separation and decorrelation - dimension and noise reduction - evaluation of non-metric data (categorical/ordinal) - data mining Industrial applications of neural networks ------------------------------------------ Organized by : - L.M. Reyneri, Politecnico. di Torino (Italy) Hardware systems for Neural devices ----------------------------------- Organized by : - P. Fleury, Univ. Edinburgh (Scotland, UK) - A. Bofill-i-Petit, Univ. Edinburgh (Scotland, UK) This special session aims to present new developments in neural hardware engineering to the neural networks community. The emphasis will be placed on the computational properties of the hardware systems or their use in specific applications, rather than on intricate details of circuit implementations. Some suggested areas of interest include (but are not restricted to) neuromorphic VLSI, interfaces between biological neurons and hardware, implementation of novel ANN algorithms and stochastic computing with nanotechnologies. ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From jyoshimi at ucsd.edu Wed Oct 29 15:38:04 2003 From: jyoshimi at ucsd.edu (Jeff Yoshimi) Date: Wed, 29 Oct 2003 12:38:04 -0800 Subject: Simbrain 1.0.4 Message-ID: Connectionists: Simbrain 1.0.4 has been released. See: http://simbrain.sourceforge.net/ This is a maintenance release which fixes several bugs, including a major issue with update order. Other fixes and new features include: improved auto-zoom, improved threading, some control over input and output nodes, improved keyboard layout, new networks, and a new random-weight option. Learning rules are still local. Supervised learning should be incorporated by January, via collaboration will Snarli ( http://snarli.sourceforge.net/ ). A rewritten gauge component, including new projection algorithms (PCA and isomap in addition to MDS) is also forthcoming, sometime in December I hope. Feedback welcome. I'm happy to add new learning and activation rules at user request. Best, Jeff From cindy at bu.edu Fri Oct 31 10:24:57 2003 From: cindy at bu.edu (Cynthia Bradford) Date: Fri, 31 Oct 2003 10:24:57 -0500 Subject: 8th ICCNS: Call for Abstracts and Confirmed Invited Speakers Message-ID: <006b01c39fc3$249b9140$903dc580@cnspc31> Apologies if you receive more than one copy of this announcement. ***** Call for Abstracts and Confirmed Invited Speakers ***** EIGHTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 19 - 22, 2004 Boston University 677 Beacon Street Boston, Massachusetts 02215 USA http://www.cns.bu.edu/meetings/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from the Office of Naval Research This interdisciplinary conference is attended each year by approximately people from 30 countries around the world. As in previous years, the conference will focus on solutions to the questions: HOW DOES THE BRAIN CONTROL BEHAVIOR? HOW CAN TECHNOLOGY EMULATE BIOLOGICAL INTELLIGENCE? The conference is aimed at researchers and students of computational neuroscience, cognitive science, neural networks, neuromorphic engineering, and artificial intelligence. The conference includes tutorial and invited lectures, and contributed lectures and posters, by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. Single-track oral and poster sessions enable all presented work to be highly visible. Three-hour poster sessions with no conflicting events will be held on two of the conference days. Posters will be up all day, and can also be viewed during breaks in the talk schedule. TUTORIAL LECTURE SERIES Stephen Grossberg (Boston University): "Linking brain to mind." See below for details. CONFIRMED INVITED AND PLENARY SPEAKERS Ehud Ahissar (Weizmann Institute of Science): "Encoding and decoding of vibrissal active touch" John Anderson (Carnegie Mellon University): "Using fMRI to track the components of a cognitive architecture" Alan D. Baddeley (University of Bristol): "In search of the episodic buffer" Moshe Bar (Massachusetts General Hospital): "Top-down facilitation of visual object recognition" Gail A. Carpenter (Boston University): "Information fusion and hierarchical knowledge discovery by ARTMAP neural networks" Stephen Goldinger (Arizona State University): "Generalization gradients in perceptual memory" Daniel Kersten (University of Minnesota): "How does human vision resolve ambiguity about objects?" Stephen M. Kosslyn (Harvard University): "The imagery debate 30 years later: Can neuroscience help resolve the issue?" Tai-Sing Lee (Carnegie Mellon University): "Inference and prediction in the visual cortex" Eve Marder (Brandeis University): "Plasticity and stability in rhythmic neuronal networks" Bartlett W. Mel (University of Southern California): "The pyramidal neuron: What sort of computing device?" Miguel Nicolelis (Duke University): "Real-time computing with neural ensembles" Jeffrey D. Schall (Vanderbilt University): "Neural selection and control of visual guided eye movements" Chantal Stern (Boston University): "Sequence? What sequence? fMRI studies of the medial temporal lobe in sequence learning" Mriganka Sur (Massachusetts Institute of Technology): "Plasticity and dynamics of visual cortex networks" Joseph Z. Tsien (Princeton University): "Temporal analysis of memory process" William H. Warren Jr. (Brown University): "Behavioral dynamics of locomotor path formation" Jeremy Wolfe (Harvard Medical School): "Has "preattentive vision" reached the end of the road?" LINKING BRAIN TO MIND: A Tutorial Lecture Series by Stephen Grossberg steve at bu.edu http://www.cns.bu.edu/Profiles/Grossberg In 1983, Stephen Grossberg gave a week-long series of tutorial lectures at an NSF-sponsored conference at Arizona State University. The lectures included a self-contained introduction to principles, mechanisms, and architectures whereby neural models link mind to brain and inspire neuromorphic applications to technology. Many leaders of the Connectionist Revolution which gained momentum during the mid-1980s attended the conference. In 1990-1992, three additional tutorial lecture series were given at the Wang Institute of Boston University. Since 1992, major breakthroughs have occurred in the theoretical understanding of how a brain gives rise to a mind. Models have begun to quantitatively explain and predict the neurophysiologically recorded dynamics of identified nerve cells, in anatomically verified circuits and systems, and the behaviors that they control. Because these results clarify how an intelligent system can autonomously adapt to a changing world, they have also been used to develop biologically-inspired solutions to technological problems. Several research groups have asked Professor Grossberg to give another lecture series to chart recent progress. Each morning session of the May 2004 conference will include one such tutorial lecture. The lectures will introduce concepts, principles, and mechanisms of mind/brain modeling and summaries of recent models about how brain development, learning, and information processing control perception, cognition, emotion, and action during both normal and abnormal behaviors. Brain-inspired algorithms for solving difficult technological problems will also be described. CALL FOR ABSTRACTS Session Topics: * vision * image understanding * audition * speech and language * unsupervised learning * supervised learning * reinforcement and emotion * sensory-motor control * cognition, planning, and attention * spatial mapping and navigation * object recognition * neural circuit models * neural system models * mathematics of neural systems * robotics * hybrid systems (fuzzy, evolutionary, digital) * neuromorphic VLSI * industrial applications * other Contributed abstracts must be received, in English, by January 30, 2004. Notification of acceptance will be provided by email by February 27, 2004. A meeting registration fee must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted Abstracts will be returned on request only until April 16, 2004. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides in a single-spaced, single-column format with a font of 10 points or larger, printed on one side of the page only. Fax or electronic submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; requested preference for oral or poster presentation; and a first and second choice from the topics above, including whether it is biological (B) or technological (T) work [Example: first choice: vision (T); second choice: neural system models (B)]. Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, VCR, and LCD projector facilities will be available for talks. Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, Massachusetts 02215 USA. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to +1 617 353 7755, or email to cindy at bu.edu. The registration fee will help to pay for a conference reception, 3 daily coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to help cover meeting travel and living costs. The deadline to apply for fellowship support is January 30, 2004. Applicants will be notified by email by February 27, 2004. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery must accompany the application and summarize how the candidate may benefit from the meeting. Fellowship applicants who also submit an Abstract need to include the registration fee payment with their Abstract submission. Fellowship checks will be distributed after the meeting. REGISTRATION FORM Eighth International Conference on Cognitive and Neural Systems Boston University Department of Cognitive and Neural Systems 677 Beacon Street Boston, Massachusetts 02215 USA May 19-22, 2004 Fax: +1 617 353 7755 http://www.cns.bu.edu/meetings/ Mr/Ms/Dr/Prof:_____________________________________________________ Affiliation:_________________________________________________________ Address:__________________________________________________________ City, State, Postal Code:______________________________________________ Phone and Fax:_____________________________________________________ Email:____________________________________________________________ The registration fee includes the conference proceedings, a reception, and 3 coffee breaks each day. CHECK ONE: ( ) $95 Conference (Regular) ( ) $65 Conference (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University" Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay by credit card (MasterCard, Visa, or Discover Card only) Name as it appears on the card:___________________________________________ Type of card: _____________________________ Expiration date:________________ Account number:_______________________________________________________ Signature:____________________________________________________________ From vlassis at science.uva.nl Wed Oct 1 05:06:24 2003 From: vlassis at science.uva.nl (Nikos Vlassis) Date: 01 Oct 2003 11:06:24 +0200 Subject: intro to multiagent systems and DAI Message-ID: <1064999184.23181.102.camel@thebe> Dear colleagues, I would like to point out to a short introductory text (70 pages) on "Multiagent systems and Distributed AI": http://www.science.uva.nl/~vlassis/cimasdai/ There is also software available for the well-known "predator-prey" simulation environment for multiagent systems. Corrections/suggestions are most welcome! Best regards, Nikos Vlassis Univ. of Amsterdam, The Netherlands -- From esann at dice.ucl.ac.be Wed Oct 1 15:25:17 2003 From: esann at dice.ucl.ac.be (esann) Date: Wed, 1 Oct 2003 21:25:17 +0200 Subject: CFP: ESANN'2004 European Symposium on Artificial Neural Networks Message-ID: <002b01c38851$c1928780$43ed6882@dice.ucl.ac.be> ESANN'2004 12th European Symposium on Artificial Neural Networks Bruges (Belgium) - April 28-29-30, 2004 Announcement and call for papers ===================================================== Technically co-sponsored by the International Neural Networks Society, the European Neural Networks Society, the IEEE Neural Networks Society, the IEEE Region 8 (to be confirmed), the IEEE Benelux Section. The call for papers for the ESANN'2004 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). ESANN'2004 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first happening in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2004 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical and statistical aspects, in the context of function approximation, classification, data analysis, control, time-series prediction, signal processing, vision, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. The topic of the paper decides if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate their choice for oral or poster presentation only. The following is a non-exhaustive list of topics covered during the ESANN conferences: - Models and architectures - Learning algorithms - Theory - Mathematics - Statistical data analysis - Classification - Approximation of functions - Time series forecasting - Nonlinear dimension reduction - Multi-layer Perceptrons - RBF networks - Self-organizing maps - Vector quantization - Support Vector Machines - Recurrent networks - Fuzzy neural nets - Hybrid networks - Bayesian neural nets - Cellular neural networks - Signal processing - Independent component analysis - Natural and artificial vision - Adaptive control - Identification of non-linear dynamical systems - Biologically plausible networks - Bio-inspired systems - Cognitive psychology - Evolutiv learning - Adaptive behaviour Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. Here is the list of special sessions that will be organized during the ESANN'2004 conference: 1. Neural methods for non-standard data B. Hammer, Univ. Osnabrck, B.J. Jain, Tech. Univ. Berlin (Germany) 2. Soft-computing techniques for time series forecasting I. Rojas, Univ. Granada (Spain) 3. Neural networks for data mining R. Andonie, Transylvania Univ. (Romania) 4. Theory and applications of neural maps U. Seiffert, IPK Gatersleben, T. Villmann, Univ. Leipzig, A. Wismller, Univ. Munich (Germany) 5. Industrial applications of neural networks L.M. Reyneri, Politecnico. di Torino (Italy) 6. Hardware systems for Neural devices P. Fleury, A. Bofill-i-Petit, Univ. Edinburgh (Scotland, UK) Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organized in a hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Proceedings and journal special issue ------------------------------------- The proceedings will include all communications presented to the conference (tutorials, oral and posters), and will be available on-site. Extended versions of selected papers will be published in the Neurocomputing journal (Elsevier). Call for contributions ---------------------- Prospective authors are invited to submit their contributions before 5 December 2003. The electronic submission procedure will be available soon on the ESANN Web pages http://www.dice.ucl.ac.be/esann/. Authors must indicate their choice for oral or poster presentation at the submission. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2004. They will benefit from the advance registration fee. Deadlines --------- Submission of papers December 5, 2003 Notification of acceptance February 6, 2004 Symposium April 28-30, 2004 Registration fees ----------------- Universities Industries speakers registration 430 530 (before 28 February 2004) (one paper per speaker) non-speaker registration (before 12 March 2004) 430 530 non-speaker registration (after 12 March 2004) 485 585 The registration fee includes the attendance to all sessions, the ESANN'2004 dinner, a copy of the proceedings, daily lunches (28-30 April 2004), and the coffee breaks. Conference secretariat ---------------------- ESANN'2004 d-side conference services phone: + 32 2 730 06 11 24 av. L. Mommaerts Fax: + 32 2 730 06 00 B - 1140 Evere (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee (to be confirmed) ---------------------------- Hugues Bersini Univ. Libre Bruxelles (B) Franois Blayo Prfigure (F) Marie Cottrell Univ. Paris I (F) Jeanny Hrault INPG Grenoble (F) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Herv Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Colin Campbell Bristol Univ. (UK) Stphane Canu Inst. Nat. Sciences App. (F) Holk Cruse Universitt Bielefeld (D) Eric de Bodt Univ. Lille II (F) & UCL Louvain-la-Neuve (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips Semiconductors (USA) Richard Duro Univ. Coruna (E) Jean-Claude Fort Universit Nancy I (F) Colin Fyfe Univ. Paisley (UK) Stan Gielen Univ. of Nijmegen (NL) Marco Gori Univ. Siena (I) Bernard Gosselin Fac. Polytech. Mons (B) Manuel Grana UPV San Sebastian (E) Anne Gurin-Dugu INPG Grenoble (F) Barbara Hammer Univ. of Osnbruck (D) Martin Hasler EPFL Lausanne (CH) Laurent Hrault CEA-LETI Grenoble (F) Gonzalo Joya Univ. Malaga (E) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinki Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Jouko Lampinen Helsinki Univ. of Tech. (FIN) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Erzsebet Merenyi Rice Univ. (USA) Jean Arcady Meyer Univ. Paris 6 (F) Jos Mira UNED (E) Jean-Pierre Nadal Ecole Normale Suprieure Paris (F) Gilles Pags Univ. Paris 6 (F) Thomas Parisini Univ. Trieste (I) Hlne Paugam-Moisy Universit Lumire Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Didier Puzenat Univ. Antilles-Guyane (F) Leonardo Reyneri Politecnico di Torino (I) Jean-Pierre Rospars INRA Versailles (F) Jose Santos Reyes Univ. Coruna (E) Udo Seiffert IPK Gatersleben (D) Jochen Steil Univ. Bielefeld (D) John Stonham Brunel University (UK) Johan Suykens K. U. Leuven (B) John Taylor Kings College London (UK) Claude Touzet Univ. Provence (F) Marc Van Hulle KUL Leuven (B) Thomas Villmann Univ. Leipzig (D) Christian Wellekens Eurecom Sophia-Antipolis (F) Axel Wismller Ludwig-Maximilians-Univ. Mnchen (D) ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Machine Learning Group 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From chaudhri at AI.SRI.COM Wed Oct 1 20:26:43 2003 From: chaudhri at AI.SRI.COM (Vinay K. Chaudhri) Date: Wed, 01 Oct 2003 17:26:43 -0700 Subject: Research Position Message-ID: <3F7B70C3.2030709@ai.sri.com> Dear Colleagues: *The Artificial Intelligence Center at SRI International is looking for a highly creative Computer Scientist to join a team of researchers building evaluation driven, but knowledge based systems. We are seeking an individual with a thorough understanding of knowledge representation with an emphasis on the representation of uncertain knowledge and reasoning under uncertainty. The person should have a knack for combining theoretical background with pragmatic approaches necessary for engineering real systems. For a complete job posting, please see: http://sri.hrdpt.com/cgi-bin/c/highlightjob.cgi?jobID=1314 * From juergen at idsia.ch Thu Oct 2 08:58:17 2003 From: juergen at idsia.ch (Juergen Schmidhuber) Date: Thu, 02 Oct 2003 14:58:17 +0200 Subject: Job at IDSIA, Switzerland / nips 2003 RNNaissance workshop Message-ID: <3F7C20E9.4060300@idsia.ch> We are seeking an outstanding postdoc or visitor for 8 months or so in 2004, with possibility of prolongation for another year or more. He/she should be interested in reinforcement learning, adaptive robotics, recurrent neural networks, optimal universal search algorithms, program evolution, cognitive science: http://www.idsia.ch/~juergen/jobcsem2004.html Juergen Schmidhuber, IDSIA ps: the NIPS 2003 RNNaissance workshop on recurrent neural networks is now open for submissions: http://www.idsia.ch/~juergen/rnnaissance.html From cns at cnsorg.org Thu Oct 2 12:47:12 2003 From: cns at cnsorg.org (CNS - Organization for Computational Neurosciences) Date: Thu, 2 Oct 2003 09:47:12 -0700 Subject: CNS*04 Message-ID: <1065113232.3f7c569021007@webmail.mydomain.com> CNS*04 The annual Computational Neuroscience Meeting will be held in Baltimore, MD, from July 18th - 20th, 2004. The main meeting will be followed by two days of workshops on July 21st and 22nd. In conjunction, the "2004 Annual Symposium, University of Maryland Program in Neuroscience: Computation in the Olfactory System" will be held as a satellite symposium to CNS*04 on Saturday, July 17th. The call for papers for CNS*04 can be expected in December 2003. For CNS*04, two types of submissions will be considered. Full papers, which can be included in the proceedings, will appear in the journal Neurocomputing and will be fully peer reviewed; extended abstracts will not be included in the proceedings but will be peer reviewed for inclusion in the meeting program. The submission deadline for full papers (in final format) and extended abstracts will be January 19, 2004. CNS is organized by the Organization for Computational Neurosciences. Further announcements will be posted on the organization's website, www.cnsorg.org. Christiane Linster, President Erik De Schutter, Program Chair Asaf Keller, Local Organizer ** CNS - Organization for Computational Neurosciences ** From ken at phy.ucsf.edu Thu Oct 2 20:44:42 2003 From: ken at phy.ucsf.edu (Ken Miller) Date: Thu, 2 Oct 2003 17:44:42 -0700 Subject: New Q-Bio Archives Message-ID: <16252.50810.380146.842433@coltrane.ucsf.edu> Hi, I received the following announcement which seems very relevant to these lists, so I am passing it on. I am not formally involved in the Q-Bio archive, just an interested potential user. List moderator: It's probably best if you just post the announcement below without this header from me, since I have nothing to do with it, but it's up to you. Ken Miller Kenneth D. Miller telephone: (415) 476-8217 Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 ---------------------------------- ---------------------------------- Announcement of new Quantitative Biology (q-bio) archive 15 Sept 2003 Dear Colleagues, In recent years, an increasing number of researchers from mathematics, computer science, and the physical sciences have been joining biologists in the ongoing revolution in biology. In a variety of ways, these researchers are contributing towards making biology a quantitative science. With this letter, we announce the formation of the q-bio archive (http://arXiv.org/archive/q-bio, see also http://arXiv.org/new/q-bio.html), which aims to serve the need of this emerging community. If you and your colleagues have active interest in quantitative biology (including but not limited to biological physics, computational biology, neural science, systems biology, bioinformatics, mathematical biology, and theoretical biology), we urge you to subscribe to the archive and submit (p)reprints to it. Both theoretical and experimental contributions are welcome, and subscription is freely accessible over the internet to all members of the scientific community. Instructions for registration, submission and subscription to the archive can be found at http://arXiv.org/help/registerhelp, http://arXiv.org/help/uploads, and http://arXiv.org/help/subscribe. The q-bio archive has grown out of a well-established series of e-Print archives accessible at http://arxiv.org/. The number of biology-related submissions to these archives has risen steadily over the last several years, and is averaging over 40/month so far in 2003. Unfortunately, these submissions are currently scattered across a number of sub-archives (including physics, cond-mat, nonlinear science, math, etc.), reflecting mostly the "home field" of the contributors rather than the subject matters of their submissions. Many colleagues have expressed the desire to have a centralized archive to share their latest results, and to learn about related findings by others in this field. The q-bio archive is designed to address this problem. It is organized mainly according to different categories of biological processes and partitioned according to their scales in space and time. The categories http://arXiv.org/new/q-bio.html range from molecular and sub-cellular structures to tissues and organs, from the kinetics of molecules to population and evolutionary dynamics. In addition, a separate category is devoted to method-dominated contributions, including computational algorithms, experimental methods, as well as novel approaches to analyzing experimental data. All submissions are required to choose a primary category, with the option for one or more secondary categories. Subscribers of the archive will receive by e-mail the title/abstracts of all submissions in their chosen categories on a regular basis. A large number of bio-related submissions to the e-Print archives during the past decade have already been identified and categorized according to the above scheme using an automated procedure. They can be accessed at http://arxiv.org/archive/q-bio. Please note that the current list of categories is a compromise between the large number of active subject matters in biology and the areas of quantitative biology where the e-print archives have received significant contributions during the past several years. The subject list will undoubtedly be updated as the major active areas develop/shift in time. This continuous structuring of the archive is overseen by an advisory committee. It consists of a number of well-established biologists, William F. Loomis (UCSD), Chuck Stevens (Salk), Gary Stormo (WUSTL), Diethard Tautz (Cologne), together with a number of dedicated volunteers who will serve as "moderators" for each category listed at http://arXiv.org/new/q-bio.html. If you have suggestions to improve the q-bio archive, please contact the coordinators or the relevant moderators by e-mail. From nips-workshop-admin at bcs.rochester.edu Fri Oct 3 17:21:45 2003 From: nips-workshop-admin at bcs.rochester.edu (Robert Jacobs) Date: Fri, 03 Oct 2003 17:21:45 -0400 Subject: NIPS workshop schedule Message-ID: <5.1.1.6.0.20031003171116.0373ca38@bcs.rochester.edu> The Neural Information Processing Systems (NIPS) conference and workshops will take place in Vancouver and Whistler, respectively, on December 8-13. The workshop schedule is now available (see below). More information=20 (including more details about each workshop) can be obtained from the NIPS web site: http://www.nips.cc Robert Jacobs and Satinder Singh NIPS workshop co-chairs ========================================== Neural Information Processing Systems (NIPS) workshops ------------------------------------------------------------------------------------- Two-Day Workshops ------------------------------- Title: Neural-Inspired Architectures for Nanoelectronics Organizers: Valeriu Beiu, Ulrich R=FCckert Title: Robust Communication Dynamics in Complex Networks Organizers: Rajarshi Das, Irina Rish, Gerald Tesauro, Cris Moore Friday, December 12 ------------------------------- Title: Estimation of Entropy and Information of Undersampled Probability Distributions: Theory, Algorithms, and Applications to the Neural Code Organizers: William Bialek, Ilya Nemenman Title: Feature Extraction Challenge Organizers: Isabelle Guyon, Masoud Nikravesh, Kristin Bennett, Richard Caruana, Asa Ben-Hur, Andre Elisseeff, Fernando Perez-Cruz, Steve Gunn Title: Hyperspectral Remote Sensing and Machine Learning Organizers: J. Anthony Gualtieri Title: Machine Learning Meets the User Interface Organizers: John Shawe-Taylor, John Platt Title: Neural Representation of Uncertainty Organizers: Sophie Deneve, Angela Yu Title: New Problems and Methods in Bioinformatics Organizers: Christina Leslie, William Noble, Koji Tsuda Title: RNNaissance Workshop (Recurrent Neural Networks) Organizers: Juergen Schmidhuber, Alex Graves, Bram Bakker Title: Syntax, Semantics, and Statistics Organizers: Richard Shiffrin, Mark Steyvers, David Blei, Tom Griffiths Saturday, December 13 ----------------------------------- Title: Approximate Nearest Neighbor Techniques for Local Learning and Perception Organizers: Trevor Darrell, Piotr Indyk, Gregory Shakhnarovich, Paul Viola Title: Computing with Spikes: Implementation of Biology and Theory Organizers: Ralph Etienne-Cummings, Timothy Horiuchi, Giacomo Indiveri Title: ICA: Sparse Representations in Signal Processing Organizers: Barak Pearlmutter, Scott Rickard, Justinian Rosca, Stefan Harmeling Title: Information Theory and Learning: The Bottleneck and Information Distortion Approach Organizers: Naftali Tishby, Tomas Gedeon Title: Neural Processing of Complex Acoustic Signals Organizers: Melissa Dominguez, Ian Bruce, Sue Becker Title: Nonparametric Bayesian Methods and Infinite Models Organizers: Matthew Beal, Yee Whye Teh Title: Open Challenges in Cognitive Vision Organizers: Barbara Caputo, Henrik Christensen, Christian Wallraven Title: Planning for the Real World: The Promises and Challenges of Dealing with Uncertainty Organizers: Drew Bagnell, Joelle Pineau, Nicholas Roy From mcrae at uwo.ca Sun Oct 5 17:50:26 2003 From: mcrae at uwo.ca (Ken McRae) Date: Sun, 05 Oct 2003 17:50:26 -0400 Subject: Postdoctoral Fellowship Message-ID: Postdoctoral Fellowship in Connectionist Modeling and Semantic Memory We have funding for a two-year Postdoctoral Fellowship at the University of Western Ontario in London, Ontario, Canada. The stipend is $45,000 per year plus $2,500 per year for conference travel. There are no citizenship restrictions. We are most interested in someone who would like to study issues regarding the role of semantic factors in word recognition and connectionist models of the computations involved. Our research incorporates theories and methodologies from a number of areas, including those typically associated with connectionist modeling, word recognition, semantic memory, concepts and categorization, and cognitive neuropsychology. Thus, there is the potential for working on various projects under the general umbrella of modeling semantic memory phenomena. Our department has a number of Cognition faculty, many of whom conduct research related to language processing and concepts. Thus, our faculty, postdocs, and graduate students provide a rich research environment. Our labs are well-equipped for both human experimentation and computational modeling. London is a pleasant city of approximately 350,000, and is located 2 hours drive from either Toronto or Detroit. Note that a reasonable one-bedroom apartment in London costs approximately $700 per month. For further information about our labs, and Cognition at UWO, see: http://www.ssc.uwo.ca/psychology/cognitive/cognitive.html although it is somewhat out of date and is currently being reconstructed. If you are interested in this position, please send a cv, statement of research interests, at least 2 letters of reference, and sample articles to the address below. Sending all information electronically is preferable. We are interested in hiring someone as soon as possible. If you would like more information about this position, please contact either Stephen Lupker (lupker at uwo.ca) or I directly. Ken McRae Associate Professor, Department of Psychology & Neuroscience Program Social Science Centre University of Western Ontario London, Ontario CANADA N6A 5C2 email: mcrae at uwo.ca phone: (519) 661-2111 ext. 84688 fax: (519) 661-3961 From mseeger at EECS.Berkeley.EDU Mon Oct 6 19:31:47 2003 From: mseeger at EECS.Berkeley.EDU (mseeger@EECS.Berkeley.EDU) Date: Mon, 06 Oct 2003 23:31:47 GMT Subject: PhD thesis on PAC-Bayesian bounds and sparse Gaussian processes Message-ID: <182621182fdd.182fdd182621@EECS.Berkeley.EDU> Dear colleagues, my PhD thesis is available online at www.dai.ed.ac.uk/~seeger/papers/thesis.html. It mainly deals with: - PAC-Bayesian generalisation error bounds and applications to Gaussian process classification - Sparse approximations for linear-time inference in Gaussian process models Please find abstract and table of contents on the website. You might also be interested in the tutorial paper Gaussian Processes for Machine Learning, available at www.dai.ed.ac.uk/~seeger/papers/bayesgp-tut.html which is extracted from the thesis, but is self-contained. An abstract follows. Best wishes, Matthias. ---- Gaussian Processes for Machine Learning Gaussian processes (GPs) are natural generalisations of multivariate Gaussian random variables to infinite (countably or continuous) index sets. GPs have been applied in a large number of fields to a diverse range of ends, and very many deep theoretical analyses of various properties are available. This paper gives an introduction to Gaussian processes on a fairly elementary level with special emphasis on characteristics relevant in machine learning. It draws explicit connections to branches such as spline smoothing models and support vector machines in which similar ideas have been investigated. --- Matthias Seeger Tel: 485 Soda Hall, UC Berkeley Fax: 510-642-5775 Berkeley, CA 94720-1776 www.dai.ed.ac.uk/~seeger From manfred at cse.ucsc.edu Tue Oct 7 17:16:51 2003 From: manfred at cse.ucsc.edu (Manfred Warmuth) Date: Tue, 7 Oct 2003 14:16:51 -0700 (PDT) Subject: Rob Schapire and Yoav Freund receive Goedel Prize for AdaBoost algorithm Message-ID: The Goedel Prize is one of most prestigious prizes in Theoretical Computer Science (jointly sponsored by EATCS and SIGACT). This is the first time that a paper in Machine Learning received this award See http://sigact.acm.org/prizes/godel for some background information on the Goedel Prize and a list of past recipients Rob Schapire and Yoav Freund received the 2003 prize for their famed AdaBoost paper. For an announcement see http://sigact.acm.org/prizes/godel/2003.html Background: Michael Kearns and Les Valiant first defined weak and strong learning and posed the open problem whether weak learning and strong learning are the same. In short weak learners must have accuracy only slightly better than 50% and strong learners must be able to achieve high accuracy In his 1991 Ph.D. thesis from MIT Rob gave the first recursive construction for combining many weak learners to form a strong learner This was followed by Yoav Freund's Ph.D. thesis in 1993, where he gave a simple flat scheme of combining weak learner by a majority vote After graduating from Santa Cruz, Yoav accepted a job at AT&T Bell labs in what was one of the strongest machine learning research groups in the country. Rob was part of that group as well They combined ideas and came up with an ``adaptive'' Boosting algorithm (called AdaBoost - just 10 lines of code) which received a lot of attention in the Machine Learning and Statistics communities Prize winning paper that introduced AdaBoost: "A Decision Theoretic Generalization of On-Line Learning and an Application to Boosting," Journal of Computer and System Sciences 55 (1997), pp. 119-139. Congrats to them ! _____________________________ Manfred K. Warmuth Prof. in Computer Science Univ. of Calif. at Santa Cruz manfred at cse.ucsc.edu http://www.cse.ucsc.edu/~manfred/ From BGabrys at bournemouth.ac.uk Tue Oct 7 10:23:51 2003 From: BGabrys at bournemouth.ac.uk (Bogdan Gabrys) Date: Tue, 7 Oct 2003 15:23:51 +0100 Subject: Industrial CASE PhD studentship available Message-ID: <4C40C6148BACD711AEF800805F8B335EDA45F1@exchange1.bournemouth.ac.uk> Dear Connectionists, The following research studentship is currently available. Please feel free to distribute to whoever may be interested in and qualified for this position. Best regards, Bogdan Gabrys *************************************************************************** EPSRC/BT funded Industrial CASE Studentship Computational Intelligence Research Group (CIRG) School of Design, Engineering and Computing, Bournemouth University, United Kingdom Applications are invited for a 3 year PhD research studentship to work on a project entitled "High Performance Fusion Systems" which is jointly funded by EPSRC and British Telecommunications plc (BT) under the EPSRC CASE scheme. The proposed research project will investigate and develop various approaches for highly efficient multiple classifier (prediction) systems composed of actively generated, well performing and decorrelated classifiers (predictors). The emphasis will be put on the automatic avoidance of data overfitting accompanied by complexity and reliability control appropriate for potential industrial applications. Combination, aggregation and fusion of information are major problems for all kinds of knowledge-based systems, from image processing to decision making, from pattern recognition to automatic learning. Various statistical, machine learning and hybrid intelligent techniques will be used for processing and modelling of imperfect data and information. The student will be joining a Computational Intelligence Research Group and will be primarily based in the School of Design, Engineering & Computing in Bournemouth but will also spend up to 3 months in each year of the project duration at the BT Exact in Ipswich. The studentship carries a remuneration starting at a minimum of =A312000 pa tax-free (suitably increased in subsequent years) and payment of tuition fees at home/EU rate. The successful applicant will need to have a permanent residency status in the UK. Applicants should have a strong mathematical background and hold a first or upper second class honours degree or equivalent in computer science, mathematics, physics, engineering, statistics or a similar discipline. Additionally the candidate should have strong programming experience using any or combination of C, C++, Matlab or Java. Knowledge of ORACLE will be an advantage. For further details please contact Dr. Bogdan Gabrys, e-mail: bgabrys at bournemouth.ac.uk . Interested candidates should send a letter of application and a detailed CV with the names and addresses of two referees to: Dr. Bogdan Gabrys, Computational Intelligence Research Group, School of DEC, Bournemouth University, Poole House, Talbot Campus, Poole, BH12 5BB, UK. The applications can be also sent by e-mail. *************************************************************************** --------------------------------------------------------------------------- Dr Bogdan Gabrys Computational Intelligence Research Group School of Design, Engineering & Computing Bournemouth University, Poole House Talbot Campus, Fern Barrow Poole, BH12 5BB United Kingdom Tel: +44 (0)1202 595298 Fax: +44 (0) 1202 595314 E-mail: bgabrys at bournemouth.ac.uk WWW: http://dec.bournemouth.ac.uk/staff/bgabrys/ --------------------------------------------------------------------------- From rogilmore at psu.edu Tue Oct 7 11:28:57 2003 From: rogilmore at psu.edu (Rick Gilmore) Date: Tue, 7 Oct 2003 11:28:57 -0400 Subject: Developmental neuroscience position at Penn State Message-ID: Developmental Neuroscience Search PSYCHOLOGY, PENN STATE. The Department of Psychology at Penn State is broadening a search for candidates for a tenure line faculty position, at any rank, with a specialization in any area of developmental neuroscience for Fall 2004. The department seeks an individual whose research will contribute to the department-wide neuroscience initiative and complement and broaden the capacities of the Department's Child Study Center (CSC) and Penn State's Child, Youth, and Family Consortium (CYFC). The CSC, a unit of the Psychology Department, is dedicated to the integration of developmental and clinical science (http://csc.la.psu.edu). The CYFC is a university-wide consortium dedicated to promoting interdisciplinary collaborations that advance research and outreach to children, youth, and families (http://www.cyfc.psu.edu). Candidates for the position may hold a doctorate in developmental or clinical child psychology (clinical candidates must hold a doctorate from an APA-approved program with an APA-approved internship). Preference will be given to candidates with post doctoral experience. The ideal candidate would bring a research program with a focus on the role of the developing brain in the development of competence and/or psychopathology in childhood (e.g., cognitive neuroscience applied to learning and/or learning difficulties, affective neuroscience applied to the development of emotional and social competence or particular disorders). The ideal candidate would also be interested in collaborating with colleagues in the Psychology Department, other departments at Penn State, and the Hershey Medical School. Please send letter of interest, vita, sample papers, and three letters of references to Pamela M. Cole, Developmental Neuroscience Search Committee, Box M, Department of Psychology, Penn State University, University Park, PA 16802. Review of applications will begin September 15, 2003 and will continue until the position is filled. Penn State is committed to affirmative action, equal opportunity and the diversity of its workforce. From nnk at atr.co.jp Wed Oct 8 04:08:25 2003 From: nnk at atr.co.jp (Neural Networks Editorial Office) Date: Wed, 8 Oct 2003 17:08:25 +0900 Subject: NEURAL NETWORKS 16(8) Message-ID: NEURAL NETWORKS 16(8) Contents - Volume 16, Number 8 - 2003 ------------------------------------------------------------------ NEURAL NETWORKS LETTERS: Recurrent neural networks with trainable amplitude of activation functions Su Lee Goh, Danilo P. Mandic Solving the XOR problem and the detection of symmetry using a single complex-valued neuron Tohru Nitta CONTRIBUTED ARTICLES: ***Psychology and Cognitive Science*** A neural model of how the brain represents and compares multi-digit numbers: spatial and categorical processes Stephen Grossberg, Dmitry V. Repin A neural network simulating human reach-grasp coordination by continuous updating of vector positioning commands Antonio Ulloa, Daniel Bullock ***Neuroscience and Neuropsychology*** A model synapse that incorporates the properties of short- and long-term synaptic plasticity Armen R. Sargsyan, Albert A. Melkonyan, Costas Papatheodoropoulos, Hovhannes H. Mkrtchian, George K. Kostopoulos ***Mathematical and Computational Analysis Back-propagation learning of infinite-dimensional dynamical systems Isao Tokuda, Ryuji Tokunaga, Kazuyuki Aihara Controlling chaos in chaotic neural network Guoguang He, Zhitong Cao, Ping Zhu, Hisakazu Ogura Neural independent component analysis by 'maximum-mismatch' learning principle Simone Fiori Necessary and sufficient condition for absolute stability of normal neural networks Tianguang Chu, Cishen Zhang, Zongda Zhang Book Review: Artificial Immune Systems: A New Computational Intelligence Approach L.N. de Castro, J. Timmis *** CURRENT EVENTS *** ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ------------------------------------------------------------------------ ---- Membership Type INNS ENNS JNNS ------------------------------------------------------------------------ ---- membership with $80 (regular) SEK 660 Y 13,000 Neural Networks (plus Y 2,000 enrollment fee) $20 (student) SEK 460 Y 11,000 (plus Y 2,000 enrollment fee) ------------------------------------------------------------------------ ---- membership without $30 SEK 200 not available Neural Networks to non-students (subscribe through another society) Y 5,000 student (plus Y 2,000 enrollment fee) ---------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Shozo Yasui Kyushu Institute of Technology Graduate School of Life Science and Engineering 2-4 Hibikino, Wakamatsu-ku Kitakyushu 808-0196 Japan 81 93 695 6108 (phone and fax) jnns at brain.kyutech.ac.jp http://www.jnns.org/ ----------------------------------------------------------------- From masami at email.arizona.edu Wed Oct 8 12:26:57 2003 From: masami at email.arizona.edu (Masami TATSUNO) Date: Wed, 8 Oct 2003 09:26:57 -0700 Subject: Preprint on possible neural architectures underlying information-geometric measures Message-ID: <4CC5E04DDFF2FF4D84EAC8F1A5EC00D82D05EB@mail.nsma.arizona.edu> Dear Connectionists, Our preprint, 'Investigation of Possible Neural Architectures Underlying Information-Geometric Measures', M. Tatsuno and M. Okada, to appear in Neural Computation is now available for download at, http://www.mns.brain.riken.go.jp/~okada/Tatsuno_Okada.pdf The preliminary results of this study have been reported in the following articles. 'Possible neural mechanisms underlying information-geometric measure parameters', M. Tatsuno and M. Okada, Society for Neuroscience Abstracts, 28, 675.15, 2002. 'How does the information-geometric measure depend on underlying neural mechanisms?', M. Tatsuno and M. Okada, Neurocomputing, Vol. 52 - 54, pp. 649 - 654, 2003. Best regards, Masami TATSUNO ARL Division of Neural Systems, Memory and Aging Life Sciences North Building, Room 384 The University of Arizona Tucson, AZ 85724, USA ----- Abstract ----- A novel analytical method based on information geometry was recently proposed, and this method may provide useful insights into the statistical interactions within neural groups. The link between information-geometric measures and the structure of neural interactions has not yet been elucidated, however, because of the ill-posed nature of the problem. Here, possible neural architectures underlying information-geometric measures are investigated using an isolated pair and an isolated triplet of model neurons. By assuming the existence of equilibrium states, we derive analytically the relationship between the information-geometric parameters and these simple neural architectures. For symmetric networks, the first- and second-order information-geometric parameters represent, respectively, the external input and the underlying connections between the neurons provided that the number of neurons used in the parameter estimation in the log-linear model and the number of neurons in the network are the same. For asymmetric networks, however, these parameters are dependent both on the intrinsic connections and on the external inputs to each neuron. In addition, we derive the relation between the information-geometric parameter corresponding to the two-neuron interaction and a conventional cross-correlation measure. We also show that the information-geometric parameters vary depending on the number of neurons assumed for parameter estimation in the log-linear model. This finding suggests a need to examine the information-geometric method carefully, and a possible criterion for choosing an appropriate orthogonal coordinate is also discussed. This paper points out the importance of a model-based approach, and sheds light on the possible neural structure underlying the application of information geometry to neural network analysis. From poznan at iub-psych.psych.indiana.edu Thu Oct 9 01:18:09 2003 From: poznan at iub-psych.psych.indiana.edu (Roman Poznanski) Date: Thu, 09 Oct 2003 01:18:09 -0400 Subject: JIN 2(2) CONTENTS Message-ID: <3F84EF91.4010304@iub-psych.psych.indiana.edu> JOURNAL OF INTEGRATIVE NEUROSCIENCE Vol. 2, No.2, December 2003 CONTENTS ----------- Short communications Savant-like Skills Exposed in Normal People by Suppressing the Left Fronto-temporal Lobe Allan W. Snyder, Elaine Mulcahy, Janet L. Taylor, D. John Mitchell, Perminder Sachdev, and Simon C. Gandevia. Closing an Open-loop Control System: Vestibular Substitution Through the Tongue Mitchell Tyler, Yuri Danilov and Paul Bach-y-Rita. Research Reports Age Related Alterations in the Complexity of Respiratory Patterns Metin Akay, Karen L. Moodie, P. Jack Hoopes, A Basal Ganglia Inspired Model of Action Selection Evaluated in a Robotic Survival Task B. Girard, V. Cuzin, A. Guillot, K.N. Gurney, T.J. Prescott. Applying Database Technology to Clinical and Basic Research Bioinformatics Projects Kelly A. Sullivan Short-term Autonomic Control of Cardiovascular Function: A Mini-review With the Help of Mathematical Models Mauro Ursino Distributed Coding Efficiency in Orthogonal Models of Facial Processing Paul A. Watters -- Roman R. Poznanski, Ph.D Associate Editor, Journal of Integrative Neuroscience Department of Psychology Indiana University 1101 E. 10th St. Bloomington, IN 47405-7007 email: poznan at iub-psych.psych.indiana.edu phone (Office): (812) 856-0838 http://www.worldscinet.com/jin/mkt/editorial.shtml From malchiodi at dsi.unimi.it Fri Oct 10 11:06:26 2003 From: malchiodi at dsi.unimi.it (Dario Malchiodi) Date: Fri, 10 Oct 2003 17:06:26 +0200 Subject: Book announcement: Algorithmic Inference in Machine Learning Message-ID: <5191CBE0-FB33-11D7-A0ED-0003939B3D3E@dsi.unimi.it> Apologizing for cross-posting Dear Colleagues, It is our pleasure to announce the availability of our book Algorithmic Inference in Machine Learning (International Series on Advanced Intelligence, Vol. 5) ISBN 0-9751004-2-4 (see http://laren.dsi.unimi.it/aibook) This book offers a new theoretical framework for modern statistical inference problems, generally referred to as learning problems. They arise in connection with hard operational contexts to be managed in the lack of all necessary knowledge. The success of their solutions lies in a suitable mix of computational skill? in processing the available data and sophisticated attitude? in stating logical relations between their properties and the expected behavior of candidate solutions. The framework is discussed through rigorous mathematical statements in the province of probability theory and a highly comprehensive style. Theoretical concepts are introduced using examples from everyday life. The book can be ordered to Advanced Knowledge International Pty Ltd PO Box 228 Magill, Adelaide South Australia SA 5072 Australia Email: info at innoknowledge.com Fax: +61-8-8332-6805 or on-line at amazon.com: http://www.amazon.com/exec/obidos/tg/detail/-/0975100424/ qid=1065775131/sr=1-2/ref=sr_1_2/103-1284135-9639054?v=glance&s=books A numerical tool is growing for computing statistics according to our framework. The tool is available through the?web site http://laren.dsi.unimi.it/TAP, and we hope for your suggestions and comments. Sincerely, Bruno Apolloni, Dario Malchiodi and Sabrina Gaito From Harel.Shouval at uth.tmc.edu Fri Oct 10 13:06:07 2003 From: Harel.Shouval at uth.tmc.edu (Harel Shouval) Date: Fri, 10 Oct 2003 12:06:07 -0500 Subject: Postdoctoral Positions Message-ID: <3F86E6FF.2020907@uth.tmc.edu> Postdoctoral positions available in Theoretical/Computational Neuroscience. We use analytical and computational techniques for studying the cellular basis of learning memory and development. Current topics include modeling of 'spike time dependent plasticity', receptive field development with spiking neurons and studying the molecular basis of stable long term plasticity. Knowledge of analytical or computational methods is required, knowledge of Neuroscience is preferred. Most of our computational work is carried out using Matlab and C, on Linux platforms. Description of work in my lab, and links to recent papers can be found in: http://nba.uth.tmc.edu/resources/faculty/members/shouval.htm Please contact: Harel Shouval Department of Neurobiology and Anatomy The University of Texas- Houston medical center Harel.shouval at uth.tmc.edu Tel: 713-500-5708 EOE/AA/SSP/smoke free environment From djaeger at emory.edu Thu Oct 9 13:00:34 2003 From: djaeger at emory.edu (Dieter Jaeger) Date: Thu, 9 Oct 2003 19:00:34 +0200 Subject: Postdoctoral Position in Computational Neuroscience Message-ID: <3F6C826D.2F419699@emory.edu> A funded postdoctoral opening in the area of computational neuroscience is available in my laboratory at Emory University, Atlanta. The research project is aimed at elucidating the operation of the deep cerebellar nuclei using whole cell recordings in slices and compartmental modeling. This work will build on our previous publications in this area (Gauck and Jaeger, J. Neurosci. 2000; 2003). The Neuroscience environment at Emory University is excellent, and living in Atlanta features a large international community and plenty of activities. Candidates should have previous experience in intracellular electrophysiology and/or compartmental modeling. Interested candidates should contact djaeger at emory.edu for further details. -Dieter Jaeger Associate Professor Emory University Department of Biology 1510 Clifton Rd. Atlanta, GA 30322 Tel: 404 727 8139 Fax: 404 727 2880 e-mail: djaeger at emory.edu From mackay at mrao.cam.ac.uk Mon Oct 13 09:08:28 2003 From: mackay at mrao.cam.ac.uk (David J.C. MacKay) Date: Mon, 13 Oct 2003 14:08:28 +0100 Subject: Information Theory, Inference, and Learning Algorithms Message-ID: The following book is now available for purchase in bookstores, price 30 pounds or $50 US. As of Mon 13/10/03, Barnes and Noble are offering it at the special price of $40.00. The book also remains available for free on-screen viewing. It can be downloaded from http://www.inference.phy.cam.ac.uk/mackay/itila/ ======================================================================== "Information Theory, Inference, and Learning Algorithms" by David J.C. MacKay Cambridge University Press http://www.cambridge.org/0521642981 ------------------------------------------------------------------------- `An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.' Bob McEliece, California Institute of Technology An undergraduate / graduate textbook. This book features: * lots of figures and demonstrations. * more than four hundred exercises, many with worked solutions. * up to date exposition of: . source coding - including arithmetic coding. . channel coding - including Gallager codes, turbo codes, and digital fountain codes. . machine learning - including clustering, neural networks, and Gaussian processes. . message-passing algorithms - especially the sum-product algorithm, or loopy belief propagation. . variational methods - including the variational view of the E-M algorithm. . Monte Carlo methods - including Hamiltonian Monte Carlo, Overrelaxation, Slice sampling, and Exact sampling. * Entertaining asides on topics as varied as crosswords, codebreaking, evolution, and sex. The book was published in September 2003. (Hardback, 640 pages) Inspect it for free at http://www.inference.phy.cam.ac.uk/mackay/itila/ --- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - David J.C. MacKay mackay at mrao.cam.ac.uk http://www.inference.phy.cam.ac.uk/mackay/ Cavendish Laboratory, Madingley Road, Cambridge CB3 0HE. U.K. (01223) 339852 | fax: 354599 | home: 740511 international: +44 1223 From planning at icsc.ab.ca Sun Oct 12 17:16:26 2003 From: planning at icsc.ab.ca (Jeanny S. Ryffel) Date: Sun, 12 Oct 2003 15:16:26 -0600 Subject: Biologically Inspired Cognitive Systems / Scotland 2004 Message-ID: <5.1.0.14.2.20031012151451.00a44b30@pop.interbaun.com> The science of neural computation focuses on mathematical aspects for solving complex practical problems. It also seeks to help neurology, brain theory and cognitive psychology in the understanding of the functioning of the nervous system by means of computational models of neurons, neural nets and sub-cellular processes. BICS2004 aims to become a major point of contact for research scientists, engineers and practitioners throughout the world in the fields of cognitive and computational systems inspired by the brain and biology. Participants will share the latest research, developments and ideas in the wide arena of disciplines encompassed under the heading of BICS2004: First International ICSC Symposium on Cognitive Neuro Science (CNS 2004) (from computationally inspired models to brain-inspired computation) Chair: Prof. Igor Aleksander, Imperial College London, U.K Second International ICSC Symposium on Biologically Inspired Systems (BIS 2004) Chair: Prof. Leslie Smith, University of Stirling, U.K. Third International ICSC Symposium on Neural Computation (NC'2004) Chair: Dr. Amir Hussain, University of Stirling, U.K. http://www.icsc-naiso.org/conferences/bics2004/bics-cfp.html From werning at phil-fak.uni-duesseldorf.de Mon Oct 13 07:32:52 2003 From: werning at phil-fak.uni-duesseldorf.de (Markus Werning) Date: Mon, 13 Oct 2003 13:32:52 +0200 Subject: CFP: Compositionality, Concepts and Cognition Message-ID: <008201c3917d$d146e650$e26a6386@philos19> Please distribute - apologies for multiple posting -------------------------------------------------------------------- FIRST CALL FOR PAPERS Compositionality, Concepts and Cognition An Interdisciplinary Conference in Cognitive Science D?sseldorf, Germany February 28 to March 3, 2004 http://www.phil.uni-duesseldorf.de/thphil/compositionality CONFERENCE AIM The conference on compositionality is to take place at Heinrich Heine University D?sseldorf, Germany, from February 28 to March 3, 2004. Compositionality is a key feature of structured representational systems, be they linguistic, mental or neuronal. A system of representations is called compositional just in case the semantic values of complex representations are determined by the semantic values of their parts. The conference brings together internationally renowned scholars from various disciplines of the cognitive sciences, including philosophers, psychologists, linguists, computer scientists and neuro scientists. The speakers will address the issue of compositionality from very different perspectives. It is the aim of the conference to further the exchange of views on compositionality across the disciplines and to explore the implications and condition of compositionality as a property of representational systems in the study of language, mind and brain. PLENARY SPEAKERS The list of plenary speakers includes Johannes Brandl, Henry Brighton, Daniel Cohnitz, Andreas K. Engel, Lila Gleitman, Terry Horgan, Theo Janssen, Hannes Leitgeb, Sebastian L?bner, Edouard Machery, Alexander Maye, Brian McLaughlin, C. Ulises Moulines, Jeff Pelletier, Martina Penke, Jesse Prinz, Gabriel Sandu, Richard Schantz, Oliver Scholz, Ricarda Schubotz, Gerhard Schurz, Markus Werning, Gert Westermann, Edward Wisniewski, and Dieter Wunderlich. ORGANIZATION The conference is the result of a co-operation between the Institut Jean Nicod, the University Paris-Sorbonne, the Ecole Normale Superieure in France and Heinrich Heine University D?sseldorf in Germany. Its is organized by - Markus Werning, Department of Philosophy, Heinrich Heine University and Center for Language, Logic, and Information, D?sseldorf; - Edouard Machery, Department of Philosophy, Sorbonne, Paris, and Max-Planck-Institute for Human Development, Berlin; - Gerhard Schurz, Department of Philosophy, Heinrich Heine University, D?sseldorf. SCIENTIFIC BOARD - Daniel Andler, Department of Philosophiy, Sorbonne, Paris, and Department of Cognitive Studies, ENS-Ulm, Paris; - Peter Carruthers, Department of Philosophy, University of Maryland; - James Hampton, Department of Psychology, City University London; - Douglas Medin, Department of Psychology, Northwestern University, Evanston; - Jesse Prinz, Department of Philosophy, University of North Carolina, Chapel-Hill; - Francois Recanati, Institut Jean-Nicod, Centre National de la Recherche Scientifique, Paris; - Philippe Schlenker, Department of Linguistics, University of California, Los Angeles, and Institut Jean-Nicod, Centre National de la Recherche Scientifique, Paris; - Dag Westerstahl, Department of Philosophy, University of Gotenborg. ABSTRACT SUBMISSION The programme committee invites researchers in the cognitive sciences (philosophy, psychology, neuroscience, computer science, linguistics, etc.) to present their work on compositionality at the conference. The deadline for paper submission is December 10, 2003. Only a limited number of oral and poster presentations can be accepted. Papers should fit into the overall programme of the conference and should be accessible to an interdisciplinary audience. Oral presentations are 20 minutes plus 10 minutes discussion. To submit a paper, please send in an extended abstract of about 1500 words using the online submission form on the conference homepage: http://www.phil-fak.uni-duesseldorf.de/thphil/compositionality In exceptional cases, hardcopy submission is also possible at: CoCoCo2004 c/o Markus Werning Chair of Theoretical Philosophy Heinrich-Heine-University D?sseldorf Universit?tsstr. 1 D-40225 D?sseldorf, Germany All submitted abstracts will be reviewed. The corresponding author will be notified about acceptance by January 15, 2004. Submissions must be received by December 10, 2003. The presenting author(s) must register for the conference after notification of acceptance. SPONSOR The conference is sponsored by the Fritz Thyssen Foundation. CONTACT Please address any questions to cococo2004 at phil.uni-duesseldorf.de From bogus@does.not.exist.com Sun Oct 12 21:06:38 2003 From: bogus@does.not.exist.com () Date: Mon, 13 Oct 2003 11:06:38 +1000 Subject: Two preprints on spike timing-dependent synaptic plasticity Message-ID: From mm at cse.ogi.edu Wed Oct 15 14:28:18 2003 From: mm at cse.ogi.edu (Melanie Mitchell) Date: Wed, 15 Oct 2003 11:28:18 -0700 Subject: faculty positions at OGI Message-ID: <16269.37314.622706.412847@sangre.cse.ogi.edu> Dear Connectionists, I wanted to bring to your attention the following advertisement for faculty recruitment at OGI. Adaptive systems and machine learning is one of our specific target areas for hiring. OGI is putting significant resources into building a strong and diverse Laboratory for Adaptive Systems, and we expect to hire between two and four people in this area within the next five years. Our recent merger with the Oregon Health & Science University has opened up great opportunities for collaboration with researchers in biological and medical sciences, and our close proximity to the OHSU Neurological Sciences Institute has already resulted in collaborations between our respective faculties in the fields of neural modeling and neural computation. OGI's current adaptive systems faculty includes: Dan Hammerstrom: Biologically inspired computation, VLSI chip design, neural networks Marwan Jabri: Intelligent signal processing, biologically inspired control for robotics Todd Leen: Machine learning, local and mixture models, neurophysiological modeling Melanie Mitchell: Evolutionary computation, cognitive science, complex systems John Moody (ICSI; joint appointment at OGI): Reinforcement learning, neural networks, time series analysis, data mining, computational finance Misha Pavel: Cognitive science, biologically inspired computation, biomedical engineering Xubo Song: Image processing, statistical pattern recognition Eric Wan: Neural networks, adaptive signal processing and control We are looking for outstanding individuals in any area of machine learning or adaptive systems to join our faculty. Please pass this on to anyone you think might be interested. --------------------------------------------------------------------- Faculty Positions at the OGI School of Science and Engineering The Department of Computer Science and Engineering invites applications for faculty positions at all ranks. We invite applications from outstanding candidates in any area of computer science and engineering. The typical teaching load in CSE is 2 graduate-level classes per year. Following an institutional merger in 2001, the Oregon Graduate Institute of Science and Technology (OGI) became one of the four schools of Oregon Health & Science University (OHSU). The merger is enabling the CSE department to expand in core disciplines of computer science and engineering and to establish strong interdisciplinary collaborations among researchers in computational, biological, and biomedical sciences. OGI is located 12 miles west of Portland, Oregon, in the heart of the Silicon Forest. Portland's extensive high-tech community, diverse cultural amenities and spectacular natural surroundings combine to make the quality of life here extraordinary. To learn more about the department, OGI, OHSU and Portland, please visit http://www.cse.ogi.edu. To apply, send a brief description of your research interests, the names of at least three references, and a curriculum vitae with a list of publications to: Chair, Recruiting Committee Department of Computer Science and Engineering OGI School of Science and Engineering at OHSU 20000 NW Walker Road Beaverton, Oregon 97006 Applications sent in before January 15, 2004 will be given preference. All applications will be reviewed. The email address for inquiries is: csedept at cse.ogi.edu. OGI/OHSU is an Equal Opportunity/Affirmative Action employer. We particularly welcome applications from women, minorities, and individuals with disabilities. From klaus at prosun.first.fraunhofer.de Thu Oct 16 10:42:24 2003 From: klaus at prosun.first.fraunhofer.de (Klaus-R. Mueller) Date: Thu, 16 Oct 2003 16:42:24 +0200 (MEST) Subject: NIPS Travel Grant Deadline Message-ID: <200310161442.h9GEgOCx021488@prosun.first.fraunhofer.de> Dear collegues, please take note of the following announcement by Bartlett Mel (NIPS Treasurer): The deadline to apply for a travel grant for the NIPS 2003 Conference is this Friday, October 17, 2003. Modest travel support will be available, with a preference to students, to those who will be presenting at the meeting, and to those who have not previously received travel awards. For details, and to fill out the on-line application, click here: http://www.nips.cc/Conferences/2003/TravelSupport.php Best wishes, klaus Muller -- &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& Prof. Dr. Klaus-Robert M\"uller University of Potsdam and Fraunhofer Institut FIRST Intelligent Data Analysis Group (IDA) Kekulestr. 7, 12489 Berlin e-mail: Klaus-Robert.Mueller at first.fraunhofer.de and klaus at first.fhg.de Tel: +49 30 6392 1860 Tel: +49 30 6392 1800 (secretary) FAX: +49 30 6392 1805 http://www.first.fhg.de/persons/Mueller.Klaus-Robert.html &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&& From nnk at atr.co.jp Sun Oct 19 20:35:42 2003 From: nnk at atr.co.jp (Neural Networks Editorial Office) Date: Mon, 20 Oct 2003 09:35:42 +0900 Subject: Neural Networks 16(9): Special Issue on "Neuroinformatics" Message-ID: Contents - Volume 16, Numbers 9 - 2003 2003 Special Issue:"Neuroinformatics" edited by Shun-ichi Amari, Michael A Arbib and Rolf Kotter ------------------------------------------------------------------ Contents: Introduction Language evolution: neural homologies and neuroinformatics. Michael Arbib, Mihail Bota Network participation indices: characterizing component roles for information processing in neural networks. Rolf Kotter, Klaas E. Stephan Towards a formalization of disease-specific ontologies for neuroinformatics. Amarnath Gupta, Bertram Ludascher, Jeffrey S. Grethe, Maryann E. Martone Visiome: neuroinformatics research in vision project. Shiro Usui Neuroanatomical database of normal Japanese brains. Kazunori Sato, Yasuyuki Taki, Hiroshi Fukuda, Ryuta Kawashima Complex independent component analysis of frequency-domain electroencephalographic data. Jorn Anemuller, Terrence J. Sejnowsky, Scott Makeig Learning and inference in the brain. Karl Friston Modeling the adaptive visual system: a survey of principled approaches. Lars Schwabe, Klaus Obermayer Self-correction mechanism for path integration in a modular navigation system on the basis of an egocentric spatial map. Regina Mudra, Rodney J. Douglas Kinetic simulation of signal transduction system in hippocampal long-term potentiation with dynamic modeling of protein phosphatase 2A. Shinichi Kikuchi, Kenji Fujimoto, Noriyuki Kitagawa, Taro Fuchikawa, Michiko Abe, Kotaro Oka, Kohtaro Takei, Masaru Tomita ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 (regular) SEK 660 (regular) Y 13,000 (regular) Neural Networks (plus 2,000 enrollment fee) $20 (student) SEK 460 (student) Y 11,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- membership without $30 SEK 200 not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) (plus 2,000 enrollment fee) ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Shozo Yasui Kyushu Institute of Technology Graduate School of Life Science and Engineering 2-4 Hibikino, Wakamatsu-ku Kitakyushu 808-0196 Japan 81 93 695 6108 (phone and fax) jnns at brain.kyutech.ac.jp http://www.jnns.org/ ----------------------------------------------------------------- From mark.laubach at yale.edu Mon Oct 20 10:24:50 2003 From: mark.laubach at yale.edu (Mark Laubach) Date: Mon, 20 Oct 2003 10:24:50 -0400 Subject: POSTDOCTORAL POSITION: REAL-TIME DECODING FOR BRAIN-MACHINE INTERFACES Message-ID: <3F93F032.4000502@yale.edu> Content-Type: text/plain; charset=us-ascii; format=flowed Content-Transfer-Encoding: 7bit POSTDOCTORAL POSITION: REAL-TIME DECODING FOR BRAIN-MACHINE INTERFACES A postdoctoral position is available in January, 2003 in the laboratory of Dr Mark Laubach at the John B Pierce Laboratory, a non-profit research institute affiliated with Yale University. The position is part of a collaboration with Drs Jon Kaas and Troy Hackett and their colleagues at Vanderbilt University and is supported by DARPA. This individual will develop novel methods for on-line analyses of neuronal ensemble data (e.g., adaptive methods for spike sorting and decoding analyses of spike and field potential data) using a cluster of Linux workstations. In addition, putative neuronal codes in auditory, prefrontal, and motor areas of the cerebral cortex will be investigated using neuronal ensemble recording and microstimulation methods in collaboration with neurophysiologists in our group. The position requires expertise in methods for statistical pattern recognition (e.g., random forest, SVMs), functional data analysis, and scientific computing under Linux. Our lab makes heavy use of Matlab, R, and Python, so knowledge of these tools is also necessary. Interested individuals should submit a letter of application, CV, copies of publications, and the names of three references to: Mark Laubach, PhD The John B. Pierce Laboratory Inc. 290 Congress Avenue New Haven, CT 06519 Inquires: laubach at jbpierce.org The review of applications will begin November 1, 2003, and will continue until the position is filled. E.O.E. From mr287 at georgetown.edu Mon Oct 20 19:57:45 2003 From: mr287 at georgetown.edu (Maximilian Riesenhuber) Date: Mon, 20 Oct 2003 19:57:45 -0400 Subject: postdoctoral position in computational neuroscience/fMRI, Georgetown University Message-ID: <3F947679.4070201@georgetown.edu> Postdoctoral Position in Computational Neuroscience and fMRI Department of Neuroscience Georgetown University A postdoctoral position is available immediately to study the neural mechanisms underlying real world object recognition in cortex using a combination of modeling, psychophysics, and fMRI. The project is part of an NIH-funded collaboration between the Riesenhuber lab at Georgetown and labs at MIT (Jim DiCarlo, Earl Miller, Tomaso Poggio), Caltech (Christof Koch) and Northwestern (David Ferster). Candidates should have experience in two of the following: computational neuroscience, visual psychophysics, fMRI. Experience with fMRI is a particular asset as a key component of the project will be the experimental testing of model predictions with fMRI, using Georgetown's new 3T magnet. For more information, see http://riesenhuberlab.neuro.georgetown.edu/Riesenhuber_lab_jobs.html . Interested candidates should send a CV, representative reprints, and the names of three references to Maximilian Riesenhuber (mr287 at georgetown.edu). Review of applications will begin November 1, with a possibility for interviews at this year's SFN meeting in New Orleans. ************************************* Maximilian Riesenhuber Department of Neuroscience Georgetown University Medical Center 3970 Reservoir Rd., NW Research Building Room EP09 Washington, DC 20007 From terry at salk.edu Tue Oct 21 20:13:40 2003 From: terry at salk.edu (Terry Sejnowski) Date: Tue, 21 Oct 2003 17:13:40 -0700 (PDT) Subject: UCSD Computational Neurobiology Training Program Message-ID: <200310220013.h9M0DeN21455@purkinje.salk.edu> DEADLINE: JANUARY 2, 2004 COMPUTATIONAL NEUROBIOLOGY GRADUATE PROGRAM Department of Biology - University of California, San Diego http://www.biology.ucsd.edu/grad/CN_overview.html The goal of the Computational Neurobiology Graduate Program at UCSD is to train researchers who are equally at home measuring large-scale brain activity, analyzing the data with advanced computational techniques, and developing new models for brain development and function. Financial support for students enrolled in this training program is available through an NSF Integrative Graduate Education and Research Training (IGERT) award. Candidates from a wide range of backgrounds are invited to apply, including Biology, Psychology, Computer Science, Physics and Mathematics. The three major themes in the training program are: 1. Neurobiology of Neural Systems: Anatomy, physiology and behavior of systems of neurons. Using modern neuroanatomical, behavioral, neuropharmacological and electrophysiological techniques. Lectures, wet laboratories and computer simulations, as well as research rotations. Major new imaging and recording techniques also will be taught, including two-photon laser scanning microscopy and functional magnetic resonance imaging (fMRI). 2. Algorithms and Realizations for the Analysis of Neuronal Data: New algorithms and techniques for analyzing data obtained from physiological recording, with an emphasis on recordings from large populations of neurons with imaging and multielectrode recording techniques. New methods for the study of co-ordinated activity, such as multi-taper spectral analysis and Independent Component Analysis (ICA). 3. Neuroinformatics, Dynamics and Control of Systems of Neurons: Theoretical aspects of single cell function and emergent properties as many neurons interact among themselves and react to sensory inputs. A synthesis of approaches from mathematics and physical sciences as well as biology will be used to explore the collective properties and nonlinear dynamics of neuronal systems, as well as issues of sensory coding and motor control. Participating Faculty include: * Henry Abarbanel (Physics): Nonlinear and oscillatory dynamics; modeling central pattern generators in the lobster stomatogastric ganglion. Director, Institute for Nonlinear Systems at UCSD * Thomas Albright (Salk Institute): Motion processing in primate visual cortex; linking single neurons to perception; fMRI in awake, behaving monkeys. Director, Sloan Center for Theoretical Neurobiology * Darwin Berg (Neurobiology): Regulation synaptic components, assembly and localization, function and long-term stability. * Garrison Cottrell (Computer Science and Engineering): Dynamical neural network models and learning algorithms * Virginia De Sa (Cognitive Science): Computational basis of perception and learning (both human and machine); multi-sensory integration and contextual influences * Mark Ellisman (Neurosciences, School of Medicine): High resolution electron and light microscopy; anatomical reconstructions. Director, National Center for Microscopy and Imaging Research * Marla Feller (Neurobiology): Mechanisms and function of spontaneous activity in the developing nervous system including the retina, spinal cord, hippocampus and neocortex. * Robert Hecht-Nielsen (Electrical and Computer Engineering): Neural computation and the functional organization of the cerebral cortex. Founder of Hecht-Nielsen Corporation * Harvey Karten (Neurosciences, School of Medicine): Anatomical, physiological and computational studies of the retina and optic tectum of birds and squirrels * David Kleinfeld (Physics): Active sensation in rats; properties of neuronal assemblies; optical imaging of large-scale activity. * William Kristan (Neurobiology): Computational Neuroethology; functional and developmental studies of the leech nervous system, including studies of the bending reflex and locomotion. Director, Neurosciences Graduate Program at UCSD * Herbert Levine (Physics): Nonlinear dynamics and pattern formation in physical and biological systems, including cardiac dynamics and the growth and form of bacterial colonies * Scott Makeig (Institute for Neural Computation): Analysis of cognitive event-related brain dynamics and fMRI using time-frequency and Independent Component Analysis * Javier Movellan (Institute for Neural Computation): Sensory fusion and learning algorithms for continuous stochastic systems * Mikhael Rabinovich (Institute for Nonlinear Science): Dynamical systems analysis of the stomatogastric ganglion of the lobster and the antenna lobe of insects * Terrence Sejnowski (Salk Institute/Neurobiology): Computational neurobiology; physiological studies of neuronal reliability and synaptic mechanisms. Director, Institute for Neural Computation * Martin Sereno (Cognitive Science): Neural bases of visual cognition and language using anatomical, electrophysiological, computational, and non-invasive brain imaging techniques * Nicholas Spitzer (Neurobiology): Regulation of ionic channels and neurotransmitters in neurons; effects of electrical activity in developing neurons on neural function. Chair of Neurobiology * Charles Stevens (Salk Institute): Synaptic physiology; physiological studies and biophysical models of synaptic plasticity in hippocampal neurons * Jochen Triesch (Cognitive Science): Sensory integration, visual psychophysics, vision systems and robotics, human-robot interaction, cognitive developmental * Roger Tsien (Chemistry): Second messenger systems in neurons; development of new optical and MRI probes of neuron function, including calcium indicators and caged neurotransmitters * Mark Whitehead (Neurosurgery, School of Medicine): Peripheral and central taste systems; anatomical and functional studies of regions in the caudal brainstem important for feeding behavior * Ruth Williams (Mathematics): Probabilistic analysis of stochastic systems and continuous learning algorithms Requests for application materials should be sent to the University of California, San Diego, Division of Biological Sciences 0348, Graduate Admissions Office, 9500 Gilman Drive, La Jolla, CA, 92093-0348 or to [gradprog at biomail.ucsd.edu]. The deadline for completed application materials, including letters of recommendation, is January 2, 2004. For more information about applying to the UCSD Biology Graduate Program. A preapplication is not required for the Computational Neurobiology Program. http://www.biology.ucsd.edu/grad/admissions/index.html From Wulfram.Gerstner at epfl.ch Wed Oct 22 03:43:52 2003 From: Wulfram.Gerstner at epfl.ch (Wulfram Gerstner) Date: Wed, 22 Oct 2003 09:43:52 +0200 Subject: Spike-Timing Dependent Plasticity Workshop Message-ID: <3F963538.82628815@epfl.ch> I would like to announce ********************************************************************** The Monte Verita Workshop on Spike-Timing Dependent Plasticity (STDP) Ascona, Switzerland, February 29-March5 , 2004 ********************************************************************** http://diwww.epfl.ch/~gerstner/STDP/index.html The symposium will bring together experimentalists and theoreticians working on Hebbian Learning, in particular Spike-Timing Dependent Plasticity (STDP) There will be a single track of talks. Most talks are by invitation, but a few extra time slots for contributed talks are available. There is also the possibility to present a poster. Registration is possible before January 15, 2004, but early registration is encouraged since the total number of participants is limited. Invited speakers A. Experiments G.Q Bi, Pittsburgh Dean Buonomano, UCLA Yang Dan, Berkeley Dominique Debanne, Marseille Nace Golding, Texas Mark Hubener, MPI of Neurobiology, Munich Mayank Mehta, MIT Henry Markram, EPFL, Lausanne M.-M. Poo, Berkeley Miguel Remondes, Caltech Jesper Sjoestroem, Brandeis and London B. Theory Henry D.I. Abarbanel, UCSD Larry Abbott, Brandeis S. Fusi, Bern W. Gerstner, Lausanne R. Guetig, Berlin Leo van Hemmen, TU Munich . R. Rao, Univ. of Washington P.D. Roberts, Oregon S. Seung, MIT Harel Shouval, Brown University Haim Sompolinsky, Jerusalem Walter Senn, Bern Misha Tsodyks, Jerusalem for further information, see http://diwww.epfl.ch/~gerstner/STDP/index.html organization committee: W. Gerstner, H. Markram, and W. Senn best regards Wulfram -- Wulfram Gerstner Swiss Federal Institute of Technology Lausanne Professor Laboratory of Computational Neuroscience, LCN wulfram.gerstner at epfl.ch Batiment AA-B Tel. +41-21-693 6713 1015 Lausanne EPFL Fax. +41-21-693 9600 http://diwww.epfl.ch/mantra From todorov at Cogsci.ucsd.edu Thu Oct 23 08:45:04 2003 From: todorov at Cogsci.ucsd.edu (Emo Todorov) Date: Thu, 23 Oct 2003 05:45:04 -0700 Subject: SFN computational motor control satellite Message-ID: <003a01c39963$7b86cdf0$8306ef84@todorov> Dear Colleagues, We are pleased to announce the second SFN satellite "Advances in Computational Motor Control". The program is now finalized and attached below. More information about the symposium, including extended abstracts for all contributed presentations, can be found at www.bme.jhu.edu/acmc Yours, Emo Todorov, UCSD Reza Shadmehr, JHU ================================= Advances in Computational Motor Control II Symposium at the Annual Society for Neuroscience Meeting Friday, November 7, 2003, 2:00PM - 9:30PM Room 288-290, Morial Convention Center, New Orleans 1:00-2:00 Registration 2:05-3:35 Session 1 INVITED TALK: Stefan Schaal. USC. "A computational approach to motor control and learning with motor primitives" James Patton, Sandro Mussa-Ivaldi, Y. Wei, M. Phillips, M. Stoykov. Northwestern University. "Exploiting sensorimotor adaptation" Steve Massaquoi. MIT. "Stabilization of cerebrocerebellar feedback control without internal dynamic models" Opher Donchin and Reza Shadmehr. Johns Hopkins University. "Can training change the desired trajectory?" 3:50-5:45 Session 2 Ken Ohta, Rafael Laboissiere, Mikhail Svinin. Max Planck Institute for Pscyhological Research and RIKEN. "Optimal trajectory of human arm reaching movements in dynamical environments" Emmanuel Guigon, Pierre Baraduc, Michel Desmurget. INSERM France. "Constant effort computation as a determinant of motor behavior" Emanuel Todorov. University of California San Diego. "Stochastic optimal feedback control of nonlinear biomechanical systems" Zhaoping Li, Alex Lewis, Silvia Scarpetta. University College London and University of Salerno, Italy. "Computational understanding of the neural circuit for the central pattern generator for locomotion and its control in lamprey" Madhusudhan Venkadesan, Francisco Valero-Cuevas, John Guckenheimer. Cornell University. "The boundary of instability as a powerful experimental paradigm for understanding complex dynamical sensorimotor behavior: dexterous manipulation as an example" 7:30-9:30 Session 3 INVITED TALK: Richard Andersen. CalTech. "Coordinate transformations for sensory guided movements" A. Roitman, S. Pasalar, and Tim Ebner. University of Minnesota. "Models of Purkinje cell discharge during circular manual tracking in monkey" Edward Boyden, Richard Tsien, Talal Chatila, Jennifer Raymond. Stanford University. "Is oppositely directed motor learning implemented with inverse plasticity mechanisms?" Paul Cisek. University of Montreal. "A computational model of reach decisions in the primate cerebral cortex" Dana Cohen and Miguel Nicolelis. Duke University. "Uncertainty reduction at the neuronal ensemble but not in single neurons during motor skill learning" From gpatane at ai.unime.it Fri Oct 24 04:49:24 2003 From: gpatane at ai.unime.it (Giuseppe Patane') Date: Fri, 24 Oct 2003 10:49:24 +0200 Subject: Clustering/VQ applet available (ELBG algorithm) Message-ID: <3F98E794.40701@ai.unime.it> Dear Connectionists, I am pleased to invite you to take a look at the Java applet demonstrating how "The Enhanced LBG Algorithm" (ELBG), developed by me and Prof. Marco Russo, works. The paper describing ELBG appeared on "Neural Networks", vol. 14 no. 9, pp 1219--1237, November 2001. You can find it, together with the related papers, at: http://ai.unime.it/~gp/english/clustering_applet/elbg_demo_applet.htm In the same site you can find other papers of mine about clustering and vector quantization. Please, note that the applet was developed using Java Swing components and it could not be supported by your browser as the following quoting from the official Java web site reports: "You can run Swing applets in any browser that has the appropriate version of Java Plug-in installed. Another option is to use a 1.2-compliant browser. Currently, the only 1.2-compliant browser available is the Applet Viewer utility provided with the Java 2 SDK." Best Regards -- Giuseppe Patane', Ph.D ------------------------------------------ Department of Physics University of Messina Contrada Papardo - Salita Sperone, 31 98166 S. Agata, Messina - ITALY and INFN Section of Catania 64, Via S. Sofia, I-95123 Catania - Italy e-mail: gpatane at ai.unime.it Fax: +39 (0)6 233245336 Home page: http://ai.unime.it/~gp/ From osporns at indiana.edu Tue Oct 28 09:25:57 2003 From: osporns at indiana.edu (Olaf Sporns) Date: Tue, 28 Oct 2003 09:25:57 -0500 Subject: Positions in robotics/cognitive science Message-ID: <3F9E7C75.1000509@indiana.edu> Indiana University, Bloomington, IN. Faculty position beginning August 2004. As part of a series of new appointments, the Cognitive Science Program at Indiana University Bloomington seeks applicants with a strong record of research and teaching using the ideas and techniques of biomorphic robotics, evolutionary robotics and artificial life. The successful applicant will take a leadership role in the planning and execution of a new, state-of-the-art laboratory for teaching and research. Applicants should send full dossiers, including letters of recommendation or names and addresses of referees. Indiana University is an equal opportunity/affirmative action employer. Applications from women and minority group members are especially encouraged. Please send materials to: Professor Andy Clark Search Committee Indiana University Cognitive Science Program 1033 E. Third St., Sycamore 0014 Bloomington, IN 47405. Applications received by January 9, 2004 are assured full consideration. -- Olaf Sporns, PhD Department of Psychology Neural Science Program Cognitive Science Program Indiana University Bloomington, IN 47405 http://php.indiana.edu/~osporns From tgd at cs.orst.edu Tue Oct 28 19:42:23 2003 From: tgd at cs.orst.edu (Thomas G. Dietterich) Date: Tue, 28 Oct 2003 16:42:23 -0800 Subject: Faculty Positions in Machine Learning and Artificial Intelligence Message-ID: <9018-Tue28Oct2003164223-0800-tgd@cs.orst.edu> The School of Electrical Engineering and Computer Science at Oregon State University is recruiting faculty in machine learning and related areas including vision, speech, natural language processing, robotics, and so on. Our school already includes several faculty members in these areas: * Bruce D'Ambrosio (Graphical Models) * Tom Dietterich (Machine Learning; Reinforcement Learning; Pattern Recognition) * Jon Herlocker (Intelligent Information Access; Collaborative Filtering) * Luca Lucchese (Camera Calibration) * Larry Marple (Signal Interpretation) * Eric Mortensen (Computer Vision, Image Processing) * Prasad Tadepalli (Reinforcement Learning) Full details are available at http://eecs.oregonstate.edu/faculty/03-04cs.html --Tom -- Thomas G. Dietterich, Professor Voice: 541-737-5559 School of Electrical Engineering FAX: 541-737-3014 and Computer Science URL: http://www.cs.orst.edu/~tgd Dearborn Hall 102, Oregon State University, Corvallis, OR 97331-3102 From geos at etf.ukim.edu.mk Tue Oct 28 15:49:17 2003 From: geos at etf.ukim.edu.mk (Georgi Stojanov) Date: Tue, 28 Oct 2003 21:49:17 +0100 Subject: CFP Epigenetic Robotics 2003 Message-ID: <001d01c39d94$fadc0630$cb9095c2@toshibauser> EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004 EPIROB2004 EPIROB2004 CALL FOR PAPERS EPIROB2004 EPIROB2004 Fourth International Workshop on Epigenetic Robotics: EPIROB2004 Modeling Cognitive Development in Robotic Systems EPIROB2004 EPIROB2004 http://www.epigenetic-robotics.org EPIROB2004 EPIROB2004 EPIROB2004 August 25-27, 2004 EPIROB2004 Location: LIRA-Lab, University of Genoa EPIROB2004 Genoa, Italy EPIROB2004 EPIROB2004 EPIROB2004 Submission Deadline: March 1st, 2004 EPIROB2004 EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004--EPIROB2004 This workshop focuses on combining developmental psychology, neuroscience, biology, and robotics with the goal of understanding the functioning of biological systems. Epigenetic systems, either natural or artificial, share a prolonged developmental process through which varied and complex cognitive and perceptual structures emerge as a result of the interaction of an embodied system with a physical and social environment. Epigenetic robotics includes the two-fold goal of understanding biological systems by the interdisciplinary integration between neural and engineering sciences and, simultaneously, that of enabling robots and artificial systems to develop skills for any particular environment instead of programming them for specific environments. To this aim, psychological theory and empirical evidence should be used to inform epigenetic robotic models, and these models should be used as theoretical tools to make experimental predictions in developmental psychology. We encourage the submission from different disciplines such as robotics, artificial intelligence, developmental psychology, biology or neurophysiology, as well as interdisciplinary work bridging the gap between science and engineering. Subject Areas include, but are not limited to: * The role of motivations, emotions, and value systems in development; * The development of: concepts, consciousness and self-awareness, emotion, imitation, intentionality, intersubjectivity, joint attention, learning, motivation, non-verbal and verbal communication, self, sensorimotor schemata, shared meaning and symbolic reference, social learning, social relationships, social understanding ("mind reading", "theory of mind"), value systems; * Interaction between innate structure, ongoing developing structure, and experience; * Related issues in algorithms, robotics, simulated robots, and embodied systems; * Strong AI (true intelligence and autonomy) versus weak AI; * Related issues from human and nonhuman empirical studies. For summaries of the papers from the latest workshops see: Zlatev and Balkenius (2001), Prince (2002), and Berthouze and Prince (2003). Please send any questions to the workshop co-chairs: Giorgio Metta (pasa at dist.unige.it) and Luc Berthouze (Luc.Berthouze at aist.go.jp). Sponsors LIRA-Lab, University of Genoa, Italy Communications Research Laboratory, Japan Location University of Genoa, Italy Invited Speakers Luciano Fadiga, Dept. of Biomedical Sciences, University of Ferrara, Italy Claes von Hofsten, Dept. of Psychology, University of Upssala, Sweden J=FCrgen Konczak, Human Sensorimotor Control Lab, University of Minnesota, USA Jacqueline Nadel, CNRS, University Pierre & Marie Curie, Paris, France Submissions Papers not exceeding eight (8) pages should be submitted electronically (PDF or Postscript) as attachment files to Luc Berthouze (Luc.Berthouze at aist.go.jp). Extended abstracts (maximum two pages) can also be submitted, and will be presented as posters (extended abstracts should also be submitted in PDF or Postscript as attachments to Luc Berthouze (Luc.Berthouze at aist.go.jp). Further instructions to authors will be posted on the workshop web page: http://www.epigenetic-robotics.org Important Dates March 1st, 2004: Deadline for submission of papers and posters April 21st, 2004: Notification of acceptance for papers and posters May 21st, 2004: Deadline for camera ready-papers & posters August 25-27, 2004: Workshop Organizing Committee Christian Balkenius (Cognitive Science, Lund University, Sweden) Luc Berthouze (Neuroscience Research Institute, AIST, Japan) Hideki Kozima (Communications Research Laboratory, Japan) Giorgio Metta (LIRA-Lab, University of Genoa, Italy) Giulio Sandini (LIRA-Lab, University of Genoa, Italy) Georgi Stojanov (Computer Science Institute, SS Cyril and Methodius University, Macedonia) Program Committee Christian Balkenius (Cognitive Science, Lund University, Sweden) Luc Berthouze (Neuroscience Research Institute, AIST, Japan) Aude Billard (Autonomous Systems Laboratory, EPFL, Switzerland) Daniel Bullock (Cognitive & Neural Systems Department, Boston University, USA) Kerstin Dautenhahn (Adaptive Systems Research Group, University of Hertfordshire, UK) Yiannis Demiris (Intelligent and Interactive Systems, Imperial College, UK) Luciano Fadiga (University of Ferrara, Italy) Peter G=E4rdenfors (Cognitive Science, Lund University, Sweden) Philippe Gaussier (Universite de Cergy-Pointoise & ENSEA, France) Gyorgy Gergely (Institute for Psychological Research, Hungarian Academy of Sciences, Hungary) Fr=E9d=E9ric Kaplan (Sony Computer Science Lab Paris, France) Hideki Kozima (Communications Research Laboratory, Japan) Valerie Kuhlmeier (Yale University, Department of Psychology, USA) Max Lungarella (Neuroscience Research Institute, AIST, Japan) Yuval Marom (Division of Informatics, University of Edinburgh, UK) Giorgio Metta (LIRA-Lab, Genoa, Italy) Jacqueline Nadel (CNRS, France) Chrystopher Nehaniv (Adaptive Systems Research Group, University of Hertfordshire, UK) Rolf Pfeifer (AI Lab, University of Zurich, Switzerland) Christopher G. Prince (Computer Science, University of Minnesota Duluth, USA) Deb Roy (Media Laboratory, MIT, USA) Giulio Sandini (LIRA-Lab, Genoa, Italy) Brian Scassellati (Department of Computer Science, Yale University, USA) Stefan Schaal (Computer Science Department, USC, USA) Matthew Schlesinger (Psychology Department, Southern Illinois University, USA) Sylvain Sirois (Department of Psychology, Manchester University, UK) Georgi Stojanov (Computer Science Institute, SS Cyril and Methodius University, Macedonia) Gert Westermann (Department of psychology, Oxford Brookes University, UK) Tom Ziemke (Department of Computer Science, University of Skovde, Sweden) Publication of Papers & Poster Abstracts Papers and poster abstracts will be published in the proceedings, and archived at CogPrints (http://cogprints.ecs.soton.ac.uk). REFERENCES Zlatev, J. & Balkenius, C. (2001). Introduction: Why "epigenetic robotics"? Proceedings of the First International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems (pp. 1-4). Lund University Cognitive Studies, Volume 85. Available at: http://www.lucs.lu.se/Epigenetic-robotics/Papers/Zlatev.Balkenius.2001.pdf Prince, C. G. (2002). Introduction: The Second International Workshop on Epigenetic Robotics. In C. G. Prince, Y. Demiris, Y. Marom, H. Kozima, & C. Balkenius (Eds.) Proceedings of the Second International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund, Sweden: Lund University Cognitive Studies Volume 94. Available at: http://www.cprince.com/PubRes/EpigeneticRobotics2002/Prince-Intro.pdf Weng, J., McClelland, J., Pentland, A., Sporns, O., Stockman, I., Sur, M., & Thelen, E. (2001). Autonomous mental development by robots and animals. Science, 291, 599-600. Available at: http://www.cse.msu.edu/dl/SciencePaper.pdf Berthouze, L. and Prince, C. G. (2003). Introduction: The Third International Workshop on Epigenetic Robotics. In C. G. Prince, L. Berthouze, H. Kozima, D. Bullock, G. Stojanov, & C. Balkenius (Eds.) Proceedings of the Third International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems. Lund, Sweden: Lund University Cognitive Studies Volume 101. Available at: http://www.d.umn.edu/~cprince/epigenetic-robotics/2003/intro.pdf From verleysen at dice.ucl.ac.be Wed Oct 29 11:33:03 2003 From: verleysen at dice.ucl.ac.be (Michel Verleysen) Date: Wed, 29 Oct 2003 17:33:03 +0100 Subject: special sessions at ESANN'2004 Message-ID: <007c01c39e3a$5342e2d0$43ed6882@dice.ucl.ac.be> ---------------------------------------------------- | | | ESANN'2004 | | | | 12h European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 28-29-30, 2004 | | | | Special sessions | ---------------------------------------------------- The following message contains a summary of all special sessions that will be organized during the ESANN'2004 conference. Authors are invited to submit their contributions to one of these sessions or to a regular session, according to the guidelines found on the web pages of the conference http://www.dice.ucl.ac.be/esann/). List of special sessions that will be organized during the ESANN'2004 conference ===================================================================== 1. Neural methods for non-standard data B. Hammer, Univ. Osnabr=FCck, B.J. Jain, Tech. Univ. Berlin (Germany) 2. Soft-computing techniques for time series forecasting I. Rojas, H. Pomares, Univ. Granada (Spain) 3. Neural networks for data mining R. Andonie, Central Washington Univ. (USA) 4. Theory and applications of neural maps U. Seiffert, IPK Gatersleben, T. Villmann, Univ. Leipzig, A. Wism=FCller, Univ. Munich (Germany) 5. Industrial applications of neural networks L.M. Reyneri, Politecnico. di Torino (Italy) 6. Hardware systems for Neural devices P. Fleury, A. Bofill-i-Petit, Univ. Edinburgh (Scotland, UK) Short descriptions ================== Neural methods for non-standard data ------------------------------------ Organized by : - B. Hammer, Univ. Osnabr=FCck (Germany) - B.J. Jain, Tech. Univ. Berlin (Germany) In modern neural network research it is common practice to represent data as feature vectors in an Euclidean vector space. This kind of representation is convenient; due to possibly high dimensions or potential loss of structural information, however, it is limited for many relevant application areas including bioinformatics, chemistry, natural language processing, network analysis, or text mining. Alternative powerful and expressive representations for complex data structures include, for example, graphs, trees, strings, sequences, or functions. Recent neural models which directly deal with complex data structures include recursive models, kernels for structures, or functional networks, to name just a few approaches. The session will focus on neural techniques for processing of non-vectorial data. Authors are invited to submit contributions related to the following list of topics: - supervised and unsupervised models for complex data structures, - coupling of symbolic and sub-symbolic systems, - similarity measures and kernel models for non-vectorial data, - specific preprocessing methods for complex data structures, - incorporatation of prior knowledge and invariances, - theoretical results within this topic, - applications e.g. in bioinformatics, chemistry, language processing, - time series processing, graph processing. Soft-computing techniques for time series forecasting ----------------------------------------------------- Organized by : - I. Rojas, Univ. Granada (Spain) - H. Pomares, Univ. Granada (Spain) It is obvious that forecasting activities play an important role in our daily life. A time series is a sequence of measured quantities, of some physical system taken at regular intervals of time. Time series analysis includes three important specific problems: prediction, modelling, and characterization. The goal of prediction is to accurately forecast the short-term evolution of the system, the aim of modelling is to precisely capture the features of the long-term behaviour of the system, and the purpose of system characterization is to determine some underlying fundamental properties of the system. Papers concerning these goals, using traditional statistical model (ARMA), neural networks, soft-computing techniques, fuzzy system, evolutionary algorithms, etc are welcome. Neural networks for data mining ------------------------------- Organized by : - R. Andonie, Central Washington Univ. (USA) Data mining is an attractive application area for neural networks. This session will focus on the specificity and limits of neural computation for data mining. The following questions will be discussed: 1. What makes the difference between data for data mining applications and data for other NN applications: huge data bases, mixed data types, uncertain data, redundant and conflicting data, etc. 2. Data mining applications may be related to Internet applications with on-line processing capability. Only few NN models can handle such requirements. How useful are NN in this case? 3. Data mining includes not only knowledge acquisition (rule extraction) but also decision making. This can be done by using NN models. How specific is this task, considering points 1-2? 4. Computational intelligence applications in E-commerce, customer profiling, marketing segmentation, etc. Authors are invited to submit contributions related to neural and neuro-fuzzy techniques used in data mining. Papers discussing why/when/how neural models are appropriate for data mining applications are especially welcome. Theory and applications of neural maps -------------------------------------- Organized by : - U. Seiffert, IPK Gatersleben (Germany) - T. Villmann, Univ. Leipzig (Germany) - A. Wism=FCller, Univ. Munich (Germany) Neural maps in real biological systems can be seen as information processing systems which map complex information onto a roughly two-dimensional structure such that the statistical relations within the data are transformed into geometrical relations - called topographic mapping. Models which describe these brain properties are called neural maps. In technical context these models are utilized as topographic vector quantizers. Famous examples are the Self-Organizing Map (SOM), the Elastic Net (EN), etc. However, also other vector quantizers, originally not inspired by biological motivation, can be taken as topographic vector quantizers. Examples are the Neural Gas (NG), Soft Topographic Vector Quantization (STVQ) and other. Topographic vector quantizers have found a large range of applications in data mining, visualization, data processing, control and so on. In parallel, a growing number of extensions of existing algorithms as well as new approaches were developed during the last years. In the proposed session we want to focus onto new developments of topographic vector quantization and neural maps. Thereby we emphasize the theoretical background as well as interesting applications with key ideas for optimal use of the properties of neural maps. We invite researchers to provide new ideas in these topics. Possible contributions can be in any area matching this framework with following (but not restricted) topics: - theory of topographic vector quantization - estimation of probability density - image processing - time series prediction - classification tasks - pattern classification, clustering, fuzzy clustering - blind source separation and decorrelation - dimension and noise reduction - evaluation of non-metric data (categorical/ordinal) - data mining Industrial applications of neural networks ------------------------------------------ Organized by : - L.M. Reyneri, Politecnico. di Torino (Italy) Hardware systems for Neural devices ----------------------------------- Organized by : - P. Fleury, Univ. Edinburgh (Scotland, UK) - A. Bofill-i-Petit, Univ. Edinburgh (Scotland, UK) This special session aims to present new developments in neural hardware engineering to the neural networks community. The emphasis will be placed on the computational properties of the hardware systems or their use in specific applications, rather than on intricate details of circuit implementations. Some suggested areas of interest include (but are not restricted to) neuromorphic VLSI, interfaces between biological neurons and hardware, implementation of novel ANN algorithms and stochastic computing with nanotechnologies. ======================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat d-side conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ======================================================== From jyoshimi at ucsd.edu Wed Oct 29 15:38:04 2003 From: jyoshimi at ucsd.edu (Jeff Yoshimi) Date: Wed, 29 Oct 2003 12:38:04 -0800 Subject: Simbrain 1.0.4 Message-ID: Connectionists: Simbrain 1.0.4 has been released. See: http://simbrain.sourceforge.net/ This is a maintenance release which fixes several bugs, including a major issue with update order. Other fixes and new features include: improved auto-zoom, improved threading, some control over input and output nodes, improved keyboard layout, new networks, and a new random-weight option. Learning rules are still local. Supervised learning should be incorporated by January, via collaboration will Snarli ( http://snarli.sourceforge.net/ ). A rewritten gauge component, including new projection algorithms (PCA and isomap in addition to MDS) is also forthcoming, sometime in December I hope. Feedback welcome. I'm happy to add new learning and activation rules at user request. Best, Jeff From cindy at bu.edu Fri Oct 31 10:24:57 2003 From: cindy at bu.edu (Cynthia Bradford) Date: Fri, 31 Oct 2003 10:24:57 -0500 Subject: 8th ICCNS: Call for Abstracts and Confirmed Invited Speakers Message-ID: <006b01c39fc3$249b9140$903dc580@cnspc31> Apologies if you receive more than one copy of this announcement. ***** Call for Abstracts and Confirmed Invited Speakers ***** EIGHTH INTERNATIONAL CONFERENCE ON COGNITIVE AND NEURAL SYSTEMS May 19 - 22, 2004 Boston University 677 Beacon Street Boston, Massachusetts 02215 USA http://www.cns.bu.edu/meetings/ Sponsored by Boston University's Center for Adaptive Systems and Department of Cognitive and Neural Systems with financial support from the Office of Naval Research This interdisciplinary conference is attended each year by approximately people from 30 countries around the world. As in previous years, the conference will focus on solutions to the questions: HOW DOES THE BRAIN CONTROL BEHAVIOR? HOW CAN TECHNOLOGY EMULATE BIOLOGICAL INTELLIGENCE? The conference is aimed at researchers and students of computational neuroscience, cognitive science, neural networks, neuromorphic engineering, and artificial intelligence. The conference includes tutorial and invited lectures, and contributed lectures and posters, by experts on the biology and technology of how the brain and other intelligent systems adapt to a changing world. Single-track oral and poster sessions enable all presented work to be highly visible. Three-hour poster sessions with no conflicting events will be held on two of the conference days. Posters will be up all day, and can also be viewed during breaks in the talk schedule. TUTORIAL LECTURE SERIES Stephen Grossberg (Boston University): "Linking brain to mind." See below for details. CONFIRMED INVITED AND PLENARY SPEAKERS Ehud Ahissar (Weizmann Institute of Science): "Encoding and decoding of vibrissal active touch" John Anderson (Carnegie Mellon University): "Using fMRI to track the components of a cognitive architecture" Alan D. Baddeley (University of Bristol): "In search of the episodic buffer" Moshe Bar (Massachusetts General Hospital): "Top-down facilitation of visual object recognition" Gail A. Carpenter (Boston University): "Information fusion and hierarchical knowledge discovery by ARTMAP neural networks" Stephen Goldinger (Arizona State University): "Generalization gradients in perceptual memory" Daniel Kersten (University of Minnesota): "How does human vision resolve ambiguity about objects?" Stephen M. Kosslyn (Harvard University): "The imagery debate 30 years later: Can neuroscience help resolve the issue?" Tai-Sing Lee (Carnegie Mellon University): "Inference and prediction in the visual cortex" Eve Marder (Brandeis University): "Plasticity and stability in rhythmic neuronal networks" Bartlett W. Mel (University of Southern California): "The pyramidal neuron: What sort of computing device?" Miguel Nicolelis (Duke University): "Real-time computing with neural ensembles" Jeffrey D. Schall (Vanderbilt University): "Neural selection and control of visual guided eye movements" Chantal Stern (Boston University): "Sequence? What sequence? fMRI studies of the medial temporal lobe in sequence learning" Mriganka Sur (Massachusetts Institute of Technology): "Plasticity and dynamics of visual cortex networks" Joseph Z. Tsien (Princeton University): "Temporal analysis of memory process" William H. Warren Jr. (Brown University): "Behavioral dynamics of locomotor path formation" Jeremy Wolfe (Harvard Medical School): "Has "preattentive vision" reached the end of the road?" LINKING BRAIN TO MIND: A Tutorial Lecture Series by Stephen Grossberg steve at bu.edu http://www.cns.bu.edu/Profiles/Grossberg In 1983, Stephen Grossberg gave a week-long series of tutorial lectures at an NSF-sponsored conference at Arizona State University. The lectures included a self-contained introduction to principles, mechanisms, and architectures whereby neural models link mind to brain and inspire neuromorphic applications to technology. Many leaders of the Connectionist Revolution which gained momentum during the mid-1980s attended the conference. In 1990-1992, three additional tutorial lecture series were given at the Wang Institute of Boston University. Since 1992, major breakthroughs have occurred in the theoretical understanding of how a brain gives rise to a mind. Models have begun to quantitatively explain and predict the neurophysiologically recorded dynamics of identified nerve cells, in anatomically verified circuits and systems, and the behaviors that they control. Because these results clarify how an intelligent system can autonomously adapt to a changing world, they have also been used to develop biologically-inspired solutions to technological problems. Several research groups have asked Professor Grossberg to give another lecture series to chart recent progress. Each morning session of the May 2004 conference will include one such tutorial lecture. The lectures will introduce concepts, principles, and mechanisms of mind/brain modeling and summaries of recent models about how brain development, learning, and information processing control perception, cognition, emotion, and action during both normal and abnormal behaviors. Brain-inspired algorithms for solving difficult technological problems will also be described. CALL FOR ABSTRACTS Session Topics: * vision * image understanding * audition * speech and language * unsupervised learning * supervised learning * reinforcement and emotion * sensory-motor control * cognition, planning, and attention * spatial mapping and navigation * object recognition * neural circuit models * neural system models * mathematics of neural systems * robotics * hybrid systems (fuzzy, evolutionary, digital) * neuromorphic VLSI * industrial applications * other Contributed abstracts must be received, in English, by January 30, 2004. Notification of acceptance will be provided by email by February 27, 2004. A meeting registration fee must accompany each Abstract. See Registration Information below for details. The fee will be returned if the Abstract is not accepted for presentation and publication in the meeting proceedings. Registration fees of accepted Abstracts will be returned on request only until April 16, 2004. Each Abstract should fit on one 8.5" x 11" white page with 1" margins on all sides in a single-spaced, single-column format with a font of 10 points or larger, printed on one side of the page only. Fax or electronic submissions will not be accepted. Abstract title, author name(s), affiliation(s), mailing, and email address(es) should begin each Abstract. An accompanying cover letter should include: Full title of Abstract; corresponding author and presenting author name, address, telephone, fax, and email address; requested preference for oral or poster presentation; and a first and second choice from the topics above, including whether it is biological (B) or technological (T) work [Example: first choice: vision (T); second choice: neural system models (B)]. Talks will be 15 minutes long. Posters will be up for a full day. Overhead, slide, VCR, and LCD projector facilities will be available for talks. Abstracts which do not meet these requirements or which are submitted with insufficient funds will be returned. Accepted Abstracts will be printed in the conference proceedings volume. No longer paper will be required. The original and 3 copies of each Abstract should be sent to: Cynthia Bradford, Boston University, Department of Cognitive and Neural Systems, 677 Beacon Street, Boston, Massachusetts 02215 USA. REGISTRATION INFORMATION: Early registration is recommended. To register, please fill out the registration form below. Student registrations must be accompanied by a letter of verification from a department chairperson or faculty/research advisor. If accompanied by an Abstract or if paying by check, mail to the address above. If paying by credit card, mail as above, or fax to +1 617 353 7755, or email to cindy at bu.edu. The registration fee will help to pay for a conference reception, 3 daily coffee breaks, and the meeting proceedings. STUDENT FELLOWSHIPS: Fellowships for PhD candidates and postdoctoral fellows are available to help cover meeting travel and living costs. The deadline to apply for fellowship support is January 30, 2004. Applicants will be notified by email by February 27, 2004. Each application should include the applicant's CV, including name; mailing address; email address; current student status; faculty or PhD research advisor's name, address, and email address; relevant courses and other educational data; and a list of research articles. A letter from the listed faculty or PhD advisor on official institutional stationery must accompany the application and summarize how the candidate may benefit from the meeting. Fellowship applicants who also submit an Abstract need to include the registration fee payment with their Abstract submission. Fellowship checks will be distributed after the meeting. REGISTRATION FORM Eighth International Conference on Cognitive and Neural Systems Boston University Department of Cognitive and Neural Systems 677 Beacon Street Boston, Massachusetts 02215 USA May 19-22, 2004 Fax: +1 617 353 7755 http://www.cns.bu.edu/meetings/ Mr/Ms/Dr/Prof:_____________________________________________________ Affiliation:_________________________________________________________ Address:__________________________________________________________ City, State, Postal Code:______________________________________________ Phone and Fax:_____________________________________________________ Email:____________________________________________________________ The registration fee includes the conference proceedings, a reception, and 3 coffee breaks each day. CHECK ONE: ( ) $95 Conference (Regular) ( ) $65 Conference (Student) METHOD OF PAYMENT (please fax or mail): [ ] Enclosed is a check made payable to "Boston University" Checks must be made payable in US dollars and issued by a US correspondent bank. Each registrant is responsible for any and all bank charges. [ ] I wish to pay by credit card (MasterCard, Visa, or Discover Card only) Name as it appears on the card:___________________________________________ Type of card: _____________________________ Expiration date:________________ Account number:_______________________________________________________ Signature:____________________________________________________________