From scheler at ICSI.Berkeley.EDU Tue May 1 12:12:18 2001 From: scheler at ICSI.Berkeley.EDU (Gabriele Scheler) Date: Tue, 1 May 2001 09:12:18 -0700 (PDT) Subject: Two papers on Tuning Curves and Neural Control Message-ID: <200105011612.JAA25734@raclette.ICSI.Berkeley.EDU> The following two papers from CNS 2000 and CNS 2001 are available from my website http://wwwbrauer.in.tum.de/~scheler/pub/control.ps http://wwwbrauer.in.tum.de/~scheler/pub/cns-paper.pdf Signal Loss with Neural Controllers Gabriele Scheler We examine the effect of neuronal plasticity on information processing in a neural feedback-control system implemented by a recurrent structure. We show that fine tuning of low-pass and high-pass filters on information flow may lead to controlled signal degradation and signal loss, which is an important function for any self-recursive system. We relate the filter function to neuromodulatory control, and discuss the biological realization of short-term and long-term plasticity effects. Dopamine modulation of prefrontal delay activity - reverbatory activity and sharpness of tuning curves Gabriele Scheler and Jean-Marc Fellous Recent electrophysiological experiments have shown that dopamine (D1) modulation of pyramidal cells in prefrontal cortex reduce spike frequency adaptation and enhances NMDA transmission. Using four models, from multicompartmental to integrate and fire, we examine the effects of these modulations on sustained (delay) activity in a reverberatory network. We find that D1 modulation may enable robust network bistability yielding selective reverberation among cells that code for a particular item or location. We further show that the tuning curve of such cells is sharpened, and the signal-to-noise ratio is increased. We postulate that D1 modulation affects the tuning of "memory fields" and yield efficient distributed dynamic representations. Gabriele Scheler ------------------- Dr. Gabriele Scheler ICSI 1947 Center Street Berkeley, Ca. 94704 From a.hussain at cs.stir.ac.uk Tue May 1 10:23:38 2001 From: a.hussain at cs.stir.ac.uk (Dr. Amir Hussain) Date: Tue, 1 May 2001 15:23:38 +0100 Subject: Extended Deadline for CIS Journal Special Issue: Final Call for Papers References: <5.0.2.1.0.20010430193834.0277c990@morgana.elet.polimi.it> <007f01c0d22d$f2f86420$56acfea9@guilder> Message-ID: <002801c0d24a$51f50980$56acfea9@guilder> Dear Connectionists: As I have been asked by quite a few potential paper authors for an extension in the submission deadline, I have decided to extend the formal deadine for submission of papers to the Journal of Control & Intelligent Systems, Special Issue on "Non-linear Speech Processing Techniques & Applications" (Vol.30(1), 2002 issue) UNTIL 21 MAY 2001 !! This email announcement also therefore, serves as the final call for papers for this journal special issue. Please see http://www.actapress.com/journals/specialci.htm for more details. Best wishes Dr. Amir Hussain Guest Editor - Neural Computing Research Group Department of Computing Science & Mathematics University of Stirling, Stirling FK9 4LA SCOTLAND, UK Tel / Fax: (++44) 01786 - 476437 / 464551 Email: a.hussain at cs.stir.ac.uk http://www.cs.stir.ac.uk/~ahu/ From yann at research.att.com Tue May 1 17:05:46 2001 From: yann at research.att.com (Yann LeCun) Date: Tue, 01 May 2001 17:05:46 -0400 Subject: Announcing NIPS Online Message-ID: <200105012104.RAA07062@surfcity.research.att.com> Dear Colleagues: We are pleased to announce the availability of the NIPS Online web site at http://nips.djvuzone.org NIPS Online offers free access to the full collection of NIPS Proceedings, volumes 1 to 12 (NIPS*88 to NIPS*99). High resolution scans of the articles are provided in DjVu format with full-text search capability. Viewer software and information about DjVu are available at http://www.djvuzone.org The NIPS Online web site was made possible by the NIPS Foundation which funded the scanning, the original publishers, MIT Press and Morgan-Kaufmann, which graciously let us provide free access to the content; and AT&T Labs which supported the project. -- Yann LeCun [apologies if you receive multiple copies of this message] ____________________________________________________________________ Yann LeCun Head, Image Processing Research Dept. AT&T Labs - Research tel:+1(732)420-9210 fax:(732)368-9454 200 Laurel Avenue, Room A5-4E34 yann at research.att.com Middletown, NJ 07748, USA. http://www.research.att.com/~yann From woonw at aston.ac.uk Tue May 1 17:10:14 2001 From: woonw at aston.ac.uk (Wei Lee Woon) Date: Tue, 1 May 2001 22:10:14 +0100 Subject: Lectureship available in Neural Computing at Aston University Message-ID: <008401c0d283$1ec43de0$81f4a8c0@canggih> LECTURESHIP IN NEURAL COMPUTING, COMPLEX SYSTEMS OR INFORMATION ENGINEERING/MATHEMATICS. ASTON UNIVERSITY, UK. The NCRG as part of the Information Engineering group are looking for a highly motivated individual to contribute to internationally renowned research effort in areas of neural computing, biomedical information engineering, and inference systems. Our theoretical research interests span the traditional areas of signal processing, statistical pattern processing and information mathematics. Current applications-oriented activity includes work in biomedical areas (ECG/EEG/MEG), image segmentation, time series analysis, geographic information systems, error correcting codes, cryptography and steganography. We are seeking an enthusiast who can contribute to our research directions. The new lecturer will also be able to contribute towards a research-based MSc and a novel undergraduate information mathematics programme. Details of the Group's activities are on www.maths.aston.ac.uk and www.ncrg.aston.ac.uk. Candidates should have excellent qualifications, a deep commitment to research and a caring and involved attitude towards students. Appointments will be for 5 years in the first instance, with the possibility of subsequent transfer to a continuing appointment. Salary scale =A318,731 to =A330,967 and exceptionally to =A334,601 per = annum Further information is available from the Personnel Office (quoting Ref A01/69). Tel: (+44/0) 121 359 0870 (24 hour answerphone); email b.a.power at aston.ac.uk. Informal enquiries can be made to Professor David Lowe (d.lowe at aston.ac.uk). Closing date for the receipt of applications: 28 June 2001. From Nigel.Goddard at ed.ac.uk Wed May 2 08:12:56 2001 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Wed, 02 May 2001 13:12:56 +0100 Subject: Neural Coding: Call for Participation Message-ID: <3AEFF9C8.53D60004@ed.ac.uk> THE NEURAL CODE: MULTILEVEL AND COMPUTATIONAL APPROACHES a Maxwell Neuroinformatics Workshop Call for Participation May 28-June 1, 2001, Edinburgh, Scotland http://www.anc.ed.ac.uk/workshops This workshop is concerned with theoretical and empirical approaches to understanding the neural code, particularly with respect to analysis of multineuron data and theoretical approaches which can inform these analyses. To be able to address the key questions, it is necessary to bring together biologists, physicists, computer scientists and statisticians. The workshop brings together scientists with experimental, computational and theoretical approaches spanning multiple levels to provide an opportunity for interaction between methodological and phenomenological foci. Confirmed speakers include: Moshe Abeles Peter Dayan Peter Foldiak Andreas Herz Rob Kass Bruce McNaughton Mike Oram Stefano Panzeri Maneesh Sahani David Sterret Alessandro Treves Emma Wood Florentin Worgotter Rich Zemel The meeting is being organized in a small workshop style with emphasis on short presentations from invited speakers and from participants, round table discussions, and open debates on emerging topics. Time is scheduled for informal, self-organised, small-group activities. Computers will be available to support explorative work and demonstrations. In addition to the invited speakers, a limited number of places will be available to interested scientists, who will be chosen on the basis of the contribution they can make to the workshop. A number of places are reserved for junior faculty, postdoctoral researchers and senior graduate students who are early on in a research career in the areas covered by the workshop and who could gain significantly from exposure to the workshop presentations and discussions. We will have some travel/accommodation stipends for some of these participants who do not have access to their own funding to participate. Registration is via the developing Neuroinformatics portal at http://www.neuroinf.org, and further information can be found at the workshope site: http://www.anc.ed.ac.uk/workshops From james at tardis.ed.ac.uk Wed May 2 12:18:56 2001 From: james at tardis.ed.ac.uk (James Hammerton) Date: Wed, 02 May 2001 17:18:56 +0100 Subject: CFP: Special Issue of JMLR on "Machine Learning Approaches to Shallow Parsing" Message-ID: <20010502161857.3A5ECC14A@davros.tardis.ed.ac.uk> [Please note the Reply-To field] Call for Papers: Special Issue of the Journal of Machine Learning Research -- "Machine Learning Approaches to Shallow Parsing" Editors: James Hammerton james.hammerton at ucd.ie, University College Dublin Miles Osborne osborne at cogsci.ed.ac.uk, University of Edinburgh Susan Armstrong susan.armstrong at issco.unige.ch, University of Geneva Walter Daelemans walter.daelemans at uia.ua.ac.be, University of Antwerp The Journal of Machine Learning Research invites authors to submit papers for the Special Issue on Machine Learning approaches to Shallow Parsing. Background ---------- Over the last decade there has been an increased interest in applying machine learning techniques to corpus-based natural language processing. In particular many techniques have been applied to shallow parsing of large corpora, where rather than produce a detailed syntactic or semantic analysis of each sentence, key parts of the syntactic structure or key pieces of semantic information are identified or extracted. For example, such tasks include identifying the noun phrases in a text, extracting non-overlapping chunks of text that identify the major phrases in a sentence or extracting the subject, main verb and object from a sentence. Applications of shallow parsing include data mining from unstructured textual material (e.g. web pages, newswires), information extraction, question answering, automated annotation of linguistic corpora and the preprocessing of data for linguistic tasks such as machine translation or full scale parsing. Shallow parsing of realistic, naturally occuring language poses a number of challenges for a machine learning system. Firstly, the training set is usually large which will push batch techniques to the limit. The training material is often noisy and frequently only partially determines a model (that is, only some aspects of the target model are observed). Secondly, shallow parsing requires making large numbers of decisions which translates as learning large models. The size of such models usually results in extremely sparse counts, which makes reliable estimation difficult. In sum, learning how to do shallow parsing will tax almost any machine learning algorithm and will thus provide valuable insight into real-world performance. In a number of workshops and publications, a variety of machine learning techniques have been applied in this area including memory based (instance based) learning, inductive logic programming, probabilistic context free grammars, maximum entropy, transformation based learning, artificial neural networks and more recently support vector machines. However there has not been an opportunity to compare and contrast these techniques in a systematic manner. The special issue will thus provide a venue for drawing together the relevant ML techniques. TOPICS ------ The aim of the special issue is to solicit and publish papers that provide a clear view of the state of the art in machine learning for shallow parsing. We therefore encourage submissions in the following areas: * applications of machine learning techniques to shallow parsing tasks, including the development of new techniques. * comparisons of machine learning techniques for shallow parsing * analyses of the complexity of machine learning for shallow parsing tasks To facilitate cross-paper comparison and thus strengthen the special issue as a whole, authors are encouraged to consider using one of the following data sets provided via the CoNLL workshops (please note however that this is not mandatory): http://lcg-www.uia.ac.be/conll2000/chunking/ or: http://lcg-www.uia.ac.be/conll2001/clauses/ We emphasise that authors will not be solely judged in terms of raw performance and this is not to be considered as a competition: insight into the strengths and weaknesses of a given system is deemed to be more important. High quality papers reviewing machine learning for shallow parsing will also be welcome. Instructions ------------ Articles should be submitted electronically. Postcript or PDF format are acceptable and submissions should be single column and typeset in 11 pt font format, and include all author contact information on the first page. See the author instructions at www.jmlr.org for more details. To submit a paper send the normal emails asked for by the JMLR in their author instructions to submissions at jmlr.org (NOT to the editors directly), indicating in the subject headers that the submission is intended for the Special Issue on Machine Learning Approaches to Shallow Parsing. Key dates --------- Submission deadline: 2nd September 2001 Notification of acceptance: 16th November 2001 Final drafts: 3rd February 2002 Further information ------------------- Please contact James Hammerton with any queries. From stefan.wermter at sunderland.ac.uk Wed May 2 11:49:57 2001 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Wed, 02 May 2001 16:49:57 +0100 Subject: PG Research Student Applications Message-ID: <3AF02CA5.3BAD883A@sunderland.ac.uk> With respect to this email list I would like to encourage applications from computing PhD students in intelligent systems (e.g. neural networks, natural language engineering, hybrid systems, cognitive neuroscience, neuro/fuzzy systems, machine learning). General application text for all areas of interest below Stefan Wermter ------------------------------------------------- Phd & MPhil Opportunities in Computing, Engineering & Technology The School of Computing, Engineering & Technology at the University of Sunderland is seeking high quality, motivated applicants wishing to gain a PhD or MPhil in the disciplines of Computing, Mathematical Sciences, Engineering and Technology. The school has a strong and growing research profile with 6 EPSRC and more than 10 EU-funded projects and a vibrant community of over 100 researchers. The School is well-resourced and offers excellent facilities with much state-of the art computing equipment and not only offers high quality postgraduate but also undergraduate programmes accredited by professional societies. We look for applications in both computing and mathematics as well as general engineering. The main research groups in computing & mathematics are: intelligent systems (major strengths in neural networks, genetic algorithms, hybrid systems and natural language engineering: Professor Stefan Wermter - stefan.wermter at sunderland.ac.uk +44 191 5153279); human computer systems (includes themes such as multimedia, computer-aided learning, computing for the disabled and human computer interaction evaluation methodologies: Professor Gilbert Cockton - gilbert.cockton at sunderland.ac.uk +44 191 5153394); software engineering (focussed on practical areas especially software testing and the organisational risks of implementing information systems, methodologies and solutions for industry: Professor Helen Edwards helen.edwards at sunderland.ac.uk +44 191 5152786 or Professor Barrie Thompson barrie.thompson at sunderland.ac.uk +44 191 5152769); electronic commerce (encompasses the development and promotion of standards in this dynamic area with a special interest in the area of electronic procurement: Kevin Ginty - kevin.ginty at sunderland.ac.uk or Albert Bokma albert.bokma at sunderland.ac.uk +44 191 5153233); decision support systems (covers a diverse range of activities in statistics & mathematics at the boundary of Computer Science and Statistics and Operational Research: Professor Eric Fletcher eric.fletcher at sunderland.ac.uk +44 191 5152822 or Professor Alfredo Moscardini alfredo.moscardini at sunderland.ac.uk +44 191 5152763); *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From shultz at psych.mcgill.ca Wed May 2 13:36:12 2001 From: shultz at psych.mcgill.ca (Thomas R. Shultz) Date: Wed, 02 May 2001 13:36:12 -0400 Subject: Recent papers on knowledge and learning Message-ID: <4.3.1.0.20010502132519.00a86620@127.0.0.1> Recent papers on knowledge and learning that may be of interest to readers of this list Shultz, T. R., & Rivest, F. (2001, in press). Knowledge-based cascade-correlation: Using knowledge to speed learning. Connection Science. Research with neural networks typically ignores the role of knowledge in learning by initializing the network with random connection weights. We examine a new extension of a well-known generative algorithm, cascade-correlation. Ordinary cascade-correlation constructs its own network topology by recruiting new hidden units as needed to reduce network error. The extended algorithm, knowledge-based cascade-correlation (KBCC), recruits previously learned sub-networks as well as single hidden units. This paper describes KBCC and assesses its performance on a series of small, but clear problems involving discrimination between two classes. The target class is distributed as a simple geometric figure. Relevant source knowledge consists of various linear transformations of the target distribution. KBCC is observed to find, adapt, and use its relevant knowledge to significantly speed learning. ============= Shultz, T. R., & Rivest, F. (2000). Using knowledge to speed learning: A comparison of knowledge-based cascade-correlation and multi-task learning. Proceedings of the Seventeenth International Conference on Machine Learning (pp. 871-878). San Francisco: Morgan Kaufmann. Cognitive modeling with neural networks unrealistically ignores the role of knowledge in learning by starting from random weights. It is likely that effective use of knowledge by neural networks could significantly speed learning. A new algorithm, knowledge-based cascade-correlation (KBCC), finds and adapts its relevant knowledge in new learning. Comparison to multi-task learning (MTL) reveals that KBCC uses its knowledge more effectively to learn faster. ============= Preprints and reprints can be found at http://www.psych.mcgill.ca/perpg/fac/shultz/default.htm Cheers, Tom -------------------------------------------------------- Thomas R. Shultz, Professor, Department of Psychology, McGill University, 1205 Penfield Ave., Montreal, Quebec, Canada H3A 1B1. E-mail: shultz at psych.mcgill.ca Updated 7 April 2001: http://www.psych.mcgill.ca/perpg/fac/shultz/default.htm Phone: 514 398-6139 Fax: 514 398-4896 -------------------------------------------------------- From andre at icmc.sc.usp.br Thu May 3 09:31:24 2001 From: andre at icmc.sc.usp.br (andre) Date: Thu, 03 May 2001 10:31:24 -0300 Subject: International Journal of Computational Intelligence and Applications Message-ID: <3AF15DAC.EF5E0CF6@icmc.sc.usp.br> ========================================================= INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS http://ejournals.wspc.com.sg/ijcia/ijcia.html Vol. 1, No. 1, March 2001 Editorial Feedback Self-Organizing Map and its Application to Spatio-Temporal Pattern Classification K. Horio and T. Yamakawa Learning of Fuzzy Automata W. Pedrycz and A. Gacek Hybrid Instance-Based System for Predicting Ocean Temperatures J. M. Corchado, B. Lees and J. Aiken Modular Connectionist Modelling and Classification Approaches for Local Diagnosis in Telecommunication Traffic Management Y. Bennani and F. Bossaert Using Case Retrieval to Seed Genetic Algorithms S. Oman and P. Cunningham The Application of Feedforward Neural Networks in VLSI Fabrication Process Optimization W. Xiangdong and W. Shoujue An Enhanced Genetic Algorithm for Solving the High-Level Synthesis Problems of Scheduling, Allocation, and Binding G. W. Grewal and T. C. Wilson Calendar of Events Book Review: Words and Rules: The Ingredients of Language -- Prof. Andre Ponce de Leon F. de Carvalho Associate Professor Computational Intelligence Laboratory Department of Computer Science and Statistics University of Sao Paulo Sao Carlos, SP, Brazil www.icmc.sc.usp.br/~andre From giese at MIT.EDU Thu May 3 23:05:36 2001 From: giese at MIT.EDU (Martin A Giese) Date: Thu, 03 May 2001 23:05:36 -0400 Subject: research positions Message-ID: <200105040305.XAA19989@superior-temporal-sulcus.mit.edu> IN THEORETICAL NEUROSCIENCE AND COMPUTER VISION / ROBOTICS The research group for Action Representation and Learning at the Max-Planck Institute for Biological Cybernetics and the Department of Cognitive Neurology at the University Hospital in Tuebingen (Germany) offers following research positions in theoretical neuroscience and computer vision / robotics: 1 postdoc position (BAT IIa) 2 PhD positions (BAT IIa / 2) The group investigates how complex movements and actions are represented in the brain, and how the underlying learning principles can be exploited for technical applications in computer vision, robotics and biomedical systems. One focus of the group is the development and experimental testing of models for action representation in the brain. This work includes the development of neural models and testing them in psychophysical, neurophysiological and fMRI experiments in close collaboration with different well established experimentalists in Tuebingen and the USA. The second focus is the development of technical applications of learning-based representations of actions for medical diagnosis, computer animation and movement programming in robots. Technical applications will be developed in collaboration with companies in robotics and biomedical technology and the Dept. for Neurology at the University Hospital in Tuebingen. Close collaborations exist with the Center for Biological and Computational Learning, M.I.T., Cambridge (USA), Harvard Medical School, and the Department of Biomedical Engineering, Boston University (USA). The postdoctoral position will be available for 3 years (salary BAT IIa), extendable to 5 years. The ideal candidate has a background in computer science, engineering, physics or mathematics and previous experience in computer vision / graphics, robotics or machine learning. She / he will be in charge of developing technical applications and new methods in machine learning for the representation of actions. Both PhD positions are available for 3 years (salary BAT IIa/2). One PhD will focus on neural modeling of the recognition of complex movements in humans and primates. He / she will be closely involved in experiments to evaluate the theory. Ideally, this candidate has a strong interest in neuroscience and good mathematical skills and previous training in physics, mathematics, engineering, computer science or psychology. Tuebingen offers a new graduate program in neuroscience. The second PhD will take part in the development of medical diagnosis systems and computer graphics applications exploiting methods from machine learning and computer vision. Ideally, this candidate has good mathematical and programming skills and previous training in physics, mathematics, engineering, or computer science. All Positions are funded by the German Volkswagen Foundation. For further information please contact: Dr. Martin Giese Center for Biological and Computational Learning Massachusetts Institute of Technology E 25-206 45, Carleton Street Cambridge, Massachusetts 02139-4307 USA email: giese at mit.edu Tel: +001-617-253 0549 (office) +001-617-253 0551 (lab secretary) Fax: +001-617-253 2964 Applicants are asked to submit their CV, bibliography and the names of two references. Applications should be sent by email to the same address. -- ----------------------------------------------------- Dr. Martin Giese Center for Biological and Computational Learning Massachusetts Institute of Technology, Room E25 - 206 45, Carleton Street Cambridge, Massachusetts 02139-4307 USA email: giese at mit.edu Tel: +001-617-253 0549 (office) +001-617-253 0551 (lab secretary) +001-617-491 5538 (home) Fax: +001-617-253 2964 ---------------------------------------------------- From santini at dii.unisi.it Fri May 4 10:16:41 2001 From: santini at dii.unisi.it (Santini Fabrizio) Date: Fri, 04 May 2001 16:16:41 +0200 Subject: LFTNC 2001 Advanced Research Workshop - NATO Grants Message-ID: <3AF2B9C9.702136AD@dii.unisi.it> LFTNC 2001 NATO Advanced Research Workshop on Limitations and Future Trends in Neural Computation -------------------------------------------------------------- We are very pleased to announce that, within the framework of the ARW LFTNC 20001, NATO provides limited additional funds to support the partecipation of scientists from Greece, Portugal and Turkey. Please refer either to the European official site or the America mirror for further details: www.ims.unico.it/2001/lftnc www.ewh.ieee.org/soc/im/2001/lftnc -------------------------------------------------------------- Fabrizio Santini Universita' di Siena - Facolta' di Ingegneria Informatica web: http://www.dii.unisi.it/~santini -------------------------------------------------------------- From derprize at cnbc.cmu.edu Fri May 4 13:27:21 2001 From: derprize at cnbc.cmu.edu (David E. Rumelhart Prize) Date: Fri, 04 May 2001 13:27:21 -0400 Subject: First Recipient of the David E. Rumelhart Prize Announced Message-ID: <3AF2E678.9DD92390@cnbc.cmu.edu> Geoffrey E. Hinton Chosen as First Recipient of the David E. Rumelhart Prize for Contributions to the Formal Analysis of Human Cognition The Glushko-Samuelson Foundation and the Cognitive Science Society are pleased to announce that Geoffrey E. Hinton has been chosen as the first recipient of the David E. Rumelhart Prize for contributions to the formal analysis of human cognition. Hinton was chosen for his many important contributions to the analysis of neural networks, elucidating the nature of representation, processing, and learning in the brain. In a landmark early book with James Anderson (1), he pioneered the use of distributed representations and described how they can be used for semantic knowledge representation (2). With Terrence J. Sejnowski (3), he introduced the Boltzmann Machine, an important neural network architecture for finding globally optimal solutions to difficult constraint satisfaction problems, and with Sejnowski and Ackley (4) he proposed a learning algorithm for use in such networks. With David Rumelhart and Ronald Williams (5), he introduced the back-propagation learning algorithm and made clear how it could be used to discover useful representations capturing the underlying structure of a body of structured propositional information. He has gone on from this important early work to make many further contributions to the field of neural networks, including studies of mixtures of experts (6) and Helmholtz machines (7). His publication list includes more than 100 articles on these and a wide range of other topics. Beyond these contributions, Hinton is an outstanding mentor and advisor: 18 graduate students have earned the Ph. D. degree under his supervision. Hinton to Deliver Prize Lecture at the Edinburgh Meeting of the Cognitive Science Society in August, 2001 Geoffey Hinton will receive the First David E. Rumelhart Prize and deliver the first Rumelhart Prize Lecture in Edinburgh, Scotland at the Annual Meeting of the Cognitive Science Society, to be held August 1-4 in Edinburgh, Scotland. The Prize itself will consist of a certificate, a citation of the awardee's contribution, and a monetary award of $100,000. Information on this year's meeting is available at http://www.hcrc.ed.ac.uk/cogsci2001/. The David E. Rumelhart Prize to be Awarded Annually When established in August of 2000, the David E. Rumelhart Prize was to be awarded bienially for outstanding contributions to the formal analysis of human cognition. Upon reviewing the pool of individuals nominated to receive the prize, the Glushko-Samuelson Foundation, in consultation with the Governing Board of the Cognitive Science Society, came to the conclusion that an annual prize is warranted. With the aid of the Prize Selection Committee (listed below), the foundation determined that there exists a large pool of outstanding candidates representing each of the approaches to the formal analysis of human cognition identified in the prize announcement: mathematical modeling of human cognitive processes, formal analysis of language and other products of human cognitive activity, and computational analyses of human cognition using symbolic and non-symbolic frameworks. Awarding the prize annually should facilitate the timely recognition of major contributions arising within each of these approaches. The recipient of the second David E. Rumelhart Prize will be announced at the Cognitive Science Society Meeting in Edinburgh, with the second prize lecture to be given at the following meeting of the society at George Mason University in July, 2002. Prize Selection Committee The membership of the prize selection committee was selected in consultation with the Distinguished Advisory Board (William Estes, Barbara Partee, and Herbert Simon). The members of the prize selection committee are Allan Collins, Bolt, Beranek and Newman and Northwestern University; Robert J. Glushko, Glushko-Samuelson Foundation; Mark Liberman, University of Pennsylvania; Anthony J. Marley, McGill University; and James L. McClelland (Chair), Carnegie Mellon. Brief Biography of Geoffrey E. Hinton Geoffrey Hinton received his BA in experimental psychology from Cambridge in 1970 and his PhD in Artificial Intelligence from Edinburgh in 1978. He did postdoctoral work at Sussex University and the University of California, San Diego and spent five years as a faculty member in the Computer Science department at Carnegie-Mellon University. He then moved to Toronto where he was a fellow of the Canadian Institute for Advanced Research and a Professor in the Computer Science and Psychology departments. He is a former president of the Cognitive Science Society, and he is a fellow of the Royal Society (UK), the Royal Society of Canada, and the American Association for Artificial Intelligence. In 1992 he won the ITAC/NSERC award for contributions to information technology. Hinton is currently Director of the Gatsby Computational Neuroscience Unit at University College London, where he leads an outstanding group of faculty, post-doctoral research fellows, and graduate students investigating the computational neural mechanisms of perception and action with an emphasis on learning. His current main interest is in unsupervised learning procedures for neural networks with rich sensory input. Cited Publications by Geoffrey E. Hinton (1) Hinton, G. E. and Anderson, J. A. (1981) Parallel Models of Associative Memory, Erlbaum, Hillsdale, NJ. (2) Hinton, G. E. (1981) Implementing semantic networks in parallel hardware. In Hinton, G. E. and Anderson, J. A. (Eds.), Parallel Models of Associative Memory, Erlbaum, Hillsdale, NJ. (3) Hinton, G. E. and Sejnowski, T. J. (1983) Optimal perceptual inference. Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, Washington DC. (4) Ackley, D. H., Hinton, G. E., and Sejnowski, T. J. (1985) A learning algorithm for Boltzmann machines. Cognitive Science, 9, 147--169. (5) Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986) Learning representations by back-propagating errors. Nature, 323, 533--536. (6) Jacobs, R., Jordan, M. I., Nowlan. S. J. and Hinton, G. E. (1991) Adaptive mixtures of local experts. Neural Computation, 3, 79-87 (7) Hinton, G. E., Dayan, P., Frey, B. J. and Neal, R. (1995) The wake-sleep algorithm for unsupervised Neural Networks. Science, 268, pp 1158-1161. Visit the David E. Rumelhart Prize Website at: http://www.cnbc.cmu.edu/derprize From Angelo.Arleo at dimail.epfl.ch Fri May 4 05:08:57 2001 From: Angelo.Arleo at dimail.epfl.ch (Angelo Arleo) Date: Fri, 04 May 2001 11:08:57 +0200 Subject: Preprints and Ph.D.thesis available Message-ID: <3AF271A9.4DB4C20@di.epfl.ch> Dear Connectionists, the following documents are now available on the web: ======================================================= A. Arleo (2000). "Spatial Learning and Navigation in Neuro-Mimetic Systems - Modeling the Rat Hippocampus", Ph.D. thesis, Dept. of Computer Science, Swiss Federal Inst.of Technology Lausanne, EPFL, Switzerland. http://diwww.epfl.ch/~arleo/PUBLICATIONS/PhD.html ======================================================= A. Arleo and W. Gerstner (2000). "Place Cells and Spatial Navigation based on Vision, Path Integration, and Reinforcement Learning", Advances in Neural Information Processing Systems 13, MIT-Press, pp. 89-95 http://diwww.epfl.ch/~arleo/PUBLICATIONS/nips00.pdf.Z ======================================================= A. Arleo and W. Gerstner (2001). "Spatial Orientation in Navigating Agents: Modeling Head-direction Cells". Neurocomputing (to appear) http://diwww.epfl.ch/~arleo/PUBLICATIONS/NeuroComputing00.pdf.Z ======================================================= Comments and suggestions are particularly welcome. Best regards, Angelo Arleo ______________________________________________________________________ ____/ __ / ____/ / Dr. Angelo Arleo. / / / / / Lab. of Computational Neuroscience (LCN) ____/ ____/ ____/ / Swiss Federal Inst. of Technology Lausanne / / / / CH-1015 Lausanne EPFL _____/ _/ _/ _____/ Tel/Fax: ++41 21 693 6696 / 5263 E-mail: angelo.arleo at epfl.ch Web: http://diwww.epfl.ch/~arleo ______________________________________________________________________ From jose at psychology.rutgers.edu Sat May 5 11:43:16 2001 From: jose at psychology.rutgers.edu (Stephen J. Hanson) Date: Sat, 05 May 2001 11:43:16 -0400 Subject: New paper on Distributional Properties of BOLD Susceptibility effects in the Brain Message-ID: <3AF41F94.2020406@kreizler.rutgers.edu> New paper available: "The Distribution of BOLD Susceptibility effects in the Brain is Non-Gaussian", S.J Hanson & B. Martin Bly to appear --NeuroReport (July, 2001). Abstract: A key assumption underlying fMRI analysis in the General Linear Model is that the underlying distributions of BOLD susceptibility is Gaussian. Analysis of several common data sets and experimental paradigms shows that the underlying distribution is NON-Gaussian. Further identification shows that the distribution is most likely GAMMA and implications for hehmodynamic modeling are discussed as well as recommendations concerning inferential testing in such "heavy-tailed" environments. PDF--> http://psychology.rutgers.edu/Users/jose/index.html Steve also see RUMBA--> www.rumba.rutgers.edu From J.A.Bullinaria at cs.bham.ac.uk Sun May 6 09:59:07 2001 From: J.A.Bullinaria at cs.bham.ac.uk (John A Bullinaria) Date: Sun, 6 May 2001 14:59:07 +0100 (BST) Subject: MSc in Natural Computation Message-ID: MSc in Natural Computation ========================== School of Computer Science The University of Birmingham Birmingham, UK Starting in October 2001, we are offering an advanced 12 month MSc programme in Natural Computation (i.e. computational systems that use ideas and inspirations from natural biological, ecological and physical systems). This will comprise of six taught modules in Neural Computation, Evolutionary Computation, Molecular and Quantum Computation, Nature Inspired Optimisation, Nature Inspired Learning, and Nature Inspired Design (10 credits each); two mini research projects (30 credits each); and one full scale research project (60 credits). The programme is supported by the EPSRC through its Master's Level Training Packages, and by a number of leading companies including BT, Unilever, Nortel Networks, Thames Water, Pro Enviro, SPSS, GPU Power Distribution, and aQtive. The School of Computer Science at the University of Birmingham has a strong research group in evolutionary and neural computation, with five members of academic staff (faculty) and two research fellows currently specialising in these fields: Dr. John Bullinaria (Neural Networks, Evolutionary Computation, Cog.Sci.) Dr. Jun He (Evolutionary Computation) Dr. Julian Miller (Evolutionary Computation, Machine Learning) Dr. Riccardo Poli (Evolutionary Computation, GP, Computer Vision, NNs, AI) Dr. Jon Rowe (Evolutionary Computation, AI) Dr. Thorsten Schnier (Evolutionary Computation, Engineering Design) Prof. Xin Yao (Evolutionary Computation, NNs, Machine Learning, Optimisation) Other staff members also working in these areas include Prof. Aaron Sloman (evolvable architectures of mind, co-evolution, interacting niches) and Dr. Jeremy Wyatt (evolutionary robotics, classifier systems). The programme is open to candidates with a very good honours degree or equivalent qualifications in Computer Science/Engineering or closely related areas. Six fully funded EPSRC studentships (covering fees and maintenance costs) are available each year, and additional financial support from our industrial partners may be available during the main project period. Further details about this programme and funding opportunities are available from our Web-site at: http://www.cs.bham.ac.uk/natcomp Please note that the closing date for applications is 15th July 2001. From vlassis at science.uva.nl Mon May 7 10:42:40 2001 From: vlassis at science.uva.nl (Nikos Vlassis) Date: Mon, 07 May 2001 16:42:40 +0200 Subject: some papers Message-ID: <3AF6B460.86DB8DB6@wins.uva.nl> Dear Connectionists, The following three papers have been accepted for publication and might be of interest. N. Vlassis, Y. Motomura, Ben Krose Supervised Dimension Reduction of Intrinsically Low-dimensional Data Neural Computation (to appear) ftp://ftp.science.uva.nl/pub/computer-systems/aut-sys/reports/Vlassis01nc.ps.gz Abstract: High-dimensional data generated by a system with limited degrees of freedom are often constrained in low-dimensional manifolds in the original space. In this paper we investigate dimension reduction methods for such intrinsically low-dimensional data through linear projections that preserve the manifold structure of the data. For intrinsically one-dimensional data this implies projecting to a curve on the plane with as few intersections as possible. We are proposing a supervised projection pursuit method which can be regarded as an extension of the single-index model for nonparametric regression. We show results from a toy and two robotic applications. Keywords: dimension reduction, feature extraction, intrinsic dimensionality, projection pursuit, simple curve, single-index model, multiple-index model, appearance modeling, mobile robot localization. ---- N. Vlassis and Y. Motomura Efficient Source Adaptivity in Independent Component Analysis IEEE Trans. Neural Networks (to appear) ftp://ftp.science.uva.nl/pub/computer-systems/aut-sys/reports/Vlassis01tnn.ps.gz Abstract: A basic element in most ICA algorithms is the choice of a model for the score functions of the unknown sources. While this is usually based on approximations, for large data sets it is possible to achieve `source adaptivity' by directly estimating from the data the `true' score functions of the sources. In this paper we describe an efficient scheme for achieving this by extending the fast density estimation method of Silverman (1982). We show with a real and a synthetic experiment that our method can provide more accurate solutions than state-of-the-art methods when optimization is carried out in the vicinity of the global minimum of the contrast function. Keywords: Independent component analysis, blind signal separation, source adaptivity, score function estimation. ---- N. Vlassis, A. Likas A greedy EM algorithm for Gaussian mixture learning Neural Processing Letters (to appear) ftp://ftp.science.uva.nl/pub/computer-systems/aut-sys/reports/Vlassis01npl.ps.gz Abstract: Learning a Gaussian mixture with a local algorithm like EM can be difficult because (i) the true number of mixing components is usually unknown, (ii) there is no generally accepted method for parameter initialization, and (iii) the algorithm can get stuck in one of the many local maxima of the likelihood function. In this paper we propose a greedy algorithm for learning a Gaussian mixture which tries to overcome these limitations. In particular, starting with a single component and adding components sequentially until a maximum number k, the algorithm is capable of achieving solutions superior to EM with k components in terms of the likelihood of a test set. The algorithm is based on recent theoretical results on incremental mixture density estimation, and uses a combination of global and local search each time a new component is added to the mixture. -- http://www.science.uva.nl/~vlassis From bbs at bbsonline.org Mon May 7 17:22:58 2001 From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor)) Date: Mon, 07 May 2001 17:22:58 -0400 Subject: BBS Call for Commentators A SENSORIMOTOR ACCOUNT OF VISION AND VISUAL CONSCIOUSNESS Message-ID: Below is the abstract of a forthcoming BBS target article [Please note that this paper was in fact accepted and archived to the web in October 2000 but the recent move of BBS to New York delayed the Call until now.] A SENSORIMOTOR ACCOUNT OF VISION AND VISUAL CONSCIOUSNESS by J. Kevin O'Regan Alva Noe http://www.bbsonline.org/Preprints/ORegan/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 8000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ A sensorimotor account of vision and visual consciousness J. Kevin O'Regan Laboratoire de Psychologie Exprimentale Centre National de Recherche Scientifique, Universit Ren Descartes, 92774 Boulogne Billancourt, France oregan at ext.jussieu.fr http://nivea.psycho.univ-paris5.fr Alva Noe Department of Philosophy University of California, Santa Cruz Santa Cruz, CA 95064 anoe at cats.ucsc.edu http://www2.ucsc.edu/people/anoe/ KEYWORDS: Sensation, Perception, Action, Consciousness, Experience, Qualia, Sensorimotor, Vision, Change blindness ABSTRACT: Many current neurophysiological, psychophysical and psychological approaches to vision rest on the idea that when we see, the brain produces an internal representation of the world. The activation of this internal representation is assumed to give rise to the experience of seeing. The problem with this kind of approach is that it leaves unexplained how the existence of such a detailed internal representation might produce visual consciousness. An alternative proposal is made here. We propose that seeing is a way of acting. It is a particular way of exploring the environment. Activity in internal representations does not generate the experience of seeing. The outside world serves as its own, external, representation. The experience of seeing occurs when the organism masters what we call the governing laws of sensorimotor contingency. The advantage of this approach is that it provides a natural and principled way of accounting for visual consciousness, and for the differences in the perceived quality of sensory experience in the different sensory modalities. Several lines of empirical evidence are brought forward in support of the theory, in particular: evidence from experiments in sensorimotor adaptation, visual "filling in", visual stability despite eye movements, change blindness, sensory substitution, and color perception. http://www.bbsonline.org/Preprints/ORegan/ ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENTS *** (1) The authors of scientific articles are not paid money for their refereed research papers; they give them away. What they want is to reach all interested researchers worldwide, so as to maximize the potential research impact of their findings. Subscription/Site-License/Pay-Per-View costs are accordingly access-barriers, and hence impact-barriers for this give-away research literature. There is now a way to free the entire refereed journal literature, for everyone, everywhere, immediately, by mounting interoperable university eprint archives, and self-archiving all refereed research papers in them. Please see: http://www.eprints.org http://www.openarchives.org/ http://www.dlib.org/dlib/december99/12harnad.html --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to self-archive all their papers in their own institution's Eprint Archives or in CogPrints, the Eprint Archive for the biobehavioral and cognitive sciences: http://cogprints.soton.ac.uk/ It is extremely simple to self-archive and will make all of our papers available to all of us everywhere, at no cost to anyone, forever. Authors of BBS papers wishing to archive their already published BBS Target Articles should submit it to BBSPrints Archive. Information about the archiving of BBS' entire backcatalogue will be sent to you in the near future. Meantime please see: http://www.bbsonline.org/help/ and http://www.bbsonline.org/Instructions/ --------------------------------------------------------------------- (3) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From bbs at bbsonline.org Mon May 7 17:05:35 2001 From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor)) Date: Mon, 07 May 2001 17:05:35 -0400 Subject: BBS re BBSPrints Logins Message-ID: Dear Connectionists List User, This list regularly receives Calls for Commentators from Behavioral and Brain Sciences (BBS) journal. BBS has now changed its procedures. If you also wish to be notified personally of accepted target articles and Calls for Commentators, you can get an individual login and password at the following URL: http://www.bbsonline.org/register.html Please note however that if you have had direct email communication with BBS in the past, a user account may already be in place for you, based on your most recent sending email address and details. In this case, the registration procedure at the URL above will send you the login details for that account. You can then logon to BBSPrints from the User Login link on the http://www.bbsonline.org/ front page and alter your mailshot (Call) status from there. There is of course no charge for any of this: also, there is no need to reply directly to this email. Many thanks, Stevan Harnad Editor Phineas de Thornley Head Editor, Electronic Review Systems Behavioral and Brain Sciences Journals Department _/_/_/ _/_/_/ _/_/_/_/ Cambridge University Press _/ _/ _/ _/ _/ 40 West 20th Street _/ _/ _/ _/ _/ New York _/_/_/_/_ _/_/_/_/_ _/_/_/_/ NY 10011-4211 _/ _/ _/ _/ _/ UNITED STATES _/ _/ _/ _/ _/ /_/_/_/_/ /_/_/_/_/ _/_/_/_/ bbs at bbsonline.org http://bbsonline.org __ __ | | |\ | | | |\ | |_ 'Phone: +001 212 924 3900 ext.369 |__| | \| |__ | | \| |__ Fax: +001 212 645 5960 From sper at informatik.uni-osnabrueck.de Tue May 8 05:07:00 2001 From: sper at informatik.uni-osnabrueck.de (Volker Sperschneider) Date: Tue, 08 May 2001 11:07:00 +0200 Subject: faculty opening Message-ID: <3AF7B734.5859C755@informatik.uni-osnabrueck.de> The University of Osnabrueck announces a full professorship for Computational Neuroscience. Further information is available at http://www.uos.de/career_service/stellenangebote/index.cfm. From mschmitt at lmi.ruhr-uni-bochum.de Wed May 9 07:34:54 2001 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Wed, 09 May 2001 13:34:54 +0200 Subject: Preprint on Radial Basis Function Neural Networks Message-ID: <3AF92B5E.BAC2BCF3@lmi.ruhr-uni-bochum.de> Dear Colleagues, a preprint of the paper "Radial basis function neural networks have superlinear VC dimension" by Michael Schmitt, accepted for the 14th Annual Conference on Computational Learning Theory COLT'2001, is available on-line from http://www.ruhr-uni-bochum.de/lmi/mschmitt/rbfsuper.ps.gz (19 pages gzipped PostScript). Regards, Michael Schmitt ------------------------------------------------------------ TITLE: Radial basis function neural networks have superlinear VC dimension AUTHOR: Michael Schmitt ABSTRACT We establish superlinear lower bounds on the Vapnik-Chervonenkis (VC) dimension of neural networks with one hidden layer and local receptive field neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension $\Omega(W\log k)$, where $W$ is the number of parameters and $k$ the number of nodes. This significantly improves the previously known linear bound. We also derive superlinear lower bounds for networks of discrete and continuous variants of center-surround neurons. The constants in all bounds are larger than those obtained thus far for sigmoidal neural networks with constant depth. The results have several implications with regard to the computational power and learning capabilities of neural networks with local receptive fields. In particular, they imply that the pseudo dimension and the fat-shattering dimension of these networks is superlinear as well, and they yield lower bounds even when the input dimension is fixed. The methods presented in this paper might be suitable for obtaining similar results for other kernel-based function classes. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: +49 234 32-23209 , Fax: +49 234 32-14465 http://www.ruhr-uni-bochum.de/lmi/mschmitt/ From ken at phy.ucsf.edu Wed May 9 14:40:11 2001 From: ken at phy.ucsf.edu (Ken Miller) Date: Wed, 9 May 2001 11:40:11 -0700 Subject: Paper available: Origins of cortical temporal tuning Message-ID: <15097.36619.21538.1324@coltrane.ucsf.edu> A preprint of the following article is now available, from http://www.keck.ucsf.edu/~ken (click on 'publications', then on 'Models of Neuronal Integration and Circuitry') or directly from ftp://ftp.keck.ucsf.edu/pub/ken/krukowski-miller01.pdf (there is also a web supplement to the article, ftp://ftp.keck.ucsf.edu/pub/ken/krukowski-miller01-websupp.pdf) Krukowski, A.E. and K.D. Miller (2001). ``Thalamocortical NMDA conductances and intracortical inhibition can explain cortical temporal tuning.'' Nature Neuroscience 4, 424-430. Abstract: Cells in cerebral cortex fail to respond to fast-moving stimuli that evoke strong responses in the thalamic nuclei that provide input to cortex. The reason for this behavior has remained a mystery. We study an experimentally-motivated model of the thalamic input-recipient layer of cat primary visual cortex that we have previously shown accounts for many aspects of cortical orientation tuning. In this circuit, inhibition dominates over excitation, but temporal modulations of excitation and inhibition occur out of phase with one another, allowing excitation to transiently drive cells. We show that this circuit provides a natural explanation of cortical low-pass temporal frequency tuning, provided N-methyl-D-aspartate (NMDA) receptors are present in thalamocortical synapses in proportions measured experimentally. This suggests a new and unanticipated role for NMDA conductances in shaping the temporal response properties of cortical cells, and suggests that common cortical circuit mechanisms underly both spatial and temporal response tuning. Ken Kenneth D. Miller telephone: (415) 476-8217 Associate Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From maneesh at gatsby.ucl.ac.uk Thu May 10 08:25:29 2001 From: maneesh at gatsby.ucl.ac.uk (Maneesh Sahani) Date: Thu, 10 May 2001 13:25:29 +0100 Subject: CNS*2001 Workshops: Call for Proposals Message-ID: <200105101225.NAA30337@crick.gatsby.ucl.ac.uk> CALL FOR PROPOSALS CNS*2001 Workshops July 4 and 5, 2001 Pacific Grove, California Workshops focusing on current issues in computational neuroscience will be held on July 4 and 5, 2001, as part of the CNS*2001 conference in Pacific Grove, California. Potential organizers are invited to submit proposals for specific workshop topics. Workshops may fall into one of three formats: 1. Discussion Workshops (formal or informal); 2. Tutorials; and 3. Mini-symposia, or they may combine more than one of these formats. The goal of the workshops is to provide an informal forum within the CNS meeting for focused discussion of recent or speculative research, novel techniques, and open issues in computational neuroscience. Discussion workshops, whether formal (i.e., held in a conference room with projection and writing media) or informal (held elsewhere), should stress interactive and open discussions in preference to sequential presentations. Tutorials and mini-symposia provide a format for a focused exploration of particular issues or techniques within a more traditional presentation framework; in these cases too, adequate time should be reserved for questions and general discussion. The organizers of a workshop should endeavor to bring together as broad a range of pertinent viewpoints as possible. In addition to recruiting participants and moderating discussion, workshop organizers should be prepared to submit a short report, summarizing the presentations and discussion, to the workshop coordinator shortly after the conference. These reports will be included on the CNS*2001 web site. ------------------------- How to propose a workshop ------------------------- To propose a workshop, please submit the following information to the workshop coordinator at the address below (1) the name(s) of the organizer(s) (2) the title of the workshop (3) a description of the subject matter, indicating clearly the range of topics to be discussed (4) the format(s) of the workshop; if a discussion session, please specify whether you would like it to be held in a conference room or in a less formal setting (5) for tutorials and mini-symposia, a provisional list of speakers (6) whether the workshop is to run for one or two days Please submit proposals as early as possible by email to cns2001workshops at gatsby.ucl.ac.uk or by post to Dr. Maneesh Sahani Gatsby Computational Neuroscience Unit Alexandra House 17, Queen Square London, WC1N 3AR, U.K. The descriptions of accepted workshops will appear on the CNS*2001 web site as they are received. Attendees are encouraged to check this list, and to contact the organizers of any workshops in which they are interested in participating. From s.holden at cs.ucl.ac.uk Thu May 10 09:31:19 2001 From: s.holden at cs.ucl.ac.uk (Sean Holden) Date: Thu, 10 May 2001 14:31:19 +0100 Subject: MSc in Intelligent Systems Message-ID: <3AFA9827.528AC379@cs.ucl.ac.uk> MSc in Intelligent Systems -------------------------- Department of Computer Science University College London London, UK A Masters Training Package funded by the Engineering and Physical Sciences Research Council (EPSRC). This new, 12 month advanced Masters programme covering all aspects of Intelligent Systems has extensive industrial involvement and is available to applicants having a good degree in Computer Science or a similar subject and/or appropriate industrial experience. Applicants will be expected to have completed final year courses in, or to have experience of, for example, neural networks, expert systems or artificial intelligence. The programme is available by full-time study for one year, or by part-time study for two years (day-release). A number of studentships are available for suitably qualified applicants. Courses, in addition to a substantial project, are planned to include: - Supervised Learning - Unsupervised Learning - Advanced Artificial Intelligence - Pattern Recognition & Machine Vision - Programming and Management Issues - Fundamental skills in Mathematical Methods and Statistics - Evolutionary Systems - Intelligent Text Handling - Advanced Database and Information Systems - Intelligent Systems in Business and Commerce - Intelligent Systems in Bioinformatics The Department of Computer Science at UCL has an excellent research group in Intelligent Systems, and the new programme has the active involvement of many leading researchers. For further information and application forms please contact the Admissions and General inquiries Office/Friends' Room, University College London (UCL), Gower Street, London WC1E 6BT, United Kingdom. Tel: +44 (0)207 679 3000 Fax: (0)207 679 3001 e-mail: degree-info at ucl.ac.uk. Alternatively, consult our web site: http://www.cs.ucl.ac.uk/teaching/MTPIS From dirk at bioss.ac.uk Thu May 10 08:10:56 2001 From: dirk at bioss.ac.uk (Dirk Husmeier) Date: Thu, 10 May 2001 13:10:56 +0100 Subject: Paper on HMMs in Bioinformatics Message-ID: <3AFA8550.3F1799E7@bioss.ac.uk> Dear Connectionists The following paper has just been accepted for publication in JOURNAL OF COMPUTATIONAL BIOLOGY and might be of interest to researchers who apply machine learning techniques to problems in BIOINFORMATICS. TITLE: Detection of Recombination in DNA Multiple Alignments with Hidden Markov Models AUTHORS: Dirk Husmeier and Frank Wright PAGES: 56 DOWNLOAD FROM: http://www.bioss.sari.ac.uk/~dirk/My_publications.html FORMAT: PDF SYNOPSIS The recent advent of multiple-resistant pathogens has led to an increased interest in interspecies recombination as an important, and previously underestimated, source of genetic diversification in bacteria and viruses. The discovery of a surprisingly high frequency of mosaic RNA sequences in HIV-1 suggests that a substantial proportion of AIDS patients have been coinfected with HIV-1 strains belonging to different subtypes, and that recombination between these genomes can occur in vivo to generate new biologically active viruses. A phylogenetic analysis of the bacterial genera Neisseria and Streptococcus has revealed that the introduction of blocks of DNA from penicillin-resistant non-pathogenic strains into sensitive pathogenic strains has led to new strains that are both pathogenic and resistant. Thus interspecies recombination raises the possibility that bacteria and viruses can acquire biologically important traits through the exchange and transfer of genetic material. In the present article, a hidden Markov model (HMM) is employed to detect recombination events in multiple alignments of DNA sequences. The emission probabilities in a given state are determined by the branching order (topology) and the branch lengths of the respective phylogenetic tree, while the transition probabilities depend on the global recombination probability. The present study improves on an earlier heuristic parameter optimization scheme and shows how the branch lengths and the recombination probability can be optimized in a maximum likelihood sense by applying the expectation maximization (EM) algorithm. The novel algorithm is tested on a synthetic benchmark problem and is found to clearly outperform the earlier heuristic approach. The paper concludes with an application of this scheme to a DNA sequence alignment of the argF gene from four Neisseria strains, where a likely recombination event is clearly detected. Best Wishes Dirk -- ---------------------------------------------- Dirk Husmeier Biomathematics and Statistics Scotland (BioSS) SCRI, Invergowrie, Dundee DD2 5DA, United Kingdom http://www.bioss.ac.uk/~dirk/ From Ulrich.Hillenbrand at dlr.de Thu May 10 11:47:21 2001 From: Ulrich.Hillenbrand at dlr.de (Ulrich Hillenbrand) Date: Thu, 10 May 2001 17:47:21 +0200 Subject: Thesis and articles on thalamocortical information processing Message-ID: <3AFAB807.104B6A85@dlr.de> Dear colleagues, you can find my doctoral thesis Spatiotemporal Adaptation in the Corticogeniculate Loop Ulrich Hillenbrand Technical University of Munich (2001) (see abstract below) for download at http://tumb1.biblio.tu-muenchen.de/publ/diss/ph/2001/hillenbrand.pdf You may also be interested in two related articles, Spatiotemporal adaptation through corticothalamic loops: A hypothesis Ulrich Hillenbrand and J. Leo van Hemmen Visual Neuroscience 17, 107-118 (2000) http://www.journals.cup.org/bin/bladerunner?REQUNIQ=989495926&REQSESS=156885&118200REQEVENT=&REQINT1=40136&REQAUTH=0 and Does Corticothalamic Feedback Control Cortical Velocity Tuning? Ulrich Hillenbrand and J. Leo van Hemmen Neural Computation 13, 327-355 (2001) http://neco.mitpress.org/cgi/content/full/13/2/327 Reprints are available upon request (by e-mail or to the address below). Please feel free to comment. Regards, Ulrich Hillenbrand ------------------------------------------------------------- Spatiotemporal Adaptation in the Corticogeniculate Loop Abstract The thalamus is the major gate to the cortex for almost all sensory signals, for input from various subcortical sources such as the cerebellum and the mammillary bodies, and for reentrant cortical information. Thalamic nuclei do not merely relay information to the cortex but perform some operation on it while being modulated by various transmitter systems and in continuous interplay with their cortical target areas. Indeed, cortical feedback to the thalamus is the anatomically dominant input to relay cells even in those thalamic nuclei that are directly driven by sensory systems. While it is well-established that the receptive fields of cortical neurons are strongly influenced by convergent thalamic inputs of different types, the modulation effected by cortical feedback in thalamic response has been difficult to interpret. Experiments and theoretical considerations have pointed to a variety of operations of the visual cortex on the visual thalamus, the lateral geniculate nucleus (LGN), such as control of binocular disparity for stereopsis (Schmielau & Singer, 1977), attention-related gating of relay cells (Sherman & Koch, 1986), gain control of relay cells (Koch, 1987), synchronizing firing of neighboring relay cells (Sillito et al., 1994; Singer 1994), increasing visual information in relay cells' output (McClurkin et al., 1994), and switching relay cells from a detection to an analyzing mode (Godwin et al., 1996; Sherman, 1996; Sherman & Guillery, 1996). Nonetheless, the evidence for any particular function is still sparse and rather indirect to date. Clearly, detailed concepts of the interdependency of thalamic and cortical operation could greatly advance our knowledge about complex sensory, and ultimately cognitive, processing. Here we present a novel view on the corticothalamic puzzle by proposing that control of velocity tuning of visual cortical neurons may be an eminent function of corticogeniculate processing. The hypothesis is advanced by studying a model of the primary visual pathway in extensive computer simulations. At the heart of the model is a biophysical account of the electrical membrane properties of thalamic relay neurons (Huguenard & McCormick, 1992; McCormick & Huguenard, 1992) that includes 12 ionic conductances. Among the different effects that corticogeniculate feedback may have on relay cells, we focus on the modulation of their relay mode (between tonic and burst mode) by control of their resting membrane potential. Employing two distinct temporal-response types of geniculate relay neurons (lagged and nonlagged), we find that shifts in membrane potential affect the temporal response properties of relay cells in a way that alters the tuning of cortical cells for speed. Given the loop of information from the LGN to cortical layer 4, via a variable number of synapses to layer 6, and back to the LGN, the question arises, what are likely implications of adaptive speed tuning for visual information processing? Based on some fairly general considerations concerning the nature of motion information, we devise a simple model of the corticogeniculate loop that utilizes adaptive speed tuning for the fundamental task of segmentation of objects in motion. A detailed mathematical analysis of the model's behavior is presented. Treating visual stimulation as a stochastic process that drives the adaptation dynamics, we prove the model's object-segmentation capabilities and reveal some non-intended properties, such as oscillatory responses, that are consequences of its basic design. Several aspects of the dynamics in the loop are discussed in relation to experimental data. -- Dr. Ulrich Hillenbrand Institute of Robotics and Mechatronics German Aerospace Center/DLR Postfach 1116 82230 Wessling Germany Phone: +49-8153-28-3501 Fax: +49-8153-28-1134 From glaser at socrates.Berkeley.EDU Thu May 10 17:24:20 2001 From: glaser at socrates.Berkeley.EDU (Donald A. Glaser) Date: Thu, 10 May 2001 14:24:20 -0700 Subject: post-doc positions in computational neuroscience (UC Berkeley) Message-ID: POST-DOC POSITIONS IN COMPUTATIONAL NEUROSCIENCE We are developing computational models of primate visual cortex based on the properties of two-dimensional arrays of interconnected model neurons and multiple layers of such arrays. These models are designed to mimic the anatomy and functioning of visual cortex as closely as practical and to make predictions of observable phenomena via psychophysical, electrophysiological, and fMRI techniques. Experiments being planned now will involve the new Berkeley Brain Imaging Center with its 4-Tesla fMRI system in studying patterns of cortical excitation resulting from various visual stimuli. Psychophysical experiments to test our models will continue in our own laboratory. Candidates will be expected to perform some combination of analysis, refinement, and elaboration of these or new, related, computer models, and participate in design and implementation of psychophysical and neuroimaging tests of these models. A strong background in mathematics, physics, or computer science and a continuing interest in neuroscience are required. Experience with Matlab, Mathematica, and Linux are desirable as we will shortly install a Linux-based Beowulf system for large computations in addition to the computers now in use. A supercomputer at the Lawrence Berkeley National Laboratory is also available for these studies. Sample Publications: 1) Motion detection and characterization by an excitable membrane: The "bow wave" model, by Donald A. Glaser, Davis Barch, Neurocomputing 26-27 (1999) 137-146 2)Characterization of activity oscillations in an excitable membrane model and their potential functionality for neuronal computations by Davis Barch, Neurocomputing 32-33 (2000) 25-35 3)Multiple matching of features in simple stereograms, by T. Kumar, Vision Res. Vol 36, No. 5 pp 675-698, (1996) 4) To be presented at CNS 2001, the Tenth Annual Computational Neuroscience Meeting at Pacific Grove, California, June 30-July 5, 2001 1) Nearby edges and lines interfere very little with width discrimination of rectangles, by T. Kumar, Ilya Khaytin, and D. A. Glaser 2) Interactions among cortical maps, by Kirill N. Shokhirev and Donald A. Glaser 3) Synaptic depression and facilitation can induce motion aftereffects in an excitable membrane model of of visual motion processing, by D. Barch, and D.A.Glaser 4) Slowly moving visual stimuli induce characteristic periodic activity waves in an excitable membrane model of visual motion processing, by D. Barch and D.A.Glaser Please send your CV, a brief statement of your interests, and letters of recommendation to: Donald A. Glaser PhD. Nobel Laureate in Physics Professor of Physics and of Neurobiology in the Graduate School University of California at Berkeley 41 Hill Road, Berkeley CA 94708 W 510-642-7231, F 510-841-2563 glaser at socrates.berkeley.edu From schittenkopf at ftc.co.at Fri May 11 03:42:53 2001 From: schittenkopf at ftc.co.at (Christian Schittenkopf (FTC Research)) Date: Fri, 11 May 2001 09:42:53 +0200 Subject: ICANN 2001 WORKSHOP CONTRIBUTIONS Message-ID: <000001c0d9ed$fccf8970$6bfda8c0@FTCRD02> [ Moderator's note: Thanks to Christian Schittenkopf for preparing this summary of the ICANN workshops. The CONNECTIONISTS list doesn't carry announcements for individual workshops associated with a conference where we have also carried the call for papers and call for registrations, because we were being flooded with too many of these and they are usually only of interest to conference attendees. However, we are happy to carry a summary of a conference's entire workshop program as a single posting. -- Dave Touretzky, CONNECTIONISTS moderator ] Following the regular program of the ICANN 2001 conference (Aug. 21-24), four workshops on current topics will be held on Aug. 25 in Vienna. CONTRIBUTIONS to the WORKSHOPS listed below are highly welcome. More details on tutorials, conference and workshops can be found at http://www.ai.univie.ac.at/icann/ Christian Schittenkopf (Workshop Chair) Workshop: RECURRENT NEURAL NETWORKS AND ONLINE SEQUENCE LEARNING Organizers: Douglas Eck and Juergen Schmidhuber, IDSIA OVERVIEW: A full-day workshop. We define the topic broadly and include presentations from related areas, although the focus will remain on recurrent neural networks (RNNs). RNNs are of interest as they can implement almost arbitrary sequential behavior. They are biologically more plausible and computationally more powerful than feedforward networks, support vector machines, hidden Markov models, etc. Making RNNs learn from examples used to be difficult though. Recent progress has overcome fundamental problems plaguing traditional RNNs - now there exist online learning RNNs that efficiently learn previously unlearnable solutions to numerous tasks, using not more than O(1) computations per weight and time step: Recognition of temporally extended, noisy patterns Recognition of regular, context free and context sensitive languages Recognition of temporal order of widely separated events Extraction of information conveyed by the temporal distance between events Generation of precisely timed rhythms Stable generation of smooth periodic trajectories Robust storage of high-precision real numbers across extended time intervals The workshop is intended to discuss recent advances as well as future potential of RNNs and alternative approaches to online sequence learning. WORKSHOP FORMAT: Like all ICANN 2001 workshops, this session will take place in a particularly nice venue, a traditional Heuriger ['hoy-ri-guer] in Vienna. A "Heuriger" provides a typically Viennese where one can drink local wine and eat schnitzel while sitting on wooden seats at wooden tables. SPEAKERS: We might be able to add one or two additional speakers. If you are interested in presenting, please contact Doug Eck (doug at idsia.ch) with a suggested title and abstract. Here is a *tentative* list. Marco Gori (Universita degli Studi di Siena, Italy) Steve Grossberg (Boston University, USA) Sepp Hochreiter (University of Colorado, USA) Juan Antonio Perez-Ortiz (Universidad di Alicante, Spain) Nicol Schraudolph (ETH Zurich, Switzerland) Sebino Stramaglia (Istituto Nazionale di Fiscia Nucleare, Italy) Ron Sun (University of Missouri, USA) Hans Georg Zimmermann (Siemens AG, Germany) For details and abstracts see the workshop website at http://www.idsia.ch/~doug/icann/index.html For registration see the ICANN website at http://www.ai.univie.ac.at/icann/ Workshop: KERNEL & SUBSPACE METHODS FOR COMPUTER VISION Co-organizers: Ales Leonardis, Horst Bischof http://www.prip.tuwien.ac.at/~bis/kernelws/ Scope of the workshop: This half-day workshop will be held in conjunction with ICANN 2001 on August 25, 2001 in Vienna. In the past years, we have been witnessing vivid developments of sophisticated kernel and subspace methods in neural network and pattern recognition communities on one hand and extensive use of these methods in the area of computer vision on the other hand. These methods seem to be especially relevant for object and scene recognition. The purpose of the workshop is to bring together scientists from the neural network (pattern recognition) and computer vision community to analyze new developments, identify open problems, and discuss possible solutions in the area of kernel & subspace methods such as: Support Vector Machines Independent Component Analysis Principle Component Analysis Canonical Correlation Analysis, etc. for computer vision problems such as: Object Recognition Navigation and Robotics 3D Vision, etc. Contributions in the above mentioned areas are welcome. The program will consist of invited and selected contributed papers. The papers selected for the workshop will appear in a Workshop Proceedings which will be distributed among the workshop participants. It is planned that selected papers from the workshop will be published in a journal. Important dates: Submission Deadline: 31.5.2001 Notification of Acceptance: 29.6.2001 Final Papers Due: 3.8.2001 Submission instructions: A complete paper, not longer than 12 pages including figures and references, should be submitted in the LNCS page format. The layout of final papers must adhere strictly to the guidelines set out in the Instructions for the Preparation of Camera-Ready Contributions to LNCS Proceedings. Authors are asked to follow these instructions exactly. In order to reduce the handling effort of papers we allow only for electronic submissions by ftp (either ps or pdf files). ftp ftp.prip.tuwien.ac.at [anonymous ftp, i.e.: Name: ftp Password: ] cd kernelws binary put .ext quit Workshop Registration: Registration for the Workshop can be done at the ICANN 2001 Homepage http://www.ai.univie.ac.at/icann/ Workshop: ADVANCES TOWARDS LIFE-LIKE PERCEPTION SYSTEMS Organizer: Leslie Smith The aim of the workshop is to discuss neuromorphic systems in sensory perception: mechanisms, coding schemes, scene analysis (whether visual, auditory, polfactory other sense), top-down and bottom up processing. We are particularly interested in the the nature of biological relevance (and indeed, whether this is really necessary) and the sensing-perception-action loop. We seek 1 page contributions by May 31. We are considering organising publication of the workshop. For further information, see http://www.ai.univie.ac.at/icann/txt/workshop-lps.html Leslie S Smith lss at cs.stir.ac.uk tel: +44 1786 46 7435 fax: +44 1786 46 4551 Department of Computing Science and Mathematics, University of Stirling, Stirling FK9 4LA, Scotland, UK From paolo at dsi.unifi.it Sat May 12 16:28:28 2001 From: paolo at dsi.unifi.it (Paolo Frasconi) Date: Sat, 12 May 2001 22:28:28 +0200 Subject: Call for participation: NATO ASI on AI and Bioinformatics Message-ID: The following meeting may be of interest to researchers interested in neural networks applied to computational biology. Artificial Intelligence and Heuristic Methods for Bioinformatics A NATO Advanced Studies Institute San Miniato, Italy October 1-11, 2001 www.dsi.unifi.it/ai4bio Application deadline: July 25, 2001 Artificial Intelligence and Heuristics (e.g., machine learning and data mining, pattern recognition, cluster analysis, search, knowledge representation) can provide key solutions for the new challenges posed by the progressive transformation of biology into a data-massive science. This school is targeted to scientists who want to learn about the most recent advancements in the application of intelligent systems to computational biology. Topics: Computational analysis of biological data. Artificial intelligence, machine learning, and heuristic methods, including neural and belief networks. Prediction of protein structure (secondary structure, contact maps). The working draft of the human genome. Genome annotation. Computational tools for gene regulation. Analysis of gene expression data and their applications. Computer assisted drug discovery. Knowledge discovery in biological domains. Lecturers: Pierre Baldi (University of California, Irvine) Soeren Brunak (CBSA, The Technical University of Denmark) Rita Casadio (CIRB, University of Bologna) Antonello Covacci (Chiron Italia) Paolo Frasconi (DSI, University of Florence) Terry Gaasterland (Rockefeller University) Dan Geiger (Technion, Israel) Mikhail Gelfand (Russian Academy of Science, Moscow) David Haussler (University of California, Santa Cruz) Nikolay A. Kolchanov (Inst. of Cytology and Genetics, Novosibirsk) Richard H. Lathrop (University of California, Irvine) Heiko Mueller (Pharmacia & Upjohn, Milano) Steve Muggleton (Imperial College, London) Burkhard Rost (Columbia University)? Roberto Serra (Montecatini SpA, Ravenna) Ron Shamir (Tel Aviv University) Co-directors: Paolo Frasconi (University of Florence) Email: paolo at dsi.unifi.it www.dsi.unifi.it/~paolo Ron Shamir (Tel Aviv University) Email: rshamir@ tau.ac.il www.math.tau.ac.il/~rshamir Limited grants have been made available by NATO to cover the accommodation and/or travel expenses of selected attendees. A limited number of travel awards will be made available by the National Science Foundation for U.S. citizens or permanent residents. For APPLICATION, CONTRIBUTING PAPERS, GRANTS, FEES, and further information please visit http://www.dsi.unifi.it/ai4bio From lorincz at valerie.inf.elte.hu Sun May 13 07:47:47 2001 From: lorincz at valerie.inf.elte.hu (LORINCZ Andras) Date: Sun, 13 May 2001 13:47:47 +0200 (MET DST) Subject: TR on Event Learning and Robust Policy Heuristics Message-ID: A technical report is now available, from http://people.inf.elte.hu/lorincz/NIPG-ELU-14-05-2001.ps.gz TITLE Event Learning and Robust Policy Heuristics ABSTRACT In this paper we introduce a novel form of reinforcement learning called event-learning or E-learning. In our method an event is an ordered pair of two consecutive states. We define event-value function and derive learning rules which are guaranteed to converge to the optimal event-value function. Combining our method with a well-known robust control method, the SDS algorithm, we introduce Robust Policy Heuristics (RPH). It is shown that RPH, a fast-adapting non-Markovian policy, is particularly useful for coarse models of the environment and for partially observed systems. As such, RPH alleviates the `curse of dimensionality' problem. Fast adaptation can be used to separate time scales of learning the value functions of a Markovian decision making problem and adaptation, the utilization of a non-Markovian policy. We shall argue that (i) the definition of modules is straightforward for E-learning, (ii) E-learning extends naturally to policy switching, and (iii) E-learning promotes planning. Computer simulations of a two-link pendulum with coarse discretization and noisy controller are shown to demonstrate the principle. Comments are more than welcome. Andras Lorincz www.inf.elte.hu/~lorincz From ted.carnevale at yale.edu Sun May 13 08:24:40 2001 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Sun, 13 May 2001 08:24:40 -0400 Subject: NEURON 2001 Summer Course Message-ID: <3AFE7D08.B58203CC@yale.edu> The registration deadline for the NEURON 2001 Summer Course is rapidly approaching (May 25), but a few seats remain available. For more information and an application form see http://www.neuron.yale.edu/neuron/sdsc2001/sdsc2001.htm --Ted From harnad at coglit.ecs.soton.ac.uk Mon May 14 15:03:23 2001 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Mon, 14 May 2001 20:03:23 +0100 (BST) Subject: BBS Call for Commentators: VISUAL CONSCIOUSNESS Message-ID: Below is the abstract of a forthcoming BBS target article [Please note that this paper was accepted and archived to the web in October 2000 but the recent move of BBS to New York delayed the Call until now.] A SENSORIMOTOR ACCOUNT OF VISION AND VISUAL CONSCIOUSNESS by J. Kevin O'Regan Alva Noe http://www.bbsonline.org/Preprints/ORegan/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 8000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ A sensorimotor account of vision and visual consciousness J. Kevin O'Regan Laboratoire de Psychologie Expirimentale Centre National de Recherche Scientifique, Universiti Reni Descartes, 92774 Boulogne Billancourt, France oregan at ext.jussieu.fr http://nivea.psycho.univ-paris5.fr Alva Noe Department of Philosophy University of California, Santa Cruz Santa Cruz, CA 95064 anoe at cats.ucsc.edu http://www2.ucsc.edu/people/anoe/ KEYWORDS: Sensation, Perception, Action, Consciousness, Experience, Qualia, Sensorimotor, Vision, Change blindness ABSTRACT: Many current neurophysiological, psychophysical and psychological approaches to vision rest on the idea that when we see, the brain produces an internal representation of the world. The activation of this internal representation is assumed to give rise to the experience of seeing. The problem with this kind of approach is that it leaves unexplained how the existence of such a detailed internal representation might produce visual consciousness. An alternative proposal is made here. We propose that seeing is a way of acting. It is a particular way of exploring the environment. Activity in internal representations does not generate the experience of seeing. The outside world serves as its own, external, representation. The experience of seeing occurs when the organism masters what we call the governing laws of sensorimotor contingency. The advantage of this approach is that it provides a natural and principled way of accounting for visual consciousness, and for the differences in the perceived quality of sensory experience in the different sensory modalities. Several lines of empirical evidence are brought forward in support of the theory, in particular: evidence from experiments in sensorimotor adaptation, visual "filling in", visual stability despite eye movements, change blindness, sensory substitution, and color perception. http://www.bbsonline.org/Preprints/ORegan/ ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENTS *** (1) The authors of scientific articles are not paid money for their refereed research papers; they give them away. What they want is to reach all interested researchers worldwide, so as to maximize the potential research impact of their findings. Subscription/Site-License/Pay-Per-View costs are accordingly access-barriers, and hence impact-barriers for this give-away research literature. There is now a way to free the entire refereed journal literature, for everyone, everywhere, immediately, by mounting interoperable university eprint archives, and self-archiving all refereed research papers in them. Please see: http://www.eprints.org http://www.openarchives.org/ http://www.dlib.org/dlib/december99/12harnad.html --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to self-archive all their papers in their own institution's Eprint Archives or in CogPrints, the Eprint Archive for the biobehavioral and cognitive sciences: http://cogprints.soton.ac.uk/ It is extremely simple to self-archive and will make all of our papers available to all of us everywhere, at no cost to anyone, forever. Authors of BBS papers wishing to archive their already published BBS Target Articles should submit it to BBSPrints Archive. Information about the archiving of BBS' entire backcatalogue will be sent to you in the near future. Meantime please see: http://www.bbsonline.org/help/ and http://www.bbsonline.org/Instructions/ --------------------------------------------------------------------- (3) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From john at cs.rhul.ac.uk Tue May 15 08:37:24 2001 From: john at cs.rhul.ac.uk (John Shawe-Taylor) Date: Tue, 15 May 2001 13:37:24 +0100 (BST) Subject: Applications of learning to text and images In-Reply-To: Message-ID: Research Assistant Opening in Kernel Based Methods (see also web site: www.cs.rhul.ac.uk/vacancies/RAkernels.shtml ) Department of Computer Science, Royal Holloway, University of London Three year postdoctoral appointment available immediately Royal Holloway, University of London invites applications for a research assistant position in computer science. The salary is competitive and the work is associated with a new European-funded project directed by John Shawe-Taylor. The project involves developing kernel based methods for the analysis of multi-media documents provided by Reuters, who are collaborators on the project. The project is financed by the EU and also involves partners in France (Xerox), Italy (Genova University, Milano University) and Israel (Hebrew University of Jerusalem). We are seeking a researcher with experience in corpus based methods of information retrieval and document categorisation, and a strong programming background. Experience with kernel methods is desirable but not required. Salary is in the range 20,865 to 27,347 per annum inclusive of London Allowance. Please contact John Shawe-Taylor by email at jst at cs.rhul.ac.uk for more information. From ncpw7 at biols.susx.ac.uk Tue May 15 09:34:25 2001 From: ncpw7 at biols.susx.ac.uk (neural computation workshop) Date: Tue, 15 May 2001 14:34:25 +0100 Subject: Call for papers for NCPW7 (Brighton, England) Message-ID: <3B013060.997D52FB@biols.susx.ac.uk> Dear Connectionists I wish to bring peoples attention to the first call for papers: The Seventh Neural Computation and Psychology Workshop (NCPW7) at Sussex University, Brighton Connectionist models of Cognition and Perception University of Sussex, Falmer, England From stephen at computing.dundee.ac.uk Tue May 15 10:38:54 2001 From: stephen at computing.dundee.ac.uk (Stephen McKenna) Date: Tue, 15 May 2001 15:38:54 +0100 Subject: Potsdoc position in vision and learning Message-ID: <052c01c0dd4c$c42ed370$26222486@dyn.computing.dundee.ac.uk> The following postdoc position may be of interest. UNIVERSITY OF DUNDEE, UK, School of Engineering Department of Applied Computing POSTDOCTORAL RESEARCHER IN COMPUTER VISION (Grade RA1A : 18,731 - 23,256) Candidates are invited to apply for a 2 year Postdoctoral position in the Department of Applied Computing at the University of Dundee. The post is funded by an EPSRC project "Advanced Sensors for Supportive Environments for the Elderly". The successful candidate will conduct research in the area of computer vision-based monitoring, learning and recognition of human action within the context of this application. The Department of Applied Computing was awarded a "5A" rating in the UK RAE. Candidates should have a PhD (or equivalent experience) in a relevant discipline (e.g. computer vision, machine learning). Informal enquiries may be made to Dr Stephen McKenna,tel: (01382) 344732; e-mail: stephen at computing.dundee.ac.uk Further details of the department and this post can be found at http://www.computing.dundee.ac.uk/projects/supportiveenvironments Applications by CV & covering letter (2 copies of each), complete with the names, addresses, telephone/fax numbers/e-mail addresses of 2 referees should be sent to Personnel Services, University of Dundee, Dundee, DD1 4HN. Further Particulars are available for this post, tel: (01382) 344015. Please quote Reference: SE/151/1. Applicants will only be contacted if invited for interview. Closing date: 7 June 2001. The University of Dundee is committed to equal opportunities and welcomes applications from all sections of the community. http://www.dundee.ac.uk/ From kegl at IRO.UMontreal.CA Tue May 15 13:25:31 2001 From: kegl at IRO.UMontreal.CA (Balazs Kegl) Date: Tue, 15 May 2001 13:25:31 -0400 Subject: Principal Curves page updated and moved Message-ID: <200105151725.f4FHPVS12431@mercure.IRO.UMontreal.CA> Dear connectionists, I updated my Principal Curves web page and moved it to http://www.iro.umontreal.ca/~kegl/research/pcurves/ Recent references are included, and a new version of the java implementation of the Polygonal Line Algorithm [1,2] is available. The most important new features are - arbitrary-dimensional input data - loading/downloading your own data and saving the results - adjusting the parameters of the algorithm in an interactive fashion [1] B. Kgl, A. Krzyzak, T. Linder, and K. Zeger "Learning and design of principal curves" IEEE Transactions on Pattern Analysis and Machine Intelligence vol. 22, no. 3, pp. 281-297, 2000. http://www.iro.umontreal.ca/~kegl/research/publications/keglKrzyzakLinderZeger99.ps [2] B. Kgl "Principal curves: learning, design, and applications," Ph. D. Thesis, Concordia University, Canada, 1999. http://www.iro.umontreal.ca/~kegl/research/publications/thesis.ps Comments are welcome. Balazs Kegl -------------------------------------------------------------------------------- Balzs Kgl Assistant Professor E-mail: kegl at iro.umontreal.ca Dept. of Computer Science and Op. Res. Phone: (514) 343-7401 University of Montreal Fax: (514) 343-5834 CP 6128 succ. Centre-Ville http://www.iro.umontreal.ca/~kegl/ Montreal, Canada H3C 3J7 From scott at salk.edu Wed May 16 11:16:56 2001 From: scott at salk.edu (Scott Makeig) Date: Wed, 16 May 2001 08:16:56 -0700 (PDT) Subject: Call for Papers ICA2001 Message-ID: <200105161516.f4GFGuU41667@moniz.salk.edu> CALL FOR PAPERS CALL FOR PAPERS ICA2001 http://ica2001.org Third International Conference on Independent Component Analysis and Signal Separation San Diego, California December 9-13, 2001 Independent Component Analysis (ICA) is emerging as a new standard area of signal processing and data analysis. ICA attempts to solve the blind source separation problem in which sensor signals are unknown mixtures of unknown source signals. While there are no general analytical solutions, in the last decade researchers have proposed good approximate methods based on simple assumptions about the source statistics and using maximum likelihood, information maximization and minimization of higher-order moments. ICA theory has received attention from several research communities including machine learning, neural networks, statistical signal processing and Bayesian modeling. More recently numerous applications of ICA have appeared including applications to adaptive speech filtering, speech signal coding, biomedical signal processing, image compression, text modeling and financial data analysis. ICA2001 will feature the latest developments in the new field of blind source separation. The Workshop will feature internationally respected keynote speakers, poster sessions, and symposia on theory, on algorithms and on applications to a wide range of fields and data types. The Conference recreational program includes an informal banquet and a unique opening cocktail party / unmixer. This, the third international meeting in this series, is being hosted by the Institute for Neural Computation, UCSD. The previous two meetings were held in Aussois, France (December, 1999) and Helsinki, Finland (June, 2000). This year's event will be held December 9-13, 2001 immediately following the Neural Information Processing Systems (NIPS) conference in Vancouver, Canada and its post-conference workshops. PAPERS WILL BE ACCEPTED THROUGH THE WORKSHOP WEBSITE http://ica2001.org BETWEEN JUNE 1 AND JUNE 29, 2001 Organizing Committee Chair Terrence Sejnowski terry at inc.ucsd.edu Program Te-Won Lee tewon at inc.ucsd.edu Publicity Scott Makeig scott at inc.ucsd.edu Treasurer Gary Cottrell gary at inc.ucsd.edu Publication Tzyy-Ping Jung jung at inc.ucsd.edu Comm. Javier Movellan javier at inc.ucsd.edu Arrangements John Staight john at inc.ucsd.edu International Advisory Committee C. Jutten, INPG, France E. Oja, Helsinki University of Technology, Finland A. Bell, The Salk Institute, USA S. I. Amari, RIKEN, Japan Program Committee Luis Almeda Hagai Attias Jean-Francois Cardoso Andrzej Cichocki Seungjin Choi Pierre Comon Gustavo Deco Scott Douglas Richard Everson Mark Girolami Lars Kai Hansen Aapo Hyvrinen Juha Karhunen Soo-Young Lee Te-Won Lee Michael Lewicki Juan Lin Eric Moreau Noburo Murata Klaus-Robert Mueller J.-P. Nadal Klaus Obermayer Bruno Olshausen Ding-Tu Pham Barak Pearlmutter Jose Principe Juergen Schmidhuber Kari Torrkola From ojensen at neuro.hut.fi Wed May 16 03:05:19 2001 From: ojensen at neuro.hut.fi (Ole Jensen) Date: Wed, 16 May 2001 10:05:19 +0300 (EET DST) Subject: papers on phase coding Message-ID: Dear colleagues, I would like to draw your attention to two papers on phase coding and information transfer between rhythmically coupled networks. The papers area available in PDF at http://boojum.hut.fi/~ojensen/ or contact me for hard copies. Ole Jensen ========================================================================= Jensen. O. (in press) Information transfer between rhythmically coupled networks: reading the hippocampal phase code. Neural Computation Brain Research Unit, Low Temperature Laboratory, Helsinki University of Technology, P.O. Box 2200, FIN-02015 Espoo, Finland There are numerous reports on rhythmic coupling between separate brain networks. It has been proposed that this rhythmic coupling indicates exchange of information. So far, few computational models have been proposed which explore this principle and its potential computational benefits. Recent results on hippocampal place cells of the rat provide new insight: it has been shown that information about space is encoded by the firing of place cells with respect to the phase of the ongoing theta rhythm. This principle is termed phase coding and suggests that upcoming locations (predicted by the hippocampus) are encoded by cells firing late in the theta cycle, whereas current location is encoded by early firing at the theta phase. A network reading the hippocampal output must inevitably also receive an oscillatory theta input in order to decipher the phase coded firing patterns. In this work I propose a simple physiologically plausible mechanism implemented as an oscillatory network which can decode the hippocampal output. By changing only the phase of the theta input to the decoder, qualitatively different information is transferred: the theta phase determines whether representations of current or upcoming locations are read by the decoder. The proposed mechanism provides a computational principle for information transfer between oscillatory networks and might generalize to brain networks beyond the hippocampal region. ========================================================================== Jensen O. and J.E. Lisman (2000) Position reconstruction from an ensemble of hippocampal place cells: contribution of theta phase coding. Journal of Neurophysiology 83:2602-2609 Department of Biology, Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts 02454 Previous analysis of the firing of individual rat hippocampal place cells has shown that their firing rate increases when they enter a place field and that their phase of firing relative to the ongoing theta oscillation (7-12 Hz) varies systematically as the rat traverses the place field, a phenomenon termed the theta phase precession. To study the relative contribution of phased-coded and rate-coded information, we reconstructed the animal's position on a linear track using spikes recorded simultaneously from 38 hippocampal neurons. Two previous studies of this kind found no evidence that phase information substantially improves reconstruction accuracy. We have found that reconstruction is improved provided epochs with large, systematic errors are first excluded. With this condition, use of both phase and rate information improves the reconstruction accuracy by >43% as compared with the use of rate information alone. Furthermore, it becomes possible to predict the rat's position on a 204-cm track with very high accuracy (error of <3 cm). The best reconstructions were obtained with more than three phase divisions per theta cycle. These results strengthen the hypothesis that information in rat hippocampal place cells is encoded by the phase of theta at which cells fire. ============================================================================== Ole Jensen, Ph.D. Helsinki University of Technology Low Temperature Laboratory Otakaari 3A P.O. Box 2200 FIN-02015 HUT Finland Office : (+358) 9 4512951 Mobile : (+358) 405049936 Fax : (+358) 9 4512969 e-mail : ojensen at neuro.hut.fi URL : http://boojum.hut.fi/~ojensen/ From terry at salk.edu Wed May 16 16:09:58 2001 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 16 May 2001 13:09:58 -0700 (PDT) Subject: NEURAL COMPUTATION 13:6 Message-ID: <200105162009.f4GK9we16485@purkinje.salk.edu> Neural Computation - Contents - Volume 13, Number 6 - June 1, 2001 VIEW Generalization in Interactive Networks: The Benefits of Inhibitory Competition and Hebbian Learning Randall C. O'Reilly NOTE Optimal Smoothing in Visual Motion Perception Rajesh P.N. Rao, David M. Eagleman and Terrence J. Sejnowski LETTERS Rate Coding Versus Temporal Order Coding: What the Retinal Ganglion Cells Tell the Visual Cortex Rufin Van Rullen and Simon J. Thorpe The Effects of Spike Frequency Adaptation and Negative Feedback on the Synchronization of Neural Oscillators Bard Ermentrout, Matthew Pascal and Boris Gutkin A Unified Approach to the Study of Temporal, Correlational, and Rate Coding Stefano Panzeri and Simon R. Schultz Determination of Response Latency and its Application to Normalization of Cross-Correlation Measures Stuart N. Baker and George L. Gerstein Attractive Periodic Sets in Discrete-Time Recurrent Networks (with Emphasis on Fixed-Point Stability and Bifurcations in Two-Neuron Networks Peter Tino, Bill G. Horne, and C. Lee Giles Attractor Networks for Shape Recognition Yali Amit and Massimo Mascaro ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From cierina at vis.caltech.edu Mon May 21 13:02:07 2001 From: cierina at vis.caltech.edu (Cierina Reyes) Date: Mon, 21 May 2001 10:02:07 -0700 Subject: Announcement - Caltech Postdoctoral Position Message-ID: <5.0.2.1.2.20010521100147.00ac2b50@vis.caltech.edu> CALIFORNIA INSTITUTE OF TECHNOLOGY 2 Postdoctoral Fellowships in Neuroscience Applications are invited for 2 postdoctoral research positions, available immediately, to join a collaborative research program between the laboratories of Partha Mitra at Bell Laboratories (Murray Hill, New Jersey) and Prof. R. Andersen at the California Institute of Technology (Pasadena, California). The research project will examine the temporal correlation structure of activity within and between local and distant cortical areas in parietal and frontal areas during cognitive tasks, involving memory and planning of eye and arm movements. Multi-site recordings of single cell activity and local field potentials will be made in behaving monkeys and the data analyzed using modern statistical methods for stochastic processes and machine learning techniques. The experimentalist will be located at Caltech, and the theorist at Bell Labs. The goal of the research is to elucidate the underlying functional architecture and multi-site dynamics of neural activity in local and long-range circuits involved in working memory. Theorist Position #400 - The successful candidate should have training in an analytical subject, preferably in theoretical physics, as well as computational experience and/or experience with statistical analysis. The candidate should be motivated to work in understanding neural systems. The position offers an opportunity to work closely with an experiment, and will be part of a strong multidisciplinary team in a rich research environment. Experimentalist Position #500 - The successful candidate should have training in electrical engineering or experimental physics either at the Bachelor's or Ph.D. level or substantial background with electrical hardware and computer programming. Experience working in a neuroscience laboratory using electrophysiological recording is desirable. The position offers the opportunity to interact directly with theorists and will provide a rich opportunity for multidisciplinary study of the nervous system. Applications should include a curriculum vitae and two letters of recommendation. This material should be sent to Ms. Cierina R. Marks, California Institute of Technology, MC 216-76, 1201 E. California Blvd., Pasadena, CA 91125. Please indicate the position number that you are applying for. Caltech is an affirmative action, equal opportunity employer. Women, minorities, veterans, and disabled persons are encouraged to apply. From steve_kemp at unc.edu Mon May 21 23:38:10 2001 From: steve_kemp at unc.edu (Steven M. Kemp) Date: Mon, 21 May 2001 23:38:10 -0400 Subject: paper available: Situational Descriptions of Behavioral Procedures Message-ID: Dear Colleagues: The following paper on evaluating neural networks and other computational models of learning against laboratory data, entitled "Situational Descriptions of Behavioral Procedures" is available at: http://www.unc.edu/~skemp/documents/situate/InSitu/KEMP-75-135.PDF This paper appears in the forthcoming issue of the Journal of the Experimental Analysis of Behavior (JEAB). For those preferring hard copies, they should be available in a couple of months. Contact me via email. Best regards, steve p.s. If you have any trouble accessing, reading or printing this file, just drop me a line. If the problem is that you don't have Adobe Acrobat, you can get it (for free) here: http://www.adobe.com/prodindex/acrobat/readstep.html ----------------------------------------------------------------------- Situational Descriptions of Behavioral Procedures: The In Situ Testbed Steven M. Kemp and David A. Eckerman, Journal of the Experimental Analysis of Behavior (2001), vol. 75, pp. 135-164. Abstract We demonstrate In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement using an extension of Mechner's notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement/extinction, fixed ratio, and fixed interval. The In Situ testbed appears to be a reliable and valid testing procedure for comparing models of learning. Key words: neural networks, reinforcement schedules, situated action, cumulative records, learning theory, Mechner diagrams, extinction, key peck, computer simulation, Markov decision process, POMDP. -- Steve Kemp [apologies if you receive multiple copies of this message] >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< Steven M. Kemp | Department of Psychology | email: steve_kemp at unc.edu Davie Hall, CB# 3270 | University of North Carolina | Chapel Hill, NC 27599-3270 | fax: (919) 962-2537 Visit our WebSite at: http://www.unc.edu/~skemp/ >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< The laws of mind [are] themselves of so fluid a character as to simulate divergences from law. -- C. S. Peirce (Collected Papers, 6.101). From sami.kaski at hut.fi Tue May 22 09:22:46 2001 From: sami.kaski at hut.fi (Sami Kaski) Date: 22 May 2001 16:22:46 +0300 Subject: Papers on learning metrics Message-ID: Dear connectionists, There are papers on learning metrics available at http://www.cis.hut.fi/projects/mi/ The methods learn, based on auxiliary data, to measure distances along relevant or important local directions in a data space. The approach has connections to discriminative learning, distributional clustering, information geometry, and maximization of mutual information. So far we have incorporated the metrics into a clustering algorithm and the SOM, and applied the methods to the analysis of gene expression data, text documents, and financial statements of companies. Best regards, Samuel Kaski ----- Abstracts of two papers: (1) Samuel Kaski, Janne Sinkkonen, and Jaakko Peltonen. Bankruptcy analysis with self-organizing maps in learning metrics. IEEE Transactions on Neural Networks, 2001. Accepted for publication. We introduce a method for deriving a metric, locally based on the Fisher information matrix, into the data space. A Self-Organizing Map is computed in the new metric to explore financial statements of enterprises. The metric measures local distances in terms of changes in the distribution of an auxiliary random variable that reflects what is important in the data. In this paper the variable indicates bankruptcy within the next few years. The conditional density of the auxiliary variable is first estimated, and the change in the estimate resulting from local displacements in the primary data space is measured using the Fisher information matrix. When a Self-Organizing Map is computed in the new metric it still visualizes the data space in a topology-preserving fashion, but represents the (local) directions in which the probability of bankruptcy changes the most. (2) Janne Sinkkonen and Samuel Kaski. Clustering based on conditional distributions in an auxiliary space. Neural Computation, 2001. Accepted for publication. We study the problem of learning groups or categories that are local in the continuous primary space, but homogeneous by the distributions of an associated auxiliary random variable over a discrete auxiliary space. Assuming variation in the auxiliary space is meaningful, categories will emphasize similarly meaningful aspects of the primary space. From a data set consisting of pairs of primary and auxiliary items, the categories are learned by minimizing a Kullback-Leibler divergence-based distortion between (implicitly estimated) distributions of the auxiliary data, conditioned on the primary data. Still, the categories are defined in terms of the primary space. An on-line algorithm resembling the traditional Hebb-type competitive learning is introduced for learning the categories. Minimizing the distortion criterion turns out to be equivalent to maximizing the mutual information between the categories and the auxiliary data. In addition, connections to density estimation and to the distributional clustering paradigm are outlined. The method is demonstrated by clustering yeast gene expression data from DNA chips, with biological knowledge about the functional classes of the genes as the auxiliary data. From swatanab at pi.titech.ac.jp Wed May 23 00:26:12 2001 From: swatanab at pi.titech.ac.jp (Sumio Watanabe) Date: Wed, 23 May 2001 13:26:12 +0900 Subject: Geometry and Statistics in NN Learning Theory Message-ID: <000701c0e340$7fdbd7a0$988a7083@titech42lg8r0u> Dear Connectionists, We are very grad to inform that we have a special session, "Geometry and Statistics in Neural Network Learning Theory" http://watanabe-www.pi.titech.ac.jp/~swatanab/kes2001.html in the International Conference KES'2001, which will be held in Oska and Nara in Japan, 6th - 8th, September, 2001. http://www.bton.ac.uk/kes/kes2001/ In our session, we study the statistical problem caused by non-identifiability of layered learning machines. Information : * Date: September, 8th (Saturday), 2001, 14:40-16:45. * Place: Nara New Public Hall, Nara City, Japan. * Schedule: The time for each presentation is 25 minutes. * (Remark) Before this session, Professor Amari gives an invited talk, 13:40-14:40. ********** The authors and papers: You can get these papers from the cite, http://watanabe-www.pi.titech.ac.jp/~swatanab/kes2001.html (1) S. Amari, T.Ozeki, and H.Park (RIKEN BSI) "Singularities in Learning Models:Gaussian Random Field Approach." (2) K. Fukumizu (ISM) "Asymptotic Theory of Locally Conic Models and its Application to Multilayer Neural Networks." (3) K.Hagiwara (Mie Univ.) "On the training error and generalization error of neural network regression without identifiablity." (4) T. Hayasaka, M.Kitahara, K.Hagiwara, N.Toda, and S.Usui (TUT) "On the Asymptotic Distribution of the Least Squares Estimators for Non-identifiable Models." (5) S. Watanabe (TIT) "Bayes and Gibbs Estimations, Empirical Processes, and Resolution of Singularities." ********** A Short Introduction: [Why Non-identifiability ?] A parametric model in statistics is called identifiable if the mappning from the parameter to the probability distribution is one-to-one. A lot of learning machines used in information processing, such as artificial neural networks, normal mixtures, and Boltzmann machines are not identifiable. We do not yet have mathematical and statistical foundation on which we can research such models. [Singularities and Asymptotics ] If a non-identifiable model is redundant compared with the true distribution, then the set of true paramters is an analytic set with complex singularities, and the rank of the Fisher information matrix depends on the parameter. The behaviors of the training and generalization errors of layered learning machines are quite different from those of regular statistical models. It should be emphasized that we can not apply the standard asymptotic methods constructed by Fisher, Cramer, and Rao to these models. Either we can not use AIC, MDL, or BIC in statistical model selection for design of artificial neural networks. [Geometry and Statistics ] The purpose of this special session is to study and discuss the geometrical and statistical methodology by which non-identifiable learning machines can be analyzed. Remark that conic singularities are given by blowing-downs, and normal crossing singularities are found by blowing-ups. These algebraic geometrical methods take us to the statistical concepts, the order statistic and the empirical process. We find that a new perspective in geometry and statistics is opened. [Results which will be reported] (1) Professor Amari, et. al. clarify the generaliztion and traning errors of learning models of conic singularities in both the maximum likelihood method and the Bayesian method using the gaussian random field approach. (2) Dr. Fukumizu proves that a three layered neural network can be understood as a locally conic model, and that the asymptotic likelihood ratio is in proportion to (log n), where n is the number of training samples. (3) Dr. Hagiwara shows that the training and generalization errors of radial basis functions with gaussian units are in proportion to (log n) based on the assumption that the inputs are fixed. (4) Dr. Hayasaka, et.al. claim that the training error of three-layer perceptron is closely related to the expectation value of the order statistic. (5) Lastly, Dr.Watanabe studies the Bayes and Gibbs estimations for the case of statistical models with normal crossing singularities, and shows all general cases result in this case by resolution theorem. We expect that mathematicians, statisticians, information scientists, and theoretical physists will be interested in this topic. ********** Thank you very much for your interest in our special session. For questions or comments, please send an e-mail to Dr. Sumio Watanabe, P&I Lab., Tokyo Institute of Technology. E-mail: swatanab at pi.titech.ac.jp http://watanabe-www.pi.titech.ac.jp/~swatanab/index.html [Postal Mail] 4259 Nagatsuta, Midori-ku, Yokohama, 226-8503 Japan. From icann at ai.univie.ac.at Wed May 23 12:24:10 2001 From: icann at ai.univie.ac.at (ICANN 2001 conference) Date: Wed, 23 May 2001 18:24:10 +0200 Subject: ICANN 2001: Call for Participation Message-ID: <3B0BE42A.71B615F8@ai.univie.ac.at> Call for Participation ============================================================== ICANN 2001 International Conference on Artificial Neural Networks Aug. 21-25, 2001 Vienna, Austria http://www.ai.univie.ac.at/icann the annual conference of the European Neural Network Society ============================================================== Deadline for early registration fees: June 15, 2001 Invited Speakers: ================= Eric Baum, NEC Research Institute Vladimir S. Cherkassky, Univ. of Minnesota Stephen Grossberg, Boston Univ. Wolfgang Maass, Graz Univ. of Technology Kim Plunkett, Oxford Univ. Stephen Roberts, Oxford Univ. Alessandro Sperduti, Univ. of Pisa Florentin Woergoetter, Univ. of Stirling Program (Aug 22-24): ==================== 72 oral presentations and around 100 posters on the following topics: - Data Analysis and Pattern Recognition (Algorithms, Theory, Hardware, Applications) - Support Vector Machines, Kernel Methods - Independent Component Analysis - Topographic Mapping - Time Series and Signal Processing - Agent-based Economic Modeling (special session) - Computational Neuroscience - Vision and Image Processing - Robotics and Control - Selforganization and Dynamical Systems - Connectionist Cognitive Science Tutorials (Aug 21): =================== - Bioinformatics - The Machine Learning Approach Pierre Baldi - Predictive Learning and Modelling Financial=20 Markets Vladimir Cherkassky - Extraction of Knowledge from Data using Computational Intelligence Methods Wlodek Duch - Support Vector Machines Alex Smola - Sequential Learning of Nonlinear Models Mahesan Niranjan - Identification and Forecasting of Dynamical Systems Hans Georg Zimmermann - Neuroscience for Engineers and Computer Scientists Peter Erdi - Independent Component Analysis Aapo Hyv=E4rinen Workshops (Aug 25): =================== - Advances in EEG Analysis B. Blankertz, A. Flexer, J. Kohlmorgen,=20 K.R. M=FCller, S. Roberts, P. Sykacek - Processing Temporal Patterns with Recurrent Networks D. Eck, J. Schmidhuber - Kernel and Subspace Methods for Computer Vision A. Leonardis, H. Bischof - Advances toward Lifelike Perception Systems L. Smith Program chairs: =============== Georg Dorffner (general chair) Horst Bischof Kurt Hornik _______________________________________________________ Please see our web page for more details and for online registration. http://www.ai.univie.ac.at/icann From ckiw at dai.ed.ac.uk Fri May 25 06:53:45 2001 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Fri, 25 May 2001 11:53:45 +0100 (BST) Subject: PhD opportunities at the University of Edinburgh, UK Message-ID: PhD opportunities at the Institute for Adaptive and Neural Computation, University of Edinburgh, UK The Institute for Adaptive and Neural Computation (ANC, http://anc.ed.ac.uk) is part of the Division of Informatics at the University of Edinburgh. The Institute fosters the study of adaptive processes in both artificial and biological systems. It encourages interdisciplinary and collaborative work involving the traditional disciplines of neuroscience, cognitive science, computer science, computational science, mathematics and statistics. Many of the information-processing tasks under study draw on a common set of principles and mathematical techniques for their solution. Combined study of the adaptive nature of artificial and biological systems facilitates the many benefits accruing from treating essentially the same problem from different perspectives. A principal theme is the study of artificial learning systems. This includes theoretical foundations (e.g. statistical theory, information theory), the development of new models and algorithms, and applications. A second principal theme is the analysis and modelling of brain processes at all levels of organization with a particular focus on theoretical developments which span levels. Within this theme, research areas are broadly defined as the study of the neural foundations of perception, cognition and action and their underlying developmental processes. A secondary theme is the construction and study of computational tools and methods which can support studies in the two principal themes, such as in the analysis of brain data, simulation of networks and parallel data mining. Currently we have PhD studentships available as from 1 October 2001. These are supported by the Medical Research Council (MRC) and by the Biotechnology and Biological Sciences Research Council (BBSRC). In addition, the Division of Informatics receives a number of EPSRC studentships for which students wishing to study within the Institute for Adaptive and Neural Computation will be considered. PLEASE NOTE: Full funding under these studentships is only available to persons who satisfy a UK residence requirement (see www.epsrc.ac.uk/Documents/Guides/Students/Annex1.htm for more details). Under these studentships funding of university fees (but not maintenance) is available for EU nationals. To qualify for funding candidates must also have (or expect to receive) a good honours degree (1st or upper second class) (or equivalent). APPLICATION PROCEDURE: Formal applications should be made using the University of Edinburgh Postgraduate Application Form available via http://www.informatics.ed.ac.uk/prospectus/graduate/research.html and should be sent to the Faculty of Science and Engineering office. We wish to award these studentships as soon as possible, therefore applications should be received by June 15. Informal enquiries should be made to the contacts given below. * MRC Ph.D. studentship in Neuroinformatics and Functional MRI (see http://www.anc.ed.ac.uk/CFIS/hiring/MRC-PhD2.html for more details). Includes realtime methods in functional MRI, reproducibility of functional MRI brain imaging, and Bayesian Methods for the analysis of fMRI data. Contact: Dr. Nigel Goddard, Nigel.Goddard at ed.ac.uk * BBSRC studentship in the analysis of DNA microarray data (see http://www.bioss.ac.uk/student/newphdcag3.html for more details) Issues include: image analysis, to reduce noise and extract spot intensities; identification of differential gene expression between pairs of samples on a single microarray; exploratory graphical methods for analysing sets of arrays; and Bayesian networks to describe gene interactions. In collaboration with Dr. Chris Glasbey (Biomathematics & Statistics Scotland), c.glasbey at bioss.ac.uk. Applicants should have, or shortly expect to obtain, a first or upper second class degree in mathematics, statistics, physics informatics, mathematical biology, or a related subject. Please send a CV and names of three academic referees to: Chris Glasbey Biomathematics and Statistics Scotland JCMB, King's Buildings Edinburgh EH9 3JZ, Scotland email: c.glasbey at bioss.ac.uk Tel: (44) +131 650 4899 Fax: (44) +131 650 4901 * The EPSRC studentships are not specifically targeted, and can potentially support work in all areas that ANC works in. These include: theoretical and practical issues in machine learning and probabilistic modelling (including applications areas such as astronomical data mining, analysis of proteomics data, condition monitoring of premature babies, etc.); developing computational and mathematical models for the analysis of particular neural systems, in particular (i) models for the functioning of the basal ganglia (ii) models for the growth of optic projections in three-dimensional space; study of human cognitive processes, particularly language-related, using computational modeling and/or brain imaging approaches; software architectures and computational methods for neuroscience and cognitive science, including simulation, visualisation, and databases; connectionist cognitive modelling and cognitive modelling based on large language corpora, applied to modelling normal and impaired visual word recognition and spoken language processing. Informal enquiries may be made to Fiona Jamieson, fiona at anc.ed.ac.uk From piuri at elet.polimi.it Sun May 27 12:26:31 2001 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Sun, 27 May 2001 18:26:31 +0200 Subject: NIMIA 2001 and LFTNC 2001: two great opportunities for phd students and researchers in the neural areas! do not miss them!!! Message-ID: <5.0.2.1.0.20010527182221.02e13230@morgana.elet.polimi.it> Dear Colleague, Do not miss the opportunity to come to Italy once and attend at the following two international meetings!!!!!! - the NATO ASI NIMIA 2001 - NATO Advanced Study Institute on Neural Networks for Instrumentation, Measurement and Related Industrial Applications, to be held on 9-20 October 2001, in Crema, Italy. - the NATO ARW LFTNC 2001 - NATO Advanced Research Workshop on Limitations and Future Trends of Neural Computation, to be held on 22-24 October 2001, in Siena, Italy. Please forward this announcement to everybody who you feel could be interested in attending the meetings, especially to people working application areas! Since the attendance has to be approved by NATO, applications to attend should be submitted 15 JUNE 2001. Detailed information and the application forms are available at http://www.ims.unico.it/2001/ or at the mirror site at http://www.ewh.ieee.org/soc/im/2001/ You are allowed to withdraw your application at any time. Submitting earlier will give us more time to look for possible additional funding if the grants which are now available will not be sufficient to cover all attendees. Best regards Vincenzo Piuri & Marco Gori Vincenzo Piuri University of Milan, Department of Information Technologies via Bramante 65, 26013 Crema (CR), Italy phone: +39-0373-898-242 secretary: +39-0373-898-249 fax: +39-0373-898-253 email: piuri at elet.polimi.it secondary address: Politecnico di Milano, Department of Electronics and Information piazza L. da Vinci 32, 20133 Milano, Italy phone: +39-02-2399-3606 secretary: +39-02-2399-3623 fax: +39-02-2399-3411 email: piuri at elet.polimi.it From melchioc at csr.nih.gov Tue May 29 15:57:14 2001 From: melchioc at csr.nih.gov (Melchior, Christine (CSR)) Date: Tue, 29 May 2001 15:57:14 -0400 Subject: neuroscience job available Message-ID: SCIENTIFIC REVIEW ADMINISTRATOR (SRA) POSITION: The Center for Scientific Review (CSR), National Institutes of Health, seeks a neuroscientist with expertise in cognitive function who is interested in serving as an SRA. An SRA manages committees composed of leading scientists in their respective fields who meet to judge the scientific merit of research grant applications. Applicants must have earned the Ph.D. or M.D. (or have equivalent experience). It is crucial to have a record of independent research accomplishment, typically requiring several years beyond the doctoral degree. Salary is commensurate with experience. A recruitment or relocation bonus may be available. Submit curriculum vitae to: Christine Melchior, Ph.D., Chief, IFCN IRG, Center for Scientific Review, NIH, 6701 Rockledge Drive, Room 5176, MSC 7844, Bethesda, MD 20892-7844. E-mail: melchioc at csr.nih.gov NIH is an Equal Opportunity Employer. From giro-ci0 at wpmail.paisley.ac.uk Thu May 31 04:15:10 2001 From: giro-ci0 at wpmail.paisley.ac.uk (Mark Girolami) Date: Thu, 31 May 2001 09:15:10 +0100 Subject: Papers Now Available Message-ID: Dear Connectionists, The following papers are now available for download from http://cis.paisley.ac.uk/giro-ci0/ 1) Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem Mark Girolami To Appear : Neural Computation Abstract Kernel principal component analysis has been introduced as a method of extracting a set of orthonormal nonlinear features from multi-variate data and many impressive applications are being reported within the literature. This paper presents the view that the eigenvalue decomposition of a kernel matrix can also provide the discrete expansion coefficients required for a non-parametric orthogonal series density estimator. In addition to providing novel insights into non-parametric density estimation this paper provides an intuitively appealing interpretation for the nonlinear features extracted from data using kernel principal component analysis. 2) A Variational Method for Learning Sparse and Overcomplete Representations. Mark Girolami To Appear : Neural Computation Abstract An expectation maximisation algorithm for learning sparse and overcomplete data representations is presented. The proposed algorithm exploits a variational approximation to a range of heavy tailed distributions whose limit is the Laplacian. A rigorous lower-bound on the sparse prior distribution is derived which enables the analytic marginalisation of a lower-bound on the data likelihood. This lower-bound enables the development of an expectation maximisation algorithm for learning the overcomplete basis vectors and inferring the most probable basis coefficients. 3) Mercer Kernel Based Clustering in Feature Space Mark Girolami To Appear : IEEE Transaction on Neural Networks Abstract This paper presents a method for both the unsupervised partitioning of a sample of dat and the estimation of the possible number of inherent clusters which generate the data. This work exploits the notion that performing a nonlinear data transformation into some high dimensional feature space increases the probability of the linear separability of the patterns within the transformed space and therefore simplifies the associated data structure. It is shown that the eigenvectors of a kernel matrix which defines the implicit mapping provides a means to estimate the number of clusters inherent within the data and a computationally simple iterative procedure is presented for the subsequent feature space partitioning of the data. Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From scheler at ICSI.Berkeley.EDU Tue May 1 12:12:18 2001 From: scheler at ICSI.Berkeley.EDU (Gabriele Scheler) Date: Tue, 1 May 2001 09:12:18 -0700 (PDT) Subject: Two papers on Tuning Curves and Neural Control Message-ID: <200105011612.JAA25734@raclette.ICSI.Berkeley.EDU> The following two papers from CNS 2000 and CNS 2001 are available from my website http://wwwbrauer.in.tum.de/~scheler/pub/control.ps http://wwwbrauer.in.tum.de/~scheler/pub/cns-paper.pdf Signal Loss with Neural Controllers Gabriele Scheler We examine the effect of neuronal plasticity on information processing in a neural feedback-control system implemented by a recurrent structure. We show that fine tuning of low-pass and high-pass filters on information flow may lead to controlled signal degradation and signal loss, which is an important function for any self-recursive system. We relate the filter function to neuromodulatory control, and discuss the biological realization of short-term and long-term plasticity effects. Dopamine modulation of prefrontal delay activity - reverbatory activity and sharpness of tuning curves Gabriele Scheler and Jean-Marc Fellous Recent electrophysiological experiments have shown that dopamine (D1) modulation of pyramidal cells in prefrontal cortex reduce spike frequency adaptation and enhances NMDA transmission. Using four models, from multicompartmental to integrate and fire, we examine the effects of these modulations on sustained (delay) activity in a reverberatory network. We find that D1 modulation may enable robust network bistability yielding selective reverberation among cells that code for a particular item or location. We further show that the tuning curve of such cells is sharpened, and the signal-to-noise ratio is increased. We postulate that D1 modulation affects the tuning of "memory fields" and yield efficient distributed dynamic representations. Gabriele Scheler ------------------- Dr. Gabriele Scheler ICSI 1947 Center Street Berkeley, Ca. 94704 From a.hussain at cs.stir.ac.uk Tue May 1 10:23:38 2001 From: a.hussain at cs.stir.ac.uk (Dr. Amir Hussain) Date: Tue, 1 May 2001 15:23:38 +0100 Subject: Extended Deadline for CIS Journal Special Issue: Final Call for Papers References: <5.0.2.1.0.20010430193834.0277c990@morgana.elet.polimi.it> <007f01c0d22d$f2f86420$56acfea9@guilder> Message-ID: <002801c0d24a$51f50980$56acfea9@guilder> Dear Connectionists: As I have been asked by quite a few potential paper authors for an extension in the submission deadline, I have decided to extend the formal deadine for submission of papers to the Journal of Control & Intelligent Systems, Special Issue on "Non-linear Speech Processing Techniques & Applications" (Vol.30(1), 2002 issue) UNTIL 21 MAY 2001 !! This email announcement also therefore, serves as the final call for papers for this journal special issue. Please see http://www.actapress.com/journals/specialci.htm for more details. Best wishes Dr. Amir Hussain Guest Editor - Neural Computing Research Group Department of Computing Science & Mathematics University of Stirling, Stirling FK9 4LA SCOTLAND, UK Tel / Fax: (++44) 01786 - 476437 / 464551 Email: a.hussain at cs.stir.ac.uk http://www.cs.stir.ac.uk/~ahu/ From yann at research.att.com Tue May 1 17:05:46 2001 From: yann at research.att.com (Yann LeCun) Date: Tue, 01 May 2001 17:05:46 -0400 Subject: Announcing NIPS Online Message-ID: <200105012104.RAA07062@surfcity.research.att.com> Dear Colleagues: We are pleased to announce the availability of the NIPS Online web site at http://nips.djvuzone.org NIPS Online offers free access to the full collection of NIPS Proceedings, volumes 1 to 12 (NIPS*88 to NIPS*99). High resolution scans of the articles are provided in DjVu format with full-text search capability. Viewer software and information about DjVu are available at http://www.djvuzone.org The NIPS Online web site was made possible by the NIPS Foundation which funded the scanning, the original publishers, MIT Press and Morgan-Kaufmann, which graciously let us provide free access to the content; and AT&T Labs which supported the project. -- Yann LeCun [apologies if you receive multiple copies of this message] ____________________________________________________________________ Yann LeCun Head, Image Processing Research Dept. AT&T Labs - Research tel:+1(732)420-9210 fax:(732)368-9454 200 Laurel Avenue, Room A5-4E34 yann at research.att.com Middletown, NJ 07748, USA. http://www.research.att.com/~yann From woonw at aston.ac.uk Tue May 1 17:10:14 2001 From: woonw at aston.ac.uk (Wei Lee Woon) Date: Tue, 1 May 2001 22:10:14 +0100 Subject: Lectureship available in Neural Computing at Aston University Message-ID: <008401c0d283$1ec43de0$81f4a8c0@canggih> LECTURESHIP IN NEURAL COMPUTING, COMPLEX SYSTEMS OR INFORMATION ENGINEERING/MATHEMATICS. ASTON UNIVERSITY, UK. The NCRG as part of the Information Engineering group are looking for a highly motivated individual to contribute to internationally renowned research effort in areas of neural computing, biomedical information engineering, and inference systems. Our theoretical research interests span the traditional areas of signal processing, statistical pattern processing and information mathematics. Current applications-oriented activity includes work in biomedical areas (ECG/EEG/MEG), image segmentation, time series analysis, geographic information systems, error correcting codes, cryptography and steganography. We are seeking an enthusiast who can contribute to our research directions. The new lecturer will also be able to contribute towards a research-based MSc and a novel undergraduate information mathematics programme. Details of the Group's activities are on www.maths.aston.ac.uk and www.ncrg.aston.ac.uk. Candidates should have excellent qualifications, a deep commitment to research and a caring and involved attitude towards students. Appointments will be for 5 years in the first instance, with the possibility of subsequent transfer to a continuing appointment. Salary scale =A318,731 to =A330,967 and exceptionally to =A334,601 per = annum Further information is available from the Personnel Office (quoting Ref A01/69). Tel: (+44/0) 121 359 0870 (24 hour answerphone); email b.a.power at aston.ac.uk. Informal enquiries can be made to Professor David Lowe (d.lowe at aston.ac.uk). Closing date for the receipt of applications: 28 June 2001. From Nigel.Goddard at ed.ac.uk Wed May 2 08:12:56 2001 From: Nigel.Goddard at ed.ac.uk (Nigel Goddard) Date: Wed, 02 May 2001 13:12:56 +0100 Subject: Neural Coding: Call for Participation Message-ID: <3AEFF9C8.53D60004@ed.ac.uk> THE NEURAL CODE: MULTILEVEL AND COMPUTATIONAL APPROACHES a Maxwell Neuroinformatics Workshop Call for Participation May 28-June 1, 2001, Edinburgh, Scotland http://www.anc.ed.ac.uk/workshops This workshop is concerned with theoretical and empirical approaches to understanding the neural code, particularly with respect to analysis of multineuron data and theoretical approaches which can inform these analyses. To be able to address the key questions, it is necessary to bring together biologists, physicists, computer scientists and statisticians. The workshop brings together scientists with experimental, computational and theoretical approaches spanning multiple levels to provide an opportunity for interaction between methodological and phenomenological foci. Confirmed speakers include: Moshe Abeles Peter Dayan Peter Foldiak Andreas Herz Rob Kass Bruce McNaughton Mike Oram Stefano Panzeri Maneesh Sahani David Sterret Alessandro Treves Emma Wood Florentin Worgotter Rich Zemel The meeting is being organized in a small workshop style with emphasis on short presentations from invited speakers and from participants, round table discussions, and open debates on emerging topics. Time is scheduled for informal, self-organised, small-group activities. Computers will be available to support explorative work and demonstrations. In addition to the invited speakers, a limited number of places will be available to interested scientists, who will be chosen on the basis of the contribution they can make to the workshop. A number of places are reserved for junior faculty, postdoctoral researchers and senior graduate students who are early on in a research career in the areas covered by the workshop and who could gain significantly from exposure to the workshop presentations and discussions. We will have some travel/accommodation stipends for some of these participants who do not have access to their own funding to participate. Registration is via the developing Neuroinformatics portal at http://www.neuroinf.org, and further information can be found at the workshope site: http://www.anc.ed.ac.uk/workshops From james at tardis.ed.ac.uk Wed May 2 12:18:56 2001 From: james at tardis.ed.ac.uk (James Hammerton) Date: Wed, 02 May 2001 17:18:56 +0100 Subject: CFP: Special Issue of JMLR on "Machine Learning Approaches to Shallow Parsing" Message-ID: <20010502161857.3A5ECC14A@davros.tardis.ed.ac.uk> [Please note the Reply-To field] Call for Papers: Special Issue of the Journal of Machine Learning Research -- "Machine Learning Approaches to Shallow Parsing" Editors: James Hammerton james.hammerton at ucd.ie, University College Dublin Miles Osborne osborne at cogsci.ed.ac.uk, University of Edinburgh Susan Armstrong susan.armstrong at issco.unige.ch, University of Geneva Walter Daelemans walter.daelemans at uia.ua.ac.be, University of Antwerp The Journal of Machine Learning Research invites authors to submit papers for the Special Issue on Machine Learning approaches to Shallow Parsing. Background ---------- Over the last decade there has been an increased interest in applying machine learning techniques to corpus-based natural language processing. In particular many techniques have been applied to shallow parsing of large corpora, where rather than produce a detailed syntactic or semantic analysis of each sentence, key parts of the syntactic structure or key pieces of semantic information are identified or extracted. For example, such tasks include identifying the noun phrases in a text, extracting non-overlapping chunks of text that identify the major phrases in a sentence or extracting the subject, main verb and object from a sentence. Applications of shallow parsing include data mining from unstructured textual material (e.g. web pages, newswires), information extraction, question answering, automated annotation of linguistic corpora and the preprocessing of data for linguistic tasks such as machine translation or full scale parsing. Shallow parsing of realistic, naturally occuring language poses a number of challenges for a machine learning system. Firstly, the training set is usually large which will push batch techniques to the limit. The training material is often noisy and frequently only partially determines a model (that is, only some aspects of the target model are observed). Secondly, shallow parsing requires making large numbers of decisions which translates as learning large models. The size of such models usually results in extremely sparse counts, which makes reliable estimation difficult. In sum, learning how to do shallow parsing will tax almost any machine learning algorithm and will thus provide valuable insight into real-world performance. In a number of workshops and publications, a variety of machine learning techniques have been applied in this area including memory based (instance based) learning, inductive logic programming, probabilistic context free grammars, maximum entropy, transformation based learning, artificial neural networks and more recently support vector machines. However there has not been an opportunity to compare and contrast these techniques in a systematic manner. The special issue will thus provide a venue for drawing together the relevant ML techniques. TOPICS ------ The aim of the special issue is to solicit and publish papers that provide a clear view of the state of the art in machine learning for shallow parsing. We therefore encourage submissions in the following areas: * applications of machine learning techniques to shallow parsing tasks, including the development of new techniques. * comparisons of machine learning techniques for shallow parsing * analyses of the complexity of machine learning for shallow parsing tasks To facilitate cross-paper comparison and thus strengthen the special issue as a whole, authors are encouraged to consider using one of the following data sets provided via the CoNLL workshops (please note however that this is not mandatory): http://lcg-www.uia.ac.be/conll2000/chunking/ or: http://lcg-www.uia.ac.be/conll2001/clauses/ We emphasise that authors will not be solely judged in terms of raw performance and this is not to be considered as a competition: insight into the strengths and weaknesses of a given system is deemed to be more important. High quality papers reviewing machine learning for shallow parsing will also be welcome. Instructions ------------ Articles should be submitted electronically. Postcript or PDF format are acceptable and submissions should be single column and typeset in 11 pt font format, and include all author contact information on the first page. See the author instructions at www.jmlr.org for more details. To submit a paper send the normal emails asked for by the JMLR in their author instructions to submissions at jmlr.org (NOT to the editors directly), indicating in the subject headers that the submission is intended for the Special Issue on Machine Learning Approaches to Shallow Parsing. Key dates --------- Submission deadline: 2nd September 2001 Notification of acceptance: 16th November 2001 Final drafts: 3rd February 2002 Further information ------------------- Please contact James Hammerton with any queries. From stefan.wermter at sunderland.ac.uk Wed May 2 11:49:57 2001 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Wed, 02 May 2001 16:49:57 +0100 Subject: PG Research Student Applications Message-ID: <3AF02CA5.3BAD883A@sunderland.ac.uk> With respect to this email list I would like to encourage applications from computing PhD students in intelligent systems (e.g. neural networks, natural language engineering, hybrid systems, cognitive neuroscience, neuro/fuzzy systems, machine learning). General application text for all areas of interest below Stefan Wermter ------------------------------------------------- Phd & MPhil Opportunities in Computing, Engineering & Technology The School of Computing, Engineering & Technology at the University of Sunderland is seeking high quality, motivated applicants wishing to gain a PhD or MPhil in the disciplines of Computing, Mathematical Sciences, Engineering and Technology. The school has a strong and growing research profile with 6 EPSRC and more than 10 EU-funded projects and a vibrant community of over 100 researchers. The School is well-resourced and offers excellent facilities with much state-of the art computing equipment and not only offers high quality postgraduate but also undergraduate programmes accredited by professional societies. We look for applications in both computing and mathematics as well as general engineering. The main research groups in computing & mathematics are: intelligent systems (major strengths in neural networks, genetic algorithms, hybrid systems and natural language engineering: Professor Stefan Wermter - stefan.wermter at sunderland.ac.uk +44 191 5153279); human computer systems (includes themes such as multimedia, computer-aided learning, computing for the disabled and human computer interaction evaluation methodologies: Professor Gilbert Cockton - gilbert.cockton at sunderland.ac.uk +44 191 5153394); software engineering (focussed on practical areas especially software testing and the organisational risks of implementing information systems, methodologies and solutions for industry: Professor Helen Edwards helen.edwards at sunderland.ac.uk +44 191 5152786 or Professor Barrie Thompson barrie.thompson at sunderland.ac.uk +44 191 5152769); electronic commerce (encompasses the development and promotion of standards in this dynamic area with a special interest in the area of electronic procurement: Kevin Ginty - kevin.ginty at sunderland.ac.uk or Albert Bokma albert.bokma at sunderland.ac.uk +44 191 5153233); decision support systems (covers a diverse range of activities in statistics & mathematics at the boundary of Computer Science and Statistics and Operational Research: Professor Eric Fletcher eric.fletcher at sunderland.ac.uk +44 191 5152822 or Professor Alfredo Moscardini alfredo.moscardini at sunderland.ac.uk +44 191 5152763); *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Centre of Informatics, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From shultz at psych.mcgill.ca Wed May 2 13:36:12 2001 From: shultz at psych.mcgill.ca (Thomas R. Shultz) Date: Wed, 02 May 2001 13:36:12 -0400 Subject: Recent papers on knowledge and learning Message-ID: <4.3.1.0.20010502132519.00a86620@127.0.0.1> Recent papers on knowledge and learning that may be of interest to readers of this list Shultz, T. R., & Rivest, F. (2001, in press). Knowledge-based cascade-correlation: Using knowledge to speed learning. Connection Science. Research with neural networks typically ignores the role of knowledge in learning by initializing the network with random connection weights. We examine a new extension of a well-known generative algorithm, cascade-correlation. Ordinary cascade-correlation constructs its own network topology by recruiting new hidden units as needed to reduce network error. The extended algorithm, knowledge-based cascade-correlation (KBCC), recruits previously learned sub-networks as well as single hidden units. This paper describes KBCC and assesses its performance on a series of small, but clear problems involving discrimination between two classes. The target class is distributed as a simple geometric figure. Relevant source knowledge consists of various linear transformations of the target distribution. KBCC is observed to find, adapt, and use its relevant knowledge to significantly speed learning. ============= Shultz, T. R., & Rivest, F. (2000). Using knowledge to speed learning: A comparison of knowledge-based cascade-correlation and multi-task learning. Proceedings of the Seventeenth International Conference on Machine Learning (pp. 871-878). San Francisco: Morgan Kaufmann. Cognitive modeling with neural networks unrealistically ignores the role of knowledge in learning by starting from random weights. It is likely that effective use of knowledge by neural networks could significantly speed learning. A new algorithm, knowledge-based cascade-correlation (KBCC), finds and adapts its relevant knowledge in new learning. Comparison to multi-task learning (MTL) reveals that KBCC uses its knowledge more effectively to learn faster. ============= Preprints and reprints can be found at http://www.psych.mcgill.ca/perpg/fac/shultz/default.htm Cheers, Tom -------------------------------------------------------- Thomas R. Shultz, Professor, Department of Psychology, McGill University, 1205 Penfield Ave., Montreal, Quebec, Canada H3A 1B1. E-mail: shultz at psych.mcgill.ca Updated 7 April 2001: http://www.psych.mcgill.ca/perpg/fac/shultz/default.htm Phone: 514 398-6139 Fax: 514 398-4896 -------------------------------------------------------- From andre at icmc.sc.usp.br Thu May 3 09:31:24 2001 From: andre at icmc.sc.usp.br (andre) Date: Thu, 03 May 2001 10:31:24 -0300 Subject: International Journal of Computational Intelligence and Applications Message-ID: <3AF15DAC.EF5E0CF6@icmc.sc.usp.br> ========================================================= INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS http://ejournals.wspc.com.sg/ijcia/ijcia.html Vol. 1, No. 1, March 2001 Editorial Feedback Self-Organizing Map and its Application to Spatio-Temporal Pattern Classification K. Horio and T. Yamakawa Learning of Fuzzy Automata W. Pedrycz and A. Gacek Hybrid Instance-Based System for Predicting Ocean Temperatures J. M. Corchado, B. Lees and J. Aiken Modular Connectionist Modelling and Classification Approaches for Local Diagnosis in Telecommunication Traffic Management Y. Bennani and F. Bossaert Using Case Retrieval to Seed Genetic Algorithms S. Oman and P. Cunningham The Application of Feedforward Neural Networks in VLSI Fabrication Process Optimization W. Xiangdong and W. Shoujue An Enhanced Genetic Algorithm for Solving the High-Level Synthesis Problems of Scheduling, Allocation, and Binding G. W. Grewal and T. C. Wilson Calendar of Events Book Review: Words and Rules: The Ingredients of Language -- Prof. Andre Ponce de Leon F. de Carvalho Associate Professor Computational Intelligence Laboratory Department of Computer Science and Statistics University of Sao Paulo Sao Carlos, SP, Brazil www.icmc.sc.usp.br/~andre From giese at MIT.EDU Thu May 3 23:05:36 2001 From: giese at MIT.EDU (Martin A Giese) Date: Thu, 03 May 2001 23:05:36 -0400 Subject: research positions Message-ID: <200105040305.XAA19989@superior-temporal-sulcus.mit.edu> IN THEORETICAL NEUROSCIENCE AND COMPUTER VISION / ROBOTICS The research group for Action Representation and Learning at the Max-Planck Institute for Biological Cybernetics and the Department of Cognitive Neurology at the University Hospital in Tuebingen (Germany) offers following research positions in theoretical neuroscience and computer vision / robotics: 1 postdoc position (BAT IIa) 2 PhD positions (BAT IIa / 2) The group investigates how complex movements and actions are represented in the brain, and how the underlying learning principles can be exploited for technical applications in computer vision, robotics and biomedical systems. One focus of the group is the development and experimental testing of models for action representation in the brain. This work includes the development of neural models and testing them in psychophysical, neurophysiological and fMRI experiments in close collaboration with different well established experimentalists in Tuebingen and the USA. The second focus is the development of technical applications of learning-based representations of actions for medical diagnosis, computer animation and movement programming in robots. Technical applications will be developed in collaboration with companies in robotics and biomedical technology and the Dept. for Neurology at the University Hospital in Tuebingen. Close collaborations exist with the Center for Biological and Computational Learning, M.I.T., Cambridge (USA), Harvard Medical School, and the Department of Biomedical Engineering, Boston University (USA). The postdoctoral position will be available for 3 years (salary BAT IIa), extendable to 5 years. The ideal candidate has a background in computer science, engineering, physics or mathematics and previous experience in computer vision / graphics, robotics or machine learning. She / he will be in charge of developing technical applications and new methods in machine learning for the representation of actions. Both PhD positions are available for 3 years (salary BAT IIa/2). One PhD will focus on neural modeling of the recognition of complex movements in humans and primates. He / she will be closely involved in experiments to evaluate the theory. Ideally, this candidate has a strong interest in neuroscience and good mathematical skills and previous training in physics, mathematics, engineering, computer science or psychology. Tuebingen offers a new graduate program in neuroscience. The second PhD will take part in the development of medical diagnosis systems and computer graphics applications exploiting methods from machine learning and computer vision. Ideally, this candidate has good mathematical and programming skills and previous training in physics, mathematics, engineering, or computer science. All Positions are funded by the German Volkswagen Foundation. For further information please contact: Dr. Martin Giese Center for Biological and Computational Learning Massachusetts Institute of Technology E 25-206 45, Carleton Street Cambridge, Massachusetts 02139-4307 USA email: giese at mit.edu Tel: +001-617-253 0549 (office) +001-617-253 0551 (lab secretary) Fax: +001-617-253 2964 Applicants are asked to submit their CV, bibliography and the names of two references. Applications should be sent by email to the same address. -- ----------------------------------------------------- Dr. Martin Giese Center for Biological and Computational Learning Massachusetts Institute of Technology, Room E25 - 206 45, Carleton Street Cambridge, Massachusetts 02139-4307 USA email: giese at mit.edu Tel: +001-617-253 0549 (office) +001-617-253 0551 (lab secretary) +001-617-491 5538 (home) Fax: +001-617-253 2964 ---------------------------------------------------- From santini at dii.unisi.it Fri May 4 10:16:41 2001 From: santini at dii.unisi.it (Santini Fabrizio) Date: Fri, 04 May 2001 16:16:41 +0200 Subject: LFTNC 2001 Advanced Research Workshop - NATO Grants Message-ID: <3AF2B9C9.702136AD@dii.unisi.it> LFTNC 2001 NATO Advanced Research Workshop on Limitations and Future Trends in Neural Computation -------------------------------------------------------------- We are very pleased to announce that, within the framework of the ARW LFTNC 20001, NATO provides limited additional funds to support the partecipation of scientists from Greece, Portugal and Turkey. Please refer either to the European official site or the America mirror for further details: www.ims.unico.it/2001/lftnc www.ewh.ieee.org/soc/im/2001/lftnc -------------------------------------------------------------- Fabrizio Santini Universita' di Siena - Facolta' di Ingegneria Informatica web: http://www.dii.unisi.it/~santini -------------------------------------------------------------- From derprize at cnbc.cmu.edu Fri May 4 13:27:21 2001 From: derprize at cnbc.cmu.edu (David E. Rumelhart Prize) Date: Fri, 04 May 2001 13:27:21 -0400 Subject: First Recipient of the David E. Rumelhart Prize Announced Message-ID: <3AF2E678.9DD92390@cnbc.cmu.edu> Geoffrey E. Hinton Chosen as First Recipient of the David E. Rumelhart Prize for Contributions to the Formal Analysis of Human Cognition The Glushko-Samuelson Foundation and the Cognitive Science Society are pleased to announce that Geoffrey E. Hinton has been chosen as the first recipient of the David E. Rumelhart Prize for contributions to the formal analysis of human cognition. Hinton was chosen for his many important contributions to the analysis of neural networks, elucidating the nature of representation, processing, and learning in the brain. In a landmark early book with James Anderson (1), he pioneered the use of distributed representations and described how they can be used for semantic knowledge representation (2). With Terrence J. Sejnowski (3), he introduced the Boltzmann Machine, an important neural network architecture for finding globally optimal solutions to difficult constraint satisfaction problems, and with Sejnowski and Ackley (4) he proposed a learning algorithm for use in such networks. With David Rumelhart and Ronald Williams (5), he introduced the back-propagation learning algorithm and made clear how it could be used to discover useful representations capturing the underlying structure of a body of structured propositional information. He has gone on from this important early work to make many further contributions to the field of neural networks, including studies of mixtures of experts (6) and Helmholtz machines (7). His publication list includes more than 100 articles on these and a wide range of other topics. Beyond these contributions, Hinton is an outstanding mentor and advisor: 18 graduate students have earned the Ph. D. degree under his supervision. Hinton to Deliver Prize Lecture at the Edinburgh Meeting of the Cognitive Science Society in August, 2001 Geoffey Hinton will receive the First David E. Rumelhart Prize and deliver the first Rumelhart Prize Lecture in Edinburgh, Scotland at the Annual Meeting of the Cognitive Science Society, to be held August 1-4 in Edinburgh, Scotland. The Prize itself will consist of a certificate, a citation of the awardee's contribution, and a monetary award of $100,000. Information on this year's meeting is available at http://www.hcrc.ed.ac.uk/cogsci2001/. The David E. Rumelhart Prize to be Awarded Annually When established in August of 2000, the David E. Rumelhart Prize was to be awarded bienially for outstanding contributions to the formal analysis of human cognition. Upon reviewing the pool of individuals nominated to receive the prize, the Glushko-Samuelson Foundation, in consultation with the Governing Board of the Cognitive Science Society, came to the conclusion that an annual prize is warranted. With the aid of the Prize Selection Committee (listed below), the foundation determined that there exists a large pool of outstanding candidates representing each of the approaches to the formal analysis of human cognition identified in the prize announcement: mathematical modeling of human cognitive processes, formal analysis of language and other products of human cognitive activity, and computational analyses of human cognition using symbolic and non-symbolic frameworks. Awarding the prize annually should facilitate the timely recognition of major contributions arising within each of these approaches. The recipient of the second David E. Rumelhart Prize will be announced at the Cognitive Science Society Meeting in Edinburgh, with the second prize lecture to be given at the following meeting of the society at George Mason University in July, 2002. Prize Selection Committee The membership of the prize selection committee was selected in consultation with the Distinguished Advisory Board (William Estes, Barbara Partee, and Herbert Simon). The members of the prize selection committee are Allan Collins, Bolt, Beranek and Newman and Northwestern University; Robert J. Glushko, Glushko-Samuelson Foundation; Mark Liberman, University of Pennsylvania; Anthony J. Marley, McGill University; and James L. McClelland (Chair), Carnegie Mellon. Brief Biography of Geoffrey E. Hinton Geoffrey Hinton received his BA in experimental psychology from Cambridge in 1970 and his PhD in Artificial Intelligence from Edinburgh in 1978. He did postdoctoral work at Sussex University and the University of California, San Diego and spent five years as a faculty member in the Computer Science department at Carnegie-Mellon University. He then moved to Toronto where he was a fellow of the Canadian Institute for Advanced Research and a Professor in the Computer Science and Psychology departments. He is a former president of the Cognitive Science Society, and he is a fellow of the Royal Society (UK), the Royal Society of Canada, and the American Association for Artificial Intelligence. In 1992 he won the ITAC/NSERC award for contributions to information technology. Hinton is currently Director of the Gatsby Computational Neuroscience Unit at University College London, where he leads an outstanding group of faculty, post-doctoral research fellows, and graduate students investigating the computational neural mechanisms of perception and action with an emphasis on learning. His current main interest is in unsupervised learning procedures for neural networks with rich sensory input. Cited Publications by Geoffrey E. Hinton (1) Hinton, G. E. and Anderson, J. A. (1981) Parallel Models of Associative Memory, Erlbaum, Hillsdale, NJ. (2) Hinton, G. E. (1981) Implementing semantic networks in parallel hardware. In Hinton, G. E. and Anderson, J. A. (Eds.), Parallel Models of Associative Memory, Erlbaum, Hillsdale, NJ. (3) Hinton, G. E. and Sejnowski, T. J. (1983) Optimal perceptual inference. Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, Washington DC. (4) Ackley, D. H., Hinton, G. E., and Sejnowski, T. J. (1985) A learning algorithm for Boltzmann machines. Cognitive Science, 9, 147--169. (5) Rumelhart, D. E., Hinton, G. E., and Williams, R. J. (1986) Learning representations by back-propagating errors. Nature, 323, 533--536. (6) Jacobs, R., Jordan, M. I., Nowlan. S. J. and Hinton, G. E. (1991) Adaptive mixtures of local experts. Neural Computation, 3, 79-87 (7) Hinton, G. E., Dayan, P., Frey, B. J. and Neal, R. (1995) The wake-sleep algorithm for unsupervised Neural Networks. Science, 268, pp 1158-1161. Visit the David E. Rumelhart Prize Website at: http://www.cnbc.cmu.edu/derprize From Angelo.Arleo at dimail.epfl.ch Fri May 4 05:08:57 2001 From: Angelo.Arleo at dimail.epfl.ch (Angelo Arleo) Date: Fri, 04 May 2001 11:08:57 +0200 Subject: Preprints and Ph.D.thesis available Message-ID: <3AF271A9.4DB4C20@di.epfl.ch> Dear Connectionists, the following documents are now available on the web: ======================================================= A. Arleo (2000). "Spatial Learning and Navigation in Neuro-Mimetic Systems - Modeling the Rat Hippocampus", Ph.D. thesis, Dept. of Computer Science, Swiss Federal Inst.of Technology Lausanne, EPFL, Switzerland. http://diwww.epfl.ch/~arleo/PUBLICATIONS/PhD.html ======================================================= A. Arleo and W. Gerstner (2000). "Place Cells and Spatial Navigation based on Vision, Path Integration, and Reinforcement Learning", Advances in Neural Information Processing Systems 13, MIT-Press, pp. 89-95 http://diwww.epfl.ch/~arleo/PUBLICATIONS/nips00.pdf.Z ======================================================= A. Arleo and W. Gerstner (2001). "Spatial Orientation in Navigating Agents: Modeling Head-direction Cells". Neurocomputing (to appear) http://diwww.epfl.ch/~arleo/PUBLICATIONS/NeuroComputing00.pdf.Z ======================================================= Comments and suggestions are particularly welcome. Best regards, Angelo Arleo ______________________________________________________________________ ____/ __ / ____/ / Dr. Angelo Arleo. / / / / / Lab. of Computational Neuroscience (LCN) ____/ ____/ ____/ / Swiss Federal Inst. of Technology Lausanne / / / / CH-1015 Lausanne EPFL _____/ _/ _/ _____/ Tel/Fax: ++41 21 693 6696 / 5263 E-mail: angelo.arleo at epfl.ch Web: http://diwww.epfl.ch/~arleo ______________________________________________________________________ From jose at psychology.rutgers.edu Sat May 5 11:43:16 2001 From: jose at psychology.rutgers.edu (Stephen J. Hanson) Date: Sat, 05 May 2001 11:43:16 -0400 Subject: New paper on Distributional Properties of BOLD Susceptibility effects in the Brain Message-ID: <3AF41F94.2020406@kreizler.rutgers.edu> New paper available: "The Distribution of BOLD Susceptibility effects in the Brain is Non-Gaussian", S.J Hanson & B. Martin Bly to appear --NeuroReport (July, 2001). Abstract: A key assumption underlying fMRI analysis in the General Linear Model is that the underlying distributions of BOLD susceptibility is Gaussian. Analysis of several common data sets and experimental paradigms shows that the underlying distribution is NON-Gaussian. Further identification shows that the distribution is most likely GAMMA and implications for hehmodynamic modeling are discussed as well as recommendations concerning inferential testing in such "heavy-tailed" environments. PDF--> http://psychology.rutgers.edu/Users/jose/index.html Steve also see RUMBA--> www.rumba.rutgers.edu From J.A.Bullinaria at cs.bham.ac.uk Sun May 6 09:59:07 2001 From: J.A.Bullinaria at cs.bham.ac.uk (John A Bullinaria) Date: Sun, 6 May 2001 14:59:07 +0100 (BST) Subject: MSc in Natural Computation Message-ID: MSc in Natural Computation ========================== School of Computer Science The University of Birmingham Birmingham, UK Starting in October 2001, we are offering an advanced 12 month MSc programme in Natural Computation (i.e. computational systems that use ideas and inspirations from natural biological, ecological and physical systems). This will comprise of six taught modules in Neural Computation, Evolutionary Computation, Molecular and Quantum Computation, Nature Inspired Optimisation, Nature Inspired Learning, and Nature Inspired Design (10 credits each); two mini research projects (30 credits each); and one full scale research project (60 credits). The programme is supported by the EPSRC through its Master's Level Training Packages, and by a number of leading companies including BT, Unilever, Nortel Networks, Thames Water, Pro Enviro, SPSS, GPU Power Distribution, and aQtive. The School of Computer Science at the University of Birmingham has a strong research group in evolutionary and neural computation, with five members of academic staff (faculty) and two research fellows currently specialising in these fields: Dr. John Bullinaria (Neural Networks, Evolutionary Computation, Cog.Sci.) Dr. Jun He (Evolutionary Computation) Dr. Julian Miller (Evolutionary Computation, Machine Learning) Dr. Riccardo Poli (Evolutionary Computation, GP, Computer Vision, NNs, AI) Dr. Jon Rowe (Evolutionary Computation, AI) Dr. Thorsten Schnier (Evolutionary Computation, Engineering Design) Prof. Xin Yao (Evolutionary Computation, NNs, Machine Learning, Optimisation) Other staff members also working in these areas include Prof. Aaron Sloman (evolvable architectures of mind, co-evolution, interacting niches) and Dr. Jeremy Wyatt (evolutionary robotics, classifier systems). The programme is open to candidates with a very good honours degree or equivalent qualifications in Computer Science/Engineering or closely related areas. Six fully funded EPSRC studentships (covering fees and maintenance costs) are available each year, and additional financial support from our industrial partners may be available during the main project period. Further details about this programme and funding opportunities are available from our Web-site at: http://www.cs.bham.ac.uk/natcomp Please note that the closing date for applications is 15th July 2001. From vlassis at science.uva.nl Mon May 7 10:42:40 2001 From: vlassis at science.uva.nl (Nikos Vlassis) Date: Mon, 07 May 2001 16:42:40 +0200 Subject: some papers Message-ID: <3AF6B460.86DB8DB6@wins.uva.nl> Dear Connectionists, The following three papers have been accepted for publication and might be of interest. N. Vlassis, Y. Motomura, Ben Krose Supervised Dimension Reduction of Intrinsically Low-dimensional Data Neural Computation (to appear) ftp://ftp.science.uva.nl/pub/computer-systems/aut-sys/reports/Vlassis01nc.ps.gz Abstract: High-dimensional data generated by a system with limited degrees of freedom are often constrained in low-dimensional manifolds in the original space. In this paper we investigate dimension reduction methods for such intrinsically low-dimensional data through linear projections that preserve the manifold structure of the data. For intrinsically one-dimensional data this implies projecting to a curve on the plane with as few intersections as possible. We are proposing a supervised projection pursuit method which can be regarded as an extension of the single-index model for nonparametric regression. We show results from a toy and two robotic applications. Keywords: dimension reduction, feature extraction, intrinsic dimensionality, projection pursuit, simple curve, single-index model, multiple-index model, appearance modeling, mobile robot localization. ---- N. Vlassis and Y. Motomura Efficient Source Adaptivity in Independent Component Analysis IEEE Trans. Neural Networks (to appear) ftp://ftp.science.uva.nl/pub/computer-systems/aut-sys/reports/Vlassis01tnn.ps.gz Abstract: A basic element in most ICA algorithms is the choice of a model for the score functions of the unknown sources. While this is usually based on approximations, for large data sets it is possible to achieve `source adaptivity' by directly estimating from the data the `true' score functions of the sources. In this paper we describe an efficient scheme for achieving this by extending the fast density estimation method of Silverman (1982). We show with a real and a synthetic experiment that our method can provide more accurate solutions than state-of-the-art methods when optimization is carried out in the vicinity of the global minimum of the contrast function. Keywords: Independent component analysis, blind signal separation, source adaptivity, score function estimation. ---- N. Vlassis, A. Likas A greedy EM algorithm for Gaussian mixture learning Neural Processing Letters (to appear) ftp://ftp.science.uva.nl/pub/computer-systems/aut-sys/reports/Vlassis01npl.ps.gz Abstract: Learning a Gaussian mixture with a local algorithm like EM can be difficult because (i) the true number of mixing components is usually unknown, (ii) there is no generally accepted method for parameter initialization, and (iii) the algorithm can get stuck in one of the many local maxima of the likelihood function. In this paper we propose a greedy algorithm for learning a Gaussian mixture which tries to overcome these limitations. In particular, starting with a single component and adding components sequentially until a maximum number k, the algorithm is capable of achieving solutions superior to EM with k components in terms of the likelihood of a test set. The algorithm is based on recent theoretical results on incremental mixture density estimation, and uses a combination of global and local search each time a new component is added to the mixture. -- http://www.science.uva.nl/~vlassis From bbs at bbsonline.org Mon May 7 17:22:58 2001 From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor)) Date: Mon, 07 May 2001 17:22:58 -0400 Subject: BBS Call for Commentators A SENSORIMOTOR ACCOUNT OF VISION AND VISUAL CONSCIOUSNESS Message-ID: Below is the abstract of a forthcoming BBS target article [Please note that this paper was in fact accepted and archived to the web in October 2000 but the recent move of BBS to New York delayed the Call until now.] A SENSORIMOTOR ACCOUNT OF VISION AND VISUAL CONSCIOUSNESS by J. Kevin O'Regan Alva Noe http://www.bbsonline.org/Preprints/ORegan/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 8000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ A sensorimotor account of vision and visual consciousness J. Kevin O'Regan Laboratoire de Psychologie Exprimentale Centre National de Recherche Scientifique, Universit Ren Descartes, 92774 Boulogne Billancourt, France oregan at ext.jussieu.fr http://nivea.psycho.univ-paris5.fr Alva Noe Department of Philosophy University of California, Santa Cruz Santa Cruz, CA 95064 anoe at cats.ucsc.edu http://www2.ucsc.edu/people/anoe/ KEYWORDS: Sensation, Perception, Action, Consciousness, Experience, Qualia, Sensorimotor, Vision, Change blindness ABSTRACT: Many current neurophysiological, psychophysical and psychological approaches to vision rest on the idea that when we see, the brain produces an internal representation of the world. The activation of this internal representation is assumed to give rise to the experience of seeing. The problem with this kind of approach is that it leaves unexplained how the existence of such a detailed internal representation might produce visual consciousness. An alternative proposal is made here. We propose that seeing is a way of acting. It is a particular way of exploring the environment. Activity in internal representations does not generate the experience of seeing. The outside world serves as its own, external, representation. The experience of seeing occurs when the organism masters what we call the governing laws of sensorimotor contingency. The advantage of this approach is that it provides a natural and principled way of accounting for visual consciousness, and for the differences in the perceived quality of sensory experience in the different sensory modalities. Several lines of empirical evidence are brought forward in support of the theory, in particular: evidence from experiments in sensorimotor adaptation, visual "filling in", visual stability despite eye movements, change blindness, sensory substitution, and color perception. http://www.bbsonline.org/Preprints/ORegan/ ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENTS *** (1) The authors of scientific articles are not paid money for their refereed research papers; they give them away. What they want is to reach all interested researchers worldwide, so as to maximize the potential research impact of their findings. Subscription/Site-License/Pay-Per-View costs are accordingly access-barriers, and hence impact-barriers for this give-away research literature. There is now a way to free the entire refereed journal literature, for everyone, everywhere, immediately, by mounting interoperable university eprint archives, and self-archiving all refereed research papers in them. Please see: http://www.eprints.org http://www.openarchives.org/ http://www.dlib.org/dlib/december99/12harnad.html --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to self-archive all their papers in their own institution's Eprint Archives or in CogPrints, the Eprint Archive for the biobehavioral and cognitive sciences: http://cogprints.soton.ac.uk/ It is extremely simple to self-archive and will make all of our papers available to all of us everywhere, at no cost to anyone, forever. Authors of BBS papers wishing to archive their already published BBS Target Articles should submit it to BBSPrints Archive. Information about the archiving of BBS' entire backcatalogue will be sent to you in the near future. Meantime please see: http://www.bbsonline.org/help/ and http://www.bbsonline.org/Instructions/ --------------------------------------------------------------------- (3) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From bbs at bbsonline.org Mon May 7 17:05:35 2001 From: bbs at bbsonline.org (Stevan Harnad - Behavioral & Brain Sciences (Editor)) Date: Mon, 07 May 2001 17:05:35 -0400 Subject: BBS re BBSPrints Logins Message-ID: Dear Connectionists List User, This list regularly receives Calls for Commentators from Behavioral and Brain Sciences (BBS) journal. BBS has now changed its procedures. If you also wish to be notified personally of accepted target articles and Calls for Commentators, you can get an individual login and password at the following URL: http://www.bbsonline.org/register.html Please note however that if you have had direct email communication with BBS in the past, a user account may already be in place for you, based on your most recent sending email address and details. In this case, the registration procedure at the URL above will send you the login details for that account. You can then logon to BBSPrints from the User Login link on the http://www.bbsonline.org/ front page and alter your mailshot (Call) status from there. There is of course no charge for any of this: also, there is no need to reply directly to this email. Many thanks, Stevan Harnad Editor Phineas de Thornley Head Editor, Electronic Review Systems Behavioral and Brain Sciences Journals Department _/_/_/ _/_/_/ _/_/_/_/ Cambridge University Press _/ _/ _/ _/ _/ 40 West 20th Street _/ _/ _/ _/ _/ New York _/_/_/_/_ _/_/_/_/_ _/_/_/_/ NY 10011-4211 _/ _/ _/ _/ _/ UNITED STATES _/ _/ _/ _/ _/ /_/_/_/_/ /_/_/_/_/ _/_/_/_/ bbs at bbsonline.org http://bbsonline.org __ __ | | |\ | | | |\ | |_ 'Phone: +001 212 924 3900 ext.369 |__| | \| |__ | | \| |__ Fax: +001 212 645 5960 From sper at informatik.uni-osnabrueck.de Tue May 8 05:07:00 2001 From: sper at informatik.uni-osnabrueck.de (Volker Sperschneider) Date: Tue, 08 May 2001 11:07:00 +0200 Subject: faculty opening Message-ID: <3AF7B734.5859C755@informatik.uni-osnabrueck.de> The University of Osnabrueck announces a full professorship for Computational Neuroscience. Further information is available at http://www.uos.de/career_service/stellenangebote/index.cfm. From mschmitt at lmi.ruhr-uni-bochum.de Wed May 9 07:34:54 2001 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Wed, 09 May 2001 13:34:54 +0200 Subject: Preprint on Radial Basis Function Neural Networks Message-ID: <3AF92B5E.BAC2BCF3@lmi.ruhr-uni-bochum.de> Dear Colleagues, a preprint of the paper "Radial basis function neural networks have superlinear VC dimension" by Michael Schmitt, accepted for the 14th Annual Conference on Computational Learning Theory COLT'2001, is available on-line from http://www.ruhr-uni-bochum.de/lmi/mschmitt/rbfsuper.ps.gz (19 pages gzipped PostScript). Regards, Michael Schmitt ------------------------------------------------------------ TITLE: Radial basis function neural networks have superlinear VC dimension AUTHOR: Michael Schmitt ABSTRACT We establish superlinear lower bounds on the Vapnik-Chervonenkis (VC) dimension of neural networks with one hidden layer and local receptive field neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension $\Omega(W\log k)$, where $W$ is the number of parameters and $k$ the number of nodes. This significantly improves the previously known linear bound. We also derive superlinear lower bounds for networks of discrete and continuous variants of center-surround neurons. The constants in all bounds are larger than those obtained thus far for sigmoidal neural networks with constant depth. The results have several implications with regard to the computational power and learning capabilities of neural networks with local receptive fields. In particular, they imply that the pseudo dimension and the fat-shattering dimension of these networks is superlinear as well, and they yield lower bounds even when the input dimension is fixed. The methods presented in this paper might be suitable for obtaining similar results for other kernel-based function classes. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: +49 234 32-23209 , Fax: +49 234 32-14465 http://www.ruhr-uni-bochum.de/lmi/mschmitt/ From ken at phy.ucsf.edu Wed May 9 14:40:11 2001 From: ken at phy.ucsf.edu (Ken Miller) Date: Wed, 9 May 2001 11:40:11 -0700 Subject: Paper available: Origins of cortical temporal tuning Message-ID: <15097.36619.21538.1324@coltrane.ucsf.edu> A preprint of the following article is now available, from http://www.keck.ucsf.edu/~ken (click on 'publications', then on 'Models of Neuronal Integration and Circuitry') or directly from ftp://ftp.keck.ucsf.edu/pub/ken/krukowski-miller01.pdf (there is also a web supplement to the article, ftp://ftp.keck.ucsf.edu/pub/ken/krukowski-miller01-websupp.pdf) Krukowski, A.E. and K.D. Miller (2001). ``Thalamocortical NMDA conductances and intracortical inhibition can explain cortical temporal tuning.'' Nature Neuroscience 4, 424-430. Abstract: Cells in cerebral cortex fail to respond to fast-moving stimuli that evoke strong responses in the thalamic nuclei that provide input to cortex. The reason for this behavior has remained a mystery. We study an experimentally-motivated model of the thalamic input-recipient layer of cat primary visual cortex that we have previously shown accounts for many aspects of cortical orientation tuning. In this circuit, inhibition dominates over excitation, but temporal modulations of excitation and inhibition occur out of phase with one another, allowing excitation to transiently drive cells. We show that this circuit provides a natural explanation of cortical low-pass temporal frequency tuning, provided N-methyl-D-aspartate (NMDA) receptors are present in thalamocortical synapses in proportions measured experimentally. This suggests a new and unanticipated role for NMDA conductances in shaping the temporal response properties of cortical cells, and suggests that common cortical circuit mechanisms underly both spatial and temporal response tuning. Ken Kenneth D. Miller telephone: (415) 476-8217 Associate Professor fax: (415) 476-4929 Dept. of Physiology, UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From maneesh at gatsby.ucl.ac.uk Thu May 10 08:25:29 2001 From: maneesh at gatsby.ucl.ac.uk (Maneesh Sahani) Date: Thu, 10 May 2001 13:25:29 +0100 Subject: CNS*2001 Workshops: Call for Proposals Message-ID: <200105101225.NAA30337@crick.gatsby.ucl.ac.uk> CALL FOR PROPOSALS CNS*2001 Workshops July 4 and 5, 2001 Pacific Grove, California Workshops focusing on current issues in computational neuroscience will be held on July 4 and 5, 2001, as part of the CNS*2001 conference in Pacific Grove, California. Potential organizers are invited to submit proposals for specific workshop topics. Workshops may fall into one of three formats: 1. Discussion Workshops (formal or informal); 2. Tutorials; and 3. Mini-symposia, or they may combine more than one of these formats. The goal of the workshops is to provide an informal forum within the CNS meeting for focused discussion of recent or speculative research, novel techniques, and open issues in computational neuroscience. Discussion workshops, whether formal (i.e., held in a conference room with projection and writing media) or informal (held elsewhere), should stress interactive and open discussions in preference to sequential presentations. Tutorials and mini-symposia provide a format for a focused exploration of particular issues or techniques within a more traditional presentation framework; in these cases too, adequate time should be reserved for questions and general discussion. The organizers of a workshop should endeavor to bring together as broad a range of pertinent viewpoints as possible. In addition to recruiting participants and moderating discussion, workshop organizers should be prepared to submit a short report, summarizing the presentations and discussion, to the workshop coordinator shortly after the conference. These reports will be included on the CNS*2001 web site. ------------------------- How to propose a workshop ------------------------- To propose a workshop, please submit the following information to the workshop coordinator at the address below (1) the name(s) of the organizer(s) (2) the title of the workshop (3) a description of the subject matter, indicating clearly the range of topics to be discussed (4) the format(s) of the workshop; if a discussion session, please specify whether you would like it to be held in a conference room or in a less formal setting (5) for tutorials and mini-symposia, a provisional list of speakers (6) whether the workshop is to run for one or two days Please submit proposals as early as possible by email to cns2001workshops at gatsby.ucl.ac.uk or by post to Dr. Maneesh Sahani Gatsby Computational Neuroscience Unit Alexandra House 17, Queen Square London, WC1N 3AR, U.K. The descriptions of accepted workshops will appear on the CNS*2001 web site as they are received. Attendees are encouraged to check this list, and to contact the organizers of any workshops in which they are interested in participating. From s.holden at cs.ucl.ac.uk Thu May 10 09:31:19 2001 From: s.holden at cs.ucl.ac.uk (Sean Holden) Date: Thu, 10 May 2001 14:31:19 +0100 Subject: MSc in Intelligent Systems Message-ID: <3AFA9827.528AC379@cs.ucl.ac.uk> MSc in Intelligent Systems -------------------------- Department of Computer Science University College London London, UK A Masters Training Package funded by the Engineering and Physical Sciences Research Council (EPSRC). This new, 12 month advanced Masters programme covering all aspects of Intelligent Systems has extensive industrial involvement and is available to applicants having a good degree in Computer Science or a similar subject and/or appropriate industrial experience. Applicants will be expected to have completed final year courses in, or to have experience of, for example, neural networks, expert systems or artificial intelligence. The programme is available by full-time study for one year, or by part-time study for two years (day-release). A number of studentships are available for suitably qualified applicants. Courses, in addition to a substantial project, are planned to include: - Supervised Learning - Unsupervised Learning - Advanced Artificial Intelligence - Pattern Recognition & Machine Vision - Programming and Management Issues - Fundamental skills in Mathematical Methods and Statistics - Evolutionary Systems - Intelligent Text Handling - Advanced Database and Information Systems - Intelligent Systems in Business and Commerce - Intelligent Systems in Bioinformatics The Department of Computer Science at UCL has an excellent research group in Intelligent Systems, and the new programme has the active involvement of many leading researchers. For further information and application forms please contact the Admissions and General inquiries Office/Friends' Room, University College London (UCL), Gower Street, London WC1E 6BT, United Kingdom. Tel: +44 (0)207 679 3000 Fax: (0)207 679 3001 e-mail: degree-info at ucl.ac.uk. Alternatively, consult our web site: http://www.cs.ucl.ac.uk/teaching/MTPIS From dirk at bioss.ac.uk Thu May 10 08:10:56 2001 From: dirk at bioss.ac.uk (Dirk Husmeier) Date: Thu, 10 May 2001 13:10:56 +0100 Subject: Paper on HMMs in Bioinformatics Message-ID: <3AFA8550.3F1799E7@bioss.ac.uk> Dear Connectionists The following paper has just been accepted for publication in JOURNAL OF COMPUTATIONAL BIOLOGY and might be of interest to researchers who apply machine learning techniques to problems in BIOINFORMATICS. TITLE: Detection of Recombination in DNA Multiple Alignments with Hidden Markov Models AUTHORS: Dirk Husmeier and Frank Wright PAGES: 56 DOWNLOAD FROM: http://www.bioss.sari.ac.uk/~dirk/My_publications.html FORMAT: PDF SYNOPSIS The recent advent of multiple-resistant pathogens has led to an increased interest in interspecies recombination as an important, and previously underestimated, source of genetic diversification in bacteria and viruses. The discovery of a surprisingly high frequency of mosaic RNA sequences in HIV-1 suggests that a substantial proportion of AIDS patients have been coinfected with HIV-1 strains belonging to different subtypes, and that recombination between these genomes can occur in vivo to generate new biologically active viruses. A phylogenetic analysis of the bacterial genera Neisseria and Streptococcus has revealed that the introduction of blocks of DNA from penicillin-resistant non-pathogenic strains into sensitive pathogenic strains has led to new strains that are both pathogenic and resistant. Thus interspecies recombination raises the possibility that bacteria and viruses can acquire biologically important traits through the exchange and transfer of genetic material. In the present article, a hidden Markov model (HMM) is employed to detect recombination events in multiple alignments of DNA sequences. The emission probabilities in a given state are determined by the branching order (topology) and the branch lengths of the respective phylogenetic tree, while the transition probabilities depend on the global recombination probability. The present study improves on an earlier heuristic parameter optimization scheme and shows how the branch lengths and the recombination probability can be optimized in a maximum likelihood sense by applying the expectation maximization (EM) algorithm. The novel algorithm is tested on a synthetic benchmark problem and is found to clearly outperform the earlier heuristic approach. The paper concludes with an application of this scheme to a DNA sequence alignment of the argF gene from four Neisseria strains, where a likely recombination event is clearly detected. Best Wishes Dirk -- ---------------------------------------------- Dirk Husmeier Biomathematics and Statistics Scotland (BioSS) SCRI, Invergowrie, Dundee DD2 5DA, United Kingdom http://www.bioss.ac.uk/~dirk/ From Ulrich.Hillenbrand at dlr.de Thu May 10 11:47:21 2001 From: Ulrich.Hillenbrand at dlr.de (Ulrich Hillenbrand) Date: Thu, 10 May 2001 17:47:21 +0200 Subject: Thesis and articles on thalamocortical information processing Message-ID: <3AFAB807.104B6A85@dlr.de> Dear colleagues, you can find my doctoral thesis Spatiotemporal Adaptation in the Corticogeniculate Loop Ulrich Hillenbrand Technical University of Munich (2001) (see abstract below) for download at http://tumb1.biblio.tu-muenchen.de/publ/diss/ph/2001/hillenbrand.pdf You may also be interested in two related articles, Spatiotemporal adaptation through corticothalamic loops: A hypothesis Ulrich Hillenbrand and J. Leo van Hemmen Visual Neuroscience 17, 107-118 (2000) http://www.journals.cup.org/bin/bladerunner?REQUNIQ=989495926&REQSESS=156885&118200REQEVENT=&REQINT1=40136&REQAUTH=0 and Does Corticothalamic Feedback Control Cortical Velocity Tuning? Ulrich Hillenbrand and J. Leo van Hemmen Neural Computation 13, 327-355 (2001) http://neco.mitpress.org/cgi/content/full/13/2/327 Reprints are available upon request (by e-mail or to the address below). Please feel free to comment. Regards, Ulrich Hillenbrand ------------------------------------------------------------- Spatiotemporal Adaptation in the Corticogeniculate Loop Abstract The thalamus is the major gate to the cortex for almost all sensory signals, for input from various subcortical sources such as the cerebellum and the mammillary bodies, and for reentrant cortical information. Thalamic nuclei do not merely relay information to the cortex but perform some operation on it while being modulated by various transmitter systems and in continuous interplay with their cortical target areas. Indeed, cortical feedback to the thalamus is the anatomically dominant input to relay cells even in those thalamic nuclei that are directly driven by sensory systems. While it is well-established that the receptive fields of cortical neurons are strongly influenced by convergent thalamic inputs of different types, the modulation effected by cortical feedback in thalamic response has been difficult to interpret. Experiments and theoretical considerations have pointed to a variety of operations of the visual cortex on the visual thalamus, the lateral geniculate nucleus (LGN), such as control of binocular disparity for stereopsis (Schmielau & Singer, 1977), attention-related gating of relay cells (Sherman & Koch, 1986), gain control of relay cells (Koch, 1987), synchronizing firing of neighboring relay cells (Sillito et al., 1994; Singer 1994), increasing visual information in relay cells' output (McClurkin et al., 1994), and switching relay cells from a detection to an analyzing mode (Godwin et al., 1996; Sherman, 1996; Sherman & Guillery, 1996). Nonetheless, the evidence for any particular function is still sparse and rather indirect to date. Clearly, detailed concepts of the interdependency of thalamic and cortical operation could greatly advance our knowledge about complex sensory, and ultimately cognitive, processing. Here we present a novel view on the corticothalamic puzzle by proposing that control of velocity tuning of visual cortical neurons may be an eminent function of corticogeniculate processing. The hypothesis is advanced by studying a model of the primary visual pathway in extensive computer simulations. At the heart of the model is a biophysical account of the electrical membrane properties of thalamic relay neurons (Huguenard & McCormick, 1992; McCormick & Huguenard, 1992) that includes 12 ionic conductances. Among the different effects that corticogeniculate feedback may have on relay cells, we focus on the modulation of their relay mode (between tonic and burst mode) by control of their resting membrane potential. Employing two distinct temporal-response types of geniculate relay neurons (lagged and nonlagged), we find that shifts in membrane potential affect the temporal response properties of relay cells in a way that alters the tuning of cortical cells for speed. Given the loop of information from the LGN to cortical layer 4, via a variable number of synapses to layer 6, and back to the LGN, the question arises, what are likely implications of adaptive speed tuning for visual information processing? Based on some fairly general considerations concerning the nature of motion information, we devise a simple model of the corticogeniculate loop that utilizes adaptive speed tuning for the fundamental task of segmentation of objects in motion. A detailed mathematical analysis of the model's behavior is presented. Treating visual stimulation as a stochastic process that drives the adaptation dynamics, we prove the model's object-segmentation capabilities and reveal some non-intended properties, such as oscillatory responses, that are consequences of its basic design. Several aspects of the dynamics in the loop are discussed in relation to experimental data. -- Dr. Ulrich Hillenbrand Institute of Robotics and Mechatronics German Aerospace Center/DLR Postfach 1116 82230 Wessling Germany Phone: +49-8153-28-3501 Fax: +49-8153-28-1134 From glaser at socrates.Berkeley.EDU Thu May 10 17:24:20 2001 From: glaser at socrates.Berkeley.EDU (Donald A. Glaser) Date: Thu, 10 May 2001 14:24:20 -0700 Subject: post-doc positions in computational neuroscience (UC Berkeley) Message-ID: POST-DOC POSITIONS IN COMPUTATIONAL NEUROSCIENCE We are developing computational models of primate visual cortex based on the properties of two-dimensional arrays of interconnected model neurons and multiple layers of such arrays. These models are designed to mimic the anatomy and functioning of visual cortex as closely as practical and to make predictions of observable phenomena via psychophysical, electrophysiological, and fMRI techniques. Experiments being planned now will involve the new Berkeley Brain Imaging Center with its 4-Tesla fMRI system in studying patterns of cortical excitation resulting from various visual stimuli. Psychophysical experiments to test our models will continue in our own laboratory. Candidates will be expected to perform some combination of analysis, refinement, and elaboration of these or new, related, computer models, and participate in design and implementation of psychophysical and neuroimaging tests of these models. A strong background in mathematics, physics, or computer science and a continuing interest in neuroscience are required. Experience with Matlab, Mathematica, and Linux are desirable as we will shortly install a Linux-based Beowulf system for large computations in addition to the computers now in use. A supercomputer at the Lawrence Berkeley National Laboratory is also available for these studies. Sample Publications: 1) Motion detection and characterization by an excitable membrane: The "bow wave" model, by Donald A. Glaser, Davis Barch, Neurocomputing 26-27 (1999) 137-146 2)Characterization of activity oscillations in an excitable membrane model and their potential functionality for neuronal computations by Davis Barch, Neurocomputing 32-33 (2000) 25-35 3)Multiple matching of features in simple stereograms, by T. Kumar, Vision Res. Vol 36, No. 5 pp 675-698, (1996) 4) To be presented at CNS 2001, the Tenth Annual Computational Neuroscience Meeting at Pacific Grove, California, June 30-July 5, 2001 1) Nearby edges and lines interfere very little with width discrimination of rectangles, by T. Kumar, Ilya Khaytin, and D. A. Glaser 2) Interactions among cortical maps, by Kirill N. Shokhirev and Donald A. Glaser 3) Synaptic depression and facilitation can induce motion aftereffects in an excitable membrane model of of visual motion processing, by D. Barch, and D.A.Glaser 4) Slowly moving visual stimuli induce characteristic periodic activity waves in an excitable membrane model of visual motion processing, by D. Barch and D.A.Glaser Please send your CV, a brief statement of your interests, and letters of recommendation to: Donald A. Glaser PhD. Nobel Laureate in Physics Professor of Physics and of Neurobiology in the Graduate School University of California at Berkeley 41 Hill Road, Berkeley CA 94708 W 510-642-7231, F 510-841-2563 glaser at socrates.berkeley.edu From schittenkopf at ftc.co.at Fri May 11 03:42:53 2001 From: schittenkopf at ftc.co.at (Christian Schittenkopf (FTC Research)) Date: Fri, 11 May 2001 09:42:53 +0200 Subject: ICANN 2001 WORKSHOP CONTRIBUTIONS Message-ID: <000001c0d9ed$fccf8970$6bfda8c0@FTCRD02> [ Moderator's note: Thanks to Christian Schittenkopf for preparing this summary of the ICANN workshops. The CONNECTIONISTS list doesn't carry announcements for individual workshops associated with a conference where we have also carried the call for papers and call for registrations, because we were being flooded with too many of these and they are usually only of interest to conference attendees. However, we are happy to carry a summary of a conference's entire workshop program as a single posting. -- Dave Touretzky, CONNECTIONISTS moderator ] Following the regular program of the ICANN 2001 conference (Aug. 21-24), four workshops on current topics will be held on Aug. 25 in Vienna. CONTRIBUTIONS to the WORKSHOPS listed below are highly welcome. More details on tutorials, conference and workshops can be found at http://www.ai.univie.ac.at/icann/ Christian Schittenkopf (Workshop Chair) Workshop: RECURRENT NEURAL NETWORKS AND ONLINE SEQUENCE LEARNING Organizers: Douglas Eck and Juergen Schmidhuber, IDSIA OVERVIEW: A full-day workshop. We define the topic broadly and include presentations from related areas, although the focus will remain on recurrent neural networks (RNNs). RNNs are of interest as they can implement almost arbitrary sequential behavior. They are biologically more plausible and computationally more powerful than feedforward networks, support vector machines, hidden Markov models, etc. Making RNNs learn from examples used to be difficult though. Recent progress has overcome fundamental problems plaguing traditional RNNs - now there exist online learning RNNs that efficiently learn previously unlearnable solutions to numerous tasks, using not more than O(1) computations per weight and time step: Recognition of temporally extended, noisy patterns Recognition of regular, context free and context sensitive languages Recognition of temporal order of widely separated events Extraction of information conveyed by the temporal distance between events Generation of precisely timed rhythms Stable generation of smooth periodic trajectories Robust storage of high-precision real numbers across extended time intervals The workshop is intended to discuss recent advances as well as future potential of RNNs and alternative approaches to online sequence learning. WORKSHOP FORMAT: Like all ICANN 2001 workshops, this session will take place in a particularly nice venue, a traditional Heuriger ['hoy-ri-guer] in Vienna. A "Heuriger" provides a typically Viennese where one can drink local wine and eat schnitzel while sitting on wooden seats at wooden tables. SPEAKERS: We might be able to add one or two additional speakers. If you are interested in presenting, please contact Doug Eck (doug at idsia.ch) with a suggested title and abstract. Here is a *tentative* list. Marco Gori (Universita degli Studi di Siena, Italy) Steve Grossberg (Boston University, USA) Sepp Hochreiter (University of Colorado, USA) Juan Antonio Perez-Ortiz (Universidad di Alicante, Spain) Nicol Schraudolph (ETH Zurich, Switzerland) Sebino Stramaglia (Istituto Nazionale di Fiscia Nucleare, Italy) Ron Sun (University of Missouri, USA) Hans Georg Zimmermann (Siemens AG, Germany) For details and abstracts see the workshop website at http://www.idsia.ch/~doug/icann/index.html For registration see the ICANN website at http://www.ai.univie.ac.at/icann/ Workshop: KERNEL & SUBSPACE METHODS FOR COMPUTER VISION Co-organizers: Ales Leonardis, Horst Bischof http://www.prip.tuwien.ac.at/~bis/kernelws/ Scope of the workshop: This half-day workshop will be held in conjunction with ICANN 2001 on August 25, 2001 in Vienna. In the past years, we have been witnessing vivid developments of sophisticated kernel and subspace methods in neural network and pattern recognition communities on one hand and extensive use of these methods in the area of computer vision on the other hand. These methods seem to be especially relevant for object and scene recognition. The purpose of the workshop is to bring together scientists from the neural network (pattern recognition) and computer vision community to analyze new developments, identify open problems, and discuss possible solutions in the area of kernel & subspace methods such as: Support Vector Machines Independent Component Analysis Principle Component Analysis Canonical Correlation Analysis, etc. for computer vision problems such as: Object Recognition Navigation and Robotics 3D Vision, etc. Contributions in the above mentioned areas are welcome. The program will consist of invited and selected contributed papers. The papers selected for the workshop will appear in a Workshop Proceedings which will be distributed among the workshop participants. It is planned that selected papers from the workshop will be published in a journal. Important dates: Submission Deadline: 31.5.2001 Notification of Acceptance: 29.6.2001 Final Papers Due: 3.8.2001 Submission instructions: A complete paper, not longer than 12 pages including figures and references, should be submitted in the LNCS page format. The layout of final papers must adhere strictly to the guidelines set out in the Instructions for the Preparation of Camera-Ready Contributions to LNCS Proceedings. Authors are asked to follow these instructions exactly. In order to reduce the handling effort of papers we allow only for electronic submissions by ftp (either ps or pdf files). ftp ftp.prip.tuwien.ac.at [anonymous ftp, i.e.: Name: ftp Password: ] cd kernelws binary put .ext quit Workshop Registration: Registration for the Workshop can be done at the ICANN 2001 Homepage http://www.ai.univie.ac.at/icann/ Workshop: ADVANCES TOWARDS LIFE-LIKE PERCEPTION SYSTEMS Organizer: Leslie Smith The aim of the workshop is to discuss neuromorphic systems in sensory perception: mechanisms, coding schemes, scene analysis (whether visual, auditory, polfactory other sense), top-down and bottom up processing. We are particularly interested in the the nature of biological relevance (and indeed, whether this is really necessary) and the sensing-perception-action loop. We seek 1 page contributions by May 31. We are considering organising publication of the workshop. For further information, see http://www.ai.univie.ac.at/icann/txt/workshop-lps.html Leslie S Smith lss at cs.stir.ac.uk tel: +44 1786 46 7435 fax: +44 1786 46 4551 Department of Computing Science and Mathematics, University of Stirling, Stirling FK9 4LA, Scotland, UK From paolo at dsi.unifi.it Sat May 12 16:28:28 2001 From: paolo at dsi.unifi.it (Paolo Frasconi) Date: Sat, 12 May 2001 22:28:28 +0200 Subject: Call for participation: NATO ASI on AI and Bioinformatics Message-ID: The following meeting may be of interest to researchers interested in neural networks applied to computational biology. Artificial Intelligence and Heuristic Methods for Bioinformatics A NATO Advanced Studies Institute San Miniato, Italy October 1-11, 2001 www.dsi.unifi.it/ai4bio Application deadline: July 25, 2001 Artificial Intelligence and Heuristics (e.g., machine learning and data mining, pattern recognition, cluster analysis, search, knowledge representation) can provide key solutions for the new challenges posed by the progressive transformation of biology into a data-massive science. This school is targeted to scientists who want to learn about the most recent advancements in the application of intelligent systems to computational biology. Topics: Computational analysis of biological data. Artificial intelligence, machine learning, and heuristic methods, including neural and belief networks. Prediction of protein structure (secondary structure, contact maps). The working draft of the human genome. Genome annotation. Computational tools for gene regulation. Analysis of gene expression data and their applications. Computer assisted drug discovery. Knowledge discovery in biological domains. Lecturers: Pierre Baldi (University of California, Irvine) Soeren Brunak (CBSA, The Technical University of Denmark) Rita Casadio (CIRB, University of Bologna) Antonello Covacci (Chiron Italia) Paolo Frasconi (DSI, University of Florence) Terry Gaasterland (Rockefeller University) Dan Geiger (Technion, Israel) Mikhail Gelfand (Russian Academy of Science, Moscow) David Haussler (University of California, Santa Cruz) Nikolay A. Kolchanov (Inst. of Cytology and Genetics, Novosibirsk) Richard H. Lathrop (University of California, Irvine) Heiko Mueller (Pharmacia & Upjohn, Milano) Steve Muggleton (Imperial College, London) Burkhard Rost (Columbia University)? Roberto Serra (Montecatini SpA, Ravenna) Ron Shamir (Tel Aviv University) Co-directors: Paolo Frasconi (University of Florence) Email: paolo at dsi.unifi.it www.dsi.unifi.it/~paolo Ron Shamir (Tel Aviv University) Email: rshamir@ tau.ac.il www.math.tau.ac.il/~rshamir Limited grants have been made available by NATO to cover the accommodation and/or travel expenses of selected attendees. A limited number of travel awards will be made available by the National Science Foundation for U.S. citizens or permanent residents. For APPLICATION, CONTRIBUTING PAPERS, GRANTS, FEES, and further information please visit http://www.dsi.unifi.it/ai4bio From lorincz at valerie.inf.elte.hu Sun May 13 07:47:47 2001 From: lorincz at valerie.inf.elte.hu (LORINCZ Andras) Date: Sun, 13 May 2001 13:47:47 +0200 (MET DST) Subject: TR on Event Learning and Robust Policy Heuristics Message-ID: A technical report is now available, from http://people.inf.elte.hu/lorincz/NIPG-ELU-14-05-2001.ps.gz TITLE Event Learning and Robust Policy Heuristics ABSTRACT In this paper we introduce a novel form of reinforcement learning called event-learning or E-learning. In our method an event is an ordered pair of two consecutive states. We define event-value function and derive learning rules which are guaranteed to converge to the optimal event-value function. Combining our method with a well-known robust control method, the SDS algorithm, we introduce Robust Policy Heuristics (RPH). It is shown that RPH, a fast-adapting non-Markovian policy, is particularly useful for coarse models of the environment and for partially observed systems. As such, RPH alleviates the `curse of dimensionality' problem. Fast adaptation can be used to separate time scales of learning the value functions of a Markovian decision making problem and adaptation, the utilization of a non-Markovian policy. We shall argue that (i) the definition of modules is straightforward for E-learning, (ii) E-learning extends naturally to policy switching, and (iii) E-learning promotes planning. Computer simulations of a two-link pendulum with coarse discretization and noisy controller are shown to demonstrate the principle. Comments are more than welcome. Andras Lorincz www.inf.elte.hu/~lorincz From ted.carnevale at yale.edu Sun May 13 08:24:40 2001 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Sun, 13 May 2001 08:24:40 -0400 Subject: NEURON 2001 Summer Course Message-ID: <3AFE7D08.B58203CC@yale.edu> The registration deadline for the NEURON 2001 Summer Course is rapidly approaching (May 25), but a few seats remain available. For more information and an application form see http://www.neuron.yale.edu/neuron/sdsc2001/sdsc2001.htm --Ted From harnad at coglit.ecs.soton.ac.uk Mon May 14 15:03:23 2001 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Mon, 14 May 2001 20:03:23 +0100 (BST) Subject: BBS Call for Commentators: VISUAL CONSCIOUSNESS Message-ID: Below is the abstract of a forthcoming BBS target article [Please note that this paper was accepted and archived to the web in October 2000 but the recent move of BBS to New York delayed the Call until now.] A SENSORIMOTOR ACCOUNT OF VISION AND VISUAL CONSCIOUSNESS by J. Kevin O'Regan Alva Noe http://www.bbsonline.org/Preprints/ORegan/ This article has been accepted for publication in Behavioral and Brain Sciences (BBS), an international, interdisciplinary journal providing Open Peer Commentary on important and controversial current research in the biobehavioral and cognitive sciences. Commentators must be BBS Associates or nominated by a BBS Associate. To be considered as a commentator for this article, to suggest other appropriate commentators, or for information about how to become a BBS Associate, please reply by EMAIL within three (3) weeks to: calls at bbsonline.org The Calls are sent to 8000 BBS Associates, so there is no expectation (indeed, it would be calamitous) that each recipient should comment on every occasion! Hence there is no need to reply except if you wish to comment, or to nominate someone to comment. If you are not a BBS Associate, please approach a current BBS Associate (there are currently over 10,000 worldwide) who is familiar with your work to nominate you. All past BBS authors, referees and commentators are eligible to become BBS Associates. A full electronic list of current BBS Associates is available at this location to help you select a name: http://www.bbsonline.org/Instructions/assoclist.html If no current BBS Associate knows your work, please send us your Curriculum Vitae and BBS will circulate it to appropriate Associates to ask whether they would be prepared to nominate you. (In the meantime, your name, address and email address will be entered into our database as an unaffiliated investigator.) To help us put together a balanced list of commentators, please give some indication of the aspects of the topic on which you would bring your areas of expertise to bear if you were selected as a commentator. To help you decide whether you would be an appropriate commentator for this article, an electronic draft is retrievable from the online BBSPrints Archive, at the URL that follows the abstract below. _____________________________________________________________ A sensorimotor account of vision and visual consciousness J. Kevin O'Regan Laboratoire de Psychologie Expirimentale Centre National de Recherche Scientifique, Universiti Reni Descartes, 92774 Boulogne Billancourt, France oregan at ext.jussieu.fr http://nivea.psycho.univ-paris5.fr Alva Noe Department of Philosophy University of California, Santa Cruz Santa Cruz, CA 95064 anoe at cats.ucsc.edu http://www2.ucsc.edu/people/anoe/ KEYWORDS: Sensation, Perception, Action, Consciousness, Experience, Qualia, Sensorimotor, Vision, Change blindness ABSTRACT: Many current neurophysiological, psychophysical and psychological approaches to vision rest on the idea that when we see, the brain produces an internal representation of the world. The activation of this internal representation is assumed to give rise to the experience of seeing. The problem with this kind of approach is that it leaves unexplained how the existence of such a detailed internal representation might produce visual consciousness. An alternative proposal is made here. We propose that seeing is a way of acting. It is a particular way of exploring the environment. Activity in internal representations does not generate the experience of seeing. The outside world serves as its own, external, representation. The experience of seeing occurs when the organism masters what we call the governing laws of sensorimotor contingency. The advantage of this approach is that it provides a natural and principled way of accounting for visual consciousness, and for the differences in the perceived quality of sensory experience in the different sensory modalities. Several lines of empirical evidence are brought forward in support of the theory, in particular: evidence from experiments in sensorimotor adaptation, visual "filling in", visual stability despite eye movements, change blindness, sensory substitution, and color perception. http://www.bbsonline.org/Preprints/ORegan/ ___________________________________________________________ Please do not prepare a commentary yet. Just let us know, after having inspected it, what relevant expertise you feel you would bring to bear on what aspect of the article. We will then let you know whether it was possible to include your name on the final formal list of invitees. _______________________________________________________________________ *** SUPPLEMENTARY ANNOUNCEMENTS *** (1) The authors of scientific articles are not paid money for their refereed research papers; they give them away. What they want is to reach all interested researchers worldwide, so as to maximize the potential research impact of their findings. Subscription/Site-License/Pay-Per-View costs are accordingly access-barriers, and hence impact-barriers for this give-away research literature. There is now a way to free the entire refereed journal literature, for everyone, everywhere, immediately, by mounting interoperable university eprint archives, and self-archiving all refereed research papers in them. Please see: http://www.eprints.org http://www.openarchives.org/ http://www.dlib.org/dlib/december99/12harnad.html --------------------------------------------------------------------- (2) All authors in the biobehavioral and cognitive sciences are strongly encouraged to self-archive all their papers in their own institution's Eprint Archives or in CogPrints, the Eprint Archive for the biobehavioral and cognitive sciences: http://cogprints.soton.ac.uk/ It is extremely simple to self-archive and will make all of our papers available to all of us everywhere, at no cost to anyone, forever. Authors of BBS papers wishing to archive their already published BBS Target Articles should submit it to BBSPrints Archive. Information about the archiving of BBS' entire backcatalogue will be sent to you in the near future. Meantime please see: http://www.bbsonline.org/help/ and http://www.bbsonline.org/Instructions/ --------------------------------------------------------------------- (3) Call for Book Nominations for BBS Multiple Book Review In the past, Behavioral and Brain Sciences (BBS) had only been able to do 1-2 BBS multiple book treatments per year, because of our limited annual page quota. BBS's new expanded page quota will make it possible for us to increase the number of books we treat per year, so this is an excellent time for BBS Associates and biobehavioral/cognitive scientists in general to nominate books you would like to see accorded BBS multiple book review. (Authors may self-nominate, but books can only be selected on the basis of multiple nominations.) It would be very helpful if you indicated in what way a BBS Multiple Book Review of the book(s) you nominate would be useful to the field (and of course a rich list of potential reviewers would be the best evidence of its potential impact!). From john at cs.rhul.ac.uk Tue May 15 08:37:24 2001 From: john at cs.rhul.ac.uk (John Shawe-Taylor) Date: Tue, 15 May 2001 13:37:24 +0100 (BST) Subject: Applications of learning to text and images In-Reply-To: Message-ID: Research Assistant Opening in Kernel Based Methods (see also web site: www.cs.rhul.ac.uk/vacancies/RAkernels.shtml ) Department of Computer Science, Royal Holloway, University of London Three year postdoctoral appointment available immediately Royal Holloway, University of London invites applications for a research assistant position in computer science. The salary is competitive and the work is associated with a new European-funded project directed by John Shawe-Taylor. The project involves developing kernel based methods for the analysis of multi-media documents provided by Reuters, who are collaborators on the project. The project is financed by the EU and also involves partners in France (Xerox), Italy (Genova University, Milano University) and Israel (Hebrew University of Jerusalem). We are seeking a researcher with experience in corpus based methods of information retrieval and document categorisation, and a strong programming background. Experience with kernel methods is desirable but not required. Salary is in the range 20,865 to 27,347 per annum inclusive of London Allowance. Please contact John Shawe-Taylor by email at jst at cs.rhul.ac.uk for more information. From ncpw7 at biols.susx.ac.uk Tue May 15 09:34:25 2001 From: ncpw7 at biols.susx.ac.uk (neural computation workshop) Date: Tue, 15 May 2001 14:34:25 +0100 Subject: Call for papers for NCPW7 (Brighton, England) Message-ID: <3B013060.997D52FB@biols.susx.ac.uk> Dear Connectionists I wish to bring peoples attention to the first call for papers: The Seventh Neural Computation and Psychology Workshop (NCPW7) at Sussex University, Brighton Connectionist models of Cognition and Perception University of Sussex, Falmer, England From stephen at computing.dundee.ac.uk Tue May 15 10:38:54 2001 From: stephen at computing.dundee.ac.uk (Stephen McKenna) Date: Tue, 15 May 2001 15:38:54 +0100 Subject: Potsdoc position in vision and learning Message-ID: <052c01c0dd4c$c42ed370$26222486@dyn.computing.dundee.ac.uk> The following postdoc position may be of interest. UNIVERSITY OF DUNDEE, UK, School of Engineering Department of Applied Computing POSTDOCTORAL RESEARCHER IN COMPUTER VISION (Grade RA1A : 18,731 - 23,256) Candidates are invited to apply for a 2 year Postdoctoral position in the Department of Applied Computing at the University of Dundee. The post is funded by an EPSRC project "Advanced Sensors for Supportive Environments for the Elderly". The successful candidate will conduct research in the area of computer vision-based monitoring, learning and recognition of human action within the context of this application. The Department of Applied Computing was awarded a "5A" rating in the UK RAE. Candidates should have a PhD (or equivalent experience) in a relevant discipline (e.g. computer vision, machine learning). Informal enquiries may be made to Dr Stephen McKenna,tel: (01382) 344732; e-mail: stephen at computing.dundee.ac.uk Further details of the department and this post can be found at http://www.computing.dundee.ac.uk/projects/supportiveenvironments Applications by CV & covering letter (2 copies of each), complete with the names, addresses, telephone/fax numbers/e-mail addresses of 2 referees should be sent to Personnel Services, University of Dundee, Dundee, DD1 4HN. Further Particulars are available for this post, tel: (01382) 344015. Please quote Reference: SE/151/1. Applicants will only be contacted if invited for interview. Closing date: 7 June 2001. The University of Dundee is committed to equal opportunities and welcomes applications from all sections of the community. http://www.dundee.ac.uk/ From kegl at IRO.UMontreal.CA Tue May 15 13:25:31 2001 From: kegl at IRO.UMontreal.CA (Balazs Kegl) Date: Tue, 15 May 2001 13:25:31 -0400 Subject: Principal Curves page updated and moved Message-ID: <200105151725.f4FHPVS12431@mercure.IRO.UMontreal.CA> Dear connectionists, I updated my Principal Curves web page and moved it to http://www.iro.umontreal.ca/~kegl/research/pcurves/ Recent references are included, and a new version of the java implementation of the Polygonal Line Algorithm [1,2] is available. The most important new features are - arbitrary-dimensional input data - loading/downloading your own data and saving the results - adjusting the parameters of the algorithm in an interactive fashion [1] B. Kgl, A. Krzyzak, T. Linder, and K. Zeger "Learning and design of principal curves" IEEE Transactions on Pattern Analysis and Machine Intelligence vol. 22, no. 3, pp. 281-297, 2000. http://www.iro.umontreal.ca/~kegl/research/publications/keglKrzyzakLinderZeger99.ps [2] B. Kgl "Principal curves: learning, design, and applications," Ph. D. Thesis, Concordia University, Canada, 1999. http://www.iro.umontreal.ca/~kegl/research/publications/thesis.ps Comments are welcome. Balazs Kegl -------------------------------------------------------------------------------- Balzs Kgl Assistant Professor E-mail: kegl at iro.umontreal.ca Dept. of Computer Science and Op. Res. Phone: (514) 343-7401 University of Montreal Fax: (514) 343-5834 CP 6128 succ. Centre-Ville http://www.iro.umontreal.ca/~kegl/ Montreal, Canada H3C 3J7 From scott at salk.edu Wed May 16 11:16:56 2001 From: scott at salk.edu (Scott Makeig) Date: Wed, 16 May 2001 08:16:56 -0700 (PDT) Subject: Call for Papers ICA2001 Message-ID: <200105161516.f4GFGuU41667@moniz.salk.edu> CALL FOR PAPERS CALL FOR PAPERS ICA2001 http://ica2001.org Third International Conference on Independent Component Analysis and Signal Separation San Diego, California December 9-13, 2001 Independent Component Analysis (ICA) is emerging as a new standard area of signal processing and data analysis. ICA attempts to solve the blind source separation problem in which sensor signals are unknown mixtures of unknown source signals. While there are no general analytical solutions, in the last decade researchers have proposed good approximate methods based on simple assumptions about the source statistics and using maximum likelihood, information maximization and minimization of higher-order moments. ICA theory has received attention from several research communities including machine learning, neural networks, statistical signal processing and Bayesian modeling. More recently numerous applications of ICA have appeared including applications to adaptive speech filtering, speech signal coding, biomedical signal processing, image compression, text modeling and financial data analysis. ICA2001 will feature the latest developments in the new field of blind source separation. The Workshop will feature internationally respected keynote speakers, poster sessions, and symposia on theory, on algorithms and on applications to a wide range of fields and data types. The Conference recreational program includes an informal banquet and a unique opening cocktail party / unmixer. This, the third international meeting in this series, is being hosted by the Institute for Neural Computation, UCSD. The previous two meetings were held in Aussois, France (December, 1999) and Helsinki, Finland (June, 2000). This year's event will be held December 9-13, 2001 immediately following the Neural Information Processing Systems (NIPS) conference in Vancouver, Canada and its post-conference workshops. PAPERS WILL BE ACCEPTED THROUGH THE WORKSHOP WEBSITE http://ica2001.org BETWEEN JUNE 1 AND JUNE 29, 2001 Organizing Committee Chair Terrence Sejnowski terry at inc.ucsd.edu Program Te-Won Lee tewon at inc.ucsd.edu Publicity Scott Makeig scott at inc.ucsd.edu Treasurer Gary Cottrell gary at inc.ucsd.edu Publication Tzyy-Ping Jung jung at inc.ucsd.edu Comm. Javier Movellan javier at inc.ucsd.edu Arrangements John Staight john at inc.ucsd.edu International Advisory Committee C. Jutten, INPG, France E. Oja, Helsinki University of Technology, Finland A. Bell, The Salk Institute, USA S. I. Amari, RIKEN, Japan Program Committee Luis Almeda Hagai Attias Jean-Francois Cardoso Andrzej Cichocki Seungjin Choi Pierre Comon Gustavo Deco Scott Douglas Richard Everson Mark Girolami Lars Kai Hansen Aapo Hyvrinen Juha Karhunen Soo-Young Lee Te-Won Lee Michael Lewicki Juan Lin Eric Moreau Noburo Murata Klaus-Robert Mueller J.-P. Nadal Klaus Obermayer Bruno Olshausen Ding-Tu Pham Barak Pearlmutter Jose Principe Juergen Schmidhuber Kari Torrkola From ojensen at neuro.hut.fi Wed May 16 03:05:19 2001 From: ojensen at neuro.hut.fi (Ole Jensen) Date: Wed, 16 May 2001 10:05:19 +0300 (EET DST) Subject: papers on phase coding Message-ID: Dear colleagues, I would like to draw your attention to two papers on phase coding and information transfer between rhythmically coupled networks. The papers area available in PDF at http://boojum.hut.fi/~ojensen/ or contact me for hard copies. Ole Jensen ========================================================================= Jensen. O. (in press) Information transfer between rhythmically coupled networks: reading the hippocampal phase code. Neural Computation Brain Research Unit, Low Temperature Laboratory, Helsinki University of Technology, P.O. Box 2200, FIN-02015 Espoo, Finland There are numerous reports on rhythmic coupling between separate brain networks. It has been proposed that this rhythmic coupling indicates exchange of information. So far, few computational models have been proposed which explore this principle and its potential computational benefits. Recent results on hippocampal place cells of the rat provide new insight: it has been shown that information about space is encoded by the firing of place cells with respect to the phase of the ongoing theta rhythm. This principle is termed phase coding and suggests that upcoming locations (predicted by the hippocampus) are encoded by cells firing late in the theta cycle, whereas current location is encoded by early firing at the theta phase. A network reading the hippocampal output must inevitably also receive an oscillatory theta input in order to decipher the phase coded firing patterns. In this work I propose a simple physiologically plausible mechanism implemented as an oscillatory network which can decode the hippocampal output. By changing only the phase of the theta input to the decoder, qualitatively different information is transferred: the theta phase determines whether representations of current or upcoming locations are read by the decoder. The proposed mechanism provides a computational principle for information transfer between oscillatory networks and might generalize to brain networks beyond the hippocampal region. ========================================================================== Jensen O. and J.E. Lisman (2000) Position reconstruction from an ensemble of hippocampal place cells: contribution of theta phase coding. Journal of Neurophysiology 83:2602-2609 Department of Biology, Volen Center for Complex Systems, Brandeis University, Waltham, Massachusetts 02454 Previous analysis of the firing of individual rat hippocampal place cells has shown that their firing rate increases when they enter a place field and that their phase of firing relative to the ongoing theta oscillation (7-12 Hz) varies systematically as the rat traverses the place field, a phenomenon termed the theta phase precession. To study the relative contribution of phased-coded and rate-coded information, we reconstructed the animal's position on a linear track using spikes recorded simultaneously from 38 hippocampal neurons. Two previous studies of this kind found no evidence that phase information substantially improves reconstruction accuracy. We have found that reconstruction is improved provided epochs with large, systematic errors are first excluded. With this condition, use of both phase and rate information improves the reconstruction accuracy by >43% as compared with the use of rate information alone. Furthermore, it becomes possible to predict the rat's position on a 204-cm track with very high accuracy (error of <3 cm). The best reconstructions were obtained with more than three phase divisions per theta cycle. These results strengthen the hypothesis that information in rat hippocampal place cells is encoded by the phase of theta at which cells fire. ============================================================================== Ole Jensen, Ph.D. Helsinki University of Technology Low Temperature Laboratory Otakaari 3A P.O. Box 2200 FIN-02015 HUT Finland Office : (+358) 9 4512951 Mobile : (+358) 405049936 Fax : (+358) 9 4512969 e-mail : ojensen at neuro.hut.fi URL : http://boojum.hut.fi/~ojensen/ From terry at salk.edu Wed May 16 16:09:58 2001 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 16 May 2001 13:09:58 -0700 (PDT) Subject: NEURAL COMPUTATION 13:6 Message-ID: <200105162009.f4GK9we16485@purkinje.salk.edu> Neural Computation - Contents - Volume 13, Number 6 - June 1, 2001 VIEW Generalization in Interactive Networks: The Benefits of Inhibitory Competition and Hebbian Learning Randall C. O'Reilly NOTE Optimal Smoothing in Visual Motion Perception Rajesh P.N. Rao, David M. Eagleman and Terrence J. Sejnowski LETTERS Rate Coding Versus Temporal Order Coding: What the Retinal Ganglion Cells Tell the Visual Cortex Rufin Van Rullen and Simon J. Thorpe The Effects of Spike Frequency Adaptation and Negative Feedback on the Synchronization of Neural Oscillators Bard Ermentrout, Matthew Pascal and Boris Gutkin A Unified Approach to the Study of Temporal, Correlational, and Rate Coding Stefano Panzeri and Simon R. Schultz Determination of Response Latency and its Application to Normalization of Cross-Correlation Measures Stuart N. Baker and George L. Gerstein Attractive Periodic Sets in Discrete-Time Recurrent Networks (with Emphasis on Fixed-Point Stability and Bifurcations in Two-Neuron Networks Peter Tino, Bill G. Horne, and C. Lee Giles Attractor Networks for Shape Recognition Yali Amit and Massimo Mascaro ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From cierina at vis.caltech.edu Mon May 21 13:02:07 2001 From: cierina at vis.caltech.edu (Cierina Reyes) Date: Mon, 21 May 2001 10:02:07 -0700 Subject: Announcement - Caltech Postdoctoral Position Message-ID: <5.0.2.1.2.20010521100147.00ac2b50@vis.caltech.edu> CALIFORNIA INSTITUTE OF TECHNOLOGY 2 Postdoctoral Fellowships in Neuroscience Applications are invited for 2 postdoctoral research positions, available immediately, to join a collaborative research program between the laboratories of Partha Mitra at Bell Laboratories (Murray Hill, New Jersey) and Prof. R. Andersen at the California Institute of Technology (Pasadena, California). The research project will examine the temporal correlation structure of activity within and between local and distant cortical areas in parietal and frontal areas during cognitive tasks, involving memory and planning of eye and arm movements. Multi-site recordings of single cell activity and local field potentials will be made in behaving monkeys and the data analyzed using modern statistical methods for stochastic processes and machine learning techniques. The experimentalist will be located at Caltech, and the theorist at Bell Labs. The goal of the research is to elucidate the underlying functional architecture and multi-site dynamics of neural activity in local and long-range circuits involved in working memory. Theorist Position #400 - The successful candidate should have training in an analytical subject, preferably in theoretical physics, as well as computational experience and/or experience with statistical analysis. The candidate should be motivated to work in understanding neural systems. The position offers an opportunity to work closely with an experiment, and will be part of a strong multidisciplinary team in a rich research environment. Experimentalist Position #500 - The successful candidate should have training in electrical engineering or experimental physics either at the Bachelor's or Ph.D. level or substantial background with electrical hardware and computer programming. Experience working in a neuroscience laboratory using electrophysiological recording is desirable. The position offers the opportunity to interact directly with theorists and will provide a rich opportunity for multidisciplinary study of the nervous system. Applications should include a curriculum vitae and two letters of recommendation. This material should be sent to Ms. Cierina R. Marks, California Institute of Technology, MC 216-76, 1201 E. California Blvd., Pasadena, CA 91125. Please indicate the position number that you are applying for. Caltech is an affirmative action, equal opportunity employer. Women, minorities, veterans, and disabled persons are encouraged to apply. From steve_kemp at unc.edu Mon May 21 23:38:10 2001 From: steve_kemp at unc.edu (Steven M. Kemp) Date: Mon, 21 May 2001 23:38:10 -0400 Subject: paper available: Situational Descriptions of Behavioral Procedures Message-ID: Dear Colleagues: The following paper on evaluating neural networks and other computational models of learning against laboratory data, entitled "Situational Descriptions of Behavioral Procedures" is available at: http://www.unc.edu/~skemp/documents/situate/InSitu/KEMP-75-135.PDF This paper appears in the forthcoming issue of the Journal of the Experimental Analysis of Behavior (JEAB). For those preferring hard copies, they should be available in a couple of months. Contact me via email. Best regards, steve p.s. If you have any trouble accessing, reading or printing this file, just drop me a line. If the problem is that you don't have Adobe Acrobat, you can get it (for free) here: http://www.adobe.com/prodindex/acrobat/readstep.html ----------------------------------------------------------------------- Situational Descriptions of Behavioral Procedures: The In Situ Testbed Steven M. Kemp and David A. Eckerman, Journal of the Experimental Analysis of Behavior (2001), vol. 75, pp. 135-164. Abstract We demonstrate In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement using an extension of Mechner's notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement/extinction, fixed ratio, and fixed interval. The In Situ testbed appears to be a reliable and valid testing procedure for comparing models of learning. Key words: neural networks, reinforcement schedules, situated action, cumulative records, learning theory, Mechner diagrams, extinction, key peck, computer simulation, Markov decision process, POMDP. -- Steve Kemp [apologies if you receive multiple copies of this message] >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< Steven M. Kemp | Department of Psychology | email: steve_kemp at unc.edu Davie Hall, CB# 3270 | University of North Carolina | Chapel Hill, NC 27599-3270 | fax: (919) 962-2537 Visit our WebSite at: http://www.unc.edu/~skemp/ >>>>>>>>>>>>>>>>>>>>> <<<<<<<<<<<<<<<<<<<<<<<< The laws of mind [are] themselves of so fluid a character as to simulate divergences from law. -- C. S. Peirce (Collected Papers, 6.101). From sami.kaski at hut.fi Tue May 22 09:22:46 2001 From: sami.kaski at hut.fi (Sami Kaski) Date: 22 May 2001 16:22:46 +0300 Subject: Papers on learning metrics Message-ID: Dear connectionists, There are papers on learning metrics available at http://www.cis.hut.fi/projects/mi/ The methods learn, based on auxiliary data, to measure distances along relevant or important local directions in a data space. The approach has connections to discriminative learning, distributional clustering, information geometry, and maximization of mutual information. So far we have incorporated the metrics into a clustering algorithm and the SOM, and applied the methods to the analysis of gene expression data, text documents, and financial statements of companies. Best regards, Samuel Kaski ----- Abstracts of two papers: (1) Samuel Kaski, Janne Sinkkonen, and Jaakko Peltonen. Bankruptcy analysis with self-organizing maps in learning metrics. IEEE Transactions on Neural Networks, 2001. Accepted for publication. We introduce a method for deriving a metric, locally based on the Fisher information matrix, into the data space. A Self-Organizing Map is computed in the new metric to explore financial statements of enterprises. The metric measures local distances in terms of changes in the distribution of an auxiliary random variable that reflects what is important in the data. In this paper the variable indicates bankruptcy within the next few years. The conditional density of the auxiliary variable is first estimated, and the change in the estimate resulting from local displacements in the primary data space is measured using the Fisher information matrix. When a Self-Organizing Map is computed in the new metric it still visualizes the data space in a topology-preserving fashion, but represents the (local) directions in which the probability of bankruptcy changes the most. (2) Janne Sinkkonen and Samuel Kaski. Clustering based on conditional distributions in an auxiliary space. Neural Computation, 2001. Accepted for publication. We study the problem of learning groups or categories that are local in the continuous primary space, but homogeneous by the distributions of an associated auxiliary random variable over a discrete auxiliary space. Assuming variation in the auxiliary space is meaningful, categories will emphasize similarly meaningful aspects of the primary space. From a data set consisting of pairs of primary and auxiliary items, the categories are learned by minimizing a Kullback-Leibler divergence-based distortion between (implicitly estimated) distributions of the auxiliary data, conditioned on the primary data. Still, the categories are defined in terms of the primary space. An on-line algorithm resembling the traditional Hebb-type competitive learning is introduced for learning the categories. Minimizing the distortion criterion turns out to be equivalent to maximizing the mutual information between the categories and the auxiliary data. In addition, connections to density estimation and to the distributional clustering paradigm are outlined. The method is demonstrated by clustering yeast gene expression data from DNA chips, with biological knowledge about the functional classes of the genes as the auxiliary data. From swatanab at pi.titech.ac.jp Wed May 23 00:26:12 2001 From: swatanab at pi.titech.ac.jp (Sumio Watanabe) Date: Wed, 23 May 2001 13:26:12 +0900 Subject: Geometry and Statistics in NN Learning Theory Message-ID: <000701c0e340$7fdbd7a0$988a7083@titech42lg8r0u> Dear Connectionists, We are very grad to inform that we have a special session, "Geometry and Statistics in Neural Network Learning Theory" http://watanabe-www.pi.titech.ac.jp/~swatanab/kes2001.html in the International Conference KES'2001, which will be held in Oska and Nara in Japan, 6th - 8th, September, 2001. http://www.bton.ac.uk/kes/kes2001/ In our session, we study the statistical problem caused by non-identifiability of layered learning machines. Information : * Date: September, 8th (Saturday), 2001, 14:40-16:45. * Place: Nara New Public Hall, Nara City, Japan. * Schedule: The time for each presentation is 25 minutes. * (Remark) Before this session, Professor Amari gives an invited talk, 13:40-14:40. ********** The authors and papers: You can get these papers from the cite, http://watanabe-www.pi.titech.ac.jp/~swatanab/kes2001.html (1) S. Amari, T.Ozeki, and H.Park (RIKEN BSI) "Singularities in Learning Models:Gaussian Random Field Approach." (2) K. Fukumizu (ISM) "Asymptotic Theory of Locally Conic Models and its Application to Multilayer Neural Networks." (3) K.Hagiwara (Mie Univ.) "On the training error and generalization error of neural network regression without identifiablity." (4) T. Hayasaka, M.Kitahara, K.Hagiwara, N.Toda, and S.Usui (TUT) "On the Asymptotic Distribution of the Least Squares Estimators for Non-identifiable Models." (5) S. Watanabe (TIT) "Bayes and Gibbs Estimations, Empirical Processes, and Resolution of Singularities." ********** A Short Introduction: [Why Non-identifiability ?] A parametric model in statistics is called identifiable if the mappning from the parameter to the probability distribution is one-to-one. A lot of learning machines used in information processing, such as artificial neural networks, normal mixtures, and Boltzmann machines are not identifiable. We do not yet have mathematical and statistical foundation on which we can research such models. [Singularities and Asymptotics ] If a non-identifiable model is redundant compared with the true distribution, then the set of true paramters is an analytic set with complex singularities, and the rank of the Fisher information matrix depends on the parameter. The behaviors of the training and generalization errors of layered learning machines are quite different from those of regular statistical models. It should be emphasized that we can not apply the standard asymptotic methods constructed by Fisher, Cramer, and Rao to these models. Either we can not use AIC, MDL, or BIC in statistical model selection for design of artificial neural networks. [Geometry and Statistics ] The purpose of this special session is to study and discuss the geometrical and statistical methodology by which non-identifiable learning machines can be analyzed. Remark that conic singularities are given by blowing-downs, and normal crossing singularities are found by blowing-ups. These algebraic geometrical methods take us to the statistical concepts, the order statistic and the empirical process. We find that a new perspective in geometry and statistics is opened. [Results which will be reported] (1) Professor Amari, et. al. clarify the generaliztion and traning errors of learning models of conic singularities in both the maximum likelihood method and the Bayesian method using the gaussian random field approach. (2) Dr. Fukumizu proves that a three layered neural network can be understood as a locally conic model, and that the asymptotic likelihood ratio is in proportion to (log n), where n is the number of training samples. (3) Dr. Hagiwara shows that the training and generalization errors of radial basis functions with gaussian units are in proportion to (log n) based on the assumption that the inputs are fixed. (4) Dr. Hayasaka, et.al. claim that the training error of three-layer perceptron is closely related to the expectation value of the order statistic. (5) Lastly, Dr.Watanabe studies the Bayes and Gibbs estimations for the case of statistical models with normal crossing singularities, and shows all general cases result in this case by resolution theorem. We expect that mathematicians, statisticians, information scientists, and theoretical physists will be interested in this topic. ********** Thank you very much for your interest in our special session. For questions or comments, please send an e-mail to Dr. Sumio Watanabe, P&I Lab., Tokyo Institute of Technology. E-mail: swatanab at pi.titech.ac.jp http://watanabe-www.pi.titech.ac.jp/~swatanab/index.html [Postal Mail] 4259 Nagatsuta, Midori-ku, Yokohama, 226-8503 Japan. From icann at ai.univie.ac.at Wed May 23 12:24:10 2001 From: icann at ai.univie.ac.at (ICANN 2001 conference) Date: Wed, 23 May 2001 18:24:10 +0200 Subject: ICANN 2001: Call for Participation Message-ID: <3B0BE42A.71B615F8@ai.univie.ac.at> Call for Participation ============================================================== ICANN 2001 International Conference on Artificial Neural Networks Aug. 21-25, 2001 Vienna, Austria http://www.ai.univie.ac.at/icann the annual conference of the European Neural Network Society ============================================================== Deadline for early registration fees: June 15, 2001 Invited Speakers: ================= Eric Baum, NEC Research Institute Vladimir S. Cherkassky, Univ. of Minnesota Stephen Grossberg, Boston Univ. Wolfgang Maass, Graz Univ. of Technology Kim Plunkett, Oxford Univ. Stephen Roberts, Oxford Univ. Alessandro Sperduti, Univ. of Pisa Florentin Woergoetter, Univ. of Stirling Program (Aug 22-24): ==================== 72 oral presentations and around 100 posters on the following topics: - Data Analysis and Pattern Recognition (Algorithms, Theory, Hardware, Applications) - Support Vector Machines, Kernel Methods - Independent Component Analysis - Topographic Mapping - Time Series and Signal Processing - Agent-based Economic Modeling (special session) - Computational Neuroscience - Vision and Image Processing - Robotics and Control - Selforganization and Dynamical Systems - Connectionist Cognitive Science Tutorials (Aug 21): =================== - Bioinformatics - The Machine Learning Approach Pierre Baldi - Predictive Learning and Modelling Financial=20 Markets Vladimir Cherkassky - Extraction of Knowledge from Data using Computational Intelligence Methods Wlodek Duch - Support Vector Machines Alex Smola - Sequential Learning of Nonlinear Models Mahesan Niranjan - Identification and Forecasting of Dynamical Systems Hans Georg Zimmermann - Neuroscience for Engineers and Computer Scientists Peter Erdi - Independent Component Analysis Aapo Hyv=E4rinen Workshops (Aug 25): =================== - Advances in EEG Analysis B. Blankertz, A. Flexer, J. Kohlmorgen,=20 K.R. M=FCller, S. Roberts, P. Sykacek - Processing Temporal Patterns with Recurrent Networks D. Eck, J. Schmidhuber - Kernel and Subspace Methods for Computer Vision A. Leonardis, H. Bischof - Advances toward Lifelike Perception Systems L. Smith Program chairs: =============== Georg Dorffner (general chair) Horst Bischof Kurt Hornik _______________________________________________________ Please see our web page for more details and for online registration. http://www.ai.univie.ac.at/icann From ckiw at dai.ed.ac.uk Fri May 25 06:53:45 2001 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Fri, 25 May 2001 11:53:45 +0100 (BST) Subject: PhD opportunities at the University of Edinburgh, UK Message-ID: PhD opportunities at the Institute for Adaptive and Neural Computation, University of Edinburgh, UK The Institute for Adaptive and Neural Computation (ANC, http://anc.ed.ac.uk) is part of the Division of Informatics at the University of Edinburgh. The Institute fosters the study of adaptive processes in both artificial and biological systems. It encourages interdisciplinary and collaborative work involving the traditional disciplines of neuroscience, cognitive science, computer science, computational science, mathematics and statistics. Many of the information-processing tasks under study draw on a common set of principles and mathematical techniques for their solution. Combined study of the adaptive nature of artificial and biological systems facilitates the many benefits accruing from treating essentially the same problem from different perspectives. A principal theme is the study of artificial learning systems. This includes theoretical foundations (e.g. statistical theory, information theory), the development of new models and algorithms, and applications. A second principal theme is the analysis and modelling of brain processes at all levels of organization with a particular focus on theoretical developments which span levels. Within this theme, research areas are broadly defined as the study of the neural foundations of perception, cognition and action and their underlying developmental processes. A secondary theme is the construction and study of computational tools and methods which can support studies in the two principal themes, such as in the analysis of brain data, simulation of networks and parallel data mining. Currently we have PhD studentships available as from 1 October 2001. These are supported by the Medical Research Council (MRC) and by the Biotechnology and Biological Sciences Research Council (BBSRC). In addition, the Division of Informatics receives a number of EPSRC studentships for which students wishing to study within the Institute for Adaptive and Neural Computation will be considered. PLEASE NOTE: Full funding under these studentships is only available to persons who satisfy a UK residence requirement (see www.epsrc.ac.uk/Documents/Guides/Students/Annex1.htm for more details). Under these studentships funding of university fees (but not maintenance) is available for EU nationals. To qualify for funding candidates must also have (or expect to receive) a good honours degree (1st or upper second class) (or equivalent). APPLICATION PROCEDURE: Formal applications should be made using the University of Edinburgh Postgraduate Application Form available via http://www.informatics.ed.ac.uk/prospectus/graduate/research.html and should be sent to the Faculty of Science and Engineering office. We wish to award these studentships as soon as possible, therefore applications should be received by June 15. Informal enquiries should be made to the contacts given below. * MRC Ph.D. studentship in Neuroinformatics and Functional MRI (see http://www.anc.ed.ac.uk/CFIS/hiring/MRC-PhD2.html for more details). Includes realtime methods in functional MRI, reproducibility of functional MRI brain imaging, and Bayesian Methods for the analysis of fMRI data. Contact: Dr. Nigel Goddard, Nigel.Goddard at ed.ac.uk * BBSRC studentship in the analysis of DNA microarray data (see http://www.bioss.ac.uk/student/newphdcag3.html for more details) Issues include: image analysis, to reduce noise and extract spot intensities; identification of differential gene expression between pairs of samples on a single microarray; exploratory graphical methods for analysing sets of arrays; and Bayesian networks to describe gene interactions. In collaboration with Dr. Chris Glasbey (Biomathematics & Statistics Scotland), c.glasbey at bioss.ac.uk. Applicants should have, or shortly expect to obtain, a first or upper second class degree in mathematics, statistics, physics informatics, mathematical biology, or a related subject. Please send a CV and names of three academic referees to: Chris Glasbey Biomathematics and Statistics Scotland JCMB, King's Buildings Edinburgh EH9 3JZ, Scotland email: c.glasbey at bioss.ac.uk Tel: (44) +131 650 4899 Fax: (44) +131 650 4901 * The EPSRC studentships are not specifically targeted, and can potentially support work in all areas that ANC works in. These include: theoretical and practical issues in machine learning and probabilistic modelling (including applications areas such as astronomical data mining, analysis of proteomics data, condition monitoring of premature babies, etc.); developing computational and mathematical models for the analysis of particular neural systems, in particular (i) models for the functioning of the basal ganglia (ii) models for the growth of optic projections in three-dimensional space; study of human cognitive processes, particularly language-related, using computational modeling and/or brain imaging approaches; software architectures and computational methods for neuroscience and cognitive science, including simulation, visualisation, and databases; connectionist cognitive modelling and cognitive modelling based on large language corpora, applied to modelling normal and impaired visual word recognition and spoken language processing. Informal enquiries may be made to Fiona Jamieson, fiona at anc.ed.ac.uk From piuri at elet.polimi.it Sun May 27 12:26:31 2001 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Sun, 27 May 2001 18:26:31 +0200 Subject: NIMIA 2001 and LFTNC 2001: two great opportunities for phd students and researchers in the neural areas! do not miss them!!! Message-ID: <5.0.2.1.0.20010527182221.02e13230@morgana.elet.polimi.it> Dear Colleague, Do not miss the opportunity to come to Italy once and attend at the following two international meetings!!!!!! - the NATO ASI NIMIA 2001 - NATO Advanced Study Institute on Neural Networks for Instrumentation, Measurement and Related Industrial Applications, to be held on 9-20 October 2001, in Crema, Italy. - the NATO ARW LFTNC 2001 - NATO Advanced Research Workshop on Limitations and Future Trends of Neural Computation, to be held on 22-24 October 2001, in Siena, Italy. Please forward this announcement to everybody who you feel could be interested in attending the meetings, especially to people working application areas! Since the attendance has to be approved by NATO, applications to attend should be submitted 15 JUNE 2001. Detailed information and the application forms are available at http://www.ims.unico.it/2001/ or at the mirror site at http://www.ewh.ieee.org/soc/im/2001/ You are allowed to withdraw your application at any time. Submitting earlier will give us more time to look for possible additional funding if the grants which are now available will not be sufficient to cover all attendees. Best regards Vincenzo Piuri & Marco Gori Vincenzo Piuri University of Milan, Department of Information Technologies via Bramante 65, 26013 Crema (CR), Italy phone: +39-0373-898-242 secretary: +39-0373-898-249 fax: +39-0373-898-253 email: piuri at elet.polimi.it secondary address: Politecnico di Milano, Department of Electronics and Information piazza L. da Vinci 32, 20133 Milano, Italy phone: +39-02-2399-3606 secretary: +39-02-2399-3623 fax: +39-02-2399-3411 email: piuri at elet.polimi.it From melchioc at csr.nih.gov Tue May 29 15:57:14 2001 From: melchioc at csr.nih.gov (Melchior, Christine (CSR)) Date: Tue, 29 May 2001 15:57:14 -0400 Subject: neuroscience job available Message-ID: SCIENTIFIC REVIEW ADMINISTRATOR (SRA) POSITION: The Center for Scientific Review (CSR), National Institutes of Health, seeks a neuroscientist with expertise in cognitive function who is interested in serving as an SRA. An SRA manages committees composed of leading scientists in their respective fields who meet to judge the scientific merit of research grant applications. Applicants must have earned the Ph.D. or M.D. (or have equivalent experience). It is crucial to have a record of independent research accomplishment, typically requiring several years beyond the doctoral degree. Salary is commensurate with experience. A recruitment or relocation bonus may be available. Submit curriculum vitae to: Christine Melchior, Ph.D., Chief, IFCN IRG, Center for Scientific Review, NIH, 6701 Rockledge Drive, Room 5176, MSC 7844, Bethesda, MD 20892-7844. E-mail: melchioc at csr.nih.gov NIH is an Equal Opportunity Employer. From giro-ci0 at wpmail.paisley.ac.uk Thu May 31 04:15:10 2001 From: giro-ci0 at wpmail.paisley.ac.uk (Mark Girolami) Date: Thu, 31 May 2001 09:15:10 +0100 Subject: Papers Now Available Message-ID: Dear Connectionists, The following papers are now available for download from http://cis.paisley.ac.uk/giro-ci0/ 1) Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem Mark Girolami To Appear : Neural Computation Abstract Kernel principal component analysis has been introduced as a method of extracting a set of orthonormal nonlinear features from multi-variate data and many impressive applications are being reported within the literature. This paper presents the view that the eigenvalue decomposition of a kernel matrix can also provide the discrete expansion coefficients required for a non-parametric orthogonal series density estimator. In addition to providing novel insights into non-parametric density estimation this paper provides an intuitively appealing interpretation for the nonlinear features extracted from data using kernel principal component analysis. 2) A Variational Method for Learning Sparse and Overcomplete Representations. Mark Girolami To Appear : Neural Computation Abstract An expectation maximisation algorithm for learning sparse and overcomplete data representations is presented. The proposed algorithm exploits a variational approximation to a range of heavy tailed distributions whose limit is the Laplacian. A rigorous lower-bound on the sparse prior distribution is derived which enables the analytic marginalisation of a lower-bound on the data likelihood. This lower-bound enables the development of an expectation maximisation algorithm for learning the overcomplete basis vectors and inferring the most probable basis coefficients. 3) Mercer Kernel Based Clustering in Feature Space Mark Girolami To Appear : IEEE Transaction on Neural Networks Abstract This paper presents a method for both the unsupervised partitioning of a sample of dat and the estimation of the possible number of inherent clusters which generate the data. This work exploits the notion that performing a nonlinear data transformation into some high dimensional feature space increases the probability of the linear separability of the patterns within the transformed space and therefore simplifies the associated data structure. It is shown that the eigenvectors of a kernel matrix which defines the implicit mapping provides a means to estimate the number of clusters inherent within the data and a computationally simple iterative procedure is presented for the subsequent feature space partitioning of the data. Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. --------------------------