From R.J.Howlett at bton.ac.uk Mon Nov 1 11:59:10 1999 From: R.J.Howlett at bton.ac.uk (Dr R.J.Howlett) Date: Mon, 1 Nov 1999 16:59:10 +0000 (GMT) Subject: Invitation to authors Message-ID: Apologies if you receive multiple copies or if this is unwanted. == Authors required - Invitation to contribute chapters == A new book with the title "Radial Basis Function Neural Networks: Design and Applications has been commissioned by publishers Springer Verlag. The editors are Dr R.J.Howlett, University of Brighton, UK, and Prof L.C.Jain, University of South Australia. Series editor is Prof J.Kacprzyk, Polish Academy of Sciences. Five chapters have been contributed: Introduction to RBF networks Training Algorithms for Robust RBF Neural Networks Hierarchical Radial Basis Neural Networks Biomedical Applications of Radial Basis Function Networks Servocontroller Applications using Radial Basis Function Networks Proposals are invited for additional chapters in areas of RBF network architectures, clustering/training algorithms, applications, practical experience of use, or other relevant areas. =============================================================== Dr R.J.Howlett Head of Intelligent Signal Processing Labs, UK Head of TCAR --------------------------------------------------------------- Engineering Research Centre School of Engineering University of Brighton Moulsecoomb Brighton Tel:+44 1273 642300 Fax:+44 1273 642301 BN2 4GJ Email r.j.howlett at brighton.ac.uk UNITED KINGDOM --------------------------------------------------------------- Transfrontier Centre for Automotive Research (TCAR) Web Site: http://www.eng.brighton.ac.uk/eee/research/tcar/ Engineering Research Centre Web Site: http://www.eng.brighton.ac.uk/eee/research/ 4th Int Conf on Knowledge-Based Intelligent Engineering Systems, 30 Sept-1 Aug 2000, Brighton, General Chair R.J.Howlett http://www.eng.brighton.ac.uk/eee/research/KES2000 =============================================================== From fmdist at hotmail.com Mon Nov 1 16:36:42 1999 From: fmdist at hotmail.com (Fionn Murtagh) Date: Mon, 01 Nov 1999 13:36:42 PST Subject: Research Assistant position Message-ID: <19991101213646.70513.qmail@hotmail.com> One-year Research Assistant position, PhD or near completion, School of Computer Science, The Queen's University of Belfast. Closing date November 12, 1999. - Neural networks for modeling and prediction in environmental and financial data analysis, - And other applications including distributed information space searching and navigation; compression and classification; image and signal processing. Good publications will be of major benefit. Further information: Prof F Murtagh, f.murtagh at qub.ac.uk ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From elman at crl.ucsd.edu Mon Nov 1 18:21:19 1999 From: elman at crl.ucsd.edu (Jeff Elman) Date: Mon, 1 Nov 1999 15:21:19 -0800 Subject: Center for Research in Language (UCSD) postdocs: 2000/2001 Message-ID: <199911012321.PAA02112@crl.ucsd.edu> THE CENTER FOR RESEARCH IN LANGUAGE UNIVERSITY OF CALIFORNIA, SAN DIEGO ANNOUNCEMENT OF POSTDOCTORAL FELLOWSHIPS FOR 2000-2001 Applications are invited for postdoctoral fellowships in Language, Communication and Brain at the Center for Research in Language at the University of California, San Diego. The fellowships are supported by the National Institutes of Health (NIDCD), and provide an annual stipend ranging from $26,000-41,000 depending upon years of postdoctoral experience. In addition, some funding is provided for medical insurance and travel. The program provides interdisciplinary training in: (1) psycholinguistics, including language processing in adults and language development in children; (2) communication disorders, including childhood language disorders and adult aphasia; (3) electrophysiological studies of language, and (4) neural network models of language learning and processing. Candidates are expected to work in at least one of these four areas, and preference will be given to candidates with background and interests involving more than one area. Grant conditions require that candidates be citizens or permanent residents of the U.S. In addition, trainees will incur a payback obligation during their first year of postdoctoral NRSA support and are required to complete a Payback Agreement.* Applications must be RECEIVED by FEBRUARY 1. Applicants should send a cover page with requested information (attached), a statement of interest, three letters of recommendation, a curriculum vitae and copies of relevant publications to: CRL POSTDOCTORAL FELLOWSHIP COORDINATOR Center for Research in Language 0526 University of California, San Diego 9500 Gilman Drive La Jolla, California 92093-0526 (619) 534-2536 Women and minority candidates are encouraged to apply. Program Requirements for post-doctoral candidates (1) Postdoctoral fellows will elect one of the four research components as their major area, defined by the fellow's primary laboratory affiliation across all years in residence. (2) Fellows are expected to attend weekly laboratory meetings within the major area. (3) In addition, post-doctoral fellows will carry out a 3 - 6 month rotation in a laboratory associated with a second component of the training program, including attendance at weekly research meetings. Access more information via our website: http://www.crl.ucsd.edu/fellowships/postdoc_fellow.html *Payback for post-docs can be discharged in the following ways: (1) By receiving an equal period of postdoctoral NRSA support beginning in the 13th month of such postdoctoral NRSA support; (2) By engaging in an equal period of health-related research or research training that averages more than 20 hours per week of a full work year; (3) By engaging in an equal period of health-related teaching that averages more than 20 hours per week of a full work year. STATMENT OF INTEREST FOR CRL POSTDOCTORAL FELLOWSHIP 2000-2001 Deadline: 2/1/2000 Applicant Name: Applicant Address: Applicant e-mail: Applicant Phone Number: Research Interests (from fellowship announcement): Title of PhD Thesis: Institution/Year PhD granted: Citizenship Status: From amari at brain.riken.go.jp Tue Nov 2 03:20:55 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Tue, 02 Nov 1999 17:20:55 +0900 Subject: multiway interactions of firing neurons Message-ID: <19991102172055D.amari@brain.riken.go.jp> Announcement of a new paper by Amari: The following paper entitled "Information Geometry on Hierarchical Decomposition of Stochastic Interactions" has been submitted to IEEE Trans. IT. The paper gives a method of orthogonal decomposition of higher order or multi-way interactions of random variables into the sum of those of lower interactions. Information Geometry gives a good solution to this problem. The theory can be applied to decomposition of interactions among an ensemble of firing neurons. I obtained the results more than ten years ago, having given a talk at a meeting in Japanese Mathematical Society. But I could not find sufficient time to write them in a paper form until now. This is read at http://www.islab.brain.riken.go.jp/~amari/pub_j.html or http://www.islab.brain.riken.go.jp/~amari/pub/IGHI.ps.gz (for gziped ps file) http://www.islab.brain.riken.go.jp/~amari/pub/IGHI.pdf (for pdf file) ********************* Shun-ichi Amari Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan RIKEN Brain Science Institute Director of Brain-Style Information Systems Research Group Laboratory for Information Synthesis, Head tel: +81-(0)48-467-9669 fax: +81-(0)48-467-9687 e-mail: amari at brain.riken.go.jp home page: http://www.bsis.brain.riken.go.jp/ From alain at fmed.ulaval.ca Tue Nov 2 14:57:40 1999 From: alain at fmed.ulaval.ca (Alain Destexhe) Date: Tue, 2 Nov 99 14:57:40 EST Subject: postdoc position in computational neuroscience Message-ID: <9911021957.AA25869@thalamus.fmed.ulaval.ca> POSTDOC POSITION AVAILABLE A postdoc position is available for a computational study of neocortical pyramidal neurons in vivo. This projet will be conducted in collaboration between three laboratories, A. Destexhe (Laval University, Canada) for the computational part, D. Pare (Laval University) and Y. Fregnac (CNRS, Gif-sur-Yvette, France) for the experimental part. The candidate will have access to intracellular data from neocortical neurons in vivo, obtained in the two aforementioned labs. The project is primarily modeling, but a participation to experiments is possible (to be discussed as a function of the interests of the candidate). The project will consist in reconstructing the morphology of intracellularly-recorded neurons using a Neuroclucida system (available at Laval University). The cellular morphologies will be incorporated in the NEURON simulator, to design biophysical models that will be matched precisely to the intracellular recordings. Because models and experimental data correspond to the same cellular morphologies, this method will allow us to characterize various aspects of synaptic activity in vivo, and estimate its consequences on dendritic integration. The candidate should have experience in computational modeling and a sufficient knowledge of electrophysiology. The salary will be paid by a grant from NIH, and is available right now for a period of 3 years. Candidates should contact Alain Destexhe for more details -- Alain Destexhe Department of Physiology Laval University Quebec G1K 7P4, Canada Tel: (418) 656 5711 Fax: (418) 656 7898 email: alain at fmed.ulaval.ca http://cns.fmed.ulaval.ca From wahba at stat.wisc.edu Tue Nov 2 21:41:03 1999 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 2 Nov 1999 20:41:03 -0600 (CST) Subject: Correlated Bernoulli observations Message-ID: <199911030241.UAA31842@hera.stat.wisc.edu> TR re multivariate Bernoulli observations: available via http://www.stat.wisc.edu/~wahba -> TRLIST Smoothing Spline ANOVA for Multivariate Bernoulli Observations, With Application to Ophthalmalogy Data Fangyu Gao, Grace Wahba, Ronald Klein, MD and Barbara Klein, MD UW-Madison Statistics Dept TR 1009, July 15, 1999, submitted We combine a Smoothing Spline ANOVA model and a log-linear model to build a partly flexible model for multivariate correlated Bernoulli response data, where the joint distribution of the components of the Bernoulli response vector may depend on a complex set of predictor variables. The joint distribution conditioning on the predictor variables is estimated via a SS-ANOVA variational problem. The log odds ratio is used to measure the association between outcome variables. A numerical scheme based on the block one-step SOR-Newton-Ralphson algorithm is proposed to obtain an approximate solution for the variational problem. We extend $GACV$ (Generalized Approximate Cross Validation) to the case of multivariate Bernoulli responses. Its randomized version is fast and stable to compute and is used to adaptively select smoothing parameters in each block one-step SOR iteration. Approximate Bayesian confidence intervals are obtained for the flexible estimates of the conditional logit functions. Simulation studies are conducted to check the performance of the proposed method. Finally, the model is applied to two-eyes observational data from the Beaver Dam Eye Study to examine the association of pigmentary abnormalities and various covariates. The results are applicable to a variety of problems where the response of interest is a vector of 0's and 1's that exhibit pairwise and higher order correlations. From yilin at stat.wisc.edu Tue Nov 2 22:01:47 1999 From: yilin at stat.wisc.edu (Yi Lin) Date: Tue, 2 Nov 1999 21:01:47 -0600 (CST) Subject: No subject Message-ID: <199911030301.VAA10802@hyg.stat.wisc.edu> TR Support Vector Machines and the Bayes Rule in Classification: available via http://www.stat.wisc.edu/~yilin Support Vector Machines and the Bayes Rule in Classification by Yi Lin UW-Madison statistics department TR 1014, November 1, 1999. The Bayes rule is the optimal classification rule if the underlying distribution of the data is known. In practice we do not know the underlying distribution, and need to ``learn'' classification rules from the data. One way to derive classification rules in practice is to implement the Bayes rule approximately by estimating an appropriate classification function. Traditional statistical methods use estimated log odds ratio as the classification function. Support vector machines (SVMs) are one type of large margin classifier, and the relationship between SVMs and the Bayes rule was not clear. In this paper, it is shown that SVMs implement the Bayes rule approximately by targeting at some interesting classification functions. This helps understand the success of SVMs in many classification studies, and makes it easier to compare SVMs and traditional statistical methods. From j.hogan at qut.edu.au Wed Nov 3 06:22:24 1999 From: j.hogan at qut.edu.au (Jim Hogan) Date: Wed, 03 Nov 1999 21:22:24 +1000 (EST) Subject: Jobs in Australia Message-ID: <3.0.32.19991103211844.009962d0@sky.fit.qut.edu.au> Queensland Univ of Technology is offering a number of positions in CS, closing soon. Potential applicants with an interest in machine learning, neural networks or data mining are encouraged to apply. All details may be found at the URL: http://www.qut.edu.au/pubs/employment/99430.html cheers jh ------------------------------ James M Hogan Lecturer in Computer Science QUT, GPO Box 2434 Brisbane Qld 4001 AUSTRALIA From neep at ecowar.demon.co.uk Thu Nov 4 12:12:03 1999 From: neep at ecowar.demon.co.uk (neep) Date: Thu, 04 Nov 1999 17:12:03 +0000 Subject: EANN2000 First Call For Papers Message-ID: <3821BE63.FF0F5A5@ecowar.demon.co.uk> First Call for Papers Sixth International Conference on Engineering Applications of Neural Networks Kingston Upon Thames, UK 17-19 July 2000 The conference is a forum for presenting the latest results on neural network applications in technical fields. The applications may be in any engineering or technical field, including but not limited to: systems engineering, mechanical engineering, robotics, process engineering, metallurgy, pulp and paper technology, aeronautical engineering, computer science, machine vision, chemistry, chemical engineering, physics, electrical engineering, electronics, civil engineering, geophysical sciences, biomedical systems, and environmental engineering. Prospective authors are requested to send an extended abstract for review by the International Committee. All papers must be written in English, starting with a succinct statement of the problem and the application area, the results achieved, their significance and a comparison with previous work (if any). The following must be also be included: Title of proposed paper, Author names, affiliations addresses, Name of author to contact for correspondence, E-mail address and fax number of contact author, Topics which best describe the paper (max. 5 keywords), Preferred Session Submissions must be received by February 29, 2000. It is strongly recommended to submit extended abastracts by electronic mail to: eann2000 at kingston.ac.uk or else by mail (2 copies) to the following address: Dr Dimitris Tsaptsinos, Kingston University, School of Mathematics, Penhryn Road, Kingston Upon Thames, KT1 2EE, UK Tel: +181-547 2000 extension. 2516 Fax: +181-547 7497 For information on earlier EANN conferences see the WWW pages: http://www.abo.fi/~abulsari/EANN.html Web Page: http://www.kingston.ac.uk/eann Diary Dates Submission Deadline 29 February, 2000 Notification of Acceptance 15 March, 2000 Delivery of full papers 15 April, 2000 Proposals for Tutorials 15 May, 2000 Registration fee paid by 15 April, 2000 to guarantee publication of contribution in the proceedings Contacts Conference Secretariat Dr Dimitris Tsaptsinos EANN2000 Conference Secretariat School of Mathematics Kingston University Penhryn Road Kingston Upon Thames Surrey KT1 2EE UK Tel: +181-5472000 extension 2516 Fax: +181-5477497 Email: eann2000 at kingston.ac.uk http://www.kingston.ac.uk/eann -------------------------------------------------------------------------------- Organising Committee A. Osman (USA) R. Baratti (Italy) S. Draghici (USA) W. Duch (Poland) J. Fernandez de Canete (Spain) C. Kuroda (Japan) A. Ruano (Portugal) D. Tsaptsinos (UK) E. Tulunay (Turkey) -------------------------------------------------------------------------------- International Committee to be extended L. Bobrowski Poland (leon at spam.ibib.waw.pl) A. Bulsari Finland (abulsari at spam.abo.fi) T. Clarkson A. Iwata G.Jones UK (G.Jones at kingston.ac.uk) L. Ludwig S. Michaelides R. Parenti Italy(parenti at spam.ari.ansaldo.it) R. Saatchi UK (R.Saatchi at spam.shu.ac.uk) C. Schizas S. Usui P. Zufiria please remove spam before you use an email address -------------------------------------------------------------------------------- Session Chair Control systems (A. Ruano, aruano at spam.ualg.pt) Process Engineering (R. Baratti, baratti at spam.unica.it) Vision/Image processing (S. Draghici, sod at spam.cs.wayne.edu) more to be announced please remove spam before you use an email address -- Neep Hazarika Phone: +44 (0)118 940 4141 (work) Econostat Limited +44 (0)118 946 1659 (home) Hennerton House Fax: +44 (0)118 940 4099 Wargrave, Berkshire RG10 8PD, U.K. e-mail: neep at ecowar.demon.co.uk From beer at eecs.cwru.edu Mon Nov 8 14:50:08 1999 From: beer at eecs.cwru.edu (Randall D. Beer) Date: Mon, 8 Nov 1999 14:50:08 -0500 Subject: NSF/IGERT in Neuromechanical Systems at CWRU Message-ID: NSF-SPONSORED TRAINING PROGRAM IN NEUROMECHANICAL SYSTEMS AT CWRU Predoctoral fellowships are now available in a new multidisciplinary graduate program in Neuro-Mechanical Systems at Case Western Reserve University. Neuro-mechanical systems include natural, man-made, or hybrid systems combining neural controllers and mechanical peripheries. Examples include natural organisms, biologically inspired robots, and neuroprostheses for restoring motor function in the disabled. We are seeking outstanding students with backgrounds in biology, neuroscience, biomedical engineering, computer engineering and science, electrical engineering, or mechanical engineering. Students participating in this program will learn the skills necessary to work in this exciting new multidiciplinary area. This program, funded by the National Science Foundations Integrative Graduate Education and Research Training initiative (NSF IGERT), brings together four research groups focused on the neurobiology and biomechanics of movement behavior, on bio-robotics, on evolution and analysis of model neuro-mechanical systems, and on motor system neuroprostheses. The program involves eight faculty from four Departments: Biology, Biomedical Engineering, Electrical Engineering and Computer Science, and Mechanical Engineering: Randall Beer, Electrical Engineering and Computer Science Michael Branicky, Electrical Engineering and Computer Science Hillel Chiel, Biology Patrick Crago, Biomedical Engineering Warren Grill, Biomedical Engineering Robert Kirsch, Biomedical Engineering Roger Quinn, Mechanical Engineering Roy Ritzmann, Biology Students in the training program will participate in cross-disciplinary courses and rotate through laboratories in all four fields. The program includes a multidisciplinary seminar featuring extended visits from leaders in each field. Funds will permit travel to scientific meetings and workshops in each field. Common computer facilities and office areas will be provided for students in the program. Internships in clinical and industrial settings will also be available as options. We are particularly interested in recruiting under-represented minorities. Students must be U.S. Citizens or Permanent Residents of the United States. Further details of the program can be found at http://neuromechanics.cwru.edu. For further information, please contact Dr. Roy Ritzmann, Department of Biology, Case Western Reserve University, Cleveland, OH 44106 - 7080, (216) 368 - 3554, rer3 at po.cwru.edu. From Annette_Burton at Brown.edu Mon Nov 8 14:22:18 1999 From: Annette_Burton at Brown.edu (Annette Burton) Date: Mon, 8 Nov 1999 15:22:18 -0400 Subject: IGERT JOB ANNOUNCEMENT Message-ID: Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY POSTDOCTORAL OPPORTUNITY in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES As part of an NSF award to Brown University through the IGERT program, the Departments of Cognitive and Linguistic Sciences, Computer Science, and Applied Mathematics will be hiring a Postdoctoral Research Associate. Fellows will be scholars who have displayed significant interest and ability in conducting collaborative interdisciplinary research in one or more of the research areas of the program: computational and empirical approaches to uncertainty in language, vision, action, or human reasoning. As well as participating in collaborative research, responsibilities will include helping to coordinate cross-departmental graduate teaching and research as well as some teaching of interdisciplinary graduate courses. We expect that the fellows will play an important role in creating a highly visible presence for the IGERT program at Brown, and the interdisciplinary activities will help unify the interdepartmental activities of the IGERT program. Applicants must hold a PhD in Cognitive Science, Linguistics, Computer Science, Mathematics, Applied Mathematics, or a related discipline, or show evidence that the PhD will be completed before the start of the position. Applicants should send a vita and three letters of reference to the IGERT Postdoc Search Committee, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, RI 02912. Special consideration will be given to those applicants whose research is relevant to at least two of the participating departments. The position will begin September 1, 2000 for one year, renewable upon satisfactory completion of duties in the first year. Salaries will be between $35,000 and $42,500 per year. All materials must be received by Jan. 15, 2000, for full consideration. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT. From ken at phy.ucsf.EDU Mon Nov 8 16:50:42 1999 From: ken at phy.ucsf.EDU (Ken Miller) Date: Mon, 8 Nov 1999 13:50:42 -0800 (PST) Subject: Paper available: review of circuitry underlying mature orientation selectivity, experiments and models Message-ID: <14375.17842.908607.854386@coltrane.ucsf.edu> The following paper is now available at ftp://ftp.keck.ucsf.edu/pub/ken/fm_final.ps.gz (compressed postscript) ftp://ftp.keck.ucsf.edu/pub/ken/fm_final.pdf (pdf) or http://www.keck.ucsf.edu/~ken (click on 'Publications') This is a preprint of an article to appear in Annual Reviews of Neuroscience, Vol. 23 (2000). (Note: A review I announced about two weeks ago focused on *development* of orientation selectivity. This review, in contrast focuses on the structure of the circuitry underlying mature orientation-selective responses.) ---------------------------------------- Neural Mechanisms of Orientation Selectivity in the Visual Cortex David Ferster, Northwestern University Kenneth D. Miller, University of California, San Francisco to appear in Annual Reviews of Neuroscience, Vol. 23 (2000). ABSTRACT: The origin of orientation selectivity in the responses of simple cells in cat visual cortex serves as a model problem for understanding cortical circuitry and computation. The feedforward model of Hubel and Wiesel posits that this selectivity arises simply from the arrangement of thalamic inputs to a simple cell. Much evidence, including a number of recent intracellular studies, supports a primary role of the thalamic inputs in determining simple cell response properties including orientation tuning. However, this mechanism alone cannot explain the invariance of orientation tuning to changes in stimulus contrast. Simple cells receive push-pull inhibition: ON inhibition in OFF subregions and vice versa. Addition of such inhibition to the feedforward model can account for this contrast invariance, provided the inhibition is sufficiently strong. The predictions of "normalization" and "feedback" models are reviewed and compared to the predictions of this modified feedforward model and to experimental results. The modified feedforward and the feedback models ascribe fundamentally different functions to cortical processing. Ken Kenneth D. Miller telephone: (415) 476-8217 Dept. of Physiology fax: (415) 476-4929 UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From jas at hnc.com Tue Nov 9 22:25:27 1999 From: jas at hnc.com (Spoelstra, Jacob) Date: Tue, 9 Nov 1999 19:25:27 -0800 Subject: Job opportunity: HNC Software Message-ID: <72A838A51366D211B3B30008C7F4D363021F6E86@pchnc.hnc.com> > POSITION: Staff Scientist > DIVISION: HNC Financial Solutions > LOCATION: San Diego, CA JOB CODE: FCT9911 > Duties/Job Description: > ----------------------- > Responsibilities include designing and building predictive models based > on the latest technologies in neural networks, pattern > recognition/artificial > intelligence, and statistical modeling for various applications in the > financial industry. Specific responsibilities may vary by project, but > will include analyzing data to determine suitability for modeling, pattern > identification and feature (variable) selection from large amounts of > data, > experimenting with different types of models, analyzing performance, and > reporting results to customers. > > Required Qualifications (Experience/Skills): > -------------------------------------------- > MS or PhD in Computer Science, Electrical Engineering, Applied > Statistics/Mathematics or related field. Minimum two years of experience > in pattern recognition, mathematical modeling, or data analysis on real > world problems. Familiarity with the latest modeling techniques and > tools. > Good oral and written communication skills, and the ability to interact > well with both co-workers and customers. Proficiency in C and Unix, and > familiarity with SAS or other analysis tool. Software Engineering will > be a plus. > > Preferred Qualifications (Experience/Skills): > --------------------------------------------- > Strong mathematical appetite, problem solving and computer skills > (C or C++ or Java). Good Unix scripting and rapid prototyping skill. > Quick learner and good team player. Experience in designing systems based > > on neural networks, pattern recognition and/or statistical modeling > techniques for the financial, health care, marketing, or other real > world applications. Object oriented software design familiarity. > > Careers at HNC Software Inc: > Headquartered in San Diego, California, HNC Software Inc. (Nasdaq: HNCS)is > the world's leading provider of Predictive Software Solutions for service > industries, including financial, retail, insurance, Internet, and > telecommunications. It is HNC's employment philosophy to create a dynamic > work environment that allows each employee to feel challenged and > experience personal growth that maximizes each person's potential. HNC > also offers a comprehensive array of employee benefits including stock > options, employee stock purchase plan, competitive health benefits, 401(k) > plans and tuition support for continuing education. (Please refer to job code FCT9911 in all correspondence) > Apply by Email: fct_jobs at hnc.com > By Fax: (858) 452-6524 (For attention: Dr. Khosrow Hassibi) > By Mail: Dr. Khosrow Hassibi HNC Software Inc. Financial Solutions 5935 Cornerstone Court West San Diego, CA 92121-3728 From faramarz at cns.bu.edu Wed Nov 10 13:34:04 1999 From: faramarz at cns.bu.edu (Faramarz Valafar) Date: Wed, 10 Nov 1999 13:34:04 -0500 Subject: Graduate Program in The Department of Cognitive and Neural Systems (CNS) at Boston University Message-ID: PLEASE POST ******************************************************************* GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. Applications for Fall 2000 admission and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via e-mail your full name and mailing address to the attention of Mr. Robin Amos at: inquiries at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. Stephen Grossberg, Chairman Gail A. Carpenter, Director of Graduate Studies Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Students are trained in a broad range of areas concerning computational neuroscience, cognitive science, and neuromorphic systems, including the brain mechanisms of vision and visual object recognition; audition, speech, and language understanding; recognition, learning, categorization, and long-term memory; cognitive information processing; self-organization and development; navigation, planning, and spatial orientation; cooperative and competitive network dynamics and short-term memory; reinforcement and motivation; attention; adaptive sensory-motor control and robotics; biological rhythms; consciousness; mental disorders; and the mathematical and computational methods needed to support advanced modeling research and applications. The CNS Department awards MA, PhD, and BA/MA degrees. The CNS Department embodies a number of unique features. It has developed a curriculum that consists of eighteen interdisciplinary graduate courses, each of which integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of neural networks to technology. Additional advanced courses, including research apprenticeship and seminar courses, are also offered. Each course is typically taught once a week in the afternoon or evening to make the program available to qualified students, including working professionals, throughout the Boston area. Students develop a coherent area of expertise by designing a program that includes courses in areas such as biology, computer science, engineering, mathematics, and psychology, in addition to courses in the CNS curriculum. The CNS Department interacts with colleagues in several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The unit most closely linked to the department is the Center for Adaptive Systems. Students interested in neural network hardware can work with researchers in CNS, at the College of Engineering, and at M.I.T. Lincoln Laboratory. Other research resources include distinguished research groups in neurophysiology, neuroanatomy, and neuropharmacology across the Boston University Charles River Campus and Medical School; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department; and in biophysics and computational physics within the Physics Department. Key colleagues in these units hold appointments in CNS. In addition to its basic research and training program, the department conducts a seminar series, as well as conferences and symposia, which bring together distinguished scientists from both experimental, theoretical, and applied disciplines. The department is housed in its own new four-story building which includes ample space for faculty and student offices and laboratories (computational neuroscience, visual psychophysics, psychoacoustics, speech and language, sensory-motor control, neurobotics, computer vision), as well as an auditorium, classroom and seminar rooms, a library, and a faculty-student lounge. The department has a powerful computer network for carrying out large-scale simulations of behavioral and brain models. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior. Aijaz Baloch Adjunct Assistant Professor of Cognitive and Neural Systems Senior Modeling Engineer, Nestor, Inc. PhD, Electrical Engineering, Boston University Visual motion perception, computational vision, adaptive control, and financial fraud detection. Helen Barbas Professor of Anatomy and Neurobiology, Boston University School of Medicine PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex. Jacob Beck Research Professor of Cognitive and Neural Systems PhD, Psychology, Cornell University Visual perception, psychophysics, computational models of vision. Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development. Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems PhD, Mathematics, University of Wisconsin, Madison Learning and memory, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations. Laird Cermak Director, Memory Disorders Research Center, Boston Veterans Affairs Medical Center Professor of Neuropsychology, School of Medicine Professor of Occupational Therapy, Sargent College PhD, Ohio State University Memory disorders. Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems, cardiovascular oscillations physiology and time series. H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, auditory virtual environments, signal processing models of hearing. Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory. William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology. Paolo Gaudiano Research Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Computational and neural models of robotics, vision, adaptive sensory-motor control, and behavioral neurobiology. Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics. Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition. Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Vision, audition, language, learning and memory, reward and motivation, cognition, development, sensory-motor control, mental disorders, applications. Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, biological sensory-motor control and functional brain imaging. Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition. Michael E. Hasselmo Associate Professor of Psychology Director of Graduate Studies, Psychology Department PhD, Experimental Psychology, Oxford University Electrophysiological studies of neuromodulatory effects in cortical structures, network biophysical simulations of memory function in hippocampus and piriform cortex, behavioral studies of amnestic drugs. Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing. Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Neural network theory, complexity theory, wavelet theory, mathematical physics. Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamics of networks of neurons. Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neurodevelopmental disorders. Ennio Mingolla Associate Professor of Cognitive and Neural Systems and Psychology PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes. Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production. Alan Peters Professor of Anatomy and Neurobiology, School of Medicine PhD, Zoology, Bristol University, United Kingdom Organization of neurons in the cerebral cortex; effects of aging on the primate brain; fine structure of the nervous system. Andrzej Przybyszewski Research Fellow, Department of Cognitive and Neural Systems Assistant Professor, University of Massachusetts Medical School, Worcester PhD, Warsaw Medical Academy Electrophysiology of the primate visual system, mathematical and computer modeling of the neuronal networks in the visual system. Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision. Mark Rubin Research Assistant Professor of Cognitive and Neural Systems PhD, Physics, University of Chicago Pattern recognition; artificial and biological vision. Michele Rucci Assistant Professor of Cognitive and Neural Systems PhD, Scuola Superiore, Pisa, Italy Vision, sensory-motor control and learning, and computational neuroscience. Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Research Scientist, Haskins Laboratories, New Haven, CT Assistant Professor in Residence, Department of Psychology and Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities. Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Scientist, Rowland Institute for Science Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception. Teaching about functional MRI and other brain mapping methods. Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling. Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Frances and Louis H. Salvage Professor of Psychology, Brandeis University Consultant in neurosurgery, Boston Children's Hospital PhD, Psychology, Brown University Visual motion, brain imaging, relation of visual perception, memory, and movement. Barbara Shinn-Cunningham Assistant Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensorimotor adaptation, mathematical models of human performance. Malvin Carl Teich Professor of Electrical and Computer Engineering, Biomedical Engineering and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission. Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging. Faramarz Valafar Adjunct Assistant Professor of Cognitive and Neural Systems PhD, Electrical Engineering, Purdue University Bioinformatics, adaptive systems (artificial neural networks), data mining and modeling in medicine, medical decision making, pattern recognition and signal processing in biomedicine, biochemistry, and glycoscience. Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI). Allen Waxman Adjunct Associate Professor of Cognitive and Neural Systems Senior Staff Scientist, MIT Lincoln Laboratory PhD, Astrophysics, University of Chicago Visual system modeling, multisensor fusion, image mining, parallel computing, and advanced visualization. James Williamson Research Assistant Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Pattern recognition; self-organization and topographic maps; perceptual grouping. Jeremy Wolfe Adjunct Associate Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Dept. Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, preattentive and attentive object representation. Curtis Woodcock Professor of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing. CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN710 Advanced Topics in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and student-run special interest groups, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by fellowships, grants, and contracts from federal agencies and private foundations that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception; audition, speech and language processing; and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Silicon Graphics workstations, Macintoshes, and PCs. A PC farm running Linix operating systems is available as a distributed computational environment. All students have access to PCs or UNIX workstation consoles, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Lab is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; a light machine shop; an active vision lab including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. Neurobotics Laboratory The Neurobotics Lab utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The lab currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Psychoacoustics Laboratory The Psychoacoustics Lab houses a newly installed, 8 ft. % 8 ft. sound-proof booth. The laboratory is extensively equipped to perform both traditional psychoacoustic experiments and experiments using interactive auditory virtual-reality stimuli. The major equipment dedicated to the psychoacoustics laboratory includes two Pentium-based personal computers; two Power-PC-based Macintosh computers; a 50-MHz array processor capable of generating auditory stimuli in real time; programmable attenuators; analog-to-digital and digital-to-analog converters; a real-time head tracking system; a special-purpose, signal-processing hardware system capable of generating "spatialized" stereo auditory signals in real time; a two-channel oscilloscope; a two-channel spectrum analyzer; various cables, headphones, and other miscellaneous electronics equipment; and software for signal generation, experimental control, data analysis, and word processing. Sensory-Motor Control Laboratory The Sensory-Motor Control Lab supports experimental studies of motor kinematics. An infrared WatSmart system allows measurement of large-scale movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. Equipment includes a 40-inch monitor that allows computer display of animations generated by an SGI workstation or a Pentium Pro (Windows NT) workstation. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. Speech and Language Laboratory The Speech and Language Lab includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. For high speed processing, supercomputer facilities speed filtering and data analysis. Visual Psychophysics Laboratory The Visual Psychophysics Lab occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Silicon Graphics, Inc. (SGI) Onyx RE2, SGI Indigo2 High Impact, SGI Indigo2 Extreme, Power Computing (Macintosh compatible) PowerTower Pro 225, and Macintosh 7100/66 workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing glasses, prisms, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://www.cns.bu.edu/ ******************************************************************* From A.van.Ooyen at nih.knaw.nl Thu Nov 11 11:40:33 1999 From: A.van.Ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Thu, 11 Nov 1999 17:40:33 +0100 Subject: Models of Axon Guidance and Bundling Message-ID: <382AF181.7408@nih.knaw.nl> New Paper: Models of Axon Guidance and Bundling During Development H. G. E. Hentschel & A. van Ooyen Proc. R. Soc. Lond. B (1999) 266: 2231-2238. Request reprint: A.van.Ooyen at nih.knaw.nl Or download from http://www.cns.ed.ac.uk/people/arjen/papers/bundle_abstract.html ABSTRACT Diffusible chemoattractants and chemorepellants, together with contact attraction and repulsion, have been implicated in the establishment of connections between neurons and their targets. Here we study how such diffusible and contact signals can be involved in the whole sequence of events from bundling of axons, guidance of axon bundles towards their targets, to debundling and the final innervation of individual targets. By means of computer simulations, we investigate the strengths and weaknesses of a number of particular mechanisms that have been proposed for these processes. -- Arjen van Ooyen, Netherlands Institute for Brain Research, Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands. email: A.van.Ooyen at nih.knaw.nl website: http://www.cns.ed.ac.uk/people/arjen.html phone: +31.20.5665483 fax: +31.20.6961006 From bengioy at IRO.UMontreal.CA Thu Nov 11 09:02:27 1999 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Thu, 11 Nov 1999 09:02:27 -0500 Subject: open faculty position in machine learning / Montreal Message-ID: <19991111090227.40797@IRO.UMontreal.CA> Hello, The department of computer science and operations research of the University of Montreal is opening two tenure-track faculty positions, and one of the main areas of interest is that of machine learning. Note that this is a French-speaking university and candidates are expected to be able to teach in French within about a year of hiring (in the past we have hired several non-francophones, and many non-Canadians, who have successfully adapted). Note also that Montreal is a great (bilingual English/French) city to live in, with the flavor of a European city, a vibrant cultural life, four universities, nearby ski slopes, great and inexpensive restaurants, and a strong network of research centers in the mathematical sciences. Please don't hesitate to contact me for further information. -- Yoshua Bengio, bengioy at iro.umontreal.ca www.iro.umontreal.ca/~bengioy The official announcement: --------------------------------------------------------------------- Universit? de Montr?al Facult? des arts et des sciences Department of Computer Science and Operations Research The DIRO (D?partement d'informatique et de recherche op?rationnelle - Department of Computer Science and Operations Research) invites applications for two tenure-track positions in Computer Science at the Assistant Professor level, starting June 1st, 2000. The Department is seeking qualified candidates in Computer Science. Preference will be given to applicants with a strong research program in one of the following or related areas: -- Hardware-software systems (specification, synthesis, and verification of embedded systems); -- Artificial intelligence (machine learning and data mining); -- Distributed multimedia systems; -- Programming languages. Beyond demonstrating a clear potential for outstanding research, the successful candidates must be committed to excellence in teaching. The Universit? de Montr?al is the leading French-speaking University in North America. The DIRO offers B.Sc., M.Sc. and Ph.D. degrees in Computer Science and Operations Research, as well as a combined undergraduate degree in Computer Science and Mathematics. With 35 faculty members, 600 undergraduate and 190 graduate students, the DIRO is one of the largest Computer Science departments in Canada as well as one of the most active in research. Research interests of current faculty include computational biology, telecommunications, intelligent tutoring systems, computer architecture, software engineering, artificial intelligence, computational linguistic, computer graphics and vision, machine learning, theoretical and quantum computing, parallelism, optimization, heuristics, numerical simulation. Further information can be obtained at the Department's web site: http://www.iro.umontreal.ca. Requirements : Ph.D. in Computer Science or a related area. Ability to teach and supervise students in French within one year. Salary : Salary is competitive and fringe benefits are excellent. Hardcopy applications including a curriculum vitae, a statement of current research program, at least three letters of reference, and up to three selected preprints/reprints, should be sent to: Sang Nguyen, professeur et directeur D?partement d'informatique et de recherche op?rationnelle, FAS Universit? de Montr?al C.P. 6128, Succ. "Centre-Ville" Montr?al (Qu?bec), H3C 3J7 by February 1st, 2000. Applications received after that date may be considered until the positions are filled. In accordance with Canadian Immigration requirements, priority will be given to Canadian citizens and permanent residents. The Universit? de Montr?al is committed to equity in employment and encourages applications from qualified women. -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From rsun at cecs.missouri.edu Thu Nov 11 16:27:48 1999 From: rsun at cecs.missouri.edu (Ron Sun) Date: Thu, 11 Nov 1999 15:27:48 -0600 Subject: Ph.D program in AI and connectionist models Message-ID: <199911112127.PAA15773@pc113.cecs.missouri.edu> The Ph.D program in CECS at University of Missouri-Columbia is accepting applications. Graduate assistantships and other forms of financial support for graduate students are available. Prospective graduate students interested in Artificial Intelligence, Cognitive Science, Connectionist Models (Neural Networks), Multi-Agent Systems, and other related areas are especially encouraged to apply. Students with Master's degrees are preferred. The department has identified graduate education and research as its primary missions. The department is conducting quality research in a number of areas: artificial intelligence, cognitive sceince, machine learning, multi-agent systems, neural networks and connectionist models, computer graphics and scientific visualization, computer vision, digital libraries, fuzzy logic, multimedia systems, parallel and distributed computing, and Web computing. To download application forms, use http://www.missouri.edu/~gradschl or http://web.missouri.edu/~regwww/admission/intl_admission/Application_Form/Application_index.html (for international students) ----------------------------------------------------------------- The CECS Department awards degrees at the Bachelor's, Master's and Ph.D's levels. The program is accredited by CSAB and ABET. The CECS Department has a variety of computing equipment and laboratories available for instruction and research. These facilities are currently being enhanced, in conjunction with computing laboratories maintained by the college and by the campus. The computing facilities offer students a wealth of opportunity to access and utilize a wide range of equipment best suited for their research needs. All of the equipment is connected to departmental, college, campus, and global networks which provides ready access to the exploding world of information and computational resources. A wealth of library resources are available through the extensive collections of books and journals housed in the Engineering and Mathematical Sciences libraries as well as collections in the Main Library and Health Sciences Libraries at MU. The University of Missouri is a Research I university enrolling some 22,000 students. The University offers programs in many areas, ranging from sciences and engineering to psychology, neuroscience, education, biology, medicine, law, agriculture, and journalism. For more information, send e-mail to: cecsdgs at cecs.missouri.edu See the Web pages below: =========================================================================== Prof. Ron Sun http://www.cecs.missouri.edu/~rsun CECS Department University of Missouri-Columbia fax: (573) 882 8318 201 Engineering Building West Columbia, MO 65211-2060 email: rsun at cecs.missouri.edu http://www.cecs.missouri.edu/~rsun http://www.cecs.missouri.edu/~rsun/journal.html http://www.cecs.missouri.edu/~rsun/clarion.html =========================================================================== From X.Yao at cs.bham.ac.uk Thu Nov 11 16:43:38 1999 From: X.Yao at cs.bham.ac.uk (Xin Yao) Date: Thu, 11 Nov 1999 21:43:38 +0000 (GMT) Subject: Combinations between EC and NNs Message-ID: The First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks Co-sponsored by IEEE Neural Network Council The Center for Excellence in Evolutionary Computation May 11-12, 2000 The Camberley Gunter Hotel, San Antonio, TX, USA Symposium URL: http://www.cs.bham.ac.uk/~xin/ecnn2000 FINAL CALL FOR PAPERS The recent increasing interest in the synergy between evolutionary computation and neural networks provides an impetus for a symposium dedicated to furthering our understanding of this synergy and the potential utility of hybridizing evolutionary and neural techniques. The First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks will offer a forum that focuses specifically on the hybridization of evolutionary and neural computation. In particular, papers are solicited in the areas of + evolutionary training of neural networks, + evolutionary design of network topologies, + evolution of learning (weight updating) rules, + evolving solutions to inverse neural network problems, + the performance of alternative variation operators in designing neural networks, + comparisons between evolutionary and other training methods, + evolving developmental rules for neural network design, and + the use of coevolution in optimizing neural networks for pattern recognition, gaming, or other applications. Other topics that combine evolutionary and neural computation are also welcome. Submitted papers should represent unpublished, original work. PAPER SUBMISSION Send three (3) copies of your manuscript to Xin Yao School of Computer Science The University of Birmingham Edgbaston, Birmingham B15 2TT U.K. Email: x.yao at cs.bham.ac.uk All three hardcopies should be printed on 8.5 by 11 inch or A4 paper using 11 point Times. Allow at least one inch (25mm) margins on all borders. A paper must include a title, an abstract, and the body and references. It must also include the names and addresses of all authors, their email addresses, and their telephone/fax numbers. The length of submitted papers must be no more than 15 single-spaced, single-column pages, including all figures, tables, and references. Shorter papers are encouraged. In addition to hardcopies, please send a postscript file of your paper (gzipped if possible) to facilitate electronic reviewing to the following email address: x.yao at cs.bham.ac.uk. Please check the symposium's web site http://www.cs.bham.ac.uk/~xin/ecnn2000 for more details as they become available. SUBMISSION DEADLINE: DECEMBER 1, 1999 General Chair: Xin Yao Programme Committee Chair: D.B. Fogel Program Committee Members: H. Adeli P.J. Angeline K. Chellapilla J.-C. Chen S.-B. Cho G.W. Greenwood L. Guan T. Hussain N. Kasabov S. Lucas N. Murshed V. Nissen M. Rizki R. Salomon G. Yen D. Van Veldhuizen B.-T. Zhang Q. Zhao The Symposium follows the 9th IEEE International Conference on Fuzzy Systems (FUZZ-IEEE2000), Hilton Palacio del Rio, San Antonio, TX, USA, 7-10 May 2000. From espaa at exeter.ac.uk Fri Nov 12 05:24:33 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Fri, 12 Nov 1999 10:24:33 +0000 (GMT Standard Time) Subject: PAA JOURNAL CONTENTS Message-ID: PATTERN ANALYSIS AND APPLICATIONS Springer-Verlag London Ltd. Homepage: http://www.dcs.ex.ac.uk/paa Electronic Version: http://link.springer.de/link/service/journals/10044 ISSN: 1433-7541 (printed version) ISSN: 1433-755X (electronic version) Table of Contents Vol. 2 Issue 4 (November, 1999) M. Hauta-Kasari, J. Parkkinen, T. Jaaskelainen, R. Lenz: Multi-spectral Texture Segmentation Based on the Spectral Cooccurrence Matrix Pattern Analysis & Applications 2 (1999) 4, 275-284 P. Fränti, E. I. Ageenko, A. Kolesnikov: Vectorising and Feature-Based Filtering for Line-Drawing Image Compression Pattern Analysis & Applications 2 (1999) 4, 285-291 A. F. R. Rahman, M. C. Fairhurst: Serial Combination of Multiple Experts: A Unified Evaluation Pattern Analysis & Applications 2 (1999) 4, 292-311 A. Fusiello, E. Trucco, T. Tommasini, V. Roberto: Improving Feature Tracking with Robust Statistics Pattern Analysis & Applications 2 (1999) 4, 312-320 P. Carvalho, N. Costa, B. Ribeiro, A. Dourado: On the Use of Neural Networks and Geometrical Criteria for Localisation of Highly Irregular Elliptical Shapes Pattern Analysis & Applications 2 (1999) 4, 321-342 From fritz at neuro.informatik.uni-ulm.de Fri Nov 12 05:44:56 1999 From: fritz at neuro.informatik.uni-ulm.de (Fritz Sommer) Date: Fri, 12 Nov 1999 11:44:56 +0100 (MET) Subject: Research Position in brain imaging/cogn. neuroscience Message-ID: <14379.60917.506870.529751@cerebellum> Research Position (BAT IIa) available (cognitive/computational neuroscience and brain imaging) At the University of Ulm an interdiscplinary research project on analysis and modeling of functional magnetic resonance data has been established. It is a joint project of the departments of Psychiatry (Prof. Dr. M. Spitzer), Radiology and Neural Information Processing (Prof. Dr. G. Palm). The project focusses on the development of new methods for the detection and interpretation of functional/effective connectivity in fMRI data and their application to working memory tasks. This project offers a unique environment for a direct cooperation between theorists and experimenters. In the described project a position is available (beginning Jan 2000, 2 years, 1 year extension possible). Candidates should be strongly interested in interdisciplinary research. They should have a background in statistical methods (cluster analysis, Neural Networks), functional MRI analysis or computational neuroscience. Required is a recent masters degree or equivalent in computer science, physics, mathematics or in a closely related area. Experience in programming in C in a Unix environment is necessary, experience with MATLAB and SPM is helpful. The research can be conducted as part of a PhD thesis degree in Computer Science. Salary according to BAT IIa. The University of Ulm is an equal opportunity employer and encourages female scientists to apply. Employment will be effected through the "Zentrale Universitaetsverwaltung" of the University of Ulm. Ulm is a town of about 160000 inhabitants nicely situated in the Danube valley. It has picturesque old parts (not to forget the gothic cathedral with the highest church tower of the world) and is surrounded by beautiful landscape, lakes, creek valleys, forrests and the sparsely populated "Schwaebische Alb" high plane. In an one hour train ride one can either reach the alps for skiing and hiking, or the cities of Munich and Stuttgart for more sophisticated cultural programs. Please send us CV, letter of motivation, and, if possible, addresses of three referees. Prof. Dr. Dr. M. Spitzer, Department of Psychiatry III, University of Ulm, Leimgrubenweg 12, 89075 Ulm, Germany or e-mail to manfred.spitzer at medizin.uni-ulm.de. Because of the time constraints please email your application to: Dr. F. T. Sommer, email: fritz at neuro.informatik.uni-ulm.de (You can also ask for more detailed informations on the research project.) From shastri at ICSI.Berkeley.EDU Fri Nov 12 14:20:53 1999 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Fri, 12 Nov 1999 11:20:53 PST Subject: Algebriac rules Message-ID: <199911121920.LAA19182@lassi.ICSI.Berkeley.EDU> Dear Connectionists: The following technical report may be of interest to some of you. Best wishes. Lokendra Shastri ------- A Spatiotemporal Connectionist Model of Algebraic Rule-Learning Lokendra Shastri and Shawn Chang TR-99-011 July, 1999 Interntational Computer Science Institute Berkeley, CA 94707 Recent experiments by Marcus, Vijaya, Rao, and Vishton suggest that infants are capable of extracting and using abstract algebraic rules such as ``the first item X is the same as the third item Y''. Such an algebraic rule represents a relationship between placeholders or variables for which one can substitute arbitrary values. As Marcus et al. point out, while most neural network models excel at capturing statistical patterns and regularities in data, they have difficulty in extracting algebraic rules that generalize to new items. We describe a connectionist network architecture that can readily acquire algebraic rules. The extracted rules are not tied to features of words used during habituation, and generalize to new words. Furthermore, the network acquires rules from a small number of examples, without using negative evidence, and without pretraining. A significant aspect of the proposed model is that it identifies a sufficient set of architectural and representational conditions that transform the problem of learning algebraic rules to the much simpler problem of learning to detect coincidences within a spatiotemporal pattern. Two key representational conditions are (i) the existence of nodes that encode serial position within a sequence and (ii) the use of temporal synchrony for expressing bindings between a positional role node and the item that occupies this position in a given sequence. This work suggests that even abstract algebraic rules can be grounded in concrete and basic notions such as spatial and temporal location, and coincidence. Available at: http://www.icsi.berkeley.edu/~shastri/psfiles/tr-99-011.ps.gz OR http://www.icsi.berkeley.edu/~shastri/psfiles/tr-99-011.pdf Lokendra Shastri International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 shastri at icsi.berkeley.edu http://www.icsi.berkeley.edu/~shastri Phone: (510) 642-4274 ext 310 FAX: (510) 643-7684 From bogus@does.not.exist.com Mon Nov 15 13:07:43 1999 From: bogus@does.not.exist.com () Date: Mon, 15 Nov 1999 19:07:43 +0100 Subject: CFP: ESANN'2000 European Symposium on Artificial Neural Networks Message-ID: ---------------------------------------------------- | | | ESANN'2000 | | | | 8th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 26-27-28, 2000 | | | | Announcement and call for papers | | | | Deadline: 10 December 1999 | ---------------------------------------------------- Technically co-sponsored by the IEEE Neural Networks Council, the IEEE Region 8, the IEEE Benelux Section, and the International Neural Networks Society. The call for papers for the ESANN'2000 conference is now available on the Web: http://www.dice.ucl.ac.be/esann We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). If you have difficulties to connect to the Web please send an e-mail to esann at dice.ucl.ac.be and we will send you a full version of the call for papers. ESANN'2000 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first edition in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2000 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical aspects, approximation of functions, classification, control, time-series prediction, statistics, signal processing, vision, self-organization, vector quantization, evolutive learning, psychological computations, biological plausibility, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, data analysis, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are also encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that it is the topics of the paper which will decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate on the author submission form that they only accept to present their paper orally. The following is a non-exhaustive list of topics covered during the ESANN conferences: o theory o models and architectures o mathematics o learning algorithms o vector quantization o self-organization o RBF networks o Bayesian classification o recurrent networks o support vector machines o time series forecasting o adaptive control o statistical data analysis o independent component analysis o signal processing o approximation of functions o cellular neural networks o fuzzy neural networks o natural and artificial vision o hybrid networks o identification of non-linear dynamic systems o biologically plausible artificial networks o bio-inspired systems o neurobiological systems o cognitive psychology o adaptive behaviour o evolutive learning Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. o Self-organizing maps for data analysis J. Lampinen, K. Kaski, Helsinki Univ. of Tech. (Finland) o Time-series prediction J. Suykens, J. Vandewalle, K.U. Leuven (Belgium) o Artificial neural networks and robotics R. Duro, J. Santos Reyes, Univ. da Coruna (Spain) o Support Vector Machines C. Campbell, Bristol Univ. (UK), J. Suykens, K.U. Leuven (Belgium) o Neural networks and statistics W. Duch, Nicholas Copernicus Univ. (Poland) o Neural network in medicine T. Villmann, Univ. Leipzig (Germany) o Artificial neural networks for energy management systems G. Joya, Univ. de Malaga (Spain) Details on special sessions are available on the Web. Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organised in an hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference, or in another one (50 m. from the first one) at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Call for contributions ---------------------- Prospective authors are invited to submit - six original copies of their manuscript (including at least two originals or very good copies without glued material, which will be used for the proceedings) - one signed copy of the author submission form before December 10, 1999. Authors are invited to join a floppy disk or CD with their contribution in (generic) PostScript or PDF format. Sorry, electronic or fax submissions are not accepted. Working language of the conference (including proceedings) is English. The instructions to authors, together with the author submission form, are available on the ESANN Web server: http://www.dice.ucl.ac.be/esann A printed version of these documents is also available through the conference secretariat (please use email if possible). Authors are invited to follow the instructions to authors. A LaTeX style file is also available on the Web. Authors must indicate their choice for oral or poster presentation on the author submission form. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2000. They will benefit from the advance registration fee. Submissions must be sent to: Michel Verleysen UCL - DICE 3, place du Levant B-1348 Louvain-la-Neuve Belgium esann at dice.ucl.ac.be All submissions will be acknowledged by fax or email before December 23, 1999. Deadlines --------- Submission of papers December 10, 1999 Notification of acceptance January 31, 2000 Symposium April 26-27-28, 2000 Registration fees ----------------- registration before registration after March 17, 2000 March 17, 2000 Universities BEF 16000 BEF 17000 Industries BEF 20000 BEF 21000 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. Conference secretariat ---------------------- Michel Verleysen D facto conference services phone: + 32 2 420 37 57 27 rue du Laekenveld Fax: + 32 2 420 02 55 B - 1080 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee ---------------------------- Fran?ois Blayo Pr?figure (F) Marie Cottrell Univ. Paris I (F) Jeanny H?rault INPG Grenoble (F) Henri Leich Fac. Polytech. Mons (B) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Edoardo Amaldi Politecnico di Milano (I) Agn?s Babloyantz Univ. Libre Bruxelles (B) Herv? Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Holk Cruse Universit?t Bielefeld (D) Eric de Bodt Univ. Lille II & UCL Louv.-la-N. (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips / LEP (F) Jean-Claude Fort Universit? Nancy I (F) Bernd Fritzke Dresden Univ. of Technology (D) Stan Gielen Univ. of Nijmegen (NL) Manuel Grana UPV San Sebastian (E) Anne Gu?rin-Dugu? INPG Grenoble (F) Martin Hasler EPFL Lausanne (CH) Laurent H?rault CEA-LETI Grenoble (F) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinky Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Eddy Mayoraz Motorola Palo Alto (USA) Jean Arcady Meyer Univ. Pierre et Marie Curie - Paris 6 (F) Jos? Mira UNED (E) Jean-Pierre Nadal Ecole Normale Sup?rieure Paris (F) Gilles Pag?s Univ. Pierre et Marie Curie - Paris 6 (F) Thomas Parisini Politecnico di Milano (I) H?l?ne Paugam-Moisy Univ. Lumi?re Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Leonardo Reyneri Politecnico di Torino (I) Tamas Roska Hungarian Academy of Science (H) Jean-Pierre Rospars INRA Versailles (F) John Stonham Brunel University (UK) Johan Suykens KUL Leuven (B) John Taylor King?s College London (UK) Claude Touzet IUSPIM Marseilles (F) Marc Van Hulle KUL Leuven (B) Christian Wellekens Eurecom Sophia-Antipolis (F) From mozer at cs.colorado.edu Tue Nov 16 14:47:25 1999 From: mozer at cs.colorado.edu (Mike Mozer) Date: Tue, 16 Nov 99 12:47:25 -0700 Subject: faculty positions in Machine Learning at U. Colorado, Boulder Message-ID: <199911161947.MAA20910@neuron.cs.colorado.edu> The Computer Science Department at the University of Colorado at Boulder has immediate openings for two new faculty members in the area of Machine Learning (at either junior or senior level). Additional openings are likely in the coming years through the Institute of Cognitive Science. Details can be found in the job ad at http://www.cs.colorado.edu/department/news/news.html#search From jagota at cse.ucsc.edu Tue Nov 16 17:07:05 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Tue, 16 Nov 1999 14:07:05 -0800 Subject: NIPS*99: call for volunteers Message-ID: <199911162207.OAA27891@sundance.cse.ucsc.edu> We can use a few more student volunteers at NIPS*99. (Any nine hours of work in Denver gets registration for both the tutorials and the conference, as well as reception and dinner. Any six hours of work in Breckenridge gets registration for the workshops, which includes the receptions and dinner, and transportation there on the bus. One may volunteer at both places.) To apply, first visit the "Volunteer Work" subsection of the "Programs and Schedules" section of the NIPS page http://www.cs.cmu.edu/Groups/NIPS/ for instructions, then visit http://www.cse.ucsc.edu/~jagota/NIPS/S.html for the current list of open tasks. Arun Jagota, NIPS*99 Local Arrangements jagota at cse.ucsc.edu From geoff at giccs.georgetown.edu Tue Nov 16 18:16:07 1999 From: geoff at giccs.georgetown.edu (Geoff Goodhill) Date: Tue, 16 Nov 1999 18:16:07 -0500 Subject: Papers available: axon guidance Message-ID: <199911162316.SAA05112@brecker.giccs.georgetown.edu> The following papers in TINS and J Neurobiol are now available from http://www.giccs.georgetown.edu/labs/cns/axon.html 1. Retinotectal Maps: Molecules, Models, and Misplaced Data. Geoffrey J. Goodhill & Linda J. Richards. Trends in Neurosciences, 22, 529-534 (December 1999). 2. Theoretical analysis of gradient detection by growth cones. Geoffrey J. Goodhill & Jeffrey S. Urbach. Journal of Neurobiology, 41, 230-241 (November 1999). Abstracts are below. Geoff Geoffrey J Goodhill, PhD Assistant Professor, Department of Neuroscience & Georgetown Institute for Cognitive and Computational Sciences Georgetown University Medical Center 3970 Reservoir Road NW, Washington DC 20007 Tel: (202) 687 6889, Fax: (202) 687 0617 Email: geoff at giccs.georgetown.edu Homepage: www.giccs.georgetown.edu/labs/cns ABSTRACTS 1. The mechanisms underlying the formation of topographic maps in the retinotectal system have long been debated. Recently, members of the Eph and ephrin receptor-ligand family have been found to provide a molecular substrate for one type of mechanism, that of chemospecific gradient matching as proposed by Sperry. However, experiments over several decades have demonstrated that there is more to map formation than gradient matching. This article briefly reviews the old and new findings, argues that these two types of data must be properly integrated in order to understand map formation fully, and suggests some experimental and theoretical ways to begin this process. 2. Gradients of diffusible and substrate-bound molecules play an important role in guiding axons to appropriate targets in the developing nervous system. Although some of the molecules involved have recently been identified, little is known about the physical mechanisms by which growth cones sense gradients. This paper applies the seminal Berg & Purcell (1977) model of gradient sensing to this problem. The model provides estimates for the statistical fluctuations in the measurement of concentration by a small sensing device. By assuming that gradient detection consists of the comparison of concentrations at two spatially or temporally separated points, the model therefore provides an estimate for the steepness of gradient that can be detected as a function of physiological parameters. The model makes the following specific predictions. (1) It is more likely that growth cones use a spatial rather than temporal sensing strategy. (2) Growth cone sensitivity increases with the concentration of ligand, the speed of ligand diffusion, the size of the growth cone, and the time over which it averages the gradient signal. (3) The minimum detectable gradient steepness for growth cones is roughly in the range 1% - 10%. (4) This value varies depending on whether a bound or freely diffusing ligand is being sensed, and on whether the sensing occurs in three dimensions or two dimensions. The model also makes predictions concerning the role of filopodia in gradient detection. From steve at cns.bu.edu Wed Nov 17 08:27:00 1999 From: steve at cns.bu.edu (Stephen Grossberg) Date: Wed, 17 Nov 1999 08:27:00 -0500 Subject: Perceptual Grouping and Object-Based Attention by the Laminar Circuits of Visual Cortex Message-ID: The following article can be read at http://www.cns.bu.edu/Profiles/Grossberg/ S. Grossberg and R. D. S. Raizada (1999) Contrast-sensitive perceptual grouping and object-based attention in the laminar circuits of primary visual cortex. Vision Research, in press. ABSTRACT Recent neurophysiological studies have shown that primary visual cortex, or V1, does more than passively process image features using the feedforward filters suggested by Hubel and Wiesel. It also uses horizontal interactions to group features preattentively into object representations, and feedback interactions to selectively attend to these groupings. All neocortical areas, including V1, are organized into layered circuits. We present a neural model showing how the layered circuits in areas V1 and V2 enable feedforward, horizontal, and feedback interactions to complete perceptual groupings over positions that do not receive contrastive visual inputs, even while attention can only modulate or prime positions that do not receive such inputs. Recent neurophysiological data about how grouping and attention occur and interact in V1 are simulated and explained, and testable predictions are made. These simulations show how attention can selectively propagate along an object grouping and protect it from competitive masking, and how contextual stimuli can enhance or suppress groupings in a contrast-sensitive manner. Preliminary version appears as Boston University Technical Report, CAS/CNS-TR-99-008. Available in PDF and gzip'ed postscript. From becker at curie.psychology.mcmaster.ca Wed Nov 17 18:18:18 1999 From: becker at curie.psychology.mcmaster.ca (Sue Becker) Date: Wed, 17 Nov 1999 18:18:18 -0500 (EST) Subject: faculty position: human cognition/cognitive neuroscience Message-ID: Dear list members, The department of Psychology at McMaster has an open faculty position at the assistant or associate level that may be of interest. Our advertisement is included below. Althoug it generally targets human cognition, cognitive neuroscience (neural computation, neuropsychology, brain imaging) is a sub-area of high priority. Feel free to contact me if you are interested in applying. cheers, Sue Becker Sue Becker Department of Psychology, McMaster University becker at mcmaster.ca 1280 Main Street West, Hamilton, Ont. L8S 4K1 Fax: (905)529-6225 http://www.science.mcmaster.ca/Psychology/sb.html Tel: 525-9140 ext. 23020 For Aug 6/1999-June 30/2000: becker at mcmaster.ca Institute of Cognitive Neuroscience Fax: 44-(0)171-391-1145 Alexandra House, University College London Tel: 44-(0)171-391-1148 17 Queen Square, London, UK WC1N 3AR ------------------------------------------------------------- The Department of Psychology at McMaster University invites applications for a tenure track appointment at the Assistant Professor level or early Associate Professor level in the area of human cognition. Preference will be given to applicants with research interests in higher level cognitive processes (e.g., memory, categorization, decision-making), or a research program in neuropsychology, particularly one involving patient populations. However, candidates with research programs in other areas of cognition are strongly encouraged to apply. Research which extends to the domain of cognitive neuroscience (e.g., neuroimaging, neural computation) will be considered an asset. To apply, send a curriculum vitae, a short statement of research interests, selected reprints, and three letters of reference to: Dr. Bruce Milliken, Department of Psychology, McMaster University, Hamilton, Ontario, CANADA L8S 4K1. Closing date for applications and supporting material is December 15, 1999. In accordance with Canadian immigration requirements, this advertisement is directed to Canadian citizens and permanent residents. McMaster University is committed to Employment Equity and encourages applications from all qualified candidates, including aboriginal peoples, persons with disabilities, members of visible minorities, and women. Interested candidates may learn more about the department at http://www.psychology.mcmaster.ca. From tp at ai.mit.edu Thu Nov 18 01:01:26 1999 From: tp at ai.mit.edu (Tomaso Poggio) Date: Thu, 18 Nov 1999 01:01:26 -0500 Subject: position available/MIT Message-ID: <4.2.0.58.19991118005531.05622410@pop6.attglobal.net> MASSACHUSETTS INSTITUTE OF TECHNOLOGY DEPARTMENT OF BRAIN & COGNITIVE SCIENCES The MIT Department of Brain and Cognitive Sciences anticipates making a new tenure-track appointment in theoretical/experimental neuroscience at the Assistant Professor level. Candidates should combine a strong mathematical background and an active research interest in the modeling of specific cellular- or systems-level phenomena with appropriate experiments. Individuals whose research focuses on learning and memory or sensory-motor integration at the level of neurons and networks of neurons are especially encouraged to apply.We are also interested in individuals working on bioinformatics in neuroscience. Responsibilities include graduate and undergraduate teaching and research supervision. Applications should include a brief cover letter stating the candidate's research and teaching interests, a vita, three letters of recommendation, and representative reprints, and should be sent to: Theoretical/Experimental Neuroscience Search Committee, Dept. of Brain & Cognitive Sciences, E25-406, MIT, Cambridge, MA 02139. Review of applications starts January 15, 2000. MIT is an Affirmative Action/Equal Opportunity Employer. Qualified women and minority candidates are encouraged to apply. Qualified women and minority candidates are especially encouraged to apply. MIT is an Affirmative Action/Equal Opportunity employer. Tomaso Poggio Uncas and Helen Whitaker Professor Brain Sciences Department and Artificial Intelligence Lab M.I.T., E25-218, 45 Carleton St Cambridge, MA 02142 E-mail: tp at ai.mit.edu Web: CBCL-Web-page: Phone: 617-253-5230 Fax: 617-253-2964 From giorgio.giacinto at computer.org Thu Nov 18 04:24:54 1999 From: giorgio.giacinto at computer.org (Giorgio Giacinto) Date: Thu, 18 Nov 1999 10:24:54 +0100 Subject: 1st Internation Workshop on Multiple Cassifier Systems Message-ID: **Apologies for multiple copies** ************************************************** ***** First Announcement and Call for Papers ***** ************************************************** ********************************************************************** 1st MCS FIRST INTERNATIONAL WORKSHOP ON MULTIPLE CLASSIFIER SYSTEMS Santa Margherita di Pula, Cagliari, Italy, June 21-23 2000 ********************************************************************** *** Updated information: http://www.diee.unica.it/mcs *** *** E-mail: mcs at diee.unica.it *** WORKSHOP OBJECTIVES The main goal of the workshop is to assess the state of the art of the theory and the applications of multiple classifier systems and related approaches. Contributions from all the research communities working in the field are welcome in order to compare the different approaches and to define the common research priorities. Special attention is also devoted to assess the applications of multiple classifier systems and the potential market perspectives. The workshop program will include both plenary lectures given by invited speakers and papers accepted for oral presentation. The papers will be published in the workshop proceedings, and extended versions of selected papers will be considered for publication in a special issue of the Pattern Analysis and Applications Journal on Classifier Fusion WORKSHOP TOPICS Papers describing original work in the following and related research topics are welcome: Theoretical foundations of multiple classifier systems Methods for classifier combination Methods for classifier selection Neural network ensembles Modular neural networks Mixture models Multiple expert systems Hybrid systems Learning in multiple classifier systems Design of multiple classifier systems Multiple models in data mining Related approaches (intelligent agents, multi-criteria decision making, etc.) Applications (biometrics, document analysis, data mining, remote sensing, etc.) WORKSHOP CHAIRS Josef Kittler (Univ. of Surrey, United Kingdom) Fabio Roli (Univ. of Cagliari, Italy) INVITED SPEAKERS Thomas G. Dietterich (Oregon State University, USA) Robert P.W. Duin (Delft Univ. of Tech., The Netherlands) Amanda J.C. Sharkey (Dept. Computer Science, Univ. of Sheffield, UK) Sargur N. Srihari (CEDAR, State Univ. of New York, Buffalo, USA) Ching Y. Suen (CENPARMI, Concordia Univ., Montreal, Canada) PROGRAM CHAIR Gianni Vernazza (Univ. of Genoa, Italy) SCIENTIFIC COMMITTEE J. A. Benediktsson (Iceland) H. Bunke (Switzerland) L. P. Cordella (Italy) T.G. Dietterich (USA) R. P.W. Duin (The Netherlands) J. Ghosh (USA) S. Impedovo (Italy) D. Landgrebe (USA) D.S. Lee (USA) A. K. Jain (USA) T. K. Ho (USA) D. Partridge (UK) C.Scagliola (Italy) R. Schapire (USA) A. J.C. Sharkey (UK) S. N. Srihari (USA) C.Y. Suen (Canada) D. Wolpert (USA) LOCAL COMMITTEE G.Armano (Univ. of Cagliari, Italy) G.Giacinto (Univ. of Cagliari, Italy) G.Fumera (Univ. of Cagliari, Italy) PAPER SUBMISSION Three hard copies of the full papers should be mailed to: 1st MCS Prof. Fabio Roli Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy In addition, participants should submit an electronic version of the manuscript (PostScript or PDF format) to mcs at diee.unica.it The papers should not exceed 15 A4 pages (12pt, double-spaced). A cover sheet with the authors names and affiliations is also requested, with the complete address of the corresponding author, and an abstract (200 words). The papers will be refereed by two separate reviewers of the Scientific Committee. IMPORTANT DATES February 1, 2000: Paper Submission March 15, 2000: Notification of Acceptance April 30, 2000: Camera-ready Manuscripts May 2000: Early registration WORKSHOP VENUE The workshop will be held at the Is Molas Golf Hotel, Santa Margherita di Pula, Cagliari, Italy. Additional information concerning the workshop venue can be found at http://www.ismolas.it WORKSHOP PROCEEDINGS The papers will be published in the workshop proceedings, and extended versions of selected papers will be considered for publication in a special issue of the Pattern Analysis and Applications Journal on Classifier Fusion ==================================================================== Fabio Roli, Phd Associate Professor of Computer Science Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy Phone +39 070 675 5874 Fax +39 070 6755900 e-mail roli at diee.unica.it Web Page at http://www.diee.unica.it From villmann at informatik.uni-leipzig.de Thu Nov 18 03:18:58 1999 From: villmann at informatik.uni-leipzig.de (villmann@informatik.uni-leipzig.de) Date: Thu, 18 Nov 1999 09:18:58 +0100 Subject: ESANN'2000 - special session - NN Applications in Medicine Message-ID: <199911180818.JAA13635@ilabws.informatik.uni-leipzig.de> Dear colleages, I want to announce and to invite to submit contributions to the special session "Neural Networks Applications in Medicine" organized by Thomas Villmann (Ujniversity Leipzig, Germany) and to be held on the ---------------------------------------------------- | | | ESANN'2000 | | | | 8th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 26-27-28, 2000 | | | | Announcement and call for papers | | | | Deadline: 10 December 1999 | ---------------------------------------------------- PROPOSAL ------------------- Artificial neural networks can be taken as a special kind of learning and self-adapting data processing systems. The abilities to handle noisy and high-dimensional data, nonlinear problems, large data sets etc. using neural techniques have lead to an innumerous number of applications. A important area of applications is the medical research and data analysis. Thereby, we have to distinguish at least two main directions: the first one is the description of neural processes in brains by neural network models. The other one is the field of data analysis and related topics. The announced special session should be lightening the second item. The applications of neural networks for data processing in medicine open a wide cover of possible tasks. This leads to further developments also in the theory of neural networks. The applications include several techniques ranging from image processing, (non-linear) principle component analysis and dimension reduction, processing of noisy and/or incomplete data, time serie prediction, feature extraction and classification problems to the processing of non-metric data, i.e. categorial or ordinal data. The recent proposed approach of Blind Source Separation using neural networks also gives a possibility to solve the in medicine often occuring problem of data decorrelation. Neural network approaches as robust tools play in increasing role in real world application in medicine. In the proposed special session we want give examples of applications of neural networks in medicine which have lead to new developments in the neural network development. Thereby we focus on applications combining original ideas and new aspcets in neural network approaches with a real world task. Authors are invited to submit contributions which can be in any area of medical research and applications as there are for example the following (but not restricted) 1.) image processing (NMR-spectroscopy, radiology, ...) 2.) time series prediction (EEG analysis, cardiology, sleep detection, ...) 3.) pattern classification 4.) clustering, fuzzy clustering 5.) Blind Source Separation and Decorralation 6.) dimension and noise reduction 7.) evaluation of non-metric data (categorial/ ordinal data) It should be emphasized that, in agreement with the scope of the ESANN, a strong theoretical background and new aspects of neural networks development are required for an acceptation of contributions. Prospective authors are invited to submit - six original copies of their manuscript (including at least two originals or very good copies without glued material, which will be used for the proceedings) - one signed copy of the author submission form before December 10, 1999. Authors are invited to join a floppy disk or CD with their contribution in (generic) PostScript or PDF format. Sorry, electronic or fax submissions are not accepted. Working language of the conference (including proceedings) is English. The instructions to authors, together with the author submission form, are available on the ESANN Web server: http://www.dice.ucl.ac.be/esann A printed version of these documents is also available through the conference secretariat (please use email if possible). Authors are invited to follow the instructions to authors. A LaTeX style file is also available on the Web. Authors must indicate their choice for oral or poster presentation on the author submission form. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2000. They will benefit from the advance registration fee. Submissions must be sent to: Michel Verleysen UCL - DICE 3, place du Levant B-1348 Louvain-la-Neuve Belgium esann at dice.ucl.ac.be All submissions will be acknowledged by fax or email before December 23, 1999. Deadlines --------- Submission of papers December 10, 1999 Notification of acceptance January 31, 2000 Symposium April 26-27-28, 2000 Registration fees ----------------- registration before registration after March 17, 2000 March 17, 2000 Universities BEF 16000 BEF 17000 Industries BEF 20000 BEF 21000 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. __________________________________________________________ Dr. Thomas Villmann University Leipzig - Clinic for Psychotherapy Karl-Tauchnitz-Str. 25 04109 Leipzig Germany Phone: +49 (0)341 9718868 Fax : +49 (0)341 2131257 email: villmann at informatik.uni-Leipzig.de __________________________________________________________ From szepes at mindmaker.hu Thu Nov 18 13:32:57 1999 From: szepes at mindmaker.hu (Csaba Szepesvari) Date: Thu, 18 Nov 1999 19:32:57 +0100 Subject: TR: Comparing Value-Function Estimation Algorithms in Undiscounted Problems Message-ID: <38344658.C3436C8E@mindmaker.hu> Dear Colleagues, The following technical report is available at http://victoria.mindmaker.hu/~szepes/papers/slowql-tr99-02.ps.gz All comments are welcome. Best wishes, Csaba Szepesvari ---------------------------------------------------------------- Comparing Value-Function Estimation Algorithms in Undiscounted Problems TR99-02, Mindmaker Ltd., Budapest 1121, Konkoly Th. M. u. 29-33 Ferenc Beleznay, Tamas Grobler and Csaba Szepesvari We compare scaling properties of several value-function estimation algorithms. In particular, we prove that Q-learning can scale exponentially slowly with the number of states. We identify the reasons of the slow convergence and show that both TD($\lambda$) and learning with a fixed learning-rate enjoy rather fast convergence, just like the model-based method. From guang at ce.chalmers.se Fri Nov 19 10:37:46 1999 From: guang at ce.chalmers.se (Guang Li) Date: Fri, 19 Nov 1999 16:37:46 +0100 Subject: PhD thesis available Message-ID: <38356EC9.F236B099@ce.chalmers.se> Dear researchers, A PhD thesis is available at Department of Computer Engineering Chalmers University of Technology S-412 96 Gothenburg, Sweden Anyone interested please reply this mail to request a copy of it. Regards Guang Li --------------------- Title: Towards On-line Learning Agents for Autonomous Navigation Abstract The design of a mechatronic agent capable of navigating autonomously in a changing and perhaps previously unfamiliar environment is a very challenging issue. This thesis addresses this issue from both functional and system perspectives. Functions such as spatial representation, localization, path-finding and collision avoidance are essential to autonomous agent navigation. Four types of learning related to these functions have been identified as important: sensory information categorization and classification, the learning of stimulus-response mapping, the learning of spatial representation and the coding and adaptation of the travel experience with regard to specific tasks. It is argued that, in order to achieve a high degree of autonomy at the system level, it is essential to implement each of these navigational functions with a highly autonomous learning technique. An analysis of several representative artificial neural network (ANN) algorithms for their degrees of autonomy and computational characteristics indicates that none of the learning techniques analyzed is alone sufficient in terms of spatial learning. It is shown that biology can be inspirational in finding a possibly better, or perhaps more complete, solution to the learning of spatial representation than previous engineering or ANN based approaches. In particular, data on the biological head direction system have inspired the generation of a computational model which is shown to be able to use learned environmental features to correct the directional error accumulated by dead-reckoning in a simulated mobile robot. Furthermore, using a hippocampal place learning system in biological systems as an inspiration, a network model of dynamic cell structure is suggested. It allows an autonomous agent to perform tasks such as environmental mapping, localization and path-finding. In this model, a focus mechanism is included to help minimize computation needs by directing the adaptation of the network and the path-finding. The thesis also discusses various approaches toward achieving a high degree of autonomy at the system level. It is also shown that a feed forward gating mechanism can be combined into a layered design framework to accommodate the interaction between various navigational functions having high degrees of autonomy. From ijspeert at rana.usc.edu Fri Nov 19 21:14:41 1999 From: ijspeert at rana.usc.edu (Auke Ijspeert) Date: Fri, 19 Nov 1999 18:14:41 -0800 (PST) Subject: PhD thesis and preprints: CPGs for lamprey and salamander locomotion Message-ID: Dear Connectionists, The following PhD thesis and preprints (see below) may interest people working on the modeling of central pattern pattern generators for locomotion. They presents some results in the use of evolutionary algorithms for completing biological plausible neural circuits modeled as continuous-time neural networks. In particular, the neural circuits underlying the swimming of the lamprey and the swimming and trotting of the salamander are investigated. This work is inspired by Ekeberg's neuromechanical model of the lamprey swimming (Biol. Cybern. 69, 363-374, 1993), and, similarly, the developed CPGs are incorporated in simple biomechanical models of a lamprey and a salamander. Animated gifs illustrating the different gaits developed can be found at: http://rana.usc.edu:8376/~ijspeert/ The thesis and the papers (title and abstracts below) can be found in gzipped postscript at: http://rana.usc.edu:8376/~ijspeert/publications.html Please tell me if you have any problem downloading them. Comments are most welcome! Best regards, Auke Ijspeert -------------------------------------------------------------------------- Dr Auke Jan Ijspeert Brain Simulation Lab & Computational Learning and Motor Control Lab Department of Computer Science, Hedco Neurosciences building U. of Southern California, Los Angeles, CA 90089, USA Web: http://rana.usc.edu:8376/~ijspeert/ Tel: +1 213 7401922 or 7406995 (work) +1 310 8238087 (home) Fax: +1 213 7405687 Email: ijspeert at rana.usc.edu -------------------------------------------------------------------------- _____________________________________________________________________ PhD thesis, A.J. Ijspeert Design of artificial neural oscillatory circuits for the control of lamprey- and salamander-like locomotion using evolutionary algorithms Supervisors: John Hallam and David Willshaw Department of Artificial Intelligence, University of Edinburgh, 1998 Abstract: This dissertation investigates the evolutionary design of oscillatory artificial neural networks for the control of animal-like locomotion. It is inspired by the neural organisation of locomotor circuitries in vertebrates, and explores in particular the control of undulatory swimming and walking. The difficulty with designing such controllers is to find mechanisms which can transform commands concerning the direction and the speed of motion into the multiple rhythmic signals sent to the multiple actuators typically involved in animal-like locomotion. In vertebrates, such control mechanisms are provided by central pattern generators which are neural circuits capable of producing the patterns of oscillations necessary for locomotion without oscillatory input from higher control centres or from sensory feedback. This thesis explores the space of possible neural configurations for the control of undulatory locomotion, and addresses the problem of how biologically plausible neural controllers can be automatically generated. Evolutionary algorithms are used to design connectionist models of central pattern generators for the motion of simulated lampreys and salamanders. This work is inspired by Ekeberg's neuronal and mechanical simulation of the lamprey [Ekeberg 93]. The first part of the thesis consists of developing alternative neural controllers for a similar mechanical simulation. Using a genetic algorithm and an incremental approach, a variety of controllers other than the biological configuration are successfully developed which can control swimming with at least the same efficiency. The same method is then used to generate synaptic weights for a controller which has the observed biological connectivity in order to illustrate how the genetic algorithm could be used for developing neurobiological models. Biologically plausible controllers are evolved which better fit physiological observations than Ekeberg's hand-crafted model. Finally, in collaboration with Jerome Kodjabachian, swimming controllers are designed using a developmental encoding scheme, in which developmental programs are evolved which determine how neurons divide and get connected to each other on a two-dimensional substrate. The second part of this dissertation examines the control of salamander-like swimming and trotting. Salamanders swim like lampreys but, on the ground, they switch to a trotting gait in which the trunk performs a standing wave with the nodes at the girdles. Little is known about the locomotion circuitry of the salamander, but neurobiologists have hypothesised that it is based on a lamprey-like organisation. A mechanical simulation of a salamander-like animat is developed, and neural controllers capable of exhibiting the two types of gaits are evolved. The controllers are made of two neural oscillators projecting to the limb motoneurons and to lamprey-like trunk circuitry. By modulating the tonic input applied to the networks, the type of gait, the speed and the direction of motion can be varied. By developing neural controllers for lamprey- and salamander-like locomotion, this thesis provides insights into the biological control of undulatory swimming and walking, and shows how evolutionary algorithms can be used for developing neurobiological models and for generating neural controllers for locomotion. Such a method could potentially be used for designing controllers for swimming or walking robots, for instance. _______________________________________________________________________ A.J. Ijspeert, J. Hallam and D. Willshaw: Evolving swimming controllers for a simulated lamprey with inspiration from neurobiology, Adaptive Behavior 7:2, 1999 (in press). Abstract: This paper presents how neural swimming controllers for a simulated lamprey can be developed using evolutionary algorithms. A genetic algorithm is used for evolving the architecture of a connectionist model which determines the muscular activity of a simulated body in interaction with water. This work is inspired by the biological model developed by Ekeberg which reproduces the central pattern generator observed in the real lamprey \cite{Ekeberg93}. In evolving artificial controllers, we demonstrate that a genetic algorithm can be an interesting design technique for neural controllers and that there exist alternative solutions to the biological connectivity. A variety of neural controllers are evolved which can produce the pattern of oscillations necessary for swimming. These patterns can be modulated through the external excitation applied to the network in order to vary the speed and the direction of swimming. The best evolved controllers cover larger ranges of frequencies, phase lags and speeds of swimming than Ekeberg's model. We also show that the same techniques for evolving artificial solutions can be interesting tools for developing neurobiological models. In particular, biologically plausible controllers can be developed with ranges of oscillation frequency much closer to those observed in the real lamprey than Ekeberg's hand-crafted model. Keywords: Neural control; genetic algorithm; simulation; central pattern generator; swimming; lamprey. _______________________________________________________________________ A.J. Ijspeert, J. Kodjabachian: Evolution and development of a central pattern generator for the swimming of a lamprey, Artificial Life 5:3, 1999 (in press). Abstract: This paper describes the design of neural control architectures for locomotion using an evolutionary approach. Inspired by the central pattern generators found in animals, we develop neural controllers which can produce the patterns of oscillations necessary for the swimming of a simulated lamprey. This work is inspired by Ekeberg's neuronal and mechanical model of a lamprey \cite{Ekeberg93}, and follows experiments in which swimming controllers were evolved using a simple encoding scheme \cite{Ijspeert99_ab,Ijspeert98_sab}. Here, controllers are developed using an evolutionary algorithm based on the SGOCE encoding \cite{Kodjabachian98a,Kodjabachian98b} in which a genetic programming approach is used to evolve developmental programs which encode the growing of a dynamical neural network. The developmental programs determine how neurons located on a 2D substrate produce new cells through cellular division and how they form efferent or afferent interconnections. Swimming controllers are generated when the growing networks eventually create connections to the muscles located on both sides of the rectangular substrate. These muscles are part of a 2D mechanical simulation of the body of the lamprey in interaction with water. The motivation of this paper is to develop a method for the design of control mechanisms for animal-like locomotion. Such a locomotion is characterised by a large number of actuators, a rhythmic activity, and the fact that efficient motion is only obtained when the actuators are well coordinated. The task of the control mechanism is therefore to transform commands concerning the speed and direction of motion into the signals sent to the multiple actuators. We define a fitness function, based on several simulations of the controller with different commands settings, which rewards the capacity of modulating the speed and the direction of swimming in response to simple, varying input signals. Central pattern generators are thus evolved capable of producing the relatively complex patterns of oscillations necessary for swimming. The best solutions generate travelling waves of neural activity, and propagate, similarly to the swimming of a real lamprey, undulations of the body from head to tail propelling the lamprey forward through water. By simply varying the amplitude of two input signals, the speed and the direction of swimming can be modulated. Keywords: Neural control; genetic programming; developmental encoding; SGOCE; simulation; central pattern generator; swimming; lamprey. ________________________________________________________________________ Preliminary results on CPGs for salamander locomotion can be found in: A.J. Ijspeert: Evolution of neural controllers for salamander-like locomotion, in McKee, G.T. and Schenker, P.S., Editors, Proceedings of Sensor Fusion and Decentralised Control in Robotics Systems II, SPIE Proceeding Vol. 3839, September 1999, pp. 168-179. Abstract: This paper presents an experiment in which evolutionary algorithms are used for the development of neural controllers for salamander locomotion. The aim of the experiment is to investigate which kind of neural circuitry can produce the typical swimming and trotting gaits of the salamander, and to develop a synthetic approach to neurobiology by using genetic algorithms as design tool. A 2D bio-mechanical simulation of the salamander's body is developed whose muscle contraction is determined by the locomotion controller simulated as continuous-time neural networks. While the connectivity of the neural circuitry underlying locomotion in the salamander has not been decoded for the moment, the general organization of the designed neural circuits corresponds to that hypothesized by neurobiologists for the real animal. In particular, the locomotion controllers are based on a body {\it central pattern generator} (CPG) corresponding to a lamprey-like swimming controller as developed by Ekeberg~\cite{Ekeberg93}, and are extended with a limb CPG for controlling the salamander's body. A genetic algorithm is used to instantiate synaptic weights of the connections within the limb CPG and from the limb CPG to the body CPG given a high level description of the desired gaits. A set of biologically plausible controllers are thus developed which can produce a neural activity and locomotion gaits very similar to those observed in the real salamander. By simply varying the external excitation applied to the network, the speed, direction and type of gait can be varied. From gary at cs.ucsd.edu Sat Nov 20 04:10:18 1999 From: gary at cs.ucsd.edu (Gary Cottrell) Date: Sat, 20 Nov 1999 01:10:18 -0800 (PST) Subject: Faculty Positions at UCSD Message-ID: <199911200910.BAA29722@gremlin.ucsd.edu> Recall my previous note about three positions at ucsd, with most of them mentioning machine learning. We are moving quickly on senior candidates. If you're thinking of applying, don't delay! (If you've already applied, this does *not* mean we're unhappy with you! ;-)) See "joining our department" off the main web page, www.cse.ucsd.edu, for details on applying. One of the positions: The Ronald R. Taylor Chair position in Information Technology in Computer Science at the full professor level. This chair is targeted towards applicants in bioinformatics, computational biology, databases, data-intensive computing, data mining, or information technology. cheers, gary Gary Cottrell 858-534-6640 FAX: 858-534-7029 Faculty Assistant Chet Frost: 858-822-3286 Computer Science and Engineering 0114 IF USING FED EX INCLUDE THE FOLLOWING LINE: "Only connect" 3101 Applied Physics and Math Building University of California San Diego -E.M. Forster La Jolla, Ca. 92093-0114 Email: gary at cs.ucsd.edu or gcottrell at ucsd.edu Home page: http://www-cse.ucsd.edu/~gary/ From steve at cns.bu.edu Sat Nov 20 12:04:41 1999 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sat, 20 Nov 1999 12:04:41 -0500 Subject: hallucinations Message-ID: The following article is available in HTML, PDF, and Gzipped Postscript at http://www.cns.bu.edu/Profiles/Grossberg Grossberg, S. (1999). How hallucinations may arise from brain mechanisms of learning, attention, and volition. Journal of the International Neuropsychological Society, in press. ABSTRACT: This article suggests how brain mechanisms of learning, attention, and volition may give rise to hallucinations during schizophrenia and other mental disorders. The article suggests that normal learning and memory are stabilized through the use of learned top-down expectations. These expectations learn prototypes that are capable of focusing attention upon the combinations of features that comprise conscious perceptual experiences. When top-down expectations are active in a priming situation, they can modulate or sensitize their target cells to respond more effectively to matched bottom-up information. They cannot, however, fully activate these target cells. These matching properties are shown to be essential towards stabilizing the memory of learned representations. The modulatory property of top-down expectations is achieved through a balance between top-down excitation and inhibition. The learned prototype is the excitatory on-center in this top-down network. Phasic volitional signals can shift the balance between excitation and inhibition to favor net excitatory activation. Such a volitionally-mediated shift enables top-down expectations, in the absence of supportive bottom-up inputs, to cause conscious experiences of imagery and inner speech, and thereby to enable fantasy and planning activities to occur. If these volitional signals become tonically hyperactive during a mental disorder, the top-down expectations can give rise to conscious experiences in the absence of bottom-up inputs and volition. These events are compared with data about hallucinations. The article predicts where these top-down expectations and volitional signals may act in the laminar circuits of visual cortex, and by extension in other sensory and cognitive neocortical areas, and how the level of abstractness of learned prototypes may covary with the abstractness of hallucinatory content. A similar breakdown of volition may lead to declusions of control in the motor system. Key Words: hallucinations, learned expectations, attention, learning, adaptive resonance theory Preliminary version appears as Boston University Technical Report, CAS/CNS-TR-99-020. From baolshausen at ucdavis.edu Sat Nov 20 17:09:10 1999 From: baolshausen at ucdavis.edu (Bruno Olshausen) Date: Sat, 20 Nov 1999 14:09:10 -0800 Subject: Faculty position in cognitive neuroscience - UC Davis Message-ID: <38371C06.F225F9D8@ucdavis.edu> FACULTY POSITION IN COGNITIVE NEUROSCIENCE UC DAVIS The Center for Neuroscience and the Department of Psychology at the University of California, Davis, invite applications for a Cognitive Neuroscientist at the assistant, associate or full professor level to begin July 1, 2000. Specialization within the area of cognitive neuroscience is open, however, we are particularly interested in faculty who work on some aspect of human neurophysiology and can utilize newly established functional brain imaging facilities. Those who apply computational modeling approaches toward understanding cognitive aspects of brain function are also encouraged to apply. Applicants are expected to demonstrate leadership in their research specialty, ability to obtain extramural funds, and ability to teach cognitive neuroscience courses at both graduate and undergraduate levels. Candidates must possess a Ph.D. degree. The Center for Neuroscience is in the midst of rapid growth, and the successful applicant will have the opportunity to participate in the recruitment of additional positions in cognitive neuroscience. The University of California, Davis, is an affirmative action/equal opportunity employer with a strong institutional commitment to the development of a climate that supports equality of opportunity and respect for differences. Applicants should send a letter describing research and teaching interests, a curriculum vitae, copies of representative publications, and at least five letters of recommendation to: Edward G. Jones, Director, Center for Neuroscience, 1544 Newton Court, University of California, Davis, CA, 95616-8686. All materials must be received by February 1, 2000, to be assured consideration. The search will continue until this position is filled. The University of California is an Equal Opportunity Employer. From nat at cs.dal.ca Sun Nov 21 08:32:55 1999 From: nat at cs.dal.ca (Nathalie Japkowicz) Date: Sun, 21 Nov 1999 09:32:55 -0400 (AST) Subject: CFP: AAAI Workshop on Learning from Imbalanced Data Sets Message-ID: ---------------------------------- Call for Participation AAAI-2000 Workshop on Learning from Imbalanced Data Sets July 31 2000, Austin Texas ---------------------------------- The majority of learning systems previously designed and tested on toy problems or carefully crafted benchmark data sets usually assumes that the training sets are well balanced. In the case of concept-learning, for example, classifiers typically expect that their training set contains as many examples of the positive as of the negative class. Unfortunately, this balanced assumption is often violated in real world settings. Indeed, there exist many domains for which some classes are represented by a large number of examples while the others are represented by only a few. Although the imbalanced data set problem is starting to attract researchers' attention, attempts at tackling it have remained isolated. It is our belief that much progress could be achieved from a concerted effort and a greater amount of interactions between researchers interested in this issue. The purpose of this workshop is to provide a forum to foster such interactions and identify future research directions. Topics ------ * Novel techniques for dealing with imbalanced data sets: * Techniques for over-sampling the minority class. * Techniques for down-sizing the majority class. * Techniques for learning from a single class. * Techniques for internally biasing the learning process. * Other approaches. * Comparing the various methodologies. * The data imbalance problem in unsupervised learning. Format ------ The workshop will consist of several sessions concentrating on the themes identified above. The workshop will conclude with a panel of distinguished guests commenting on the presentations of the day, discussing future directions, and opening the floor for general discussion. Attendance ---------- This workshop is open to all members of the Machine-Learning, Data-Mining, Information Retrieval, Statistics and Connectionist communities interested in the data imbalance problem. Attendance is limited to 65 participants. Submission ---------- Prospective participants are invited to submit papers on the topics outlined above or on other related issues. Submissions should be 6 pages, and be in line with the AAAI style sheet. Electronic submissions, in Postscript format, are prefered and should be sent to Nathalie Japkowicz at nat at cs.dal.ca. Alternatively, four hard copies of the papers can be sent to: Nathalie Japkowicz Faculty of Computer Science DalTech/Dalhousie University 6050 University Avenue Halifax, N.S. Canada, B3H 1W5 Telephone: (902) 494-3157 FAX: (902) 492-1517 If space is available, attendance to the workshop is also possible by submitting a 1 or 2 page statement of interest to the above address. Timetable: ---------- * Submission deadline: March 10, 2000 * Notification date: March 24, 2000 * Final date for camera-ready copies to organizers: April 26, 2000 Co-Chairs: ---------- * Robert Holte, University of Ottawa (holte at site.uottawa.ca); * Nathalie Japkowicz, Dalhousie University (nat at cs.dal.ca); * Charles Ling, University of Western Ontario (ling at csd.uwo.ca); * Stan Matwin University of Ottawa (stan at site.uottawa.ca) Additional Information ---------------------- http://borg.cs.dal.ca/~nat/Workshop2000/workshop2000.html From piuri at elet.polimi.it Mon Nov 22 07:34:55 1999 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Mon, 22 Nov 1999 13:34:55 +0100 Subject: IJCNN'2000: Call for Special Sessions Message-ID: <3.0.5.32.19991122133455.015bda30@elet.polimi.it> ======================================================================== IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS Grand Hotel di Como, Como, Italy - 24-27 July 2000 sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society with the technical cooperation of the Japanese Neural Network Society, AEI, SIREN, and AI*IA > > > > > > CALL FOR SPECIAL SESSIONS < < < < < < Proposals for special sessions for IJCNN 2000 are solicited. Special sessions consist of three to five papers related to neural networks or related area. Proposals are due on December 31, 1999 and must include 1. The title of the special session and a one paragraph summary describing the session's themes. 2. The names, affiliations and e-mail of the organizers of the special session with identification of one of these individuals as the contact person. 3. A list of papers including (a) authors (b) author affiliation and e-mail and (c) paper title 4. A statement from the session organizer that (a) the authors have been contacted and agree to write the paper and (b) at least one author per paper will register and attend IJCNN2000 to present the paper (the registration is due by 30 April 2000 to guarantee inclusion of the paper in the proceedings - see the detailed conditions in the conference Call for Papers on the conference web site). Final papers for publication in the conference record will be due on January 31, 2000. Proposals for Special Sessions should be submitted electronically to ijcnn2000 at ee.washington.edu For the complete Call for Papers and other information (including information about Como, Italy), visit the conference web site at: http://www.ims.unico.it/2000ijcnn.html ====================================================================== IJCNN 2000 Special Sessions Committee Payman Arabshahi, JPL, USA Alexandre P. Alves da Silva, Federal Engineering School at Itajuba, Brazil Mohamed El-Sharkawi, University of Washington, USA Michael Healy, The Boeing Airplane Company, USA Jae-Byung Jung, University of Washington, USA Robert J. Marks II (Chair), University of Washington, USA ====================================================================== Vincenzo Piuri Department of Electronics and Information, Politecnico di Milano piazza L. da Vinci 32, 20133 Milano, Italy phone +39-02-2399-3606 secretary +39-02-2399-3623 fax +39-02-2399-3411 email piuri at elet.polimi.it From Sebastian_Thrun at heaven.learning.cs.cmu.edu Mon Nov 22 11:31:22 1999 From: Sebastian_Thrun at heaven.learning.cs.cmu.edu (Sebastian Thrun) Date: Mon, 22 Nov 1999 11:31:22 -0500 Subject: NEW MASTERS PROGRAM IN KNOWLEDGE DISCOVERY AND DATA MINING Message-ID: Carnegie Mellon, Center for Automated Learning & Discovery announces: A NEW MASTERS PROGRAM IN KNOWLEDGE DISCOVERY AND DATA MINING The extraordinary spread of computers and online data is changing forever the way that important decisions are made in many organizations. Hospitals analyze online medical records to decide which treatments to apply to future patients. Banks routinely analyze past financial records to learn to spot future fraud. Today's demand for data mining expertise far exceeds the supply, and this imbalance will become more severe over the coming decade. To educate the next generation of experts in this important area, Carnegie Mellon University offers a new Master's program in Knowledge Discovery and Data Mining (KDD). This new inter-disciplinary program trains students to become tomorrow's leaders in the rapidly growing area of Knowledge Discovery and Data Mining. It is offered by Carnegie Mellon's Center for Automated Learning and Discovery (CALD), which has assembled a large multi-disciplinary team of faculty and students across several academic departments. KDD candidates will be trained in all important areas related to scientific data mining and knowledge discovery. The Master's program balances interdisciplinary course work, hands-on project work, and cutting-edge research carried out under direct faculty supervision. The curriculum addresses areas such as advanced machine learning algorithms, statistical principles and foundations, database and data warehousing methods, complexity analysis, approaches to data visualization, privacy and security, and specific application areas such as business, marketing, finance, and public policy. Our graduates are uniquely positioned to pioneer new data mining and knowledge discovery efforts, and to pursue top notch research on the next generation of data mining tools, algorithms, and systems. Carnegie Mellon invites applications of qualified individuals. Admission is highly competitive. A limited number of fellowships are available, which will be provided on a competitive basis. The application deadline is February 5, 2000. For more details about the program or to apply: http://www.cs.cmu.edu/~kdd From cindy at cns.bu.edu Mon Nov 22 13:45:37 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Mon, 22 Nov 1999 13:45:37 -0500 Subject: Neural Networks 12(9) Message-ID: <199911221845.NAA04682@retina.bu.edu> NEURAL NETWORKS 12(9) Contents - Volume 12, Number 9 - 1999 NEURAL NETWORKS LETTERS: Solving the n-bit parity problem using neural networks Myron E. Hohil, Derong Liu, and Stanley H. Smith CURRENT OPINIONS: How stereo vision interacts with optic flow perception: Neural mechanisms M. Lappe and A. Grigo ARTICLES: *** Mathematical and Computational Analysis *** The asymptotic memory capacity of the generalized Hopfield network Jinwen Ma Synchronization and desynchronization of neural oscillators A. Tonnelier, S. Meignen, H. Bosch, and J. Demongeot Improved learning algorithms for mixture of experts in multiclass classification K. Chen, L. Xu, and H. Chi On the indentifiability of mixtures-of-experts W. Jiang and M.A. Tanner Derivation of the multilayer perceptron weight constraints for direct network interpretation and knowledge discovery M.L. Vaughn *** Engineering and Design *** The Kohonen network incorporating explicit statistics and its application to the travelling salesman problem N. Aras, B.J. Oommen, and I.K. Altmel Accelerating neural network training using weight extrapolations S.V. Kamarthi and S. Pittner HyFIS: Adaptive neuro-fuzzy inference systems and their application to nonlinear dynamical systems J. Kim and N. Kasabov BOOK REVIEW: How to legitimate a field: A review of D.J. stein and J. Ludik's (1998) "Neural networks and psychopathology: Connectionist models in practice and research" Greg J. Siegle ------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** From Nello.Cristianini at bristol.ac.uk Tue Nov 23 08:17:56 1999 From: Nello.Cristianini at bristol.ac.uk (N Cristianini) Date: Tue, 23 Nov 1999 13:17:56 +0000 (GMT) Subject: Support Vector Machines: Book Announcement Message-ID: AN INTRODUCTION TO SUPPORT VECTOR MACHINES and other kernel-based learning methods N. Cristianini and J. Shawe-Taylor Cambridge University Press, 2000 ISBN: 0 521 78019 5 Book's website: www.support-vector.net This book is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. The book also introduces Bayesian analysis of learning and relates SVMs to Gaussian Processes and other kernel based learning methods. SVMs deliver state-of-the-art performance in real-world applications such as text categorisation, hand-written character recognition, image classification, biosequences analysis, etc. Their first introduction in the early 1990s lead to a recent explosion of applications and deepening theoretical analysis, that has now established Support Vector Machines along with neural networks as one of the standard tools for machine learning and data mining. Students will find the book both stimulating and accessible, while practitioners will be guided smoothly through the material required for a good grasp of the theory and application of these techniques. The concepts are introduced gradually in accessible and self-contained stages, though in each stage the presentation is rigorous and thorough. Pointers to relevant literature and web sites containing software ensure that it forms an ideal starting point for further study. Equally the book will equip the practitioner to apply the techniques and an associated web site will provide pointers to updated literature, new applications, and on-line software. More information is available on the book's website: www.support-vector.net From Zoubin at gatsby.ucl.ac.uk Tue Nov 23 07:17:59 1999 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Tue, 23 Nov 1999 12:17:59 +0000 (GMT) Subject: Postdoc and PhD Positions, Gatsby Computational Neuroscience Unit Message-ID: <199911231217.MAA06430@cajal.gatsby.ucl.ac.uk> Gatsby Computational Neuroscience Unit Director: Geoffrey Hinton http://www.gatsby.ucl.ac.uk/ Post-doctoral and PhD Research Positions Computational Neuroscience The Gatsby Computational Neuroscience Unit invites applications for PhD studentships and post-doctoral research positions tenable from September 2000. Members of the unit are interested in models of all aspects of brain function, especially unsupervised learning, computational vision, reinforcement learning, and computational motor control, and also conduct psychophysical experiments in motor control and vision. Current researchers at the unit include 12 PhD students and the following faculty and postdocs: Faculty: Post-doctoral Fellows: Geoff Hinton Hagai Attias Peter Dayan Sam Roweis Zoubin Ghahramani Maneesh Sahani Zhaoping Li Emo Todorov Carl van Vreeswijk For further details please see: http://www.gatsby.ucl.ac.uk/research.html The Gatsby Unit provides a unique opportunity for a critical mass of theoreticians to interact closely with each other and with University College's other world class research groups including Anatomy, Computer Science, Functional Imaging Laboratory, Physics, Physiology, Psychology, Neurology, Ophthalmology, and Statistics. The unit has excellent computational facilities, and laboratory facilities include both motor and visual psychophysics labs for theoretically motivated experimental studies. The unit's visitor and seminar programmes enable its staff and students to interact with leading researchers from across the world. Candidates should have a strong analytical background and a keen interest in neuroscience. Competitive salaries and studentships are available. Applicants should send a CV [PhD applicants should include details of course work and grades], a statement of research interests, and names and addresses of 3 referees to janice at gatsby.ucl.ac.uk [email preferred] or to The Gatsby Computational Neuroscience Unit University College London Alexandra House 17 Queen Square LONDON WC1N 3AR UK ** Closing date for applications: 31 January 2000 ** From rogers at rtna.daimlerchrysler.com Tue Nov 23 14:15:23 1999 From: rogers at rtna.daimlerchrysler.com (Seth Rogers) Date: Tue, 23 Nov 1999 11:15:23 -0800 (PST) Subject: Complete CFP: International Conference on Machine Learning Message-ID: Call for Papers THE SEVENTEENTH INTERNATIONAL CONFERENCE ON MACHINE LEARNING June 29-July 2, 2000 Stanford University The Seventeenth International Conference on Machine Learning (ICML-2000) will be held at Stanford University from June 29 to July 2, 2000, in the heart of Silicon Valley. The conference will bring together researchers to exchange ideas and report recent progress in the computational study of learning. Topics for Submission ICML-2000 welcomes submissions on all facets of machine learning, but especially solicits papers on problem areas, research topics, learning paradigms, and approaches to evaluation that have been rare at recent conferences, including: - the role of learning in natural language, vision and speech, planning and scheduling, design and configuration, logical and spatial reasoning, motor control, and more generally on learning for performance tasks carried out by intelligent agents; - the discovery of scientific laws and taxonomies, the construction of componential and structural models, and learning at multiple levels of temporal and spatial resolution; - the effect of the developers' decisions about problem formulation, representation, data quality, and reward function on the learning process; - computational models of human learning, applications to real-world problems, exploratory research that describes novel learning tasks, work that integrates familiar methods to demonstrate new functionality, and agent architectures in which learning plays a central role; - empirical studies that combine natural data (to show relevance) with synthetic data (to understand conditions on behavior), along with formal analyses that make contact with empirical results, especially where the aim is to identify sources of power, rather than to show one method is superior to others. Naturally, we also welcome submissions on traditional topics, ranging from induction over supervised data to learning from delayed rewards, but we hope the conference will also attract contributions on the issues above. Review Process The ICML-2000 review process will be structured to encourage publications covering a broad range of research and to foster increased participation in the conference. To this end, we have instituted: - area chairs who will be responsible for recruiting papers in their area of expertise and overseeing the review process for those submissions; - conditional acceptance of papers that are not publishable in their initial form, but that can be improved enough for inclusion in time to appear in the proceedings; and - a review form that requires referees to explicitly list any problems with a paper, what it would take to overcome them, and, if they recommend rejection, why it cannot be fixed in time for inclusion. The overall goal is to make the review process more like that in journals, with time for the authors to incorporate feedback from reviewers. Each submitted paper will be reviewed by two members of the program committee, with the decision about its acceptance overseen by the responsible area chair and the program chair. Paper Submission Authors should submit papers using same format and length as the final proceedings version. The detailed instructions for authors at http://www-csli.stanford.edu/icml2k/instructions.html include pointers to templates for LaTeX and Word documents. These specify two-column style, Times Roman font with 10 point type, vertical spacing of 11 points, overall text width of 6.75 inches, length of 9.0 inches, 0.25 inches between the two columns, top margin of 1.0 inch, and left margin of 0.75 inch. (The right and bottom margins will depend on whether one uses US letter or A4 paper.) Papers must not exceed eight (8) pages including figures and references. We will return to the authors any papers that do not satisfy these requirements. The deadline for submissions to ICML-2000 is MONDAY, JANUARY 24, 2000. Submission will be entirely electronic by transferring papers to the ICML-2000 ftp site, as explained in the detailed instructions for authors. Authors must submit papers in POSTSCRIPT format to ensure our ability to print them out for review. Each submission must be accompanied by the paper's title, the authors' names and physical addresses, a 250-word abstract, the contact author's email address and phone number, and the author who would present the talk at the conference. Authors must enter this information into the submission form at the conference web site by FRIDAY, JANUARY 21. ICML-2000 allows simultaneous submission to other conferences, provided this fact is clearly indicated on the submission form. Accepted papers will appear in the conference proceedings only if they are withdrawn from other conferences. Simultaneous submissions that are not clearly specified as such will be rejected. Other Conference Information The Seventeenth International Conference on Machine Learning will be collocated with the Thirteenth Annual Conference on Computational Learning Theory (COLT-2000) and the Sixteenth Conference on Uncertainty in Artificial Intelligence (UAI-2000). Registrants to any of these meetings will be able to attend the technical sessions of the others at no additional cost. ICML-2000 will also be preceded by tutorials on various facets of machine learning. For additional information, see the web site for the conference at http://www-csli.stanford.edu/icml2k/ which will provide additional details as they become available. If you have questions about ICML-2000, please send electronic mail to icml2k at csli.stanford.edu. The conference has received support from DaimlerChrysler Research and Technology, Stanford's Center for the Study of Language and Information (CSLI), and the Institute for the Study of Learning and Expertise (ISLE). From maass at igi.tu-graz.ac.at Wed Nov 24 09:46:41 1999 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Wed, 24 Nov 1999 15:46:41 +0100 Subject: Subject: Article on the computational power of winner-take-all Message-ID: <383BFA51.32366A4D@igi.tu-graz.ac.at> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit The following article is now online available : On the Computational Power of Winner-Take-All Wolfgang Maass Technische Universitaet Graz, Austria Abstract: Everybody ``KNOWS'' that neural networks need more than a single layer of nonlinear units to compute interesting functions. We show that this is FALSE if one employs winner-take-all as nonlinear unit: * Any boolean function can be computed by a single k-winner-take-all unit applied to weighted sums of the input variables. * Any continuous function can be approximated arbitrarily well by a single soft winner-take-all unit applied to weighted sums of the input variables. * Only positive weights are needed in these (linear) weighted sums. This may be of interest from the point of view of neurophysiology, since only 15% of the synapses in the cortex are inhibitory. * Our results support the view that winner-take-all is a very suitable basic computational unit in Neural VLSI: - it is wellknown that winner-take-all of n input variables can be computed very efficiently with 2n transistors (and a total wire length and area that is linear in n ) in analog VLSI [Lazzaro et al., 1989] - we show that winner-take-all is not just useful for special purpose computations, but may serve as the only nonlinear unit for neural circuits with universal computational power - we show that any multi-layer perceptron needs quadratically in n many gates to compute winner-take-all for n input variables, hence winner-take-all provides a substantially more powerful computational unit than a perceptron (at about the same cost of implementation in analog VLSI). ------------------------------------------------------------------ The full version of this article will appear in Neural Computation. It can be downloaded from http://www.tu-graz.ac.at/igi/maass/#Publications (see publication # 113). An extended abstract will appear in the Proceedings of NIPS 1999. From Leo.van.Hemmen at Physik.TU-Muenchen.DE Wed Nov 24 17:39:06 1999 From: Leo.van.Hemmen at Physik.TU-Muenchen.DE (J. Leo van Hemmen) Date: Wed, 24 Nov 1999 23:39:06 +0100 Subject: Biological Cybernetics Message-ID: Dear Friends: In the July issue 81/1 (1999) of ``Biological Cybernetics'', its Editors-in-Chief Gert Hauske and I have published an Editorial. As we think it could make for interesting reading for most of you we have appended the text. Enjoy reading, Leo van Hemmen. >>>$<<< The time has come for _Biological Cybernetics_ to face some changes and widen its scope. First, Leo van Hemmen has joined Gert Hauske as Coeditor-in-Chief. The latter welcomes him as a colleague and friend with extensive experience in theoretical biophysics, international reputation, and a passion for temporal coding, in particular, within the visual and auditory system. The former realizes that, in managing the Journal as well as Gert Hauske does, he is called to live up to a very high standard. In setting a new course for the Journal and working in a cybernetic vein, we intend to focus on sensory modalities, especially hearing and seeing, their cortical processing, and emanating activities such as those in the motor cortex, including gesture and locomotion. Learning and memory constitute the second key domain of the Journal, strongly dependent as they are on sensory modalities. Also the analysis of underlying techniques deserves our attention. Papers on artificial neural nets whose motivation stems from biology are as welcome as they were before [1]. In this way the editors will continue the liberal, innovative, and rich traditions set forth for the Journal since its conception. It was Norbert Wiener [2,3] who defined the notion of cybernetics: ``We have decided to call the entire field of control and communication theory, whether in the machine or in the animal, by the name _Cybernetics_, which we form from the Greek $\kappa\upsilon\beta\epsilon\rho\nu\eta\tau\eta\varsigma$ or _steersman_.'' He then noted that J.C. Maxwell's 1868 paper on governors, a word that is a Latin corruption of its Greek root, was the first significant paper on mechanisms of feedback. While control is a key notion for all of biology, the importance of feedback connections relative to feedforward regulation has been appreciated most profoundly through recent studies of the brain. Communication or transmission of information is another key idea that is indispensable in understanding neuronal processing. Here Wiener was also ahead of our time, save for the fact that he imagined machine and animal to perform information processing in the same way -- as is suggested by the above interlude ``whether in the machine or in the animal'' and is set out elsewhere [2,3]. Today we know much more than one could have then [4], that the differences between neuronal and machine computation, and between the underlying mechanisms of learning and memory, are huge. With hindsight it is easy to see why the original cybernetic movement stalled in the mid sixties. It was a never explicitly formulated hypothesis that a simple block diagram could account for the exquisite functions of animal brains [2-4], and that a clever person could write an algorithm for each procedure emerging from a block diagram. Life is not that simple and would-be programmers were called upon to digest megabytes of data, which apparently they could not. Our brain is simply not made for that. Biological cybernetics as conceived by our founding Editor-in-Chief Werner Reichardt is different. His insertion of the adjective _biological_ before `cybernetics' focused attention on natural as opposed to artificial networks. The underlying idea, as advocated by Wiener, is that a merger of biology, mathematics, and physics is needed to clarify the fascinating problems posed by animal brains functioning as `neuronal machines' in natural environments. This means that computational neuroscience necessarily must be an integral part of the Journal. Biologists are increasingly aware of the power theoretical neuroscience has for making predictions that can be verified by experiment, and to suggest entirely new directions for experimental investigation. Accordingly, _Biological Cybernetics_ will aim at bringing to biologists a better understanding of the reciprocity between theory and experiment, which has so long been the successful formula for the physical sciences. At the same time, however, the Journal will remain deeply rooted in biological experimentation and its power for setting boundaries on overly zealous speculation. What, then, should be questioned or analyzed? Though biological cybernetics can, and did, give insight into such attractive practical problems as heart arrhythmias, a richer approach should focus on all aspects of information processing, notably in biological neural networks. The Journal's traditional preferences slanted towards the visual system, perhaps to the expense of a broader analysis of neuronal information processing as a whole. Since the auditory system is so intimately connected with the visual cortex, and a fascinating structure by itself, we think that focusing on both may be particularly fruitful. Through such comparisons our editorial objective is to stimulate an inquiring assessment of sensory information processing at large and to illuminate the `hows' and `whys' that give rise to similarity and difference in various systems. Putting things in a proper perspective, we are aiming at all aspects of communication and control in biological information processing. To accommodate this broadened frontier for the Journal, the Editorial Board will be appropriately extended. Most importantly, we hope that you, the reader and prospective author, will join us in this new adventure. Gert Hauske J. Leo van Hemmen. [1] Braitenberg V (1984) Vehicles: Experiments in Synthetic Psychology. MIT Press, Cambridge, MA [2] Wiener N (1948) Cybernetics, or control and communication in the animal and the machine. Wiley, New York, and Hermann, Paris. See in particular p. 19. [3] Wiener N (1948) Cybernetics. Sci. Amer. 179:14-18 [4] Rosenblith W and Wiesner J (1966) From philosophy to mathematics to biology. Bull. Amer. Math. Soc. 72:33-38. This is a critical appreciation of Wiener's role in biology by two of his contemporaries. >>>$<<< Prof. Dr. J. Leo van Hemmen Physik Department TU M"unchen D-85747 Garching bei M"unchen Germany Phone: +49(89)289.12362 (office) and .12380 (secretary) Fax: +49(89)289.14656 e-mail: Leo.van.Hemmen at ph.tum.de From uchiyama at ics.kagoshima-u.ac.jp Thu Nov 25 00:12:55 1999 From: uchiyama at ics.kagoshima-u.ac.jp (Hiroyuki UCHIYAMA) Date: Thu, 25 Nov 1999 14:12:55 +0900 Subject: Paper available: Retinal computation of motion direction Message-ID: <199911250512.OAA03071@joho.ics.kagoshima-u.ac.jp> The following paper (pdf file) is now available from http://www.ics.kagoshima-u.ac.jp/~uchiyama/preprint.html This is a preprint of an article that will appear in Visual Neurosciece. ---------------------------------------------------------- Computation of Motion Direction by Quail Retinal Ganglion Cells That Have a Nonconcentric Receptive Field Hiroyuki Uchiyama, Takahide Kanaya and Shoichi Sonohata Abstract One type of retinal ganglion cells prefers object motion in a particu lar direction. Neuronal mechanisms for the computation of motion direction are still unknown. We quantitatively mapped excitatory and inhibitory regio ns of receptive fields for directionally selective retinal ganglion cells in the Japanese quail, and found that the inhibitory regions are displaced abou t 1-3 deg. toward the side where the null sweep starts, relative to the exci tatory regions. Directional selectivity thus results from delayed transient suppression exerted by the nonconcentrically-arranged inhibitory regions, an d not by local directional inhibition as hypothesized by Barlow and Levick (1965) . ----------------------------------------------------------- ---------------------------------------------- Hiroyuki Uchiyama, Ph.D. Department of Information & Computer Science, Faculty of Engineering, Kagoshima University Korimoto 1-21-40, Kagoshima 890-0065, JAPAN phone +81-99-285-8449 fax +81-99-285-8464 http://www.ics.kagoshima-u.ac.jp/~uchiyama From omlin at waterbug.cs.sun.ac.za Thu Nov 25 11:55:04 1999 From: omlin at waterbug.cs.sun.ac.za (Christian Omlin) Date: Thu, 25 Nov 1999 18:55:04 +0200 Subject: Preprint available Message-ID: Dear Colleagues The technical report below is available from our website http://www.cs.sun.ac.za/projects/tech_reports/US-CS-TR-99-14.ps.gz We welcome any comments you may have. With kind regards, Christian Christian W. Omlin e-mail: omlin at cs.sun.ac.za Department of Computer Science phone (direct): +27-21-808-4308 University of Stellenbosch phone (secretary): +27-21-808-4232 Private Bag X1 fax: +27-21-808-4416 Stellenbosch 7602 http://www.cs.sun.ac.za/people/staff/omlin SOUTH AFRICA http://www.neci.nj.nec.com/homepages/omlin ------------------------------- cut here ------------------------------ What Inductive Bias Gives Good Neural Network Training Performance? S. Snyders C.W. Omlin Department of Computer Science University of Stellenbosch 7602 Stellennbosch South Africa E-mail: {snyders,omlin}@cs.sun.ac.za ABSTRACT There has been an increased interest in the use of prior knowl- edge for training neural networks. Prior knowledge in the form of Horn clauses has been the predominant paradigm for knowledge- based neural networks. Given a set of training examples and an initial domain theory, a neural network is constructed that fits the training examples by preprogramming some of the weights. The initialized neural network is then trained using backpropagation to refine the knowledge. The prior knowledge presumably defines a good starting point in weight space and provides an inductive bias leading to faster convergence; it overrides backpropaga- tion's bias toward a smooth interpolation resulting in small weights. This paper proposes a heuristic for determining the strength of the inductive bias by making use of gradient informa- tion in weight space in the direction of the programmed weights. The network starts its search in weight space where the gradient is maximal thus speeding-up convergence. Tests on a benchmark problem from molecular biology demonstrate that our heuristic on average reduces the training time by 60% compared to a random choice of the strength of the inductive bias; this performance is within 20% of the training time that can be achieved with optimal inductive bias. The difference in generalization performance is not statistically significant. From Peter.Bartlett at anu.edu.au Thu Nov 25 22:24:49 1999 From: Peter.Bartlett at anu.edu.au (Peter Bartlett) Date: Fri, 26 Nov 1999 14:24:49 +1100 Subject: machine learning position at ANU Message-ID: <383DFD81.9D933520@anu.edu.au> We're currently advertising a position in machine learning at the Australian National University. It's a limited term (3-5 year) research position at academic level B (Research Fellow). Researchers in the group (which currently includes Peter Bartlett, Jonathan Baxter, Markus Hegland, John Lloyd and Stephen Roberts) work on a variety of theoretical and experimental areas in machine learning, including: reinforcement learning, computational learning theory, neural networks, large margin classification and prediction, scalable data analysis for data mining, and logic for machine learning. See http://wwwrsise.anu.edu.au/ad.html#LevB_CSL for more information. -- Peter. Peter Bartlett email: Peter.Bartlett at anu.edu.au Computer Sciences Laboratory Phone: +61 2 6279 8681 Research School of Information Sciences and Engineering Australian National University Fax: +61 2 6279 8645 Canberra, 0200 AUSTRALIA http://csl.anu.edu.au/~bartlett From J.Sougne at ulg.ac.be Fri Nov 26 04:13:38 1999 From: J.Sougne at ulg.ac.be (J.Sougne@ulg.ac.be) Date: Fri, 26 Nov 1999 10:13:38 +0100 Subject: Dissertation available on cognitive modeling with spiking neurons Message-ID: Dear Connectionists, The following dissertation is now available on-line at http://www.fapse.ulg.ac.be/Lab/cogsci/jsougne/jspapers.html http://www.fapse.ulg.ac.be/Lab/cogsci/jsougne/JSougneThesis.ps.Z (compressed postscript) http://www.fapse.ulg.ac.be/Lab/cogsci/jsougne/JSougneThesis.pdf (pdf) INFERNET: A Neurocomputational Model of Binding and Inference by Jacques P. Sougn? ABSTRACT An implementation of a network of integrate-and-fire neuron-like elements is presented. Integrate-and-fire nodes fire at a precise moment and transmit their activation, with a particular strength and delay, to nodes connected to them. The receiving nodes accumulate potential but also slowly loose their potential through decay. When the potential of the node reaches a particular threshold, it emits a spike. Thereafter, the potential is reset to a resting value. As with real neurons, there is a short refractory period during which this node will be completely insensitive to incoming signals, after which its sensitivity will slowly increase. Precise timing properties have been used to represent symbols in a distributed manner, and to solve the problems of binding and multiple instantiation. This architecture produced several predictions about human short-term memory, predicate processing, complex reasoning, and multiple instantiation. These predictions have been tested by empirical studies on humans. This network shows symbolic processing abilities using neurologically and psychologically plausible mechanisms that maintain the advantages of generalization and noise tolerance found in connectionist networks. From kathryn.cousins at hodder.co.uk Fri Nov 26 07:02:18 1999 From: kathryn.cousins at hodder.co.uk (Kathryn Cousins) Date: Fri, 26 Nov 1999 12:02:18 -0000 Subject: book announcement: Statistical Pattern Recognition Message-ID: <000e01bf3806$166c0380$5b04c00a@cathryncousins.hhinternal.co.uk> STATISTICAL PATTERN RECOGNITION - http://www.arnoldpublishers.com/scripts/webbook.asp?isbn=0340741643 By Andrew Webb, Defence Evaluation and Research Agency, UK From bengioy at IRO.UMontreal.CA Fri Nov 26 13:29:45 1999 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Fri, 26 Nov 1999 13:29:45 -0500 Subject: call for presentations / April Workshop on selecting and combining models Message-ID: <19991126132945.48542@IRO.UMontreal.CA> -------------------------------------------------------------------- Call for Presentations: CRM Workshop on Selecting and Combining Models with Machine Learning Algorithms April 12-14, 2000 Centre de Recherches Mathematiques Montreal Organizers: Yoshua Bengio (Universite de Montreal) and Dale Schuurmans (University of Waterloo) A central objective of machine learning research is to develop algorithms that learn predictive relationships from data. This is a central component of data mining and knowledge discovery tasks, which are becoming commonplace applications in the realm of e-commerce. This is a difficult task, however, because inferring a predictive function from data is in fact an "ill-posed" problem; that is, many functions can often "fit" a given finite data set, and yet these functions might generalize very differently on new data drawn from the same distribution. To make this problem well-posed one needs to somehow "calibrate" the complexity of the proposed function class to the amount and quality of available sample data. A classical approach is to perform "model selection" where one imposes a preference structure over function classes and then optimizes a combined objective of class preference and data fit. In doing so, however, it would be useful to have an accurate estimate of the expected generalization performance at each preference level; one could then pick the function class that obtained the lowest expected error, or combine functions from the functions classes with the lowest expected error, and so on. Many approaches have been proposed in the past for this purpose, both in the statistics and the machine learning research communities. Recently in machine learning there has been significant interest in new techniques for evaluating generalization error, for optimizing generalization error, and for combining and selecting models. This is exemplified, for example, by recent work on Structural Risk Minimization, Support Vector Machines, various Boosting algorithms, and the Bagging algorithm. These new approaches suggest that better generalization performance can be obtained using new, broadly applicable procedures. Progress in this area has not only been important for improving our understanding of how machine learning algorithms generalize, it has already been demonstrated to be very useful for practical applications of machine learning and data analysis. This workshop will bring together several key researchers in the fields of machine learning and statistics to present their recent results and debate the controversial issues that have been dividing them in the recent machine learning and neural network conferences. The following leaders in this field have tentatively accepted to participate to the workshop as invited speakers: Peter Bartlett (Australia National University), Leo Breiman (University of California-Berkeley), Tom Dietterich (Oregon State University), Yoav Freund (AT&T Labs-Research), Radford Neal (University of Toronto), Michael Perrone (IBM T.J. Watson Research Center), Robert Schapire (AT&T Labs-Research), Grace Wahba (University of Wisconsin at Madison). The workshop will be sponsored by the CRM (Centre de Recherches Mathematiques) as well as by the MITACS (Mathematics of Information Technology And Complex Systems) Network of Centers of Excellence. Contributors to the workshop are invited to submit a short (1 or 2 page) summary in electronic form (ascii text, postscript or pdf) of the proposed presentation by e-mail, to one of the organizers by February 1st, 2000: bengioy at iro.umontreal.ca or dale at cs.uwaterloo.ca. Information on the workshop will be posted on the following web site: www.iro.umontreal.ca/~bengioy/crmworkshop2000 -------------------------------------------------------------------- -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From Peter.Bartlett at anu.edu.au Fri Nov 26 21:17:58 1999 From: Peter.Bartlett at anu.edu.au (Peter Bartlett) Date: Sat, 27 Nov 1999 13:17:58 +1100 Subject: paper: hebbian synaptic update rule for reinforcement learning Message-ID: <383F3F56.E5506911@anu.edu.au> The following paper is available at http://csl.anu.edu.au/~bartlett/papers/BartlettBaxter-Nov99.ps.gz Hebbian Synaptic Modifications in Spiking Neurons that Learn Peter L. Bartlett and Jonathan Baxter Australian National University In this paper, we derive a new model of synaptic plasticity, based on recent algorithms for reinforcement learning (in which an agent attempts to learn appropriate actions to maximize its long-term average reward). We show that these direct reinforcement learning algorithms also give locally optimal performance for the problem of reinforcement learning with multiple agents, without any explicit communication between agents. By considering a network of spiking neurons as a collection of agents attempting to maximize the long-term average of a reward signal, we derive a synaptic update rule that is qualitatively similar to Hebb's postulate. This rule requires only simple computations, such as addition and leaky integration, and involves only quantities that are available in the vicinity of the synapse. Furthermore, it leads to synaptic connection strengths that give locally optimal values of the long term average reward. The reinforcement learning paradigm is sufficiently broad to encompass many learning problems that are solved by the brain. We illustrate, with simulations, that the approach is effective for simple pattern classification and motor learning tasks. -- Peter. Peter Bartlett email: Peter.Bartlett at anu.edu.au Machine Learning Group Computer Sciences Laboratory Phone: +61 2 6279 8681 Research School of Information Sciences and Engineering Australian National University Fax: +61 2 6279 8645 Canberra, 0200 AUSTRALIA http://csl.anu.edu.au/~bartlett From piuri at elet.polimi.it Sat Nov 27 06:00:13 1999 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Sat, 27 Nov 1999 12:00:13 +0100 Subject: IJCNN'2000: New web site, Call for Papers and more.... Message-ID: <3.0.5.32.19991127120013.01d474f0@elet.polimi.it> ======================================================================== IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS Grand Hotel di Como, Como, Italy - 24-27 July 2000 sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society with the technical cooperation of the Japanese Neural Network Society, AEI, SIREN, and AI*IA Paper submission deadline is coming soon!!!! Please, have a look to the call for papers in the conference web site. Do not miss the opportunity to submit a paper and participate actively to the conference!!!! Calls for special sessions and tutorials have been also published. You can find them on the conference web site. For your convenience, we set up also a mirror web site in the USA. Information will be updated contemporaneously on both sites. You can find them at the following addresses official conference web site: http://www.ims.unico.it/2000ijcnn.html conference mirror web site: http://www.lans.ece.utexas.edu/2000ijcnn.html ====================================================================== Vincenzo Piuri Department of Electronics and Information, Politecnico di Milano piazza L. da Vinci 32, 20133 Milano, Italy phone +39-02-2399-3606 secretary +39-02-2399-3623 fax +39-02-2399-3411 email piuri at elet.polimi.it From ingber at ingber.com Sun Nov 28 17:29:56 1999 From: ingber at ingber.com (Lester Ingber) Date: Sun, 28 Nov 1999 16:29:56 -0600 Subject: paper: ... reaction time correlates of the g factor Message-ID: <19991128162956.A7362@ingber.com> Statistical mechanics of neocortical interactions: Reaction time correlates of the g factor has been submitted as an Invited commentary on The g Factor: The Science of Mental Ability by Arthur Jensen This paper can be retrieved as http://www.ingber.com/smni00_g_factor.ps.gz Instructions for retrieval are given below. ABSTRACT: A statistical mechanics of neuronal interactions (SMNI) is explored as providing some substance to a physiological basis of the g factor. Some specific elements of SMNI, previously used to develop a theory of short-term memory (STM) and a model of electroencephalography (EEG) are key to providing this basis. Specifically, Hick's Law, an observed linear relationship between reaction time (RT) and the information storage of STM, in turn correlated to a RT-g relationship, is derived. Links to informations and utilities for compression/expansion and for viewing and printing PostScript are in http://www.ingber.com/Z_gz_ps_tar_shar.txt Lester ======================================================================== Instructions for Retrieval of Code and Reprints Interactively Via WWW The archive can be accessed via WWW path http://www.ingber.com/ http://www.alumni.caltech.edu/~ingber/ where the last address is a mirror homepage for the full archive. Interactively Via Anonymous FTP Code and reprints can be retrieved via anonymous ftp from ftp.ingber.com. Interactively [brackets signify machine prompts]: [your_machine%] ftp ftp.ingber.com [Name (...):] anonymous [Password:] your_e-mail_address [ftp>] binary [ftp>] ls [ftp>] get file_of_interest [ftp>] quit The 00index file contains an index of the other files. Files have the same WWW and FTP paths under the main / directory; e.g., http://www.ingber.com/MISC.DIR/00index_misc and ftp://ftp.ingber.com/MISC.DIR/00index_misc reference the same file. Electronic Mail If you do not have WWW or FTP access, get the Guide to Offline Internet Access, returned by sending an e-mail to mail-server at rtfm.mit.edu with only the words send usenet/news.answers/internet-services/access-via-email in the body of the message. The guide gives information on using e-mail to access just about all InterNet information and documents. Additional Information Lester Ingber Research (LIR) develops projects in areas of expertise documented in the ingber.com InterNet archive. Limited help assisting people with queries on my codes and papers is available only by electronic mail correspondence. Sorry, I cannot mail out hardcopies of code or papers. Lester ======================================================================== -- Lester Ingber http://www.ingber.com/ PO Box 06440 Wacker Dr PO Sears Tower Chicago IL 60606-0440 http://www.alumni.caltech.edu/~ingber/ From luciano at if.sc.usp.br Sun Nov 28 18:40:54 1999 From: luciano at if.sc.usp.br (Luciano Da Fontoura Costa) Date: Sun, 28 Nov 1999 21:40:54 -0200 (EDT) Subject: Research opportunities Message-ID: Dear Sir/Madam: Please help disseminating the following message: ==================================================================== OPPORTUNITY FOR POST-GRAD, POST-DOC AND SABBATICAL STUDIES Cybernetic Vision Research Group IFSC, University of Sco Paulo, Caixa Postal 369 Sco Carlos, SP, 13560-970 Brazil ____________________________________________________________________ We would like to communicate about possibilities for post-grad, post-doc and sabbatical programs at the Cybernetic Vision Research Group, University of Sco Paulo, Brazil. Additional information is presented in the following: THE CYBERNETIC VISION RESEARCH GROUP: Started in 1993, the Cybernetic Vision Research Group has become nationally and internationally recognized for research in the areas of shape analysis, computer and biological vision, and computational neuroscience. The group currently includes 15 researchers, most of them MSc and PhD students, each with access to individual computational resources. The group is part of the Instituto de Fisica de Sao Carlos, which has modern computational and network resources (alpha workstations) as well as a well-equipped library. Additional information can be found at the following homepages: *** Group homepage: http://cyvision.if.sc.usp.br/ *** Luciano's personal homepage: http://www.if.sc.usp.br/visao/group/members/luciano/luciano.htm SAO CARLOS: Sao Carlos, where the group is located, is a small and quiet town (about 150 000 inhabitants) in the heart of the state of Sao Paulo, in Brazil. The university campus is within a residential area, where accommodation is very affordable. Our town, which includes two major Brazilian universities as well as many small industries, is know as one of the most prominent Brazilian high-technology centers. Sao Carlos is not far from Sao Paulo (230km), the state capital, where flights to most Brazilian and international destinations can be found. Weather is mild (no snow throughout the year), with an average temperature around 20C. RESEARCH POSSIBILITIES: Well-motivated and dynamic students and researchers, with background in the most diverse areas - including Physics, Mathematics, Computer Science, Engineering, Biology, and Medicine - are welcome to apply for studies in our group. The full time MSc and PhD programs last up to 2 and 4 years, respectively, but it is possible to proceed directly to PhD. Post-doc and sabbatical programs can last from a few months to one year or more. It is possible to apply for Brazilian sponsorship covering travelling and/or the basic living expenses. Research possibilities include but are not limited to the following: *1* Neuromorphology and neuromorphic modeling: development of new neural shape measures, validation, and application to classification of neural cells and neuromorphic modeling. We are particularly interested in investigating how neural shapes constrain and help define neural behavior; *2* Scale space shape representations in 2D and 3D, including multiresolution curvature and skeletonization, singularity theory, differential geometry and differential equations; *3* Visual inspection and image analysis applied to microscopy; *4* Mathematical physics applications to image analysis and vision; *5* Datamining and its applications to visual design, visual quality assessment, neural modeling, and shape analysis. Candidates should contact Prof Luciano da F. Costa at luciano at if.sc.usp.br, indicating the specific interests and including curricular information as well as at least three addresses for recommendation purposes. ===================================================================== Prof. Luciano da Fontoura Costa Coordinator - Cybernetic Vision Research Group DFI-IFSC, Universidade de Sao Paulo Caixa Postal 369 Sao Carlos, SP 13560-970 Brazil FAX: +55 162 73 9879 or +55 162 71 3616 e-mail: luciano at if.sc.usp.br Group homepage: http://cyvision.if.sc.usp.br/ Personal homepage: http://www.if.sc.usp.br/visao/group/members/luciano/luciano.htm --------------------------------------------------------------------- Forthcoming book: Shape Analysis and Recognition (CRC Press) http://www.ime.usp.br/~cesar/shape_crc/ Have you been to The Scientist? http://www.the-scientist.library.upenn.edu/index.html BMCV2000: http://image.korea.ac.kr/BMCV2000/ ===================================================================== END OF MESSAGE ============== From ASJagath at ntu.edu.sg Sun Nov 28 20:22:15 1999 From: ASJagath at ntu.edu.sg (Jagath C Rajapakse (Asst Prof)) Date: Mon, 29 Nov 1999 09:22:15 +0800 Subject: Postdoc in Brain Imaging Message-ID: Postdoctoral Position in Brain Image Analysis A postdoctoral research position in structural and functional brain image analysis is immediately available in the School of Applied Science, Nanyang Technological University, Singapore. Candidate should have a Ph.D. and experience in signal/image processing and Unix/C/C++. For further information, contact: Dr. Jagath Rajapakse, School of Applied Science, Nanyang Technological University, N4 Nanyang Avenue, Singapore 639798. Email: asjagath at ntu.edu.sg, Phone: +65 790 5802, Fax: +65 792 6559 From cindy at cns.bu.edu Mon Nov 29 09:54:18 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Mon, 29 Nov 1999 09:54:18 -0500 Subject: Neural Networks 12(10) Message-ID: <199911291454.JAA17326@retina.bu.edu> NEURAL NETWORKS 12(10) Contents - Volume 12, Number 10 - 1999 NEURAL NETWORKS LETTERS: Exploiting inherent relationships in RNN architectures D.P. Mandic and J.A. Chambers ARTICLES: *** Psychology and Cognitive Science *** A self-supervised learning system for pattern recognition by sensory integration K. Yamauchi, M. Oota, and N. Ishii *** Neuroscience and Neuropsychology *** A distributed model of the saccade system: Simulations of temporally perturbed saccades using position and velocity feedback K. Arai, S. Das, E.L. Keller, and E. Aiyoshi *** Mathematical and Computational Analysis *** Storage capacity of non-monotonic neurons B. Crespi A neural implementation of canonical correlation analysis P.L. Lai and C. Fyfe Ensemble learning via negative correlation Y. Liu and X. Yao A regularization approach to continuous learning with an application to financial derivatives pricing D. Ormoneit *** Engineering and Design *** A developmental approach to visually-guided reaching in artificial systems G. Metta, G. Sandini, and J. Konczak Neuro-fuzzy feature evaluation with theoretical analysis R.K. De, J. Basak, and S.K. Pal ______________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** From simone at eealab.unian.it Mon Nov 29 11:31:30 1999 From: simone at eealab.unian.it (Simone G.O. Fiori) Date: Mon, 29 Nov 1999 17:31:30 +0100 Subject: Papers available on neural PCA and ICA Message-ID: <1.5.4.32.19991129163130.00d96ffc@unipg.it> Dear Connectionists, the following two papers are now available: `Mechanical' Neural Learning for Blind Source Separation ======================================================== by Simone Fiori - Dept. of Industrial Engineering University of Perugia, Perugia (Italy) Journal: Electronics Letters, Vol. 35, No. 22, Oct. 1999 Extended abstract In this Letter we suggest a new learning theory which ensures that the weight-matrix of each layer of a neural network keeps orthonormal during the whole learning phase. The learning theory is based upon the study of the dynamics of an abstract rigid mechanical system subject to a field of external forces deriving from a potential energy function (PEF). We suggest that by properly selecting the PEF it is possible to force the system to perform different motions, hence the network to perform different tasks. The proposed learning theory is then applied in order to solve blind source separation problems. Related papers The mentioned work is a part of a wider study about neural learning and weight flow on Stiefel-Grassman manifold. Interested readers can find more details on the following papers (and references therein): [1] S. Fiori et al., Orthonormal Strongly-Constrained Neural Learning, Proc. IJCNN'98, pp. 1332 - 1337, 1998 [2] --, A Second-Order Differential System for Orthonormal Optimization, Proc. of International Symposium on Circuits and Systems, Vol. V, pp. 531 - 534, 1999 [3] --, `Mechanical' Neural Learning and InfoMax Orthonormal Independent Component Analysis, Proc. IJCNN'99, in press [4] --, Neural Learning and Weight Flow on Stiefel Manifold, Proc. X Italian Workshop on Neural Nets, pp. 325 -- 333, 1998 (in English) ~~~~~~~~~ An Experimental Comparison of Three PCA Neural Networks ======================================================= by Simone Fiori - Dept. of Industrial Engineering University of Perugia, Perugia (Italy) Journal: Neural Processing Letters, accepted for publication. Abstract We present a numerical and structural comparison of three neural PCA techniques: The GHA by Sanger, the APEX by Kung and Diamantaras, and the $\psi$--APEX first proposed by the present author. Through computer simulations we illustrate the performances of the algorithms in terms of convergence speed and minimal attainable error; then an evaluation of the computational efforts for the different algorithms is presented and discussed. A close examination of the obtained results shows that the members of the new class improve the numerical performances of the considered existing algorithms, and are also easier to implement. ~~~~~~~~~ Comments and suggestions, especially about the first topic, would be particularly welcome. Both comments and requests of reprints should be addressed to: Dr. Simone Fiori Dept. of Industrial Engineering, Univ. of Perugia Via Ischia, 131 I-60026 Numana (An), Italy E-mail: simone at eealab.unian.it Fax: +39.0744.470188 Best regards, S. Fiori From apolloni at dsi.unimi.it Mon Nov 29 13:28:51 1999 From: apolloni at dsi.unimi.it (Prof. Apolloni Bruno) Date: Mon, 29 Nov 1999 19:28:51 +0100 Subject: Job position at Milano University Message-ID: RESEARCH POST IN THE DEPT. OF COMPUTER SCIENCE, UNIVERSITY OF MILAN Applications are invited for a research post within the framework of the "Principled Hybrid Systems: Theory and Applications" (PHYSTA) research network. The network is funded by the Training and Mobility of Researchers programme (TMR) of the EC and has started in December 1997. It involves research groups from five universities: i) King's College London, United Kingdom (Dept. of Mathematics, Prof. J.Taylor) ii) Katholic University of Nijmegen, Netherlands (Centre for Neural Networks, Prof. Stan Gielen) iii) National Technical University of Athens, Greece (Dept. of Electrical and Computer Engineering, Prof. Stefanos Kollias) iv) University of Milan, Italy (Department of Computer Science, Prof. Bruno Apolloni) v) Queen's University Belfast, United Kingdom (Dept. of Psychology/English, Prof. Rodie Cowie) The aim of the research network is the development of a theory and a methodology for combining symbolic and sub-symbolic techniques. The problem domain which is used comes from HCI and refers to emotion understanding based on static/moving image and speech signals. The research post is available in the Neural Networks Laboratory of the Computer Science Department of the University of Milan. The post is available immediately for a period of 18 months. The successful applicant will work for the continuation of the development of a hybrid system prototype which has been implemented. This is a system combining a neural processing part which performs a mapping from the feature to the propositional variables space and a symbolic processing part. The latter is a series of meditation jumps to higher levels of abstraction following a PAC learning framework for boolean formulas. Candidates must have (or be close to obtaining) a PhD in a field such as Computer Science, Mathematics or Electrical Engineering. C programming skills are necessary while experience using SNNS and Scheme/Lisp programming skills are welcome. As the funding is provided by the EC TMR programme there are some restrictions on who may benefit from it: * Candidates must be 35 years old or younger. * Candidates must be nationals of a EU country, Norway, Switzerland or Iceland * Candidates must not be Italian nationals or have worked in Italy 18 out of the last 24 months. The salary for this post is approximately 2,100 Euros. To apply for this post please (e)mail your CV to the address below: Professor Bruno Apolloni, Neural Networks Laboratory Computer Science Department University of Milan, Via Comelico 39, Milano 20135, ITALY phone: 0039 02 55006284 fax: 0039 02 55006276 email: apolloni at dsi.unimi.it More information: apolloni at dsi.unimi.it http://www.image.ece.ntua.gr/physta From angelo.arleo at epfl.ch Tue Nov 30 03:47:41 1999 From: angelo.arleo at epfl.ch (Angelo Arleo) Date: Tue, 30 Nov 1999 09:47:41 +0100 Subject: Preprint available Message-ID: <38438F2C.F2D774C1@epfl.ch> Dear Connectionists, the following paper is to appear in Biological Cybernetics, Special Issue on Navigation in Biological and Artificial Systems: =================================================== "Spatial Cognition and Neuro-Mimetic Navigation: A Model of Hippocampal Place Cell Activity" Angelo Arleo and Wulfram Gerstner Centre for Neuro-Mimetic Systems, MANTRA Swiss Federal Institute of Technology Lausanne =================================================== A preprint of the paper is available at the site: ftp://lamiftp.epfl.ch/pub/arleo/BC99/paper.ps.Z Comments and suggestions are particularly welcome. Best regards, Angelo Arleo ================================================== Extended Abstract A computational model of hippocampal activity during spatial cognition and navigation tasks is presented. The spatial representation in our model of the rat hippocampus is built on-line during exploration via two processing streams. An allothetic vision-based representation is built by unsupervised Hebbian learning extracting spatio-temporal properties of the environment from visual input. An idiothetic representation is learned based on internal movement- related information provided by path integration. On the level of the hippocampus, allothetic and idiothetic representations are integrated to yield a stable representation of the environment by a population of localized overlapping CA3-CA1 place fields. The hippocampal spatial representation is used as a basis for goal-oriented spatial behavior. We focus on the neural pathway connecting the hippocampus to the nucleus accumbens. Place cells drive a population of locomotor action neurons in the nucleus accumbens. Reward-based learning is applied to map place cell activity into action cell activity. The ensemble action cell activity provides navigational maps to support spatial behavior. We present experimental results obtained with a mobile Khepera robot. ================================================= -- _________________________________________________________________________ ____/ __ / ____/ / Angelo Arleo / / / / / Laboratory of Microcomputing (LAMI-DI) ____/ ____/ ____/ / Swiss Federal Inst. of Technology Lausanne / / / / (EPFL) CH-1015 Ecublens, Lausanne _____/ _/ _/ _____/ Tel/Fax: ++41 21 693 6696 / 5263 E-mail: angelo.arleo at di.epfl.ch Web: http://diwww.epfl.ch/lami/team/arleo _________________________________________________________________________ From sbay at algonquin.ICS.UCI.EDU Tue Nov 30 00:43:47 1999 From: sbay at algonquin.ICS.UCI.EDU (Stephen D. Bay) Date: Mon, 29 Nov 1999 21:43:47 -0800 Subject: news item -- UCI KDD Archive: Current Contents Message-ID: <199911292143.aa17340@gremlin-relay.ics.uci.edu> I believe the following item may be of interest to the connectionists mailing list members. Stephen =================================================================== Stephen D. Bay phone (949) 824-3491 Information and Computer Science fax (949) 824-4056 University of California e-mail sbay at ics.uci.edu Irvine, CA 92697-3425 http://www.ics.uci.edu/~sbay - ---------------------------------------------------------- ************************************************** The UCI KDD Archive Current Contents http://kdd.ics.uci.edu/ ************************************************** Thanks to our generous donors, we now have the the following data sets publically available: Discrete Sequence Data Spatio-Temporal Data UNIX User Data El Nino Data Image Text Data CMU Face Images 20 Newsgroups Data Volcanoes on Venus Reuters-21578 Text Collection Multivariate Data Time Series COIL Data Australian Sign Language Data Corel Image Features EEG Data Forest CoverType Pioneer-1 Mobile Robot Data Internet Usage Data Pseudo Periodic Synthetic Time Series IPUMS Census Data Robot Execution Failures KDD CUP 1998 Data Synthetic Control Chart Time Series KDD CUP 1999 Data Web Data Relational Data Microsoft Anonymous Web Data Movies Syskill Webert Web Data If you have a data set or an analysis of data that would be of interest to the KDD community, please consider donating it to the archive. The UCI KDD Archive web site has detailed submission instructions or you may contact the archive librarian for more information. This archive is supported by the Information and Data Management Program at the National Science Foundation. Stephen Bay (sbay at ics.uci.edu) archive librarian From R.J.Howlett at bton.ac.uk Mon Nov 1 11:59:10 1999 From: R.J.Howlett at bton.ac.uk (Dr R.J.Howlett) Date: Mon, 1 Nov 1999 16:59:10 +0000 (GMT) Subject: Invitation to authors Message-ID: Apologies if you receive multiple copies or if this is unwanted. == Authors required - Invitation to contribute chapters == A new book with the title "Radial Basis Function Neural Networks: Design and Applications has been commissioned by publishers Springer Verlag. The editors are Dr R.J.Howlett, University of Brighton, UK, and Prof L.C.Jain, University of South Australia. Series editor is Prof J.Kacprzyk, Polish Academy of Sciences. Five chapters have been contributed: Introduction to RBF networks Training Algorithms for Robust RBF Neural Networks Hierarchical Radial Basis Neural Networks Biomedical Applications of Radial Basis Function Networks Servocontroller Applications using Radial Basis Function Networks Proposals are invited for additional chapters in areas of RBF network architectures, clustering/training algorithms, applications, practical experience of use, or other relevant areas. =============================================================== Dr R.J.Howlett Head of Intelligent Signal Processing Labs, UK Head of TCAR --------------------------------------------------------------- Engineering Research Centre School of Engineering University of Brighton Moulsecoomb Brighton Tel:+44 1273 642300 Fax:+44 1273 642301 BN2 4GJ Email r.j.howlett at brighton.ac.uk UNITED KINGDOM --------------------------------------------------------------- Transfrontier Centre for Automotive Research (TCAR) Web Site: http://www.eng.brighton.ac.uk/eee/research/tcar/ Engineering Research Centre Web Site: http://www.eng.brighton.ac.uk/eee/research/ 4th Int Conf on Knowledge-Based Intelligent Engineering Systems, 30 Sept-1 Aug 2000, Brighton, General Chair R.J.Howlett http://www.eng.brighton.ac.uk/eee/research/KES2000 =============================================================== From fmdist at hotmail.com Mon Nov 1 16:36:42 1999 From: fmdist at hotmail.com (Fionn Murtagh) Date: Mon, 01 Nov 1999 13:36:42 PST Subject: Research Assistant position Message-ID: <19991101213646.70513.qmail@hotmail.com> One-year Research Assistant position, PhD or near completion, School of Computer Science, The Queen's University of Belfast. Closing date November 12, 1999. - Neural networks for modeling and prediction in environmental and financial data analysis, - And other applications including distributed information space searching and navigation; compression and classification; image and signal processing. Good publications will be of major benefit. Further information: Prof F Murtagh, f.murtagh at qub.ac.uk ______________________________________________________ Get Your Private, Free Email at http://www.hotmail.com From elman at crl.ucsd.edu Mon Nov 1 18:21:19 1999 From: elman at crl.ucsd.edu (Jeff Elman) Date: Mon, 1 Nov 1999 15:21:19 -0800 Subject: Center for Research in Language (UCSD) postdocs: 2000/2001 Message-ID: <199911012321.PAA02112@crl.ucsd.edu> THE CENTER FOR RESEARCH IN LANGUAGE UNIVERSITY OF CALIFORNIA, SAN DIEGO ANNOUNCEMENT OF POSTDOCTORAL FELLOWSHIPS FOR 2000-2001 Applications are invited for postdoctoral fellowships in Language, Communication and Brain at the Center for Research in Language at the University of California, San Diego. The fellowships are supported by the National Institutes of Health (NIDCD), and provide an annual stipend ranging from $26,000-41,000 depending upon years of postdoctoral experience. In addition, some funding is provided for medical insurance and travel. The program provides interdisciplinary training in: (1) psycholinguistics, including language processing in adults and language development in children; (2) communication disorders, including childhood language disorders and adult aphasia; (3) electrophysiological studies of language, and (4) neural network models of language learning and processing. Candidates are expected to work in at least one of these four areas, and preference will be given to candidates with background and interests involving more than one area. Grant conditions require that candidates be citizens or permanent residents of the U.S. In addition, trainees will incur a payback obligation during their first year of postdoctoral NRSA support and are required to complete a Payback Agreement.* Applications must be RECEIVED by FEBRUARY 1. Applicants should send a cover page with requested information (attached), a statement of interest, three letters of recommendation, a curriculum vitae and copies of relevant publications to: CRL POSTDOCTORAL FELLOWSHIP COORDINATOR Center for Research in Language 0526 University of California, San Diego 9500 Gilman Drive La Jolla, California 92093-0526 (619) 534-2536 Women and minority candidates are encouraged to apply. Program Requirements for post-doctoral candidates (1) Postdoctoral fellows will elect one of the four research components as their major area, defined by the fellow's primary laboratory affiliation across all years in residence. (2) Fellows are expected to attend weekly laboratory meetings within the major area. (3) In addition, post-doctoral fellows will carry out a 3 - 6 month rotation in a laboratory associated with a second component of the training program, including attendance at weekly research meetings. Access more information via our website: http://www.crl.ucsd.edu/fellowships/postdoc_fellow.html *Payback for post-docs can be discharged in the following ways: (1) By receiving an equal period of postdoctoral NRSA support beginning in the 13th month of such postdoctoral NRSA support; (2) By engaging in an equal period of health-related research or research training that averages more than 20 hours per week of a full work year; (3) By engaging in an equal period of health-related teaching that averages more than 20 hours per week of a full work year. STATMENT OF INTEREST FOR CRL POSTDOCTORAL FELLOWSHIP 2000-2001 Deadline: 2/1/2000 Applicant Name: Applicant Address: Applicant e-mail: Applicant Phone Number: Research Interests (from fellowship announcement): Title of PhD Thesis: Institution/Year PhD granted: Citizenship Status: From amari at brain.riken.go.jp Tue Nov 2 03:20:55 1999 From: amari at brain.riken.go.jp (Shunichi Amari) Date: Tue, 02 Nov 1999 17:20:55 +0900 Subject: multiway interactions of firing neurons Message-ID: <19991102172055D.amari@brain.riken.go.jp> Announcement of a new paper by Amari: The following paper entitled "Information Geometry on Hierarchical Decomposition of Stochastic Interactions" has been submitted to IEEE Trans. IT. The paper gives a method of orthogonal decomposition of higher order or multi-way interactions of random variables into the sum of those of lower interactions. Information Geometry gives a good solution to this problem. The theory can be applied to decomposition of interactions among an ensemble of firing neurons. I obtained the results more than ten years ago, having given a talk at a meeting in Japanese Mathematical Society. But I could not find sufficient time to write them in a paper form until now. This is read at http://www.islab.brain.riken.go.jp/~amari/pub_j.html or http://www.islab.brain.riken.go.jp/~amari/pub/IGHI.ps.gz (for gziped ps file) http://www.islab.brain.riken.go.jp/~amari/pub/IGHI.pdf (for pdf file) ********************* Shun-ichi Amari Wako-shi, Hirosawa 2-1, Saitama 351-0198, Japan RIKEN Brain Science Institute Director of Brain-Style Information Systems Research Group Laboratory for Information Synthesis, Head tel: +81-(0)48-467-9669 fax: +81-(0)48-467-9687 e-mail: amari at brain.riken.go.jp home page: http://www.bsis.brain.riken.go.jp/ From alain at fmed.ulaval.ca Tue Nov 2 14:57:40 1999 From: alain at fmed.ulaval.ca (Alain Destexhe) Date: Tue, 2 Nov 99 14:57:40 EST Subject: postdoc position in computational neuroscience Message-ID: <9911021957.AA25869@thalamus.fmed.ulaval.ca> POSTDOC POSITION AVAILABLE A postdoc position is available for a computational study of neocortical pyramidal neurons in vivo. This projet will be conducted in collaboration between three laboratories, A. Destexhe (Laval University, Canada) for the computational part, D. Pare (Laval University) and Y. Fregnac (CNRS, Gif-sur-Yvette, France) for the experimental part. The candidate will have access to intracellular data from neocortical neurons in vivo, obtained in the two aforementioned labs. The project is primarily modeling, but a participation to experiments is possible (to be discussed as a function of the interests of the candidate). The project will consist in reconstructing the morphology of intracellularly-recorded neurons using a Neuroclucida system (available at Laval University). The cellular morphologies will be incorporated in the NEURON simulator, to design biophysical models that will be matched precisely to the intracellular recordings. Because models and experimental data correspond to the same cellular morphologies, this method will allow us to characterize various aspects of synaptic activity in vivo, and estimate its consequences on dendritic integration. The candidate should have experience in computational modeling and a sufficient knowledge of electrophysiology. The salary will be paid by a grant from NIH, and is available right now for a period of 3 years. Candidates should contact Alain Destexhe for more details -- Alain Destexhe Department of Physiology Laval University Quebec G1K 7P4, Canada Tel: (418) 656 5711 Fax: (418) 656 7898 email: alain at fmed.ulaval.ca http://cns.fmed.ulaval.ca From wahba at stat.wisc.edu Tue Nov 2 21:41:03 1999 From: wahba at stat.wisc.edu (Grace Wahba) Date: Tue, 2 Nov 1999 20:41:03 -0600 (CST) Subject: Correlated Bernoulli observations Message-ID: <199911030241.UAA31842@hera.stat.wisc.edu> TR re multivariate Bernoulli observations: available via http://www.stat.wisc.edu/~wahba -> TRLIST Smoothing Spline ANOVA for Multivariate Bernoulli Observations, With Application to Ophthalmalogy Data Fangyu Gao, Grace Wahba, Ronald Klein, MD and Barbara Klein, MD UW-Madison Statistics Dept TR 1009, July 15, 1999, submitted We combine a Smoothing Spline ANOVA model and a log-linear model to build a partly flexible model for multivariate correlated Bernoulli response data, where the joint distribution of the components of the Bernoulli response vector may depend on a complex set of predictor variables. The joint distribution conditioning on the predictor variables is estimated via a SS-ANOVA variational problem. The log odds ratio is used to measure the association between outcome variables. A numerical scheme based on the block one-step SOR-Newton-Ralphson algorithm is proposed to obtain an approximate solution for the variational problem. We extend $GACV$ (Generalized Approximate Cross Validation) to the case of multivariate Bernoulli responses. Its randomized version is fast and stable to compute and is used to adaptively select smoothing parameters in each block one-step SOR iteration. Approximate Bayesian confidence intervals are obtained for the flexible estimates of the conditional logit functions. Simulation studies are conducted to check the performance of the proposed method. Finally, the model is applied to two-eyes observational data from the Beaver Dam Eye Study to examine the association of pigmentary abnormalities and various covariates. The results are applicable to a variety of problems where the response of interest is a vector of 0's and 1's that exhibit pairwise and higher order correlations. From yilin at stat.wisc.edu Tue Nov 2 22:01:47 1999 From: yilin at stat.wisc.edu (Yi Lin) Date: Tue, 2 Nov 1999 21:01:47 -0600 (CST) Subject: No subject Message-ID: <199911030301.VAA10802@hyg.stat.wisc.edu> TR Support Vector Machines and the Bayes Rule in Classification: available via http://www.stat.wisc.edu/~yilin Support Vector Machines and the Bayes Rule in Classification by Yi Lin UW-Madison statistics department TR 1014, November 1, 1999. The Bayes rule is the optimal classification rule if the underlying distribution of the data is known. In practice we do not know the underlying distribution, and need to ``learn'' classification rules from the data. One way to derive classification rules in practice is to implement the Bayes rule approximately by estimating an appropriate classification function. Traditional statistical methods use estimated log odds ratio as the classification function. Support vector machines (SVMs) are one type of large margin classifier, and the relationship between SVMs and the Bayes rule was not clear. In this paper, it is shown that SVMs implement the Bayes rule approximately by targeting at some interesting classification functions. This helps understand the success of SVMs in many classification studies, and makes it easier to compare SVMs and traditional statistical methods. From j.hogan at qut.edu.au Wed Nov 3 06:22:24 1999 From: j.hogan at qut.edu.au (Jim Hogan) Date: Wed, 03 Nov 1999 21:22:24 +1000 (EST) Subject: Jobs in Australia Message-ID: <3.0.32.19991103211844.009962d0@sky.fit.qut.edu.au> Queensland Univ of Technology is offering a number of positions in CS, closing soon. Potential applicants with an interest in machine learning, neural networks or data mining are encouraged to apply. All details may be found at the URL: http://www.qut.edu.au/pubs/employment/99430.html cheers jh ------------------------------ James M Hogan Lecturer in Computer Science QUT, GPO Box 2434 Brisbane Qld 4001 AUSTRALIA From neep at ecowar.demon.co.uk Thu Nov 4 12:12:03 1999 From: neep at ecowar.demon.co.uk (neep) Date: Thu, 04 Nov 1999 17:12:03 +0000 Subject: EANN2000 First Call For Papers Message-ID: <3821BE63.FF0F5A5@ecowar.demon.co.uk> First Call for Papers Sixth International Conference on Engineering Applications of Neural Networks Kingston Upon Thames, UK 17-19 July 2000 The conference is a forum for presenting the latest results on neural network applications in technical fields. The applications may be in any engineering or technical field, including but not limited to: systems engineering, mechanical engineering, robotics, process engineering, metallurgy, pulp and paper technology, aeronautical engineering, computer science, machine vision, chemistry, chemical engineering, physics, electrical engineering, electronics, civil engineering, geophysical sciences, biomedical systems, and environmental engineering. Prospective authors are requested to send an extended abstract for review by the International Committee. All papers must be written in English, starting with a succinct statement of the problem and the application area, the results achieved, their significance and a comparison with previous work (if any). The following must be also be included: Title of proposed paper, Author names, affiliations addresses, Name of author to contact for correspondence, E-mail address and fax number of contact author, Topics which best describe the paper (max. 5 keywords), Preferred Session Submissions must be received by February 29, 2000. It is strongly recommended to submit extended abastracts by electronic mail to: eann2000 at kingston.ac.uk or else by mail (2 copies) to the following address: Dr Dimitris Tsaptsinos, Kingston University, School of Mathematics, Penhryn Road, Kingston Upon Thames, KT1 2EE, UK Tel: +181-547 2000 extension. 2516 Fax: +181-547 7497 For information on earlier EANN conferences see the WWW pages: http://www.abo.fi/~abulsari/EANN.html Web Page: http://www.kingston.ac.uk/eann Diary Dates Submission Deadline 29 February, 2000 Notification of Acceptance 15 March, 2000 Delivery of full papers 15 April, 2000 Proposals for Tutorials 15 May, 2000 Registration fee paid by 15 April, 2000 to guarantee publication of contribution in the proceedings Contacts Conference Secretariat Dr Dimitris Tsaptsinos EANN2000 Conference Secretariat School of Mathematics Kingston University Penhryn Road Kingston Upon Thames Surrey KT1 2EE UK Tel: +181-5472000 extension 2516 Fax: +181-5477497 Email: eann2000 at kingston.ac.uk http://www.kingston.ac.uk/eann -------------------------------------------------------------------------------- Organising Committee A. Osman (USA) R. Baratti (Italy) S. Draghici (USA) W. Duch (Poland) J. Fernandez de Canete (Spain) C. Kuroda (Japan) A. Ruano (Portugal) D. Tsaptsinos (UK) E. Tulunay (Turkey) -------------------------------------------------------------------------------- International Committee to be extended L. Bobrowski Poland (leon at spam.ibib.waw.pl) A. Bulsari Finland (abulsari at spam.abo.fi) T. Clarkson A. Iwata G.Jones UK (G.Jones at kingston.ac.uk) L. Ludwig S. Michaelides R. Parenti Italy(parenti at spam.ari.ansaldo.it) R. Saatchi UK (R.Saatchi at spam.shu.ac.uk) C. Schizas S. Usui P. Zufiria please remove spam before you use an email address -------------------------------------------------------------------------------- Session Chair Control systems (A. Ruano, aruano at spam.ualg.pt) Process Engineering (R. Baratti, baratti at spam.unica.it) Vision/Image processing (S. Draghici, sod at spam.cs.wayne.edu) more to be announced please remove spam before you use an email address -- Neep Hazarika Phone: +44 (0)118 940 4141 (work) Econostat Limited +44 (0)118 946 1659 (home) Hennerton House Fax: +44 (0)118 940 4099 Wargrave, Berkshire RG10 8PD, U.K. e-mail: neep at ecowar.demon.co.uk From beer at eecs.cwru.edu Mon Nov 8 14:50:08 1999 From: beer at eecs.cwru.edu (Randall D. Beer) Date: Mon, 8 Nov 1999 14:50:08 -0500 Subject: NSF/IGERT in Neuromechanical Systems at CWRU Message-ID: NSF-SPONSORED TRAINING PROGRAM IN NEUROMECHANICAL SYSTEMS AT CWRU Predoctoral fellowships are now available in a new multidisciplinary graduate program in Neuro-Mechanical Systems at Case Western Reserve University. Neuro-mechanical systems include natural, man-made, or hybrid systems combining neural controllers and mechanical peripheries. Examples include natural organisms, biologically inspired robots, and neuroprostheses for restoring motor function in the disabled. We are seeking outstanding students with backgrounds in biology, neuroscience, biomedical engineering, computer engineering and science, electrical engineering, or mechanical engineering. Students participating in this program will learn the skills necessary to work in this exciting new multidiciplinary area. This program, funded by the National Science Foundations Integrative Graduate Education and Research Training initiative (NSF IGERT), brings together four research groups focused on the neurobiology and biomechanics of movement behavior, on bio-robotics, on evolution and analysis of model neuro-mechanical systems, and on motor system neuroprostheses. The program involves eight faculty from four Departments: Biology, Biomedical Engineering, Electrical Engineering and Computer Science, and Mechanical Engineering: Randall Beer, Electrical Engineering and Computer Science Michael Branicky, Electrical Engineering and Computer Science Hillel Chiel, Biology Patrick Crago, Biomedical Engineering Warren Grill, Biomedical Engineering Robert Kirsch, Biomedical Engineering Roger Quinn, Mechanical Engineering Roy Ritzmann, Biology Students in the training program will participate in cross-disciplinary courses and rotate through laboratories in all four fields. The program includes a multidisciplinary seminar featuring extended visits from leaders in each field. Funds will permit travel to scientific meetings and workshops in each field. Common computer facilities and office areas will be provided for students in the program. Internships in clinical and industrial settings will also be available as options. We are particularly interested in recruiting under-represented minorities. Students must be U.S. Citizens or Permanent Residents of the United States. Further details of the program can be found at http://neuromechanics.cwru.edu. For further information, please contact Dr. Roy Ritzmann, Department of Biology, Case Western Reserve University, Cleveland, OH 44106 - 7080, (216) 368 - 3554, rer3 at po.cwru.edu. From Annette_Burton at Brown.edu Mon Nov 8 14:22:18 1999 From: Annette_Burton at Brown.edu (Annette Burton) Date: Mon, 8 Nov 1999 15:22:18 -0400 Subject: IGERT JOB ANNOUNCEMENT Message-ID: Departments of Applied Mathematics, Cognitive and Linguistic Sciences, and Computer Science announce A NEW INTERDISCIPLINARY POSTDOCTORAL OPPORTUNITY in LEARNING AND ACTION IN THE FACE OF UNCERTAINTY: COGNITIVE, COMPUTATIONAL AND STATISTICAL APPROACHES As part of an NSF award to Brown University through the IGERT program, the Departments of Cognitive and Linguistic Sciences, Computer Science, and Applied Mathematics will be hiring a Postdoctoral Research Associate. Fellows will be scholars who have displayed significant interest and ability in conducting collaborative interdisciplinary research in one or more of the research areas of the program: computational and empirical approaches to uncertainty in language, vision, action, or human reasoning. As well as participating in collaborative research, responsibilities will include helping to coordinate cross-departmental graduate teaching and research as well as some teaching of interdisciplinary graduate courses. We expect that the fellows will play an important role in creating a highly visible presence for the IGERT program at Brown, and the interdisciplinary activities will help unify the interdepartmental activities of the IGERT program. Applicants must hold a PhD in Cognitive Science, Linguistics, Computer Science, Mathematics, Applied Mathematics, or a related discipline, or show evidence that the PhD will be completed before the start of the position. Applicants should send a vita and three letters of reference to the IGERT Postdoc Search Committee, Department of Cognitive and Linguistic Sciences, Brown University, Box 1978, Providence, RI 02912. Special consideration will be given to those applicants whose research is relevant to at least two of the participating departments. The position will begin September 1, 2000 for one year, renewable upon satisfactory completion of duties in the first year. Salaries will be between $35,000 and $42,500 per year. All materials must be received by Jan. 15, 2000, for full consideration. Brown University is an Equal Opportunity/Affirmative Action Employer. For additional information about the program and ongoing research initiatives please visit our website at: http://www.cog.brown.edu/IGERT. From ken at phy.ucsf.EDU Mon Nov 8 16:50:42 1999 From: ken at phy.ucsf.EDU (Ken Miller) Date: Mon, 8 Nov 1999 13:50:42 -0800 (PST) Subject: Paper available: review of circuitry underlying mature orientation selectivity, experiments and models Message-ID: <14375.17842.908607.854386@coltrane.ucsf.edu> The following paper is now available at ftp://ftp.keck.ucsf.edu/pub/ken/fm_final.ps.gz (compressed postscript) ftp://ftp.keck.ucsf.edu/pub/ken/fm_final.pdf (pdf) or http://www.keck.ucsf.edu/~ken (click on 'Publications') This is a preprint of an article to appear in Annual Reviews of Neuroscience, Vol. 23 (2000). (Note: A review I announced about two weeks ago focused on *development* of orientation selectivity. This review, in contrast focuses on the structure of the circuitry underlying mature orientation-selective responses.) ---------------------------------------- Neural Mechanisms of Orientation Selectivity in the Visual Cortex David Ferster, Northwestern University Kenneth D. Miller, University of California, San Francisco to appear in Annual Reviews of Neuroscience, Vol. 23 (2000). ABSTRACT: The origin of orientation selectivity in the responses of simple cells in cat visual cortex serves as a model problem for understanding cortical circuitry and computation. The feedforward model of Hubel and Wiesel posits that this selectivity arises simply from the arrangement of thalamic inputs to a simple cell. Much evidence, including a number of recent intracellular studies, supports a primary role of the thalamic inputs in determining simple cell response properties including orientation tuning. However, this mechanism alone cannot explain the invariance of orientation tuning to changes in stimulus contrast. Simple cells receive push-pull inhibition: ON inhibition in OFF subregions and vice versa. Addition of such inhibition to the feedforward model can account for this contrast invariance, provided the inhibition is sufficiently strong. The predictions of "normalization" and "feedback" models are reviewed and compared to the predictions of this modified feedforward model and to experimental results. The modified feedforward and the feedback models ascribe fundamentally different functions to cortical processing. Ken Kenneth D. Miller telephone: (415) 476-8217 Dept. of Physiology fax: (415) 476-4929 UCSF internet: ken at phy.ucsf.edu 513 Parnassus www: http://www.keck.ucsf.edu/~ken San Francisco, CA 94143-0444 From jas at hnc.com Tue Nov 9 22:25:27 1999 From: jas at hnc.com (Spoelstra, Jacob) Date: Tue, 9 Nov 1999 19:25:27 -0800 Subject: Job opportunity: HNC Software Message-ID: <72A838A51366D211B3B30008C7F4D363021F6E86@pchnc.hnc.com> > POSITION: Staff Scientist > DIVISION: HNC Financial Solutions > LOCATION: San Diego, CA JOB CODE: FCT9911 > Duties/Job Description: > ----------------------- > Responsibilities include designing and building predictive models based > on the latest technologies in neural networks, pattern > recognition/artificial > intelligence, and statistical modeling for various applications in the > financial industry. Specific responsibilities may vary by project, but > will include analyzing data to determine suitability for modeling, pattern > identification and feature (variable) selection from large amounts of > data, > experimenting with different types of models, analyzing performance, and > reporting results to customers. > > Required Qualifications (Experience/Skills): > -------------------------------------------- > MS or PhD in Computer Science, Electrical Engineering, Applied > Statistics/Mathematics or related field. Minimum two years of experience > in pattern recognition, mathematical modeling, or data analysis on real > world problems. Familiarity with the latest modeling techniques and > tools. > Good oral and written communication skills, and the ability to interact > well with both co-workers and customers. Proficiency in C and Unix, and > familiarity with SAS or other analysis tool. Software Engineering will > be a plus. > > Preferred Qualifications (Experience/Skills): > --------------------------------------------- > Strong mathematical appetite, problem solving and computer skills > (C or C++ or Java). Good Unix scripting and rapid prototyping skill. > Quick learner and good team player. Experience in designing systems based > > on neural networks, pattern recognition and/or statistical modeling > techniques for the financial, health care, marketing, or other real > world applications. Object oriented software design familiarity. > > Careers at HNC Software Inc: > Headquartered in San Diego, California, HNC Software Inc. (Nasdaq: HNCS)is > the world's leading provider of Predictive Software Solutions for service > industries, including financial, retail, insurance, Internet, and > telecommunications. It is HNC's employment philosophy to create a dynamic > work environment that allows each employee to feel challenged and > experience personal growth that maximizes each person's potential. HNC > also offers a comprehensive array of employee benefits including stock > options, employee stock purchase plan, competitive health benefits, 401(k) > plans and tuition support for continuing education. (Please refer to job code FCT9911 in all correspondence) > Apply by Email: fct_jobs at hnc.com > By Fax: (858) 452-6524 (For attention: Dr. Khosrow Hassibi) > By Mail: Dr. Khosrow Hassibi HNC Software Inc. Financial Solutions 5935 Cornerstone Court West San Diego, CA 92121-3728 From faramarz at cns.bu.edu Wed Nov 10 13:34:04 1999 From: faramarz at cns.bu.edu (Faramarz Valafar) Date: Wed, 10 Nov 1999 13:34:04 -0500 Subject: Graduate Program in The Department of Cognitive and Neural Systems (CNS) at Boston University Message-ID: PLEASE POST ******************************************************************* GRADUATE TRAINING IN THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS (CNS) AT BOSTON UNIVERSITY ******************************************************************* The Boston University Department of Cognitive and Neural Systems offers comprehensive graduate training in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of technological problems. Applications for Fall 2000 admission and financial aid are now being accepted for both the MA and PhD degree programs. To obtain a brochure describing the CNS Program and a set of application materials, write, telephone, or fax: DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS Boston University 677 Beacon Street Boston, MA 02215 617/353-9481 (phone) 617/353-7755 (fax) or send via e-mail your full name and mailing address to the attention of Mr. Robin Amos at: inquiries at cns.bu.edu Applications for admission and financial aid should be received by the Graduate School Admissions Office no later than January 15. Late applications will be considered until May 1; after that date applications will be considered only as special cases. Applicants are required to submit undergraduate (and, if applicable, graduate) transcripts, three letters of recommendation, and Graduate Record Examination (GRE) scores. The Advanced Test should be in the candidate's area of departmental specialization. GRE scores may be waived for MA candidates and, in exceptional cases, for PhD candidates, but absence of these scores will decrease an applicant's chances for admission and financial aid. Non-degree students may also enroll in CNS courses on a part-time basis. Stephen Grossberg, Chairman Gail A. Carpenter, Director of Graduate Studies Description of the CNS Department: The Department of Cognitive and Neural Systems (CNS) provides advanced training and research experience for graduate students interested in the neural and computational principles, mechanisms, and architectures that underlie human and animal behavior, and the application of neural network architectures to the solution of outstanding technological problems. Students are trained in a broad range of areas concerning computational neuroscience, cognitive science, and neuromorphic systems, including the brain mechanisms of vision and visual object recognition; audition, speech, and language understanding; recognition, learning, categorization, and long-term memory; cognitive information processing; self-organization and development; navigation, planning, and spatial orientation; cooperative and competitive network dynamics and short-term memory; reinforcement and motivation; attention; adaptive sensory-motor control and robotics; biological rhythms; consciousness; mental disorders; and the mathematical and computational methods needed to support advanced modeling research and applications. The CNS Department awards MA, PhD, and BA/MA degrees. The CNS Department embodies a number of unique features. It has developed a curriculum that consists of eighteen interdisciplinary graduate courses, each of which integrates the psychological, neurobiological, mathematical, and computational information needed to theoretically investigate fundamental issues concerning mind and brain processes and the applications of neural networks to technology. Additional advanced courses, including research apprenticeship and seminar courses, are also offered. Each course is typically taught once a week in the afternoon or evening to make the program available to qualified students, including working professionals, throughout the Boston area. Students develop a coherent area of expertise by designing a program that includes courses in areas such as biology, computer science, engineering, mathematics, and psychology, in addition to courses in the CNS curriculum. The CNS Department interacts with colleagues in several Boston University research centers or groups, and with Boston-area scientists collaborating with these centers. The unit most closely linked to the department is the Center for Adaptive Systems. Students interested in neural network hardware can work with researchers in CNS, at the College of Engineering, and at M.I.T. Lincoln Laboratory. Other research resources include distinguished research groups in neurophysiology, neuroanatomy, and neuropharmacology across the Boston University Charles River Campus and Medical School; in sensory robotics, biomedical engineering, computer and systems engineering, and neuromuscular research within the College of Engineering; in dynamical systems within the Mathematics Department; in theoretical computer science within the Computer Science Department; and in biophysics and computational physics within the Physics Department. Key colleagues in these units hold appointments in CNS. In addition to its basic research and training program, the department conducts a seminar series, as well as conferences and symposia, which bring together distinguished scientists from both experimental, theoretical, and applied disciplines. The department is housed in its own new four-story building which includes ample space for faculty and student offices and laboratories (computational neuroscience, visual psychophysics, psychoacoustics, speech and language, sensory-motor control, neurobotics, computer vision), as well as an auditorium, classroom and seminar rooms, a library, and a faculty-student lounge. The department has a powerful computer network for carrying out large-scale simulations of behavioral and brain models. Below are listed departmental faculty, courses and labs. FACULTY AND STAFF OF THE DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS AND CENTER FOR ADAPTIVE SYSTEMS Jelle Atema Professor of Biology Director, Boston University Marine Program (BUMP) PhD, University of Michigan Sensory physiology and behavior. Aijaz Baloch Adjunct Assistant Professor of Cognitive and Neural Systems Senior Modeling Engineer, Nestor, Inc. PhD, Electrical Engineering, Boston University Visual motion perception, computational vision, adaptive control, and financial fraud detection. Helen Barbas Professor of Anatomy and Neurobiology, Boston University School of Medicine PhD, Physiology/Neurophysiology, McGill University Organization of the prefrontal cortex, evolution of the neocortex. Jacob Beck Research Professor of Cognitive and Neural Systems PhD, Psychology, Cornell University Visual perception, psychophysics, computational models of vision. Daniel H. Bullock Associate Professor of Cognitive and Neural Systems, and Psychology PhD, Experimental Psychology, Stanford University Sensory-motor performance and learning, voluntary control of action, serial order and timing, cognitive development. Gail A. Carpenter Professor of Cognitive and Neural Systems and Mathematics Director of Graduate Studies, Department of Cognitive and Neural Systems PhD, Mathematics, University of Wisconsin, Madison Learning and memory, synaptic processes, pattern recognition, remote sensing, medical database analysis, machine learning, differential equations. Laird Cermak Director, Memory Disorders Research Center, Boston Veterans Affairs Medical Center Professor of Neuropsychology, School of Medicine Professor of Occupational Therapy, Sargent College PhD, Ohio State University Memory disorders. Michael A. Cohen Associate Professor of Cognitive and Neural Systems and Computer Science PhD, Psychology, Harvard University Speech and language processing, measurement theory, neural modeling, dynamical systems, cardiovascular oscillations physiology and time series. H. Steven Colburn Professor of Biomedical Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Audition, binaural interaction, auditory virtual environments, signal processing models of hearing. Howard Eichenbaum Professor of Psychology PhD, Psychology, University of Michigan Neurophysiological studies of how the hippocampal system mediates declarative memory. William D. Eldred III Professor of Biology PhD, University of Colorado, Health Science Center Visual neuralbiology. Paolo Gaudiano Research Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Computational and neural models of robotics, vision, adaptive sensory-motor control, and behavioral neurobiology. Jean Berko Gleason Professor of Psychology PhD, Harvard University Psycholinguistics. Sucharita Gopal Associate Professor of Geography PhD, University of California at Santa Barbara Neural networks, computational modeling of behavior, geographical information systems, fuzzy sets, and spatial cognition. Stephen Grossberg Wang Professor of Cognitive and Neural Systems Professor of Mathematics, Psychology, and Biomedical Engineering Chairman, Department of Cognitive and Neural Systems Director, Center for Adaptive Systems PhD, Mathematics, Rockefeller University Vision, audition, language, learning and memory, reward and motivation, cognition, development, sensory-motor control, mental disorders, applications. Frank Guenther Associate Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University MSE, Electrical Engineering, Princeton University Speech production, speech perception, biological sensory-motor control and functional brain imaging. Catherine L. Harris Assistant Professor of Psychology PhD, Cognitive Science and Psychology, University of California at San Diego Visual word recognition, psycholinguistics, cognitive semantics, second language acquisition, computational models of cognition. Michael E. Hasselmo Associate Professor of Psychology Director of Graduate Studies, Psychology Department PhD, Experimental Psychology, Oxford University Electrophysiological studies of neuromodulatory effects in cortical structures, network biophysical simulations of memory function in hippocampus and piriform cortex, behavioral studies of amnestic drugs. Thomas G. Kincaid Professor of Electrical, Computer and Systems Engineering, College of Engineering PhD, Electrical Engineering, Massachusetts Institute of Technology Signal and image processing, neural networks, non-destructive testing. Mark Kon Professor of Mathematics PhD, Massachusetts Institute of Technology Neural network theory, complexity theory, wavelet theory, mathematical physics. Nancy Kopell Professor of Mathematics PhD, Mathematics, University of California at Berkeley Dynamics of networks of neurons. Jacqueline A. Liederman Associate Professor of Psychology PhD, Psychology, University of Rochester Dynamics of interhemispheric cooperation; prenatal correlates of neurodevelopmental disorders. Ennio Mingolla Associate Professor of Cognitive and Neural Systems and Psychology PhD, Psychology, University of Connecticut Visual perception, mathematical modeling of visual processes. Joseph Perkell Adjunct Professor of Cognitive and Neural Systems Senior Research Scientist, Research Lab of Electronics and Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology PhD, Massachusetts Institute of Technology Motor control of speech production. Alan Peters Professor of Anatomy and Neurobiology, School of Medicine PhD, Zoology, Bristol University, United Kingdom Organization of neurons in the cerebral cortex; effects of aging on the primate brain; fine structure of the nervous system. Andrzej Przybyszewski Research Fellow, Department of Cognitive and Neural Systems Assistant Professor, University of Massachusetts Medical School, Worcester PhD, Warsaw Medical Academy Electrophysiology of the primate visual system, mathematical and computer modeling of the neuronal networks in the visual system. Adam Reeves Adjunct Professor of Cognitive and Neural Systems Professor of Psychology, Northeastern University PhD, Psychology, City University of New York Psychophysics, cognitive psychology, vision. Mark Rubin Research Assistant Professor of Cognitive and Neural Systems PhD, Physics, University of Chicago Pattern recognition; artificial and biological vision. Michele Rucci Assistant Professor of Cognitive and Neural Systems PhD, Scuola Superiore, Pisa, Italy Vision, sensory-motor control and learning, and computational neuroscience. Elliot Saltzman Associate Professor of Physical Therapy, Sargent College Research Scientist, Haskins Laboratories, New Haven, CT Assistant Professor in Residence, Department of Psychology and Center for the Ecological Study of Perception and Action, University of Connecticut, Storrs, CT PhD, Developmental Psychology, University of Minnesota Modeling and experimental studies of human sensorimotor control and coordination of the limbs and speech articulators, focusing on issues of timing in skilled activities. Robert Savoy Adjunct Associate Professor of Cognitive and Neural Systems Scientist, Rowland Institute for Science Experimental Psychologist, Massachusetts General Hospital PhD, Experimental Psychology, Harvard University Computational neuroscience; visual psychophysics of color, form, and motion perception. Teaching about functional MRI and other brain mapping methods. Eric Schwartz Professor of Cognitive and Neural Systems; Electrical, Computer and Systems Engineering; and Anatomy and Neurobiology PhD, High Energy Physics, Columbia University Computational neuroscience, machine vision, neuroanatomy, neural modeling. Robert Sekuler Adjunct Professor of Cognitive and Neural Systems Research Professor of Biomedical Engineering, College of Engineering, BioMolecular Engineering Research Center Frances and Louis H. Salvage Professor of Psychology, Brandeis University Consultant in neurosurgery, Boston Children's Hospital PhD, Psychology, Brown University Visual motion, brain imaging, relation of visual perception, memory, and movement. Barbara Shinn-Cunningham Assistant Professor of Cognitive and Neural Systems and Biomedical Engineering PhD, Electrical Engineering and Computer Science, Massachusetts Institute of Technology Psychoacoustics, audition, auditory localization, binaural hearing, sensorimotor adaptation, mathematical models of human performance. Malvin Carl Teich Professor of Electrical and Computer Engineering, Biomedical Engineering and Physics PhD, Cornell University Quantum optics and imaging, photonics, wavelets and fractal stochastic processes, biological signal processing and information transmission. Lucia Vaina Professor of Biomedical Engineering Research Professor of Neurology, School of Medicine PhD, Sorbonne (France); Dres Science, National Politechnique Institute, Toulouse (France) Computational visual neuroscience, biological and computational learning, functional and structural neuroimaging. Faramarz Valafar Adjunct Assistant Professor of Cognitive and Neural Systems PhD, Electrical Engineering, Purdue University Bioinformatics, adaptive systems (artificial neural networks), data mining and modeling in medicine, medical decision making, pattern recognition and signal processing in biomedicine, biochemistry, and glycoscience. Takeo Watanabe Associate Professor of Psychology PhD, Behavioral Sciences, University of Tokyo Perception of objects and motion and effects of attention on perception using psychophysics and brain imaging (f-MRI). Allen Waxman Adjunct Associate Professor of Cognitive and Neural Systems Senior Staff Scientist, MIT Lincoln Laboratory PhD, Astrophysics, University of Chicago Visual system modeling, multisensor fusion, image mining, parallel computing, and advanced visualization. James Williamson Research Assistant Professor of Cognitive and Neural Systems PhD, Cognitive and Neural Systems, Boston University Pattern recognition; self-organization and topographic maps; perceptual grouping. Jeremy Wolfe Adjunct Associate Professor of Cognitive and Neural Systems Associate Professor of Ophthalmology, Harvard Medical School Psychophysicist, Brigham & Women's Hospital, Surgery Dept. Director of Psychophysical Studies, Center for Clinical Cataract Research PhD, Massachusetts Institute of Technology Visual attention, preattentive and attentive object representation. Curtis Woodcock Professor of Geography Director, Geographic Applications, Center for Remote Sensing PhD, University of California, Santa Barbara Biophysical remote sensing, particularly of forests and natural vegetation, canopy reflectance models and their inversion, spatial modeling, and change detection; biogeography; spatial analysis; geographic information systems; digital image processing. CNS DEPARTMENT COURSE OFFERINGS CAS CN500 Computational Methods in Cognitive and Neural Systems CAS CN510 Principles and Methods of Cognitive and Neural Modeling I CAS CN520 Principles and Methods of Cognitive and Neural Modeling II CAS CN530 Neural and Computational Models of Vision CAS CN540 Neural and Computational Models of Adaptive Movement Planning and Control CAS CN550 Neural and Computational Models of Recognition, Memory and Attention CAS CN560 Neural and Computational Models of Speech Perception and Production CAS CN570 Neural and Computational Models of Conditioning, Reinforcement, Motivation and Rhythm CAS CN580 Introduction to Computational Neuroscience GRS CN700 Computational and Mathematical Methods in Neural Modeling GRS CN710 Advanced Topics in Neural Modeling GRS CN720 Neural and Computational Models of Planning and Temporal Structure in Behavior GRS CN730 Models of Visual Perception GRS CN740 Topics in Sensory-Motor Control GRS CN760 Topics in Speech Perception and Recognition GRS CN780 Topics in Computational Neuroscience GRS CN810 Topics in Cognitive and Neural Systems: Visual Event Perception GRS CN811 Topics in Cognitive and Neural Systems: Visual Perception GRS CN911,912 Research in Neural Networks for Adaptive Pattern Recognition GRS CN915,916 Research in Neural Networks for Vision and Image Processing GRS CN921,922 Research in Neural Networks for Speech and Language Processing GRS CN925,926 Research in Neural Networks for Adaptive Sensory-Motor Planning and Control GRS CN931,932 Research in Neural Networks for Conditioning and Reinforcement Learning GRS CN935,936 Research in Neural Networks for Cognitive Information Processing GRS CN941,942 Research in Nonlinear Dynamics of Neural Networks GRS CN945,946 Research in Technological Applications of Neural Networks GRS CN951,952 Research in Hardware Implementations of Neural Networks CNS students also take a wide variety of courses in related departments. In addition, students participate in a weekly colloquium series, an informal lecture series, and student-run special interest groups, and attend lectures and meetings throughout the Boston area; and advanced students work in small research groups. LABORATORY AND COMPUTER FACILITIES The department is funded by fellowships, grants, and contracts from federal agencies and private foundations that support research in life sciences, mathematics, artificial intelligence, and engineering. Facilities include laboratories for experimental research and computational modeling in visual perception; audition, speech and language processing; and sensory-motor control and robotics. Data analysis and numerical simulations are carried out on a state-of-the-art computer network comprised of Sun workstations, Silicon Graphics workstations, Macintoshes, and PCs. A PC farm running Linix operating systems is available as a distributed computational environment. All students have access to PCs or UNIX workstation consoles, a network of SGI machines, and standard modeling and mathematical simulation packages such as Mathematica, VisSim, Khoros, and Matlab. The department maintains a core collection of books and journals, and has access both to the Boston University libraries and to the many other collections of the Boston Library Consortium. In addition, several specialized facilities and software are available for use. These include: Computer Vision/Computational Neuroscience Laboratory The Computer Vision/Computational Neuroscience Lab is comprised of an electronics workshop, including a surface-mount workstation, PCD fabrication tools, and an Alterra EPLD design system; a light machine shop; an active vision lab including actuators and video hardware; and systems for computer aided neuroanatomy and application of computer graphics and image processing to brain sections and MRI images. Neurobotics Laboratory The Neurobotics Lab utilizes wheeled mobile robots to study potential applications of neural networks in several areas, including adaptive dynamics and kinematics, obstacle avoidance, path planning and navigation, visual object recognition, and conditioning and motivation. The lab currently has three Pioneer robots equipped with sonar and visual sensors; one B-14 robot with a moveable camera, sonars, infrared, and bump sensors; and two Khepera miniature robots with infrared proximity detectors. Psychoacoustics Laboratory The Psychoacoustics Lab houses a newly installed, 8 ft. % 8 ft. sound-proof booth. The laboratory is extensively equipped to perform both traditional psychoacoustic experiments and experiments using interactive auditory virtual-reality stimuli. The major equipment dedicated to the psychoacoustics laboratory includes two Pentium-based personal computers; two Power-PC-based Macintosh computers; a 50-MHz array processor capable of generating auditory stimuli in real time; programmable attenuators; analog-to-digital and digital-to-analog converters; a real-time head tracking system; a special-purpose, signal-processing hardware system capable of generating "spatialized" stereo auditory signals in real time; a two-channel oscilloscope; a two-channel spectrum analyzer; various cables, headphones, and other miscellaneous electronics equipment; and software for signal generation, experimental control, data analysis, and word processing. Sensory-Motor Control Laboratory The Sensory-Motor Control Lab supports experimental studies of motor kinematics. An infrared WatSmart system allows measurement of large-scale movements, and a pressure-sensitive graphics tablet allows studies of handwriting and other fine-scale movements. Equipment includes a 40-inch monitor that allows computer display of animations generated by an SGI workstation or a Pentium Pro (Windows NT) workstation. A second major component is a helmet-mounted, video-based, eye-head tracking system (ISCAN Corp, 1997). The latter's camera samples eye position at 240Hz and also allows reconstruction of what subjects are attending to as they freely scan a scene under normal lighting. Thus the system affords a wide range of visuo-motor studies. Speech and Language Laboratory The Speech and Language Lab includes facilities for analog-to-digital and digital-to-analog software conversion. Ariel equipment allows reliable synthesis and playback of speech waveforms. An Entropic signal processing package provides facilities for detailed analysis, filtering, spectral construction, and formant tracking of the speech waveform. Various large databases, such as TIMIT and TIdigits, are available for testing algorithms of speech recognition. For high speed processing, supercomputer facilities speed filtering and data analysis. Visual Psychophysics Laboratory The Visual Psychophysics Lab occupies an 800-square-foot suite, including three dedicated rooms for data collection, and houses a variety of computer controlled display platforms, including Silicon Graphics, Inc. (SGI) Onyx RE2, SGI Indigo2 High Impact, SGI Indigo2 Extreme, Power Computing (Macintosh compatible) PowerTower Pro 225, and Macintosh 7100/66 workstations. Ancillary resources for visual psychophysics include a computer-controlled video camera, stereo viewing glasses, prisms, a photometer, and a variety of display-generation, data-collection, and data-analysis software. Affiliated Laboratories Affiliated CAS/CNS faculty have additional laboratories ranging from visual and auditory psychophysics and neurophysiology, anatomy, and neuropsychology to engineering and chip design. These facilities are used in the context of faculty/student collaborations. ******************************************************************* DEPARTMENT OF COGNITIVE AND NEURAL SYSTEMS GRADUATE TRAINING ANNOUNCEMENT Boston University 677 Beacon Street Boston, MA 02215 Phone: 617/353-9481 Fax: 617/353-7755 Email: inquiries at cns.bu.edu Web: http://www.cns.bu.edu/ ******************************************************************* From A.van.Ooyen at nih.knaw.nl Thu Nov 11 11:40:33 1999 From: A.van.Ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Thu, 11 Nov 1999 17:40:33 +0100 Subject: Models of Axon Guidance and Bundling Message-ID: <382AF181.7408@nih.knaw.nl> New Paper: Models of Axon Guidance and Bundling During Development H. G. E. Hentschel & A. van Ooyen Proc. R. Soc. Lond. B (1999) 266: 2231-2238. Request reprint: A.van.Ooyen at nih.knaw.nl Or download from http://www.cns.ed.ac.uk/people/arjen/papers/bundle_abstract.html ABSTRACT Diffusible chemoattractants and chemorepellants, together with contact attraction and repulsion, have been implicated in the establishment of connections between neurons and their targets. Here we study how such diffusible and contact signals can be involved in the whole sequence of events from bundling of axons, guidance of axon bundles towards their targets, to debundling and the final innervation of individual targets. By means of computer simulations, we investigate the strengths and weaknesses of a number of particular mechanisms that have been proposed for these processes. -- Arjen van Ooyen, Netherlands Institute for Brain Research, Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands. email: A.van.Ooyen at nih.knaw.nl website: http://www.cns.ed.ac.uk/people/arjen.html phone: +31.20.5665483 fax: +31.20.6961006 From bengioy at IRO.UMontreal.CA Thu Nov 11 09:02:27 1999 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Thu, 11 Nov 1999 09:02:27 -0500 Subject: open faculty position in machine learning / Montreal Message-ID: <19991111090227.40797@IRO.UMontreal.CA> Hello, The department of computer science and operations research of the University of Montreal is opening two tenure-track faculty positions, and one of the main areas of interest is that of machine learning. Note that this is a French-speaking university and candidates are expected to be able to teach in French within about a year of hiring (in the past we have hired several non-francophones, and many non-Canadians, who have successfully adapted). Note also that Montreal is a great (bilingual English/French) city to live in, with the flavor of a European city, a vibrant cultural life, four universities, nearby ski slopes, great and inexpensive restaurants, and a strong network of research centers in the mathematical sciences. Please don't hesitate to contact me for further information. -- Yoshua Bengio, bengioy at iro.umontreal.ca www.iro.umontreal.ca/~bengioy The official announcement: --------------------------------------------------------------------- Universit? de Montr?al Facult? des arts et des sciences Department of Computer Science and Operations Research The DIRO (D?partement d'informatique et de recherche op?rationnelle - Department of Computer Science and Operations Research) invites applications for two tenure-track positions in Computer Science at the Assistant Professor level, starting June 1st, 2000. The Department is seeking qualified candidates in Computer Science. Preference will be given to applicants with a strong research program in one of the following or related areas: -- Hardware-software systems (specification, synthesis, and verification of embedded systems); -- Artificial intelligence (machine learning and data mining); -- Distributed multimedia systems; -- Programming languages. Beyond demonstrating a clear potential for outstanding research, the successful candidates must be committed to excellence in teaching. The Universit? de Montr?al is the leading French-speaking University in North America. The DIRO offers B.Sc., M.Sc. and Ph.D. degrees in Computer Science and Operations Research, as well as a combined undergraduate degree in Computer Science and Mathematics. With 35 faculty members, 600 undergraduate and 190 graduate students, the DIRO is one of the largest Computer Science departments in Canada as well as one of the most active in research. Research interests of current faculty include computational biology, telecommunications, intelligent tutoring systems, computer architecture, software engineering, artificial intelligence, computational linguistic, computer graphics and vision, machine learning, theoretical and quantum computing, parallelism, optimization, heuristics, numerical simulation. Further information can be obtained at the Department's web site: http://www.iro.umontreal.ca. Requirements : Ph.D. in Computer Science or a related area. Ability to teach and supervise students in French within one year. Salary : Salary is competitive and fringe benefits are excellent. Hardcopy applications including a curriculum vitae, a statement of current research program, at least three letters of reference, and up to three selected preprints/reprints, should be sent to: Sang Nguyen, professeur et directeur D?partement d'informatique et de recherche op?rationnelle, FAS Universit? de Montr?al C.P. 6128, Succ. "Centre-Ville" Montr?al (Qu?bec), H3C 3J7 by February 1st, 2000. Applications received after that date may be considered until the positions are filled. In accordance with Canadian Immigration requirements, priority will be given to Canadian citizens and permanent residents. The Universit? de Montr?al is committed to equity in employment and encourages applications from qualified women. -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From rsun at cecs.missouri.edu Thu Nov 11 16:27:48 1999 From: rsun at cecs.missouri.edu (Ron Sun) Date: Thu, 11 Nov 1999 15:27:48 -0600 Subject: Ph.D program in AI and connectionist models Message-ID: <199911112127.PAA15773@pc113.cecs.missouri.edu> The Ph.D program in CECS at University of Missouri-Columbia is accepting applications. Graduate assistantships and other forms of financial support for graduate students are available. Prospective graduate students interested in Artificial Intelligence, Cognitive Science, Connectionist Models (Neural Networks), Multi-Agent Systems, and other related areas are especially encouraged to apply. Students with Master's degrees are preferred. The department has identified graduate education and research as its primary missions. The department is conducting quality research in a number of areas: artificial intelligence, cognitive sceince, machine learning, multi-agent systems, neural networks and connectionist models, computer graphics and scientific visualization, computer vision, digital libraries, fuzzy logic, multimedia systems, parallel and distributed computing, and Web computing. To download application forms, use http://www.missouri.edu/~gradschl or http://web.missouri.edu/~regwww/admission/intl_admission/Application_Form/Application_index.html (for international students) ----------------------------------------------------------------- The CECS Department awards degrees at the Bachelor's, Master's and Ph.D's levels. The program is accredited by CSAB and ABET. The CECS Department has a variety of computing equipment and laboratories available for instruction and research. These facilities are currently being enhanced, in conjunction with computing laboratories maintained by the college and by the campus. The computing facilities offer students a wealth of opportunity to access and utilize a wide range of equipment best suited for their research needs. All of the equipment is connected to departmental, college, campus, and global networks which provides ready access to the exploding world of information and computational resources. A wealth of library resources are available through the extensive collections of books and journals housed in the Engineering and Mathematical Sciences libraries as well as collections in the Main Library and Health Sciences Libraries at MU. The University of Missouri is a Research I university enrolling some 22,000 students. The University offers programs in many areas, ranging from sciences and engineering to psychology, neuroscience, education, biology, medicine, law, agriculture, and journalism. For more information, send e-mail to: cecsdgs at cecs.missouri.edu See the Web pages below: =========================================================================== Prof. Ron Sun http://www.cecs.missouri.edu/~rsun CECS Department University of Missouri-Columbia fax: (573) 882 8318 201 Engineering Building West Columbia, MO 65211-2060 email: rsun at cecs.missouri.edu http://www.cecs.missouri.edu/~rsun http://www.cecs.missouri.edu/~rsun/journal.html http://www.cecs.missouri.edu/~rsun/clarion.html =========================================================================== From X.Yao at cs.bham.ac.uk Thu Nov 11 16:43:38 1999 From: X.Yao at cs.bham.ac.uk (Xin Yao) Date: Thu, 11 Nov 1999 21:43:38 +0000 (GMT) Subject: Combinations between EC and NNs Message-ID: The First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks Co-sponsored by IEEE Neural Network Council The Center for Excellence in Evolutionary Computation May 11-12, 2000 The Camberley Gunter Hotel, San Antonio, TX, USA Symposium URL: http://www.cs.bham.ac.uk/~xin/ecnn2000 FINAL CALL FOR PAPERS The recent increasing interest in the synergy between evolutionary computation and neural networks provides an impetus for a symposium dedicated to furthering our understanding of this synergy and the potential utility of hybridizing evolutionary and neural techniques. The First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks will offer a forum that focuses specifically on the hybridization of evolutionary and neural computation. In particular, papers are solicited in the areas of + evolutionary training of neural networks, + evolutionary design of network topologies, + evolution of learning (weight updating) rules, + evolving solutions to inverse neural network problems, + the performance of alternative variation operators in designing neural networks, + comparisons between evolutionary and other training methods, + evolving developmental rules for neural network design, and + the use of coevolution in optimizing neural networks for pattern recognition, gaming, or other applications. Other topics that combine evolutionary and neural computation are also welcome. Submitted papers should represent unpublished, original work. PAPER SUBMISSION Send three (3) copies of your manuscript to Xin Yao School of Computer Science The University of Birmingham Edgbaston, Birmingham B15 2TT U.K. Email: x.yao at cs.bham.ac.uk All three hardcopies should be printed on 8.5 by 11 inch or A4 paper using 11 point Times. Allow at least one inch (25mm) margins on all borders. A paper must include a title, an abstract, and the body and references. It must also include the names and addresses of all authors, their email addresses, and their telephone/fax numbers. The length of submitted papers must be no more than 15 single-spaced, single-column pages, including all figures, tables, and references. Shorter papers are encouraged. In addition to hardcopies, please send a postscript file of your paper (gzipped if possible) to facilitate electronic reviewing to the following email address: x.yao at cs.bham.ac.uk. Please check the symposium's web site http://www.cs.bham.ac.uk/~xin/ecnn2000 for more details as they become available. SUBMISSION DEADLINE: DECEMBER 1, 1999 General Chair: Xin Yao Programme Committee Chair: D.B. Fogel Program Committee Members: H. Adeli P.J. Angeline K. Chellapilla J.-C. Chen S.-B. Cho G.W. Greenwood L. Guan T. Hussain N. Kasabov S. Lucas N. Murshed V. Nissen M. Rizki R. Salomon G. Yen D. Van Veldhuizen B.-T. Zhang Q. Zhao The Symposium follows the 9th IEEE International Conference on Fuzzy Systems (FUZZ-IEEE2000), Hilton Palacio del Rio, San Antonio, TX, USA, 7-10 May 2000. From espaa at exeter.ac.uk Fri Nov 12 05:24:33 1999 From: espaa at exeter.ac.uk (ESPAA) Date: Fri, 12 Nov 1999 10:24:33 +0000 (GMT Standard Time) Subject: PAA JOURNAL CONTENTS Message-ID: PATTERN ANALYSIS AND APPLICATIONS Springer-Verlag London Ltd. Homepage: http://www.dcs.ex.ac.uk/paa Electronic Version: http://link.springer.de/link/service/journals/10044 ISSN: 1433-7541 (printed version) ISSN: 1433-755X (electronic version) Table of Contents Vol. 2 Issue 4 (November, 1999) M. Hauta-Kasari, J. Parkkinen, T. Jaaskelainen, R. Lenz: Multi-spectral Texture Segmentation Based on the Spectral Cooccurrence Matrix Pattern Analysis & Applications 2 (1999) 4, 275-284 P. Fränti, E. I. Ageenko, A. Kolesnikov: Vectorising and Feature-Based Filtering for Line-Drawing Image Compression Pattern Analysis & Applications 2 (1999) 4, 285-291 A. F. R. Rahman, M. C. Fairhurst: Serial Combination of Multiple Experts: A Unified Evaluation Pattern Analysis & Applications 2 (1999) 4, 292-311 A. Fusiello, E. Trucco, T. Tommasini, V. Roberto: Improving Feature Tracking with Robust Statistics Pattern Analysis & Applications 2 (1999) 4, 312-320 P. Carvalho, N. Costa, B. Ribeiro, A. Dourado: On the Use of Neural Networks and Geometrical Criteria for Localisation of Highly Irregular Elliptical Shapes Pattern Analysis & Applications 2 (1999) 4, 321-342 From fritz at neuro.informatik.uni-ulm.de Fri Nov 12 05:44:56 1999 From: fritz at neuro.informatik.uni-ulm.de (Fritz Sommer) Date: Fri, 12 Nov 1999 11:44:56 +0100 (MET) Subject: Research Position in brain imaging/cogn. neuroscience Message-ID: <14379.60917.506870.529751@cerebellum> Research Position (BAT IIa) available (cognitive/computational neuroscience and brain imaging) At the University of Ulm an interdiscplinary research project on analysis and modeling of functional magnetic resonance data has been established. It is a joint project of the departments of Psychiatry (Prof. Dr. M. Spitzer), Radiology and Neural Information Processing (Prof. Dr. G. Palm). The project focusses on the development of new methods for the detection and interpretation of functional/effective connectivity in fMRI data and their application to working memory tasks. This project offers a unique environment for a direct cooperation between theorists and experimenters. In the described project a position is available (beginning Jan 2000, 2 years, 1 year extension possible). Candidates should be strongly interested in interdisciplinary research. They should have a background in statistical methods (cluster analysis, Neural Networks), functional MRI analysis or computational neuroscience. Required is a recent masters degree or equivalent in computer science, physics, mathematics or in a closely related area. Experience in programming in C in a Unix environment is necessary, experience with MATLAB and SPM is helpful. The research can be conducted as part of a PhD thesis degree in Computer Science. Salary according to BAT IIa. The University of Ulm is an equal opportunity employer and encourages female scientists to apply. Employment will be effected through the "Zentrale Universitaetsverwaltung" of the University of Ulm. Ulm is a town of about 160000 inhabitants nicely situated in the Danube valley. It has picturesque old parts (not to forget the gothic cathedral with the highest church tower of the world) and is surrounded by beautiful landscape, lakes, creek valleys, forrests and the sparsely populated "Schwaebische Alb" high plane. In an one hour train ride one can either reach the alps for skiing and hiking, or the cities of Munich and Stuttgart for more sophisticated cultural programs. Please send us CV, letter of motivation, and, if possible, addresses of three referees. Prof. Dr. Dr. M. Spitzer, Department of Psychiatry III, University of Ulm, Leimgrubenweg 12, 89075 Ulm, Germany or e-mail to manfred.spitzer at medizin.uni-ulm.de. Because of the time constraints please email your application to: Dr. F. T. Sommer, email: fritz at neuro.informatik.uni-ulm.de (You can also ask for more detailed informations on the research project.) From shastri at ICSI.Berkeley.EDU Fri Nov 12 14:20:53 1999 From: shastri at ICSI.Berkeley.EDU (Lokendra Shastri) Date: Fri, 12 Nov 1999 11:20:53 PST Subject: Algebriac rules Message-ID: <199911121920.LAA19182@lassi.ICSI.Berkeley.EDU> Dear Connectionists: The following technical report may be of interest to some of you. Best wishes. Lokendra Shastri ------- A Spatiotemporal Connectionist Model of Algebraic Rule-Learning Lokendra Shastri and Shawn Chang TR-99-011 July, 1999 Interntational Computer Science Institute Berkeley, CA 94707 Recent experiments by Marcus, Vijaya, Rao, and Vishton suggest that infants are capable of extracting and using abstract algebraic rules such as ``the first item X is the same as the third item Y''. Such an algebraic rule represents a relationship between placeholders or variables for which one can substitute arbitrary values. As Marcus et al. point out, while most neural network models excel at capturing statistical patterns and regularities in data, they have difficulty in extracting algebraic rules that generalize to new items. We describe a connectionist network architecture that can readily acquire algebraic rules. The extracted rules are not tied to features of words used during habituation, and generalize to new words. Furthermore, the network acquires rules from a small number of examples, without using negative evidence, and without pretraining. A significant aspect of the proposed model is that it identifies a sufficient set of architectural and representational conditions that transform the problem of learning algebraic rules to the much simpler problem of learning to detect coincidences within a spatiotemporal pattern. Two key representational conditions are (i) the existence of nodes that encode serial position within a sequence and (ii) the use of temporal synchrony for expressing bindings between a positional role node and the item that occupies this position in a given sequence. This work suggests that even abstract algebraic rules can be grounded in concrete and basic notions such as spatial and temporal location, and coincidence. Available at: http://www.icsi.berkeley.edu/~shastri/psfiles/tr-99-011.ps.gz OR http://www.icsi.berkeley.edu/~shastri/psfiles/tr-99-011.pdf Lokendra Shastri International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 shastri at icsi.berkeley.edu http://www.icsi.berkeley.edu/~shastri Phone: (510) 642-4274 ext 310 FAX: (510) 643-7684 From bogus@does.not.exist.com Mon Nov 15 13:07:43 1999 From: bogus@does.not.exist.com () Date: Mon, 15 Nov 1999 19:07:43 +0100 Subject: CFP: ESANN'2000 European Symposium on Artificial Neural Networks Message-ID: ---------------------------------------------------- | | | ESANN'2000 | | | | 8th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 26-27-28, 2000 | | | | Announcement and call for papers | | | | Deadline: 10 December 1999 | ---------------------------------------------------- Technically co-sponsored by the IEEE Neural Networks Council, the IEEE Region 8, the IEEE Benelux Section, and the International Neural Networks Society. The call for papers for the ESANN'2000 conference is now available on the Web: http://www.dice.ucl.ac.be/esann We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. You will find below a short version of this call for papers, without the instructions to authors (available on the Web). If you have difficulties to connect to the Web please send an e-mail to esann at dice.ucl.ac.be and we will send you a full version of the call for papers. ESANN'2000 is organised in collaboration with the UCL (Universite catholique de Louvain, Louvain-la-Neuve) and the KULeuven (Katholiek Universiteit Leuven). Scope and topics ---------------- Since its first edition in 1993, the European Symposium on Artificial Neural Networks has become the reference for researchers on fundamentals and theoretical aspects of artificial neural networks. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field. The ESANN'2000 conference will focus on fundamental aspects of ANNs: theory, models, learning algorithms, mathematical aspects, approximation of functions, classification, control, time-series prediction, statistics, signal processing, vision, self-organization, vector quantization, evolutive learning, psychological computations, biological plausibility, etc. Papers on links and comparisons between ANNs and other domains of research (such as statistics, data analysis, signal processing, biology, psychology, evolutive learning, bio-inspired systems, etc.) are also encouraged. Papers will be presented orally (no parallel sessions) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that it is the topics of the paper which will decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors have the choice to indicate on the author submission form that they only accept to present their paper orally. The following is a non-exhaustive list of topics covered during the ESANN conferences: o theory o models and architectures o mathematics o learning algorithms o vector quantization o self-organization o RBF networks o Bayesian classification o recurrent networks o support vector machines o time series forecasting o adaptive control o statistical data analysis o independent component analysis o signal processing o approximation of functions o cellular neural networks o fuzzy neural networks o natural and artificial vision o hybrid networks o identification of non-linear dynamic systems o biologically plausible artificial networks o bio-inspired systems o neurobiological systems o cognitive psychology o adaptive behaviour o evolutive learning Special sessions ---------------- Special sessions will be organized by renowned scientists in their respective fields. Papers submitted to these sessions are reviewed according to the same rules as any other submission. Authors who submit papers to one of these sessions are invited to mention it on the author submission form; nevertheless, submissions to the special sessions must follow the same format, instructions and deadlines as any other submission, and must be sent to the same address. o Self-organizing maps for data analysis J. Lampinen, K. Kaski, Helsinki Univ. of Tech. (Finland) o Time-series prediction J. Suykens, J. Vandewalle, K.U. Leuven (Belgium) o Artificial neural networks and robotics R. Duro, J. Santos Reyes, Univ. da Coruna (Spain) o Support Vector Machines C. Campbell, Bristol Univ. (UK), J. Suykens, K.U. Leuven (Belgium) o Neural networks and statistics W. Duch, Nicholas Copernicus Univ. (Poland) o Neural network in medicine T. Villmann, Univ. Leipzig (Germany) o Artificial neural networks for energy management systems G. Joya, Univ. de Malaga (Spain) Details on special sessions are available on the Web. Location -------- The conference will be held in Bruges (also called "Venice of the North"), one of the most beautiful medieval towns in Europe. Bruges can be reached by train from Brussels in less than one hour (frequent trains). The town of Bruges is world-wide known, and famous for its architectural style, its canals, and its pleasant atmosphere. The conference will be organised in an hotel located near the centre (walking distance) of the town. There is no obligation for the participants to stay in this hotel. Hotels of all level of comfort and price are available in Bruges; there is a possibility to book a room in the hotel of the conference, or in another one (50 m. from the first one) at a preferential rate through the conference secretariat. A list of other smaller hotels is also available. The conference will be held at the Novotel hotel, Katelijnestraat 65B, 8000 Brugge, Belgium. Call for contributions ---------------------- Prospective authors are invited to submit - six original copies of their manuscript (including at least two originals or very good copies without glued material, which will be used for the proceedings) - one signed copy of the author submission form before December 10, 1999. Authors are invited to join a floppy disk or CD with their contribution in (generic) PostScript or PDF format. Sorry, electronic or fax submissions are not accepted. Working language of the conference (including proceedings) is English. The instructions to authors, together with the author submission form, are available on the ESANN Web server: http://www.dice.ucl.ac.be/esann A printed version of these documents is also available through the conference secretariat (please use email if possible). Authors are invited to follow the instructions to authors. A LaTeX style file is also available on the Web. Authors must indicate their choice for oral or poster presentation on the author submission form. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2000. They will benefit from the advance registration fee. Submissions must be sent to: Michel Verleysen UCL - DICE 3, place du Levant B-1348 Louvain-la-Neuve Belgium esann at dice.ucl.ac.be All submissions will be acknowledged by fax or email before December 23, 1999. Deadlines --------- Submission of papers December 10, 1999 Notification of acceptance January 31, 2000 Symposium April 26-27-28, 2000 Registration fees ----------------- registration before registration after March 17, 2000 March 17, 2000 Universities BEF 16000 BEF 17000 Industries BEF 20000 BEF 21000 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. Conference secretariat ---------------------- Michel Verleysen D facto conference services phone: + 32 2 420 37 57 27 rue du Laekenveld Fax: + 32 2 420 02 55 B - 1080 Brussels (Belgium) E-mail: esann at dice.ucl.ac.be http://www.dice.ucl.ac.be/esann Steering and local committee ---------------------------- Fran?ois Blayo Pr?figure (F) Marie Cottrell Univ. Paris I (F) Jeanny H?rault INPG Grenoble (F) Henri Leich Fac. Polytech. Mons (B) Bernard Manderick Vrije Univ. Brussel (B) Eric Noldus Univ. Gent (B) Jean-Pierre Peters FUNDP Namur (B) Joos Vandewalle KUL Leuven (B) Michel Verleysen UCL Louvain-la-Neuve (B) Scientific committee (to be confirmed) -------------------- Edoardo Amaldi Politecnico di Milano (I) Agn?s Babloyantz Univ. Libre Bruxelles (B) Herv? Bourlard IDIAP Martigny (CH) Joan Cabestany Univ. Polit. de Catalunya (E) Holk Cruse Universit?t Bielefeld (D) Eric de Bodt Univ. Lille II & UCL Louv.-la-N. (B) Dante Del Corso Politecnico di Torino (I) Wlodek Duch Nicholas Copernicus Univ. (PL) Marc Duranton Philips / LEP (F) Jean-Claude Fort Universit? Nancy I (F) Bernd Fritzke Dresden Univ. of Technology (D) Stan Gielen Univ. of Nijmegen (NL) Manuel Grana UPV San Sebastian (E) Anne Gu?rin-Dugu? INPG Grenoble (F) Martin Hasler EPFL Lausanne (CH) Laurent H?rault CEA-LETI Grenoble (F) Christian Jutten INPG Grenoble (F) Juha Karhunen Helsinky Univ. of Technology (FIN) Vera Kurkova Acad. of Science of the Czech Rep. (CZ) Petr Lansky Acad. of Science of the Czech Rep. (CZ) Mia Loccufier Univ. Gent (B) Eddy Mayoraz Motorola Palo Alto (USA) Jean Arcady Meyer Univ. Pierre et Marie Curie - Paris 6 (F) Jos? Mira UNED (E) Jean-Pierre Nadal Ecole Normale Sup?rieure Paris (F) Gilles Pag?s Univ. Pierre et Marie Curie - Paris 6 (F) Thomas Parisini Politecnico di Milano (I) H?l?ne Paugam-Moisy Univ. Lumi?re Lyon 2 (F) Alberto Prieto Universitad de Granada (E) Leonardo Reyneri Politecnico di Torino (I) Tamas Roska Hungarian Academy of Science (H) Jean-Pierre Rospars INRA Versailles (F) John Stonham Brunel University (UK) Johan Suykens KUL Leuven (B) John Taylor King?s College London (UK) Claude Touzet IUSPIM Marseilles (F) Marc Van Hulle KUL Leuven (B) Christian Wellekens Eurecom Sophia-Antipolis (F) From mozer at cs.colorado.edu Tue Nov 16 14:47:25 1999 From: mozer at cs.colorado.edu (Mike Mozer) Date: Tue, 16 Nov 99 12:47:25 -0700 Subject: faculty positions in Machine Learning at U. Colorado, Boulder Message-ID: <199911161947.MAA20910@neuron.cs.colorado.edu> The Computer Science Department at the University of Colorado at Boulder has immediate openings for two new faculty members in the area of Machine Learning (at either junior or senior level). Additional openings are likely in the coming years through the Institute of Cognitive Science. Details can be found in the job ad at http://www.cs.colorado.edu/department/news/news.html#search From jagota at cse.ucsc.edu Tue Nov 16 17:07:05 1999 From: jagota at cse.ucsc.edu (Arun Jagota) Date: Tue, 16 Nov 1999 14:07:05 -0800 Subject: NIPS*99: call for volunteers Message-ID: <199911162207.OAA27891@sundance.cse.ucsc.edu> We can use a few more student volunteers at NIPS*99. (Any nine hours of work in Denver gets registration for both the tutorials and the conference, as well as reception and dinner. Any six hours of work in Breckenridge gets registration for the workshops, which includes the receptions and dinner, and transportation there on the bus. One may volunteer at both places.) To apply, first visit the "Volunteer Work" subsection of the "Programs and Schedules" section of the NIPS page http://www.cs.cmu.edu/Groups/NIPS/ for instructions, then visit http://www.cse.ucsc.edu/~jagota/NIPS/S.html for the current list of open tasks. Arun Jagota, NIPS*99 Local Arrangements jagota at cse.ucsc.edu From geoff at giccs.georgetown.edu Tue Nov 16 18:16:07 1999 From: geoff at giccs.georgetown.edu (Geoff Goodhill) Date: Tue, 16 Nov 1999 18:16:07 -0500 Subject: Papers available: axon guidance Message-ID: <199911162316.SAA05112@brecker.giccs.georgetown.edu> The following papers in TINS and J Neurobiol are now available from http://www.giccs.georgetown.edu/labs/cns/axon.html 1. Retinotectal Maps: Molecules, Models, and Misplaced Data. Geoffrey J. Goodhill & Linda J. Richards. Trends in Neurosciences, 22, 529-534 (December 1999). 2. Theoretical analysis of gradient detection by growth cones. Geoffrey J. Goodhill & Jeffrey S. Urbach. Journal of Neurobiology, 41, 230-241 (November 1999). Abstracts are below. Geoff Geoffrey J Goodhill, PhD Assistant Professor, Department of Neuroscience & Georgetown Institute for Cognitive and Computational Sciences Georgetown University Medical Center 3970 Reservoir Road NW, Washington DC 20007 Tel: (202) 687 6889, Fax: (202) 687 0617 Email: geoff at giccs.georgetown.edu Homepage: www.giccs.georgetown.edu/labs/cns ABSTRACTS 1. The mechanisms underlying the formation of topographic maps in the retinotectal system have long been debated. Recently, members of the Eph and ephrin receptor-ligand family have been found to provide a molecular substrate for one type of mechanism, that of chemospecific gradient matching as proposed by Sperry. However, experiments over several decades have demonstrated that there is more to map formation than gradient matching. This article briefly reviews the old and new findings, argues that these two types of data must be properly integrated in order to understand map formation fully, and suggests some experimental and theoretical ways to begin this process. 2. Gradients of diffusible and substrate-bound molecules play an important role in guiding axons to appropriate targets in the developing nervous system. Although some of the molecules involved have recently been identified, little is known about the physical mechanisms by which growth cones sense gradients. This paper applies the seminal Berg & Purcell (1977) model of gradient sensing to this problem. The model provides estimates for the statistical fluctuations in the measurement of concentration by a small sensing device. By assuming that gradient detection consists of the comparison of concentrations at two spatially or temporally separated points, the model therefore provides an estimate for the steepness of gradient that can be detected as a function of physiological parameters. The model makes the following specific predictions. (1) It is more likely that growth cones use a spatial rather than temporal sensing strategy. (2) Growth cone sensitivity increases with the concentration of ligand, the speed of ligand diffusion, the size of the growth cone, and the time over which it averages the gradient signal. (3) The minimum detectable gradient steepness for growth cones is roughly in the range 1% - 10%. (4) This value varies depending on whether a bound or freely diffusing ligand is being sensed, and on whether the sensing occurs in three dimensions or two dimensions. The model also makes predictions concerning the role of filopodia in gradient detection. From steve at cns.bu.edu Wed Nov 17 08:27:00 1999 From: steve at cns.bu.edu (Stephen Grossberg) Date: Wed, 17 Nov 1999 08:27:00 -0500 Subject: Perceptual Grouping and Object-Based Attention by the Laminar Circuits of Visual Cortex Message-ID: The following article can be read at http://www.cns.bu.edu/Profiles/Grossberg/ S. Grossberg and R. D. S. Raizada (1999) Contrast-sensitive perceptual grouping and object-based attention in the laminar circuits of primary visual cortex. Vision Research, in press. ABSTRACT Recent neurophysiological studies have shown that primary visual cortex, or V1, does more than passively process image features using the feedforward filters suggested by Hubel and Wiesel. It also uses horizontal interactions to group features preattentively into object representations, and feedback interactions to selectively attend to these groupings. All neocortical areas, including V1, are organized into layered circuits. We present a neural model showing how the layered circuits in areas V1 and V2 enable feedforward, horizontal, and feedback interactions to complete perceptual groupings over positions that do not receive contrastive visual inputs, even while attention can only modulate or prime positions that do not receive such inputs. Recent neurophysiological data about how grouping and attention occur and interact in V1 are simulated and explained, and testable predictions are made. These simulations show how attention can selectively propagate along an object grouping and protect it from competitive masking, and how contextual stimuli can enhance or suppress groupings in a contrast-sensitive manner. Preliminary version appears as Boston University Technical Report, CAS/CNS-TR-99-008. Available in PDF and gzip'ed postscript. From becker at curie.psychology.mcmaster.ca Wed Nov 17 18:18:18 1999 From: becker at curie.psychology.mcmaster.ca (Sue Becker) Date: Wed, 17 Nov 1999 18:18:18 -0500 (EST) Subject: faculty position: human cognition/cognitive neuroscience Message-ID: Dear list members, The department of Psychology at McMaster has an open faculty position at the assistant or associate level that may be of interest. Our advertisement is included below. Althoug it generally targets human cognition, cognitive neuroscience (neural computation, neuropsychology, brain imaging) is a sub-area of high priority. Feel free to contact me if you are interested in applying. cheers, Sue Becker Sue Becker Department of Psychology, McMaster University becker at mcmaster.ca 1280 Main Street West, Hamilton, Ont. L8S 4K1 Fax: (905)529-6225 http://www.science.mcmaster.ca/Psychology/sb.html Tel: 525-9140 ext. 23020 For Aug 6/1999-June 30/2000: becker at mcmaster.ca Institute of Cognitive Neuroscience Fax: 44-(0)171-391-1145 Alexandra House, University College London Tel: 44-(0)171-391-1148 17 Queen Square, London, UK WC1N 3AR ------------------------------------------------------------- The Department of Psychology at McMaster University invites applications for a tenure track appointment at the Assistant Professor level or early Associate Professor level in the area of human cognition. Preference will be given to applicants with research interests in higher level cognitive processes (e.g., memory, categorization, decision-making), or a research program in neuropsychology, particularly one involving patient populations. However, candidates with research programs in other areas of cognition are strongly encouraged to apply. Research which extends to the domain of cognitive neuroscience (e.g., neuroimaging, neural computation) will be considered an asset. To apply, send a curriculum vitae, a short statement of research interests, selected reprints, and three letters of reference to: Dr. Bruce Milliken, Department of Psychology, McMaster University, Hamilton, Ontario, CANADA L8S 4K1. Closing date for applications and supporting material is December 15, 1999. In accordance with Canadian immigration requirements, this advertisement is directed to Canadian citizens and permanent residents. McMaster University is committed to Employment Equity and encourages applications from all qualified candidates, including aboriginal peoples, persons with disabilities, members of visible minorities, and women. Interested candidates may learn more about the department at http://www.psychology.mcmaster.ca. From tp at ai.mit.edu Thu Nov 18 01:01:26 1999 From: tp at ai.mit.edu (Tomaso Poggio) Date: Thu, 18 Nov 1999 01:01:26 -0500 Subject: position available/MIT Message-ID: <4.2.0.58.19991118005531.05622410@pop6.attglobal.net> MASSACHUSETTS INSTITUTE OF TECHNOLOGY DEPARTMENT OF BRAIN & COGNITIVE SCIENCES The MIT Department of Brain and Cognitive Sciences anticipates making a new tenure-track appointment in theoretical/experimental neuroscience at the Assistant Professor level. Candidates should combine a strong mathematical background and an active research interest in the modeling of specific cellular- or systems-level phenomena with appropriate experiments. Individuals whose research focuses on learning and memory or sensory-motor integration at the level of neurons and networks of neurons are especially encouraged to apply.We are also interested in individuals working on bioinformatics in neuroscience. Responsibilities include graduate and undergraduate teaching and research supervision. Applications should include a brief cover letter stating the candidate's research and teaching interests, a vita, three letters of recommendation, and representative reprints, and should be sent to: Theoretical/Experimental Neuroscience Search Committee, Dept. of Brain & Cognitive Sciences, E25-406, MIT, Cambridge, MA 02139. Review of applications starts January 15, 2000. MIT is an Affirmative Action/Equal Opportunity Employer. Qualified women and minority candidates are encouraged to apply. Qualified women and minority candidates are especially encouraged to apply. MIT is an Affirmative Action/Equal Opportunity employer. Tomaso Poggio Uncas and Helen Whitaker Professor Brain Sciences Department and Artificial Intelligence Lab M.I.T., E25-218, 45 Carleton St Cambridge, MA 02142 E-mail: tp at ai.mit.edu Web: CBCL-Web-page: Phone: 617-253-5230 Fax: 617-253-2964 From giorgio.giacinto at computer.org Thu Nov 18 04:24:54 1999 From: giorgio.giacinto at computer.org (Giorgio Giacinto) Date: Thu, 18 Nov 1999 10:24:54 +0100 Subject: 1st Internation Workshop on Multiple Cassifier Systems Message-ID: **Apologies for multiple copies** ************************************************** ***** First Announcement and Call for Papers ***** ************************************************** ********************************************************************** 1st MCS FIRST INTERNATIONAL WORKSHOP ON MULTIPLE CLASSIFIER SYSTEMS Santa Margherita di Pula, Cagliari, Italy, June 21-23 2000 ********************************************************************** *** Updated information: http://www.diee.unica.it/mcs *** *** E-mail: mcs at diee.unica.it *** WORKSHOP OBJECTIVES The main goal of the workshop is to assess the state of the art of the theory and the applications of multiple classifier systems and related approaches. Contributions from all the research communities working in the field are welcome in order to compare the different approaches and to define the common research priorities. Special attention is also devoted to assess the applications of multiple classifier systems and the potential market perspectives. The workshop program will include both plenary lectures given by invited speakers and papers accepted for oral presentation. The papers will be published in the workshop proceedings, and extended versions of selected papers will be considered for publication in a special issue of the Pattern Analysis and Applications Journal on Classifier Fusion WORKSHOP TOPICS Papers describing original work in the following and related research topics are welcome: Theoretical foundations of multiple classifier systems Methods for classifier combination Methods for classifier selection Neural network ensembles Modular neural networks Mixture models Multiple expert systems Hybrid systems Learning in multiple classifier systems Design of multiple classifier systems Multiple models in data mining Related approaches (intelligent agents, multi-criteria decision making, etc.) Applications (biometrics, document analysis, data mining, remote sensing, etc.) WORKSHOP CHAIRS Josef Kittler (Univ. of Surrey, United Kingdom) Fabio Roli (Univ. of Cagliari, Italy) INVITED SPEAKERS Thomas G. Dietterich (Oregon State University, USA) Robert P.W. Duin (Delft Univ. of Tech., The Netherlands) Amanda J.C. Sharkey (Dept. Computer Science, Univ. of Sheffield, UK) Sargur N. Srihari (CEDAR, State Univ. of New York, Buffalo, USA) Ching Y. Suen (CENPARMI, Concordia Univ., Montreal, Canada) PROGRAM CHAIR Gianni Vernazza (Univ. of Genoa, Italy) SCIENTIFIC COMMITTEE J. A. Benediktsson (Iceland) H. Bunke (Switzerland) L. P. Cordella (Italy) T.G. Dietterich (USA) R. P.W. Duin (The Netherlands) J. Ghosh (USA) S. Impedovo (Italy) D. Landgrebe (USA) D.S. Lee (USA) A. K. Jain (USA) T. K. Ho (USA) D. Partridge (UK) C.Scagliola (Italy) R. Schapire (USA) A. J.C. Sharkey (UK) S. N. Srihari (USA) C.Y. Suen (Canada) D. Wolpert (USA) LOCAL COMMITTEE G.Armano (Univ. of Cagliari, Italy) G.Giacinto (Univ. of Cagliari, Italy) G.Fumera (Univ. of Cagliari, Italy) PAPER SUBMISSION Three hard copies of the full papers should be mailed to: 1st MCS Prof. Fabio Roli Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy In addition, participants should submit an electronic version of the manuscript (PostScript or PDF format) to mcs at diee.unica.it The papers should not exceed 15 A4 pages (12pt, double-spaced). A cover sheet with the authors names and affiliations is also requested, with the complete address of the corresponding author, and an abstract (200 words). The papers will be refereed by two separate reviewers of the Scientific Committee. IMPORTANT DATES February 1, 2000: Paper Submission March 15, 2000: Notification of Acceptance April 30, 2000: Camera-ready Manuscripts May 2000: Early registration WORKSHOP VENUE The workshop will be held at the Is Molas Golf Hotel, Santa Margherita di Pula, Cagliari, Italy. Additional information concerning the workshop venue can be found at http://www.ismolas.it WORKSHOP PROCEEDINGS The papers will be published in the workshop proceedings, and extended versions of selected papers will be considered for publication in a special issue of the Pattern Analysis and Applications Journal on Classifier Fusion ==================================================================== Fabio Roli, Phd Associate Professor of Computer Science Electrical and Electronic Engineering Dept. - University of Cagliari Piazza d'Armi 09123 Cagliari Italy Phone +39 070 675 5874 Fax +39 070 6755900 e-mail roli at diee.unica.it Web Page at http://www.diee.unica.it From villmann at informatik.uni-leipzig.de Thu Nov 18 03:18:58 1999 From: villmann at informatik.uni-leipzig.de (villmann@informatik.uni-leipzig.de) Date: Thu, 18 Nov 1999 09:18:58 +0100 Subject: ESANN'2000 - special session - NN Applications in Medicine Message-ID: <199911180818.JAA13635@ilabws.informatik.uni-leipzig.de> Dear colleages, I want to announce and to invite to submit contributions to the special session "Neural Networks Applications in Medicine" organized by Thomas Villmann (Ujniversity Leipzig, Germany) and to be held on the ---------------------------------------------------- | | | ESANN'2000 | | | | 8th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 26-27-28, 2000 | | | | Announcement and call for papers | | | | Deadline: 10 December 1999 | ---------------------------------------------------- PROPOSAL ------------------- Artificial neural networks can be taken as a special kind of learning and self-adapting data processing systems. The abilities to handle noisy and high-dimensional data, nonlinear problems, large data sets etc. using neural techniques have lead to an innumerous number of applications. A important area of applications is the medical research and data analysis. Thereby, we have to distinguish at least two main directions: the first one is the description of neural processes in brains by neural network models. The other one is the field of data analysis and related topics. The announced special session should be lightening the second item. The applications of neural networks for data processing in medicine open a wide cover of possible tasks. This leads to further developments also in the theory of neural networks. The applications include several techniques ranging from image processing, (non-linear) principle component analysis and dimension reduction, processing of noisy and/or incomplete data, time serie prediction, feature extraction and classification problems to the processing of non-metric data, i.e. categorial or ordinal data. The recent proposed approach of Blind Source Separation using neural networks also gives a possibility to solve the in medicine often occuring problem of data decorrelation. Neural network approaches as robust tools play in increasing role in real world application in medicine. In the proposed special session we want give examples of applications of neural networks in medicine which have lead to new developments in the neural network development. Thereby we focus on applications combining original ideas and new aspcets in neural network approaches with a real world task. Authors are invited to submit contributions which can be in any area of medical research and applications as there are for example the following (but not restricted) 1.) image processing (NMR-spectroscopy, radiology, ...) 2.) time series prediction (EEG analysis, cardiology, sleep detection, ...) 3.) pattern classification 4.) clustering, fuzzy clustering 5.) Blind Source Separation and Decorralation 6.) dimension and noise reduction 7.) evaluation of non-metric data (categorial/ ordinal data) It should be emphasized that, in agreement with the scope of the ESANN, a strong theoretical background and new aspects of neural networks development are required for an acceptation of contributions. Prospective authors are invited to submit - six original copies of their manuscript (including at least two originals or very good copies without glued material, which will be used for the proceedings) - one signed copy of the author submission form before December 10, 1999. Authors are invited to join a floppy disk or CD with their contribution in (generic) PostScript or PDF format. Sorry, electronic or fax submissions are not accepted. Working language of the conference (including proceedings) is English. The instructions to authors, together with the author submission form, are available on the ESANN Web server: http://www.dice.ucl.ac.be/esann A printed version of these documents is also available through the conference secretariat (please use email if possible). Authors are invited to follow the instructions to authors. A LaTeX style file is also available on the Web. Authors must indicate their choice for oral or poster presentation on the author submission form. They must also sign a written agreement that they will register to the conference and present the paper in case of acceptation of their submission. Authors of accepted papers will have to register before February 28, 2000. They will benefit from the advance registration fee. Submissions must be sent to: Michel Verleysen UCL - DICE 3, place du Levant B-1348 Louvain-la-Neuve Belgium esann at dice.ucl.ac.be All submissions will be acknowledged by fax or email before December 23, 1999. Deadlines --------- Submission of papers December 10, 1999 Notification of acceptance January 31, 2000 Symposium April 26-27-28, 2000 Registration fees ----------------- registration before registration after March 17, 2000 March 17, 2000 Universities BEF 16000 BEF 17000 Industries BEF 20000 BEF 21000 The registration fee include the attendance to all sessions, the lunches during the three days of the conference, the coffee breaks twice a day, the conference dinner, and the proceedings. __________________________________________________________ Dr. Thomas Villmann University Leipzig - Clinic for Psychotherapy Karl-Tauchnitz-Str. 25 04109 Leipzig Germany Phone: +49 (0)341 9718868 Fax : +49 (0)341 2131257 email: villmann at informatik.uni-Leipzig.de __________________________________________________________ From szepes at mindmaker.hu Thu Nov 18 13:32:57 1999 From: szepes at mindmaker.hu (Csaba Szepesvari) Date: Thu, 18 Nov 1999 19:32:57 +0100 Subject: TR: Comparing Value-Function Estimation Algorithms in Undiscounted Problems Message-ID: <38344658.C3436C8E@mindmaker.hu> Dear Colleagues, The following technical report is available at http://victoria.mindmaker.hu/~szepes/papers/slowql-tr99-02.ps.gz All comments are welcome. Best wishes, Csaba Szepesvari ---------------------------------------------------------------- Comparing Value-Function Estimation Algorithms in Undiscounted Problems TR99-02, Mindmaker Ltd., Budapest 1121, Konkoly Th. M. u. 29-33 Ferenc Beleznay, Tamas Grobler and Csaba Szepesvari We compare scaling properties of several value-function estimation algorithms. In particular, we prove that Q-learning can scale exponentially slowly with the number of states. We identify the reasons of the slow convergence and show that both TD($\lambda$) and learning with a fixed learning-rate enjoy rather fast convergence, just like the model-based method. From guang at ce.chalmers.se Fri Nov 19 10:37:46 1999 From: guang at ce.chalmers.se (Guang Li) Date: Fri, 19 Nov 1999 16:37:46 +0100 Subject: PhD thesis available Message-ID: <38356EC9.F236B099@ce.chalmers.se> Dear researchers, A PhD thesis is available at Department of Computer Engineering Chalmers University of Technology S-412 96 Gothenburg, Sweden Anyone interested please reply this mail to request a copy of it. Regards Guang Li --------------------- Title: Towards On-line Learning Agents for Autonomous Navigation Abstract The design of a mechatronic agent capable of navigating autonomously in a changing and perhaps previously unfamiliar environment is a very challenging issue. This thesis addresses this issue from both functional and system perspectives. Functions such as spatial representation, localization, path-finding and collision avoidance are essential to autonomous agent navigation. Four types of learning related to these functions have been identified as important: sensory information categorization and classification, the learning of stimulus-response mapping, the learning of spatial representation and the coding and adaptation of the travel experience with regard to specific tasks. It is argued that, in order to achieve a high degree of autonomy at the system level, it is essential to implement each of these navigational functions with a highly autonomous learning technique. An analysis of several representative artificial neural network (ANN) algorithms for their degrees of autonomy and computational characteristics indicates that none of the learning techniques analyzed is alone sufficient in terms of spatial learning. It is shown that biology can be inspirational in finding a possibly better, or perhaps more complete, solution to the learning of spatial representation than previous engineering or ANN based approaches. In particular, data on the biological head direction system have inspired the generation of a computational model which is shown to be able to use learned environmental features to correct the directional error accumulated by dead-reckoning in a simulated mobile robot. Furthermore, using a hippocampal place learning system in biological systems as an inspiration, a network model of dynamic cell structure is suggested. It allows an autonomous agent to perform tasks such as environmental mapping, localization and path-finding. In this model, a focus mechanism is included to help minimize computation needs by directing the adaptation of the network and the path-finding. The thesis also discusses various approaches toward achieving a high degree of autonomy at the system level. It is also shown that a feed forward gating mechanism can be combined into a layered design framework to accommodate the interaction between various navigational functions having high degrees of autonomy. From ijspeert at rana.usc.edu Fri Nov 19 21:14:41 1999 From: ijspeert at rana.usc.edu (Auke Ijspeert) Date: Fri, 19 Nov 1999 18:14:41 -0800 (PST) Subject: PhD thesis and preprints: CPGs for lamprey and salamander locomotion Message-ID: Dear Connectionists, The following PhD thesis and preprints (see below) may interest people working on the modeling of central pattern pattern generators for locomotion. They presents some results in the use of evolutionary algorithms for completing biological plausible neural circuits modeled as continuous-time neural networks. In particular, the neural circuits underlying the swimming of the lamprey and the swimming and trotting of the salamander are investigated. This work is inspired by Ekeberg's neuromechanical model of the lamprey swimming (Biol. Cybern. 69, 363-374, 1993), and, similarly, the developed CPGs are incorporated in simple biomechanical models of a lamprey and a salamander. Animated gifs illustrating the different gaits developed can be found at: http://rana.usc.edu:8376/~ijspeert/ The thesis and the papers (title and abstracts below) can be found in gzipped postscript at: http://rana.usc.edu:8376/~ijspeert/publications.html Please tell me if you have any problem downloading them. Comments are most welcome! Best regards, Auke Ijspeert -------------------------------------------------------------------------- Dr Auke Jan Ijspeert Brain Simulation Lab & Computational Learning and Motor Control Lab Department of Computer Science, Hedco Neurosciences building U. of Southern California, Los Angeles, CA 90089, USA Web: http://rana.usc.edu:8376/~ijspeert/ Tel: +1 213 7401922 or 7406995 (work) +1 310 8238087 (home) Fax: +1 213 7405687 Email: ijspeert at rana.usc.edu -------------------------------------------------------------------------- _____________________________________________________________________ PhD thesis, A.J. Ijspeert Design of artificial neural oscillatory circuits for the control of lamprey- and salamander-like locomotion using evolutionary algorithms Supervisors: John Hallam and David Willshaw Department of Artificial Intelligence, University of Edinburgh, 1998 Abstract: This dissertation investigates the evolutionary design of oscillatory artificial neural networks for the control of animal-like locomotion. It is inspired by the neural organisation of locomotor circuitries in vertebrates, and explores in particular the control of undulatory swimming and walking. The difficulty with designing such controllers is to find mechanisms which can transform commands concerning the direction and the speed of motion into the multiple rhythmic signals sent to the multiple actuators typically involved in animal-like locomotion. In vertebrates, such control mechanisms are provided by central pattern generators which are neural circuits capable of producing the patterns of oscillations necessary for locomotion without oscillatory input from higher control centres or from sensory feedback. This thesis explores the space of possible neural configurations for the control of undulatory locomotion, and addresses the problem of how biologically plausible neural controllers can be automatically generated. Evolutionary algorithms are used to design connectionist models of central pattern generators for the motion of simulated lampreys and salamanders. This work is inspired by Ekeberg's neuronal and mechanical simulation of the lamprey [Ekeberg 93]. The first part of the thesis consists of developing alternative neural controllers for a similar mechanical simulation. Using a genetic algorithm and an incremental approach, a variety of controllers other than the biological configuration are successfully developed which can control swimming with at least the same efficiency. The same method is then used to generate synaptic weights for a controller which has the observed biological connectivity in order to illustrate how the genetic algorithm could be used for developing neurobiological models. Biologically plausible controllers are evolved which better fit physiological observations than Ekeberg's hand-crafted model. Finally, in collaboration with Jerome Kodjabachian, swimming controllers are designed using a developmental encoding scheme, in which developmental programs are evolved which determine how neurons divide and get connected to each other on a two-dimensional substrate. The second part of this dissertation examines the control of salamander-like swimming and trotting. Salamanders swim like lampreys but, on the ground, they switch to a trotting gait in which the trunk performs a standing wave with the nodes at the girdles. Little is known about the locomotion circuitry of the salamander, but neurobiologists have hypothesised that it is based on a lamprey-like organisation. A mechanical simulation of a salamander-like animat is developed, and neural controllers capable of exhibiting the two types of gaits are evolved. The controllers are made of two neural oscillators projecting to the limb motoneurons and to lamprey-like trunk circuitry. By modulating the tonic input applied to the networks, the type of gait, the speed and the direction of motion can be varied. By developing neural controllers for lamprey- and salamander-like locomotion, this thesis provides insights into the biological control of undulatory swimming and walking, and shows how evolutionary algorithms can be used for developing neurobiological models and for generating neural controllers for locomotion. Such a method could potentially be used for designing controllers for swimming or walking robots, for instance. _______________________________________________________________________ A.J. Ijspeert, J. Hallam and D. Willshaw: Evolving swimming controllers for a simulated lamprey with inspiration from neurobiology, Adaptive Behavior 7:2, 1999 (in press). Abstract: This paper presents how neural swimming controllers for a simulated lamprey can be developed using evolutionary algorithms. A genetic algorithm is used for evolving the architecture of a connectionist model which determines the muscular activity of a simulated body in interaction with water. This work is inspired by the biological model developed by Ekeberg which reproduces the central pattern generator observed in the real lamprey \cite{Ekeberg93}. In evolving artificial controllers, we demonstrate that a genetic algorithm can be an interesting design technique for neural controllers and that there exist alternative solutions to the biological connectivity. A variety of neural controllers are evolved which can produce the pattern of oscillations necessary for swimming. These patterns can be modulated through the external excitation applied to the network in order to vary the speed and the direction of swimming. The best evolved controllers cover larger ranges of frequencies, phase lags and speeds of swimming than Ekeberg's model. We also show that the same techniques for evolving artificial solutions can be interesting tools for developing neurobiological models. In particular, biologically plausible controllers can be developed with ranges of oscillation frequency much closer to those observed in the real lamprey than Ekeberg's hand-crafted model. Keywords: Neural control; genetic algorithm; simulation; central pattern generator; swimming; lamprey. _______________________________________________________________________ A.J. Ijspeert, J. Kodjabachian: Evolution and development of a central pattern generator for the swimming of a lamprey, Artificial Life 5:3, 1999 (in press). Abstract: This paper describes the design of neural control architectures for locomotion using an evolutionary approach. Inspired by the central pattern generators found in animals, we develop neural controllers which can produce the patterns of oscillations necessary for the swimming of a simulated lamprey. This work is inspired by Ekeberg's neuronal and mechanical model of a lamprey \cite{Ekeberg93}, and follows experiments in which swimming controllers were evolved using a simple encoding scheme \cite{Ijspeert99_ab,Ijspeert98_sab}. Here, controllers are developed using an evolutionary algorithm based on the SGOCE encoding \cite{Kodjabachian98a,Kodjabachian98b} in which a genetic programming approach is used to evolve developmental programs which encode the growing of a dynamical neural network. The developmental programs determine how neurons located on a 2D substrate produce new cells through cellular division and how they form efferent or afferent interconnections. Swimming controllers are generated when the growing networks eventually create connections to the muscles located on both sides of the rectangular substrate. These muscles are part of a 2D mechanical simulation of the body of the lamprey in interaction with water. The motivation of this paper is to develop a method for the design of control mechanisms for animal-like locomotion. Such a locomotion is characterised by a large number of actuators, a rhythmic activity, and the fact that efficient motion is only obtained when the actuators are well coordinated. The task of the control mechanism is therefore to transform commands concerning the speed and direction of motion into the signals sent to the multiple actuators. We define a fitness function, based on several simulations of the controller with different commands settings, which rewards the capacity of modulating the speed and the direction of swimming in response to simple, varying input signals. Central pattern generators are thus evolved capable of producing the relatively complex patterns of oscillations necessary for swimming. The best solutions generate travelling waves of neural activity, and propagate, similarly to the swimming of a real lamprey, undulations of the body from head to tail propelling the lamprey forward through water. By simply varying the amplitude of two input signals, the speed and the direction of swimming can be modulated. Keywords: Neural control; genetic programming; developmental encoding; SGOCE; simulation; central pattern generator; swimming; lamprey. ________________________________________________________________________ Preliminary results on CPGs for salamander locomotion can be found in: A.J. Ijspeert: Evolution of neural controllers for salamander-like locomotion, in McKee, G.T. and Schenker, P.S., Editors, Proceedings of Sensor Fusion and Decentralised Control in Robotics Systems II, SPIE Proceeding Vol. 3839, September 1999, pp. 168-179. Abstract: This paper presents an experiment in which evolutionary algorithms are used for the development of neural controllers for salamander locomotion. The aim of the experiment is to investigate which kind of neural circuitry can produce the typical swimming and trotting gaits of the salamander, and to develop a synthetic approach to neurobiology by using genetic algorithms as design tool. A 2D bio-mechanical simulation of the salamander's body is developed whose muscle contraction is determined by the locomotion controller simulated as continuous-time neural networks. While the connectivity of the neural circuitry underlying locomotion in the salamander has not been decoded for the moment, the general organization of the designed neural circuits corresponds to that hypothesized by neurobiologists for the real animal. In particular, the locomotion controllers are based on a body {\it central pattern generator} (CPG) corresponding to a lamprey-like swimming controller as developed by Ekeberg~\cite{Ekeberg93}, and are extended with a limb CPG for controlling the salamander's body. A genetic algorithm is used to instantiate synaptic weights of the connections within the limb CPG and from the limb CPG to the body CPG given a high level description of the desired gaits. A set of biologically plausible controllers are thus developed which can produce a neural activity and locomotion gaits very similar to those observed in the real salamander. By simply varying the external excitation applied to the network, the speed, direction and type of gait can be varied. From gary at cs.ucsd.edu Sat Nov 20 04:10:18 1999 From: gary at cs.ucsd.edu (Gary Cottrell) Date: Sat, 20 Nov 1999 01:10:18 -0800 (PST) Subject: Faculty Positions at UCSD Message-ID: <199911200910.BAA29722@gremlin.ucsd.edu> Recall my previous note about three positions at ucsd, with most of them mentioning machine learning. We are moving quickly on senior candidates. If you're thinking of applying, don't delay! (If you've already applied, this does *not* mean we're unhappy with you! ;-)) See "joining our department" off the main web page, www.cse.ucsd.edu, for details on applying. One of the positions: The Ronald R. Taylor Chair position in Information Technology in Computer Science at the full professor level. This chair is targeted towards applicants in bioinformatics, computational biology, databases, data-intensive computing, data mining, or information technology. cheers, gary Gary Cottrell 858-534-6640 FAX: 858-534-7029 Faculty Assistant Chet Frost: 858-822-3286 Computer Science and Engineering 0114 IF USING FED EX INCLUDE THE FOLLOWING LINE: "Only connect" 3101 Applied Physics and Math Building University of California San Diego -E.M. Forster La Jolla, Ca. 92093-0114 Email: gary at cs.ucsd.edu or gcottrell at ucsd.edu Home page: http://www-cse.ucsd.edu/~gary/ From steve at cns.bu.edu Sat Nov 20 12:04:41 1999 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sat, 20 Nov 1999 12:04:41 -0500 Subject: hallucinations Message-ID: The following article is available in HTML, PDF, and Gzipped Postscript at http://www.cns.bu.edu/Profiles/Grossberg Grossberg, S. (1999). How hallucinations may arise from brain mechanisms of learning, attention, and volition. Journal of the International Neuropsychological Society, in press. ABSTRACT: This article suggests how brain mechanisms of learning, attention, and volition may give rise to hallucinations during schizophrenia and other mental disorders. The article suggests that normal learning and memory are stabilized through the use of learned top-down expectations. These expectations learn prototypes that are capable of focusing attention upon the combinations of features that comprise conscious perceptual experiences. When top-down expectations are active in a priming situation, they can modulate or sensitize their target cells to respond more effectively to matched bottom-up information. They cannot, however, fully activate these target cells. These matching properties are shown to be essential towards stabilizing the memory of learned representations. The modulatory property of top-down expectations is achieved through a balance between top-down excitation and inhibition. The learned prototype is the excitatory on-center in this top-down network. Phasic volitional signals can shift the balance between excitation and inhibition to favor net excitatory activation. Such a volitionally-mediated shift enables top-down expectations, in the absence of supportive bottom-up inputs, to cause conscious experiences of imagery and inner speech, and thereby to enable fantasy and planning activities to occur. If these volitional signals become tonically hyperactive during a mental disorder, the top-down expectations can give rise to conscious experiences in the absence of bottom-up inputs and volition. These events are compared with data about hallucinations. The article predicts where these top-down expectations and volitional signals may act in the laminar circuits of visual cortex, and by extension in other sensory and cognitive neocortical areas, and how the level of abstractness of learned prototypes may covary with the abstractness of hallucinatory content. A similar breakdown of volition may lead to declusions of control in the motor system. Key Words: hallucinations, learned expectations, attention, learning, adaptive resonance theory Preliminary version appears as Boston University Technical Report, CAS/CNS-TR-99-020. From baolshausen at ucdavis.edu Sat Nov 20 17:09:10 1999 From: baolshausen at ucdavis.edu (Bruno Olshausen) Date: Sat, 20 Nov 1999 14:09:10 -0800 Subject: Faculty position in cognitive neuroscience - UC Davis Message-ID: <38371C06.F225F9D8@ucdavis.edu> FACULTY POSITION IN COGNITIVE NEUROSCIENCE UC DAVIS The Center for Neuroscience and the Department of Psychology at the University of California, Davis, invite applications for a Cognitive Neuroscientist at the assistant, associate or full professor level to begin July 1, 2000. Specialization within the area of cognitive neuroscience is open, however, we are particularly interested in faculty who work on some aspect of human neurophysiology and can utilize newly established functional brain imaging facilities. Those who apply computational modeling approaches toward understanding cognitive aspects of brain function are also encouraged to apply. Applicants are expected to demonstrate leadership in their research specialty, ability to obtain extramural funds, and ability to teach cognitive neuroscience courses at both graduate and undergraduate levels. Candidates must possess a Ph.D. degree. The Center for Neuroscience is in the midst of rapid growth, and the successful applicant will have the opportunity to participate in the recruitment of additional positions in cognitive neuroscience. The University of California, Davis, is an affirmative action/equal opportunity employer with a strong institutional commitment to the development of a climate that supports equality of opportunity and respect for differences. Applicants should send a letter describing research and teaching interests, a curriculum vitae, copies of representative publications, and at least five letters of recommendation to: Edward G. Jones, Director, Center for Neuroscience, 1544 Newton Court, University of California, Davis, CA, 95616-8686. All materials must be received by February 1, 2000, to be assured consideration. The search will continue until this position is filled. The University of California is an Equal Opportunity Employer. From nat at cs.dal.ca Sun Nov 21 08:32:55 1999 From: nat at cs.dal.ca (Nathalie Japkowicz) Date: Sun, 21 Nov 1999 09:32:55 -0400 (AST) Subject: CFP: AAAI Workshop on Learning from Imbalanced Data Sets Message-ID: ---------------------------------- Call for Participation AAAI-2000 Workshop on Learning from Imbalanced Data Sets July 31 2000, Austin Texas ---------------------------------- The majority of learning systems previously designed and tested on toy problems or carefully crafted benchmark data sets usually assumes that the training sets are well balanced. In the case of concept-learning, for example, classifiers typically expect that their training set contains as many examples of the positive as of the negative class. Unfortunately, this balanced assumption is often violated in real world settings. Indeed, there exist many domains for which some classes are represented by a large number of examples while the others are represented by only a few. Although the imbalanced data set problem is starting to attract researchers' attention, attempts at tackling it have remained isolated. It is our belief that much progress could be achieved from a concerted effort and a greater amount of interactions between researchers interested in this issue. The purpose of this workshop is to provide a forum to foster such interactions and identify future research directions. Topics ------ * Novel techniques for dealing with imbalanced data sets: * Techniques for over-sampling the minority class. * Techniques for down-sizing the majority class. * Techniques for learning from a single class. * Techniques for internally biasing the learning process. * Other approaches. * Comparing the various methodologies. * The data imbalance problem in unsupervised learning. Format ------ The workshop will consist of several sessions concentrating on the themes identified above. The workshop will conclude with a panel of distinguished guests commenting on the presentations of the day, discussing future directions, and opening the floor for general discussion. Attendance ---------- This workshop is open to all members of the Machine-Learning, Data-Mining, Information Retrieval, Statistics and Connectionist communities interested in the data imbalance problem. Attendance is limited to 65 participants. Submission ---------- Prospective participants are invited to submit papers on the topics outlined above or on other related issues. Submissions should be 6 pages, and be in line with the AAAI style sheet. Electronic submissions, in Postscript format, are prefered and should be sent to Nathalie Japkowicz at nat at cs.dal.ca. Alternatively, four hard copies of the papers can be sent to: Nathalie Japkowicz Faculty of Computer Science DalTech/Dalhousie University 6050 University Avenue Halifax, N.S. Canada, B3H 1W5 Telephone: (902) 494-3157 FAX: (902) 492-1517 If space is available, attendance to the workshop is also possible by submitting a 1 or 2 page statement of interest to the above address. Timetable: ---------- * Submission deadline: March 10, 2000 * Notification date: March 24, 2000 * Final date for camera-ready copies to organizers: April 26, 2000 Co-Chairs: ---------- * Robert Holte, University of Ottawa (holte at site.uottawa.ca); * Nathalie Japkowicz, Dalhousie University (nat at cs.dal.ca); * Charles Ling, University of Western Ontario (ling at csd.uwo.ca); * Stan Matwin University of Ottawa (stan at site.uottawa.ca) Additional Information ---------------------- http://borg.cs.dal.ca/~nat/Workshop2000/workshop2000.html From piuri at elet.polimi.it Mon Nov 22 07:34:55 1999 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Mon, 22 Nov 1999 13:34:55 +0100 Subject: IJCNN'2000: Call for Special Sessions Message-ID: <3.0.5.32.19991122133455.015bda30@elet.polimi.it> ======================================================================== IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS Grand Hotel di Como, Como, Italy - 24-27 July 2000 sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society with the technical cooperation of the Japanese Neural Network Society, AEI, SIREN, and AI*IA > > > > > > CALL FOR SPECIAL SESSIONS < < < < < < Proposals for special sessions for IJCNN 2000 are solicited. Special sessions consist of three to five papers related to neural networks or related area. Proposals are due on December 31, 1999 and must include 1. The title of the special session and a one paragraph summary describing the session's themes. 2. The names, affiliations and e-mail of the organizers of the special session with identification of one of these individuals as the contact person. 3. A list of papers including (a) authors (b) author affiliation and e-mail and (c) paper title 4. A statement from the session organizer that (a) the authors have been contacted and agree to write the paper and (b) at least one author per paper will register and attend IJCNN2000 to present the paper (the registration is due by 30 April 2000 to guarantee inclusion of the paper in the proceedings - see the detailed conditions in the conference Call for Papers on the conference web site). Final papers for publication in the conference record will be due on January 31, 2000. Proposals for Special Sessions should be submitted electronically to ijcnn2000 at ee.washington.edu For the complete Call for Papers and other information (including information about Como, Italy), visit the conference web site at: http://www.ims.unico.it/2000ijcnn.html ====================================================================== IJCNN 2000 Special Sessions Committee Payman Arabshahi, JPL, USA Alexandre P. Alves da Silva, Federal Engineering School at Itajuba, Brazil Mohamed El-Sharkawi, University of Washington, USA Michael Healy, The Boeing Airplane Company, USA Jae-Byung Jung, University of Washington, USA Robert J. Marks II (Chair), University of Washington, USA ====================================================================== Vincenzo Piuri Department of Electronics and Information, Politecnico di Milano piazza L. da Vinci 32, 20133 Milano, Italy phone +39-02-2399-3606 secretary +39-02-2399-3623 fax +39-02-2399-3411 email piuri at elet.polimi.it From Sebastian_Thrun at heaven.learning.cs.cmu.edu Mon Nov 22 11:31:22 1999 From: Sebastian_Thrun at heaven.learning.cs.cmu.edu (Sebastian Thrun) Date: Mon, 22 Nov 1999 11:31:22 -0500 Subject: NEW MASTERS PROGRAM IN KNOWLEDGE DISCOVERY AND DATA MINING Message-ID: Carnegie Mellon, Center for Automated Learning & Discovery announces: A NEW MASTERS PROGRAM IN KNOWLEDGE DISCOVERY AND DATA MINING The extraordinary spread of computers and online data is changing forever the way that important decisions are made in many organizations. Hospitals analyze online medical records to decide which treatments to apply to future patients. Banks routinely analyze past financial records to learn to spot future fraud. Today's demand for data mining expertise far exceeds the supply, and this imbalance will become more severe over the coming decade. To educate the next generation of experts in this important area, Carnegie Mellon University offers a new Master's program in Knowledge Discovery and Data Mining (KDD). This new inter-disciplinary program trains students to become tomorrow's leaders in the rapidly growing area of Knowledge Discovery and Data Mining. It is offered by Carnegie Mellon's Center for Automated Learning and Discovery (CALD), which has assembled a large multi-disciplinary team of faculty and students across several academic departments. KDD candidates will be trained in all important areas related to scientific data mining and knowledge discovery. The Master's program balances interdisciplinary course work, hands-on project work, and cutting-edge research carried out under direct faculty supervision. The curriculum addresses areas such as advanced machine learning algorithms, statistical principles and foundations, database and data warehousing methods, complexity analysis, approaches to data visualization, privacy and security, and specific application areas such as business, marketing, finance, and public policy. Our graduates are uniquely positioned to pioneer new data mining and knowledge discovery efforts, and to pursue top notch research on the next generation of data mining tools, algorithms, and systems. Carnegie Mellon invites applications of qualified individuals. Admission is highly competitive. A limited number of fellowships are available, which will be provided on a competitive basis. The application deadline is February 5, 2000. For more details about the program or to apply: http://www.cs.cmu.edu/~kdd From cindy at cns.bu.edu Mon Nov 22 13:45:37 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Mon, 22 Nov 1999 13:45:37 -0500 Subject: Neural Networks 12(9) Message-ID: <199911221845.NAA04682@retina.bu.edu> NEURAL NETWORKS 12(9) Contents - Volume 12, Number 9 - 1999 NEURAL NETWORKS LETTERS: Solving the n-bit parity problem using neural networks Myron E. Hohil, Derong Liu, and Stanley H. Smith CURRENT OPINIONS: How stereo vision interacts with optic flow perception: Neural mechanisms M. Lappe and A. Grigo ARTICLES: *** Mathematical and Computational Analysis *** The asymptotic memory capacity of the generalized Hopfield network Jinwen Ma Synchronization and desynchronization of neural oscillators A. Tonnelier, S. Meignen, H. Bosch, and J. Demongeot Improved learning algorithms for mixture of experts in multiclass classification K. Chen, L. Xu, and H. Chi On the indentifiability of mixtures-of-experts W. Jiang and M.A. Tanner Derivation of the multilayer perceptron weight constraints for direct network interpretation and knowledge discovery M.L. Vaughn *** Engineering and Design *** The Kohonen network incorporating explicit statistics and its application to the travelling salesman problem N. Aras, B.J. Oommen, and I.K. Altmel Accelerating neural network training using weight extrapolations S.V. Kamarthi and S. Pittner HyFIS: Adaptive neuro-fuzzy inference systems and their application to nonlinear dynamical systems J. Kim and N. Kasabov BOOK REVIEW: How to legitimate a field: A review of D.J. stein and J. Ludik's (1998) "Neural networks and psychopathology: Connectionist models in practice and research" Greg J. Siegle ------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** From Nello.Cristianini at bristol.ac.uk Tue Nov 23 08:17:56 1999 From: Nello.Cristianini at bristol.ac.uk (N Cristianini) Date: Tue, 23 Nov 1999 13:17:56 +0000 (GMT) Subject: Support Vector Machines: Book Announcement Message-ID: AN INTRODUCTION TO SUPPORT VECTOR MACHINES and other kernel-based learning methods N. Cristianini and J. Shawe-Taylor Cambridge University Press, 2000 ISBN: 0 521 78019 5 Book's website: www.support-vector.net This book is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory. The book also introduces Bayesian analysis of learning and relates SVMs to Gaussian Processes and other kernel based learning methods. SVMs deliver state-of-the-art performance in real-world applications such as text categorisation, hand-written character recognition, image classification, biosequences analysis, etc. Their first introduction in the early 1990s lead to a recent explosion of applications and deepening theoretical analysis, that has now established Support Vector Machines along with neural networks as one of the standard tools for machine learning and data mining. Students will find the book both stimulating and accessible, while practitioners will be guided smoothly through the material required for a good grasp of the theory and application of these techniques. The concepts are introduced gradually in accessible and self-contained stages, though in each stage the presentation is rigorous and thorough. Pointers to relevant literature and web sites containing software ensure that it forms an ideal starting point for further study. Equally the book will equip the practitioner to apply the techniques and an associated web site will provide pointers to updated literature, new applications, and on-line software. More information is available on the book's website: www.support-vector.net From Zoubin at gatsby.ucl.ac.uk Tue Nov 23 07:17:59 1999 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Tue, 23 Nov 1999 12:17:59 +0000 (GMT) Subject: Postdoc and PhD Positions, Gatsby Computational Neuroscience Unit Message-ID: <199911231217.MAA06430@cajal.gatsby.ucl.ac.uk> Gatsby Computational Neuroscience Unit Director: Geoffrey Hinton http://www.gatsby.ucl.ac.uk/ Post-doctoral and PhD Research Positions Computational Neuroscience The Gatsby Computational Neuroscience Unit invites applications for PhD studentships and post-doctoral research positions tenable from September 2000. Members of the unit are interested in models of all aspects of brain function, especially unsupervised learning, computational vision, reinforcement learning, and computational motor control, and also conduct psychophysical experiments in motor control and vision. Current researchers at the unit include 12 PhD students and the following faculty and postdocs: Faculty: Post-doctoral Fellows: Geoff Hinton Hagai Attias Peter Dayan Sam Roweis Zoubin Ghahramani Maneesh Sahani Zhaoping Li Emo Todorov Carl van Vreeswijk For further details please see: http://www.gatsby.ucl.ac.uk/research.html The Gatsby Unit provides a unique opportunity for a critical mass of theoreticians to interact closely with each other and with University College's other world class research groups including Anatomy, Computer Science, Functional Imaging Laboratory, Physics, Physiology, Psychology, Neurology, Ophthalmology, and Statistics. The unit has excellent computational facilities, and laboratory facilities include both motor and visual psychophysics labs for theoretically motivated experimental studies. The unit's visitor and seminar programmes enable its staff and students to interact with leading researchers from across the world. Candidates should have a strong analytical background and a keen interest in neuroscience. Competitive salaries and studentships are available. Applicants should send a CV [PhD applicants should include details of course work and grades], a statement of research interests, and names and addresses of 3 referees to janice at gatsby.ucl.ac.uk [email preferred] or to The Gatsby Computational Neuroscience Unit University College London Alexandra House 17 Queen Square LONDON WC1N 3AR UK ** Closing date for applications: 31 January 2000 ** From rogers at rtna.daimlerchrysler.com Tue Nov 23 14:15:23 1999 From: rogers at rtna.daimlerchrysler.com (Seth Rogers) Date: Tue, 23 Nov 1999 11:15:23 -0800 (PST) Subject: Complete CFP: International Conference on Machine Learning Message-ID: Call for Papers THE SEVENTEENTH INTERNATIONAL CONFERENCE ON MACHINE LEARNING June 29-July 2, 2000 Stanford University The Seventeenth International Conference on Machine Learning (ICML-2000) will be held at Stanford University from June 29 to July 2, 2000, in the heart of Silicon Valley. The conference will bring together researchers to exchange ideas and report recent progress in the computational study of learning. Topics for Submission ICML-2000 welcomes submissions on all facets of machine learning, but especially solicits papers on problem areas, research topics, learning paradigms, and approaches to evaluation that have been rare at recent conferences, including: - the role of learning in natural language, vision and speech, planning and scheduling, design and configuration, logical and spatial reasoning, motor control, and more generally on learning for performance tasks carried out by intelligent agents; - the discovery of scientific laws and taxonomies, the construction of componential and structural models, and learning at multiple levels of temporal and spatial resolution; - the effect of the developers' decisions about problem formulation, representation, data quality, and reward function on the learning process; - computational models of human learning, applications to real-world problems, exploratory research that describes novel learning tasks, work that integrates familiar methods to demonstrate new functionality, and agent architectures in which learning plays a central role; - empirical studies that combine natural data (to show relevance) with synthetic data (to understand conditions on behavior), along with formal analyses that make contact with empirical results, especially where the aim is to identify sources of power, rather than to show one method is superior to others. Naturally, we also welcome submissions on traditional topics, ranging from induction over supervised data to learning from delayed rewards, but we hope the conference will also attract contributions on the issues above. Review Process The ICML-2000 review process will be structured to encourage publications covering a broad range of research and to foster increased participation in the conference. To this end, we have instituted: - area chairs who will be responsible for recruiting papers in their area of expertise and overseeing the review process for those submissions; - conditional acceptance of papers that are not publishable in their initial form, but that can be improved enough for inclusion in time to appear in the proceedings; and - a review form that requires referees to explicitly list any problems with a paper, what it would take to overcome them, and, if they recommend rejection, why it cannot be fixed in time for inclusion. The overall goal is to make the review process more like that in journals, with time for the authors to incorporate feedback from reviewers. Each submitted paper will be reviewed by two members of the program committee, with the decision about its acceptance overseen by the responsible area chair and the program chair. Paper Submission Authors should submit papers using same format and length as the final proceedings version. The detailed instructions for authors at http://www-csli.stanford.edu/icml2k/instructions.html include pointers to templates for LaTeX and Word documents. These specify two-column style, Times Roman font with 10 point type, vertical spacing of 11 points, overall text width of 6.75 inches, length of 9.0 inches, 0.25 inches between the two columns, top margin of 1.0 inch, and left margin of 0.75 inch. (The right and bottom margins will depend on whether one uses US letter or A4 paper.) Papers must not exceed eight (8) pages including figures and references. We will return to the authors any papers that do not satisfy these requirements. The deadline for submissions to ICML-2000 is MONDAY, JANUARY 24, 2000. Submission will be entirely electronic by transferring papers to the ICML-2000 ftp site, as explained in the detailed instructions for authors. Authors must submit papers in POSTSCRIPT format to ensure our ability to print them out for review. Each submission must be accompanied by the paper's title, the authors' names and physical addresses, a 250-word abstract, the contact author's email address and phone number, and the author who would present the talk at the conference. Authors must enter this information into the submission form at the conference web site by FRIDAY, JANUARY 21. ICML-2000 allows simultaneous submission to other conferences, provided this fact is clearly indicated on the submission form. Accepted papers will appear in the conference proceedings only if they are withdrawn from other conferences. Simultaneous submissions that are not clearly specified as such will be rejected. Other Conference Information The Seventeenth International Conference on Machine Learning will be collocated with the Thirteenth Annual Conference on Computational Learning Theory (COLT-2000) and the Sixteenth Conference on Uncertainty in Artificial Intelligence (UAI-2000). Registrants to any of these meetings will be able to attend the technical sessions of the others at no additional cost. ICML-2000 will also be preceded by tutorials on various facets of machine learning. For additional information, see the web site for the conference at http://www-csli.stanford.edu/icml2k/ which will provide additional details as they become available. If you have questions about ICML-2000, please send electronic mail to icml2k at csli.stanford.edu. The conference has received support from DaimlerChrysler Research and Technology, Stanford's Center for the Study of Language and Information (CSLI), and the Institute for the Study of Learning and Expertise (ISLE). From maass at igi.tu-graz.ac.at Wed Nov 24 09:46:41 1999 From: maass at igi.tu-graz.ac.at (Wolfgang Maass) Date: Wed, 24 Nov 1999 15:46:41 +0100 Subject: Subject: Article on the computational power of winner-take-all Message-ID: <383BFA51.32366A4D@igi.tu-graz.ac.at> Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit The following article is now online available : On the Computational Power of Winner-Take-All Wolfgang Maass Technische Universitaet Graz, Austria Abstract: Everybody ``KNOWS'' that neural networks need more than a single layer of nonlinear units to compute interesting functions. We show that this is FALSE if one employs winner-take-all as nonlinear unit: * Any boolean function can be computed by a single k-winner-take-all unit applied to weighted sums of the input variables. * Any continuous function can be approximated arbitrarily well by a single soft winner-take-all unit applied to weighted sums of the input variables. * Only positive weights are needed in these (linear) weighted sums. This may be of interest from the point of view of neurophysiology, since only 15% of the synapses in the cortex are inhibitory. * Our results support the view that winner-take-all is a very suitable basic computational unit in Neural VLSI: - it is wellknown that winner-take-all of n input variables can be computed very efficiently with 2n transistors (and a total wire length and area that is linear in n ) in analog VLSI [Lazzaro et al., 1989] - we show that winner-take-all is not just useful for special purpose computations, but may serve as the only nonlinear unit for neural circuits with universal computational power - we show that any multi-layer perceptron needs quadratically in n many gates to compute winner-take-all for n input variables, hence winner-take-all provides a substantially more powerful computational unit than a perceptron (at about the same cost of implementation in analog VLSI). ------------------------------------------------------------------ The full version of this article will appear in Neural Computation. It can be downloaded from http://www.tu-graz.ac.at/igi/maass/#Publications (see publication # 113). An extended abstract will appear in the Proceedings of NIPS 1999. From Leo.van.Hemmen at Physik.TU-Muenchen.DE Wed Nov 24 17:39:06 1999 From: Leo.van.Hemmen at Physik.TU-Muenchen.DE (J. Leo van Hemmen) Date: Wed, 24 Nov 1999 23:39:06 +0100 Subject: Biological Cybernetics Message-ID: Dear Friends: In the July issue 81/1 (1999) of ``Biological Cybernetics'', its Editors-in-Chief Gert Hauske and I have published an Editorial. As we think it could make for interesting reading for most of you we have appended the text. Enjoy reading, Leo van Hemmen. >>>$<<< The time has come for _Biological Cybernetics_ to face some changes and widen its scope. First, Leo van Hemmen has joined Gert Hauske as Coeditor-in-Chief. The latter welcomes him as a colleague and friend with extensive experience in theoretical biophysics, international reputation, and a passion for temporal coding, in particular, within the visual and auditory system. The former realizes that, in managing the Journal as well as Gert Hauske does, he is called to live up to a very high standard. In setting a new course for the Journal and working in a cybernetic vein, we intend to focus on sensory modalities, especially hearing and seeing, their cortical processing, and emanating activities such as those in the motor cortex, including gesture and locomotion. Learning and memory constitute the second key domain of the Journal, strongly dependent as they are on sensory modalities. Also the analysis of underlying techniques deserves our attention. Papers on artificial neural nets whose motivation stems from biology are as welcome as they were before [1]. In this way the editors will continue the liberal, innovative, and rich traditions set forth for the Journal since its conception. It was Norbert Wiener [2,3] who defined the notion of cybernetics: ``We have decided to call the entire field of control and communication theory, whether in the machine or in the animal, by the name _Cybernetics_, which we form from the Greek $\kappa\upsilon\beta\epsilon\rho\nu\eta\tau\eta\varsigma$ or _steersman_.'' He then noted that J.C. Maxwell's 1868 paper on governors, a word that is a Latin corruption of its Greek root, was the first significant paper on mechanisms of feedback. While control is a key notion for all of biology, the importance of feedback connections relative to feedforward regulation has been appreciated most profoundly through recent studies of the brain. Communication or transmission of information is another key idea that is indispensable in understanding neuronal processing. Here Wiener was also ahead of our time, save for the fact that he imagined machine and animal to perform information processing in the same way -- as is suggested by the above interlude ``whether in the machine or in the animal'' and is set out elsewhere [2,3]. Today we know much more than one could have then [4], that the differences between neuronal and machine computation, and between the underlying mechanisms of learning and memory, are huge. With hindsight it is easy to see why the original cybernetic movement stalled in the mid sixties. It was a never explicitly formulated hypothesis that a simple block diagram could account for the exquisite functions of animal brains [2-4], and that a clever person could write an algorithm for each procedure emerging from a block diagram. Life is not that simple and would-be programmers were called upon to digest megabytes of data, which apparently they could not. Our brain is simply not made for that. Biological cybernetics as conceived by our founding Editor-in-Chief Werner Reichardt is different. His insertion of the adjective _biological_ before `cybernetics' focused attention on natural as opposed to artificial networks. The underlying idea, as advocated by Wiener, is that a merger of biology, mathematics, and physics is needed to clarify the fascinating problems posed by animal brains functioning as `neuronal machines' in natural environments. This means that computational neuroscience necessarily must be an integral part of the Journal. Biologists are increasingly aware of the power theoretical neuroscience has for making predictions that can be verified by experiment, and to suggest entirely new directions for experimental investigation. Accordingly, _Biological Cybernetics_ will aim at bringing to biologists a better understanding of the reciprocity between theory and experiment, which has so long been the successful formula for the physical sciences. At the same time, however, the Journal will remain deeply rooted in biological experimentation and its power for setting boundaries on overly zealous speculation. What, then, should be questioned or analyzed? Though biological cybernetics can, and did, give insight into such attractive practical problems as heart arrhythmias, a richer approach should focus on all aspects of information processing, notably in biological neural networks. The Journal's traditional preferences slanted towards the visual system, perhaps to the expense of a broader analysis of neuronal information processing as a whole. Since the auditory system is so intimately connected with the visual cortex, and a fascinating structure by itself, we think that focusing on both may be particularly fruitful. Through such comparisons our editorial objective is to stimulate an inquiring assessment of sensory information processing at large and to illuminate the `hows' and `whys' that give rise to similarity and difference in various systems. Putting things in a proper perspective, we are aiming at all aspects of communication and control in biological information processing. To accommodate this broadened frontier for the Journal, the Editorial Board will be appropriately extended. Most importantly, we hope that you, the reader and prospective author, will join us in this new adventure. Gert Hauske J. Leo van Hemmen. [1] Braitenberg V (1984) Vehicles: Experiments in Synthetic Psychology. MIT Press, Cambridge, MA [2] Wiener N (1948) Cybernetics, or control and communication in the animal and the machine. Wiley, New York, and Hermann, Paris. See in particular p. 19. [3] Wiener N (1948) Cybernetics. Sci. Amer. 179:14-18 [4] Rosenblith W and Wiesner J (1966) From philosophy to mathematics to biology. Bull. Amer. Math. Soc. 72:33-38. This is a critical appreciation of Wiener's role in biology by two of his contemporaries. >>>$<<< Prof. Dr. J. Leo van Hemmen Physik Department TU M"unchen D-85747 Garching bei M"unchen Germany Phone: +49(89)289.12362 (office) and .12380 (secretary) Fax: +49(89)289.14656 e-mail: Leo.van.Hemmen at ph.tum.de From uchiyama at ics.kagoshima-u.ac.jp Thu Nov 25 00:12:55 1999 From: uchiyama at ics.kagoshima-u.ac.jp (Hiroyuki UCHIYAMA) Date: Thu, 25 Nov 1999 14:12:55 +0900 Subject: Paper available: Retinal computation of motion direction Message-ID: <199911250512.OAA03071@joho.ics.kagoshima-u.ac.jp> The following paper (pdf file) is now available from http://www.ics.kagoshima-u.ac.jp/~uchiyama/preprint.html This is a preprint of an article that will appear in Visual Neurosciece. ---------------------------------------------------------- Computation of Motion Direction by Quail Retinal Ganglion Cells That Have a Nonconcentric Receptive Field Hiroyuki Uchiyama, Takahide Kanaya and Shoichi Sonohata Abstract One type of retinal ganglion cells prefers object motion in a particu lar direction. Neuronal mechanisms for the computation of motion direction are still unknown. We quantitatively mapped excitatory and inhibitory regio ns of receptive fields for directionally selective retinal ganglion cells in the Japanese quail, and found that the inhibitory regions are displaced abou t 1-3 deg. toward the side where the null sweep starts, relative to the exci tatory regions. Directional selectivity thus results from delayed transient suppression exerted by the nonconcentrically-arranged inhibitory regions, an d not by local directional inhibition as hypothesized by Barlow and Levick (1965) . ----------------------------------------------------------- ---------------------------------------------- Hiroyuki Uchiyama, Ph.D. Department of Information & Computer Science, Faculty of Engineering, Kagoshima University Korimoto 1-21-40, Kagoshima 890-0065, JAPAN phone +81-99-285-8449 fax +81-99-285-8464 http://www.ics.kagoshima-u.ac.jp/~uchiyama From omlin at waterbug.cs.sun.ac.za Thu Nov 25 11:55:04 1999 From: omlin at waterbug.cs.sun.ac.za (Christian Omlin) Date: Thu, 25 Nov 1999 18:55:04 +0200 Subject: Preprint available Message-ID: Dear Colleagues The technical report below is available from our website http://www.cs.sun.ac.za/projects/tech_reports/US-CS-TR-99-14.ps.gz We welcome any comments you may have. With kind regards, Christian Christian W. Omlin e-mail: omlin at cs.sun.ac.za Department of Computer Science phone (direct): +27-21-808-4308 University of Stellenbosch phone (secretary): +27-21-808-4232 Private Bag X1 fax: +27-21-808-4416 Stellenbosch 7602 http://www.cs.sun.ac.za/people/staff/omlin SOUTH AFRICA http://www.neci.nj.nec.com/homepages/omlin ------------------------------- cut here ------------------------------ What Inductive Bias Gives Good Neural Network Training Performance? S. Snyders C.W. Omlin Department of Computer Science University of Stellenbosch 7602 Stellennbosch South Africa E-mail: {snyders,omlin}@cs.sun.ac.za ABSTRACT There has been an increased interest in the use of prior knowl- edge for training neural networks. Prior knowledge in the form of Horn clauses has been the predominant paradigm for knowledge- based neural networks. Given a set of training examples and an initial domain theory, a neural network is constructed that fits the training examples by preprogramming some of the weights. The initialized neural network is then trained using backpropagation to refine the knowledge. The prior knowledge presumably defines a good starting point in weight space and provides an inductive bias leading to faster convergence; it overrides backpropaga- tion's bias toward a smooth interpolation resulting in small weights. This paper proposes a heuristic for determining the strength of the inductive bias by making use of gradient informa- tion in weight space in the direction of the programmed weights. The network starts its search in weight space where the gradient is maximal thus speeding-up convergence. Tests on a benchmark problem from molecular biology demonstrate that our heuristic on average reduces the training time by 60% compared to a random choice of the strength of the inductive bias; this performance is within 20% of the training time that can be achieved with optimal inductive bias. The difference in generalization performance is not statistically significant. From Peter.Bartlett at anu.edu.au Thu Nov 25 22:24:49 1999 From: Peter.Bartlett at anu.edu.au (Peter Bartlett) Date: Fri, 26 Nov 1999 14:24:49 +1100 Subject: machine learning position at ANU Message-ID: <383DFD81.9D933520@anu.edu.au> We're currently advertising a position in machine learning at the Australian National University. It's a limited term (3-5 year) research position at academic level B (Research Fellow). Researchers in the group (which currently includes Peter Bartlett, Jonathan Baxter, Markus Hegland, John Lloyd and Stephen Roberts) work on a variety of theoretical and experimental areas in machine learning, including: reinforcement learning, computational learning theory, neural networks, large margin classification and prediction, scalable data analysis for data mining, and logic for machine learning. See http://wwwrsise.anu.edu.au/ad.html#LevB_CSL for more information. -- Peter. Peter Bartlett email: Peter.Bartlett at anu.edu.au Computer Sciences Laboratory Phone: +61 2 6279 8681 Research School of Information Sciences and Engineering Australian National University Fax: +61 2 6279 8645 Canberra, 0200 AUSTRALIA http://csl.anu.edu.au/~bartlett From J.Sougne at ulg.ac.be Fri Nov 26 04:13:38 1999 From: J.Sougne at ulg.ac.be (J.Sougne@ulg.ac.be) Date: Fri, 26 Nov 1999 10:13:38 +0100 Subject: Dissertation available on cognitive modeling with spiking neurons Message-ID: Dear Connectionists, The following dissertation is now available on-line at http://www.fapse.ulg.ac.be/Lab/cogsci/jsougne/jspapers.html http://www.fapse.ulg.ac.be/Lab/cogsci/jsougne/JSougneThesis.ps.Z (compressed postscript) http://www.fapse.ulg.ac.be/Lab/cogsci/jsougne/JSougneThesis.pdf (pdf) INFERNET: A Neurocomputational Model of Binding and Inference by Jacques P. Sougn? ABSTRACT An implementation of a network of integrate-and-fire neuron-like elements is presented. Integrate-and-fire nodes fire at a precise moment and transmit their activation, with a particular strength and delay, to nodes connected to them. The receiving nodes accumulate potential but also slowly loose their potential through decay. When the potential of the node reaches a particular threshold, it emits a spike. Thereafter, the potential is reset to a resting value. As with real neurons, there is a short refractory period during which this node will be completely insensitive to incoming signals, after which its sensitivity will slowly increase. Precise timing properties have been used to represent symbols in a distributed manner, and to solve the problems of binding and multiple instantiation. This architecture produced several predictions about human short-term memory, predicate processing, complex reasoning, and multiple instantiation. These predictions have been tested by empirical studies on humans. This network shows symbolic processing abilities using neurologically and psychologically plausible mechanisms that maintain the advantages of generalization and noise tolerance found in connectionist networks. From kathryn.cousins at hodder.co.uk Fri Nov 26 07:02:18 1999 From: kathryn.cousins at hodder.co.uk (Kathryn Cousins) Date: Fri, 26 Nov 1999 12:02:18 -0000 Subject: book announcement: Statistical Pattern Recognition Message-ID: <000e01bf3806$166c0380$5b04c00a@cathryncousins.hhinternal.co.uk> STATISTICAL PATTERN RECOGNITION - http://www.arnoldpublishers.com/scripts/webbook.asp?isbn=0340741643 By Andrew Webb, Defence Evaluation and Research Agency, UK From bengioy at IRO.UMontreal.CA Fri Nov 26 13:29:45 1999 From: bengioy at IRO.UMontreal.CA (Yoshua Bengio) Date: Fri, 26 Nov 1999 13:29:45 -0500 Subject: call for presentations / April Workshop on selecting and combining models Message-ID: <19991126132945.48542@IRO.UMontreal.CA> -------------------------------------------------------------------- Call for Presentations: CRM Workshop on Selecting and Combining Models with Machine Learning Algorithms April 12-14, 2000 Centre de Recherches Mathematiques Montreal Organizers: Yoshua Bengio (Universite de Montreal) and Dale Schuurmans (University of Waterloo) A central objective of machine learning research is to develop algorithms that learn predictive relationships from data. This is a central component of data mining and knowledge discovery tasks, which are becoming commonplace applications in the realm of e-commerce. This is a difficult task, however, because inferring a predictive function from data is in fact an "ill-posed" problem; that is, many functions can often "fit" a given finite data set, and yet these functions might generalize very differently on new data drawn from the same distribution. To make this problem well-posed one needs to somehow "calibrate" the complexity of the proposed function class to the amount and quality of available sample data. A classical approach is to perform "model selection" where one imposes a preference structure over function classes and then optimizes a combined objective of class preference and data fit. In doing so, however, it would be useful to have an accurate estimate of the expected generalization performance at each preference level; one could then pick the function class that obtained the lowest expected error, or combine functions from the functions classes with the lowest expected error, and so on. Many approaches have been proposed in the past for this purpose, both in the statistics and the machine learning research communities. Recently in machine learning there has been significant interest in new techniques for evaluating generalization error, for optimizing generalization error, and for combining and selecting models. This is exemplified, for example, by recent work on Structural Risk Minimization, Support Vector Machines, various Boosting algorithms, and the Bagging algorithm. These new approaches suggest that better generalization performance can be obtained using new, broadly applicable procedures. Progress in this area has not only been important for improving our understanding of how machine learning algorithms generalize, it has already been demonstrated to be very useful for practical applications of machine learning and data analysis. This workshop will bring together several key researchers in the fields of machine learning and statistics to present their recent results and debate the controversial issues that have been dividing them in the recent machine learning and neural network conferences. The following leaders in this field have tentatively accepted to participate to the workshop as invited speakers: Peter Bartlett (Australia National University), Leo Breiman (University of California-Berkeley), Tom Dietterich (Oregon State University), Yoav Freund (AT&T Labs-Research), Radford Neal (University of Toronto), Michael Perrone (IBM T.J. Watson Research Center), Robert Schapire (AT&T Labs-Research), Grace Wahba (University of Wisconsin at Madison). The workshop will be sponsored by the CRM (Centre de Recherches Mathematiques) as well as by the MITACS (Mathematics of Information Technology And Complex Systems) Network of Centers of Excellence. Contributors to the workshop are invited to submit a short (1 or 2 page) summary in electronic form (ascii text, postscript or pdf) of the proposed presentation by e-mail, to one of the organizers by February 1st, 2000: bengioy at iro.umontreal.ca or dale at cs.uwaterloo.ca. Information on the workshop will be posted on the following web site: www.iro.umontreal.ca/~bengioy/crmworkshop2000 -------------------------------------------------------------------- -- Yoshua Bengio Professeur aggr?g? D?partement d'Informatique et Recherche Operationnelle Universit? de Montr?al, addresse postale: C.P. 6128 Succ. Centre-Ville, Montreal, Quebec, Canada H3C 3J7 addresse civique: 2920 Chemin de la Tour, Montreal, Quebec, Canada H3T 1J8, #2194 Tel: 514-343-6804. Fax: 514-343-5834. Bureau 3339. http://www.iro.umontreal.ca/~bengioy http://www.iro.umontreal.ca/~lisa From Peter.Bartlett at anu.edu.au Fri Nov 26 21:17:58 1999 From: Peter.Bartlett at anu.edu.au (Peter Bartlett) Date: Sat, 27 Nov 1999 13:17:58 +1100 Subject: paper: hebbian synaptic update rule for reinforcement learning Message-ID: <383F3F56.E5506911@anu.edu.au> The following paper is available at http://csl.anu.edu.au/~bartlett/papers/BartlettBaxter-Nov99.ps.gz Hebbian Synaptic Modifications in Spiking Neurons that Learn Peter L. Bartlett and Jonathan Baxter Australian National University In this paper, we derive a new model of synaptic plasticity, based on recent algorithms for reinforcement learning (in which an agent attempts to learn appropriate actions to maximize its long-term average reward). We show that these direct reinforcement learning algorithms also give locally optimal performance for the problem of reinforcement learning with multiple agents, without any explicit communication between agents. By considering a network of spiking neurons as a collection of agents attempting to maximize the long-term average of a reward signal, we derive a synaptic update rule that is qualitatively similar to Hebb's postulate. This rule requires only simple computations, such as addition and leaky integration, and involves only quantities that are available in the vicinity of the synapse. Furthermore, it leads to synaptic connection strengths that give locally optimal values of the long term average reward. The reinforcement learning paradigm is sufficiently broad to encompass many learning problems that are solved by the brain. We illustrate, with simulations, that the approach is effective for simple pattern classification and motor learning tasks. -- Peter. Peter Bartlett email: Peter.Bartlett at anu.edu.au Machine Learning Group Computer Sciences Laboratory Phone: +61 2 6279 8681 Research School of Information Sciences and Engineering Australian National University Fax: +61 2 6279 8645 Canberra, 0200 AUSTRALIA http://csl.anu.edu.au/~bartlett From piuri at elet.polimi.it Sat Nov 27 06:00:13 1999 From: piuri at elet.polimi.it (Vincenzo Piuri) Date: Sat, 27 Nov 1999 12:00:13 +0100 Subject: IJCNN'2000: New web site, Call for Papers and more.... Message-ID: <3.0.5.32.19991127120013.01d474f0@elet.polimi.it> ======================================================================== IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS Grand Hotel di Como, Como, Italy - 24-27 July 2000 sponsored by the IEEE Neural Network Council, the International Neural Network Society, and the European Neural Network Society with the technical cooperation of the Japanese Neural Network Society, AEI, SIREN, and AI*IA Paper submission deadline is coming soon!!!! Please, have a look to the call for papers in the conference web site. Do not miss the opportunity to submit a paper and participate actively to the conference!!!! Calls for special sessions and tutorials have been also published. You can find them on the conference web site. For your convenience, we set up also a mirror web site in the USA. Information will be updated contemporaneously on both sites. You can find them at the following addresses official conference web site: http://www.ims.unico.it/2000ijcnn.html conference mirror web site: http://www.lans.ece.utexas.edu/2000ijcnn.html ====================================================================== Vincenzo Piuri Department of Electronics and Information, Politecnico di Milano piazza L. da Vinci 32, 20133 Milano, Italy phone +39-02-2399-3606 secretary +39-02-2399-3623 fax +39-02-2399-3411 email piuri at elet.polimi.it From ingber at ingber.com Sun Nov 28 17:29:56 1999 From: ingber at ingber.com (Lester Ingber) Date: Sun, 28 Nov 1999 16:29:56 -0600 Subject: paper: ... reaction time correlates of the g factor Message-ID: <19991128162956.A7362@ingber.com> Statistical mechanics of neocortical interactions: Reaction time correlates of the g factor has been submitted as an Invited commentary on The g Factor: The Science of Mental Ability by Arthur Jensen This paper can be retrieved as http://www.ingber.com/smni00_g_factor.ps.gz Instructions for retrieval are given below. ABSTRACT: A statistical mechanics of neuronal interactions (SMNI) is explored as providing some substance to a physiological basis of the g factor. Some specific elements of SMNI, previously used to develop a theory of short-term memory (STM) and a model of electroencephalography (EEG) are key to providing this basis. Specifically, Hick's Law, an observed linear relationship between reaction time (RT) and the information storage of STM, in turn correlated to a RT-g relationship, is derived. Links to informations and utilities for compression/expansion and for viewing and printing PostScript are in http://www.ingber.com/Z_gz_ps_tar_shar.txt Lester ======================================================================== Instructions for Retrieval of Code and Reprints Interactively Via WWW The archive can be accessed via WWW path http://www.ingber.com/ http://www.alumni.caltech.edu/~ingber/ where the last address is a mirror homepage for the full archive. Interactively Via Anonymous FTP Code and reprints can be retrieved via anonymous ftp from ftp.ingber.com. Interactively [brackets signify machine prompts]: [your_machine%] ftp ftp.ingber.com [Name (...):] anonymous [Password:] your_e-mail_address [ftp>] binary [ftp>] ls [ftp>] get file_of_interest [ftp>] quit The 00index file contains an index of the other files. Files have the same WWW and FTP paths under the main / directory; e.g., http://www.ingber.com/MISC.DIR/00index_misc and ftp://ftp.ingber.com/MISC.DIR/00index_misc reference the same file. Electronic Mail If you do not have WWW or FTP access, get the Guide to Offline Internet Access, returned by sending an e-mail to mail-server at rtfm.mit.edu with only the words send usenet/news.answers/internet-services/access-via-email in the body of the message. The guide gives information on using e-mail to access just about all InterNet information and documents. Additional Information Lester Ingber Research (LIR) develops projects in areas of expertise documented in the ingber.com InterNet archive. Limited help assisting people with queries on my codes and papers is available only by electronic mail correspondence. Sorry, I cannot mail out hardcopies of code or papers. Lester ======================================================================== -- Lester Ingber http://www.ingber.com/ PO Box 06440 Wacker Dr PO Sears Tower Chicago IL 60606-0440 http://www.alumni.caltech.edu/~ingber/ From luciano at if.sc.usp.br Sun Nov 28 18:40:54 1999 From: luciano at if.sc.usp.br (Luciano Da Fontoura Costa) Date: Sun, 28 Nov 1999 21:40:54 -0200 (EDT) Subject: Research opportunities Message-ID: Dear Sir/Madam: Please help disseminating the following message: ==================================================================== OPPORTUNITY FOR POST-GRAD, POST-DOC AND SABBATICAL STUDIES Cybernetic Vision Research Group IFSC, University of Sco Paulo, Caixa Postal 369 Sco Carlos, SP, 13560-970 Brazil ____________________________________________________________________ We would like to communicate about possibilities for post-grad, post-doc and sabbatical programs at the Cybernetic Vision Research Group, University of Sco Paulo, Brazil. Additional information is presented in the following: THE CYBERNETIC VISION RESEARCH GROUP: Started in 1993, the Cybernetic Vision Research Group has become nationally and internationally recognized for research in the areas of shape analysis, computer and biological vision, and computational neuroscience. The group currently includes 15 researchers, most of them MSc and PhD students, each with access to individual computational resources. The group is part of the Instituto de Fisica de Sao Carlos, which has modern computational and network resources (alpha workstations) as well as a well-equipped library. Additional information can be found at the following homepages: *** Group homepage: http://cyvision.if.sc.usp.br/ *** Luciano's personal homepage: http://www.if.sc.usp.br/visao/group/members/luciano/luciano.htm SAO CARLOS: Sao Carlos, where the group is located, is a small and quiet town (about 150 000 inhabitants) in the heart of the state of Sao Paulo, in Brazil. The university campus is within a residential area, where accommodation is very affordable. Our town, which includes two major Brazilian universities as well as many small industries, is know as one of the most prominent Brazilian high-technology centers. Sao Carlos is not far from Sao Paulo (230km), the state capital, where flights to most Brazilian and international destinations can be found. Weather is mild (no snow throughout the year), with an average temperature around 20C. RESEARCH POSSIBILITIES: Well-motivated and dynamic students and researchers, with background in the most diverse areas - including Physics, Mathematics, Computer Science, Engineering, Biology, and Medicine - are welcome to apply for studies in our group. The full time MSc and PhD programs last up to 2 and 4 years, respectively, but it is possible to proceed directly to PhD. Post-doc and sabbatical programs can last from a few months to one year or more. It is possible to apply for Brazilian sponsorship covering travelling and/or the basic living expenses. Research possibilities include but are not limited to the following: *1* Neuromorphology and neuromorphic modeling: development of new neural shape measures, validation, and application to classification of neural cells and neuromorphic modeling. We are particularly interested in investigating how neural shapes constrain and help define neural behavior; *2* Scale space shape representations in 2D and 3D, including multiresolution curvature and skeletonization, singularity theory, differential geometry and differential equations; *3* Visual inspection and image analysis applied to microscopy; *4* Mathematical physics applications to image analysis and vision; *5* Datamining and its applications to visual design, visual quality assessment, neural modeling, and shape analysis. Candidates should contact Prof Luciano da F. Costa at luciano at if.sc.usp.br, indicating the specific interests and including curricular information as well as at least three addresses for recommendation purposes. ===================================================================== Prof. Luciano da Fontoura Costa Coordinator - Cybernetic Vision Research Group DFI-IFSC, Universidade de Sao Paulo Caixa Postal 369 Sao Carlos, SP 13560-970 Brazil FAX: +55 162 73 9879 or +55 162 71 3616 e-mail: luciano at if.sc.usp.br Group homepage: http://cyvision.if.sc.usp.br/ Personal homepage: http://www.if.sc.usp.br/visao/group/members/luciano/luciano.htm --------------------------------------------------------------------- Forthcoming book: Shape Analysis and Recognition (CRC Press) http://www.ime.usp.br/~cesar/shape_crc/ Have you been to The Scientist? http://www.the-scientist.library.upenn.edu/index.html BMCV2000: http://image.korea.ac.kr/BMCV2000/ ===================================================================== END OF MESSAGE ============== From ASJagath at ntu.edu.sg Sun Nov 28 20:22:15 1999 From: ASJagath at ntu.edu.sg (Jagath C Rajapakse (Asst Prof)) Date: Mon, 29 Nov 1999 09:22:15 +0800 Subject: Postdoc in Brain Imaging Message-ID: Postdoctoral Position in Brain Image Analysis A postdoctoral research position in structural and functional brain image analysis is immediately available in the School of Applied Science, Nanyang Technological University, Singapore. Candidate should have a Ph.D. and experience in signal/image processing and Unix/C/C++. For further information, contact: Dr. Jagath Rajapakse, School of Applied Science, Nanyang Technological University, N4 Nanyang Avenue, Singapore 639798. Email: asjagath at ntu.edu.sg, Phone: +65 790 5802, Fax: +65 792 6559 From cindy at cns.bu.edu Mon Nov 29 09:54:18 1999 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Mon, 29 Nov 1999 09:54:18 -0500 Subject: Neural Networks 12(10) Message-ID: <199911291454.JAA17326@retina.bu.edu> NEURAL NETWORKS 12(10) Contents - Volume 12, Number 10 - 1999 NEURAL NETWORKS LETTERS: Exploiting inherent relationships in RNN architectures D.P. Mandic and J.A. Chambers ARTICLES: *** Psychology and Cognitive Science *** A self-supervised learning system for pattern recognition by sensory integration K. Yamauchi, M. Oota, and N. Ishii *** Neuroscience and Neuropsychology *** A distributed model of the saccade system: Simulations of temporally perturbed saccades using position and velocity feedback K. Arai, S. Das, E.L. Keller, and E. Aiyoshi *** Mathematical and Computational Analysis *** Storage capacity of non-monotonic neurons B. Crespi A neural implementation of canonical correlation analysis P.L. Lai and C. Fyfe Ensemble learning via negative correlation Y. Liu and X. Yao A regularization approach to continuous learning with an application to financial derivatives pricing D. Ormoneit *** Engineering and Design *** A developmental approach to visually-guided reaching in artificial systems G. Metta, G. Sandini, and J. Konczak Neuro-fuzzy feature evaluation with theoretical analysis R.K. De, J. Basak, and S.K. Pal ______________________________ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ***************************************************************** From simone at eealab.unian.it Mon Nov 29 11:31:30 1999 From: simone at eealab.unian.it (Simone G.O. Fiori) Date: Mon, 29 Nov 1999 17:31:30 +0100 Subject: Papers available on neural PCA and ICA Message-ID: <1.5.4.32.19991129163130.00d96ffc@unipg.it> Dear Connectionists, the following two papers are now available: `Mechanical' Neural Learning for Blind Source Separation ======================================================== by Simone Fiori - Dept. of Industrial Engineering University of Perugia, Perugia (Italy) Journal: Electronics Letters, Vol. 35, No. 22, Oct. 1999 Extended abstract In this Letter we suggest a new learning theory which ensures that the weight-matrix of each layer of a neural network keeps orthonormal during the whole learning phase. The learning theory is based upon the study of the dynamics of an abstract rigid mechanical system subject to a field of external forces deriving from a potential energy function (PEF). We suggest that by properly selecting the PEF it is possible to force the system to perform different motions, hence the network to perform different tasks. The proposed learning theory is then applied in order to solve blind source separation problems. Related papers The mentioned work is a part of a wider study about neural learning and weight flow on Stiefel-Grassman manifold. Interested readers can find more details on the following papers (and references therein): [1] S. Fiori et al., Orthonormal Strongly-Constrained Neural Learning, Proc. IJCNN'98, pp. 1332 - 1337, 1998 [2] --, A Second-Order Differential System for Orthonormal Optimization, Proc. of International Symposium on Circuits and Systems, Vol. V, pp. 531 - 534, 1999 [3] --, `Mechanical' Neural Learning and InfoMax Orthonormal Independent Component Analysis, Proc. IJCNN'99, in press [4] --, Neural Learning and Weight Flow on Stiefel Manifold, Proc. X Italian Workshop on Neural Nets, pp. 325 -- 333, 1998 (in English) ~~~~~~~~~ An Experimental Comparison of Three PCA Neural Networks ======================================================= by Simone Fiori - Dept. of Industrial Engineering University of Perugia, Perugia (Italy) Journal: Neural Processing Letters, accepted for publication. Abstract We present a numerical and structural comparison of three neural PCA techniques: The GHA by Sanger, the APEX by Kung and Diamantaras, and the $\psi$--APEX first proposed by the present author. Through computer simulations we illustrate the performances of the algorithms in terms of convergence speed and minimal attainable error; then an evaluation of the computational efforts for the different algorithms is presented and discussed. A close examination of the obtained results shows that the members of the new class improve the numerical performances of the considered existing algorithms, and are also easier to implement. ~~~~~~~~~ Comments and suggestions, especially about the first topic, would be particularly welcome. Both comments and requests of reprints should be addressed to: Dr. Simone Fiori Dept. of Industrial Engineering, Univ. of Perugia Via Ischia, 131 I-60026 Numana (An), Italy E-mail: simone at eealab.unian.it Fax: +39.0744.470188 Best regards, S. Fiori From apolloni at dsi.unimi.it Mon Nov 29 13:28:51 1999 From: apolloni at dsi.unimi.it (Prof. Apolloni Bruno) Date: Mon, 29 Nov 1999 19:28:51 +0100 Subject: Job position at Milano University Message-ID: RESEARCH POST IN THE DEPT. OF COMPUTER SCIENCE, UNIVERSITY OF MILAN Applications are invited for a research post within the framework of the "Principled Hybrid Systems: Theory and Applications" (PHYSTA) research network. The network is funded by the Training and Mobility of Researchers programme (TMR) of the EC and has started in December 1997. It involves research groups from five universities: i) King's College London, United Kingdom (Dept. of Mathematics, Prof. J.Taylor) ii) Katholic University of Nijmegen, Netherlands (Centre for Neural Networks, Prof. Stan Gielen) iii) National Technical University of Athens, Greece (Dept. of Electrical and Computer Engineering, Prof. Stefanos Kollias) iv) University of Milan, Italy (Department of Computer Science, Prof. Bruno Apolloni) v) Queen's University Belfast, United Kingdom (Dept. of Psychology/English, Prof. Rodie Cowie) The aim of the research network is the development of a theory and a methodology for combining symbolic and sub-symbolic techniques. The problem domain which is used comes from HCI and refers to emotion understanding based on static/moving image and speech signals. The research post is available in the Neural Networks Laboratory of the Computer Science Department of the University of Milan. The post is available immediately for a period of 18 months. The successful applicant will work for the continuation of the development of a hybrid system prototype which has been implemented. This is a system combining a neural processing part which performs a mapping from the feature to the propositional variables space and a symbolic processing part. The latter is a series of meditation jumps to higher levels of abstraction following a PAC learning framework for boolean formulas. Candidates must have (or be close to obtaining) a PhD in a field such as Computer Science, Mathematics or Electrical Engineering. C programming skills are necessary while experience using SNNS and Scheme/Lisp programming skills are welcome. As the funding is provided by the EC TMR programme there are some restrictions on who may benefit from it: * Candidates must be 35 years old or younger. * Candidates must be nationals of a EU country, Norway, Switzerland or Iceland * Candidates must not be Italian nationals or have worked in Italy 18 out of the last 24 months. The salary for this post is approximately 2,100 Euros. To apply for this post please (e)mail your CV to the address below: Professor Bruno Apolloni, Neural Networks Laboratory Computer Science Department University of Milan, Via Comelico 39, Milano 20135, ITALY phone: 0039 02 55006284 fax: 0039 02 55006276 email: apolloni at dsi.unimi.it More information: apolloni at dsi.unimi.it http://www.image.ece.ntua.gr/physta From angelo.arleo at epfl.ch Tue Nov 30 03:47:41 1999 From: angelo.arleo at epfl.ch (Angelo Arleo) Date: Tue, 30 Nov 1999 09:47:41 +0100 Subject: Preprint available Message-ID: <38438F2C.F2D774C1@epfl.ch> Dear Connectionists, the following paper is to appear in Biological Cybernetics, Special Issue on Navigation in Biological and Artificial Systems: =================================================== "Spatial Cognition and Neuro-Mimetic Navigation: A Model of Hippocampal Place Cell Activity" Angelo Arleo and Wulfram Gerstner Centre for Neuro-Mimetic Systems, MANTRA Swiss Federal Institute of Technology Lausanne =================================================== A preprint of the paper is available at the site: ftp://lamiftp.epfl.ch/pub/arleo/BC99/paper.ps.Z Comments and suggestions are particularly welcome. Best regards, Angelo Arleo ================================================== Extended Abstract A computational model of hippocampal activity during spatial cognition and navigation tasks is presented. The spatial representation in our model of the rat hippocampus is built on-line during exploration via two processing streams. An allothetic vision-based representation is built by unsupervised Hebbian learning extracting spatio-temporal properties of the environment from visual input. An idiothetic representation is learned based on internal movement- related information provided by path integration. On the level of the hippocampus, allothetic and idiothetic representations are integrated to yield a stable representation of the environment by a population of localized overlapping CA3-CA1 place fields. The hippocampal spatial representation is used as a basis for goal-oriented spatial behavior. We focus on the neural pathway connecting the hippocampus to the nucleus accumbens. Place cells drive a population of locomotor action neurons in the nucleus accumbens. Reward-based learning is applied to map place cell activity into action cell activity. The ensemble action cell activity provides navigational maps to support spatial behavior. We present experimental results obtained with a mobile Khepera robot. ================================================= -- _________________________________________________________________________ ____/ __ / ____/ / Angelo Arleo / / / / / Laboratory of Microcomputing (LAMI-DI) ____/ ____/ ____/ / Swiss Federal Inst. of Technology Lausanne / / / / (EPFL) CH-1015 Ecublens, Lausanne _____/ _/ _/ _____/ Tel/Fax: ++41 21 693 6696 / 5263 E-mail: angelo.arleo at di.epfl.ch Web: http://diwww.epfl.ch/lami/team/arleo _________________________________________________________________________ From sbay at algonquin.ICS.UCI.EDU Tue Nov 30 00:43:47 1999 From: sbay at algonquin.ICS.UCI.EDU (Stephen D. Bay) Date: Mon, 29 Nov 1999 21:43:47 -0800 Subject: news item -- UCI KDD Archive: Current Contents Message-ID: <199911292143.aa17340@gremlin-relay.ics.uci.edu> I believe the following item may be of interest to the connectionists mailing list members. Stephen =================================================================== Stephen D. Bay phone (949) 824-3491 Information and Computer Science fax (949) 824-4056 University of California e-mail sbay at ics.uci.edu Irvine, CA 92697-3425 http://www.ics.uci.edu/~sbay - ---------------------------------------------------------- ************************************************** The UCI KDD Archive Current Contents http://kdd.ics.uci.edu/ ************************************************** Thanks to our generous donors, we now have the the following data sets publically available: Discrete Sequence Data Spatio-Temporal Data UNIX User Data El Nino Data Image Text Data CMU Face Images 20 Newsgroups Data Volcanoes on Venus Reuters-21578 Text Collection Multivariate Data Time Series COIL Data Australian Sign Language Data Corel Image Features EEG Data Forest CoverType Pioneer-1 Mobile Robot Data Internet Usage Data Pseudo Periodic Synthetic Time Series IPUMS Census Data Robot Execution Failures KDD CUP 1998 Data Synthetic Control Chart Time Series KDD CUP 1999 Data Web Data Relational Data Microsoft Anonymous Web Data Movies Syskill Webert Web Data If you have a data set or an analysis of data that would be of interest to the KDD community, please consider donating it to the archive. The UCI KDD Archive web site has detailed submission instructions or you may contact the archive librarian for more information. This archive is supported by the Information and Data Management Program at the National Science Foundation. Stephen Bay (sbay at ics.uci.edu) archive librarian