From planning at icsc.ab.ca Mon May 1 17:53:38 2000 From: planning at icsc.ab.ca (Jeanny S. Ryffel) Date: Mon, 1 May 2000 15:53:38 -0600 Subject: Auditory Computations and Neuromorphic Implementations Message-ID: <000501bfb3b7$f0868cc0$5a5b22cf@compusmart.ab.ca> Tutorial on Auditory Computations and Neuromorphic Implementations http://www.icsc.ab.ca/150-tut.htm#Dr. Shihab A. Shamma Organizer: Dr. Shihab A. Shamma (Chair) Electrical and Computer Engineering Department A.V. WIlliams Bldg University of Maryland, College Park MD 20742 sas at Glue.umd.edu http://www.isr.umd.edu/People/faculty/Shamma.html This session will focus on advances in auditory theory and computations for speech, music, and other complex sounds. Topics will include the latest findings and models for encoding of timbre and pitch of complex sounds, and binaural localization cues and algorithms. Also addressed are issues dealing with real-time implementations of auditory algorithms, and their applications in various industrial and military contexts ranging from manufacturing acoustics to MEMS interfaces and prosthetics. The session will also include invited talks on hardware realizations of the algorithms, both in analog VLSI and in DSP platforms. Invited speakers to the session include: Shihab Shamma (Chair) will provide the review talk on auditory computations and implementations (University of Maryland, College Park) Timothy Horiuchi (Co-Chair) - He will talk about auditory robotics, both acoustic and for echolocation (as in bats) (University of Maryland, College Park) Andre van Schaik - He will adress aVLSI implementations of auditory algorithms for pitch extraction and peripheral auditory transformations (University of Sidney) Ralph Etienne-Cummings will review some of the MEMS sensors and actuators that have been used for acoustic signal processing. He will also describe an architecture and sensor/actuator design for a highly integrated microphone, speaker and processing electronics for ultrasonic ranging and imaging. (The Johns Hopkins University) Steve Greenberg - His talk will address the perception of speech and its relation to basic auditory processes in cortex and other higher auditory centers. (Int. Comp. Science Inst., Berkeley California) Researchers interested in contributing to this session are requested to submit manuscripts of up to 5,000 words to: Shihab A. Shamma sas at Glue.umd.edu IMPORTANT DATES May 15, 2000: Submission deadline June 15, 2000: Notification of acceptance July 30, 2000: Delivery of full papers December 12-15, 2000: ISA'2000 congress This session is part of the International Congress on INTELLIGENT SYSTEMS AND APPLICATIONS (ISA'2000) University of Wollongong (near Sydney), Australia December 12-15, 2000 http://www.icsc.ab.ca/isa2000.htm From mkm at hnc.com Tue May 2 11:00:44 2000 From: mkm at hnc.com (McClarin, Melissa) Date: Tue, 2 May 2000 08:00:44 -0700 Subject: HNC Software-Financial Solutions seeks two Staff Scientists Message-ID: <72A838A51366D211B3B30008C7F4D363035BD088@pchnc.hnc.com> HNC Financial Solutions, a division of HNC Software, is a world leader in the development and delivery of predictive, neural networks based software solutions for the financial industry. We offer the opportunity to work with cutting edge technology and a world class team in a casual atmosphere. Benefits include three weeks vacation, stock option grants and tuition reimbursement. HNC Financial Solutions is seeking two full time Staff Scientists to be based at our headquarters in San Diego, California. Please see job description and requirements below. To apply, please use one of the following methods: Mail: Resume Processing Center PO Box 828 Burlington, MA 01803 Fax: 800-438-0957 Email: hnc at rpc.webhire.com Online: http:\\www.hnc.com To apply for this position, please reference job code 00-070CSD Staff Scientist Duties/Job Description: The individual will work with a highly motivated and talented group of scientists and engineers responsible for development of neural network-based predictive models for Falcon payment card fraud product. Falcon currently protects more than 300 million payment cards against fraudulent use around the world in five continents. Forty out of top fifty U.S. banks and sixteen out of top twenty-five international banks currently use Falcon. Job responsibilities include designing and building predictive models for payment card fraud detection based on the latest technologies in neural networks, pattern recognition, artificial intelligence, statistical modeling, and machine learning. Specific responsibilities may vary by project, but will include analyzing huge amount of data to determine suitability for modeling, pattern identification and feature (variable) selection from large amounts of data, experimenting with different types of models, analyzing performance, and reporting results to customers. (Comment: Do not allow automatic wrap or extra returns caused by wrap, 3 total lines) Required Qualifications (Experience/Skills): MS or Ph.D. in Computer Science, Electrical Engineering, Applied Statistics/Mathematics or related fields. Minimum two years of experience in pattern recognition, mathematical/statistical modeling, or data analysis on real world problems. Familiarity with the latest modeling techniques and tools. Good oral and written communication skills. Team orientation and at the same time the ability to work independently. Ability to interact well with both co-workers and customers. Strong programming skills desired. (Comment: same as above) Proficiency in C and UNIX, and familiarity with some analysis tools, e.g., MATLAB, SAS, etc. desirable. Preferred Qualifications (Experience/Skills): Strong mathematical appetite, problem solving and computer skills (C or C++ or Java). Good UNIX scripting and rapid prototyping skill. Quick learner. Good team player. Experience in designing systems based on neural networks, pattern recognition and/or statistical modeling techniques for the financial, health care, marketing, or other real world applications. Object oriented software design familiarity. From sschaal at usc.edu Tue May 2 21:37:08 2000 From: sschaal at usc.edu (Stefan Schaal) Date: Tue, 2 May 2000 18:37:08 -0700 Subject: Conference on Humanoid Robots -- deadline extension May 21 Message-ID: <200005030137.SAA03955@rubens.usc.edu.> **************************************************************************** **** Note: Due to popular request, deadline was extended to May 21 ********* **************************************************************************** Appended you find the call for papers for the first international conference on Humanoid Robots. We are particularly interested in soliciting contributions from the learning community for this conference. Supervised learning, unsupervised learning, and reinforcement learning are core elements in sensory motor control of a humanoid robot, the same as in biological systems. The need for algorithms that scale well to high-dimensional data, work incrementally in real-time, can integrate multi-modal information and deal with hidden state makes the area of humanoid robotics a very interesting challenge for new learning theories. With best regards, Stefan Schaal & Alois Knoll -------------------------------------------------------------------------- CALL FOR PAPERS -- Please circulate *** HUMANOIDS2000 *** -- The First IEEE-RAS Intern. Conf. on Humanoid Robots -- -- Co-sponsored by the Robotics Society of Japan (RSJ) -- Massachusetts Institute of Technology, Sept. 7-8, 2000 Papers should present current work, outline research programmes, and/or summarize in a tutorial style the state of the art in areas that are related to the building of, controling of , and learning in humanoid robots or that can be expected to be of importance to the field in the future. Note that we are also especially interested in connectionist and statistical learning methods as they relate to learning sensorimotor control and higher planning abilities in complex, high-dimensional movement systems. Paper submission deadline is May 21, 2000. For mor information, please visit the conference web sites at http://humanoids.uni-bielefeld.de or http://humanoids.usc.edu for further details (including a full Call for Papers in PDF and Postcript format). ______________________________________________________ Deadlines Submission: May 21, 2000 Notification: June 30, 2000 Camera-Ready Copy: August 4, 2000 ______________________________________________________ Contact address: humanoids at usc.edu ______________________________________________________ Conference Chairs: G.A.Bekey, USC (General) R.A.Brooks, MIT (Honorary) A.C.Knoll, U Bielefeld (Program) ______________________________________________________ Program Committee: M. Asada (Osaka U) C. Atkeson (Georgia Tech) T. Christaller (GMD-Bonn) T. Fukuda (Nagoya U) S. Hashimoto (U Waseda) H. Inoue (U Tokyo) K. Kawamura (Vanderbilt U) B. Keeley (U Northern Iowa) P. Khosla (CMU) T. Kobayashi (U Waseda) Y. Kuniyoshi (MITI Tsukuba) M. Mataric (USC) R. Pfeifer (U Zurich) R. Reiter (U Toronto) S. Schaal (USC) S. Sugano (Waseda U) M. Wheeler (U Stirling) S. Yuta (U Tsukuba) ______________________________________________________ From nnsp00 at neuro.kuleuven.ac.be Wed May 3 06:54:43 2000 From: nnsp00 at neuro.kuleuven.ac.be (NNSP2000, Sydney) Date: Wed, 03 May 2000 12:54:43 +0200 Subject: Postdoctoral position biomedical signal-processing and neuroimaging Message-ID: <39100573.502A8E9E@neuro.kuleuven.ac.be> Postdoctoral position biomedical signal-processing and neuroimaging ------------------------------------------------------------------- The Computational Neuroscience Group of the Laboratory of Neuro- and Psychophysiology, Medical School of the Catholic University of Leuven, Belgium (http:\\simone.neuro.kuleuven.ac.be), invites applications for a post-doctoral position in the area of biomedical signal-processing and neuroimaging (functional Magnetic Resonance Imaging). Desired profile: The highly qualified applicant should possess a Ph.D. degree in the field of signal-processing, image-processing, statistics, or neural networks. He/she should be familiar with Principal Components Analysis (PCA), Independent Components Analysis (ICA), projection pursuit, or related techniques, and have a profound knowledge of both uni-variate statistics, such as t-tests, F-tests, and multi-variate statistics, such as ANOVA, ANCOVA, and MANCOVA. Programming skills are an asset (C, Matlab, ...), as is a familiarity with UNIX and PC platforms. We offer: 1) A challenging research environment. The applicant will have access to data from state-of-the-art Magnetic Resonance scanners and advanced statistical tools such as SPM (Statistical Parameter Mapping) for examining brain activity in both human and monkey. 2) An attractive income. The applicant will reveive 2150 USD or 2375 Euro per month, including a full social security coverage and housing. This is comparable to the salary of an associate Professor at the University. Housing will be taken care of by the host institute. 3) Free return airline ticket, ecomomy class (maximum 1350 USD or 1500 Euro) and a reimbursement of all costs incurred for shipping luggage to Belgium (maximum 900 USD or 1000 Euro). Send your CV (including the names and contact information of three references), bibliography and how to contact you by mail/fax/email/phone to: Prof. Dr. Marc M. Van Hulle K.U.Leuven Laboratorium voor Neuro- en Psychofysiologie Faculteit Geneeskunde Campus Gasthuisberg Herestraat 49 B-3000 Leuven Belgium Phone: + 32 16 345961 Fax: + 32 16 345993 E-mail: marc at neuro.kuleuven.ac.be URL: http://simone.neuro.kuleuven.ac.be From POCASIP at aol.com Wed May 3 21:12:20 2000 From: POCASIP at aol.com (POCASIP@aol.com) Date: Wed, 3 May 2000 21:12:20 EDT Subject: Neurobiologically inspired image processing expert sought Message-ID: <1e.4c5850d.26422874@aol.com> The Advanced Signal and Image Processing Laboratory of Intelligent Optical Systems Inc. (IOS) is looking for a candidate who has expertise and experience in: Neurobiologically inspired image processing. We seek a candidate with hands-on experience in modeling the human visual system and its ability to perform edge and vertex detection, contour extraction, and illusory contour representations, as well as knowledge about neuronal synchrony. + Programming fluency in C++ and Java are important assets. + Experience in solving real-world problems in a wide variety of applications is a definite plus. The activities of the Advanced Signal and Image Processing Laboratory include image analysis, biomedical diagnosis, food quality control, chemical analysis, system control, and target recognition, using neural computation-based implementations in software and hardware. IOS is a rapidly growing dynamic high-tech R&D company with a focus on commercializing smart sensors and advanced information processing. It . We employ about 35 people, including 14 scientists from a variety of disciplines, and are located in Torrance, California, a pleasant seaside town with a high standard of living and year-round perfect weather. Please send your application including curriculum vitae, and three references, in ASCII only, by e-mail to POCASIP at aol.com E. Fiesler From Andres.PerezUribe at unifr.ch Wed May 3 08:04:05 2000 From: Andres.PerezUribe at unifr.ch (Andres Perez-Uribe) Date: Wed, 3 May 2000 14:04:05 +0200 Subject: Post-Doc: Uni-Fribourg, Switzerland Message-ID: <1000503140405.ZM5515@ufps25> The Parallelism and Artificial Intelligence (PAI) group is a dynamic team involved in hot research topics related to new information and communication technologies. His interests encompass namely the methodologies of Autonomous and Adaptive Systems, Collective Intelligence, Evolutionary Computing, Agent Technology, and Web operating Systems, but also the field of Human Computer Interaction, where it addresses specifically the Immersive trend and Pervasive Computing, namely with Force-Feedback Interaction and Sound imaging. For the launching of his new WELCOME header project, it is now opening a Post-Doctoral Research Position The position The position is intended for an enthusiastic postgraduate, who have terminated her/his studies for a doctorate. The candidate will participate in the research activities that are conducted within the WELCOME project (gzip'd postcript description of the project available at http://www-iiuf.unifr.ch/pai/index.html/Welcome.ps.gz). Her/his duties will consist in (i) developing Agent-based methodologies for Internet-based infrastructures, or (ii) tackling Human Computer Interaction issues, together with supervision of two Ph.D research works. As far as possible she/he will promote industrial applications of her/his research, and build contacts with external academic or commercial organizations. An open-mind to interdisciplinary approaches and to non-standard innovation techniques will be appreciated. The position is to be taken in early spring 2000 (or at convenience). It is granted for two years by the Swiss National Foundation for Scientific Research (with a possibility of renewal once). Job location is Fribourg, a french-german bilingual middle-size city in Switzerland. The requirements Education: Ph.D in Computer Science Ability to speak, read and write French or German or English Proficient at one or several topics, such as: Intelligent Networks, Distributed Systems and/or Coordination Languages, Agent Technology Human Computer Interaction, Immersive and Pervasive Computing, Force-feedback interaction, Augmented or virtual reality, Sound Imaging Evolutionary Computing, Artificial Life Object-Oriented design techniques, Java programming, Jini technology Applications with CV and research paper list must be sent to (Email submissions are encouraged): Prof. B?at Hirsbrunner University of Fribourg, ch. du Mus?e 3, CH-1700 Fribourg Tel.: +41 (079) 611 72 48 Email: beat.hirsbrunner at unifr.ch -- Andres PEREZ-URIBE Postdoctoral Fellow Parallelism and Artificial Intelligence Group (PAI) Computer Science Institute, University of Fribourg, Switzerland Ch. du Musee 3, CH-1700 Fribourg, Office 2.76b Perolles Tel. +41-26-300-8473, Fax +41-26-300-9731 Email:Andres.PerezUribe at unifr.ch, http://www-iiuf.unifr.ch/~aperezu/ From dimi at ci.tuwien.ac.at Fri May 5 09:43:36 2000 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Fri, 5 May 2000 15:43:36 +0200 (CEST) Subject: CI BibTeX Collection -- Update Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 3/4 IEEE Transactions on Fuzzy Systems, Volumes 7/6-8/1 IEEE Transactions on Neural Networks, Volumes 10/6-11/2 Machine Learning, Volumes 37/2-40/2 Neural Computation, Volumes 11/7-12/4 Neural Networks, Volumes 12/9-13/2 Neural Processing Letters, Volumes 10/2-11/2 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fr Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitt Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ From banzhaf at tarantoga.cs.uni-dortmund.de Fri May 5 10:49:19 2000 From: banzhaf at tarantoga.cs.uni-dortmund.de (Wolfgang Banzhaf) Date: Fri, 5 May 2000 16:49:19 +0200 (MET DST) Subject: No subject Message-ID: <200005051449.QAA18496@tarantoga.cs.uni-dortmund.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 2102 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/38422686/attachment.ksh From hammer at informatik.uni-osnabrueck.de Fri May 5 09:45:02 2000 From: hammer at informatik.uni-osnabrueck.de (Barbara Hammer) Date: Fri, 5 May 2000 15:45:02 +0200 Subject: PhD positions at the University of Osnabrueck (Germany) Message-ID: <200005051345.PAA18905@pooh.informatik.uni-osnabrueck.de> Dear Sir or Madam, You are kindly requested to forward the following opening to anyone who might be interested. We apologize in advance in case of multiple receipts of this message. Please ignore this message if it does not lie in your field of interest. Thank you in advance for your cooperation. Sincerely yours, Barbara Hammer ----------------------------------------------------------------- PHD Positions in Neuromathematics ----------------------------------------------------------------- The Ministry of Science and Education of the Federal State of Lower Saxony has established a new research scientists group of junior staff mainly working in the field of "Learning with neural methods on structured data" in the department of Mathematics and Computer Science at the University of Osnabrueck and is therefore looking for two Scientific Assistants (payment according to German tariff BAT IIa) as of now and for a limited period of four years. The group co-operates with the working groups discrete mathematics and theoretical computer science/neuro-informatics as well as with representatives of the interdisciplinary course of studies 'Cognitive Science'. The main working field is the combination of neuro-informatic methods and discrete optimization and applications in information management and scheduling. The persons filling in these posts will have to work in current research projects. These posts require an appropriate scientific university degree in mathematics, computer science or a related scientific-technical course of studies. Knowledge of machine learning or discrete mathematics is desirable. Readiness for interdisciplinary research is expected. The possibility of doing a doctorate is given. In an exceptional case the candidates may already have received the PhD. Part-time employment may be considered. The University of Osnabrueck aims at a higher share of women in the scientific field and would expressively encourage qualified female scientists to apply for these posts. Seriously handicapped applicants will take precedence if equally qualified. Applications containing the customary documents should be sent to Dr. Barbara Hammer, Department of Mathematics/Computer Science, University of Osnabrueck, D-49069 Osnabrueck, Germany by 26th of May 2000. Further information is available under: http://www.informatik.uni-osnabrueck.de/barbara/lnm/ or e-mail to hammer at informatik.uni-osnabrueck.de From mel at lnc.usc.edu Fri May 5 20:11:59 2000 From: mel at lnc.usc.edu (Bartlett Mel) Date: Fri, 05 May 2000 17:11:59 -0700 Subject: Prelim Program: 7th Joint Symposium on Neural Computation Message-ID: <3913634F.322AD1A7@lnc.usc.edu> PRELIMINARY PROGRAM --- 7th Joint Symposium on Neural Computation --- to be held at the UNIVERSITY OF SOUTHERN CALIFORNIA Saturday, May 20, 2000 ------------------------------------------- website: http://www.its.caltech.edu/~jsnc/ ------------------------------------------- 8:45 Opening Remarks - Mel Session I. Cells and Synapses - Bower 9:00 Olivier Coenen, San Diego Children's Hospital Research Center "A Hypothesis for Parallel Fiber Coding in a Cerebellar Model of Smooth Pursuit Eye Movement" 9:15 Roland Suri, The Salk Institute "Modeling Functions of Striatal Dopamine Modulation in Learning and Planning" 9:30 Ralf Wessel, UC San Diego "Biophysics of Visual Motion Analysis in Avian Tectum" 9:45 Panayiota Poirazi, USC "Sublinear vs. Superlinear Synaptic Integration? Tales of a Duplicitous Active Current" Session II. Sensory-Motor Learning - Schaal 10:00 Auke Ijspeert, USC "Locomotion and Visually-Guided Behavior in Salamander: An Artificial Evolution and Neuromechanical Study" 10:15 Thomas DeMarse, Caltech "The Animat Project: Interfacing Neuronal Cultures to a Computer Generated Virtual World" 10:30 Richard Belew, UC San Diego "Evolving Behavior in Developing Robot Bodies Controlled by Quasi-Hebbian Neural Networks" 10:45 Aude Billard, USC "A Biologically inspired Connectionist Model for Learning Motor Skills by Imitation" 11:00 Coffee Break 11:15 KEYNOTE SPEAKER Gerald E. Loeb, USC "Dialogs with the Nervous System" 12:00 Lunch and Posters Session IV. Vision - Hoffmann 2:00 Martina Wicklein, The Salk Institute "Perception of Looming in the Humminbird Hawkmoth Manduca Sexta (Sphingidae, Lepidoptera)" 2:15 Erhan Oztop, USC "Mirror Neuron System in Monkey: A Computational Modeling Approach" 2:30 Eric Ortega, USC "Smart Center-Surround Receptive Fields: What Bayes May Say About the Neural Substrate for Color Constancy" 2:45 Junmei Zhu, USC "Fast Dynamic Link Matching by Communicating Synapese" 3:00 David Eagleman, The Salk Insitute "The Timing of Perception: How Far in the Past do we Live, and Why? 3:15 Coffee Break Session III. Concepts and Memory - Mel 3:30 Jonathan Nelson, UC San Diego "Concept Induction in the Presence of Uncertainty" 3:45 Peter Latham, UCLA "Attractor Networks in Systems with Underlying Random Connectivity" Session V. Faces - Sejnowski 4:00 Tim Marks, UC San Diego "Face Processing in Williams Syndrome: Using ICA to Discriminate Functionally Distinct Independent Components of ERPs in Face Recognition" 4:15 Ian Fasel, UC San Diego "Automatic Detection of Facial Landmarks: An Exhaustive Comparision of Methods" 4:30 Boris Shpungin, UC San Diego "A System for Robustly Tracking Faces in Real-Time" 4:45 Closing Remarks - Sejnowski 5:00 Adjourn for Dinner POSTERS ------- Ildiko Aradi, UC Irvine "Network Stability and Interneuronal Diversity" Javier Bautista, USC "Creating a World Representation of the Environment from Visual Images" Maxim Bazhenov, The Salk Institute "Slow Wave Sleep Oscillations and Transition to an Awake State in a Thalamocortical Network Model" Hamid Beigy, Amirkabir Univ. of Technology "Adaptation of Parameters of BP Algorithm Using Learning Automata" Axel Blau, Caltech "High-Speed Imaging of Neuronal Network Activity" Mihail Bota, USC "The NeuroHomology Database" Theodore Bullock, UC San Diego "When is a Rhythm" Spiros Courellis, USC "Modeling Event-Driven Dynamics in Biological Neural Networks" Holger Quast, UC San Diego "Absolute Perceived Loudness of Speech" Gary Holt, USC "Unsupervised Learning of the Non-Classical Surround" Jeff McKinstry, Point Loma Nazarene University "A Model of Primary Visual Cortex Applied to Edge Detection" Stefan Schaal, USC "Functional brain activation in rhythmic and discrete movement" Alexei Samsonovich, University of Arizona "A Theory-of-Mind Connectionist Model of Episodic Memory Consolidation" ------------------------------------------------------------------ REGISTRATION CHECKS SHOULD BE RECEIVED BY MAY 15 TO GUARANTEE THAT A DELICIOUS HOT LUNCH WILL BE WAITING FOR YOU. See conference web site to register, and for directions to the meeting: http://www.its.caltech.edu/~jsnc/ Fee: $40 students, $50 all others - both prices include hot lunch. Mail checks to: Linda Yokote BME Department USC, Mail Code 1451 Los Angeles, CA 90089 PROGRAM COMMITTEE ----------------- JAMES M. BOWER Division of Biology, Caltech GARRISON W. COTTRELL Dept. of Computer Science and Engineering, UCSD DONALD D. HOFFMAN Dept. of Cognitive Sciences, UCI GILLES LAURENT Division of Biology & Computation and Neural Systems Program, Caltech BARTLETT W. MEL (chair) Dept. of Biomedical Engineering & Neuroscience Program, USC SHEILA NIRENBERG Dept. of Neurobiology, UCLA STEFAN SCHAAL Dept. of Computer Science and Neuroscience Program, USC TERRENCE J. SEJNOWSKI Howard Hughes Medical Institute, UCSD/Salk Institute Local Arrangements ---------------------- Linda Yokote, BME Department, USC, marubaya at rcf.usc.edu, (213)740-0840 Gabriele Larmon, BME Dept, USC, larmon at bmsrs.usc.edu Proceedings ------------ Marilee Bateman, Institute for Neural Computation, UCSD, bateman at cogsci.ucsd.edu Web Site -------- Marionne Epalle, Engineering and Applied Science, Caltech, marionne at caltech.edu From kositsky at greed.cs.umass.edu Mon May 8 18:15:34 2000 From: kositsky at greed.cs.umass.edu (Michael Kositsky) Date: Mon, 8 May 2000 18:15:34 -0400 (EDT) Subject: PhD thesis anouncement Message-ID: Dear Connectionists, My PhD thesis on motor learning and skill acquisition is now available at http://www-anw.cs.umass.edu/~kositsky/phdThesis/phdThesis.html Title: Motor Learning and Skill Acquisition by Sequences of Elementary Actions Abstract: The work presents a computational model for motor learning and memory. The basic approach of the model is to treat complex activities as sequences of elementary actions. The model implements two major functions. First, the combination of elementary actions into sequences to produce desired complex activities, which is achieved by a search procedure involving multiscale task analysis and stochastic descent processing. Second, the utilization of past motor experience by effective memorization and retrieval, and generalizing sequences. New tasks are accomplished by combining past sequences intended for similar tasks. The generalization is based upon the clustering property of motor experience data. Specifically, the clustering property results in concentrating the data points within compact regions, allowing fast and accurate generalization of the elementary actions and consequently, enabling a robust performance of familiar tasks. A motor memory architecture is proposed that uses the clusters as the basic memory units. The computational work is accompanied by a set of psychophysical studies aimed at examining the possible use of a cluster representation by the human motor system. The experiment examines the entire motor learning process, starting from untrained movements up to the formation of highly skilled actions. Michael Kositsky Senior Postdoctoral Researcher Department of Computer Science University of Massachusetts, Amherst email: kositsky at cs.umass.edu web: http://www-anw.cs.umass.edu/~kositsky From cjlin at csie.ntu.edu.tw Mon May 8 14:45:23 2000 From: cjlin at csie.ntu.edu.tw (Chih-jen Lin) Date: Tue, 9 May 2000 02:45:23 +0800 (CST) Subject: announcing a software Message-ID: <200005081845.CAA27195@ntucsa.csie.ntu.edu.tw> Dear Colleagues: We announce the release of the software LIBSVM, a support vector machines (SVM) library for classification problems by Chih-Chung Chang and Chih-Jen Lin. Most available SVM software are either quite complicated or are not suitable for large problems. Instead of seeking a very fast software for difficult problems, we provide a simple, easy-to-use, and moderately efficient package for SVM classification. We hope this library helps users from other fields to easily use SVM as a tool. We also provide a graphic interface to demonstrate 2-D pattern recognition. The current release (Version 1.0) is available from http://www.csie.ntu.edu.tw/~cjlin/libsvm Any comments are very welcome. Sincerely, Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan cjlin at csie.ntu.edu.tw From flake at research.nj.nec.com Mon May 8 13:16:32 2000 From: flake at research.nj.nec.com (Gary William Flake) Date: Mon, 8 May 2000 13:16:32 -0400 (EDT) Subject: NODElib availability Message-ID: <200005081716.NAA09498@cartman.nj.nec.com> Greetings: I am pleased to announce that NODElib is publicly available under the GNU public license. NODElib is a research and development programming library that can be used to rapidly produce neural network simulations. Some of NODElib's more advance features include: * A vast number NN architectures unified under a general framework: MLP, RBFN, higher-order, SMLP, CNLS, arbitrary activation functions, arbitrary connectivity, etc. * Advanced algorithms for all of the above: calculation of the Hessian, optimization of the Jacobian, etc. * Support vector machines with a generalized version of SMO that handles regression and kernel caching. * Advanced optimization routines with options for line search procedures. * Plus many other features... An overview of NODElib can be found at: http://www.neci.nj.nec.com/homepages/flake/nodelib/html/ And the actual library itself can be downloaded from: http://www.neci.nj.nec.com/homepages/flake/nodelib.tgz Best, -- GWF -- Gary William Flake, Ph.D. flake at research.nj.nec.com NEC Research Institute http://www.neci.nj.nec.com/homepages/flake/ 4 Independence Way voice: 609-951-2795 Princeton, NJ 08540 fax : 609-951-2488 =============================================================================== The Computational Beauty of Nature http://mitpress.mit.edu/books/FLAOH/cbnhtml/ From goodman at unr.edu Mon May 8 10:01:31 2000 From: goodman at unr.edu (goodman@unr.edu) Date: Mon, 8 May 2000 07:01:31 -0700 (PDT) Subject: Postdoc in Spike-Coding Message-ID: ********* POSTDOCTORAL POSITION IN SPIKE-CODING NEOCORTICAL MODELS ********* Philip H. Goodman Henry Markram Sushil J. Louis Weizmann Institute for Science University of Nevada, Reno Rehovot, Israel Applications are invited for a postdoctoral research fellowship in the field of large-scale biologically realistic models of cortical microcircuit dynamics. Location: Reno/Lake Tahoe with periodic work at the Weizmann Institute Funding: Negotiable, depending upon experience Dates: Available now; duration 2-3 years Deadline: Open until filled Qualifications (*all of the following*): 1. Ph.D. in computational modeling, neuroscience, or cognitive science 2. Strong mathematical background 3. Substantial modeling experience using GENESIS, NEURON, or SURF-HIPPO 4. Demonstrable programming ability in C++ 5. Familiarity with basic statistical analyses 6. Famliiarity wth machine learning & artificial neural networks concepts 7. Willingness to commit at least two full years to the program Description: The purpose of this program is to address a major gap in our conceptual understanding of synaptic and brain-like network dynamics, and to benefit from untapped technological applications of related pulse-coding information networks. The core activity involves the design and implementation of increasingly complex and powerful brain-like simulations on parallel- distributed "Beowulf" computer systems, incorporating newly discovered excitatory and inhibitory parameters obtained from living tissue. We will use this technology to address the following questions: > What minimal microcircuit must be replicated to create a functional cortical column? > How many such columns must interact to demonstrate emergent behavior -- can we crack the "neural code"? > Can one "lesion" such models to evaluate putative therapies for brain disorders such as Alzheimer's disease, stroke, and epilepsy? We will also compare generalization abilities of "brain-wise" computation to existing artificial neural network and traditional non-neural classifiers. The fellow will work closely with faculty, a full-time PhD-candidate in computer engineering, and other students. Reno is located at an elevation of 4,000 feet at the base of the Sierra Nevada mountain range, with outstanding year-round weather. Reno is only 45 minutes away from powder-skiing, boating, and hiking near Lake Tahoe, and 3 hours from San Francisco by car. Cost of living is estimated at only 65% that of the San Francisco region, and Nevada has no state income tax. Inquiries: Fully qualified individuals should send all of the following in order to initiate consideration: (1) a short letter stating interest and summarizing qualifications, (2) a CV, and (3) names, phone numbers, and email addresses of three references to: Philip H. Goodman, MD, MS, Washoe Medical Center, 77 Pringle Way, Reno, NV 89502. Email inquiries to goodman at unr.edu are encouraged; please use one of the following formats: plain text, unencoded postscript, MS Word, Adobe Acrobat. ***************************************************************************** From ash at isical.ac.in Fri May 5 18:09:54 2000 From: ash at isical.ac.in (Ashish Ghosh) Date: Sat, 06 May 2000 03:39:54 +0530 Subject: New book Message-ID: <391346B2.F178061D@isical.ac.in> I am happy to announce the publication of the following book: "Soft Computing for Image Processing" by Sankar K. Pal, Ashish Ghosh and Malay K. Kundu (Eds.) from Physica-Verlag, Heidelberg, New York. The book is available at http://www.springer.de/cgi-bin/search_book.pl?isbn=3-7908-1268-4 Thanks, Ashish Ghosh ================================================================ Soft Computing for Image Processing Pal, S.K., Indian Statistical Institute, Calcutta, India Ghosh, A., Indian Statistical Institute, Calcutta, India Kundu, M.K., Indian Statistical Institute, Calcutta, India (Eds.) 2000. XVIII, 590 pp. 309 figs., 73 tabs. The volume provides a collection of 21 articles containing new material and describing, in a unified way with extensive real life applications, the merits and significance of performing different image processing/analysis tasks in soft computing paradigm. The articles, written by leading experts all over the world, demonstrate the various ways the fuzzy logic, artificial neural networks, genetic algorithms and fractals can be used independently and in integrated manner to provide efficient and flexible information processing capabilities in a stronger computational paradigm for handling the tasks like filtering, edge detection, segmentation, compression , classification, motion estimation, character regognition and target identification. Application domain includes, among others, data mining, computer vision, pattern recognition and machine learning, information technology, remote sensing, forensic investigation, video abstraction and knowledge based systems. Keywords: Image Processing, Image Analysis, Soft Computing Contents: S.K. Pal, A. Ghosh, M.K. Kundu: Soft Computing and Image Analysis: Features, Relevance and Hybridization.- Preprocessing and Feature Extraction: F.Russo: Image Filtering Using Evolutionary Neural Fuzzy Systems.- T. Law, D. Shibata, T. Nakamura, L. He, H. Itoh: Edge Extraction Using Fuzzy Reasoning.- S.K. Mitra, C.A. Murthy, M.K. Kundu: Image Compression and Edge Extraction Using Fractal Technique and Genetic Algorithms.- S. Mitra, R. Castellanos, S.-Y. Yang, S. Pemmaraju: Adaptive Clustering for Efficient Segmentation and Vector Quantization of Images.- B. Uma Shankar, A. Ghosh, S.K. Pal: On Fuzzy Thresholding of Remotely Sensed Images.- W. Skarbek: Image Compression Using Pixel Neural Networks.- L He, Y. Chao, T. Nakamura, H. Itho: Genetic Algorithm and Fuzzy Reasoning for Digital Image Compression Using Triangular Plane Patches.- N B. Karayiannis, T.C. Wang: Compression of Digital Mammograms Using Wavelets and Fuzzy Algorithms for Learning Vector Quantization.- V.D. Ges: Soft Computing and Image Analysis.- J.H. Han, T.Y. Kim, L.T. Kczy: Fuzzy Interpretation of Image Data.- Classification: M. Grabisch: New Pattern Recognition Tools Based on Fuzzy Logic for Image Understanding.- N.K. Kasabov, S.I. Israel, B.J. Woodford: Adaptive, Evolving, Hybrid Connectionist Systems for Image Pattern Recognition.- P.A. Stadter, N.K Bose: Neuro-Fuzzy Computing: Structure, Performance Measure and Applications.- K. D. Bollacker, J. Ghosh: Knowledge Reuse Mechanisms for Categorizing Related Image Sets.- K. C. Gowda, P. Nagabhushan, H.N. Srikanta Prakash: Symbolic Data Analysis for Image Processing.- Applications: N.M. Nasrabadi, S. De, L.-C. Wang, S. Rizvi, A. Chan: The Use of Artificial Neural Networks for Automatic Target Recognition.- S. Gutta, H. Wechsler: Hybrid Systems for Facial Analysis and Processing Tasks.- V. Susheela Devi, M. Narasimha Murty: Handwritten Digit Recognition Using Soft Computing Tools.- T.L. Huntsburger, J.R. Rose, D. Girard: Neural Systems for Motion Analysis: Single Neuron and Network Approaches.- H.M. Kim, B. Kosko: Motion Estimation and Compensation with Neural Fuzzy Systems. -- ************************************** * Dr. Ashish Ghosh, Ph.D., M.Tech. * * Associate Professor * * Machine Intelligence Unit * * Indian Statistical Institute * * 203 B. T. Road * * Calcutta 700 035, INDIA * * E-mail : ash at isical.ac.in * * ashishghosgisi at hotmail.com * * Fax : +91-33-577-6680/3035 * * Tel:+91-33-577-8085 ext.3110 (Off) * * +91-33-528-2399 (Res) * * URL: http://www.isical.ac.in/~ash * * ICQ: 48125276 * ************************************** From sml at essex.ac.uk Tue May 9 10:45:49 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Tue, 9 May 2000 15:45:49 +0100 Subject: Algorithm evaluation over the Internet (papers available) Message-ID: <8935BFF68E96D3119C91009027D3A56A4A75A8@sernt14.essex.ac.uk> Dear All, We've been developing a system for automatically evaluating algorithms over the Internet - in particular this is geared towards evaluation of image/signal processing and pattern recognition algorithms. There are many aims of the system but the main one is to make life easier for the algorithm developer by automating the evaluation process and by ensuring that the results produced are objective and authentic. There's a paper due to appear in ICPR-2000, and you can also get a preprint of a more complete paper submitted to the Interntation Journal of Document Analysis and Recognition (see below). The website is in its early testing phase and can be found at http://algoval.essex.ac.uk Most of the site is being mirrored for the moment at http://ace.essex.ac.uk so try that if the first one fails. The results section shows how potenitally informative and effortless evaluation can be when done this way (see trainable ROI results, or dictionary word recognition for example). Also, if there's sufficient interest, I wondered if anyone would be interested in helping organise a NIPS workshop on this kid of stuff. Comments welcome. Best Regards, Simon Lucas Papers http://algoval.essex.ac.uk/papers/icpr2000.ps title: Automatic evaluation of algorithms over the Internet authors: Simon Lucas and Kostas Sarampalis to appear: Proceedings ICPR 2000 abstract: This paper describes a system for the automatic evaluation of algorithms (especially pattern recognition algorithms) over the Internet. We present the case for such a system, discuss the system requirements and potential users and present an initial prototype. We illustrate usage of the system with an evaluation of image distance measures used for face recognition. http://algoval.essex.ac.uk/papers/ijdar.ps title: Automatic evaluation of document image analysis algorithms over the Internet Simon Lucas, Paul Beattie and Joanne Coy Submission for IJDAR special issue on performance evaluation Abstract: We have implemented a system for automatically evaluating document image analysis algorithms on datasets over the Internet. In this paper we discuss some general issues related to this mode of evaluation and describe in particular the set-up of a common document image processing problem: locating regions of interest within an image. Traditionally, the evaluation of these and other document processing algorithms has usually been done in a rather manner, in most cases by the developers of the algorithms. By contrast, our system makes the evaluation process simple, consistent, objective and automatic. The system also provides detailed log files to give useful feedback to algorithm developers. ------------------------------------------------ Dr. Simon Lucas Department of Electronic Systems Engineering University of Essex Colchester CO4 3SQ United Kingdom Tel: (+44) 1206 872935 Fax: (+44) 1206 872900 Email: sml at essex.ac.uk http://esewww.essex.ac.uk/~sml ------------------------------------------------- From stephen-m at uk2.net Tue May 9 06:35:51 2000 From: stephen-m at uk2.net (Stephen McGlinchey) Date: Tue, 9 May 2000 11:35:51 +0100 Subject: Thesis available - Transformation-Invariant Topology Preserving Maps Message-ID: <004301bfb9a2$5a1aa780$d363bf92@cis0-75s.staff.paisley.ac.uk> The following PhD thesis is available from http://cis.paisley.ac.uk/mcgl-ci0/ "Transformation-Invariant Topology Preserving Maps" Stephen J. McGlinchey (March 2000) Abstract This thesis investigates the use of unsupervised learning in the context of artificial neural networks to determine filters of data sets that exhibit invariances of one sort or another. The artificial neural networks in this thesis are all of a general type known as topology-preserving mappings. Topology-preserving maps have been of great interest in the computational intelligence community since they were devised in the 1980s. These are methods of mapping high dimensional data to a space of smaller dimensionality, whilst preserving the topographic structure of the data, at least to some degree. Such models have been successfully used in applications such as speech processing, robotics, data visualisation and computer vision, to name but a few. Apart from the many engineering applications of topology preservation, they have also been of biological interest since ensembles of neurons in biological brains have similar properties in that neurons that are close together in certain parts of the brain respond similarly to input data. The specific contributions of this thesis are: 1. An investigation of matrix constraints to preserve topological relations in neural network algorithms, which have previously only used orthonormality as a constraint. The previous algorithms are then seen as special cases of our new algorithms. 2. the development of a topology-preserving map network that ignores the magnitude of input data and respond to its radial location. The organised mappings are able to reliably classify data, where the magnitude of the data has little or no bearing on which class they belongs to. For example, voiced phonemes were classified with amplitude invariance, i.e. regardless of the volume of the speech. 3. a novel neural network method based on Kohonens self-organising map (Kohonen, 1997) algorithm, but combining it with a principal component analysis network to give a set of local principal components which globally cover the data set with a smooth ordering. The resulting filters are transformation invariant for some simple transformations. Stephen McGlinchey Dept. of Computing & Information Systems University of Paisley High Street Paisley PA1 2BE Scotland email: stephen-m at uk2.net fax: +44 141 848 3542 http://cis.paisley.ac.uk/mcgl-ci0/ From nnsp2000 at ee.usyd.edu.au Tue May 9 22:33:38 2000 From: nnsp2000 at ee.usyd.edu.au (NNSP 2000) Date: Wed, 10 May 2000 12:33:38 +1000 Subject: Postdoctoral position biomedical signal-processing and neuroimaging Message-ID: <001001bfba29$dc714720$581b4e81@ee.usyd.edu.au.pe088> Postdoctoral position biomedical signal-processing and neuroimaging ------------------------------------------------------------------- The Computational Neuroscience Group of the Laboratory of Neuro- and Psychophysiology, Medical School of the Catholic University of Leuven, Belgium (http:\\simone.neuro.kuleuven.ac.be), invites applications for a post-doctoral position in the area of biomedical signal-processing and neuroimaging (functional Magnetic Resonance Imaging). Desired profile: The highly qualified applicant should possess a Ph.D. degree in the field of signal-processing, image-processing, statistics, or neural networks. He/she should be familiar with Principal Components Analysis (PCA), Independent Components Analysis (ICA), projection pursuit, or related techniques, and have a profound knowledge of both uni-variate statistics, such as t-tests, F-tests, and multi-variate statistics, such as ANOVA, ANCOVA, and MANCOVA. Programming skills are an asset (C, Matlab, ...), as is a familiarity with UNIX and PC platforms. We offer: 1) A challenging research environment. The applicant will have access to data from state-of-the-art Magnetic Resonance scanners and advanced statistical tools such as SPM (Statistical Parameter Mapping) for examining brain activity in both human and monkey. 2) An attractive income. The applicant will reveive 2150 USD or 2375 Euro per month, including a full social security coverage and housing. This is comparable to the salary of an associate Professor at the University. Housing will be taken care of by the host institute. 3) Free return airline ticket, ecomomy class (maximum 1350 USD or 1500 Euro) and a reimbursement of all costs incurred for shipping luggage to Belgium (maximum 900 USD or 1000 Euro).=20 Send your CV (including the names and contact information of three references), bibliography and how to contact you by mail/fax/email/phone to:=20 Prof. Dr. Marc M. Van Hulle K.U.Leuven Laboratorium voor Neuro- en Psychofysiologie Faculteit Geneeskunde Campus Gasthuisberg Herestraat 49 B-3000 Leuven Belgium Phone: + 32 16 345961 Fax: + 32 16 345993 E-mail: marc at neuro.kuleuven.ac.be URL: http://simone.neuro.kuleuven.ac.be From jcheng at cs.ualberta.ca Thu May 11 14:52:59 2000 From: jcheng at cs.ualberta.ca (J Cheng) Date: Thu, 11 May 2000 12:52:59 -0600 Subject: SOFTWARE ANNOUNCEMENT (Bayesian Network Learning & Data Mining) Message-ID: <005f01bfbb7a$21b88bd0$211c8081@cs.ualberta.ca> Dear Colleagues, A software package for Bayesian belief network (BN) learning & data mining is now available for free download at: http://www.cs.ualberta.ca/~jcheng/bnsoft.htm The package includes two applications for Windows 95/98/NT/2000: BN PowerConstructor: An efficient system for learning BN structures and parameters from data. Constantly update since 1997. BN PowerPredictor: An extension of BN PowerConstructor for learning unrestricted BN & Bayes multi-nets based classifiers. Best regards, Jie Cheng Dept. of Computing Science, Univ. of Alberta email: jcheng at cs.ualberta.ca ############################################################################ Features of BN PowerPredictor: * Graphical Bayesian network editor for modifying BN classifiers' structure. * Wrapper algorithm. The system can automatically learn classifiers of different types and different complexities and choose the best performer. * Feature subset selection. The system can perform feature subset selection automatically. * Two inference modes. The classification can be performed in either batch mode (data set based) or interactive mode (instance based). * Supporting domain knowledge. * Supporting misclassification cost table definition. Sample Performance of BN PowerPredictor on UCI ML data sets: (Pentium II-300) Data set Cases(Train/Test) Attr. Running Time(training) Prediction Accu. Adult 32561/16281 13 25 Min. 86.33+-0.53 Nursery 8640/4320 8 1 Min. 97.13+-0.50 Mushroom 5416/2708 22 9 Min. 100 Chess 2130/1066 36 3 Min. 96.44+-1.11 DNA 2000/1186 60 17 Min. 96.63+-1.03 * The confidence level is 95%. The system performs 300-1000 classifications per second. From ingber at ingber.com Fri May 12 07:58:21 2000 From: ingber at ingber.com (Lester Ingber) Date: Fri, 12 May 2000 06:58:21 -0500 Subject: Programmer/Analyst Position Message-ID: <20000512065821.A14116@ingber.com> Programmer/Analyst in Computational Finance. DRW Investments [www.drwtrading.com], 311 S Wacker Dr Ste 900, Chicago IL 60606. At least 2-3 years combined experience programming in C/C++, Visual Basic/VBA and Java, as well as financial industry experience with a financial institution. Must have excellent background in Physics, Math, or similar disciplines, PhD preferred. Needs practical knowledge of methods of field theory and stochastic differential equations, as well as an understanding of derivatives pricing models, bond futures and modeling of financial indices. The position will be primarily dedicated to developing and coding algorithms for automated trading. Flexible hours in intense environment. Requires strong commitment to several ongoing projects with shifting priorities. See www.ingber.com for some papers on current projects. Please email Lester Ingber ingber at drwtrading.com a resume regarding this position. -- Lester Ingber http://www.ingber.com/ PO Box 06440 Wacker Dr PO Sears Tower Chicago IL 60606-0440 http://www.alumni.caltech.edu/~ingber/ From stephen at computing.dundee.ac.uk Mon May 15 10:01:09 2000 From: stephen at computing.dundee.ac.uk (Stephen McKenna) Date: Mon, 15 May 2000 15:01:09 +0100 Subject: New Book - Dynamic Vision: From Images to Face Recognition Message-ID: <035901bfbe76$05823160$26222486@lagavulin.dyn.computing.dundee.ac.uk> Dear colleagues, We are pleased to announce the publication of the following book: "Dynamic Vision: From Images to Face Recognition" by Shaogang Gong, Stephen J McKenna and Alexandra Psarrou Imperial College Press (World Scientific Publishing), ISBN 1-86094-181-8, 344 pp. Further details are available at http://www.computing.dundee.ac.uk/staff/stephen/book.html The book can be obtained from: http://www.worldscientific.com/books/bookshop.html or http://www.amazon.com/exec/obidos/ASIN/1860941818/qid%3D957267900/sr%3D1-2/102-5354817-5263206 Sincerely, Stephen McKenna Department of Applied Computing University of Dundee DD1 4HN Tel.: +44 (0)1382 344732 Fax.: +44 (0)1382 345509 ================ [ Table of contents inserted by Connectionists moderator: ] Contents PART I BACKGROUND 1 About Face The Visual Face, The Changing Face, Computing Faces, Biological Perspectives, The Approach 2 Perception and Representation A Distal Object, Representation by 3D Reconstruction, Two-dimensional View-based Representation, Image Template-based Representation, The Correspondence Problem and Alignment, Biological Perspectives, Discussion 3 Learning Under Uncertainty Statistical Learning, Learning as Function Approximation, Bayesian Inference and MAP Classification, Learning as Density Estimation, Unsupervised Learning without Density Estimation, Linear Classification and Regression, Non-linear Classification and Regression, Adaptation, Biological Perspectives, Discussion PART II FROM SENSORY TO MEANINGFUL PERCEPTION 4 Selective Attention: Where to Look Pre-attentive Visual Cues from Motion, Learning Object-based Colour Cues, Perceptual Grouping for Selective Attention, Data Fusion for Perceptual Grouping, Temporal Matching and Tracking, Biological Perspectives, Discussion 5 A Face Model: What to Look For Person-independent Face Models for Detection, Modelling the Face Class, Modelling a Near-face Class, Learning a Decision Boundary, Perceptual Search, Biological Perspectives, Discussion 6 Understanding Pose Feature and Template-based Correspondence, The Face Space across Views: Pose Manifolds, Template Matching as Affine Transformation, Similarities to Prototypes across Views, Learning View-based Support Vector Machines, Biological Perspectives, Discussion 7 Prediction and Adaptation Temporal Observations, Propagating First-order Markov Processes, Kalman Filters, Propagating Non-Gaussian Conditional Densities, Tracking Attended Regions, Adaptive Colour Models, Selective Adaptation, Tracking Faces, Pose Tracking, Biological Perspectives, Discussion PART III MODELS OF IDENTITY 8 Single-View Identification Identification Tasks, Nearest-neighbour Template Matching, Representing Knowledge of Facial Appearance, Statistical Knowledge of Facial Appearance, Statistical Knowledge of Identity, Structural Knowledge: The Role of Correspondence, Biological Perspectives, Discussion 9 Multi-View Identification View-based Models, The Role of Prior Knowledge, View Correspondence in Identification, Generalisation from a Single View, Generalisation from Multiple Views, Biological Perspectives, Discussion 10 Identifying Moving Faces Biological Perspectives, Computational Theories of Temporal Identification, Identification using Holistic Temporal Trajectories, Identification by Continuous View Transformation, An Experimental System, Discussion PART IV PERCEPTION IN CONTEXT 11 Perceptual Integration Sensory and Model-based Vision, Perceptual Fusion, Perceptual Inference, Vision as Co-operating Processes, Biological Perspectives, Discussion 12 Beyond Faces Multi-modal Identification, Visually Mediated Interaction, Visual Surveillance and Monitoring, Immersive Virtual Reality, Visual Database Screening PART V APPENDICES A Databases Database Acquisition and Design, Acquisition of a Pose-labelled Database, Benchmarking, Commercial Databases, Public Domain Face Databases, Discussion B Commercial Systems System Characterisation, A View on the Industry, Discussion C Mathematical Details Principal Components Analysis, Linear Discriminant Analysis, Gaussian Mixture Estimation, Kalman Filters, Bayesian Belief Networks, Hidden Markov Models, Gabor Wavelets Bibliography Index 344 pp. From steve at cns.bu.edu Mon May 15 22:32:33 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Mon, 15 May 2000 22:32:33 -0400 Subject: The Imbalanced Brain: From Normal Behavior to Schizophrenia Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg in HTML, PDF, and Gzipped postscript: Grossberg, S. (2000). The imbalanced brain: From normal behavior to schizophrenia. Biological Psychiatry, in press. Preliminary version appears as Boston University Technical Report CAS/CNS TR-99-018. ABSTRACT: An outstanding problem in psychiatry concerns how to link discoveries about the pharmacological, neurophysiological, and neuroanatomical substrates of mental disorders to the abnormal behaviors that they control. A related problem concerns how to understand abnormal behaviors on a continuum with normal behaviors. During the past few decades, neural models have been developed of how normal cognitive and emotional processes learn from the environment, focus attention and act upon motivationally important events, and cope with unexpected events. When arousal or volitional signals in these models are suitably altered, they give rise to symptoms that strikingly resemble negative and positive symptoms of schizophrenia, including flat affect, impoverishment of will, attentional problems, loss of a theory of mind, thought derailment, hallucinations, and delusions. The present article models how emotional centers of the brain, such as the amygdala, interact with sensory and prefrontal cortices (notably ventral, or orbital, prefrontal cortex) to generate affective states, attend to motivationally salient sensory events, and elicit motivated behaviors. Closing this feedback loop between cognitive and emotional centers is predicted to generate a cognitive-emotional resonance that can support conscious awareness. When such emotional centers become depressed, negative symptoms of schizophrenia emerge in the model. Such emotional centers are modeled as opponent affective processes, such as fear and relief, whose response amplitude and sensitivity are calibrated by an arousal level and chemical transmitters that slowly inactivate, or habituate, in an activity-dependent way. These opponent processes exhibit an Inverted-U whereby behavior become depressed if the arousal level is chosen too large or too small. The negative symptoms are due to the way in which the depressed opponent process interacts with other circuits throughout the brain. Keywords: schizophrenia, arousal, prefrontal cortex, amygdala, opponent process, neural networks From bvr at stanford.edu Mon May 15 18:45:05 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Mon, 15 May 2000 15:45:05 -0700 Subject: REMINDER: CALL FOR PAPERS -- NIPS*2000 Message-ID: <4.2.0.58.20000515154436.00d40d60@bvr.pobox.stanford.edu> CALL FOR PAPERS -- NIPS*2000 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Nov. 27 -- Saturday, Dec. 2, 2000 Denver, Colorado ========================================== This is the fourteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Nov. 27), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 1-2). Tutorials will include: Population Codes (Richard Zemel, U. of Toronto), Linking Brain to Behavior (Stephen Grosberg, Boston U.), Markov Chain Monte Carlo (Andrew Gelman, Columbia U.), Visual Attention (Harold Pashler, UCSD) and more! Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. A special area of emphasis this year is innovative applications of neural computation. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, localized basis functions, mixture models, committee models, belief networks, graphical models, support vector machines, Gaussian processes, topographic maps, decision trees, factor analysis, principal component analysis and extensions, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, information retrieval, web and network applications, intrusion detection, fraud detection, bio-informatics, medical diagnosis, image processing and analysis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music, video and artistic applications, animation, virtual environments, learning dynamical systems. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, conditioning, human learning and memory, attention, language, natural language, reasoning, spatial cognition, emotional cognition, conceptual representation, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, optical neurocomputing systems, novel neurodevices, computational sensors and actuators, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, Markov decision processes. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory, multi-agent learning. Visual Processing: image processing, image coding, object recognition, visual psychophysics, stereopsis, motion detection and tracking. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable by anonymous FTP at the site given below. THE STYLE FILES HAVE BEEN UPDATED; please make sure that you use the current ones and not previous versions. Submission Instructions: NIPS has migrated to electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We are only accepting postscript manuscripts. No pdf files will be accepted this year. The electronic submission page will be available on April 28, 2000. Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT MAY 19, 2000 PACIFIC DAYLIGHT TIME (08:00 GMT May 20). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS Copies of the style files are also available via anonymous ftp at ftp.cs.cmu.edu (128.2.242.152) in /afs/cs/Web/Groups/NIPS/formatting. For general inquiries or requests for registration material, send e-mail to nipsinfo at salk.edu or fax to (619)587-0417. NIPS*2000 Organizing Committee: General Chair, Todd K. Leen, Oregon Graduate Institute; Program Chair, Tom Dietterich, Oregon State University; Publications Chair, Volker Tresp, Siemens AG; Tutorial Chair, Mike Mozer, University of Colorado; Workshops Co-Chairs, Rich Caruana, Carnegie Mellon University, Virginia de Sa, Sloan Center for Theoretical Neurobiology; Publicity Chair, Benjamin Van Roy, Stanford University; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Doug Baker and Alex Gray, Carnegie Mellon University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2000 Program Committee: Leon Bottou, AT&T Labs - Research; Tom Dietterich, Oregon State University (chair); Bill Freeman, Mitsubishi Electric Research Lab; Zoubin Ghahramani, University College London; Dan Hammerstrom, Oregon Graduate Institute; Thomas Hofmann, Brown University; Tommi Jaakkola, MIT; Sridhar Mahadevan, Michigan State University; Klaus Obermeyer, TU Berlin; Manfred Opper, Aston University; Yoram Singer, Hebrew University of Jerusalem; Malcolm Slaney, Interval Research; Josh Tenenbaum, Stanford University; Sebastian Thrun, Carnegie Mellon University. PAPERS MUST BE SUBMITTED BY MAY 19, 2000 From gluck at pavlov.rutgers.edu Tue May 16 17:04:04 2000 From: gluck at pavlov.rutgers.edu (Mark A. Gluck) Date: Tue, 16 May 2000 17:04:04 -0400 Subject: Neural Net Programming/Research Job at Rutgers-Newark Neuroscience Message-ID: Neural Net Programming/Research Job at Rutgers-Newark Neuroscience The Gluck and Myers labs Rutgers-Newark have a part-time position available for a computer programmer. We are looking for someone with very strong programming skills who is capable of independent work. Hours and salary will be by arrangement, commensurate with experience and assigned work; there is also the possibility of course credits for independent study in computer science, psychology or neurobiology. We are looking for someone who can contribute to one or more of the following projects. Prior coursework dealing with brain systems and learning is not required for any of these projects, although an interest in brain science would be desirable. For more information on our research and the Memory Disorders Project at Rutgers-Newark, see the web pages listed at bottom of email. There is the training potential in this job to learn more about NN models, the neurobio of learning and memory, and the cognitive neuroscience of memory, etc. Significant contributions to research will be acknowledged by author-credit on on academic research papers. An ideal candidate might be someone who is looking to do a year or two of research and work in a research laboratory before applying to graduate school in a related area. Work duties will span three projects: 1) Computational neuroscience. We develop neural network models of the brain and learning, focusing on the role of specific brain structures such as the hippocampus and basal forebrain. We are looking for a programmer with a strong background in C or C++ with some experience working with neural networks to modify and extend existing code; experience with Unix systems and the Solaris operating system is essential. 2) Behavioral test development. To test our computational models, we perform behavioral tests in normal people and people with various memory impairments. These tests take the form of computerized "games". We are looking for a programmer with experience using object-oriented languages to implement new tests. Currently, our tests are written in SuperCard and SuperLab languages for Macintosh. Familiarity with these languages is not essential, but the successful applicant will be prepared to learn them. Some prior experience with Macintosh computers is essential. 3) Applications programming. We have several existing behavioral tests, programmed for the Macintosh, which need to be reprogrammed to run under Microsoft Windows using a platform such as Visual Basic, Visual C++ or MatLab. Strong experience with one of these platforms and with Windows is essential. If interested, please email both gluck at pavlov.rutgers.edu and myers at pavlov.rutgers.edu with information on your background experience, relevant skills, and future career goals. Please give emails for three people who can write letters of recommendation for you. Mark A. Gluck Associate Professor of Neuroscience Catherine E. Myers Assistant Professor of Psychology _______________________________________________________________ Dr. Mark A. Gluck, Associate Professor Center for Molecular and Behavioral Neuroscience Phone: (973) 353-1080 x3221 Rutgers University Fax: (973) 353-1272 197 University Ave. Newark, New Jersey 07102 Email: gluck at pavlov.rutgers.edu WWW Homepages: Research Lab: http://www.gluck.edu Rutgers Memory Disorders Project: http://www.memory.rutgers.edu ______________________________________________________________ From skremer at q.cis.uoguelph.ca Wed May 17 14:11:06 2000 From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer) Date: Wed, 17 May 2000 14:11:06 -0400 (EDT) Subject: No subject Message-ID: Dear Connectionists: We are in the process of trying to organize a competition involving using unlabeled data for supervised learning (similar to previous competitions on learning time-series and grammars). If this sounds like something you might be interested in, please check out the web-page at: http://q.cis.uoguelph.ca/~skremer/NIPS2000 Thanks, -Stefan -- Dr. Stefan C. Kremer, Assistant Prof., Dept. of Computing and Information Science University of Guelph, Guelph, Ontario N1G 2W1 WWW: http://hebb.cis.uoguelph.ca/~skremer Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323 E-mail: skremer at snowhite.cis.uoguelph.ca From cristina at idsia.ch Thu May 18 06:47:16 2000 From: cristina at idsia.ch (Cristina Versino) Date: Thu, 18 May 2000 12:47:16 +0200 Subject: Neuroinformatics for 'living' artefacts Message-ID: <200005181047.MAA02735@rapa.idsia.ch> The Information Society Technologies programme and the Quality of Life programme are launching a joint call for project proposals on "Neuroinformatics for 'living' artefacts". The call objective is to explore new synergies between Neurosciences and Information Technologies in order to enable the construction of hardware/software "artefacts that live and grow", i.e. artefacts that self-adapt and evolve beyond pure programming. Preference will be given to work that demonstrates adaptability and growth in the "real world" and that does not simply extrapolate from an already established research field (such as neural-networks or genetic algorithms). An Information Workshop on the call will take place in Brussels on the 9th of June. The workshop will feature: - An in-depth presentation of the call by European Commission staff. - To stimulate thinking and discussion, invited speakers will present topics of their choice that relate to the call. - _Stand up and speak_: anybody having registered can speak for 2 minutes to describe their interests and the kind of partnerships they are seeking. Updated information on the workshop: http://www.cordis.lu/ist/fetni-iw.htm Information on the call: http://www.cordis.lu/ist/fetni.htm http://www.cordis.lu/life/src/neuro.htm From ascoli at osf1.gmu.edu Fri May 19 12:25:28 2000 From: ascoli at osf1.gmu.edu (GIORGIO ASCOLI) Date: Fri, 19 May 2000 12:25:28 -0400 (EDT) Subject: Postdoc position available Message-ID: Please post and circulate as you see fit. Many thanks! COMPUTATIONAL NEUROSCIENCE POST-DOCTORAL POSITION AVAILABLE A post-doctoral position is available immediately for computational modeling of dendritic morphology, neuronal connectivity, and development of anatomically and physiologically accurate neural networks. All highly motivated candidates with a recent PhD (or expecting one in year 2000) in biology, computer science, or other scientific disciplines are encouraged to apply. Academic background and/or a strong interest in either computational science or neuroscience are required, as well as the ability and willingness to learn new techniques and concepts. Programming skills and/or experience with modeling packages are desirable but not necessary. Post-doc will join a young and dynamic research group at the Krasnow Institute for Advanced Study, located in Fairfax, VA (less than 20 miles west of Washington DC). The initial research project is focused on (1) the generation of complete neurons in virtual reality that reproduce accurately the experimental morphological data; and/or (2) the study of the influence of dendritic shape (geometry and topology) on the electrophysiological behavior. We have developed advanced software to build network models of entire regions of the brain (e.g. the rat hippocampus). Please refer to our website for further details: www.krasnow.gmu.edu/ascoli/CNG The post-doc will be hired as a Research Assistant Professor (with VA state employee benefits) with a salary based on the NIH postdoctoral scale, and will have full access to library and computing facilities both within the Krasnow Institute and George Mason University. Send CV, (p)reprints, a brief description of your motivation, and names, email addresses and phone/fax numbers of three references to: ascoli at gmu.edu (or by fax at the number below) ASAP. There is no deadline but the position will be filled as soon as a suitable candidate is found. Non-resident aliens are also welcome to apply. The Krasnow Institute is an equal opportunity employer. Giorgio Ascoli, PhD Head, Computational Neuroanatomy Group Krasnow Institute for Advanced Study at George Mason University, MS2A1 Fairfax, VA 22030 Ph. (703)993-4383 Fax (703)993-4325 From wolfskil at MIT.EDU Fri May 19 14:45:33 2000 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Fri, 19 May 2000 14:45:33 -0400 Subject: book announcement--Thornton Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 7454 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/39ea9e34/attachment.bin From michael at cs.unm.edu Fri May 19 19:17:27 2000 From: michael at cs.unm.edu (Zibulevsky Michael) Date: Fri, 19 May 2000 17:17:27 -0600 Subject: paper: Blind Source Separation by Sparse Decomposition Message-ID: Announcing a paper (revised version) ... Title: Blind Source Separation by Sparse Decomposition in a Signal Dictionary Authors: Michael Zibulevsky and Barak A. Pearlmutter Abstract The blind source separation problem is to extract the underlying source signals from a set of linear mixtures, where the mixing matrix is unknown. This situation is common, in acoustics, radio, medical signal and image processing, hyperspectral imaging, etc. We suggest a two-stage separation process. First, a priori selection of a possibly overcomplete signal dictionary (for instance a wavelet frame, or a learned dictionary) in which the sources are assumed to be sparsely representable. Second, unmixing the sources by exploiting the their sparse representability. We consider the general case of more sources than mixtures, but also derive a more efficient algorithm in the case of a non-overcomplete dictionary and an equal numbers of sources and mixtures. Experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques. URL of the ps file: http://ie.technion.ac.il/~mcib/spica12.ps.gz Contact: michael at cs.unm.edu, bap at cs.unm.edu From P.Dean at sheffield.ac.uk Wed May 17 05:25:28 2000 From: P.Dean at sheffield.ac.uk (Paul Dean) Date: Wed, 17 May 2000 10:25:28 +0100 Subject: Position available Message-ID: UNIVERSITY OF SHEFFIELD DEPARTMENT OF PSYCHOLOGY POSTDOCTORAL POSITION IN COMPUTATIONAL NEUROSCIENCE (R2012) Applications are invited for a 3 year post-doctoral research position to develop second-generation distributed models for the control of eye movements, with special reference to the role of the cerebellum. The project is supervised by Drs. Paul Dean and John Porrill. Applicants with an interest in the mathematical modelling of biological systems are welcomed. Experience with MATLAB is preferred but not essential. The post begins in October 2000. Salary: 16,286 - 20,811 pa. Closing date: 20 July 2000. For details of this post, email: jobs at sheffield.ac.uk or tel: 0114 222 1631 (24 hr). Please quote the post reference (R2012) in all enquiries. Vacancy Website: http://www.shef.ac.uk/jobs/ Informal enquiries to Paul Dean: email p.dean at sheffield.ac.uk, phone +44 (0)114 222 1631 (24 hr). ------------------------------------------------------------------------------ Dr. Paul Dean, Department of Psychology University of Sheffield, Sheffield S10 2TP England Phone: +44 (0)114 222 6521 Fax: +44 (0)114 276 6515 From jobs at thuris.com Sat May 20 00:09:47 2000 From: jobs at thuris.com (Thuris Corporation) Date: Fri, 19 May 2000 20:09:47 -0800 Subject: Thuris Corporation seeks senior scientists (two positions) Message-ID: Thuris Corporation (www.thuris.com) is a startup company applying innovative computer science methods to advanced neuroscience data. We offer extremely competitive compensation packages, as well as the opportunity to work in a thriving cutting-edge environment. We are headquartered in Newport Beach, California, and have extensive collaborative and cross-licensing agreements with the University of California, Irvine. Job Description: The individuals hired will work on a range of projects involving the computational analysis of brain data. Thuris currently has three products under development: the NeuroGraph, an EEG-based device for the diagnosis of Alzheimer's disease (see Forbes ASAP article, 5/29/'00); the BrainPrint system, a new method for in-depth analysis of the effects of pharmaceutical compounds in the brain, and RapidAging, an assay system for testing the effects of candidate neuroprotective drugs. All are patented or patent pending, and all are in advanced stages of development. Job responsibilities include designing and building applications software, diagnostic and classification software, and networking software. Specific responsibilities will vary by project, and may include data analysis, pattern recognition, statistical modeling, machine learning, and neural networks. Required qualifications: Bachelor's degree in computer science or related field, with experience in programming, pattern recognition, and advanced statistical tools. Ability to interact well with co-workers. Proficiency in C, C++, object-oriented programming, and familiarity with some analysis tools, e.g., MATLAB, etc. Preferred qualifications: Master's degree or Ph.D. in computer science or related field, background in neuroscience, experience in designing large statistical or pattern recognition systems. To apply: Thuris Corporation Suite 1100 620 Newport Center Drive Newport Beach, CA 92660 Attn: Human Resources fax: 949-856-9036 email: jobs at thuris.com From john at dcs.rhbnc.ac.uk Mon May 22 05:37:15 2000 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Mon, 22 May 2000 10:37:15 +0100 (BST) Subject: NeuroCOLT workshop on Reinforcement Learning Message-ID: NeuroCOLT workshop on Reinforcement Learning, 12-16 July 2000 Cumberland Lodge Windsor Great Park Windsor England registration forms and more information on the website: http://www.neurocolt.org/reinforcement.html Summary Reinforcement learning is one of the most active research areas in artificial intelligence. In this approach to machine learning, an agent tries to maximize the total amount of reward it receives when interacting with a complex and uncertain environment. The analysis of this scenario is radically different from standard approaches to supervised learning, as many of the common assumptions do not hold. Recent advances in the theoretical analysis of this problem will be surveyed in a 3 days course by Michael Kearns, followed by invited talks by Jonathan Baxter, Chris Watkins, and some talks contributed by the participants in the workshop. (For more details see web site). Cost The inclusive cost per participant is GBP 406.00 The non-resident cost is GBP 250.00 this includes all meals except breakfast Accommodation Residential accommodation for the workshop is limited. A maximum of 23 rooms is available on the first two nights, but more are free thereafter. From dale at logos.math.uwaterloo.ca Mon May 22 15:10:03 2000 From: dale at logos.math.uwaterloo.ca (Dale Schuurmans) Date: Mon, 22 May 2000 15:10:03 -0400 (EDT) Subject: CFP: MLJ special issue Message-ID: <200005221910.PAA26155@newlogos.math.uwaterloo.ca> Call for Papers MACHINE LEARNING Journal Special Issue on NEW METHODS FOR MODEL SELECTION AND MODEL COMBINATION GUEST EDITORS: Yoshua Bengio, Universit de Montral Dale Schuurmans, University of Waterloo SUBMISSION DEADLINE: July 31, 2000 (electronic submission in pdf or postscript format) A fundamental tradeoff in machine learning and statistics is the under-fitting versus over-fitting dilemma: When inferring a predictive relationship from data one must typically search a complex space of hypotheses to ensure that a good predictive model is available, but must simultaneously restrict the hypothesis space to ensure that good candidates can be reliably distinguished from bad. That is, the learning problem is fundamentally ill-posed; several functions might fit a given set of data but behave very differently on further data drawn from the same distribution. A classical approach to coping with this tradeoff is to perform "model selection" where one imposes a complexity ranking over function classes and then optimizes a combined objective of class complexity and data fit. In doing so, however, it would be useful to have an accurate estimate of the expected generalization error at each complexity level so that the function class with the lowest expected error could be selected, or functions from the classes with lowest expected error could be combined, and so on. Many approaches have been proposed for this purpose in both the statistics and the machine learning research communities. Recently in machine learning and statistics there has been renewed interest in techniques for evaluating generalization error, for optimizing generalization error, and for combining and selecting models. This is exemplified, for instance, by recent work on structural risk minimization, support vector machines, boosting algorithms, and the bagging algorithm. These new approaches suggest that better generalization performance can be obtained using new, broadly applicable procedures. Progress in this area has not only been important for improving our understanding of how machine learning algorithms can generalize effectively, it has already proven its value in real applications of machine learning and data analysis. We seek submissions that cover any of these new areas of predictive model selection and combination. We are particularly interested in papers that present current work on boosting, bagging, and Bayesian model combination techniques, as well as work on model selection, regularization, and other automated complexity control methods. Papers can be either theoretical or empirical in nature; our primary goal is to collect papers that shed new light on existing algorithms or propose new algorithms that can be shown to exhibit superior performance under identifiable conditions. The key evaluation criteria will be insight and novelty. This special issue Machine Learning follows from a successful workshop held on the same topic at the Universit de Montral in April, 2000. This workshop brought together several key researchers in the fields of machine learning and statistics to discuss current research issues on boosting algorithms, support vector machines, and model selection and regularization techniques. Further details about the workshop can be found at www.iro.umontreal.ca/~bengioy/crmworkshop2000. SUBMISSION INSTRUCTIONS: Papers should be sent by email to dale at cs.uwaterloo.ca by July 31, 2000. The preferred format for submission is PDF or Postscript. (Please be sure to embed any special fonts.) If electronic submission is not possible, then a hard copy can be sent to: Dale Schuurmans Department of Computer Science 200 University Avenue West University of Waterloo Waterloo, Ontario N2L 3G1 Canada (519) 888-4567 x6769 (for courier delivery) From leews at comp.nus.edu.sg Tue May 23 03:31:11 2000 From: leews at comp.nus.edu.sg (Lee Wee Sun) Date: Tue, 23 May 2000 15:31:11 +0800 (GMT-8) Subject: postdoc position available Message-ID: The computational learning theory group of the National University of Singapore is looking for a postdoc. The focus of the project funding the position is the search for theoretically principled, practical algorithms for machine learning. One of the main areas of interest of the project is on developing learning algorithms that work on the internet - for applications such as collaborative filtering, adaptive placement of web advertisements and information retrieval. The postdoc will be free to pursue independent research along these general lines, but will also have the opportunity to collaborate with the members of the group on other research areas (see http://www.comp.nus.edu.sg/~plong/nuscolt.html for a description of the research interests of the members of group). The starting date is somewhat flexible, but should be some time before September, 2000. The position runs for two years. If you are interested, please send your CV and the names of three references to leews at comp.nus.edu.sg by June 24, 2000. From terry at salk.edu Tue May 23 23:51:40 2000 From: terry at salk.edu (terry@salk.edu) Date: Tue, 23 May 2000 20:51:40 -0700 (PDT) Subject: NEURAL COMPUTATION 12:6 Message-ID: <200005240351.UAA12818@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 6 - June 1, 2000 ARTICLE Separating Style and Content with Bilinear Models Joshua B. Tenenbaum and William T. Freeman NOTES Relationships between the A Priori and a Posteriori Errors in Nonlinear Adaptive Neural Filters Danilo P. Mandic, and Jonathon A. Chambers The VC Dimension For Mixtures of Binary Classifiers Wenxin Jiang The Early Restart Algorithm Malik Magdon-Ismail and Amir F. Atiya LETTERS Attractor Dynamics in Feedforward Neural Networks Lawrence K. Saul and Michael I. Jordan Visualizing the Function Computed by a Feedforward Neural Network Tony Plate, Joel Bert, John Grace and Pierre Band Discriminant Pattern Recognition Using Transformation Invariant Neurons Diego Sona, Alessandro Sperduti, Antonina Starita Observable Operator Models for Discrete Stochastic Time Series Herbert Jaeger Adaptive Method of Realizing Natural Gradient Learning for Multilayer Perceptrons Shun-ichi Amari, Hyeyoung Park, and Kenji Fukumizu Nonmontonic Generalization Bias of Gaussian Mixture Models Shotaro Akaho and Hilbert J. Kappen Efficient Block Training of Multilayer Perceptrons A Navia-Vazquez and A. R. Figueiras-Vidal An Opimization Approach to Design of Generalized BSB Neural Associative Memories Jooyoung Park and Yonmook Park Nonholonomic Orthogonal Learning Algorithms for Blind Source Separation Shun-ichi Amari, Tian-Ping Chen, and Andrzej Cichocki ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From qian at brahms.cpmc.columbia.edu Thu May 25 19:09:20 2000 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Thu, 25 May 2000 19:09:20 -0400 Subject: stereo vision paper available Message-ID: <200005252309.TAA15264@brahms.cpmc.columbia.edu> Dear Connectionists, The following paper on modeling disparity attraction and repulsion is available at: http://brahms.cpmc.columbia.edu/publications/attract-repul.ps.gz It has 28 text pages and 11 figures. Best regards, Ning -------------------------------------------------------------- A Physiologically-Based Explanation of Disparity Attraction and Repulsion Samuel Mikaelian and Ning Qian, Vision Research (in press). Abstract Westheimer and Levi found that when a few isolated features are viewed foveally, the perceived depth of a feature depends not only on its own disparity but also on those of its neighbors. The nature of this interaction is a function of the lateral separation between the features: When the distance is small the features appear to attract each other in depth but the interaction becomes repulsive at larger distances. Here we introduce a two-dimensional extension of our recent stereo model based on the physiological studies of Ohzawa et al, and demonstrate through analyses and simulations that these observations can be naturally explained without introducing ad hoc assumptions about the connectivity between disparity-tuned units. In particular, our model can explain the distance-dependent attraction/repulsion phenomena in both the vertical-line configuration used by Westheimer, and the horizontal-line-and-point configuration used by Westheimer and Levi. Thus, the psychophysically observed disparity interaction may be viewed as a direct consequence of the known physiological organization of the binocular receptive fields. We also find that the transition distance at which the disparity interaction between features changes from attraction to repulsion is largely determined by the preferred spatial frequency and orientation distributions of the cells used in the disparity computation. This result may explain the observed variations of the transition distance among different subjects in the psychophysical experiments. Finally, our model can also reproduce the observed effect on the perceived disparity when the disparity magnitude of the neighboring features is changed. From stefan.wermter at sunderland.ac.uk Fri May 26 12:54:17 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Fri, 26 May 2000 17:54:17 +0100 Subject: NN and neuroscience workshop call Message-ID: <392EAC38.E6A37954@sunderland.ac.uk> ***We also plan to have six places for advanced PhD students or recent post-doctorates and encourage applicants.**** EmerNet: International EPSRC Workshop on Current Computational Architectures Integrating Neural Networks and Neuroscience. Date: 8-9 August 2000 Location: Durham Castle, Durham, United Kingdom Workshop web page is http://www.his.sunderland.ac.uk/worksh3 Organising Committee ----------------------- Prof. Stefan Wermter Chair Hybrid Intelligent Systems Group University of Sunderland Prof. Jim Austin Advanced Computer Architecture Group Department of Computer Science University of York Prof. David Willshaw Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh Call for Papers and Participation -------------------------------- Description and Motivation --------------------------- Although there is a massive body of research and knowledge regarding how processing occurs in the brain this has had little impact on the design and development of computational systems. Many challenges remain in the development of computational systems, such as robustness, learning capability, modularity, massive parallelism for speed, simple programming, more reliability etc. This workshop aims to consider if the design of computational systems can learn from the integration of cognitive neuroscience, neurobiology and artificial neural networks. The main objective is the transfer of knowledge by bringing Together researchers in the twin domains of artificial and real neural networks. The goal is to enable computer scientists to comprehend how the brain processes information to generate new techniques for computation and encourage neuroscientists to consider computational factors when performing their research. Areas of Interest for Workshop -------------------------------- The main areas of interest for the workshop bring together Neural Network Architectures and Neuroscience Robustness: What are the characteristics that enable the human brain to carry on operating despite failure of its elements? How can the brain's slow but robust memory be utilised to replace the brittle but fast memory presently found in conventional computers? Modular construction: How can the brain provide ideas for Bringing together the current small artificial neural networks to create larger modular systems that can solve more complex tasks like associative retrieval, vision and language understanding? Learning in context: There is evidence from neuron, network and Brain levels that the internal state of such a neurobiological system has an influence on processing and learning. Is it possible to build computational models of these processes and states, and design incremental learning algorithms and dynamic architectures? Synchronisation: How does the brain synchronise its processing when using millions of processors? How can large asynchronous computerised systems be produced that do not rely on a central clock? Timing: Undertaking actions before a given deadline is vital. What structural and processing characteristics enable the brain to deal with real time situations? How can these be incorporated into a computerised approach? Processing speed: despite having relatively slow computing element, how is real-time performance achieved? Preliminary Invited Speakers We plan to have around 30 participants, including speakers and participants. -------------------------------------- Dr Jim Fleming - EPSRC Prof. Michael Fourman - University of Edinburgh Prof. Angela Frederici - Max Planck Institute of Cognitive NeuroScience Prof. Stephen Hanson - Rutgers University Prof. Stevan Harnad - University of Southampton Prof. Vasant Honavar - Iowa State University Dr Hermann Moisl - University of Newcastle upon Tyne Prof. Heiko Neumann - Universit Ulm Prof. Gnther Palm - Universit Ulm Prof. Kim Plunkett (tbc) - Oxford University Prof. James A. Reggia - University of Maryland Prof. John Taylor - King's College London Workshop Details ------------------- In order to have a workshop of the highest quality it incorporates a combination of paper presentations on one of the six areas of interest by the participants and more open discussion oriented activities. The discussion element of the EmerNet Workshop will be related to the questions above and it is highly desirable that those wishing to participate focus on one or more of these issues in an extended abstract or position paper of up to 4 pages. Papers should be in either ps, pdf or doc format via email for consideration to Professor Stefan Wermter and Mark Elshaw by the 1st of June 2000. KEY QUESTIONS IS: What can we learn from cognitive neuroscience and the brain for building new computational neural architectures. It is intended that for all participants registration, meals and accommodation at Durham Castle for the Workshop will be provided free of charge. Further, specially invited participants are to receive reasonable travel expenses reimbursed and additional participants rail travel costs in the UK. ***We also plan to have six places for PhD students or recent post-doctorates and encourage applicants.**** Extended versions of papers can be published as book chapters in a book with Springer. Location - Durham Castle ------------------------- The EmerNet Workshop is to be held at Durham Castle, Durham(chosen as in between Sunderland, York and Edinburgh) in the North East of England. There are few places in the world that can match the historic City of Durham, with its dramatic setting on a rocky horseshoe bend in the River Wear and beautiful local countryside. Furthermore, it offers easy accessibility by rail from anywhere in the Great Britain and is close to the international airport at Newcastle. The workshop provides the chance to stay at a real English castle that was constructed under the orders of King William the Conqueror in 1072, shortly after the Norman Conquest. It has many rooms of interest including a Norman Chapel that has some of the most fascinating Norman sculptures in existence and the Great Hall that acts as the dinning area. By having the EmerNet Workshop at this excellent location this provides the chance for interesting and productive discussion in a peaceful and historic atmosphere. It is possible to gain a flavour of Durham Castle and Cathedral on the on-line tour at http://www.dur.ac.uk/~dla0www/c_tour/tour.html Contact Details --------------- Mark Elshaw (Workshop Organiser) Hybrid Intelligent Systems Group Informatics Centre SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3249 Fax: +44 191 515 2781 E-mail: Mark.Elshaw at sunderland.ac.uk Prof. Stefan Wermter (Chair) Informatics Centre, SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3279 Fax: +44 191 515 2781 E-mail: Stefan.Wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ From papanik at intelligencia.com Sat May 27 01:31:28 2000 From: papanik at intelligencia.com (Kostas Papanikolaou) Date: Sat, 27 May 2000 01:31:28 -0400 Subject: NNA '01 (Neural Netorks and Applications), FSFS '01 (Fuzzy Sets and Fuzzy Systems), EC '01(Evolutionary Computation) Message-ID: <005d01bfc79d$0fa92740$0101a8c0@Alexandros> Our sincere apologies if multiple copies of the call for papers arrive you or these conferences are not inside your research interests. ******************************************************************** We invite you to submit a paper and/or organize a special session for NNA, FSFS, EC 2001. NNA '01 (Neural Netorks and Applications), FSFS '01 (Fuzzy Sets and Fuzzy Systems) and EC '01(Evolutionary Computation) -- a unique triplet of soft computing conferences - will take place in Puerto De La Cruz, Tenerife, Canary Islands, (Spain), in February 11-15, 2001. Web Sites: http://www.worldses.org/wses/nna http://www.worldses.org/wses/fsfs http://www.worldses.org/wses/ec They are sponsored by: The World Scientific and Engineering Society (WSES) Co-Sponsored by IIARD, IMCS and they are supported by NeuroDimension Inc. http://www.nd.com Could you, please, forward the following call_for_papers to your friends, colleagues, working groups? Sincerely Yours K.Papanikolaou papanico at go.com papanik at intelligencia.com ***************** CALL FOR PAPERS ********************** We invite you to submit a paper and/or organize a special session for NNA, FSFS, EC 2001. Could you, please, forward the following call_for_papers to your friends, colleagues, working groups? Thanks a lot! 2001 WSES International Conference on: Neural Networks and Applications (NNA '01) http://www.worldses.org/wses/nna - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . 2001 WSES International Conference on: Fuzzy Sets & Fuzzy Systems (FSFS '01) http://www.worldses.org/wses/fsfs - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . 2001 WSES International Conference on: Evolutionary Computations (EC '01) http://www.worldses.org/wses/ec - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . Sponsored by: The World Scientific and Engineering Society (WSES) Co-Sponsored by IIARD, IMCS Supported by NeuroDimension Inc http://www.nd.com The WORLDSES Conferences: NNA'01 (Neural Networks and Applications), FSFS'01 (Fuzzy Sets Fuzzy Logic), EC'01 (Evolutionary Computation) will take place at the Hotel TENERIFE PLAYA (main hotel for the conferences) as well as at the Hotel SAN FELIPE. (Puerto De La Cruz, Tenerife, Canary Islands (Spain), February 11-15, 2001). DEADLINE FOR PAPER SUBMISSION: OCTOBER 30, 2000 NOTIFICATION OF ACCEPTANCE/REJECTION: NOVEMBER 30, 2000 INTERNATIONAL SCIENTIFIC COMMITTEE: Prof. Peter G. Anderson, Rochester Institute of Technology, NY, USA. Prof. George Antoniou, Mont Clair State University, NJ, USA. Prof. Hamid Arabnia, University of Georgia, Georgia, USA. Prof. Hans-Georg Beyer, University of Dortmund, Germany. Prof. Hans-Heinrich Bothe, Technical University of Denmark, Denmark. Prof. Andrew Lim Leong Chye, National Technical University, Singapore. Prof. Raimondas Ciegis, Vilnius Technical University, Lithuania. Prof. Patrick Corr, The Queen's University of Belfast, Northern Ireland. Prof. Satnam Dlay, University of Newcastle, UK. Prof. Meng Joo Er, National Technological University, Singapore. Prof. Janos Fodor, Szent Istvan University, Hungary. Prof. David Fogel, Natural Selection Company, IEEE Editor Trans.EC., USA. Prof. Kaoru Hirota, Tokyo Institute of Technology, Japan. Prof. Damir Kalpic, University of Zagreb, Croatia. Prof. Dae-Seong Kang, Dong-A University, Korea. Prof. Nikola Kasabov, University of Otago, Dunedin, New Zealand. Prof. Rudolf Kruse, Universitaet Magdeburg, Germany. Prof. Franz Kurfess, Concordia University, Canada. Prof. Pascal Lorenz, Universite de Haute Alsace, Colmar, France. Prof. Maria Makrynaki, IMCS, Greece. Prof. Nikos Mastorakis, Hellenic Naval Academy, Greece. Prof. Valeri Mladenov, Eindhoven University of Technology, Netherlands. Prof. Ahmed Mohamed, American University of Cairo, Egypt. Prof. Masoud Mohammadian, University of Canberra, Australia. Prof. Fionn Murtagh, The Queen's University of Belfast, Northern Ireland. Prof. Fazel Naghdy, University of Wollongong, Australia. Prof. Erkki Oja, Helsinki University of Technology (HTU), Finland. Prof. Marcin Paprzycki, University of Southern Mississippi, USA. Prof. Hristo Radev, Technical University of Sofia, Bulgaria. Prof. Raul Rojas, Freie Universitaet Berlin, Germany. Prof. David Sanchez, Elsevier: Neurocomputing, Editor in Chief, Pasadena, USA. Prof. Francisco Torrens, Universitat de Valencia, Spain. Prof. Tom Whalen, The Georgia State University, Atlanta, USA. Prof. Yanqing Zhang, The Georgia State University, Atlanta, USA. Prof. Hans-Jorgen Zimmermann, RWTH Aaachen, Germany. Prof. Jacek Zurada, University of Louisville, USA. Dr. Jacob Barhen, CESAR, ORNL, TN, USA also with JPL, California, USA. Dr. Dimitris Tsaptsinos, Kingston University, UK. Dr. Qingfu Zhang, UMIST, Manchester, UK. NNA'01 TOPICS: ============== Biological Neural Networks Artificial Neural Networks Mathematical Foundations of Neural Networks Virtual Environments Neural Networks (NN) for Signal Processing Connectionist Systems Learning Theory Architectures and Algorithms Neurodynamics and Attractor Networks Pattern Classification and Clustering Hybrid and Knowledge-Based Networks Artificial Life Implementation of (artificial) NN VLSI techniques for NN implementation Neural Control NN for Robotics NN for Optimization, Systems theory and Operational Research NN in Numerical Analysis problems NN Training using Fuzzy Logic NN Training using Evolutionary Computations Interaction between: Neural Networks - Fuzzy Logic - Genetic Algorithms NN and Non-linear Systems NN and Chaos and Fractals Modeling and Simulation Hybrid Intelligent systems Neural Networks for Electric Machines Neural Networks for Power Systems Neural Networks for Real-Time Systems Neural Networks in Information Systems Neural Networks in Decision Support Systems Neural Networks and Discrete Event Systems Neural Networks in Communications Neural Networks for Multimedia Neural Networks for Educational Software Neural Networks for Software Engineering NN for Adaptive Control NN for Aerospace, Oceanic and Vehicular Engineering Man-Machine Systems Cybernetics and Bio-Cybernetics Relevant Topics and Applications Parallel and Distributed Systems Special Topics Others. FSFS'01 TOPICS: ============== Fuzzy Logic Fuzzy Sets Fuzzy Topology and Fuzzy Functional Analysis Fuzzy Differential Geometry Fuzzy Differential Equations Fuzzy Algorithms Fuzzy Geometry Fuzzy Languages Fuzzy Control Fuzzy Signal Processing Fuzzy Subband Image Coding VLSI Fuzzy Systems Approximate Reasoning Fuzzy Logic and Possibility theory Fuzzy Expert Systems Fuzzy Systems theory Connectionist Systems Learning Theory Pattern Classification and Clustering Hybrid and Knowledge-Based Networks Artificial Life Fuzzy Systems in Robotics Fuzzy Systems for Operational Research NN Training using Fuzzy Logic Interaction between: Neural Networks - Fuzzy Logic - Genetic Algorithms Fuzzy Systems and Non-linear Systems Fuzzy Systems and Chaos and Fractals Modeling and Simulation Hybrid Intelligent systems Fuzzy Systems and Fuzzy Engineering for Electric Machines Fuzzy Systems and Fuzzy Engineering for Power Systems Fuzzy Systems and Fuzzy Engineering for Real-Time Systems Fuzzy Systems and Fuzzy Engineering for Information Systems Fuzzy Systems and Fuzzy Engineering for Decision Support Systems Fuzzy Systems and Fuzzy Engineering for Discrete Event Systems Fuzzy Systems and Fuzzy Engineering for Communications Fuzzy Systems and Fuzzy Engineering for Multimedia Fuzzy Systems and Fuzzy Engineering for Educational Software Fuzzy Systems and Fuzzy Engineering for Software Engineering Fuzzy Systems and Fuzzy Engineering for Adaptive Control Fuzzy Systems and Fuzzy Engineering for Aerospace, Oceanic and Vehicular Engineering Man-Machine Systems Cybernetics and Bio-Cybernetics Relevant Topics and Applications Parallel and Distributed Systems Special Topics Others. EC'01 TOPICS: ============= Genetic Algorithms (GA) Mathematical Foundations of GA Evolution Strategies Genetic Programming Evolutionary Programming Classifier Systems Cultural algorithms Simulated Evolution Artificial Life Learning Theory Pattern Classification and Clustering Evolutionary Computations (EC) in Knowledge Engineering Evolvable Hardware Molecular Computing EC in Control Theory EC in Signal Processing EC for Image Coding Approximate Reasoning EC in Robotics EC for Operational Research Neural Networks Training using EC Interaction between: Neural Networks - Fuzzy Logic - Evolutionary Computations EC and Non-linear Systems theory Modeling and Simulation Hybrid Intelligent systems EC for Electric Machines EC for Power Systems EC for Real-Time Systems EC for Information Systems EC for Decision Support Systems EC for Discrete Event Systems EC for Communications EC for Multimedia EC for Educational Software EC for Software Engineering EC for Adaptive Control EC for Aerospace, Oceanic and Vehicular Engineering Global Optimization Man-Machine Systems Cybernetics and Bio-Cybernetics Relevant Topics and Applications Parallel and Distributed Systems Special Topics Others. TENERIFE and PORTO DE LA CRUZ The population of the island is about 700.000 of which about 210.000 live in the capital city Santa Cruz, situated on the north-west coast of the island. Tenerife (as well as the other Canarian Islands) is partly tax-free zone. The main source of livelihood of Tenerife is the tourism industry: more than four million tourists visit the island every year. Tourism has very long traditions in Tenerife, the first tourists came from England in the 1880's! There is some agriculture too: vegetables, fruit and flowers. The most important cultivated plant is banana [platano]. A Tenerifean banana is quite different from its distant cousin the Chiquita banana. A platano is short and plump. The colour of the fruit flesh is darker yellow and the taste much more delicious. There can be hundreds of thousands of banana plants on one plantation. They also make licquer of the bananas on the island. Another important plant is the grapevine. The rich volcanic soil and mild climate give the wine its own unique aroma. In Tenerife there are as many as five "Denomination of Origen (DO)" vineyards: Abona, Tacoronte-Acentejo, Valle de Guimar, Valle de la Orotava and Ycoden-Daute-Isora. The largest of these is Tacoronte-Acentejo, area 1.200 hectares. The production in 1997 was 1.553.000 kgs grapes. The most cultivated brands of grape are the white Listan Blanco, Malvasia and Marmajuelo, the red Listan Negro and Negramoll. There are two airports on the island. The international airport Reina Sofia (Tenerife Sur TFS) in the south near Playa de las Americas where most of the international flights land. The other airport Los Rodeos (Tenerife Norte TFN) is in the north near La Laguna. Los Rodeos serves mainly domestic flights. The distance from Reina Sofia to Playa de las Americas is about 20 kms and the trip takes about 20 mins, to Puerto de la Cruz about 100 km and takes about 90 mins. Tenerife is dominated by the highest mountain in Spain, the volcano Teide, the often snow covered summit of which reaches the altitude of 3.717 meters. El Teide is not a dormant volcano! The last (though minor) eruption took place in the beginning of this century. The last disasterous eruption happened in the year 1706. The southern part of the island is very infertile and next to nothing grows without artificial irrigation. The southern resorts Playa de las Americas and Los Cristianos have been built for tourism only and there is no local settlement. The prices there are distinctly higher than in the Capital City or Puerto de la Cruz. But the best beaches are on the southern coast and the sunshine is best counted on there. The newcomer among the resorts of Tenerife is the small and peaceful Los Gigantes on the west coast. In the northern parts of the island the nature is quite different from the southern nature. The clouds arriving from north don't always have the strength to clear the mountain but pour their rain north of the mountain. Due to these showers the flora is unbelievably rich and breathtakingly beautiful. That is why Puerto de la Cruz is often called the City of Eternal Spring. The best time to travel to Tenerife is February. Puerto de la Cruz was founded in the beginning of the 17th century. Originally it was called Puerto de la Orotava. A big harbour was built there and the city became an important centre of commerce and navigation. The most important export articles until the 19th century were sugar and wine. Nowadays the main source of livelihood of Puerto de la Cruz is tourism. Despite of mass tourism Puerto is still a genuine Canarian town with about 35.000 natives living there. The island of Tenerife was born 10 million years ago as a result of an underwater landslide and a volcanic eruption. The first of the Canary Islands were born the same way some 20 million years ago. The island was conquered from the natives, the Guanches, to Spain by Andalusian Alonso Fernandez Lugo and his troops in the year 1496. The origin of the Guanches is still a mystery to the anthropologists because they were tall, blond and blue-eyed. Furthermore there is no proof of their boat making skills and obviously they couldn't even swim! You can familiarize yourself with the history of the Guanches in Museo Etnografico in La Orotava or in Santa Cruz in Museo Arqueologico where you can meet a Guanche in person - as a mummy. Tenerife is the largest of the Canary Islands. From planning at icsc.ab.ca Mon May 1 17:53:38 2000 From: planning at icsc.ab.ca (Jeanny S. Ryffel) Date: Mon, 1 May 2000 15:53:38 -0600 Subject: Auditory Computations and Neuromorphic Implementations Message-ID: <000501bfb3b7$f0868cc0$5a5b22cf@compusmart.ab.ca> Tutorial on Auditory Computations and Neuromorphic Implementations http://www.icsc.ab.ca/150-tut.htm#Dr. Shihab A. Shamma Organizer: Dr. Shihab A. Shamma (Chair) Electrical and Computer Engineering Department A.V. WIlliams Bldg University of Maryland, College Park MD 20742 sas at Glue.umd.edu http://www.isr.umd.edu/People/faculty/Shamma.html This session will focus on advances in auditory theory and computations for speech, music, and other complex sounds. Topics will include the latest findings and models for encoding of timbre and pitch of complex sounds, and binaural localization cues and algorithms. Also addressed are issues dealing with real-time implementations of auditory algorithms, and their applications in various industrial and military contexts ranging from manufacturing acoustics to MEMS interfaces and prosthetics. The session will also include invited talks on hardware realizations of the algorithms, both in analog VLSI and in DSP platforms. Invited speakers to the session include: Shihab Shamma (Chair) will provide the review talk on auditory computations and implementations (University of Maryland, College Park) Timothy Horiuchi (Co-Chair) - He will talk about auditory robotics, both acoustic and for echolocation (as in bats) (University of Maryland, College Park) Andre van Schaik - He will adress aVLSI implementations of auditory algorithms for pitch extraction and peripheral auditory transformations (University of Sidney) Ralph Etienne-Cummings will review some of the MEMS sensors and actuators that have been used for acoustic signal processing. He will also describe an architecture and sensor/actuator design for a highly integrated microphone, speaker and processing electronics for ultrasonic ranging and imaging. (The Johns Hopkins University) Steve Greenberg - His talk will address the perception of speech and its relation to basic auditory processes in cortex and other higher auditory centers. (Int. Comp. Science Inst., Berkeley California) Researchers interested in contributing to this session are requested to submit manuscripts of up to 5,000 words to: Shihab A. Shamma sas at Glue.umd.edu IMPORTANT DATES May 15, 2000: Submission deadline June 15, 2000: Notification of acceptance July 30, 2000: Delivery of full papers December 12-15, 2000: ISA'2000 congress This session is part of the International Congress on INTELLIGENT SYSTEMS AND APPLICATIONS (ISA'2000) University of Wollongong (near Sydney), Australia December 12-15, 2000 http://www.icsc.ab.ca/isa2000.htm From mkm at hnc.com Tue May 2 11:00:44 2000 From: mkm at hnc.com (McClarin, Melissa) Date: Tue, 2 May 2000 08:00:44 -0700 Subject: HNC Software-Financial Solutions seeks two Staff Scientists Message-ID: <72A838A51366D211B3B30008C7F4D363035BD088@pchnc.hnc.com> HNC Financial Solutions, a division of HNC Software, is a world leader in the development and delivery of predictive, neural networks based software solutions for the financial industry. We offer the opportunity to work with cutting edge technology and a world class team in a casual atmosphere. Benefits include three weeks vacation, stock option grants and tuition reimbursement. HNC Financial Solutions is seeking two full time Staff Scientists to be based at our headquarters in San Diego, California. Please see job description and requirements below. To apply, please use one of the following methods: Mail: Resume Processing Center PO Box 828 Burlington, MA 01803 Fax: 800-438-0957 Email: hnc at rpc.webhire.com Online: http:\\www.hnc.com To apply for this position, please reference job code 00-070CSD Staff Scientist Duties/Job Description: The individual will work with a highly motivated and talented group of scientists and engineers responsible for development of neural network-based predictive models for Falcon payment card fraud product. Falcon currently protects more than 300 million payment cards against fraudulent use around the world in five continents. Forty out of top fifty U.S. banks and sixteen out of top twenty-five international banks currently use Falcon. Job responsibilities include designing and building predictive models for payment card fraud detection based on the latest technologies in neural networks, pattern recognition, artificial intelligence, statistical modeling, and machine learning. Specific responsibilities may vary by project, but will include analyzing huge amount of data to determine suitability for modeling, pattern identification and feature (variable) selection from large amounts of data, experimenting with different types of models, analyzing performance, and reporting results to customers. (Comment: Do not allow automatic wrap or extra returns caused by wrap, 3 total lines) Required Qualifications (Experience/Skills): MS or Ph.D. in Computer Science, Electrical Engineering, Applied Statistics/Mathematics or related fields. Minimum two years of experience in pattern recognition, mathematical/statistical modeling, or data analysis on real world problems. Familiarity with the latest modeling techniques and tools. Good oral and written communication skills. Team orientation and at the same time the ability to work independently. Ability to interact well with both co-workers and customers. Strong programming skills desired. (Comment: same as above) Proficiency in C and UNIX, and familiarity with some analysis tools, e.g., MATLAB, SAS, etc. desirable. Preferred Qualifications (Experience/Skills): Strong mathematical appetite, problem solving and computer skills (C or C++ or Java). Good UNIX scripting and rapid prototyping skill. Quick learner. Good team player. Experience in designing systems based on neural networks, pattern recognition and/or statistical modeling techniques for the financial, health care, marketing, or other real world applications. Object oriented software design familiarity. From sschaal at usc.edu Tue May 2 21:37:08 2000 From: sschaal at usc.edu (Stefan Schaal) Date: Tue, 2 May 2000 18:37:08 -0700 Subject: Conference on Humanoid Robots -- deadline extension May 21 Message-ID: <200005030137.SAA03955@rubens.usc.edu.> **************************************************************************** **** Note: Due to popular request, deadline was extended to May 21 ********* **************************************************************************** Appended you find the call for papers for the first international conference on Humanoid Robots. We are particularly interested in soliciting contributions from the learning community for this conference. Supervised learning, unsupervised learning, and reinforcement learning are core elements in sensory motor control of a humanoid robot, the same as in biological systems. The need for algorithms that scale well to high-dimensional data, work incrementally in real-time, can integrate multi-modal information and deal with hidden state makes the area of humanoid robotics a very interesting challenge for new learning theories. With best regards, Stefan Schaal & Alois Knoll -------------------------------------------------------------------------- CALL FOR PAPERS -- Please circulate *** HUMANOIDS2000 *** -- The First IEEE-RAS Intern. Conf. on Humanoid Robots -- -- Co-sponsored by the Robotics Society of Japan (RSJ) -- Massachusetts Institute of Technology, Sept. 7-8, 2000 Papers should present current work, outline research programmes, and/or summarize in a tutorial style the state of the art in areas that are related to the building of, controling of , and learning in humanoid robots or that can be expected to be of importance to the field in the future. Note that we are also especially interested in connectionist and statistical learning methods as they relate to learning sensorimotor control and higher planning abilities in complex, high-dimensional movement systems. Paper submission deadline is May 21, 2000. For mor information, please visit the conference web sites at http://humanoids.uni-bielefeld.de or http://humanoids.usc.edu for further details (including a full Call for Papers in PDF and Postcript format). ______________________________________________________ Deadlines Submission: May 21, 2000 Notification: June 30, 2000 Camera-Ready Copy: August 4, 2000 ______________________________________________________ Contact address: humanoids at usc.edu ______________________________________________________ Conference Chairs: G.A.Bekey, USC (General) R.A.Brooks, MIT (Honorary) A.C.Knoll, U Bielefeld (Program) ______________________________________________________ Program Committee: M. Asada (Osaka U) C. Atkeson (Georgia Tech) T. Christaller (GMD-Bonn) T. Fukuda (Nagoya U) S. Hashimoto (U Waseda) H. Inoue (U Tokyo) K. Kawamura (Vanderbilt U) B. Keeley (U Northern Iowa) P. Khosla (CMU) T. Kobayashi (U Waseda) Y. Kuniyoshi (MITI Tsukuba) M. Mataric (USC) R. Pfeifer (U Zurich) R. Reiter (U Toronto) S. Schaal (USC) S. Sugano (Waseda U) M. Wheeler (U Stirling) S. Yuta (U Tsukuba) ______________________________________________________ From nnsp00 at neuro.kuleuven.ac.be Wed May 3 06:54:43 2000 From: nnsp00 at neuro.kuleuven.ac.be (NNSP2000, Sydney) Date: Wed, 03 May 2000 12:54:43 +0200 Subject: Postdoctoral position biomedical signal-processing and neuroimaging Message-ID: <39100573.502A8E9E@neuro.kuleuven.ac.be> Postdoctoral position biomedical signal-processing and neuroimaging ------------------------------------------------------------------- The Computational Neuroscience Group of the Laboratory of Neuro- and Psychophysiology, Medical School of the Catholic University of Leuven, Belgium (http:\\simone.neuro.kuleuven.ac.be), invites applications for a post-doctoral position in the area of biomedical signal-processing and neuroimaging (functional Magnetic Resonance Imaging). Desired profile: The highly qualified applicant should possess a Ph.D. degree in the field of signal-processing, image-processing, statistics, or neural networks. He/she should be familiar with Principal Components Analysis (PCA), Independent Components Analysis (ICA), projection pursuit, or related techniques, and have a profound knowledge of both uni-variate statistics, such as t-tests, F-tests, and multi-variate statistics, such as ANOVA, ANCOVA, and MANCOVA. Programming skills are an asset (C, Matlab, ...), as is a familiarity with UNIX and PC platforms. We offer: 1) A challenging research environment. The applicant will have access to data from state-of-the-art Magnetic Resonance scanners and advanced statistical tools such as SPM (Statistical Parameter Mapping) for examining brain activity in both human and monkey. 2) An attractive income. The applicant will reveive 2150 USD or 2375 Euro per month, including a full social security coverage and housing. This is comparable to the salary of an associate Professor at the University. Housing will be taken care of by the host institute. 3) Free return airline ticket, ecomomy class (maximum 1350 USD or 1500 Euro) and a reimbursement of all costs incurred for shipping luggage to Belgium (maximum 900 USD or 1000 Euro). Send your CV (including the names and contact information of three references), bibliography and how to contact you by mail/fax/email/phone to: Prof. Dr. Marc M. Van Hulle K.U.Leuven Laboratorium voor Neuro- en Psychofysiologie Faculteit Geneeskunde Campus Gasthuisberg Herestraat 49 B-3000 Leuven Belgium Phone: + 32 16 345961 Fax: + 32 16 345993 E-mail: marc at neuro.kuleuven.ac.be URL: http://simone.neuro.kuleuven.ac.be From POCASIP at aol.com Wed May 3 21:12:20 2000 From: POCASIP at aol.com (POCASIP@aol.com) Date: Wed, 3 May 2000 21:12:20 EDT Subject: Neurobiologically inspired image processing expert sought Message-ID: <1e.4c5850d.26422874@aol.com> The Advanced Signal and Image Processing Laboratory of Intelligent Optical Systems Inc. (IOS) is looking for a candidate who has expertise and experience in: Neurobiologically inspired image processing. We seek a candidate with hands-on experience in modeling the human visual system and its ability to perform edge and vertex detection, contour extraction, and illusory contour representations, as well as knowledge about neuronal synchrony. + Programming fluency in C++ and Java are important assets. + Experience in solving real-world problems in a wide variety of applications is a definite plus. The activities of the Advanced Signal and Image Processing Laboratory include image analysis, biomedical diagnosis, food quality control, chemical analysis, system control, and target recognition, using neural computation-based implementations in software and hardware. IOS is a rapidly growing dynamic high-tech R&D company with a focus on commercializing smart sensors and advanced information processing. It . We employ about 35 people, including 14 scientists from a variety of disciplines, and are located in Torrance, California, a pleasant seaside town with a high standard of living and year-round perfect weather. Please send your application including curriculum vitae, and three references, in ASCII only, by e-mail to POCASIP at aol.com E. Fiesler From Andres.PerezUribe at unifr.ch Wed May 3 08:04:05 2000 From: Andres.PerezUribe at unifr.ch (Andres Perez-Uribe) Date: Wed, 3 May 2000 14:04:05 +0200 Subject: Post-Doc: Uni-Fribourg, Switzerland Message-ID: <1000503140405.ZM5515@ufps25> The Parallelism and Artificial Intelligence (PAI) group is a dynamic team involved in hot research topics related to new information and communication technologies. His interests encompass namely the methodologies of Autonomous and Adaptive Systems, Collective Intelligence, Evolutionary Computing, Agent Technology, and Web operating Systems, but also the field of Human Computer Interaction, where it addresses specifically the Immersive trend and Pervasive Computing, namely with Force-Feedback Interaction and Sound imaging. For the launching of his new WELCOME header project, it is now opening a Post-Doctoral Research Position The position The position is intended for an enthusiastic postgraduate, who have terminated her/his studies for a doctorate. The candidate will participate in the research activities that are conducted within the WELCOME project (gzip'd postcript description of the project available at http://www-iiuf.unifr.ch/pai/index.html/Welcome.ps.gz). Her/his duties will consist in (i) developing Agent-based methodologies for Internet-based infrastructures, or (ii) tackling Human Computer Interaction issues, together with supervision of two Ph.D research works. As far as possible she/he will promote industrial applications of her/his research, and build contacts with external academic or commercial organizations. An open-mind to interdisciplinary approaches and to non-standard innovation techniques will be appreciated. The position is to be taken in early spring 2000 (or at convenience). It is granted for two years by the Swiss National Foundation for Scientific Research (with a possibility of renewal once). Job location is Fribourg, a french-german bilingual middle-size city in Switzerland. The requirements Education: Ph.D in Computer Science Ability to speak, read and write French or German or English Proficient at one or several topics, such as: Intelligent Networks, Distributed Systems and/or Coordination Languages, Agent Technology Human Computer Interaction, Immersive and Pervasive Computing, Force-feedback interaction, Augmented or virtual reality, Sound Imaging Evolutionary Computing, Artificial Life Object-Oriented design techniques, Java programming, Jini technology Applications with CV and research paper list must be sent to (Email submissions are encouraged): Prof. B?at Hirsbrunner University of Fribourg, ch. du Mus?e 3, CH-1700 Fribourg Tel.: +41 (079) 611 72 48 Email: beat.hirsbrunner at unifr.ch -- Andres PEREZ-URIBE Postdoctoral Fellow Parallelism and Artificial Intelligence Group (PAI) Computer Science Institute, University of Fribourg, Switzerland Ch. du Musee 3, CH-1700 Fribourg, Office 2.76b Perolles Tel. +41-26-300-8473, Fax +41-26-300-9731 Email:Andres.PerezUribe at unifr.ch, http://www-iiuf.unifr.ch/~aperezu/ From dimi at ci.tuwien.ac.at Fri May 5 09:43:36 2000 From: dimi at ci.tuwien.ac.at (Evgenia Dimitriadou) Date: Fri, 5 May 2000 15:43:36 +0200 (CEST) Subject: CI BibTeX Collection -- Update Message-ID: The following volumes have been added to the collection of BibTeX files maintained by the Vienna Center for Computational Intelligence: IEEE Transactions on Evolutionary Computation, Volumes 3/4 IEEE Transactions on Fuzzy Systems, Volumes 7/6-8/1 IEEE Transactions on Neural Networks, Volumes 10/6-11/2 Machine Learning, Volumes 37/2-40/2 Neural Computation, Volumes 11/7-12/4 Neural Networks, Volumes 12/9-13/2 Neural Processing Letters, Volumes 10/2-11/2 Most files have been converted automatically from various source formats, please report any bugs you find. The complete collection can be downloaded from http://www.ci.tuwien.ac.at/docs/ci/bibtex_collection.html ftp://ftp.ci.tuwien.ac.at/pub/texmf/bibtex/ Best, Vivi ************************************************************************ * Evgenia Dimitriadou * ************************************************************************ * Institut fr Statistik * Tel: (+43 1) 58801 10773 * * Technische Universitt Wien * Fax: (+43 1) 58801 10798 * * Wiedner Hauptstr. 8-10/1071 * Evgenia.Dimitriadou at ci.tuwien.ac.at * * A-1040 Wien, Austria * http://www.ci.tuwien.ac.at/~dimi* ************************************************************************ From banzhaf at tarantoga.cs.uni-dortmund.de Fri May 5 10:49:19 2000 From: banzhaf at tarantoga.cs.uni-dortmund.de (Wolfgang Banzhaf) Date: Fri, 5 May 2000 16:49:19 +0200 (MET DST) Subject: No subject Message-ID: <200005051449.QAA18496@tarantoga.cs.uni-dortmund.de> A non-text attachment was scrubbed... Name: not available Type: text Size: 2102 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/38422686/attachment-0001.ksh From hammer at informatik.uni-osnabrueck.de Fri May 5 09:45:02 2000 From: hammer at informatik.uni-osnabrueck.de (Barbara Hammer) Date: Fri, 5 May 2000 15:45:02 +0200 Subject: PhD positions at the University of Osnabrueck (Germany) Message-ID: <200005051345.PAA18905@pooh.informatik.uni-osnabrueck.de> Dear Sir or Madam, You are kindly requested to forward the following opening to anyone who might be interested. We apologize in advance in case of multiple receipts of this message. Please ignore this message if it does not lie in your field of interest. Thank you in advance for your cooperation. Sincerely yours, Barbara Hammer ----------------------------------------------------------------- PHD Positions in Neuromathematics ----------------------------------------------------------------- The Ministry of Science and Education of the Federal State of Lower Saxony has established a new research scientists group of junior staff mainly working in the field of "Learning with neural methods on structured data" in the department of Mathematics and Computer Science at the University of Osnabrueck and is therefore looking for two Scientific Assistants (payment according to German tariff BAT IIa) as of now and for a limited period of four years. The group co-operates with the working groups discrete mathematics and theoretical computer science/neuro-informatics as well as with representatives of the interdisciplinary course of studies 'Cognitive Science'. The main working field is the combination of neuro-informatic methods and discrete optimization and applications in information management and scheduling. The persons filling in these posts will have to work in current research projects. These posts require an appropriate scientific university degree in mathematics, computer science or a related scientific-technical course of studies. Knowledge of machine learning or discrete mathematics is desirable. Readiness for interdisciplinary research is expected. The possibility of doing a doctorate is given. In an exceptional case the candidates may already have received the PhD. Part-time employment may be considered. The University of Osnabrueck aims at a higher share of women in the scientific field and would expressively encourage qualified female scientists to apply for these posts. Seriously handicapped applicants will take precedence if equally qualified. Applications containing the customary documents should be sent to Dr. Barbara Hammer, Department of Mathematics/Computer Science, University of Osnabrueck, D-49069 Osnabrueck, Germany by 26th of May 2000. Further information is available under: http://www.informatik.uni-osnabrueck.de/barbara/lnm/ or e-mail to hammer at informatik.uni-osnabrueck.de From mel at lnc.usc.edu Fri May 5 20:11:59 2000 From: mel at lnc.usc.edu (Bartlett Mel) Date: Fri, 05 May 2000 17:11:59 -0700 Subject: Prelim Program: 7th Joint Symposium on Neural Computation Message-ID: <3913634F.322AD1A7@lnc.usc.edu> PRELIMINARY PROGRAM --- 7th Joint Symposium on Neural Computation --- to be held at the UNIVERSITY OF SOUTHERN CALIFORNIA Saturday, May 20, 2000 ------------------------------------------- website: http://www.its.caltech.edu/~jsnc/ ------------------------------------------- 8:45 Opening Remarks - Mel Session I. Cells and Synapses - Bower 9:00 Olivier Coenen, San Diego Children's Hospital Research Center "A Hypothesis for Parallel Fiber Coding in a Cerebellar Model of Smooth Pursuit Eye Movement" 9:15 Roland Suri, The Salk Institute "Modeling Functions of Striatal Dopamine Modulation in Learning and Planning" 9:30 Ralf Wessel, UC San Diego "Biophysics of Visual Motion Analysis in Avian Tectum" 9:45 Panayiota Poirazi, USC "Sublinear vs. Superlinear Synaptic Integration? Tales of a Duplicitous Active Current" Session II. Sensory-Motor Learning - Schaal 10:00 Auke Ijspeert, USC "Locomotion and Visually-Guided Behavior in Salamander: An Artificial Evolution and Neuromechanical Study" 10:15 Thomas DeMarse, Caltech "The Animat Project: Interfacing Neuronal Cultures to a Computer Generated Virtual World" 10:30 Richard Belew, UC San Diego "Evolving Behavior in Developing Robot Bodies Controlled by Quasi-Hebbian Neural Networks" 10:45 Aude Billard, USC "A Biologically inspired Connectionist Model for Learning Motor Skills by Imitation" 11:00 Coffee Break 11:15 KEYNOTE SPEAKER Gerald E. Loeb, USC "Dialogs with the Nervous System" 12:00 Lunch and Posters Session IV. Vision - Hoffmann 2:00 Martina Wicklein, The Salk Institute "Perception of Looming in the Humminbird Hawkmoth Manduca Sexta (Sphingidae, Lepidoptera)" 2:15 Erhan Oztop, USC "Mirror Neuron System in Monkey: A Computational Modeling Approach" 2:30 Eric Ortega, USC "Smart Center-Surround Receptive Fields: What Bayes May Say About the Neural Substrate for Color Constancy" 2:45 Junmei Zhu, USC "Fast Dynamic Link Matching by Communicating Synapese" 3:00 David Eagleman, The Salk Insitute "The Timing of Perception: How Far in the Past do we Live, and Why? 3:15 Coffee Break Session III. Concepts and Memory - Mel 3:30 Jonathan Nelson, UC San Diego "Concept Induction in the Presence of Uncertainty" 3:45 Peter Latham, UCLA "Attractor Networks in Systems with Underlying Random Connectivity" Session V. Faces - Sejnowski 4:00 Tim Marks, UC San Diego "Face Processing in Williams Syndrome: Using ICA to Discriminate Functionally Distinct Independent Components of ERPs in Face Recognition" 4:15 Ian Fasel, UC San Diego "Automatic Detection of Facial Landmarks: An Exhaustive Comparision of Methods" 4:30 Boris Shpungin, UC San Diego "A System for Robustly Tracking Faces in Real-Time" 4:45 Closing Remarks - Sejnowski 5:00 Adjourn for Dinner POSTERS ------- Ildiko Aradi, UC Irvine "Network Stability and Interneuronal Diversity" Javier Bautista, USC "Creating a World Representation of the Environment from Visual Images" Maxim Bazhenov, The Salk Institute "Slow Wave Sleep Oscillations and Transition to an Awake State in a Thalamocortical Network Model" Hamid Beigy, Amirkabir Univ. of Technology "Adaptation of Parameters of BP Algorithm Using Learning Automata" Axel Blau, Caltech "High-Speed Imaging of Neuronal Network Activity" Mihail Bota, USC "The NeuroHomology Database" Theodore Bullock, UC San Diego "When is a Rhythm" Spiros Courellis, USC "Modeling Event-Driven Dynamics in Biological Neural Networks" Holger Quast, UC San Diego "Absolute Perceived Loudness of Speech" Gary Holt, USC "Unsupervised Learning of the Non-Classical Surround" Jeff McKinstry, Point Loma Nazarene University "A Model of Primary Visual Cortex Applied to Edge Detection" Stefan Schaal, USC "Functional brain activation in rhythmic and discrete movement" Alexei Samsonovich, University of Arizona "A Theory-of-Mind Connectionist Model of Episodic Memory Consolidation" ------------------------------------------------------------------ REGISTRATION CHECKS SHOULD BE RECEIVED BY MAY 15 TO GUARANTEE THAT A DELICIOUS HOT LUNCH WILL BE WAITING FOR YOU. See conference web site to register, and for directions to the meeting: http://www.its.caltech.edu/~jsnc/ Fee: $40 students, $50 all others - both prices include hot lunch. Mail checks to: Linda Yokote BME Department USC, Mail Code 1451 Los Angeles, CA 90089 PROGRAM COMMITTEE ----------------- JAMES M. BOWER Division of Biology, Caltech GARRISON W. COTTRELL Dept. of Computer Science and Engineering, UCSD DONALD D. HOFFMAN Dept. of Cognitive Sciences, UCI GILLES LAURENT Division of Biology & Computation and Neural Systems Program, Caltech BARTLETT W. MEL (chair) Dept. of Biomedical Engineering & Neuroscience Program, USC SHEILA NIRENBERG Dept. of Neurobiology, UCLA STEFAN SCHAAL Dept. of Computer Science and Neuroscience Program, USC TERRENCE J. SEJNOWSKI Howard Hughes Medical Institute, UCSD/Salk Institute Local Arrangements ---------------------- Linda Yokote, BME Department, USC, marubaya at rcf.usc.edu, (213)740-0840 Gabriele Larmon, BME Dept, USC, larmon at bmsrs.usc.edu Proceedings ------------ Marilee Bateman, Institute for Neural Computation, UCSD, bateman at cogsci.ucsd.edu Web Site -------- Marionne Epalle, Engineering and Applied Science, Caltech, marionne at caltech.edu From kositsky at greed.cs.umass.edu Mon May 8 18:15:34 2000 From: kositsky at greed.cs.umass.edu (Michael Kositsky) Date: Mon, 8 May 2000 18:15:34 -0400 (EDT) Subject: PhD thesis anouncement Message-ID: Dear Connectionists, My PhD thesis on motor learning and skill acquisition is now available at http://www-anw.cs.umass.edu/~kositsky/phdThesis/phdThesis.html Title: Motor Learning and Skill Acquisition by Sequences of Elementary Actions Abstract: The work presents a computational model for motor learning and memory. The basic approach of the model is to treat complex activities as sequences of elementary actions. The model implements two major functions. First, the combination of elementary actions into sequences to produce desired complex activities, which is achieved by a search procedure involving multiscale task analysis and stochastic descent processing. Second, the utilization of past motor experience by effective memorization and retrieval, and generalizing sequences. New tasks are accomplished by combining past sequences intended for similar tasks. The generalization is based upon the clustering property of motor experience data. Specifically, the clustering property results in concentrating the data points within compact regions, allowing fast and accurate generalization of the elementary actions and consequently, enabling a robust performance of familiar tasks. A motor memory architecture is proposed that uses the clusters as the basic memory units. The computational work is accompanied by a set of psychophysical studies aimed at examining the possible use of a cluster representation by the human motor system. The experiment examines the entire motor learning process, starting from untrained movements up to the formation of highly skilled actions. Michael Kositsky Senior Postdoctoral Researcher Department of Computer Science University of Massachusetts, Amherst email: kositsky at cs.umass.edu web: http://www-anw.cs.umass.edu/~kositsky From cjlin at csie.ntu.edu.tw Mon May 8 14:45:23 2000 From: cjlin at csie.ntu.edu.tw (Chih-jen Lin) Date: Tue, 9 May 2000 02:45:23 +0800 (CST) Subject: announcing a software Message-ID: <200005081845.CAA27195@ntucsa.csie.ntu.edu.tw> Dear Colleagues: We announce the release of the software LIBSVM, a support vector machines (SVM) library for classification problems by Chih-Chung Chang and Chih-Jen Lin. Most available SVM software are either quite complicated or are not suitable for large problems. Instead of seeking a very fast software for difficult problems, we provide a simple, easy-to-use, and moderately efficient package for SVM classification. We hope this library helps users from other fields to easily use SVM as a tool. We also provide a graphic interface to demonstrate 2-D pattern recognition. The current release (Version 1.0) is available from http://www.csie.ntu.edu.tw/~cjlin/libsvm Any comments are very welcome. Sincerely, Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei, Taiwan cjlin at csie.ntu.edu.tw From flake at research.nj.nec.com Mon May 8 13:16:32 2000 From: flake at research.nj.nec.com (Gary William Flake) Date: Mon, 8 May 2000 13:16:32 -0400 (EDT) Subject: NODElib availability Message-ID: <200005081716.NAA09498@cartman.nj.nec.com> Greetings: I am pleased to announce that NODElib is publicly available under the GNU public license. NODElib is a research and development programming library that can be used to rapidly produce neural network simulations. Some of NODElib's more advance features include: * A vast number NN architectures unified under a general framework: MLP, RBFN, higher-order, SMLP, CNLS, arbitrary activation functions, arbitrary connectivity, etc. * Advanced algorithms for all of the above: calculation of the Hessian, optimization of the Jacobian, etc. * Support vector machines with a generalized version of SMO that handles regression and kernel caching. * Advanced optimization routines with options for line search procedures. * Plus many other features... An overview of NODElib can be found at: http://www.neci.nj.nec.com/homepages/flake/nodelib/html/ And the actual library itself can be downloaded from: http://www.neci.nj.nec.com/homepages/flake/nodelib.tgz Best, -- GWF -- Gary William Flake, Ph.D. flake at research.nj.nec.com NEC Research Institute http://www.neci.nj.nec.com/homepages/flake/ 4 Independence Way voice: 609-951-2795 Princeton, NJ 08540 fax : 609-951-2488 =============================================================================== The Computational Beauty of Nature http://mitpress.mit.edu/books/FLAOH/cbnhtml/ From goodman at unr.edu Mon May 8 10:01:31 2000 From: goodman at unr.edu (goodman@unr.edu) Date: Mon, 8 May 2000 07:01:31 -0700 (PDT) Subject: Postdoc in Spike-Coding Message-ID: ********* POSTDOCTORAL POSITION IN SPIKE-CODING NEOCORTICAL MODELS ********* Philip H. Goodman Henry Markram Sushil J. Louis Weizmann Institute for Science University of Nevada, Reno Rehovot, Israel Applications are invited for a postdoctoral research fellowship in the field of large-scale biologically realistic models of cortical microcircuit dynamics. Location: Reno/Lake Tahoe with periodic work at the Weizmann Institute Funding: Negotiable, depending upon experience Dates: Available now; duration 2-3 years Deadline: Open until filled Qualifications (*all of the following*): 1. Ph.D. in computational modeling, neuroscience, or cognitive science 2. Strong mathematical background 3. Substantial modeling experience using GENESIS, NEURON, or SURF-HIPPO 4. Demonstrable programming ability in C++ 5. Familiarity with basic statistical analyses 6. Famliiarity wth machine learning & artificial neural networks concepts 7. Willingness to commit at least two full years to the program Description: The purpose of this program is to address a major gap in our conceptual understanding of synaptic and brain-like network dynamics, and to benefit from untapped technological applications of related pulse-coding information networks. The core activity involves the design and implementation of increasingly complex and powerful brain-like simulations on parallel- distributed "Beowulf" computer systems, incorporating newly discovered excitatory and inhibitory parameters obtained from living tissue. We will use this technology to address the following questions: > What minimal microcircuit must be replicated to create a functional cortical column? > How many such columns must interact to demonstrate emergent behavior -- can we crack the "neural code"? > Can one "lesion" such models to evaluate putative therapies for brain disorders such as Alzheimer's disease, stroke, and epilepsy? We will also compare generalization abilities of "brain-wise" computation to existing artificial neural network and traditional non-neural classifiers. The fellow will work closely with faculty, a full-time PhD-candidate in computer engineering, and other students. Reno is located at an elevation of 4,000 feet at the base of the Sierra Nevada mountain range, with outstanding year-round weather. Reno is only 45 minutes away from powder-skiing, boating, and hiking near Lake Tahoe, and 3 hours from San Francisco by car. Cost of living is estimated at only 65% that of the San Francisco region, and Nevada has no state income tax. Inquiries: Fully qualified individuals should send all of the following in order to initiate consideration: (1) a short letter stating interest and summarizing qualifications, (2) a CV, and (3) names, phone numbers, and email addresses of three references to: Philip H. Goodman, MD, MS, Washoe Medical Center, 77 Pringle Way, Reno, NV 89502. Email inquiries to goodman at unr.edu are encouraged; please use one of the following formats: plain text, unencoded postscript, MS Word, Adobe Acrobat. ***************************************************************************** From ash at isical.ac.in Fri May 5 18:09:54 2000 From: ash at isical.ac.in (Ashish Ghosh) Date: Sat, 06 May 2000 03:39:54 +0530 Subject: New book Message-ID: <391346B2.F178061D@isical.ac.in> I am happy to announce the publication of the following book: "Soft Computing for Image Processing" by Sankar K. Pal, Ashish Ghosh and Malay K. Kundu (Eds.) from Physica-Verlag, Heidelberg, New York. The book is available at http://www.springer.de/cgi-bin/search_book.pl?isbn=3-7908-1268-4 Thanks, Ashish Ghosh ================================================================ Soft Computing for Image Processing Pal, S.K., Indian Statistical Institute, Calcutta, India Ghosh, A., Indian Statistical Institute, Calcutta, India Kundu, M.K., Indian Statistical Institute, Calcutta, India (Eds.) 2000. XVIII, 590 pp. 309 figs., 73 tabs. The volume provides a collection of 21 articles containing new material and describing, in a unified way with extensive real life applications, the merits and significance of performing different image processing/analysis tasks in soft computing paradigm. The articles, written by leading experts all over the world, demonstrate the various ways the fuzzy logic, artificial neural networks, genetic algorithms and fractals can be used independently and in integrated manner to provide efficient and flexible information processing capabilities in a stronger computational paradigm for handling the tasks like filtering, edge detection, segmentation, compression , classification, motion estimation, character regognition and target identification. Application domain includes, among others, data mining, computer vision, pattern recognition and machine learning, information technology, remote sensing, forensic investigation, video abstraction and knowledge based systems. Keywords: Image Processing, Image Analysis, Soft Computing Contents: S.K. Pal, A. Ghosh, M.K. Kundu: Soft Computing and Image Analysis: Features, Relevance and Hybridization.- Preprocessing and Feature Extraction: F.Russo: Image Filtering Using Evolutionary Neural Fuzzy Systems.- T. Law, D. Shibata, T. Nakamura, L. He, H. Itoh: Edge Extraction Using Fuzzy Reasoning.- S.K. Mitra, C.A. Murthy, M.K. Kundu: Image Compression and Edge Extraction Using Fractal Technique and Genetic Algorithms.- S. Mitra, R. Castellanos, S.-Y. Yang, S. Pemmaraju: Adaptive Clustering for Efficient Segmentation and Vector Quantization of Images.- B. Uma Shankar, A. Ghosh, S.K. Pal: On Fuzzy Thresholding of Remotely Sensed Images.- W. Skarbek: Image Compression Using Pixel Neural Networks.- L He, Y. Chao, T. Nakamura, H. Itho: Genetic Algorithm and Fuzzy Reasoning for Digital Image Compression Using Triangular Plane Patches.- N B. Karayiannis, T.C. Wang: Compression of Digital Mammograms Using Wavelets and Fuzzy Algorithms for Learning Vector Quantization.- V.D. Ges: Soft Computing and Image Analysis.- J.H. Han, T.Y. Kim, L.T. Kczy: Fuzzy Interpretation of Image Data.- Classification: M. Grabisch: New Pattern Recognition Tools Based on Fuzzy Logic for Image Understanding.- N.K. Kasabov, S.I. Israel, B.J. Woodford: Adaptive, Evolving, Hybrid Connectionist Systems for Image Pattern Recognition.- P.A. Stadter, N.K Bose: Neuro-Fuzzy Computing: Structure, Performance Measure and Applications.- K. D. Bollacker, J. Ghosh: Knowledge Reuse Mechanisms for Categorizing Related Image Sets.- K. C. Gowda, P. Nagabhushan, H.N. Srikanta Prakash: Symbolic Data Analysis for Image Processing.- Applications: N.M. Nasrabadi, S. De, L.-C. Wang, S. Rizvi, A. Chan: The Use of Artificial Neural Networks for Automatic Target Recognition.- S. Gutta, H. Wechsler: Hybrid Systems for Facial Analysis and Processing Tasks.- V. Susheela Devi, M. Narasimha Murty: Handwritten Digit Recognition Using Soft Computing Tools.- T.L. Huntsburger, J.R. Rose, D. Girard: Neural Systems for Motion Analysis: Single Neuron and Network Approaches.- H.M. Kim, B. Kosko: Motion Estimation and Compensation with Neural Fuzzy Systems. -- ************************************** * Dr. Ashish Ghosh, Ph.D., M.Tech. * * Associate Professor * * Machine Intelligence Unit * * Indian Statistical Institute * * 203 B. T. Road * * Calcutta 700 035, INDIA * * E-mail : ash at isical.ac.in * * ashishghosgisi at hotmail.com * * Fax : +91-33-577-6680/3035 * * Tel:+91-33-577-8085 ext.3110 (Off) * * +91-33-528-2399 (Res) * * URL: http://www.isical.ac.in/~ash * * ICQ: 48125276 * ************************************** From sml at essex.ac.uk Tue May 9 10:45:49 2000 From: sml at essex.ac.uk (Lucas, Simon M) Date: Tue, 9 May 2000 15:45:49 +0100 Subject: Algorithm evaluation over the Internet (papers available) Message-ID: <8935BFF68E96D3119C91009027D3A56A4A75A8@sernt14.essex.ac.uk> Dear All, We've been developing a system for automatically evaluating algorithms over the Internet - in particular this is geared towards evaluation of image/signal processing and pattern recognition algorithms. There are many aims of the system but the main one is to make life easier for the algorithm developer by automating the evaluation process and by ensuring that the results produced are objective and authentic. There's a paper due to appear in ICPR-2000, and you can also get a preprint of a more complete paper submitted to the Interntation Journal of Document Analysis and Recognition (see below). The website is in its early testing phase and can be found at http://algoval.essex.ac.uk Most of the site is being mirrored for the moment at http://ace.essex.ac.uk so try that if the first one fails. The results section shows how potenitally informative and effortless evaluation can be when done this way (see trainable ROI results, or dictionary word recognition for example). Also, if there's sufficient interest, I wondered if anyone would be interested in helping organise a NIPS workshop on this kid of stuff. Comments welcome. Best Regards, Simon Lucas Papers http://algoval.essex.ac.uk/papers/icpr2000.ps title: Automatic evaluation of algorithms over the Internet authors: Simon Lucas and Kostas Sarampalis to appear: Proceedings ICPR 2000 abstract: This paper describes a system for the automatic evaluation of algorithms (especially pattern recognition algorithms) over the Internet. We present the case for such a system, discuss the system requirements and potential users and present an initial prototype. We illustrate usage of the system with an evaluation of image distance measures used for face recognition. http://algoval.essex.ac.uk/papers/ijdar.ps title: Automatic evaluation of document image analysis algorithms over the Internet Simon Lucas, Paul Beattie and Joanne Coy Submission for IJDAR special issue on performance evaluation Abstract: We have implemented a system for automatically evaluating document image analysis algorithms on datasets over the Internet. In this paper we discuss some general issues related to this mode of evaluation and describe in particular the set-up of a common document image processing problem: locating regions of interest within an image. Traditionally, the evaluation of these and other document processing algorithms has usually been done in a rather manner, in most cases by the developers of the algorithms. By contrast, our system makes the evaluation process simple, consistent, objective and automatic. The system also provides detailed log files to give useful feedback to algorithm developers. ------------------------------------------------ Dr. Simon Lucas Department of Electronic Systems Engineering University of Essex Colchester CO4 3SQ United Kingdom Tel: (+44) 1206 872935 Fax: (+44) 1206 872900 Email: sml at essex.ac.uk http://esewww.essex.ac.uk/~sml ------------------------------------------------- From stephen-m at uk2.net Tue May 9 06:35:51 2000 From: stephen-m at uk2.net (Stephen McGlinchey) Date: Tue, 9 May 2000 11:35:51 +0100 Subject: Thesis available - Transformation-Invariant Topology Preserving Maps Message-ID: <004301bfb9a2$5a1aa780$d363bf92@cis0-75s.staff.paisley.ac.uk> The following PhD thesis is available from http://cis.paisley.ac.uk/mcgl-ci0/ "Transformation-Invariant Topology Preserving Maps" Stephen J. McGlinchey (March 2000) Abstract This thesis investigates the use of unsupervised learning in the context of artificial neural networks to determine filters of data sets that exhibit invariances of one sort or another. The artificial neural networks in this thesis are all of a general type known as topology-preserving mappings. Topology-preserving maps have been of great interest in the computational intelligence community since they were devised in the 1980s. These are methods of mapping high dimensional data to a space of smaller dimensionality, whilst preserving the topographic structure of the data, at least to some degree. Such models have been successfully used in applications such as speech processing, robotics, data visualisation and computer vision, to name but a few. Apart from the many engineering applications of topology preservation, they have also been of biological interest since ensembles of neurons in biological brains have similar properties in that neurons that are close together in certain parts of the brain respond similarly to input data. The specific contributions of this thesis are: 1. An investigation of matrix constraints to preserve topological relations in neural network algorithms, which have previously only used orthonormality as a constraint. The previous algorithms are then seen as special cases of our new algorithms. 2. the development of a topology-preserving map network that ignores the magnitude of input data and respond to its radial location. The organised mappings are able to reliably classify data, where the magnitude of the data has little or no bearing on which class they belongs to. For example, voiced phonemes were classified with amplitude invariance, i.e. regardless of the volume of the speech. 3. a novel neural network method based on Kohonens self-organising map (Kohonen, 1997) algorithm, but combining it with a principal component analysis network to give a set of local principal components which globally cover the data set with a smooth ordering. The resulting filters are transformation invariant for some simple transformations. Stephen McGlinchey Dept. of Computing & Information Systems University of Paisley High Street Paisley PA1 2BE Scotland email: stephen-m at uk2.net fax: +44 141 848 3542 http://cis.paisley.ac.uk/mcgl-ci0/ From nnsp2000 at ee.usyd.edu.au Tue May 9 22:33:38 2000 From: nnsp2000 at ee.usyd.edu.au (NNSP 2000) Date: Wed, 10 May 2000 12:33:38 +1000 Subject: Postdoctoral position biomedical signal-processing and neuroimaging Message-ID: <001001bfba29$dc714720$581b4e81@ee.usyd.edu.au.pe088> Postdoctoral position biomedical signal-processing and neuroimaging ------------------------------------------------------------------- The Computational Neuroscience Group of the Laboratory of Neuro- and Psychophysiology, Medical School of the Catholic University of Leuven, Belgium (http:\\simone.neuro.kuleuven.ac.be), invites applications for a post-doctoral position in the area of biomedical signal-processing and neuroimaging (functional Magnetic Resonance Imaging). Desired profile: The highly qualified applicant should possess a Ph.D. degree in the field of signal-processing, image-processing, statistics, or neural networks. He/she should be familiar with Principal Components Analysis (PCA), Independent Components Analysis (ICA), projection pursuit, or related techniques, and have a profound knowledge of both uni-variate statistics, such as t-tests, F-tests, and multi-variate statistics, such as ANOVA, ANCOVA, and MANCOVA. Programming skills are an asset (C, Matlab, ...), as is a familiarity with UNIX and PC platforms. We offer: 1) A challenging research environment. The applicant will have access to data from state-of-the-art Magnetic Resonance scanners and advanced statistical tools such as SPM (Statistical Parameter Mapping) for examining brain activity in both human and monkey. 2) An attractive income. The applicant will reveive 2150 USD or 2375 Euro per month, including a full social security coverage and housing. This is comparable to the salary of an associate Professor at the University. Housing will be taken care of by the host institute. 3) Free return airline ticket, ecomomy class (maximum 1350 USD or 1500 Euro) and a reimbursement of all costs incurred for shipping luggage to Belgium (maximum 900 USD or 1000 Euro).=20 Send your CV (including the names and contact information of three references), bibliography and how to contact you by mail/fax/email/phone to:=20 Prof. Dr. Marc M. Van Hulle K.U.Leuven Laboratorium voor Neuro- en Psychofysiologie Faculteit Geneeskunde Campus Gasthuisberg Herestraat 49 B-3000 Leuven Belgium Phone: + 32 16 345961 Fax: + 32 16 345993 E-mail: marc at neuro.kuleuven.ac.be URL: http://simone.neuro.kuleuven.ac.be From jcheng at cs.ualberta.ca Thu May 11 14:52:59 2000 From: jcheng at cs.ualberta.ca (J Cheng) Date: Thu, 11 May 2000 12:52:59 -0600 Subject: SOFTWARE ANNOUNCEMENT (Bayesian Network Learning & Data Mining) Message-ID: <005f01bfbb7a$21b88bd0$211c8081@cs.ualberta.ca> Dear Colleagues, A software package for Bayesian belief network (BN) learning & data mining is now available for free download at: http://www.cs.ualberta.ca/~jcheng/bnsoft.htm The package includes two applications for Windows 95/98/NT/2000: BN PowerConstructor: An efficient system for learning BN structures and parameters from data. Constantly update since 1997. BN PowerPredictor: An extension of BN PowerConstructor for learning unrestricted BN & Bayes multi-nets based classifiers. Best regards, Jie Cheng Dept. of Computing Science, Univ. of Alberta email: jcheng at cs.ualberta.ca ############################################################################ Features of BN PowerPredictor: * Graphical Bayesian network editor for modifying BN classifiers' structure. * Wrapper algorithm. The system can automatically learn classifiers of different types and different complexities and choose the best performer. * Feature subset selection. The system can perform feature subset selection automatically. * Two inference modes. The classification can be performed in either batch mode (data set based) or interactive mode (instance based). * Supporting domain knowledge. * Supporting misclassification cost table definition. Sample Performance of BN PowerPredictor on UCI ML data sets: (Pentium II-300) Data set Cases(Train/Test) Attr. Running Time(training) Prediction Accu. Adult 32561/16281 13 25 Min. 86.33+-0.53 Nursery 8640/4320 8 1 Min. 97.13+-0.50 Mushroom 5416/2708 22 9 Min. 100 Chess 2130/1066 36 3 Min. 96.44+-1.11 DNA 2000/1186 60 17 Min. 96.63+-1.03 * The confidence level is 95%. The system performs 300-1000 classifications per second. From ingber at ingber.com Fri May 12 07:58:21 2000 From: ingber at ingber.com (Lester Ingber) Date: Fri, 12 May 2000 06:58:21 -0500 Subject: Programmer/Analyst Position Message-ID: <20000512065821.A14116@ingber.com> Programmer/Analyst in Computational Finance. DRW Investments [www.drwtrading.com], 311 S Wacker Dr Ste 900, Chicago IL 60606. At least 2-3 years combined experience programming in C/C++, Visual Basic/VBA and Java, as well as financial industry experience with a financial institution. Must have excellent background in Physics, Math, or similar disciplines, PhD preferred. Needs practical knowledge of methods of field theory and stochastic differential equations, as well as an understanding of derivatives pricing models, bond futures and modeling of financial indices. The position will be primarily dedicated to developing and coding algorithms for automated trading. Flexible hours in intense environment. Requires strong commitment to several ongoing projects with shifting priorities. See www.ingber.com for some papers on current projects. Please email Lester Ingber ingber at drwtrading.com a resume regarding this position. -- Lester Ingber http://www.ingber.com/ PO Box 06440 Wacker Dr PO Sears Tower Chicago IL 60606-0440 http://www.alumni.caltech.edu/~ingber/ From stephen at computing.dundee.ac.uk Mon May 15 10:01:09 2000 From: stephen at computing.dundee.ac.uk (Stephen McKenna) Date: Mon, 15 May 2000 15:01:09 +0100 Subject: New Book - Dynamic Vision: From Images to Face Recognition Message-ID: <035901bfbe76$05823160$26222486@lagavulin.dyn.computing.dundee.ac.uk> Dear colleagues, We are pleased to announce the publication of the following book: "Dynamic Vision: From Images to Face Recognition" by Shaogang Gong, Stephen J McKenna and Alexandra Psarrou Imperial College Press (World Scientific Publishing), ISBN 1-86094-181-8, 344 pp. Further details are available at http://www.computing.dundee.ac.uk/staff/stephen/book.html The book can be obtained from: http://www.worldscientific.com/books/bookshop.html or http://www.amazon.com/exec/obidos/ASIN/1860941818/qid%3D957267900/sr%3D1-2/102-5354817-5263206 Sincerely, Stephen McKenna Department of Applied Computing University of Dundee DD1 4HN Tel.: +44 (0)1382 344732 Fax.: +44 (0)1382 345509 ================ [ Table of contents inserted by Connectionists moderator: ] Contents PART I BACKGROUND 1 About Face The Visual Face, The Changing Face, Computing Faces, Biological Perspectives, The Approach 2 Perception and Representation A Distal Object, Representation by 3D Reconstruction, Two-dimensional View-based Representation, Image Template-based Representation, The Correspondence Problem and Alignment, Biological Perspectives, Discussion 3 Learning Under Uncertainty Statistical Learning, Learning as Function Approximation, Bayesian Inference and MAP Classification, Learning as Density Estimation, Unsupervised Learning without Density Estimation, Linear Classification and Regression, Non-linear Classification and Regression, Adaptation, Biological Perspectives, Discussion PART II FROM SENSORY TO MEANINGFUL PERCEPTION 4 Selective Attention: Where to Look Pre-attentive Visual Cues from Motion, Learning Object-based Colour Cues, Perceptual Grouping for Selective Attention, Data Fusion for Perceptual Grouping, Temporal Matching and Tracking, Biological Perspectives, Discussion 5 A Face Model: What to Look For Person-independent Face Models for Detection, Modelling the Face Class, Modelling a Near-face Class, Learning a Decision Boundary, Perceptual Search, Biological Perspectives, Discussion 6 Understanding Pose Feature and Template-based Correspondence, The Face Space across Views: Pose Manifolds, Template Matching as Affine Transformation, Similarities to Prototypes across Views, Learning View-based Support Vector Machines, Biological Perspectives, Discussion 7 Prediction and Adaptation Temporal Observations, Propagating First-order Markov Processes, Kalman Filters, Propagating Non-Gaussian Conditional Densities, Tracking Attended Regions, Adaptive Colour Models, Selective Adaptation, Tracking Faces, Pose Tracking, Biological Perspectives, Discussion PART III MODELS OF IDENTITY 8 Single-View Identification Identification Tasks, Nearest-neighbour Template Matching, Representing Knowledge of Facial Appearance, Statistical Knowledge of Facial Appearance, Statistical Knowledge of Identity, Structural Knowledge: The Role of Correspondence, Biological Perspectives, Discussion 9 Multi-View Identification View-based Models, The Role of Prior Knowledge, View Correspondence in Identification, Generalisation from a Single View, Generalisation from Multiple Views, Biological Perspectives, Discussion 10 Identifying Moving Faces Biological Perspectives, Computational Theories of Temporal Identification, Identification using Holistic Temporal Trajectories, Identification by Continuous View Transformation, An Experimental System, Discussion PART IV PERCEPTION IN CONTEXT 11 Perceptual Integration Sensory and Model-based Vision, Perceptual Fusion, Perceptual Inference, Vision as Co-operating Processes, Biological Perspectives, Discussion 12 Beyond Faces Multi-modal Identification, Visually Mediated Interaction, Visual Surveillance and Monitoring, Immersive Virtual Reality, Visual Database Screening PART V APPENDICES A Databases Database Acquisition and Design, Acquisition of a Pose-labelled Database, Benchmarking, Commercial Databases, Public Domain Face Databases, Discussion B Commercial Systems System Characterisation, A View on the Industry, Discussion C Mathematical Details Principal Components Analysis, Linear Discriminant Analysis, Gaussian Mixture Estimation, Kalman Filters, Bayesian Belief Networks, Hidden Markov Models, Gabor Wavelets Bibliography Index 344 pp. From steve at cns.bu.edu Mon May 15 22:32:33 2000 From: steve at cns.bu.edu (Stephen Grossberg) Date: Mon, 15 May 2000 22:32:33 -0400 Subject: The Imbalanced Brain: From Normal Behavior to Schizophrenia Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg in HTML, PDF, and Gzipped postscript: Grossberg, S. (2000). The imbalanced brain: From normal behavior to schizophrenia. Biological Psychiatry, in press. Preliminary version appears as Boston University Technical Report CAS/CNS TR-99-018. ABSTRACT: An outstanding problem in psychiatry concerns how to link discoveries about the pharmacological, neurophysiological, and neuroanatomical substrates of mental disorders to the abnormal behaviors that they control. A related problem concerns how to understand abnormal behaviors on a continuum with normal behaviors. During the past few decades, neural models have been developed of how normal cognitive and emotional processes learn from the environment, focus attention and act upon motivationally important events, and cope with unexpected events. When arousal or volitional signals in these models are suitably altered, they give rise to symptoms that strikingly resemble negative and positive symptoms of schizophrenia, including flat affect, impoverishment of will, attentional problems, loss of a theory of mind, thought derailment, hallucinations, and delusions. The present article models how emotional centers of the brain, such as the amygdala, interact with sensory and prefrontal cortices (notably ventral, or orbital, prefrontal cortex) to generate affective states, attend to motivationally salient sensory events, and elicit motivated behaviors. Closing this feedback loop between cognitive and emotional centers is predicted to generate a cognitive-emotional resonance that can support conscious awareness. When such emotional centers become depressed, negative symptoms of schizophrenia emerge in the model. Such emotional centers are modeled as opponent affective processes, such as fear and relief, whose response amplitude and sensitivity are calibrated by an arousal level and chemical transmitters that slowly inactivate, or habituate, in an activity-dependent way. These opponent processes exhibit an Inverted-U whereby behavior become depressed if the arousal level is chosen too large or too small. The negative symptoms are due to the way in which the depressed opponent process interacts with other circuits throughout the brain. Keywords: schizophrenia, arousal, prefrontal cortex, amygdala, opponent process, neural networks From bvr at stanford.edu Mon May 15 18:45:05 2000 From: bvr at stanford.edu (Benjamin Van Roy) Date: Mon, 15 May 2000 15:45:05 -0700 Subject: REMINDER: CALL FOR PAPERS -- NIPS*2000 Message-ID: <4.2.0.58.20000515154436.00d40d60@bvr.pobox.stanford.edu> CALL FOR PAPERS -- NIPS*2000 ========================================== Neural Information Processing Systems Natural and Synthetic Monday, Nov. 27 -- Saturday, Dec. 2, 2000 Denver, Colorado ========================================== This is the fourteenth meeting of an interdisciplinary conference which brings together cognitive scientists, computer scientists, engineers, neuroscientists, physicists, statisticians, and mathematicians interested in all aspects of neural processing and computation. The conference will include invited talks as well as oral and poster presentations of refereed papers. The conference is single track and is highly selective. Preceding the main session, there will be one day of tutorial presentations (Nov. 27), and following it there will be two days of focused workshops on topical issues at a nearby ski area (Dec. 1-2). Tutorials will include: Population Codes (Richard Zemel, U. of Toronto), Linking Brain to Behavior (Stephen Grosberg, Boston U.), Markov Chain Monte Carlo (Andrew Gelman, Columbia U.), Visual Attention (Harold Pashler, UCSD) and more! Major categories for paper submission, with example subcategories (by no means exhaustive), are listed below. A special area of emphasis this year is innovative applications of neural computation. Algorithms and Architectures: supervised and unsupervised learning algorithms, feedforward and recurrent network architectures, localized basis functions, mixture models, committee models, belief networks, graphical models, support vector machines, Gaussian processes, topographic maps, decision trees, factor analysis, principal component analysis and extensions, independent component analysis, model selection algorithms, combinatorial optimization, hybrid symbolic-subsymbolic systems. Applications: innovative applications of neural computation including data mining, information retrieval, web and network applications, intrusion detection, fraud detection, bio-informatics, medical diagnosis, image processing and analysis, handwriting recognition, industrial monitoring and control, financial analysis, time-series prediction, consumer products, music, video and artistic applications, animation, virtual environments, learning dynamical systems. Cognitive Science/Artificial Intelligence: perception and psychophysics, neuropsychology, cognitive neuroscience, development, conditioning, human learning and memory, attention, language, natural language, reasoning, spatial cognition, emotional cognition, conceptual representation, neurophilosophy, problem solving and planning. Implementations: analog and digital VLSI, optical neurocomputing systems, novel neurodevices, computational sensors and actuators, simulation tools. Neuroscience: neural encoding, spiking neurons, synchronicity, sensory processing, systems neurophysiology, neuronal development, synaptic plasticity, neuromodulation, dendritic computation, channel dynamics, experimental data relevant to computational issues. Reinforcement Learning and Control: exploration, planning, navigation, Q-learning, TD-learning, state estimation, dynamic programming, robotic motor control, process control, Markov decision processes. Speech and Signal Processing: speech recognition, speech coding, speech synthesis, speech signal enhancement, auditory scene analysis, source separation, applications of hidden Markov models to signal processing, models of human speech perception, auditory modeling and psychoacoustics. Theory: computational learning theory, statistical physics of learning, information theory, Bayesian methods, prediction and generalization, regularization, online learning (stochastic approximation), dynamics of learning, approximation and estimation theory, complexity theory, multi-agent learning. Visual Processing: image processing, image coding, object recognition, visual psychophysics, stereopsis, motion detection and tracking. ---------------------------------------------------------------------- Review Criteria: All submitted papers will be thoroughly refereed on the basis of technical quality, significance, and clarity. Novelty of the work is also a strong consideration in paper selection, but to encourage interdisciplinary contributions, we will consider work which has been submitted or presented in part elsewhere, if it is unlikely to have been seen by the NIPS audience. Authors new to NIPS are strongly encouraged to submit their work, and will be given preference for oral presentations. Authors should not be dissuaded from submitting recent work, as there will be an opportunity after the meeting to revise accepted manuscripts before submitting a final camera-ready copy for the proceedings. Paper Format: Submitted papers may be up to seven pages in length, including figures and references, using a font no smaller than 10 point. Text is to be confined within a 8.25in by 5in rectangle. Submissions failing to follow these guidelines will not be considered. Authors are required to use the NIPS LaTeX style files obtainable by anonymous FTP at the site given below. THE STYLE FILES HAVE BEEN UPDATED; please make sure that you use the current ones and not previous versions. Submission Instructions: NIPS has migrated to electronic submissions. Full submission instructions will be available at the web site given below. You will be asked to enter paper title, names of all authors, category, oral/poster preference, and contact author data (name, full address, telephone, fax, and email). You will upload your manuscript from the same page. We are only accepting postscript manuscripts. No pdf files will be accepted this year. The electronic submission page will be available on April 28, 2000. Submission Deadline: SUBMISSIONS MUST BE LOGGED BY MIDNIGHT MAY 19, 2000 PACIFIC DAYLIGHT TIME (08:00 GMT May 20). The LaTeX style files for NIPS, the Electronic Submission Page, and other conference information are available on the World Wide Web at http://www.cs.cmu.edu/Web/Groups/NIPS Copies of the style files are also available via anonymous ftp at ftp.cs.cmu.edu (128.2.242.152) in /afs/cs/Web/Groups/NIPS/formatting. For general inquiries or requests for registration material, send e-mail to nipsinfo at salk.edu or fax to (619)587-0417. NIPS*2000 Organizing Committee: General Chair, Todd K. Leen, Oregon Graduate Institute; Program Chair, Tom Dietterich, Oregon State University; Publications Chair, Volker Tresp, Siemens AG; Tutorial Chair, Mike Mozer, University of Colorado; Workshops Co-Chairs, Rich Caruana, Carnegie Mellon University, Virginia de Sa, Sloan Center for Theoretical Neurobiology; Publicity Chair, Benjamin Van Roy, Stanford University; Treasurer, Bartlett Mel, University of Southern California; Web Masters, Doug Baker and Alex Gray, Carnegie Mellon University; Government Liaison, Gary Blasdel, Harvard Medical School; Contracts, Steve Hanson, Rutgers University, Scott Kirkpatrick, IBM, Gerry Tesauro, IBM. NIPS*2000 Program Committee: Leon Bottou, AT&T Labs - Research; Tom Dietterich, Oregon State University (chair); Bill Freeman, Mitsubishi Electric Research Lab; Zoubin Ghahramani, University College London; Dan Hammerstrom, Oregon Graduate Institute; Thomas Hofmann, Brown University; Tommi Jaakkola, MIT; Sridhar Mahadevan, Michigan State University; Klaus Obermeyer, TU Berlin; Manfred Opper, Aston University; Yoram Singer, Hebrew University of Jerusalem; Malcolm Slaney, Interval Research; Josh Tenenbaum, Stanford University; Sebastian Thrun, Carnegie Mellon University. PAPERS MUST BE SUBMITTED BY MAY 19, 2000 From gluck at pavlov.rutgers.edu Tue May 16 17:04:04 2000 From: gluck at pavlov.rutgers.edu (Mark A. Gluck) Date: Tue, 16 May 2000 17:04:04 -0400 Subject: Neural Net Programming/Research Job at Rutgers-Newark Neuroscience Message-ID: Neural Net Programming/Research Job at Rutgers-Newark Neuroscience The Gluck and Myers labs Rutgers-Newark have a part-time position available for a computer programmer. We are looking for someone with very strong programming skills who is capable of independent work. Hours and salary will be by arrangement, commensurate with experience and assigned work; there is also the possibility of course credits for independent study in computer science, psychology or neurobiology. We are looking for someone who can contribute to one or more of the following projects. Prior coursework dealing with brain systems and learning is not required for any of these projects, although an interest in brain science would be desirable. For more information on our research and the Memory Disorders Project at Rutgers-Newark, see the web pages listed at bottom of email. There is the training potential in this job to learn more about NN models, the neurobio of learning and memory, and the cognitive neuroscience of memory, etc. Significant contributions to research will be acknowledged by author-credit on on academic research papers. An ideal candidate might be someone who is looking to do a year or two of research and work in a research laboratory before applying to graduate school in a related area. Work duties will span three projects: 1) Computational neuroscience. We develop neural network models of the brain and learning, focusing on the role of specific brain structures such as the hippocampus and basal forebrain. We are looking for a programmer with a strong background in C or C++ with some experience working with neural networks to modify and extend existing code; experience with Unix systems and the Solaris operating system is essential. 2) Behavioral test development. To test our computational models, we perform behavioral tests in normal people and people with various memory impairments. These tests take the form of computerized "games". We are looking for a programmer with experience using object-oriented languages to implement new tests. Currently, our tests are written in SuperCard and SuperLab languages for Macintosh. Familiarity with these languages is not essential, but the successful applicant will be prepared to learn them. Some prior experience with Macintosh computers is essential. 3) Applications programming. We have several existing behavioral tests, programmed for the Macintosh, which need to be reprogrammed to run under Microsoft Windows using a platform such as Visual Basic, Visual C++ or MatLab. Strong experience with one of these platforms and with Windows is essential. If interested, please email both gluck at pavlov.rutgers.edu and myers at pavlov.rutgers.edu with information on your background experience, relevant skills, and future career goals. Please give emails for three people who can write letters of recommendation for you. Mark A. Gluck Associate Professor of Neuroscience Catherine E. Myers Assistant Professor of Psychology _______________________________________________________________ Dr. Mark A. Gluck, Associate Professor Center for Molecular and Behavioral Neuroscience Phone: (973) 353-1080 x3221 Rutgers University Fax: (973) 353-1272 197 University Ave. Newark, New Jersey 07102 Email: gluck at pavlov.rutgers.edu WWW Homepages: Research Lab: http://www.gluck.edu Rutgers Memory Disorders Project: http://www.memory.rutgers.edu ______________________________________________________________ From skremer at q.cis.uoguelph.ca Wed May 17 14:11:06 2000 From: skremer at q.cis.uoguelph.ca (Stefan C. Kremer) Date: Wed, 17 May 2000 14:11:06 -0400 (EDT) Subject: No subject Message-ID: Dear Connectionists: We are in the process of trying to organize a competition involving using unlabeled data for supervised learning (similar to previous competitions on learning time-series and grammars). If this sounds like something you might be interested in, please check out the web-page at: http://q.cis.uoguelph.ca/~skremer/NIPS2000 Thanks, -Stefan -- Dr. Stefan C. Kremer, Assistant Prof., Dept. of Computing and Information Science University of Guelph, Guelph, Ontario N1G 2W1 WWW: http://hebb.cis.uoguelph.ca/~skremer Tel: (519)824-4120 Ext.8913 Fax: (519)837-0323 E-mail: skremer at snowhite.cis.uoguelph.ca From cristina at idsia.ch Thu May 18 06:47:16 2000 From: cristina at idsia.ch (Cristina Versino) Date: Thu, 18 May 2000 12:47:16 +0200 Subject: Neuroinformatics for 'living' artefacts Message-ID: <200005181047.MAA02735@rapa.idsia.ch> The Information Society Technologies programme and the Quality of Life programme are launching a joint call for project proposals on "Neuroinformatics for 'living' artefacts". The call objective is to explore new synergies between Neurosciences and Information Technologies in order to enable the construction of hardware/software "artefacts that live and grow", i.e. artefacts that self-adapt and evolve beyond pure programming. Preference will be given to work that demonstrates adaptability and growth in the "real world" and that does not simply extrapolate from an already established research field (such as neural-networks or genetic algorithms). An Information Workshop on the call will take place in Brussels on the 9th of June. The workshop will feature: - An in-depth presentation of the call by European Commission staff. - To stimulate thinking and discussion, invited speakers will present topics of their choice that relate to the call. - _Stand up and speak_: anybody having registered can speak for 2 minutes to describe their interests and the kind of partnerships they are seeking. Updated information on the workshop: http://www.cordis.lu/ist/fetni-iw.htm Information on the call: http://www.cordis.lu/ist/fetni.htm http://www.cordis.lu/life/src/neuro.htm From ascoli at osf1.gmu.edu Fri May 19 12:25:28 2000 From: ascoli at osf1.gmu.edu (GIORGIO ASCOLI) Date: Fri, 19 May 2000 12:25:28 -0400 (EDT) Subject: Postdoc position available Message-ID: Please post and circulate as you see fit. Many thanks! COMPUTATIONAL NEUROSCIENCE POST-DOCTORAL POSITION AVAILABLE A post-doctoral position is available immediately for computational modeling of dendritic morphology, neuronal connectivity, and development of anatomically and physiologically accurate neural networks. All highly motivated candidates with a recent PhD (or expecting one in year 2000) in biology, computer science, or other scientific disciplines are encouraged to apply. Academic background and/or a strong interest in either computational science or neuroscience are required, as well as the ability and willingness to learn new techniques and concepts. Programming skills and/or experience with modeling packages are desirable but not necessary. Post-doc will join a young and dynamic research group at the Krasnow Institute for Advanced Study, located in Fairfax, VA (less than 20 miles west of Washington DC). The initial research project is focused on (1) the generation of complete neurons in virtual reality that reproduce accurately the experimental morphological data; and/or (2) the study of the influence of dendritic shape (geometry and topology) on the electrophysiological behavior. We have developed advanced software to build network models of entire regions of the brain (e.g. the rat hippocampus). Please refer to our website for further details: www.krasnow.gmu.edu/ascoli/CNG The post-doc will be hired as a Research Assistant Professor (with VA state employee benefits) with a salary based on the NIH postdoctoral scale, and will have full access to library and computing facilities both within the Krasnow Institute and George Mason University. Send CV, (p)reprints, a brief description of your motivation, and names, email addresses and phone/fax numbers of three references to: ascoli at gmu.edu (or by fax at the number below) ASAP. There is no deadline but the position will be filled as soon as a suitable candidate is found. Non-resident aliens are also welcome to apply. The Krasnow Institute is an equal opportunity employer. Giorgio Ascoli, PhD Head, Computational Neuroanatomy Group Krasnow Institute for Advanced Study at George Mason University, MS2A1 Fairfax, VA 22030 Ph. (703)993-4383 Fax (703)993-4325 From wolfskil at MIT.EDU Fri May 19 14:45:33 2000 From: wolfskil at MIT.EDU (Jud Wolfskill) Date: Fri, 19 May 2000 14:45:33 -0400 Subject: book announcement--Thornton Message-ID: A non-text attachment was scrubbed... Name: not available Type: text/enriched Size: 7454 bytes Desc: not available Url : https://mailman.srv.cs.cmu.edu/mailman/private/connectionists/attachments/00000000/39ea9e34/attachment-0001.bin From michael at cs.unm.edu Fri May 19 19:17:27 2000 From: michael at cs.unm.edu (Zibulevsky Michael) Date: Fri, 19 May 2000 17:17:27 -0600 Subject: paper: Blind Source Separation by Sparse Decomposition Message-ID: Announcing a paper (revised version) ... Title: Blind Source Separation by Sparse Decomposition in a Signal Dictionary Authors: Michael Zibulevsky and Barak A. Pearlmutter Abstract The blind source separation problem is to extract the underlying source signals from a set of linear mixtures, where the mixing matrix is unknown. This situation is common, in acoustics, radio, medical signal and image processing, hyperspectral imaging, etc. We suggest a two-stage separation process. First, a priori selection of a possibly overcomplete signal dictionary (for instance a wavelet frame, or a learned dictionary) in which the sources are assumed to be sparsely representable. Second, unmixing the sources by exploiting the their sparse representability. We consider the general case of more sources than mixtures, but also derive a more efficient algorithm in the case of a non-overcomplete dictionary and an equal numbers of sources and mixtures. Experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques. URL of the ps file: http://ie.technion.ac.il/~mcib/spica12.ps.gz Contact: michael at cs.unm.edu, bap at cs.unm.edu From P.Dean at sheffield.ac.uk Wed May 17 05:25:28 2000 From: P.Dean at sheffield.ac.uk (Paul Dean) Date: Wed, 17 May 2000 10:25:28 +0100 Subject: Position available Message-ID: UNIVERSITY OF SHEFFIELD DEPARTMENT OF PSYCHOLOGY POSTDOCTORAL POSITION IN COMPUTATIONAL NEUROSCIENCE (R2012) Applications are invited for a 3 year post-doctoral research position to develop second-generation distributed models for the control of eye movements, with special reference to the role of the cerebellum. The project is supervised by Drs. Paul Dean and John Porrill. Applicants with an interest in the mathematical modelling of biological systems are welcomed. Experience with MATLAB is preferred but not essential. The post begins in October 2000. Salary: 16,286 - 20,811 pa. Closing date: 20 July 2000. For details of this post, email: jobs at sheffield.ac.uk or tel: 0114 222 1631 (24 hr). Please quote the post reference (R2012) in all enquiries. Vacancy Website: http://www.shef.ac.uk/jobs/ Informal enquiries to Paul Dean: email p.dean at sheffield.ac.uk, phone +44 (0)114 222 1631 (24 hr). ------------------------------------------------------------------------------ Dr. Paul Dean, Department of Psychology University of Sheffield, Sheffield S10 2TP England Phone: +44 (0)114 222 6521 Fax: +44 (0)114 276 6515 From jobs at thuris.com Sat May 20 00:09:47 2000 From: jobs at thuris.com (Thuris Corporation) Date: Fri, 19 May 2000 20:09:47 -0800 Subject: Thuris Corporation seeks senior scientists (two positions) Message-ID: Thuris Corporation (www.thuris.com) is a startup company applying innovative computer science methods to advanced neuroscience data. We offer extremely competitive compensation packages, as well as the opportunity to work in a thriving cutting-edge environment. We are headquartered in Newport Beach, California, and have extensive collaborative and cross-licensing agreements with the University of California, Irvine. Job Description: The individuals hired will work on a range of projects involving the computational analysis of brain data. Thuris currently has three products under development: the NeuroGraph, an EEG-based device for the diagnosis of Alzheimer's disease (see Forbes ASAP article, 5/29/'00); the BrainPrint system, a new method for in-depth analysis of the effects of pharmaceutical compounds in the brain, and RapidAging, an assay system for testing the effects of candidate neuroprotective drugs. All are patented or patent pending, and all are in advanced stages of development. Job responsibilities include designing and building applications software, diagnostic and classification software, and networking software. Specific responsibilities will vary by project, and may include data analysis, pattern recognition, statistical modeling, machine learning, and neural networks. Required qualifications: Bachelor's degree in computer science or related field, with experience in programming, pattern recognition, and advanced statistical tools. Ability to interact well with co-workers. Proficiency in C, C++, object-oriented programming, and familiarity with some analysis tools, e.g., MATLAB, etc. Preferred qualifications: Master's degree or Ph.D. in computer science or related field, background in neuroscience, experience in designing large statistical or pattern recognition systems. To apply: Thuris Corporation Suite 1100 620 Newport Center Drive Newport Beach, CA 92660 Attn: Human Resources fax: 949-856-9036 email: jobs at thuris.com From john at dcs.rhbnc.ac.uk Mon May 22 05:37:15 2000 From: john at dcs.rhbnc.ac.uk (John Shawe-Taylor) Date: Mon, 22 May 2000 10:37:15 +0100 (BST) Subject: NeuroCOLT workshop on Reinforcement Learning Message-ID: NeuroCOLT workshop on Reinforcement Learning, 12-16 July 2000 Cumberland Lodge Windsor Great Park Windsor England registration forms and more information on the website: http://www.neurocolt.org/reinforcement.html Summary Reinforcement learning is one of the most active research areas in artificial intelligence. In this approach to machine learning, an agent tries to maximize the total amount of reward it receives when interacting with a complex and uncertain environment. The analysis of this scenario is radically different from standard approaches to supervised learning, as many of the common assumptions do not hold. Recent advances in the theoretical analysis of this problem will be surveyed in a 3 days course by Michael Kearns, followed by invited talks by Jonathan Baxter, Chris Watkins, and some talks contributed by the participants in the workshop. (For more details see web site). Cost The inclusive cost per participant is GBP 406.00 The non-resident cost is GBP 250.00 this includes all meals except breakfast Accommodation Residential accommodation for the workshop is limited. A maximum of 23 rooms is available on the first two nights, but more are free thereafter. From dale at logos.math.uwaterloo.ca Mon May 22 15:10:03 2000 From: dale at logos.math.uwaterloo.ca (Dale Schuurmans) Date: Mon, 22 May 2000 15:10:03 -0400 (EDT) Subject: CFP: MLJ special issue Message-ID: <200005221910.PAA26155@newlogos.math.uwaterloo.ca> Call for Papers MACHINE LEARNING Journal Special Issue on NEW METHODS FOR MODEL SELECTION AND MODEL COMBINATION GUEST EDITORS: Yoshua Bengio, Universit de Montral Dale Schuurmans, University of Waterloo SUBMISSION DEADLINE: July 31, 2000 (electronic submission in pdf or postscript format) A fundamental tradeoff in machine learning and statistics is the under-fitting versus over-fitting dilemma: When inferring a predictive relationship from data one must typically search a complex space of hypotheses to ensure that a good predictive model is available, but must simultaneously restrict the hypothesis space to ensure that good candidates can be reliably distinguished from bad. That is, the learning problem is fundamentally ill-posed; several functions might fit a given set of data but behave very differently on further data drawn from the same distribution. A classical approach to coping with this tradeoff is to perform "model selection" where one imposes a complexity ranking over function classes and then optimizes a combined objective of class complexity and data fit. In doing so, however, it would be useful to have an accurate estimate of the expected generalization error at each complexity level so that the function class with the lowest expected error could be selected, or functions from the classes with lowest expected error could be combined, and so on. Many approaches have been proposed for this purpose in both the statistics and the machine learning research communities. Recently in machine learning and statistics there has been renewed interest in techniques for evaluating generalization error, for optimizing generalization error, and for combining and selecting models. This is exemplified, for instance, by recent work on structural risk minimization, support vector machines, boosting algorithms, and the bagging algorithm. These new approaches suggest that better generalization performance can be obtained using new, broadly applicable procedures. Progress in this area has not only been important for improving our understanding of how machine learning algorithms can generalize effectively, it has already proven its value in real applications of machine learning and data analysis. We seek submissions that cover any of these new areas of predictive model selection and combination. We are particularly interested in papers that present current work on boosting, bagging, and Bayesian model combination techniques, as well as work on model selection, regularization, and other automated complexity control methods. Papers can be either theoretical or empirical in nature; our primary goal is to collect papers that shed new light on existing algorithms or propose new algorithms that can be shown to exhibit superior performance under identifiable conditions. The key evaluation criteria will be insight and novelty. This special issue Machine Learning follows from a successful workshop held on the same topic at the Universit de Montral in April, 2000. This workshop brought together several key researchers in the fields of machine learning and statistics to discuss current research issues on boosting algorithms, support vector machines, and model selection and regularization techniques. Further details about the workshop can be found at www.iro.umontreal.ca/~bengioy/crmworkshop2000. SUBMISSION INSTRUCTIONS: Papers should be sent by email to dale at cs.uwaterloo.ca by July 31, 2000. The preferred format for submission is PDF or Postscript. (Please be sure to embed any special fonts.) If electronic submission is not possible, then a hard copy can be sent to: Dale Schuurmans Department of Computer Science 200 University Avenue West University of Waterloo Waterloo, Ontario N2L 3G1 Canada (519) 888-4567 x6769 (for courier delivery) From leews at comp.nus.edu.sg Tue May 23 03:31:11 2000 From: leews at comp.nus.edu.sg (Lee Wee Sun) Date: Tue, 23 May 2000 15:31:11 +0800 (GMT-8) Subject: postdoc position available Message-ID: The computational learning theory group of the National University of Singapore is looking for a postdoc. The focus of the project funding the position is the search for theoretically principled, practical algorithms for machine learning. One of the main areas of interest of the project is on developing learning algorithms that work on the internet - for applications such as collaborative filtering, adaptive placement of web advertisements and information retrieval. The postdoc will be free to pursue independent research along these general lines, but will also have the opportunity to collaborate with the members of the group on other research areas (see http://www.comp.nus.edu.sg/~plong/nuscolt.html for a description of the research interests of the members of group). The starting date is somewhat flexible, but should be some time before September, 2000. The position runs for two years. If you are interested, please send your CV and the names of three references to leews at comp.nus.edu.sg by June 24, 2000. From terry at salk.edu Tue May 23 23:51:40 2000 From: terry at salk.edu (terry@salk.edu) Date: Tue, 23 May 2000 20:51:40 -0700 (PDT) Subject: NEURAL COMPUTATION 12:6 Message-ID: <200005240351.UAA12818@hebb.salk.edu> Neural Computation - Contents - Volume 12, Number 6 - June 1, 2000 ARTICLE Separating Style and Content with Bilinear Models Joshua B. Tenenbaum and William T. Freeman NOTES Relationships between the A Priori and a Posteriori Errors in Nonlinear Adaptive Neural Filters Danilo P. Mandic, and Jonathon A. Chambers The VC Dimension For Mixtures of Binary Classifiers Wenxin Jiang The Early Restart Algorithm Malik Magdon-Ismail and Amir F. Atiya LETTERS Attractor Dynamics in Feedforward Neural Networks Lawrence K. Saul and Michael I. Jordan Visualizing the Function Computed by a Feedforward Neural Network Tony Plate, Joel Bert, John Grace and Pierre Band Discriminant Pattern Recognition Using Transformation Invariant Neurons Diego Sona, Alessandro Sperduti, Antonina Starita Observable Operator Models for Discrete Stochastic Time Series Herbert Jaeger Adaptive Method of Realizing Natural Gradient Learning for Multilayer Perceptrons Shun-ichi Amari, Hyeyoung Park, and Kenji Fukumizu Nonmontonic Generalization Bias of Gaussian Mixture Models Shotaro Akaho and Hilbert J. Kappen Efficient Block Training of Multilayer Perceptrons A Navia-Vazquez and A. R. Figueiras-Vidal An Opimization Approach to Design of Generalized BSB Neural Associative Memories Jooyoung Park and Yonmook Park Nonholonomic Orthogonal Learning Algorithms for Blind Source Separation Shun-ichi Amari, Tian-Ping Chen, and Andrzej Cichocki ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2000 - VOLUME 12 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $430 $460.10 $478 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 258-6779 mitpress-orders at mit.edu ----- From qian at brahms.cpmc.columbia.edu Thu May 25 19:09:20 2000 From: qian at brahms.cpmc.columbia.edu (Ning Qian) Date: Thu, 25 May 2000 19:09:20 -0400 Subject: stereo vision paper available Message-ID: <200005252309.TAA15264@brahms.cpmc.columbia.edu> Dear Connectionists, The following paper on modeling disparity attraction and repulsion is available at: http://brahms.cpmc.columbia.edu/publications/attract-repul.ps.gz It has 28 text pages and 11 figures. Best regards, Ning -------------------------------------------------------------- A Physiologically-Based Explanation of Disparity Attraction and Repulsion Samuel Mikaelian and Ning Qian, Vision Research (in press). Abstract Westheimer and Levi found that when a few isolated features are viewed foveally, the perceived depth of a feature depends not only on its own disparity but also on those of its neighbors. The nature of this interaction is a function of the lateral separation between the features: When the distance is small the features appear to attract each other in depth but the interaction becomes repulsive at larger distances. Here we introduce a two-dimensional extension of our recent stereo model based on the physiological studies of Ohzawa et al, and demonstrate through analyses and simulations that these observations can be naturally explained without introducing ad hoc assumptions about the connectivity between disparity-tuned units. In particular, our model can explain the distance-dependent attraction/repulsion phenomena in both the vertical-line configuration used by Westheimer, and the horizontal-line-and-point configuration used by Westheimer and Levi. Thus, the psychophysically observed disparity interaction may be viewed as a direct consequence of the known physiological organization of the binocular receptive fields. We also find that the transition distance at which the disparity interaction between features changes from attraction to repulsion is largely determined by the preferred spatial frequency and orientation distributions of the cells used in the disparity computation. This result may explain the observed variations of the transition distance among different subjects in the psychophysical experiments. Finally, our model can also reproduce the observed effect on the perceived disparity when the disparity magnitude of the neighboring features is changed. From stefan.wermter at sunderland.ac.uk Fri May 26 12:54:17 2000 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Fri, 26 May 2000 17:54:17 +0100 Subject: NN and neuroscience workshop call Message-ID: <392EAC38.E6A37954@sunderland.ac.uk> ***We also plan to have six places for advanced PhD students or recent post-doctorates and encourage applicants.**** EmerNet: International EPSRC Workshop on Current Computational Architectures Integrating Neural Networks and Neuroscience. Date: 8-9 August 2000 Location: Durham Castle, Durham, United Kingdom Workshop web page is http://www.his.sunderland.ac.uk/worksh3 Organising Committee ----------------------- Prof. Stefan Wermter Chair Hybrid Intelligent Systems Group University of Sunderland Prof. Jim Austin Advanced Computer Architecture Group Department of Computer Science University of York Prof. David Willshaw Institute for Adaptive and Neural Computation Division of Informatics University of Edinburgh Call for Papers and Participation -------------------------------- Description and Motivation --------------------------- Although there is a massive body of research and knowledge regarding how processing occurs in the brain this has had little impact on the design and development of computational systems. Many challenges remain in the development of computational systems, such as robustness, learning capability, modularity, massive parallelism for speed, simple programming, more reliability etc. This workshop aims to consider if the design of computational systems can learn from the integration of cognitive neuroscience, neurobiology and artificial neural networks. The main objective is the transfer of knowledge by bringing Together researchers in the twin domains of artificial and real neural networks. The goal is to enable computer scientists to comprehend how the brain processes information to generate new techniques for computation and encourage neuroscientists to consider computational factors when performing their research. Areas of Interest for Workshop -------------------------------- The main areas of interest for the workshop bring together Neural Network Architectures and Neuroscience Robustness: What are the characteristics that enable the human brain to carry on operating despite failure of its elements? How can the brain's slow but robust memory be utilised to replace the brittle but fast memory presently found in conventional computers? Modular construction: How can the brain provide ideas for Bringing together the current small artificial neural networks to create larger modular systems that can solve more complex tasks like associative retrieval, vision and language understanding? Learning in context: There is evidence from neuron, network and Brain levels that the internal state of such a neurobiological system has an influence on processing and learning. Is it possible to build computational models of these processes and states, and design incremental learning algorithms and dynamic architectures? Synchronisation: How does the brain synchronise its processing when using millions of processors? How can large asynchronous computerised systems be produced that do not rely on a central clock? Timing: Undertaking actions before a given deadline is vital. What structural and processing characteristics enable the brain to deal with real time situations? How can these be incorporated into a computerised approach? Processing speed: despite having relatively slow computing element, how is real-time performance achieved? Preliminary Invited Speakers We plan to have around 30 participants, including speakers and participants. -------------------------------------- Dr Jim Fleming - EPSRC Prof. Michael Fourman - University of Edinburgh Prof. Angela Frederici - Max Planck Institute of Cognitive NeuroScience Prof. Stephen Hanson - Rutgers University Prof. Stevan Harnad - University of Southampton Prof. Vasant Honavar - Iowa State University Dr Hermann Moisl - University of Newcastle upon Tyne Prof. Heiko Neumann - Universit Ulm Prof. Gnther Palm - Universit Ulm Prof. Kim Plunkett (tbc) - Oxford University Prof. James A. Reggia - University of Maryland Prof. John Taylor - King's College London Workshop Details ------------------- In order to have a workshop of the highest quality it incorporates a combination of paper presentations on one of the six areas of interest by the participants and more open discussion oriented activities. The discussion element of the EmerNet Workshop will be related to the questions above and it is highly desirable that those wishing to participate focus on one or more of these issues in an extended abstract or position paper of up to 4 pages. Papers should be in either ps, pdf or doc format via email for consideration to Professor Stefan Wermter and Mark Elshaw by the 1st of June 2000. KEY QUESTIONS IS: What can we learn from cognitive neuroscience and the brain for building new computational neural architectures. It is intended that for all participants registration, meals and accommodation at Durham Castle for the Workshop will be provided free of charge. Further, specially invited participants are to receive reasonable travel expenses reimbursed and additional participants rail travel costs in the UK. ***We also plan to have six places for PhD students or recent post-doctorates and encourage applicants.**** Extended versions of papers can be published as book chapters in a book with Springer. Location - Durham Castle ------------------------- The EmerNet Workshop is to be held at Durham Castle, Durham(chosen as in between Sunderland, York and Edinburgh) in the North East of England. There are few places in the world that can match the historic City of Durham, with its dramatic setting on a rocky horseshoe bend in the River Wear and beautiful local countryside. Furthermore, it offers easy accessibility by rail from anywhere in the Great Britain and is close to the international airport at Newcastle. The workshop provides the chance to stay at a real English castle that was constructed under the orders of King William the Conqueror in 1072, shortly after the Norman Conquest. It has many rooms of interest including a Norman Chapel that has some of the most fascinating Norman sculptures in existence and the Great Hall that acts as the dinning area. By having the EmerNet Workshop at this excellent location this provides the chance for interesting and productive discussion in a peaceful and historic atmosphere. It is possible to gain a flavour of Durham Castle and Cathedral on the on-line tour at http://www.dur.ac.uk/~dla0www/c_tour/tour.html Contact Details --------------- Mark Elshaw (Workshop Organiser) Hybrid Intelligent Systems Group Informatics Centre SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3249 Fax: +44 191 515 2781 E-mail: Mark.Elshaw at sunderland.ac.uk Prof. Stefan Wermter (Chair) Informatics Centre, SCET University of Sunderland St Peter's Way Sunderland SR6 0DD United Kingdom Phone: +44 191 515 3279 Fax: +44 191 515 2781 E-mail: Stefan.Wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ From papanik at intelligencia.com Sat May 27 01:31:28 2000 From: papanik at intelligencia.com (Kostas Papanikolaou) Date: Sat, 27 May 2000 01:31:28 -0400 Subject: NNA '01 (Neural Netorks and Applications), FSFS '01 (Fuzzy Sets and Fuzzy Systems), EC '01(Evolutionary Computation) Message-ID: <005d01bfc79d$0fa92740$0101a8c0@Alexandros> Our sincere apologies if multiple copies of the call for papers arrive you or these conferences are not inside your research interests. ******************************************************************** We invite you to submit a paper and/or organize a special session for NNA, FSFS, EC 2001. NNA '01 (Neural Netorks and Applications), FSFS '01 (Fuzzy Sets and Fuzzy Systems) and EC '01(Evolutionary Computation) -- a unique triplet of soft computing conferences - will take place in Puerto De La Cruz, Tenerife, Canary Islands, (Spain), in February 11-15, 2001. Web Sites: http://www.worldses.org/wses/nna http://www.worldses.org/wses/fsfs http://www.worldses.org/wses/ec They are sponsored by: The World Scientific and Engineering Society (WSES) Co-Sponsored by IIARD, IMCS and they are supported by NeuroDimension Inc. http://www.nd.com Could you, please, forward the following call_for_papers to your friends, colleagues, working groups? Sincerely Yours K.Papanikolaou papanico at go.com papanik at intelligencia.com ***************** CALL FOR PAPERS ********************** We invite you to submit a paper and/or organize a special session for NNA, FSFS, EC 2001. Could you, please, forward the following call_for_papers to your friends, colleagues, working groups? Thanks a lot! 2001 WSES International Conference on: Neural Networks and Applications (NNA '01) http://www.worldses.org/wses/nna - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . 2001 WSES International Conference on: Fuzzy Sets & Fuzzy Systems (FSFS '01) http://www.worldses.org/wses/fsfs - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . 2001 WSES International Conference on: Evolutionary Computations (EC '01) http://www.worldses.org/wses/ec - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . Sponsored by: The World Scientific and Engineering Society (WSES) Co-Sponsored by IIARD, IMCS Supported by NeuroDimension Inc http://www.nd.com The WORLDSES Conferences: NNA'01 (Neural Networks and Applications), FSFS'01 (Fuzzy Sets Fuzzy Logic), EC'01 (Evolutionary Computation) will take place at the Hotel TENERIFE PLAYA (main hotel for the conferences) as well as at the Hotel SAN FELIPE. (Puerto De La Cruz, Tenerife, Canary Islands (Spain), February 11-15, 2001). DEADLINE FOR PAPER SUBMISSION: OCTOBER 30, 2000 NOTIFICATION OF ACCEPTANCE/REJECTION: NOVEMBER 30, 2000 INTERNATIONAL SCIENTIFIC COMMITTEE: Prof. Peter G. Anderson, Rochester Institute of Technology, NY, USA. Prof. George Antoniou, Mont Clair State University, NJ, USA. Prof. Hamid Arabnia, University of Georgia, Georgia, USA. Prof. Hans-Georg Beyer, University of Dortmund, Germany. Prof. Hans-Heinrich Bothe, Technical University of Denmark, Denmark. Prof. Andrew Lim Leong Chye, National Technical University, Singapore. Prof. Raimondas Ciegis, Vilnius Technical University, Lithuania. Prof. Patrick Corr, The Queen's University of Belfast, Northern Ireland. Prof. Satnam Dlay, University of Newcastle, UK. Prof. Meng Joo Er, National Technological University, Singapore. Prof. Janos Fodor, Szent Istvan University, Hungary. Prof. David Fogel, Natural Selection Company, IEEE Editor Trans.EC., USA. Prof. Kaoru Hirota, Tokyo Institute of Technology, Japan. Prof. Damir Kalpic, University of Zagreb, Croatia. Prof. Dae-Seong Kang, Dong-A University, Korea. Prof. Nikola Kasabov, University of Otago, Dunedin, New Zealand. Prof. Rudolf Kruse, Universitaet Magdeburg, Germany. Prof. Franz Kurfess, Concordia University, Canada. Prof. Pascal Lorenz, Universite de Haute Alsace, Colmar, France. Prof. Maria Makrynaki, IMCS, Greece. Prof. Nikos Mastorakis, Hellenic Naval Academy, Greece. Prof. Valeri Mladenov, Eindhoven University of Technology, Netherlands. Prof. Ahmed Mohamed, American University of Cairo, Egypt. Prof. Masoud Mohammadian, University of Canberra, Australia. Prof. Fionn Murtagh, The Queen's University of Belfast, Northern Ireland. Prof. Fazel Naghdy, University of Wollongong, Australia. Prof. Erkki Oja, Helsinki University of Technology (HTU), Finland. Prof. Marcin Paprzycki, University of Southern Mississippi, USA. Prof. Hristo Radev, Technical University of Sofia, Bulgaria. Prof. Raul Rojas, Freie Universitaet Berlin, Germany. Prof. David Sanchez, Elsevier: Neurocomputing, Editor in Chief, Pasadena, USA. Prof. Francisco Torrens, Universitat de Valencia, Spain. Prof. Tom Whalen, The Georgia State University, Atlanta, USA. Prof. Yanqing Zhang, The Georgia State University, Atlanta, USA. Prof. Hans-Jorgen Zimmermann, RWTH Aaachen, Germany. Prof. Jacek Zurada, University of Louisville, USA. Dr. Jacob Barhen, CESAR, ORNL, TN, USA also with JPL, California, USA. Dr. Dimitris Tsaptsinos, Kingston University, UK. Dr. Qingfu Zhang, UMIST, Manchester, UK. NNA'01 TOPICS: ============== Biological Neural Networks Artificial Neural Networks Mathematical Foundations of Neural Networks Virtual Environments Neural Networks (NN) for Signal Processing Connectionist Systems Learning Theory Architectures and Algorithms Neurodynamics and Attractor Networks Pattern Classification and Clustering Hybrid and Knowledge-Based Networks Artificial Life Implementation of (artificial) NN VLSI techniques for NN implementation Neural Control NN for Robotics NN for Optimization, Systems theory and Operational Research NN in Numerical Analysis problems NN Training using Fuzzy Logic NN Training using Evolutionary Computations Interaction between: Neural Networks - Fuzzy Logic - Genetic Algorithms NN and Non-linear Systems NN and Chaos and Fractals Modeling and Simulation Hybrid Intelligent systems Neural Networks for Electric Machines Neural Networks for Power Systems Neural Networks for Real-Time Systems Neural Networks in Information Systems Neural Networks in Decision Support Systems Neural Networks and Discrete Event Systems Neural Networks in Communications Neural Networks for Multimedia Neural Networks for Educational Software Neural Networks for Software Engineering NN for Adaptive Control NN for Aerospace, Oceanic and Vehicular Engineering Man-Machine Systems Cybernetics and Bio-Cybernetics Relevant Topics and Applications Parallel and Distributed Systems Special Topics Others. FSFS'01 TOPICS: ============== Fuzzy Logic Fuzzy Sets Fuzzy Topology and Fuzzy Functional Analysis Fuzzy Differential Geometry Fuzzy Differential Equations Fuzzy Algorithms Fuzzy Geometry Fuzzy Languages Fuzzy Control Fuzzy Signal Processing Fuzzy Subband Image Coding VLSI Fuzzy Systems Approximate Reasoning Fuzzy Logic and Possibility theory Fuzzy Expert Systems Fuzzy Systems theory Connectionist Systems Learning Theory Pattern Classification and Clustering Hybrid and Knowledge-Based Networks Artificial Life Fuzzy Systems in Robotics Fuzzy Systems for Operational Research NN Training using Fuzzy Logic Interaction between: Neural Networks - Fuzzy Logic - Genetic Algorithms Fuzzy Systems and Non-linear Systems Fuzzy Systems and Chaos and Fractals Modeling and Simulation Hybrid Intelligent systems Fuzzy Systems and Fuzzy Engineering for Electric Machines Fuzzy Systems and Fuzzy Engineering for Power Systems Fuzzy Systems and Fuzzy Engineering for Real-Time Systems Fuzzy Systems and Fuzzy Engineering for Information Systems Fuzzy Systems and Fuzzy Engineering for Decision Support Systems Fuzzy Systems and Fuzzy Engineering for Discrete Event Systems Fuzzy Systems and Fuzzy Engineering for Communications Fuzzy Systems and Fuzzy Engineering for Multimedia Fuzzy Systems and Fuzzy Engineering for Educational Software Fuzzy Systems and Fuzzy Engineering for Software Engineering Fuzzy Systems and Fuzzy Engineering for Adaptive Control Fuzzy Systems and Fuzzy Engineering for Aerospace, Oceanic and Vehicular Engineering Man-Machine Systems Cybernetics and Bio-Cybernetics Relevant Topics and Applications Parallel and Distributed Systems Special Topics Others. EC'01 TOPICS: ============= Genetic Algorithms (GA) Mathematical Foundations of GA Evolution Strategies Genetic Programming Evolutionary Programming Classifier Systems Cultural algorithms Simulated Evolution Artificial Life Learning Theory Pattern Classification and Clustering Evolutionary Computations (EC) in Knowledge Engineering Evolvable Hardware Molecular Computing EC in Control Theory EC in Signal Processing EC for Image Coding Approximate Reasoning EC in Robotics EC for Operational Research Neural Networks Training using EC Interaction between: Neural Networks - Fuzzy Logic - Evolutionary Computations EC and Non-linear Systems theory Modeling and Simulation Hybrid Intelligent systems EC for Electric Machines EC for Power Systems EC for Real-Time Systems EC for Information Systems EC for Decision Support Systems EC for Discrete Event Systems EC for Communications EC for Multimedia EC for Educational Software EC for Software Engineering EC for Adaptive Control EC for Aerospace, Oceanic and Vehicular Engineering Global Optimization Man-Machine Systems Cybernetics and Bio-Cybernetics Relevant Topics and Applications Parallel and Distributed Systems Special Topics Others. TENERIFE and PORTO DE LA CRUZ The population of the island is about 700.000 of which about 210.000 live in the capital city Santa Cruz, situated on the north-west coast of the island. Tenerife (as well as the other Canarian Islands) is partly tax-free zone. The main source of livelihood of Tenerife is the tourism industry: more than four million tourists visit the island every year. Tourism has very long traditions in Tenerife, the first tourists came from England in the 1880's! There is some agriculture too: vegetables, fruit and flowers. The most important cultivated plant is banana [platano]. A Tenerifean banana is quite different from its distant cousin the Chiquita banana. A platano is short and plump. The colour of the fruit flesh is darker yellow and the taste much more delicious. There can be hundreds of thousands of banana plants on one plantation. They also make licquer of the bananas on the island. Another important plant is the grapevine. The rich volcanic soil and mild climate give the wine its own unique aroma. In Tenerife there are as many as five "Denomination of Origen (DO)" vineyards: Abona, Tacoronte-Acentejo, Valle de Guimar, Valle de la Orotava and Ycoden-Daute-Isora. The largest of these is Tacoronte-Acentejo, area 1.200 hectares. The production in 1997 was 1.553.000 kgs grapes. The most cultivated brands of grape are the white Listan Blanco, Malvasia and Marmajuelo, the red Listan Negro and Negramoll. There are two airports on the island. The international airport Reina Sofia (Tenerife Sur TFS) in the south near Playa de las Americas where most of the international flights land. The other airport Los Rodeos (Tenerife Norte TFN) is in the north near La Laguna. Los Rodeos serves mainly domestic flights. The distance from Reina Sofia to Playa de las Americas is about 20 kms and the trip takes about 20 mins, to Puerto de la Cruz about 100 km and takes about 90 mins. Tenerife is dominated by the highest mountain in Spain, the volcano Teide, the often snow covered summit of which reaches the altitude of 3.717 meters. El Teide is not a dormant volcano! The last (though minor) eruption took place in the beginning of this century. The last disasterous eruption happened in the year 1706. The southern part of the island is very infertile and next to nothing grows without artificial irrigation. The southern resorts Playa de las Americas and Los Cristianos have been built for tourism only and there is no local settlement. The prices there are distinctly higher than in the Capital City or Puerto de la Cruz. But the best beaches are on the southern coast and the sunshine is best counted on there. The newcomer among the resorts of Tenerife is the small and peaceful Los Gigantes on the west coast. In the northern parts of the island the nature is quite different from the southern nature. The clouds arriving from north don't always have the strength to clear the mountain but pour their rain north of the mountain. Due to these showers the flora is unbelievably rich and breathtakingly beautiful. That is why Puerto de la Cruz is often called the City of Eternal Spring. The best time to travel to Tenerife is February. Puerto de la Cruz was founded in the beginning of the 17th century. Originally it was called Puerto de la Orotava. A big harbour was built there and the city became an important centre of commerce and navigation. The most important export articles until the 19th century were sugar and wine. Nowadays the main source of livelihood of Puerto de la Cruz is tourism. Despite of mass tourism Puerto is still a genuine Canarian town with about 35.000 natives living there. The island of Tenerife was born 10 million years ago as a result of an underwater landslide and a volcanic eruption. The first of the Canary Islands were born the same way some 20 million years ago. The island was conquered from the natives, the Guanches, to Spain by Andalusian Alonso Fernandez Lugo and his troops in the year 1496. The origin of the Guanches is still a mystery to the anthropologists because they were tall, blond and blue-eyed. Furthermore there is no proof of their boat making skills and obviously they couldn't even swim! You can familiarize yourself with the history of the Guanches in Museo Etnografico in La Orotava or in Santa Cruz in Museo Arqueologico where you can meet a Guanche in person - as a mummy. Tenerife is the largest of the Canary Islands.