From oby at cs.tu-berlin.de Thu Feb 1 07:52:12 2001 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Thu, 1 Feb 2001 13:52:12 +0100 (MET) Subject: funding opportunities Message-ID: <200102011252.NAA15347@pollux.cs.tu-berlin.de> Dear Connectionists, the German Alexander von Humboldt Foundation launched several programs to support foreign researchers in Germany and to enable them to set up their own, independent group at German research institutions. The programs are for all fields, but - I feel - that our area "Neural Computation" could benefit from these funding opportunities. Therefore, I would encourage all of you to look into these possibilities. If there is interest to collaborate with universities / research institutions in Berlin I am happy to help to establish contacts locally. Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ========================================================================= Internationalization of Research in Germany Initiatives of the Alexander von Humboldt Foundation - Wolfgang Paul Program The Wolfgang Paul Award, which is endowed with up to DM 4.5 million between 2001 and 2003, will make it possible for internationally known scientists and scholars from other countries to set up their own teams of highly qualified younger researchers in Germany. Researchers who are selected will be able to concentrate on their work without the burden of many administrative responsibilities; the goal is to carry out first-rate, innovative research and integrate younger researchers in this work. In addition to the award money, this support also includes financing of the researcher's team and any associated technicians or groups of younger researchers, as well as financing of necessary equipment and materials (incl. travel costs, etc.). The award will be granted based on submitted nominations. Kosmos Program Recipients of the Kosmos Award, who are chosen from among the top young researchers abroad, will receive up to DM 750,000 annually from 2001 to 2003 to establish a group of young researchers in Germany. The award includes funds for the recipient's own salary and for the costs of setting up and operating an independent team for researchers in a field of the team's own choice. Qualified applicants are outstanding young researchers from other countries, generally up to 35 years of age. Friedrich Wilhelm Bessel Research Award The objective of the Friedrich Wilhelm Bessel Research Award Program is to attract top young r esearchers who are already recognized as outstanding specialists in their area to engage in cooperative research efforts with specialists in Germany over a longer period of time. Friedrich Wilhelm Bessel Research Awards are granted to acknowledge previous work conducted in the research area. Recipients are invited to carry out their own research projects in Germany in cooperation with colleagues working in the same area. Top young researchers from abroad can be nominated who, as proven specialists in their fields, are especially in demand by German researchers. The award is endowed with DM 75,000-1 1 0,000. Humboldt Research Fellowship for Long-term Cooperation By further supporting Humboldt Research fellows already residing in Germany for a period of up to two years following their first residency, talented young foreigners are more strongly integrated into German research teams. The long-term support ensures intensive academic cooperation between the guest scholar and German colleagues working in the same area. The fellowship is matched with additional funds to cover research expenses. see also: http://www.humboldt-foundation.de From ijspeert at usc.edu Thu Feb 1 20:02:28 2001 From: ijspeert at usc.edu (ijspeert) Date: Thu, 1 Feb 2001 17:02:28 -0800 (PST) Subject: Preprint: a neuromechanical simulation of gait transition in salamander Message-ID: Dear Connectionists, Preprints of a paper to appear in Biological Cybernetics addressing the neural control of locomotion and gait transition in salamander are available for download from the following sites: http://rana.usc.edu:8376/~ijspeert/publications.html http://rana.usc.edu:8376/~ijspeert/PAPERS/BC.ps.gz http://rana.usc.edu:8376/~ijspeert/PAPERS/BC.pdf The paper (see abstract below) presents a neuromechanical model capable of exhibiting the typical swimming and walking gaits of the salamander. Animations of the salamander simulation can be viewed at http://rana.usc.edu:8376/~ijspeert/salamander.html Comments are most welcome. Best regards, Auke Ijspeert --------------------------------------------------------------------------- Dr Auke Jan Ijspeert Brain Simulation Lab & Computational Learning and Motor Control Lab Dept. of Computer Science, Hedco Neurosciences bldg, 3641 Watt way U. of Southern California, Los Angeles, CA 90089-2520, USA Web: http://rana.usc.edu:8376/~ijspeert/ Tel: +1 213 7401922 or 7406995 (work) +1 310 8238087 (home) Fax: +1 213 7405687 Email: ijspeert at usc.edu --------------------------------------------------------------------------- A connectionist central pattern generator for the aquatic and terrestrial gaits of a simulated salamander Auke Jan Ijspeert Abstract: This article investigates the neural mechanisms underlying salamander locomotion, and develops a biologically plausible connectionist model of a central pattern generator capable of producing the typical aquatic and terrestrial gaits of the salamander. It investigates, in particular, what type of neural circuitry can produce and modulate the two locomotor programs identified within the salamander's spinal cord, namely, a traveling wave of neural activity for swimming and a standing wave for trotting. A two-dimensional biomechanical simulation of the salamander's body is developed whose muscle contraction is determined by the locomotion controller simulated as a leaky-integrator neural network. While the connectivity of the neural circuitry underlying locomotion in the salamander has not been decoded for the moment, this article presents the design of a neural circuit which has a general organization corresponding to that hypothesized by neurobiologists. In particular, the locomotion controller is based on a body {\it central pattern generator} (CPG) corresponding to a lamprey-like swimming controller, and is extended with a limb CPG for controlling the salamander's limbs. The complete controller is developed in three stages, with first the development of segmental oscillators, second the development of intersegmental coupling for the making of a lamprey-like swimming CPG, and finally the development of the limb CPG and its coupling with the body CPG. A genetic algorithm is used to instantiate the parameters of the neural circuit for the different stages given a high level description of the desired state space trajectories of the different subnetworks. A controller is thus developed which can produce neural activities and locomotion gaits very similar to those observed in the real salamander. By varying the tonic (i.e. non-oscillating) excitation applied to the network, the speed, direction and type of gait can be varied. From takane at takane2.psych.mcgill.ca Fri Feb 2 09:21:31 2001 From: takane at takane2.psych.mcgill.ca (Yoshio Takane) Date: Fri, 2 Feb 2001 09:21:31 -0500 (EST) Subject: No subject Message-ID: <200102021421.JAA07987@takane2.psych.mcgill.ca> Dear Colleagues, Thanks to you all! I have received overwhelming responses to my "call for papers" for my invited session at the IMPS2001, circulated a few weeks ago. The following are tentative speakers: names, titles of the talks, affiliations and email addresses. See you all there. You can get detailed information about the meeting at http://www.ir.rikkyo.ac.jp Regards, Yoshio Takane Professor ******************************************************************** 1. Thomas R. Shultz & Francois Rivest (McGill University). Knowledge- based cascade-correlation: Size variation of relevant prior knowledge. (Email: shultz at psych.mcgill.ca, frives at po-box.mcgill.ca) 2. Shotaro Akaho (National Institute of Advanced Industrial Science and Technology, formerly ETL). A kernel method for canonical correlation analysis. (Email: akaho at etl.go.jp) 3. Hideki Asoh (National Institute of Advanced Industrial Science and Technology, formerly ETL). An approximation of nonlinear canonical correlation analysis using neural networks. (Email: asoh at etl.go.jp) 4. Yoshio Takane & Yuriko Oshima-Takane (McGill University). Nonlinear generalized canonical correlation analysis by neural network models. (Email: takane at takane2.psych.mcgill.ca, yuoshima at ed.tokyo-fukushi.ac.jp) 5. Daniel L. Silver (Acadia University). The task rehearsal method of sequential learning. (Email: Danny.Silver at AcadiaU.ca) 6. Shogo Makioka (Osaka Women's University). A connectionist model of phonological working memory. (Email: makioka at center.osaka-wu.ac.jp) 7. Ryotaro Kamimura (Tokai University). Cooperative information control for self-organization maps. (Email: ryo at ego.mcgill.ca) 8. Emilia Barakova (GMD-Japan Reaserch Laboratory). Temporal integration of self-organized sensor streams for robust spatial modeling. (Email: emilia.barakova at emilia-g3.gmd.gr.jp) 9. Allan Kardec Barros, Andrzej Cichocki & Noboru Ohnishi (Riken and Universidade Federal do Maranhao) Extraction of sources using a priori information about their temporal structure. (Email: allan at biomedica.org, cia at brain.riken.go.jp) 10. Keith Worsley (McGill University). Brain mapping data: classification, principal components, and multidimensional scaling. (Email: keith at scylla.math.mcgill.ca) From mpessk at guppy.mpe.nus.edu.sg Fri Feb 2 23:09:26 2001 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Sat, 3 Feb 2001 12:09:26 +0800 (SGT) Subject: TR + Code for automatic C, sigma tuning in SVMs Message-ID: Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms S. Keerthi National Univeristy of Singapore mpessk at guppy.mpe.nus.edu.sg Motivated by a recent SVM paper by Chapelle, Vapnik, Bousquet and Mukherjee, I have been working on a code for the automatic tuning of the hyperparameters of a Support Vector Machine with L2 soft margin, for which the radius/margin bound is taken as the index to be minimized and iterative techniques are employed for computing radius and margin. The implementation is found to be feasible and efficient even for large problems having more than 10,000 support vectors. A report discussing the implementation issues, together with a code running on Matlab interface, can be downloaded from the following page: http://guppy.mpe.nus.edu.sg/~mpessk/nparm.shtml Any comments on the tech report as well as the performance of the code on new datasets will be very much appreciated. From fuzzi at ee.usyd.edu.au Sat Feb 3 22:12:18 2001 From: fuzzi at ee.usyd.edu.au (fuzz-ieee01) Date: Sun, 4 Feb 2001 14:12:18 +1100 (EST) Subject: Final CFP: FUZZ-IEEE 2001 Message-ID: <200102040312.OAA00053@cassius.ee.usyd.edu.au> *************************************************************************** Call for Papers The 10th IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2001) December 2-5, 2001, The University of Melbourne, Melbourne, Australia Sponsored by IEEE Neural Networks Council "Meeting the grand challenge: machines that serve people" Website: http://www.csse.melbourne.edu/FUZZ-IEEE2001 Students: Partial travel scholarships are available from the IEEE Neural Network Council. Please check http://www.arc.unm.edu/~karen/IEEE_NNC/Student_Travel_Grants/ Important dates: Paper submission Friday, 2 March 2001 Notification of acceptance Friday, 1 June 2001 Final manuscripts Friday, 3 August 2001 General enquiry please contact the secretariat of FUZZ-IEEE 2001: fuzz-ieee2001 at csse.melbourne.edu *************************************************************************** The FUZZ-IEEE 2001 conference will be held in Melbourne, Australia, one of the most beautiful and exciting cities in the Southern Hemisphere. Melbourne is also one of the safest, healthiest, and cleanest cities in the world, and is Australia's pre-eminent centre for arts and culture, education, fine food and dining and exciting shopping experiences. The conference will cover a broad range of research topics related to fuzzy logic and soft computing, including but not limited to: T1: Pattern recognition and image processing: supervised and unsupervised learning, classifier design and integration, signal/image processing and analysis, computer vision, multimedia applications. T2: Electronic and robotic systems: fuzzy logic in robotics, automation, and other industrial applications, fuzzy hardware design and implementation. T3: Soft computing and hybrid systems: intelligent information systems, database systems, data mining, intelligent agents, neuro-fuzzy systems, Internet computing. T4: Control systems: fuzzy control theory and applications. T5: Mathematics: foundations of fuzzy logic, approximate reasoning, evolutionary computation. Authors are invited to submit 6 copies of full papers for review. Papers should be written in English, not exceeding 7 single-sided pages on A4 or letter-size paper, in one-column format with 1-inch margin on all four sides, in Times or a similar font of 10 points or larger. Faxed or e-mailed papers will not be accepted. The first page of each paper must include the following information: - title of the paper, - names(s) and affiliation(s) of the author(s), - abstract of the paper, - maximum 5 keywords, - technical area of the paper (T1, T2, T3, T4 or T5, choose one only), - name, postal address, phone and fax numbers and e-mail address of the contact author. Please send papers to: Secretariat of FUZZ-IEEE 2001 Conference Management The University of Melbourne Victoria 3010, Australia E-mail: fuzz-ieee2001 at csse.melbourne.edu *************************************************************************** CONFERENCE ORGAZATION COMMITTEES Honorary Co-Chairs Jim Bezdek, University of West Florida, USA Enrique H. Ruspini, SRI International, USA Lotfi A. Zadeh, University of California, Berkeley, USA General Chair Zhi-Qiang Liu, University of Melbourne, Australia Program Chair Hong Yan, University of Sydney, Australia Special Sessions Chair Sadaaki Miyamoto, University of Tsukuba, Japan Workshop & Tutorials Chair Qiang Shen, University of Edinburgh, UK International Steering Committee Chair Valerie V. Cross, University of North Carolina, USA Publicity Chairs Alen Blair, University of Melbourne, Australia Publications Chair Ed Kazmierczak, University of Melbourne, Australia Local Arrangements Chair Nick M. Barnes, University of Melbourne, Australia PROGRAM AREA CHAIRS Pattern Recognition and Image Processing Jim Keller, University of Missouri, USA Electronic & Robotic Systems Toshio Fukuda, Nagoya University, Japan Control Systems Janusz Kacprzyk, Polish Academy of Science, Poland Soft Computing and Hybrid Systems Nikola Kasabov, University of Otago, New Zealand Mathematics Arthur Ramer, University of New South Wales, Australia *************************************************************************** From steve at cns.bu.edu Sat Feb 3 14:06:57 2001 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sat, 3 Feb 2001 11:06:57 -0800 Subject: Context-Sensitive Binding by the Laminar Circuits of V1 and V2 Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg in PDF format: Raizada, R.D.S. and Grossberg, S. (2001). Context-Sensitive Binding by the Laminar Circuits of V1 and V2: A Unified Model of Perceptual Grouping, Attention, and Orientation Contrast. Visual Cognition, in press. Preliminary version available as Technical Report CAS/CNS TR-2000-008 ABSTRACT: A detailed neural model is presented of how the laminar circuits of visual cortical areas V1 and V2 implement context-sensitive binding processes such as perceptual grouping and attention. The model proposes how specific laminar circuits allow the responses of visual cortical neurons to be determined not only by the stimuli within their classical receptive fields, but also to be strongly influenced by stimuli in the extra-classical surround. This context-sensitive visual processing can greatly enhance the analysis of visual scenes, especially those containing targets that are low contrast, partially occluded, or crowded by distractors. We show how interactions of feedforward, feedback and horizontal circuitry can implement several types of contextual processing simultaneously, using shared laminar circuits. In particular, we present computer simulations which suggest how top-down attention and preattentive perceptual grouping, two processes that are fundamental for visual binding, can interact, with attentional enhancement selectively propagating along groupings of both real and illusory contours, thereby showing how attention can selectively enhance object representations. These simulations also illustrate how attention may have a stronger facilitatory effect on low contrast than on high contrast stimuli, and how pop-out from orientation contrast may occur. The specific functional roles which the model proposes for the cortical layers allow several testable neurophysiological predictions to be made. The results presented here simulate only the boundary grouping system of adult cortical architecture. However, we also discuss how this model contributes to a larger neural theory of vision which suggests how intracortical and intercortical feedback help to stabilize development and learning within these cortical circuits. Although feedback plays a key role, fast feedforward processing is possible in response to unambiguous information. Model circuits are capable of synchronizing quickly, but context-sensitive persistence of previous events can influence how synchrony develops. Although these results focus on how the interblob cortical processing stream controls boundary grouping and attention, related modeling of the blob cortical processing stream suggests how visible surfaces are formed, and modeling of the motion stream suggests how transient responses to scenic changes can control long-range apparent motion and also attract spatial attention. From oreilly at grey.colorado.edu Mon Feb 5 01:20:31 2001 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Sun, 4 Feb 2001 23:20:31 -0700 Subject: Changes at Cognitive Science Message-ID: <200102050620.XAA21864@grey.colorado.edu> ANNOUNCEMENT OF RECENT CHANGES AT COGNITIVE SCIENCE: The Cognitive Science journal has a new editorial board and set of policies. One of the more important policy changes is that we now officially publish four categories of articles: Letters to the editor, brief reports, regular articles, and extended articles. To find out more details on the nature of these categories and the journal more generally, visit the cognitive science society's web page at: http://www.cognitivesciencesociety.org/about.html The journal is also now available online for all members of the cognitive science society at: http://www.elsevier.nl/gej-ng/10/15/15/show/ Editor: Robert L. Goldstone, Indiana University Associate Editors: John R. Anderson, Carnegie Mellon University Nick Chater, University of Warwick Andy Clark, University of Sussex Shimon Edelman, Cornell University Kenneth Forbus, Northwestern University Dedre Gentner, Northwestern University Raymond W. Gibbs, Jr., University of California, Santa Cruz James Greeno, Stanford University Robert A. Jacobs, University of Rochester Randall C. O'Reilly, University of Colorado Boulder Colleen M. Seifert, University of Michigan Daniel Sperber, CNRS Paris From gustavo.deco at mchp.siemens.de Mon Feb 5 05:39:33 2001 From: gustavo.deco at mchp.siemens.de (Gustavo Deco) Date: Mon, 5 Feb 2001 11:39:33 +0100 (MET) Subject: New book Information Dynamics Message-ID: <200102051039.f15AdXS28209@loire.mchp.siemens.de> Dear Connectionists It is my pleasure to announce the publication of a NEW BOOK: INFORMATION DYNAMICS:Foundations and Applications by Gustavo DECO and Bernd SCHUERMANN Editorial: SPRINGER-VERLAG Approx. 304 pp. 89 figs., ISBN 0-387-95047-8 Information Dynamics presents an interdisciplinary treatment of nonlinear dynamical systems with an emphasis on complex systems which transmit information in time and space. The book offers a new theoretical approach to dynamical systems from an information theory perspective. The goal of the book is to provide a detailed and unified study of the flow of information in a quantitative manner, utilizing methods and techniques from information theory, time series analysis, nonlinear dynamics and neural networks. The authors use analysis of test-bed simulations, empirical data, and real-world applications to give concrete perspectives and reinforcement for the key conceptual ideas and methods. The formulation provides a unique and consistent conceptual framework for the problem of discovering knowledge behind empirical data. Contents: Dynamical Systems Overview.- Statistical Structure Extraction in Dynamical Systems: Parametric Formulation.- Applications: Parametric Characterization of Time Series.- Statistical Structure Extraction in Dynamical Systems: Nonparametric Formulation.- Applications: Nonparametric Characterization of Time Series.- Statistical Structure Extraction in Dynamical Systems: Semiparametric Formulation.- Applications: Semiparametric Characterization of Time Series.- Information Processing and Coding in Spatio-Temporal Dynamical Systems.- Applications: Information Processing and Coding in Spatio-Temporal Dynamical Systems.- Appendixes A,B,C,D.- References. Info Processing Theory, Nonlinear Dynamics, Time Series Modeling, Neural Networks For graduates, researchers, professionals ---------------------------- PD Dr.Dr.habil. Gustavo Deco Siemens Corporate Research Computational Neuroscience Otto-Hahn-Ring 6 D-81739 Munich Germany Tel. +49 89 636 47373 Fax. +49 89 636 49767 From m.usher at psychology.bbk.ac.uk Mon Feb 5 09:48:47 2001 From: m.usher at psychology.bbk.ac.uk (m.usher@psychology.bbk.ac.uk) Date: Mon, 5 Feb 2001 14:48:47 GMT Subject: article on stochastic resonance in human cognition Message-ID: <200102051448.OAA01475@acer.ccs.bbk.ac.uk> The following article addressing the phenomenon of stochastic resonance (SR), and which was recently published as a Letter to the Editors in Biol. Cybern. 83 (2000) 6, L011-L016, can now be accessed from the following website: www.psyc.bbk.ac.uk/staff/mu/homepage/noise/SR.pdf Stochastic resonance in the speed of memory retrieval Marius Usher and Mario Feingold Birkbeck College Ben Gurion University University of London Beer Sheva, Israel Email: M.Usher at bbk.ac.uk Email: Mario at bgumail.bgu.ac.il Abstract. The stochastic resonance (SR) phenomenon in human cognition (memory retrieval speed for arithmetical multiplication rules) is addressed in a behavioral and neurocomputational study. The results of an experiment in which performance was monitored for various magnitudes of acoustic noise are presented. The average response time was found to be minimal for some optimal noise level. Moreover, it was shown that the optimal noise level and the magnitude of the SR effect depend on the difficulty of the task. A computational framework based on leaky accumulators that integrate noisy information and provide the output upon reaching a threshold criterion is used to explain the observed phenomena. From a.hussain at cs.stir.ac.uk Tue Feb 6 12:23:04 2001 From: a.hussain at cs.stir.ac.uk (Dr. Amir Hussain) Date: Tue, 6 Feb 2001 17:23:04 -0000 Subject: CIS Journal Special Issue: Call for Papers Message-ID: <003701c09061$77e99c80$5299fc3e@amir> Dear connectionists, Please find the Call for Papers for the Control & Intelligent Systems (CIS) Journal Special Issue on "Non-linear Speech Processing Techniques & Applications", at the link below: http://www.actapress.com/journals/specialci.htm Looking forward to hearing from interested connectionists soon! Sincerely - Dr.Amir Hussain, Lecturer Guest Editor, Control & Intelligent Systems Journal Special Issue Department of Computing Science & Mathematics University of Stirling, Stirling FK9 4LA SCOTLAND, UK Tel / Fax: (++44) 01786 - 476437 / 464551 Email: a.hussain at cs.stir.ac.uk http://www.cs.stir.ac.uk/~ahu/ From jbower at bbb.caltech.edu Tue Feb 6 12:32:00 2001 From: jbower at bbb.caltech.edu (James M. Bower) Date: Tue, 6 Feb 2001 09:32:00 -0800 Subject: No subject Message-ID: New deadline for submitting papers to CNS*01 Due to circumstances beyond our control, the computer systems serving the Computational Neuroscience Meetings have not been available for the last two weeks. They are now reinstated, however, and we have established a new deadline of Feb. 12 for paper submissions to the meeting. CNS*01 is the tenth annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. The meeting in 2001 will take place on the Central West Coast of the United States. The meeting will officially convene in San Francisco on the evening of June 30th with an opening reception. Then on the morning of Sunday July 1st, meeting participants will board busses for the Monterey Peninsula where the formal scientific sessions will start at 5:00 p.m. at the Asilomar Conference Grounds in Pacific Grove. The last meeting event will be the traditional banquet on Thursday evening, July 5th at the Monterey Bay Aquarium . Housing accommodations will be available for the evening of June 30th at the Ramada Plaza Hotel in the Market Street area of San Francisco and at the Asilomar Conference Grounds for the duration of the meeting. Travel information as well as instructions for paper submission can be found at the meeting's web site: http://cns.numedeon.com/cns2001/ Jim Bower Meeting Chairman From ormoneit at Robotics.Stanford.EDU Tue Feb 6 20:50:22 2001 From: ormoneit at Robotics.Stanford.EDU (Dirk Ormoneit) Date: Tue, 6 Feb 2001 17:50:22 -0800 (PST) Subject: Reinforcement learning, particle filters, and human motion Message-ID: <200102070150.RAA00782@bonfire.Stanford.EDU> Hi, Please take notice of the following brand-new papers: Kernel-based reinforcement learning in average-cost problems. D. Ormoneit and P. W. Glynn. http://robotics.stanford.edu/~ormoneit/publications/control.ps Lattice Particle Filters. C. Lemieux, D. Ormoneit, and David J. Fleet. http://robotics.stanford.edu/~ormoneit/publications/lattice.ps.gz Functional analysis of human motion data. D. Ormoneit, T. Hastie, and M.Black. http://robotics.stanford.edu/~ormoneit/publications/motion.ps.gz A more detailed description is enclosed below. Cheers, Dirk _________________________________________________________________ Kernel-based reinforcement learning in average-cost problems D. Ormoneit and P. W. Glynn. http://robotics.stanford.edu/~ormoneit/publications/control.ps Reinforcement learning (RL) is concerned with the identification of optimal controls in Markov Decision Processes (MDP) where no explicit model of the transition probabilities is available. Many existing approaches to RL --- including ``temporal-difference learning'' --- employ simulation-based approximations of the value function for this purpose \cite{sutton88,tsitsiklis97}. This proceeding frequently leads to numerical instabilities of the resulting learning algorithm, especially if the function approximators used are parametric such as linear combinations of basis functions or neural networks. In this work, we propose an alternative class of RL algorithms which always produces stable estimates of the value function. In detail, we use ``local averaging'' methods to construct an approximate dynamic programming (ADP) algorithm. _________________________________________________________________ Lattice Particle Filters C. Lemieux, D. Ormoneit, and David J. Fleet. http://robotics.stanford.edu/~ormoneit/publications/lattice.ps.gz A common way to formulate visual tracking is to adopt a Bayesian approach, and to use particle filters to cope with nonlinear dynamics and nonlinear observation equations. While particle filters can deal with such filtering tasks in principle, their performance often varies significantly due to their stochastic nature. We present a class of algorithms, called lattice particle filters, that circumvent this difficulty by placing the particles deterministically according to a Quasi-Monte Carlo integration rule. We describe a practical realization of this idea and discuss its theoretical properties. Experimental results with a synthetic 2D tracking problem show that the lattice particle filter yields a performance improvement over conventional particle filters that is equivalent to an increase between 10 and 60\% in the number of particles, depending on their ``sparsity'' in the state-space. We also present results on inferring 3D human motion from moving light displays. _________________________________________________________________ Functional analysis of human motion data D. Ormoneit, T. Hastie, and M.Black. http://robotics.stanford.edu/~ormoneit/publications/motion.ps.gz We present a method for the modeling of 3D human motion data using functional analysis. First, we estimate a statistical model of typical activities from a large set of 3D human motion data. For this purpose, the human body is represented as a set of articulated cylinders and the evolution of a particular joint angle is described by a time-series. Specifically, we consider periodic motion such as ``walking'' in this work, and we develop a new set of tools that allows for the automatic segmentation of the training data into a sequence of identical ``motion cycles''. Then we compute the mean and the principal components of these cycles using a new algorithm that accounts for missing information and that enforces smooth transitions between cycles. As an application of this methodology we consider the visual tracking of human motion in a 2D video sequence. Here the principal components serve to define a low-dimensional representation of the human 3D poses in a state-space model that treats the 2D video images as observations. We apply (approximate) Bayesian inference using a particle filter to the state-space model to infer the body poses at each time-step. The resulting algorithm is able to track human subjects in monocular video sequences and to recover their 3D motion in complex unknown environments. _________________________________________________________________ Dirk Ormoneit Department of Computer Science Gates Building, 1A-148 Stanford University Stanford, CA 94305-9010 ph.: (650) 725-8797 fax: (650) 725-1449 ormoneit at cs.stanford.edu http://robotics.stanford.edu/~ormoneit/ From Patrick.DeMaziere at med.kuleuven.ac.be Wed Feb 7 04:38:09 2001 From: Patrick.DeMaziere at med.kuleuven.ac.be (Patrick De Maziere) Date: Wed, 07 Feb 2001 10:38:09 +0100 Subject: Postdoctoral position in biomedical signal-processing Message-ID: <3A811781.62E819C5@med.kuleuven.ac.be> ----------------------------------------------------- Postdoctoral position in biomedical signal-processing ----------------------------------------------------- Deadline for application: 15 March 2001 The Computational Neuroscience Group of the Laboratory of Neuro- and Psychophysiology, Medical School of the Katholiek Universiteit Leuven, Belgium (http:\\simone.neuro.kuleuven.ac.be), invites applications for a post-doctoral position in the area of biomedical signal-processing (functional Magnetic Resonance Imaging). Desired profile: The highly qualified applicant should possess a Ph.D. degree in the field of signal-processing, image-processing, or neural networks. He/she should be familiar with Independent Components Analysis (ICA) and/or related techniques as Projection Pursuit, Blind Source Separation, ... Programming skills are an asset (C, Matlab, ...), as is a familiarity with UNIX and PC platforms. We offer: 1) A challenging research environment. The applicant will have access to data from state-of-the-art Magnetic Resonance scanners and signal-processing tools for examining brain activity in both humans and monkeys. 2) An attractive income. The applicant will receive 2375 Euro net per month, including a full social security coverage. This is comparable to the salary of an associate Professor at the University. Housing will be taken care of by the host institute. 3) Free return airline ticket, economy class (maximum 1500 Euro) and a reimbursement of all costs incurred for shipping luggage to Belgium (maximum 1000 Euro). Please send (mail/fax/email) your CV (including bibliography, and the names and addresses of three references), before the deadline of 15 March 2001 to: Prof. Dr. Marc M. Van Hulle K.U.Leuven Laboratorium voor Neuro- en Psychofysiologie Faculteit Geneeskunde Campus Gasthuisberg Herestraat 49 B-3000 Leuven Belgium Phone: + 32 16 345961 Fax: + 32 16 345993 E-mail: marc at neuro.kuleuven.ac.be URL: http://simone.neuro.kuleuven.ac.be From ml_conn at infrm.kiev.ua Wed Feb 7 16:00:18 2001 From: ml_conn at infrm.kiev.ua (Dmitri Rachkovskij) Date: Wed, 7 Feb 2001 23:00:18 +0200 (UKR) Subject: Building large-scale hierarchical models of the world... Message-ID: <2.07b5.7329.G8ENOI@infrm.kiev.ua> Keywords: analogy, analogical mapping, analogical retrieval, APNN, associative-projective neural networks, binary coding, binding, categories, chunking, compositional distributed representations, concepts, concept hierarchy, connectionist symbol processing, context-dependent thinning, distributed memory, distributed representations, Hebb, long-term memory, nested representations, neural assemblies, part-whole hierarchy, representation of structure, sparse coding, taxonomy hierarchy, thinning, working memory, world model Dear Colleagues, The following paper draft (abstract enclosed): Dmitri Rachkovskij & Ernst Kussul: "Building large-scale hierarchical models of the world with binary sparse distributed representations" is available at http://cogprints.soton.ac.uk/documents/disk0/00/00/12/87/index.html or by the ID code: cog00001287 at http://cogprints.soton.ac.uk/ Comments are welcome! Thank you and best regards, Dmitri Rachkovskij ************************************************************************* Dmitri A. Rachkovskij, Ph.D. Net: dar at infrm.kiev.ua Senior Researcher, V.M.Glushkov Cybernetics Center, Tel: 380 (44) 266-4119 Pr. Acad. Glushkova 40, Kiev 03680, UKRAINE Fax: 380 (44) 266-1570 ************************************************************************* Encl: Abstract Many researchers agree on the basic architecture of the "world model" where knowledge about the world required for organization of agent's intelligent behavior is represented. However, most proposals on possible implementation of such a model are far from being plausible both from computational and neurobiological points of view. Implementation ideas based on distributed connectionist representations offer a huge information capacity and flexibility of similarity representation. They also allow a distributed neural network memory to be used that provides an excellent storage capacity for sparse patterns and naturally forms generalization (or concept, or taxonomy) hierarchy using the Hebbian learning rule. However, for a long time distributed representations suffered from the "superposition catastrophe" that did not allow nested part-whole (or compositional) hierarchies to be handled. Besides, statistical nature of distributed representations demands their high dimensionality and a lot of memory, even for small tasks. Local representations are vivid, pictorial and easily interpretable, allow for an easy manual construction of both types of hierarchies and an economical computer simulation of toy tasks. The problems of local representations show up with scaling to the real world models. Such models include an enormous number of associated items that are met in various contexts and situations, comprise parts of other items, form multitude intersecting multilevel part-whole hierarchies, belong to various category-based hierarchies with fuzzy boundaries formed naturally in interaction with environment. It appears that using local representations in such models becomes less economical than distributed ones, and it is unclear how to solve their inherent problems under reasonable requirements imposed on memory size and speed (e.g., at the level of mammals' brain). We discuss the architecture of Associative-Projective Neural Networks (APNNs) that is based on binary sparse distributed representations of fixed dimensionality for items of various complexity and generality. Such representations are rather neurobiologically plausible, however we consider that the main biologically relevant feature of APNNs is their promise for scaling up to the full-sized adequate model of the real world, the feature that is lacked by implementations of other schemes As in other schemes of compositional distributed representations, such as HRRs of Plate and BSCs of Kanerva, an on-the-fly binding procedure is proposed for APNNs. It overcomes the superposition catastrophe, permitting the order and grouping of component items in structures to be represented. The APNN representations allow a simple estimation of structures' similarity (such as analogical episodes), as well as finding various kinds of associations based on context-dependent similarity of these representations. Structured distributed auto-associative neural network of feedback type is used as long-term memory, wherein representations of models organized into both types of hierarchy are built. Examples of schematic APNN architectures and processes for recognition, prediction, reaction, analogical reasoning, and other tasks required for functioning of an intelligent system, as well as APNN implementations, are considered. --------------------------------------------------------------------- From school at cogs.nbu.acad.bg Fri Feb 9 12:56:12 2001 From: school at cogs.nbu.acad.bg (CogSci Summer School) Date: Fri, 9 Feb 2001 19:56:12 +0200 Subject: CogSci 2001 Message-ID: 8th International Summer School in Cognitive Science Sofia, New Bulgarian University, July 9 - 28, 2001 Courses: * Arthur Shimamura (University of California at Berkeley, USA) - Ecexutive Control, Metacognition and Memory Processes * Barbara Finlay (Cornell University, USA) - Development and Evolution of the Brain * Maurice Greenberg (New Bulgarian University, BG) - Introduction to Connectionism * Stella Vosniadou (University of Athens, GR) - Cognitive Development and Conceptual Change * Michael Thomas (NCDU, ICH, UK) - Connectionist Models of Developmental Disorders * Csaba Pleh (University of Seget, HU) - Language Understanding in Children, Adults and Patients * Robert Goldstone (Indiana University, USA) - Human learning and adaptive systems * Jeff Elman (University of California at San Diego, USA) - Connectionist Models of Learning and Development * Susan Epstein (City University of New York, USA) - Cognitive Modeling and the Development of Expertise Organised by New Bulgarian University, Bulgarian Academy of Sciences, and Bulgarian Society for Cognitive Science Endorsed by the Cognitive Science Society For more information look at: http://www.nbu.bg/cogs/events/ss2001.htm Central and East European Center for Cognitive Science New Bulgarian University 21 Montevideo Str. Sofia 1635 phone: 955-7518 Guergana Erdeklian, administrative manager, CEECenter for Cognitive Science From marchand at site.uottawa.ca Thu Feb 8 09:00:15 2001 From: marchand at site.uottawa.ca (Mario Marchand) Date: Thu, 8 Feb 2001 09:00:15 -0500 Subject: Tenure-track and contractual positions available Message-ID: <004a01c091d7$76d76e40$255a7a89@site.uottawa.ca> Dear Colleagues The School of Information Technology and Engineering (SITE) at the University of Ottawa is searching for good candidates to fill several tenure-track faculty positions and contractual positions in basically any area of Computer Science, Software Engineering, and Computer and Electrical Engineering. Hence researchers in Machine Learning, Neural Networks, Pattern Recognition,... are invited to apply. For more details about these positions and our institution, please consult: http://www.site.uottawa.ca/school/positions/ Regards, -mario From dandre at cs.berkeley.edu Sat Feb 10 23:38:44 2001 From: dandre at cs.berkeley.edu (David Andre) Date: Sat, 10 Feb 2001 20:38:44 -0800 Subject: ICML Workshop on Hierarchy and Memory in Reinforcement Learning Message-ID: <23276.981866324@tatami.cs.berkeley.edu> Workshop on Hierarchy and Memory in Reinforcement Learning ICML 2001, Williams College, June 28, 2001 Call for Participation In recent years, much research in reinforcement learning has focused on learning, planning, and representing knowledge at multiple levels of temporal abstraction. If reinforcement learning is to scale to solving larger, more real-world-like problems, it is essential to consider a hierarchical approach in which a complex learning task is decomposed into subtasks. It has been shown in recent and past work that a hierarchical approach substantially increases the efficiency and abilities of RL systems. Early work in reinforcement learning showed faster learning resulted when tasks were decomposed into behaviors (Lin, 1993; Mahadevan and Connell, 1992; Singh et al 1994). However, these approaches were mostly based on modular, but not hierarchical decompositions. More recently, researchers have proposed various models, the most widely recognized being Hierarchies of Abstract Machines (HAMs) (Parr, 1998), options (Sutton, Precup and Singh, to appear), and MAXQ value function decomposition (Dietterich, 1998). A key technical breakthrough that enabled these approaches is the use of reinforcement learning over semi-Markov decision processes (Bradtke and Duff, Mahadevan et al, 1997, Parr, 1998). Although these approaches speed up learning considerably, they still assume the underlying environment is accessible (i.e. not perceptually aliased). A major direction of research into scaling up hierarchical methods is extend them to domains where the underlying states are hidden (i.e., partially observable Markov decision processes, or POMDPs). Over the past year, there has been a surge of interest in applying hierarchical reinforcement learning (HRL) methods to such partially observable domains. Researchers are investigating techniques such as memory, state abstraction, offline decomposition, action abstraction, and many others to simplify the problem of learning near-optimal behaviors as they attack increasingly complex environments. This workshop will be an opportunity for the researchers in this growing field to share knowledge and expertise on the topic, open lines of communication for collaboration, prevent redundant research, and possibly agree on standard problems and techniques. The format of the workshop is designed to encourage discussion and debate. There will be approximately four invited talks by senior researchers. Ample time between talks will be provided for discussion. Additionally, there will be a a NIPS-like poster session (complete with plenary poster previews). We will also be providing a partially observable problem domain that participants can optionally use to test their techniques, which may provide for a underlying common experience for comparing and contrasting techniques. The discussion will focus not only on the various techniques and theories for using memory in reinforcement learning, but also on higher order questions, such as ``Must hierarchy and memory be combined for practical RL systems?'' and ``What forms of structured memory are useful for RL?''. We are thus seeking attendees with the following interests: * Basic technologies applied to the problem of hierarchy and memory in RL - Function approximation - State abstraction - Finite memory methods * Training Issues for hierarchical RL - Shaping - Knowledge tranfser across tasks/subproblems * Hierarchy Construction issues - Action models - Action decomposition methods - Automatic acquisition of hierarchical structure - Learning state abstractions * Applications - Hierarchical motor control (especially with feedback). - Hierarchical visual attention/gaze control - Hierarchical navigation Submissions: To participate in the workshop, please send a email message to one of the two organizers, giving your name, address, email address, and a brief description of your reasons for wanting to attend. In addition, if you wish to present a poster, please send a short (< 5 pages in ICML format) paper in postscript or PDF format to the organizers. If you have questions, please feel free to contact us. Important Dates: Deadline for workshop submissions: Mar 22, 2001 Notification of acceptance: April 9, 2001 Final version of workshop papers due: April 31, 2001 The workshop itself: June 28, 2001 Committee: Workshop Chairs: David Andre (dandre at cs.berkeley.edu) Anders Jonsson (ajonsson at cs.umass.edu) Workshop Committee: Andrew Barto Natalia Hernandez Sridhar Mahadevan Ronald Parr Stuart Russell Please see http://www-anw.cs.umass.edu/~ajonsson/icml/ for more information, including directions for formatting the submissions and details on the benchmark domain. From R.J.Howlett at bton.ac.uk Sun Feb 11 13:13:24 2001 From: R.J.Howlett at bton.ac.uk (R.J.Howlett@bton.ac.uk) Date: Sun, 11 Feb 2001 18:13:24 -0000 Subject: Invitation to AI/Internet researchers/authors Message-ID: <08C89600D966D4119C7800105AF0CB6B0246D12B@moulsecoomb.bton.ac.uk> Invitation to authors I am collecting material for a book with a working title of "Intelligent Internet Systems" to be published by a major publisher. The volume will describe the latest research on the interaction between intelligent systems (neural networks, fuzzy and rule-based systems, intelligent agents) and the internet/WWW. I already have quite a lot of material, but would like to invite proposals for chapters from prospective authors. Typical topics might be intelligent techniques for:- * interpreting WWW-derived information * internet data-mining * internet search algorithms and mechanisms * remote condition monitoring and security monitoring * interfaces * tools * applications but suggestions for further topics would be welcome. If you would like to propose a chapter please would you let me have the following information:- Name/email: University/Company: Are you a Professor/Researcher/Engineer/etc: Title of proposed chapter: Brief description of contents: Many thanks, Bob Howlett ------------------------------------------ Dr R.J.Howlett BSc, MPhil, PhD, MBCS, CEng Director of Brighton TCS Centre, Head of Intelligent Signal Processing Labs, University of Brighton, Moulsecoomb, Brighton, UK Tel: 01273 642305 Fax: 01273 642444 Email: R.J.Howlett at brighton.ac.uk ------------------------------------------ From ckiw at dai.ed.ac.uk Mon Feb 12 14:17:05 2001 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Mon, 12 Feb 2001 19:17:05 +0000 (GMT) Subject: Advert for Chair in Machine Learning at U of Edinburgh, UK Message-ID: The University of Edinburgh Chair in Machine Learning The University invites applications for a Chair in Machine Learning, to be held within the Division of Informatics. We seek a candidate who will further develop the strengths of the Division through empirical, theoretical, applications-oriented or inter-disciplinary research in any area of machine learning, including probabilistic graphical modelling, computational learning theory, inductive logic programming, pattern recognition, reinforcement learning, data mining, text mining, relations between human and machine learning. Outstanding applicants from areas of Informatics that provide a foundation for Machine Learning are also welcome, including databases, information retrieval, knowledge representation. In addition to strengths in research and scholarship, the successful candidate will provide leadership and inspiration, and play an active role in teaching and administration. Informal enquiries can be made to the Head of Division, Professor Alan Bundy, at the Division of Informatics, by telephone (+44~131~650~2716) or e-mail (hod at informatics.ed.ac.uk). Further particulars and application packs should be obtained from Personnel Department, the University of Edinburgh, 9-16 Chambers Street, Edinburgh EH1 1HT. Tel 0131 650 2511. Please quote Ref Number 316129. Further particulars may also be obtained from the Informatics web site http://www.informatics.ed.ac.uk/events/vacancies Closing date: 15 March 2001 From stefan.wermter at sunderland.ac.uk Mon Feb 12 07:35:22 2001 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Mon, 12 Feb 2001 12:35:22 +0000 Subject: CfC: Book on Intelligent Agent Engineering Message-ID: <3A87D88A.39862A9D@sunderland.ac.uk> We are particularly interested in Novel approaches to intelligent agents: ----------------- CALL FOR CHAPTERS Intelligent Agent Software Engineering A book edited by Valentina Plekhanova and Stefan Wermter, University of Sunderland, United Kingdom Intelligent software agents are a unique generation of information society tools that independently perform various tasks on behalf of human user(s) or other software agents. The new possibility of information society requires the development of new more intelligent methods, tools and theories for modelling/engineering of agent-based systems and technologies. This directly involves a need for consideration, understanding and analysis of human factors, e.g. people's knowledge/skill, learning, and performance capabilities/compatibilities in particular software development environments. This is because software developers utilise their experience and represent their mental models via development/engineering of intelligent agent(s). Therefore, the study of interrelated factors such as people's and agent's capabilities constitute an important/critical area in intelligent agent software engineering. This should eventually lead to more robust, intelligent, interactive, learning and adaptive agents. We encourage submitting papers where intelligent agent software engineering is considered as the application of the integration of formal methods and heuristic approaches to ensure support for the evaluation, comparison, analysis, and evolution of agent behaviour. The primary objective of the book is to introduce the readers to the concept of intelligent agent software engineering so that they will be able to develop their own agents and implement concepts in their own environments. Chapters based on research from both academia and industry are encouraged. Representative topics include, but are not limited to, the following: Requirements for intelligent agent software engineering Intelligent agent modelling and management Integration problems in intelligent agent software engineering Intelligent planning/scheduling systems Prioritisation problems in intelligent agent software engineering Engineering models of intelligent agents Modelling, analysis, and management of learning agents Engineering the Information Technology Engineering the learning processes Hybrid neural agents Adaptive agents Learning agents Communicating agents Meta agent architectures Negotiating agents Emerging knowledge based on collaborating software agents Machine learning for intelligent agents Cognitively oriented software agents Biologically motivated agent models Lifelong learning in software agents Robustness and noise in software agents Machine learning for software engineering Evolutionary agents Neuroscience - inspired agents; Agents based on soft and fuzzy computing SUBMISSION PROCEDURE Researchers and practitioners are invited to submit on or before March 31, 2001, a 2-5 page proposal outlining the mission of the proposed chapter. Authors of accepted proposals will be notified by May 15, 2001 about the status of their proposals and sent chapter organisational guidelines. Full chapters are expected to be submitted by September 15, 2001. The book is scheduled to be published by Idea Group Publishing in 2002. Inquiries and Submissions can be forwarded electronically to: Valentina Plekhanova, and Stefan Wermter, University of Sunderland, SCET, The Informatics Centre, St Peter's Campus, St Peter's Way, SR6 0DD, UK Tel: +44 (0)-191-515-2755 Fax: +44 (0)-191 -515-2781 Email: valentina.plekhanova at sunderland.ac.uk stefan.wermter at sunderland.ac.uk *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Informatics Centre, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From dagli at umr.edu Mon Feb 12 12:01:38 2001 From: dagli at umr.edu (Cihan Dagli) Date: Mon, 12 Feb 2001 11:01:38 -0600 Subject: ANNIE 2001 Smart Engineering Systems Design Conference November 4-7, 2001 Message-ID: Dear Colleagues: On behalf of the organizing committee I would like to invite you to attend ANNIE 2001 ( http://www.umr.edu/~annie/annie01 ) which will be held November 4th-7th, 2001, at the Marriott Pavilion Hotel in downtown St. Louis, Missouri, USA. This will be the eleventh international gathering of researchers interested in Smart Engineering System Design using neural networks, fuzzy logic, evolutionary programming, complex systems, data mining, and rough sets. Each previous conference drew approximately 150 papers from twenty countries. The proceedings of all conferences have been published by ASME Press as hardbound books in nine volumes. The latest volume, edited by Dagli, et. al., was titled "Smart Engineering System Design: Neural Networks, Fuzzy Logic, Evolutionary Programming, Data Mining, and Complex Systems." You can visit ANNIE home page for details of these conferences http://www.umr.edu/~annie/ . ANNIE 2001 ( http://www.umr.edu/~annie/annie01 ) will cover the theory of Smart Engineering System Design techniques, namely: neural networks, fuzzy logic, evolutionary programming, complex systems, data mining, and rough sets. Presentations dealing with applications of these technologies are encouraged. The organizing committee invites all persons interested in Computational Intelligence to submit papers for presentation at the conference. You can submit your abstracts online at http://www.umr.edu/~annie/oas.htm . All papers accepted for presentation will be published in the conference proceedings. They will be reviewed by two referees, senior researchers in the field, for technical merit and content. AUTHORS SCHEDULE March 2, 2001: Deadline for contributed paper abstract, information sheet, and letter of intent. May 18, 2001: Deadline for full papers. July 6, 2001: Notification of status of contributed papers. August 10, 2001: Deadline for camera ready manuscripts. Six pages will be allocated for each accepted paper in the proceedings. All accepted papers will be published as a hardbound book by ASME Press and edited by Drs. Dagli, Buczak, Embrechts, Ersoy, Ghosh and Kercel. Authors are requested to submit the following by March 2, 2001: 1) an abstract (up to 200 words) 2) an information sheet that includes the full name of the authors, address, phone and FAX number, E-mail, Web address 3) a letter of intent Authors should forward their letter of intent, information sheet, abstract, and full paper to: Dr. Cihan H. Dagli, Conference Chair Smart Engineering Systems Laboratory Department of Engineering Management University of Missouri - Rolla 1870 Miner Circle Rolla, MO 65409-0370, USA Phone: (573) 341-6576 or (573) 341-4374 FAX: (573) 341-6268 E-Mail: annie at umr.edu -or- patti at umr.edu Internet: http://www.umr.edu/~annie From tsims at talley.com Fri Feb 9 18:15:17 2001 From: tsims at talley.com (tsims@talley.com) Date: Fri, 9 Feb 2001 18:15:17 -0500 Subject: IJCNN 2001 deadline extended to Feb 22, 2001 Message-ID: <852569EE.007FBE33.00@frodo.talley.com> E X T E N D E D A B S T R A C T D E A D L I N E for I J C N N 2 0 0 1 International Joint Conference on Neural Networks 2001 Washington, D.C., July 15-19,2001 Deadline for ABSTRACTS extended to February 22, 2001. Electronic Submission of Abstracts: http://ams.cos.com/cgi-bin/login?institutionId=2425&meetingId=19 --------------------------------------------------- Dear Colleague, There is an important new development for the IJCNN 2001. We are informed that the National Science Foundation is launching a major new interdisciplinary funding initiative soon, one that should be of extreme interest to our community. Accordingly, the INNS Board has decided to feature this NSF Initiative at the IJCNN 2001, including opportunity for special sessions/tracks, and a Panel that focuses on this new initiative. This will be a venue in which our community can provide the agencies involved in this new interdisciplinary initiative a broader scope of ideas to include in the definition of their programs. WE INVITE you, our colleagues, to consider submitting paper abstracts and/or proposals for special sessions on topics which involve multiple disciplines in our field, and hopefully, in areas that push boundaries beyond the scope of previous session topics. We encourage you and other interested scientists to participate in presentations and on-site discussions to launch the new initiative. Our hope is that such presentations, coupled with details of the funding initiative and a panel discussion on the topic will provide an outstanding opportunity for attendees of the IJCNN 2001. Sincerely, Ken Marko and Paul Werbos, General Chairs, IJCNN 2001 PS: We are happy to announce that over 500 Abstracts have already been submitted for the IJCNN 2001, and encourage those who may have missed the original deadline to submit their abstracts now. Use the following URL to go to the abstract submission site: http://ams.cos.com/cgi-bin/login?institutionId=2425&meetingId=19 Sincerely, Ken Marko and Paul Werbos For the Organizing Committee of IJCNN 2001 From steve at cns.bu.edu Tue Feb 13 11:21:54 2001 From: steve at cns.bu.edu (Stephen Grossberg) Date: Tue, 13 Feb 2001 08:21:54 -0800 Subject: Resonant Dynamics of Speech Perception Message-ID: The following article is now available at http://www.cns.bu.edu/Profiles/Grossberg in HTML, PDF, and Gzipped Postscript. Grossberg, S. and Myers, C.W. The resonant dynamics of speech perception: Interword integration and duration-dependent backward effects. Psychological Review. ABSTRACT: How do listeners integrate temporally distributed phonemic information into coherent representations of syllables and words? During fluent speech perception, variations in the durations of speech sounds and silent pauses can produce different perceived groupings. For example, increasing the silence interval between the words "gray chip" may result in the percept "great chip", whereas increasing the duration of fricative noise in "chip" may alter the percept to "great ship" (Repp et al., 1978). The ARTWORD neural model quantitatively simulates such context-sensitive speech data. In ARTWORD, sequential activation and storage of phonemic items in working memory provides bottom-up input to unitized representations, or list chunks, that group together sequences of items of variable length. The list chunks compete with each other as they dynamically integrate this bottom-up information. The winning groupings feed back to provide top-down support to their phonemic items. Feedback establishes a resonance which temporarily boosts the activation levels of selected items and chunks, thereby creating an emergent conscious percept. Because the resonance evolves more slowly than working memory activation, it can be influenced by information presented after relatively long intervening silence intervals. The same phonemic input can hereby yield different groupings depending on its arrival time. Processes of resonant transfer and competitive teaming help determine which groupings win the competition. Habituating levels of neurotransmitter along the pathways that sustain the resonant feedback lead to a resonant collapse that permits the formation of subsequent resonances. Keywords: speech perception, word recognition, consciousness, adaptive esonance, context effects, consonant perception, neural network, silence duration, working memory, categorization, clustering. From mpp at us.ibm.com Tue Feb 13 13:58:04 2001 From: mpp at us.ibm.com (Michael Perrone) Date: Tue, 13 Feb 2001 13:58:04 -0500 Subject: IBM Graduate Summer Intern Positions in Handwriting Recognition Message-ID: _________________________________________________________________________________ Graduate Summer Intern Positions at IBM _________________________________________________________________________________ The Pen Technologies Group at the IBM T.J. Watson Research Center is looking for graduate students to fill summer R&D positions in the area of large-vocabulary, unconstrained, handwriting recognition. Candidates should have the following qualifications: - Currently enrolled in a PhD program in EE, CS, Math, Physics or similar field - Research experience in handwriting recognition or IR - Strong mathematics/probability background - Excellent programming skills (in C and C++) - Creativity Our current projects include: - HMM-based, unconstrained, handwriting recognition - Language and grammar modeling - Accurate, high-speed, search methods - Document understanding and processing - Pen computing - Handwritten document retrieval The IBM T.J. Watson Research Center is one of the top industrial laboratories in the world. We offer an exciting research environment with the opportunity to become involved in all aspects of cutting edge technology in the computer industry. ______________________ Please send CV's to: Michael P. Perrone mpp at us.ibm.com -or- Michael P. Perrone IBM T.J. Watson Research Center - 36-207 Route 134 Yorktown Heights, NY 10598 914-945-1779 From cangulo at esaii.upc.es Tue Feb 13 12:14:36 2001 From: cangulo at esaii.upc.es (Cecilio Angulo) Date: Tue, 13 Feb 2001 18:14:36 +0100 Subject: CFP Proposed Pre-Organized Session. IWANN 2001 Message-ID: <3A896B7B.19B6BAC9@esaii.upc.es> Call for papers ========== Proposed Pre-Organized Session. IWANN 2001 Granada, Spain. June 13-15 , 2001 Title: Kernel Machines. Kernel-based Learning Methods. Organizer: Dr. Andreu Catal Description: Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel functions in learning systems for problems of classification, regression, density estimation, ... Some learning machines employing kernel methodology are Support Vector Machines, Radial Basis Function Neural Networks, Gaussian Process prediction, Mathematical Programming with Kernels, Regularized Artificial Neural Networks, Reproducing Kernel Hilbert Spaces and related methods. Some working directions are design of novel kernel-based algorithms and novel types of kernel functions, development of new learning theory concepts and speed-up learning process. Final Date for Submission February 28, 2001 Acceptance notification and start of inscription March 31, 2001 End of reduction fee for early inscription April 30, 2001 Congress date June 13-15, 2001 From cindy at cns.bu.edu Wed Feb 14 14:56:32 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 14 Feb 2001 14:56:32 -0500 Subject: Neural Networks 14(2) Message-ID: <200102141956.OAA07727@retina.bu.edu> NEURAL NETWORKS 14(2) Contents - Volume 14, Number 2 - 2001 ------------------------------------------------------------------ CONTRIBUTED ARTICLES: ***** Mathematical and Computational Analysis ***** Modularity, evolution, and the binding problem: A view from stability theory J.-J.E. Slotine and W. Lohmiller A pruning method for the recursive least squared algorithm Chi-Sing Leung, Kwok-Wo Wong, Pui-Fai Sum, and Lai-Wan Chan Two methods for encoding clusters Pierre Courrieu Numerical solution of differential equations using multiquadric radial basis function networks Nam Mai-Duy and Thanh Tran-Cong ***** Engineering and Design ***** Biomimetic gaze stabilization based on feedback-error-learning with nonparametric regression networks T. Shibata and S. Schaal A globally convergent Lagrange and barrier function iterative algorithm for the traveling salesman problem Chuangyin Dang and Lei Xu Quick fuzzy backpropagation algorithm A. Nikov and S. Stoeva ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From evansdj at aston.ac.uk Wed Feb 14 05:51:10 2001 From: evansdj at aston.ac.uk (DJ EVANS) Date: Wed, 14 Feb 2001 10:51:10 +0000 Subject: JOB: Postdoctoral Research Fellowship in Non-linear Statistics Message-ID: <3A8A631E.4EEFE77C@aston.ac.uk> Postdoctoral Research Fellowship Neural Computing Research Group, Aston University, Birmingham, UK NAOC: Neural network Algorithms for Ocean Colour We are looking for a highly motivated individual for a 3 year postdoctoral research position in the area of the analysis of multi-spectral, remotely sensed data. The emphasis of this research will be on developing and applying data modelling, visualisation and analysis algorithms to measurements of the 'ocean colour' across the electro-magnetic spectrum to generate estimates of chlorophyll-A concentrations using satellite observations. This will allow a better understanding of the distribution of primary biological production and the global carbon cycle. You will be working in a team from seven institutions around the EuropeanUnion, and there is scope for a great deal of interaction. For more details, see the NAOC web site http://www.ncrg.aston.ac.uk/~cornfosd/naoc/. You will work in the Neural Computing Research Group which has a worldwide reputation in practical and theoretical aspects of information analysis. Applicants should have strong mathematical and computational skills; some knowledge of remote sensing would be an advantage. We expect to use methods such as the Generative Topographic Mapping (GTM) to model the unconditional probability distribution of the ocean colour data, and then combine this with expert labelling to perform 'unsupervised' classification. We will also develop spatial (Gaussian process) models of the abundance of phytoplankton to aid field-wise (Bayesian) ocean colour retrieval. Candidates should hold (or imminently expect to gain) a PhD in Mathematics, Statistics, Physics, Remote Sensing or a related field. Experience of programming (especially MATLAB or C/C++) and a knowledge of Unix would be an advantage. Salaries will be up to point 6 on the RA 1A scale, currently 18731 UK pounds per annum. The salary scale is subject to annual increments. If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, quoting reference NAOC01, to: Personnel Officer Aston University Birmingham B4 7ET, U.K. Tel: +44 (0)121 359 0870 Fax: +44 (0)121 359 6470 Further details can be obtained from Dr. Dan Cornford: d.cornford at aston.ac.uk. Closing date: 16 March 2001 -- Dr Dan Cornford d.cornford at aston.ac.uk Computer Science Aston University Aston Triangle tel +44 121 359 3611 ext 4667 Birmingham B4 7ET fax +44 121 333 6215 http://www.ncrg.aston.ac.uk/~cornfosd/ ------------------------------------------------------------- David J. Evans evansdj at aston.ac.uk Neural Computing Research Group tel: +44 (0)121 359 3611 Room 318 C ext. 4668 Aston University Aston Triangle Birmingham B4 7ET http://www.ncrg.aston.ac.uk -------------------------------------------------------------- From terry at salk.edu Wed Feb 14 21:15:47 2001 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 14 Feb 2001 18:15:47 -0800 (PST) Subject: NEURAL COMPUTATION 13:3 In-Reply-To: <200101130743.f0D7h9v12640@purkinje.salk.edu> Message-ID: <200102150215.f1F2FlC47730@kepler.salk.edu> Neural Computation - Contents - Volume 13, Number 3 - March 1, 2001 Article The Limits of Counting Accuracy in Distributed Neural Representations A. R. Gardner-Medwin and H. B. Barlow Note An Expectation-Maximisation Approach to Nonlinear Component Analysis Roman Rosipal and Mark Girolami Letters A Population Density Approach that Facilitates Large-Scale Modeling of Neural Networks: Extension to Slow Inhibitory Synapses Duane Q. Nykamp and Daniel Tranchina A Complex Cell-Like Receptive Field Obtained Information Maximization Kenji Okajima and Hitoshi Imaoka Self-Organization of Topographic Mixture Networks Using Attentional Feedback James R. Williamson Auto-SOM: Recurseive Parameter Estimation for Guidance of Self-Organizing Feature Maps Karin Haese and Geoffrey J. Goodhill Exponential Convergence of Delayed Dynamical Systems Tianping Chen and Shun-ichi Amari Improvements to Platt's SMO Algorithm for SVM Classifier Design S. S. Keerthi, S. K. Shevade, C. Bhattacharyya and K. R. K. Murthy Learning Hough Transform: A Neural Network Model Jayanta Basak A Constrained E.M. Algorithm for Independent Component Analysis Max Welling and Markus Webber Emergence of Memory-Driven Command Neurons in Evolved Artificial Agents Ranit Aharonov-Barki, Tuvik Beker and Eytan Ruppin ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From d.mareschal at bbk.ac.uk Thu Feb 15 04:24:20 2001 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Thu, 15 Feb 2001 10:24:20 +0100 Subject: chnages to Developmental Science Message-ID: Dear all, Following the tragic and unexpected demise of Professor George Butterworth (founding editor of Developmental Science) a few chnages have been made to the editorial board and emphasis of the journal. DEVELOPMENTAL SCIENCE publishes cutting-edge theory and up-to-the-minute research on scientific developmental psychology from leading thinkers in the field. New scientific findings and in-depth empirical studies are published, with coverage including species comparative, compuational modelling (ESPECIALLY CONNECTIONIST OR NEURAL NETWORK MODELLING), social and biological approaches to development as well as cognitive development. INCREASING EMPHASIS WILL BE PLACED ON PAPERS THAT BRIDGE LEVELS OF EXPLANATION IN DEVELOPMENTAL SCIENCE, such as between brain growth and perceptual, cognitive and social development (sometimes called "developmental cognitive neuroscience"), and those which cover emerging areas such as functional neuroimaging of the developing brain. SPECIAL ISSUE, Volume 4, Issue 3 THE DEVELOPING HUMAN BRAIN Edited by Michael Posner, Mary Rothbart, Martha Farah and John Bruer This special issue will contain a state-of-the-art assessment written by six international panels who reviewed research progress on fundamental questions in the field. Chapters include: perception and attention, language, temperament and emotion, socialization and psychopathology. The two-year process of compiling the report involved meetings that included leading investigators in many fields and countries. EDITOR Mark H. Johnson, Birkbeck College, UK ASSOCIATE EDITORS B.J. Casey, Sackler Institute, Cornell University, USA Adele Diamond, Eunice Kennedy Shriver Center, USA Barbara Finlay, Cornell University, USA Patricia Kuhl, University of Washington, Seattle, USA Denis Mareschal, Birkbeck College, UK Andy Meltzoff, University of Seattle, USA Helen Neville, University of Oregon, USA Paul Quinn, Washington and Jefferson College, USA Scania de Schonen, CNRS Paris, France Michael Tomasello, Max Planck Inst. of Evol. Anthropology, Germany Editor for "Developmental Science in other languages" Juan-Carlos Gomez, University of St Andrews, UK FURTHER INFORMATION CAN BE OBTAINED FROM: http://www.blackwellpublishers.co.uk/journals/desc/ ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 (0)20 7631-6582/6207 fax +44 (0)20 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From lunga at ifi.unizh.ch Mon Feb 19 08:48:43 2001 From: lunga at ifi.unizh.ch (Max Lungarella) Date: 19 Feb 2001 14:48:43 +0100 Subject: DECO2001 - CFP Message-ID: <3A91243B.FB87BB9B@ifi.unizh.ch> We apologize for multiple postings of this call. ********************** CALL FOR PAPERS ************************** DECO2001 - Developmental Embodied Cognition Workshop to be held in conjuction with the 23rd Annual Meeting of the Cognitive Science Society July 31st, University of Edinburgh, Scotland http://www.cogsci.ed.ac.uk/~deco/ PAPER DEADLINE: April 30, 2001 WORKSHOP DATE: July 31, 2001 SCOPE The objective of this workshop is to bring together researchers from cognitive science, psychology, robotics, artificial intelligence, philosophy, and related fields to discuss the role of developmental and embodied views of cognition, and in particular, their mutual relationship. The ultimate goal of this approach is to understand the emergence of high-level cognition in organisms based on their interactions with their environment over extended periods of time. FOCUS The symposium will focus on research that explicitly takes development in embodied systems into account, either at the level of empirical work, computational models, or real-world devices. Finally, contributions giving a broad and novel philosophical or methodological view on embodied cognition are welcome. CONTRIBUTIONS Contributions are solicited from the following areas (but not restricted to this list): - Mechanisms of development (e.g. neural networks, dynamical systems, information theoretic) - Development of sensory and motor systems - Perception-action coupling, sensory-motor coordination - Cognitive developmental robotics - Categorization, object exploration - Communication and social interaction - Methodologies - Debates and philosophical issues (e.g. constructivism-selectionism, nature-nurture, scalability, symbol grounding) ORGANIZATION This will be a one-day workshop with a number of invited talks, a poster session, and with lots of room for discussion. In order to allow for more individual interactions, and to make better use of the limited time we have for the workshop, accepted papers will generally be as posters. The poster session will be over a cocktail to ensure a relaxed atmosphere. FORM OF CONTRIBUTION Contributions should be in the form of full papers with a maximum of five pages, which, if accepted, can be presented as posters at the workshop. Accepted contributions will be assembled in a handout paper collection for the participants of the workshop. At the workshop there will be a discussion on potential further publication, e.g. as a special issue of a journal or a book. GUIDELINES FOR ABSTRACT/PAPER SUBMISSION The contribution should be submitted electronically to deco at cogsci.ed.ac.uk in pdf format. Information on how to convert LaTeX or Word documents to pdf can be found at http://www.hcrc.ed.ac.uk/cogsci2001/PDF.html IMPORTANT DATES Full papers submission: April 30, 2001 Notification of acceptance: May 31, 2001 Workshop date: July 31, 2001 LINKS Workshop page: http://www.cogsci.ed.ac.uk/~deco Cognitive Science Conference: http://www.hcrc.ed.ac.uk/cogsci2001/ PROGRAM COMMITTEE Rolf Pfeifer (chair, AI Lab, University of Zurich, Switzerland) Gert Westermann (co-chair, Sony CSL, Paris) Cynthia Breazeal (MIT AI Lab, Cambridge, Mass., USA) Yiannis Demiris (Laboratory of Physiology, Oxford University, UK) Max Lungarella (AI Lab, University of Zurich, Switzerland) Rafael Nunez (University of Fribourg, Switzerland, and University of California, Berkeley, USA) Linda Smith (Department of Psychology, Indiana University, Bloomington, IN, USA) From moody at cse.ogi.edu Mon Feb 19 18:30:44 2001 From: moody at cse.ogi.edu (John E Moody) Date: Mon, 19 Feb 2001 15:30:44 -0800 Subject: Rio de Janeiro, 5th Congress on Neural Nets, April 2-5, 2001 Message-ID: <3A91ACA4.93039698@cse.ogi.edu> Connectionists, I am posting this announcement at the request of the organizers of the Fifth Brazilian Congress on Neural Networks, which will convene in Rio de Janeiro on April 2-5, 2001. For registration and abstract submission information, please see the conference URL at: http://www.ele.ita.br/cnrn Early registration continues through February 28. The Congress will feature presentations by Paul Werbos, Teuvo Kohonen and others, including: "Automatic Knowledge Discovery in Power Systems", Prof. Louis A. Wehenkel (U. Liege). "Real Time Dynamic Security Assessment", Prof. Mohamed El-Sharkawi (U. Washington). "Electrical Load Forecasting Via Neural Networks", Prof. Tharam S. Dillon (La Trobe U.). "The Application of Fundamental and Neural Network Models to Credit Risk Analysis", Dr. Pratap Sondhi (Citibank). "Financial Model Calibration Using Consistency Hints", Prof. Yaser Abu-Mostafa (CALTECH). "Learning to Trade via Direct Reinforcement", Prof. John Moody (Oregon Graduate Institute). "Self-Organizing Maps of Massive Document Collections", Prof. Teuvo Kohonen (Helsinki University of Technology). "Specification, Estimation, and Evaluation of Neural Networks Time Series Models", Prof. Timo Tersvirta (Stockholm School of Economics). "Neural Networks in Control Systems", Dr. Paul Werbos (National Science Foundation). See you in Rio! John ================================================================= John E. Moody http://www.cse.ogi.edu/~moody/ Professor and Director, http://www.cse.ogi.edu/CompFin/ Computational Finance Program Oregon Graduate Institute Voice: (503) 748-1554 20000 NW Walker Road FAX: (503) 748-1548 Beaverton, OR 97006 USA Email: moody at cse.ogi.edu ================================================================= From nnsp01 at neuro.kuleuven.ac.be Mon Feb 19 09:08:43 2001 From: nnsp01 at neuro.kuleuven.ac.be (Neural Networks for Signal Processing 2001) Date: Mon, 19 Feb 2001 15:08:43 +0100 Subject: NNSP 2001, Call For Papers Message-ID: <3A9128EB.6364725B@neuro.kuleuven.ac.be> ****************************************************************************** 2001 IEEE Workshop on Neural Networks for Signal Processing September 10--12, 2001 Falmouth, Massachusetts, USA ****************************************************************************** Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council General Chairs David J. MILLER Pennsylvania State University Tulay ADALI University of Maryland Baltimore County Program Chairs Jan LARSEN Technical University of Denmark Marc VAN HULLE Katholieke Universiteit, Leuven Finance Chair Lian YAN Athene Software, Inc. Proceedings Chair Scott C. DOUGLAS Southern Methodist University Publicity Chair Patrick DE MAZIERE Katholieke Universiteit, Leuven Registration Elizabeth J. WILSON Raytheon Co. Europe Liaison Herve BOURLARD Swiss Federal Institute of Technology, Lausanne America Liaison Amir ASSADI University of Wisconsin at Madison Asia Liaison H.C. FU National Chiao Tung University Papers Thanks to the sponsorship of IEEE Signal Processing Society and IEEE Neural Network Council, the eleventh of a series of IEEE workshops on Neural Networks for Signal Processing will be held in Falmouth, Massachusetts, at the SeaCrest Oceanfront Resort and Conference Center. The workshop will feature keynote addresses, technical presentations and panel discussions. Papers are solicited for, but not limited to,the following areas: Algorithms and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, parameter estimation, nonlinear signal processing, generalization, design algorithms, optimization, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, blind source separation, sonar and radar, data fusion, data mining, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. Further Information NNSP'2001 webpage: http://eivind.imm.dtu.dk/nnsp2001 Paper Submission Procedure Prospective authors are invited to submit a full paper of up to ten pages using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2001 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academic Publishers. Program Committee Yianni Attikiouzel Andrew Back Herve Bourlard Andrzej Cichocki Jesus Cid-Sueiro Robert Dony Ling Guan Tzyy-Ping Jung Shigeru Katagiri Jens Kohlmorgen Fa Long Luo Danilo Mandic Elias Manolakos Michael Manry Takashi Matsumoto Li Min Fu Christophe Molina Bernard Mulgrew Mahesan Niranjan Tomaso Poggio Kostas N. Plataniotis Jose Principe Phillip A.Regalia Joao-Marcos T. Romano Kenneth Rose Jonas Sjoberg Robert Snapp M. Kemal Sonmez Naonori Ueda Lizhong Wu Lian Yan Fernando J.Von Z uben Schedule Submission of full paper: March 15, 2001 Notification of acceptance: May 1, 2001 Submission of photo-ready accepted paper and author registration: June 1, 2001 Advance registration, before: July 15, 2001 From A.van.Ooyen at nih.knaw.nl Wed Feb 21 10:29:54 2001 From: A.van.Ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Wed, 21 Feb 2001 16:29:54 +0100 Subject: New Paper Message-ID: <3A93DEF2.D1FF5234@nih.knaw.nl> New Paper: Competition in the Development of Nerve Connections: A Review of Models Network: Computation in Neural Systems (2001) 12: R1-R47 by Arjen van Ooyen Please download (free) from Network: http://www.iop.org/Journals/ne or via my website http://www.anc.ed.ac.uk/~arjen/competition.html Abstract: The establishment and refinement of neural circuits involve both the formation of new connections and the elimination of already existing connections. Elimination of connections occurs, for example, in the development of mononeural innervation of muscle fibres and in the formation of ocular dominance columns in the visual cortex. The process that leads to the elimination of connections is often referred to as axonal or synaptic competition. Although the notion of competition is commonly used, the process is not well understood -- with respect to, for example, the type of competition, what axons and synapses are competing for, and the role of electrical activity. This article reviews the types of competition that have been distinguished and the models of competition that have been proposed. Models of both the neuromuscular system and the visual system are described. For each of these models, the assumptions on which it is based, its mathematical structure, and the extent to which it is supported by the experimental data are evaluated. Special attention is given to the different modelling approaches and the role of electrical activity in competition. -- Arjen van Ooyen, Netherlands Institute for Brain Research, Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands. email: A.van.Ooyen at nih.knaw.nl website: http://www.anc.ed.ac.uk/~arjen phone: +31.20.5665483 fax: +31.20.6961006 From jbower at bbb.caltech.edu Wed Feb 21 17:57:43 2001 From: jbower at bbb.caltech.edu (James M. Bower) Date: Wed, 21 Feb 2001 14:57:43 -0800 Subject: No subject Message-ID: CNS*01 Housing Alert We are pleased to announce the submission of almost 300 papers to this year's Computational Neuroscience Meeting in San Francisco and Asylomar California. http://cns.numedeon.com/cns2001/ Those of you planning on attending this year's CNS meeting, however, need to make your reservations at the Ramada Plaza Hotel in San Francisco immediately. The deadline for reserving rooms at the conference rate is: Wednesday, FEB. 28th, 2001 After that date, the cost per night for a hotel room will go from the conference rate of $139 to the regular rate of $239. Please reserve your room by calling 1-800-227-4747 or (415)626-8000. You can also download the reservation form from http://cns.numedeon.com/cns2001/ and fax the form to (415) 861-1435, Attn: Kuldip Singh. Please indicate to the reservation agent that you are a participant in the CNS*01 conference. Remember, you can reserve rooms at the conference rate anytime from Friday, June 29 to Saturday, June 30. Transportation will be provided from the Ramada Plaza Hotel on Sunday, July 1 to the Asilomar Conference Center. Thank you. Jim Bower, CNS Meeting Chair Judy Macias, CNS Conference Coordinator From harnad at coglit.ecs.soton.ac.uk Sun Feb 25 10:53:31 2001 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sun, 25 Feb 2001 15:53:31 +0000 (GMT) Subject: Survey of Users and Non-Users of Eprint Archives Message-ID: I would be very grateful if you could participate in a survey we are conducting on current users and non-users of Eprint Archives. http://www.eprints.org/survey/ The purpose of the survey is to determine who is and is not using such archives at this time, how they use them if they do, why they do not use them if they do not, and what features they would like to have added to them to make them more useful. (The survey is anonymous. Revealing your identity is optional and it will be kept confidential.) The survey consists of about web-based 72 questions, and comes in four versions: PHYSICISTS, ASTROPHYSICISTS, MATHEMATICIANS 1. arXiv Users 2. arXiv Non-Users COGNITIVE SCIENTISTS (Psychologists, Neuroscientists, Behavioral Biologists, Computer Scientists [AI/robotics/vision/speech/learning], Linguists, Philosophers) 3. CogPrints Users 4. CogPrints Non-Users OTHER DISCIPLINES: Please use either 2. or 4. http://www.eprints.org/survey/ Many thanks, Stevan Harnad harnad at cogsci.soton.ac.uk Professor of Cognitive Science harnad at princeton.edu Department of Electronics and phone: +44 23-80 592-582 Computer Science fax: +44 23-80 592-865 University of Southampton http://www.cogsci.soton.ac.uk/~harnad/ Highfield, Southampton http://www.princeton.edu/~harnad/ SO17 1BJ UNITED KINGDOM From: esann To: "Connectionists at cs.cmu.edu" References: From bogus@does.not.exist.com Mon Feb 26 08:10:40 2001 From: bogus@does.not.exist.com () Date: Mon, 26 Feb 2001 14:10:40 +0100 Subject: ESANN'2001 programme ( European Symposium on Artificial Neural Networks) Message-ID: ---------------------------------------------------- | | | ESANN'2001 | | | | 9th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 25-26-27, 2001 | | | | Preliminary programme | ---------------------------------------------------- The preliminary programme of the ESANN'2001 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. For 9 years the ESANN conference has become a major event in the field of neural computation. ESANN is a human-size conference focusing on fundamental aspects of artificial neural networks (theory, models, algorithms, links with statistics, data analysis, biological background,...). The programme of the conference can be found at the URL http://www.dice.ucl.ac.be/esann, together with practical information about the conference venue, registration,... Other information can be obtained by sending an e-mail to esann at dice.ucl.ac.be . ===================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat D facto conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ===================================================== From mzib at ee.technion.ac.il Mon Feb 26 09:31:58 2001 From: mzib at ee.technion.ac.il (Michael Zibulevsky) Date: Mon, 26 Feb 2001 16:31:58 +0200 (IST) Subject: New paper on extraction of a single source from multichannel data Message-ID: Announcing a paper ... Title: Extraction of a single source from multichannel data using sparse decomposition Authors: M. Zibulevsky and Y.Y. Zeevi ABSTRACT: It was discovered recently that use of sparse decompositions in signal dictionaries improves dramatically quality of blind source separation. In this work we exploit sparse decomposition of a single source in order to extract it from the multidimensional sensor data, when in addition a rough template of the source is known. This leads to a convex optimization problem, which is solved by a Newton-type method. Complete and overcomplete dictionaries are considered. Simulations with synthetic evoked responses mixed into natural 122-channel MEG data show significant improvement in accuracy of signal restoration. URL of gzipped ps file: http://ie.technion.ac.il/~mcib/extr_onesrc2.ps.gz Contact: mzib at ee.technion.ac.il From campber at CLEMSON.EDU Mon Feb 26 17:51:36 2001 From: campber at CLEMSON.EDU (Robert L. Campbell) Date: Mon, 26 Feb 2001 17:51:36 -0500 Subject: Interactivist Summer Institute, Lehigh University, July 23-27, 2001 Message-ID: The Interactivist Summer Institute 2001 July 23-27, 2001 Lehigh University, Bethlehem, Pennsylvania, USA GENERAL INFORMATION (This Call for Participation, along with related information, can also be viewed at http://www.lehigh.edu/~interact/isi2001.html). It's happening: research threads in multiple fields scattered across the mind-sciences seem to be converging towards a point where the classical treatment of representation within the encodingist framework is felt as an impasse. A rethinking of the methods, concepts, arguments, facts, etc. is needed and, so it seems, is being found in the interactivist approach. From research in human cognition, motivation, and development, through consciousness, sociality, and language, to artificial intelligence, post-behaviorist cognitive robotics, and interface design, we are witnessing the appearance of projects where the assumptions of interactivism are embraced. More often then not, this is in an implicit manner, so that at a superficial level those projects (the problems they deal with, the methods they use) seem to be incommensurable. However, underneath, one can feel their interactivist gist. The time is right (and ripe) we felt, to articulate this "irrational" (in Feyerabendian sense) pressure for change at a programmatic level, and this is what we want to accomplish with the present workshop. The workshop will be preceded by a Summer School in Interactivism featuring several tutorials which are meant to provide the needed theoretical background, based mainly on Mark Bickhard and his collaborators' work. The intention is for this Institute to become a traditional annual meeting where those sharing the core ideas of interactivism will meet and discuss their work, try to reconstruct its historical roots, put forward current research in different fields that fits the interactivist framework, and define research topics for prospective graduate students. People working in philosophy of mind, linguistics, social sciences, artificial intelligence, cognitive robotics, and other fields related to the sciences of mind are invited to send their statement of interest for participation to the organizers (see details below). ORGANIZING COMMITTEE Mark Bickhard John C. Christopher Wayne Christensen Robert Campbell Georgi Stojanov Goran Trajkovski MAJOR THEMES * Foundations of Interactivism * Naturalism * Emergence * Process metaphysics * Cognition and Representation * Representation emergent in action systems * Dissolution of problems of skepticism, error, Chinese room, etc. * Concepts * Memory * Learning * Heuristic learning * Metaphor * Rationality and negative knowledge * Agents * Interaction * Motivation * Emotions * Autonomous agents * Persons * Development * Consciousness * Sociality * Language * Ethics * Social processes and realities ACADEMIC AND SCIENTIFIC CO-SPONSORS Lehigh University, Bethlehem, PA, USA Institute for Interactive Studies Cognitive Science Program Humanities Research Center SS Cyril and Methodius University, Skopje, Macedonia West Virginia University at Parkersburg CALL FOR PARTICIPATION Participation will be limited to 30 people and by invitation only; People wishing to participate should submit a short curriculum vitae and a statement of interest to Interactivist Summer Institute. Please include e-mail address and/or fax number, if available. Applications should be received by March 15, 2001. Notification of acceptance will be provided by April 15, 2001. The meeting will take place in the conference room The Governor's Suite, Iaccoca Hall (tentative). A small number of scholarships for partial financial support will be provided by the organizers for graduate students or postdocs. CALL FOR PAPERS If you are interested in the issues mentioned above and wish to share your thoughts and research results with like-minded people, please submit an extended abstract or full paper via email with attached files (in ASCII, RTF, or Word) to: Interactivist Summer Institute (interact at lehigh.edu) Abstracts and papers should be sent taking into account the following format: 1. Major theme of the paper, related to the major themes given above. 2. Paper title. 3. Extended abstract of 500 to 1500 words and/or paper drafts of 2000 to 5000 words, in English. 4. Author or co-authors with names, addresses, telephone number, fax number and e-mail address. All abstracts will be refereed by an independent panel of experts. The judgments of the referees will determine the list of papers to be presented at the conference. DEADLINES Applications: March 15 Submission of papers: March 15 Notification date: April 15 Receipt of registration fee: May 1 On campus housing reservation (see below): June 30 Off campus housing reservation (see below): June 22 CONFERENCE FEES Standard registration fee: $150 Student registration fee: $100 Checks should be made out to: Interactivist Summer Institute. Mail to: Mark H. Bickhard Interactivist Summer Institute 17 Memorial Drive East Bethlehem, PA 18015 USA For wire transfers: Wire address: First Union National Bank Funds Transfer Department Attention: NC0803 1525 West W.T. Harris Blvd. Charlotte, NC 28288-0803 ABA # 031201467 Account # 2100012444293 Account Name: Lehigh University For international wires, these additional identification numbers are required: CHIPS Participant #0509 Swift TID #PNBPUS33 You must include your name and identify that the transfer is for the Interactivist Summer Institute. HOUSING Housing is available both on campus and off campus. Off campus housing is with Comfort Suites, and is within easy walking distance of the main Lehigh campus. The rates are $80/night for a single and $85/night for a double. Please contact: Comfort Suites 120 W 3rd Bethlehem, PA 18015 USA 610-882-9700 On campus housing is available both air-conditioned (Trembly Park) and not air-conditioned (Gamma Phi Beta). For on campus housing, please fill out and return the Interactivist Summer Institute housing form. This can be obtained from the Web site as a PDF file or a Word file. TRAVEL The easiest way to get to Bethlehem is to fly into Lehigh Valley International Airport (known as ABE, from Allentown, Bethlehem, Easton - LVI is already taken by Las Vegas International Airport). There are direct flights from Chicago, for example, for those coming from the west, and also flights from the South (e.g., Atlanta). Flying into New York, particularly Newark Airport, also works well. There are buses to Bethlehem from Newark Airport and from the Port Authority Bus Terminal in Manhattan. So, from Kennedy or LaGuardia, you first go the Port Authority, and then get a bus to Bethlehem. The bus company is: Trans Bridge Lines 2012 Industrial Drive Bethlehem 610-868-6001 800-962-9135 The Industrial Drive terminal is the main bus terminal, and taxis are available to the Lehigh campus. Lehigh Valley Taxi: 610-867-6000 Quick Service Taxi: 610-434-8132 Airport Taxi Service: 610-231-2000 There is also a South Bethlehem terminal that is within walking distance of Comfort Suites and of campus (though it would a little long with luggage), but fewer buses make that stop. Philadelphia airport is closer than Newark airport, but getting to Bethlehem from there is harder than from Newark. You get to the Philadelphia bus station (probably by taxi, though there is a train to downtown Philadelphia), and then take a bus (Bieber Tours) to Bethlehem: it's roughly the equivalent in complication of coming through Kennedy airport -- Robert L. Campbell Professor, Psychology Brackett Hall 410A Clemson University Clemson, SC 29634-1355 USA phone (864) 656-4986 fax (864) 656-0358 http://hubcap.clemson.edu/~campber/index.html From frey at dendrite.uwaterloo.ca Tue Feb 27 13:11:39 2001 From: frey at dendrite.uwaterloo.ca (frey@dendrite.uwaterloo.ca) Date: Tue, 27 Feb 2001 13:11:39 -0500 Subject: postdoc advertisement Message-ID: <200102271811.NAA31868@dendrite.uwaterloo.ca> CALL FOR APPLICATIONS POSTDOCTORAL RESEARCH SCIENTIST Probabilistic Inference - Machine Learning - Decision Making Statistical Computing - Bayesian Theory & Applications Computer Vision - Coding - Speech Recognition - Bioinformatics Our group at the University of Toronto would like to hire one or more postdoctoral research scientists. The successful applicant(s) will work on theoretical and applied research in areas such as those listed above. Faculty members in our group and their interests are as follows: Craig Boutilier http://www.cs.toronto.edu/~cebly Markov decision processes, reinforcement learning. Probabilistic inference. Economic models of agency, combinatorial auctions. Preference elicitation, interactive optimization under uncertainty. Brendan Frey http://www.cs.toronto.edu/~frey Graphical models, machine learning, variational techniques, loopy belief propagation. Computer vision. Speech recognition. Iterative error-correcting decoding. SAR and MRI imaging. Bioinformatics. Radford Neal http://www.cs.toronto.edu/~radford/ Bayesian modeling with neural networks, Gaussian processes, and mixtures. Markov chain Monte Carlo methods. Low density parity check codes. Empirical assessment of learning methods. Jeffrey Rosenthal http://markov.utstat.toronto.edu/jeff/ Probability theory and stochastic processes. Markov chain Monte Carlo theory and methods. Convergence rates of Markov chains. Randomized algorithms. Random walks on groups. Rich Zemel http://www.cs.toronto.edu/~zemel Unsupervised learning, boosting. Perceptual learning, representations of visual motion, multisensory integration. Neural coding, probabilistic models of neural representations. The group currently consists of the above faculty members, 2 postdoctoral researchers and 25 graduate students and has joint projects with Microsoft Research, Xerox PARC, the University of Illinois at Urbana-Champaign, the University of British Columbia, the University of Waterloo, and Simon Fraser University. Applicants should * have a solid background in one or more of the areas described above * have good scientific skills * be good at writing software to implement and evaluate algorithms Successful applicants who wish to do so will have the opportunity to apply to do sessional teaching in the departments of Computer Science, Statistics, or Electrical and Computer Engineering. Applicants should EMAIL a CV, the email addresses of 3 references, and a short description of their research interests and goals as a postdoc (ascii format, < 500 words) to Brendan Frey at frey at cs.toronto.edu. From xwu at gauss.Mines.EDU Tue Feb 27 12:35:44 2001 From: xwu at gauss.Mines.EDU (Xindong Wu) Date: Tue, 27 Feb 2001 10:35:44 -0700 (MST) Subject: Knowledge and Information Systems: 3(1) and 3(2), 2001 Message-ID: <200102271735.KAA05770@gauss.Mines.EDU> Knowledge and Information Systems: An International Journal ----------------------------------------------------------- ISSN: 0219-1377 (printed version) ISSN: 0219-3116 (electronic version) by Springer-Verlag Home Page: http://kais.mines.edu/~kais/home.html ================================================ I. Volume 3, Number 1 (February 2001) ------------------------------------- Regular Papers - Parallel Data Mining for Association Rules on Shared-Memory Systems by Srinivasan Parthasarathy, Mohammed J. Zaki, Mitsunori Ogihara, and Wei Li URL link.springer.de/link/service/journals/10115/bibs/1003001/10030001.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030001.htm - On Similarity Measures for Multimedia Database Applications by K. Selcuk Candan and Wen-Syan Li URL link.springer.de/link/service/journals/10115/bibs/1003001/10030030.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030030.htm - Representing and Reasoning on Database Conceptual Schemas by Mohand-Said Hacid, Jean-Marc Petit and Farouk Toumani URL link.springer.de/link/service/journals/10115/bibs/1003001/10030052.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030052.htm - Fuzzy User Modeling for Information Retrieval on the World Wide Web by Robert I. John and Gabrielle J. Mooney URL link.springer.de/link/service/journals/10115/bibs/1003001/10030081.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030081.htm - An Artificial Network Simulating Cause-to-Effect Reasoning: Cancellation Interactions and Numerical Studies by L. Ben Romdhane, B. Ayeb, and S. Wang URL link.springer.de/link/service/journals/10115/bibs/1003001/10030096.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030096.htm Short Papers - Zipf's Law for Web Surfers by Mark Levene, Jose Borges and George Loizou URL link.springer.de/link/service/journals/10115/bibs/1003001/10030120.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030120.htm II. Volume 3, Number 2 (May 2001) --------------------------------- Regular Papers - Making Use of the Most Expressive Jumping Emerging Patterns for Classification by Jinyan Li, Guozhu Dong, and Kotagiri Ramamohanarao - A Description Length Based Decision Criterion for Default Knowledge in the Ripple Down Rules Method by Takuya Wada, Tadashi Horiuchi, Hiroshi Motoda and Takashi Washio - Multipass Algorithms for Mining Association Rules in Text Databases by J.D. Holt and S.M. Chung - C-Net: A Method for Generating Non-Deterministic and Dynamic Multi-Variate Decision Trees by H.A. Abbass, M. Towsey, and G. Finn - A Hybrid Fragmentation Approach for Distributed Deductive Database Systems by Seung-Jin Lim and Yiu-Kai Ng - ActiveCBR: An Agent System that Integrates Case-based Reasoning and Active Databases by Sheng Li and Qiang Yang Short Papers - XML Indexing and Retrieval with a Hybrid Storage Model by Dongwook Shin From se37 at cornell.edu Tue Feb 27 17:25:17 2001 From: se37 at cornell.edu (Shimon Edelman) Date: Tue, 27 Feb 2001 17:25:17 -0500 Subject: a cross-disciplinary symposium on communication Message-ID: ---------------------------------------------------------------------- WHAT: "From Signals to Structured Communication" WHEN: May 4 and 5 WHERE: Ithaca, NY WEB: http://kybele.psych.cornell.edu/~edelman/CogStud/May2001.html The list of speakers at this Spring Symposium of the Cornell Cognitive Studies Program includes representatives from neurobiology, ethology, philosophy, linguistics, economics, game theory, and computer science. The emergence of a diverse, unconventional and exciting set of perspectives on communication at this event will be facilitated by focusing the presentations on certain common threads - notably, issues having to do with structure, compositional or other - that run through the entire spectrum of research themes represented at the symposium. ---------------------------------------------------------------------- Shimon Edelman Professor, Dept. of Psychology, 232 Uris Hall Cornell University, Ithaca, NY 14853-7601, USA Web: http://kybele.psych.cornell.edu/~edelman From giacomo at ini.phys.ethz.ch Tue Feb 27 03:58:12 2001 From: giacomo at ini.phys.ethz.ch (Giacomo Indiveri) Date: Tue, 27 Feb 2001 09:58:12 +0100 Subject: NEUROMORPHIC ENGINEERING WORKSHOP (second call) Message-ID: <3A9B6C23.7A1A7D89@ini.phys.ethz.ch> Please accept our apology for cross-postings. This is the second call for the Telluride Workshop application announcement (also at http://www.ini.unizh.ch/telluride2000/tell2001_announcement.html) ---------------------------------------------------------------------- NEUROMORPHIC ENGINEERING WORKSHOP Sunday, JULY 1 - Saturday, JULY 21, 2001 TELLURIDE, COLORADO ------------------------------------------------------------------------ Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Timmer HORIUCHI (University of Maryland) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) ------------------------------------------------------------------------ We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, July 1 to Sunday, July 21, 2001. The application deadline is Friday, March 7, and application instructions are described at the bottom of this document. The 2000 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, Whitaker Foundation, the Office of Naval Research, and by the Center for Neuromorphic Systems Engineering at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available at http://www.ini.unizh.ch/telluride2000. We strongly encourage interested parties to browse through the previous workshop web pages: http://www.ini.unizh.ch/telluride2000/ GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on active participation, with demonstration systems and hands-on-experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware (with an emphasis on analog and asynchronous digital VLSI), are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures on systems neuroscience (in particular learning, oculo-motor and other motor systems and attention), practical tutorials on analog VLSI design, small mobile robots (Koalas and Kheperas), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed (soon to be defined). They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. The analog VLSI practical tutorials will cover all aspects of analog VLSI design, simulation, layout, and testing during the three weeks of the workshop. The first week covers basics of transistors, simple circuit design and simulation. This material is intended for participants who have no experience with analog VLSI. The second week will focus on design frames for silicon retinas, from the silicon compilation and layout of on-chip video scanners, to building the peripheral boards necessary for interfacing analog VLSI retinas to video output monitors. Retina chips will be provided. The third week will feature sessions on floating gates, including lectures on the physics of tunneling and injection, and on inter-chip communication systems. We will also feature a tutorial on the use of small, mobile robots, focusing on Koala's, as an ideal platform for vision, auditory and sensory-motor circuits. Projects that are carried out during the workshop will be centered in a number of working groups, including: * active vision * audition * olfaction * motor control * central pattern generator * robotics, multichip communication * analog VLSI * learning The active perception project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include a robot head active vision system consisting of a three degree-of-freedom binocular camera system that is fully programmable. The central pattern generator group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The robotics group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The audition group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The multichip communication project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communication will be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours drive away from Denver (350 miles). America West and United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of workstations running UNIX and PCs running LINUX and Microsoft Windows. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. Unless otherwise arranged with one of the organizers, we expect participants to stay for the entire duration of this three week workshop. FINANCIAL ARRANGEMENT: Notification of acceptances will be mailed out around March 9, 2001. Participants are expected to pay a $275.00 workshop fee at that time in order to reserve a place in the workshop. The cost of a shared condominium will be covered for all academic participants but upgrades to a private room will cost extra. Participants from National Laboratories and Industry are expected to pay for these condominiums. Travel reimbursement of up to $500 for US domestic travel and up to $800 for overseas travel will be possible if financial help is needed (Please specify on the application). HOW TO APPLY: Applicants should be at the level of graduate students or above (i.e. postdoctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: * First name, Last name, valid email address. * Curriculum Vitae. * One page summary of background and interests relevant to the workshop. * Description of special equipment needed for demonstrations that could be brought to the workshop. * Two letters of recommendation Complete applications should be sent to: Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 email: telluride at salk.edu FAX: (858) 587 0417 DEADLINE: March 7, 2001 Applicants will be notified by email around March 21, 2001 From vercher at laps.univ-mrs.fr Tue Feb 27 06:28:15 2001 From: vercher at laps.univ-mrs.fr (Jean-Louis Vercher) Date: Tue, 27 Feb 2001 12:28:15 +0100 Subject: Conference announcement (Marseille 2001) Message-ID: SORRY FOR MULTIPLE POSTINGS --------------------------- CONFERENCE ANNOUNCEMENT 3rd INTERNATIONAL CONFERENCE ON SENSORIMOTOR CONTROLS IN MEN AND MACHINES We are pleased to announce The IIIrd Conference on Sensorimotor systems in Men and Machines, which will take place on October, Friday 5 - Saturday 6 2001 at Palais du Pharo, Marseille, France. The first one was held in Berkeley in 1994 (Starkfest), the second one in Chicago in 1997. To see details, including the list of speakers, and to obtain your registration form, visit conference web site at http:www.laps.univ-mrs.fr/umr/workshop/index.html or contact the Conference secretary: Nathalie FENOUIL mailto:workshop at laps.univ-mrs.fr +33 [0] 491 17 22 50 Note that the deadline for abstract submission is April 31. Confirmation of acceptance will be forwarded one month after the deadline. From rosi-ci0 at wpmail.paisley.ac.uk Tue Feb 27 11:15:56 2001 From: rosi-ci0 at wpmail.paisley.ac.uk (Roman Rosipal) Date: Tue, 27 Feb 2001 16:15:56 +0000 Subject: New TR Message-ID: Dear Connectionists, The following TR is now available at my home page: Kernel Partial Least Squares Regression in RKHS Roman Rosipal and Leonard J Trejo Abstract A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the Kernel Partial Least Squares (PLS) regression model. Similar to Principal Components Regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling the relationship between input andoutput variables while maintaining most of the input variables information. PLS is considered to be useful in situations where the number of explanatory variables exceeds the number of observations and/or a high level of multicollinearityamong those variables is assumed. Motivated by this fact we will provide a Kernel PLS algorithm for construction of non-linear regression models in possiblyhigh-dimensional feature spaces. We give the theoretical description of the Kernel PLS algorithm and we experimentally compare the algorithm with the existing Kernel PCR and Kernel Ridge Regression techniques. We will demonstrate that on the data sets employed Kernel PLS achieves the same results but in comparison to Kernel PCR uses significantly smaller, qualitatively different components. __________________ You can download gzipped postscript from http://cis.paisley.ac.uk/rosi-ci0/Papers/TR01_1.ps.gz Any comments and remarks are very welcome. _______________ Roman Rosipal University of Paisley, CIS Department, Paisley, PA1 2BE Scotland, UK http://cis.paisley.ac.uk/rosi-ci0 e-mai:rosi-ci0 at paisley.ac.uk Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From horwitzb at mail.nih.gov Wed Feb 28 13:50:19 2001 From: horwitzb at mail.nih.gov (Horwitz, Barry (NIDCD)) Date: Wed, 28 Feb 2001 13:50:19 -0500 Subject: postdoctoral position available Message-ID: <45120BC2AC24D4119B7100508B9506D48F6A97@nihexchange5.nih.gov> National Institute on Deafness and Other Communication Disorders National Institutes of Health Postdoctoral Fellowship in Neural Modeling of Human Functional Neuroimaging Data A two-year postdoctoral fellowship under the supervision of Dr. Barry Horwitz is available immediately for developing and applying computational neuroscience modeling methods to in vivo human functional neuroimaging data, obtained from functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). PET and fMRI facilities are available to implement and test hypotheses derived from the modeling. Two areas of research are possible, depending on the background and interests of the fellow: (1) extending the method of structural equation modeling (path analysis) to deal with fMRI designs and to improve the use of this method for network analysis; (2) extending a large-scale neurobiologically realistic computational model with the goal of understanding the relation between functional neuroimaging data and the underlying electrophysiological behavior of multiple interconnected neuronal populations. See Trends in Cognitive Science 3: 91-98, 1999 for examples of these types of research. Knowledge of neural modeling or statistical techniques, and programming experience are required. PhD or MD degree required. The position is in the Language Section, Voice, Speech and Language Branch, NIDCD, NIH, Bethesda, MD USA. For further information, contact: Dr. Barry Horwitz, Bldg. 10, Rm. 6C420, National Institutes of Health, Bethesda, MD 20892, USA. Tel. 301-594-7755; FAX: 301-480-5625; Email: horwitz at helix.nih.gov. ---------------------------------------------------------------------------- ------ Barry Horwitz, Ph.D. Senior Investigator Language Section Voice, Speech and Language Branch National Institute on Deafness and other Communication Disorders National Institutes of Health Bldg. 10, Rm. 6C420 MSC 1591 Bethesda, MD 20892 USA Tel. 301-594-7755 FAX 301-480-5625 horwitz at helix.nih.gov http://www.nidcd.nih.gov/intram/scientists/horwitzb.htm From kap-listman at wkap.nl Wed Feb 28 20:25:32 2001 From: kap-listman at wkap.nl (kap-listman@wkap.nl) Date: Thu, 01 Mar 2001 02:25:32 +0100 (MET) Subject: New Issue: Neural Processing Letters. Vol. 13, Issue 1 Message-ID: <200103010125.CAA08052@wkap.nl> Kluwer ALERT, the free notification service from Kluwer Academic/PLENUM Publishers and Kluwer Law International ------------------------------------------------------------ Neural Processing Letters ISSN 1370-4621 http://www.wkap.nl/issuetoc.htm/1370-4621+13+1+2001 Vol. 13, Issue 1, February 2001. TITLE: The Neural Solids; For optimization problems AUTHOR(S): Giansalvo Cirrincione, Maurizio Cirrincione KEYWORD(S): constrained optimization, essential matrix decomposition, Hopfield networks, matrix decomposition, polar decomposition, self organization, stucture from motion. PAGE(S): 1-15 TITLE: Efficient Vector Quantization Using the WTA-Rule with Activity Equalization AUTHOR(S): Gunther Heidemann, Helge Ritter KEYWORD(S): clustering, codebook generation, competitive learning, neural gas, unsupervised learning, vector quantization, winner takes all. PAGE(S): 17-30 TITLE: Probability Density Estimation Using Adaptive Activation Function Neurons AUTHOR(S): Simone Fiori, Paolo Bucciarelli KEYWORD(S): adaptive activation function neurons, cumulative distribution function, differential entropy, probability density function, stochastic gradient. PAGE(S): 31-42 TITLE: A Simple Neural Network Pruning Algorithm with Application to Filter Synthesis AUTHOR(S): Kenji Suzuki, Isao Horiba, Noboru Sugie KEYWORD(S): generalization ability, image enhancement, generalization ability, image enhancement, neural filter, optimal structure, redundancy removal, right function, signal processing. PAGE(S): 43-53 TITLE: An Annealed Chaotic Competitive Learning Network with Nonlinear Self-feedback and Its Application in Edge Detection AUTHOR(S): Jzau-Sheng Lin, Ching-Tsorng Tsai, Jiann-Shu Lee KEYWORD(S): annealed chaotic competitive learning network, chaotic dynamic, edge detection, competitive learning network, simulated annealing. PAGE(S): 55-69 TITLE: Storage Capacity of the Exponential Correlation Associative Memory AUTHOR(S): Richard C. Wilson, Edwin R. Hancock KEYWORD(S): exponential correlation associative memory, storage capacity. PAGE(S): 71-80 TITLE: Learning Algorithm and Retrieval Process for the Multiple Classes Random Neural Network Model AUTHOR(S): Jose Aguilar KEYWORD(S): color pattern recognition, learning algorithm, multiple classes random neural network, retrieval process. PAGE(S): 81-91 -------------------------------------------------------------- Thank you for your interest in Kluwer's books and journals. NORTH, CENTRAL AND SOUTH AMERICA Kluwer Academic Publishers Order Department, PO Box 358 Accord Station, Hingham, MA 02018-0358 USA Telephone (781) 871-6600 Fax (781) 681-9045 E-Mail: kluwer at wkap.com EUROPE, ASIA AND AFRICA Kluwer Academic Publishers Distribution Center PO Box 322 3300 AH Dordrecht The Netherlands Telephone 31-78-6392392 Fax 31-78-6546474 E-Mail: orderdept at wkap.nl From oby at cs.tu-berlin.de Thu Feb 1 07:52:12 2001 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Thu, 1 Feb 2001 13:52:12 +0100 (MET) Subject: funding opportunities Message-ID: <200102011252.NAA15347@pollux.cs.tu-berlin.de> Dear Connectionists, the German Alexander von Humboldt Foundation launched several programs to support foreign researchers in Germany and to enable them to set up their own, independent group at German research institutions. The programs are for all fields, but - I feel - that our area "Neural Computation" could benefit from these funding opportunities. Therefore, I would encourage all of you to look into these possibilities. If there is interest to collaborate with universities / research institutions in Berlin I am happy to help to establish contacts locally. Cheers Klaus ----------------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ ========================================================================= Internationalization of Research in Germany Initiatives of the Alexander von Humboldt Foundation - Wolfgang Paul Program The Wolfgang Paul Award, which is endowed with up to DM 4.5 million between 2001 and 2003, will make it possible for internationally known scientists and scholars from other countries to set up their own teams of highly qualified younger researchers in Germany. Researchers who are selected will be able to concentrate on their work without the burden of many administrative responsibilities; the goal is to carry out first-rate, innovative research and integrate younger researchers in this work. In addition to the award money, this support also includes financing of the researcher's team and any associated technicians or groups of younger researchers, as well as financing of necessary equipment and materials (incl. travel costs, etc.). The award will be granted based on submitted nominations. Kosmos Program Recipients of the Kosmos Award, who are chosen from among the top young researchers abroad, will receive up to DM 750,000 annually from 2001 to 2003 to establish a group of young researchers in Germany. The award includes funds for the recipient's own salary and for the costs of setting up and operating an independent team for researchers in a field of the team's own choice. Qualified applicants are outstanding young researchers from other countries, generally up to 35 years of age. Friedrich Wilhelm Bessel Research Award The objective of the Friedrich Wilhelm Bessel Research Award Program is to attract top young r esearchers who are already recognized as outstanding specialists in their area to engage in cooperative research efforts with specialists in Germany over a longer period of time. Friedrich Wilhelm Bessel Research Awards are granted to acknowledge previous work conducted in the research area. Recipients are invited to carry out their own research projects in Germany in cooperation with colleagues working in the same area. Top young researchers from abroad can be nominated who, as proven specialists in their fields, are especially in demand by German researchers. The award is endowed with DM 75,000-1 1 0,000. Humboldt Research Fellowship for Long-term Cooperation By further supporting Humboldt Research fellows already residing in Germany for a period of up to two years following their first residency, talented young foreigners are more strongly integrated into German research teams. The long-term support ensures intensive academic cooperation between the guest scholar and German colleagues working in the same area. The fellowship is matched with additional funds to cover research expenses. see also: http://www.humboldt-foundation.de From ijspeert at usc.edu Thu Feb 1 20:02:28 2001 From: ijspeert at usc.edu (ijspeert) Date: Thu, 1 Feb 2001 17:02:28 -0800 (PST) Subject: Preprint: a neuromechanical simulation of gait transition in salamander Message-ID: Dear Connectionists, Preprints of a paper to appear in Biological Cybernetics addressing the neural control of locomotion and gait transition in salamander are available for download from the following sites: http://rana.usc.edu:8376/~ijspeert/publications.html http://rana.usc.edu:8376/~ijspeert/PAPERS/BC.ps.gz http://rana.usc.edu:8376/~ijspeert/PAPERS/BC.pdf The paper (see abstract below) presents a neuromechanical model capable of exhibiting the typical swimming and walking gaits of the salamander. Animations of the salamander simulation can be viewed at http://rana.usc.edu:8376/~ijspeert/salamander.html Comments are most welcome. Best regards, Auke Ijspeert --------------------------------------------------------------------------- Dr Auke Jan Ijspeert Brain Simulation Lab & Computational Learning and Motor Control Lab Dept. of Computer Science, Hedco Neurosciences bldg, 3641 Watt way U. of Southern California, Los Angeles, CA 90089-2520, USA Web: http://rana.usc.edu:8376/~ijspeert/ Tel: +1 213 7401922 or 7406995 (work) +1 310 8238087 (home) Fax: +1 213 7405687 Email: ijspeert at usc.edu --------------------------------------------------------------------------- A connectionist central pattern generator for the aquatic and terrestrial gaits of a simulated salamander Auke Jan Ijspeert Abstract: This article investigates the neural mechanisms underlying salamander locomotion, and develops a biologically plausible connectionist model of a central pattern generator capable of producing the typical aquatic and terrestrial gaits of the salamander. It investigates, in particular, what type of neural circuitry can produce and modulate the two locomotor programs identified within the salamander's spinal cord, namely, a traveling wave of neural activity for swimming and a standing wave for trotting. A two-dimensional biomechanical simulation of the salamander's body is developed whose muscle contraction is determined by the locomotion controller simulated as a leaky-integrator neural network. While the connectivity of the neural circuitry underlying locomotion in the salamander has not been decoded for the moment, this article presents the design of a neural circuit which has a general organization corresponding to that hypothesized by neurobiologists. In particular, the locomotion controller is based on a body {\it central pattern generator} (CPG) corresponding to a lamprey-like swimming controller, and is extended with a limb CPG for controlling the salamander's limbs. The complete controller is developed in three stages, with first the development of segmental oscillators, second the development of intersegmental coupling for the making of a lamprey-like swimming CPG, and finally the development of the limb CPG and its coupling with the body CPG. A genetic algorithm is used to instantiate the parameters of the neural circuit for the different stages given a high level description of the desired state space trajectories of the different subnetworks. A controller is thus developed which can produce neural activities and locomotion gaits very similar to those observed in the real salamander. By varying the tonic (i.e. non-oscillating) excitation applied to the network, the speed, direction and type of gait can be varied. From takane at takane2.psych.mcgill.ca Fri Feb 2 09:21:31 2001 From: takane at takane2.psych.mcgill.ca (Yoshio Takane) Date: Fri, 2 Feb 2001 09:21:31 -0500 (EST) Subject: No subject Message-ID: <200102021421.JAA07987@takane2.psych.mcgill.ca> Dear Colleagues, Thanks to you all! I have received overwhelming responses to my "call for papers" for my invited session at the IMPS2001, circulated a few weeks ago. The following are tentative speakers: names, titles of the talks, affiliations and email addresses. See you all there. You can get detailed information about the meeting at http://www.ir.rikkyo.ac.jp Regards, Yoshio Takane Professor ******************************************************************** 1. Thomas R. Shultz & Francois Rivest (McGill University). Knowledge- based cascade-correlation: Size variation of relevant prior knowledge. (Email: shultz at psych.mcgill.ca, frives at po-box.mcgill.ca) 2. Shotaro Akaho (National Institute of Advanced Industrial Science and Technology, formerly ETL). A kernel method for canonical correlation analysis. (Email: akaho at etl.go.jp) 3. Hideki Asoh (National Institute of Advanced Industrial Science and Technology, formerly ETL). An approximation of nonlinear canonical correlation analysis using neural networks. (Email: asoh at etl.go.jp) 4. Yoshio Takane & Yuriko Oshima-Takane (McGill University). Nonlinear generalized canonical correlation analysis by neural network models. (Email: takane at takane2.psych.mcgill.ca, yuoshima at ed.tokyo-fukushi.ac.jp) 5. Daniel L. Silver (Acadia University). The task rehearsal method of sequential learning. (Email: Danny.Silver at AcadiaU.ca) 6. Shogo Makioka (Osaka Women's University). A connectionist model of phonological working memory. (Email: makioka at center.osaka-wu.ac.jp) 7. Ryotaro Kamimura (Tokai University). Cooperative information control for self-organization maps. (Email: ryo at ego.mcgill.ca) 8. Emilia Barakova (GMD-Japan Reaserch Laboratory). Temporal integration of self-organized sensor streams for robust spatial modeling. (Email: emilia.barakova at emilia-g3.gmd.gr.jp) 9. Allan Kardec Barros, Andrzej Cichocki & Noboru Ohnishi (Riken and Universidade Federal do Maranhao) Extraction of sources using a priori information about their temporal structure. (Email: allan at biomedica.org, cia at brain.riken.go.jp) 10. Keith Worsley (McGill University). Brain mapping data: classification, principal components, and multidimensional scaling. (Email: keith at scylla.math.mcgill.ca) From mpessk at guppy.mpe.nus.edu.sg Fri Feb 2 23:09:26 2001 From: mpessk at guppy.mpe.nus.edu.sg (S. Sathiya Keerthi) Date: Sat, 3 Feb 2001 12:09:26 +0800 (SGT) Subject: TR + Code for automatic C, sigma tuning in SVMs Message-ID: Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms S. Keerthi National Univeristy of Singapore mpessk at guppy.mpe.nus.edu.sg Motivated by a recent SVM paper by Chapelle, Vapnik, Bousquet and Mukherjee, I have been working on a code for the automatic tuning of the hyperparameters of a Support Vector Machine with L2 soft margin, for which the radius/margin bound is taken as the index to be minimized and iterative techniques are employed for computing radius and margin. The implementation is found to be feasible and efficient even for large problems having more than 10,000 support vectors. A report discussing the implementation issues, together with a code running on Matlab interface, can be downloaded from the following page: http://guppy.mpe.nus.edu.sg/~mpessk/nparm.shtml Any comments on the tech report as well as the performance of the code on new datasets will be very much appreciated. From fuzzi at ee.usyd.edu.au Sat Feb 3 22:12:18 2001 From: fuzzi at ee.usyd.edu.au (fuzz-ieee01) Date: Sun, 4 Feb 2001 14:12:18 +1100 (EST) Subject: Final CFP: FUZZ-IEEE 2001 Message-ID: <200102040312.OAA00053@cassius.ee.usyd.edu.au> *************************************************************************** Call for Papers The 10th IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2001) December 2-5, 2001, The University of Melbourne, Melbourne, Australia Sponsored by IEEE Neural Networks Council "Meeting the grand challenge: machines that serve people" Website: http://www.csse.melbourne.edu/FUZZ-IEEE2001 Students: Partial travel scholarships are available from the IEEE Neural Network Council. Please check http://www.arc.unm.edu/~karen/IEEE_NNC/Student_Travel_Grants/ Important dates: Paper submission Friday, 2 March 2001 Notification of acceptance Friday, 1 June 2001 Final manuscripts Friday, 3 August 2001 General enquiry please contact the secretariat of FUZZ-IEEE 2001: fuzz-ieee2001 at csse.melbourne.edu *************************************************************************** The FUZZ-IEEE 2001 conference will be held in Melbourne, Australia, one of the most beautiful and exciting cities in the Southern Hemisphere. Melbourne is also one of the safest, healthiest, and cleanest cities in the world, and is Australia's pre-eminent centre for arts and culture, education, fine food and dining and exciting shopping experiences. The conference will cover a broad range of research topics related to fuzzy logic and soft computing, including but not limited to: T1: Pattern recognition and image processing: supervised and unsupervised learning, classifier design and integration, signal/image processing and analysis, computer vision, multimedia applications. T2: Electronic and robotic systems: fuzzy logic in robotics, automation, and other industrial applications, fuzzy hardware design and implementation. T3: Soft computing and hybrid systems: intelligent information systems, database systems, data mining, intelligent agents, neuro-fuzzy systems, Internet computing. T4: Control systems: fuzzy control theory and applications. T5: Mathematics: foundations of fuzzy logic, approximate reasoning, evolutionary computation. Authors are invited to submit 6 copies of full papers for review. Papers should be written in English, not exceeding 7 single-sided pages on A4 or letter-size paper, in one-column format with 1-inch margin on all four sides, in Times or a similar font of 10 points or larger. Faxed or e-mailed papers will not be accepted. The first page of each paper must include the following information: - title of the paper, - names(s) and affiliation(s) of the author(s), - abstract of the paper, - maximum 5 keywords, - technical area of the paper (T1, T2, T3, T4 or T5, choose one only), - name, postal address, phone and fax numbers and e-mail address of the contact author. Please send papers to: Secretariat of FUZZ-IEEE 2001 Conference Management The University of Melbourne Victoria 3010, Australia E-mail: fuzz-ieee2001 at csse.melbourne.edu *************************************************************************** CONFERENCE ORGAZATION COMMITTEES Honorary Co-Chairs Jim Bezdek, University of West Florida, USA Enrique H. Ruspini, SRI International, USA Lotfi A. Zadeh, University of California, Berkeley, USA General Chair Zhi-Qiang Liu, University of Melbourne, Australia Program Chair Hong Yan, University of Sydney, Australia Special Sessions Chair Sadaaki Miyamoto, University of Tsukuba, Japan Workshop & Tutorials Chair Qiang Shen, University of Edinburgh, UK International Steering Committee Chair Valerie V. Cross, University of North Carolina, USA Publicity Chairs Alen Blair, University of Melbourne, Australia Publications Chair Ed Kazmierczak, University of Melbourne, Australia Local Arrangements Chair Nick M. Barnes, University of Melbourne, Australia PROGRAM AREA CHAIRS Pattern Recognition and Image Processing Jim Keller, University of Missouri, USA Electronic & Robotic Systems Toshio Fukuda, Nagoya University, Japan Control Systems Janusz Kacprzyk, Polish Academy of Science, Poland Soft Computing and Hybrid Systems Nikola Kasabov, University of Otago, New Zealand Mathematics Arthur Ramer, University of New South Wales, Australia *************************************************************************** From steve at cns.bu.edu Sat Feb 3 14:06:57 2001 From: steve at cns.bu.edu (Stephen Grossberg) Date: Sat, 3 Feb 2001 11:06:57 -0800 Subject: Context-Sensitive Binding by the Laminar Circuits of V1 and V2 Message-ID: The following article is available at http://www.cns.bu.edu/Profiles/Grossberg in PDF format: Raizada, R.D.S. and Grossberg, S. (2001). Context-Sensitive Binding by the Laminar Circuits of V1 and V2: A Unified Model of Perceptual Grouping, Attention, and Orientation Contrast. Visual Cognition, in press. Preliminary version available as Technical Report CAS/CNS TR-2000-008 ABSTRACT: A detailed neural model is presented of how the laminar circuits of visual cortical areas V1 and V2 implement context-sensitive binding processes such as perceptual grouping and attention. The model proposes how specific laminar circuits allow the responses of visual cortical neurons to be determined not only by the stimuli within their classical receptive fields, but also to be strongly influenced by stimuli in the extra-classical surround. This context-sensitive visual processing can greatly enhance the analysis of visual scenes, especially those containing targets that are low contrast, partially occluded, or crowded by distractors. We show how interactions of feedforward, feedback and horizontal circuitry can implement several types of contextual processing simultaneously, using shared laminar circuits. In particular, we present computer simulations which suggest how top-down attention and preattentive perceptual grouping, two processes that are fundamental for visual binding, can interact, with attentional enhancement selectively propagating along groupings of both real and illusory contours, thereby showing how attention can selectively enhance object representations. These simulations also illustrate how attention may have a stronger facilitatory effect on low contrast than on high contrast stimuli, and how pop-out from orientation contrast may occur. The specific functional roles which the model proposes for the cortical layers allow several testable neurophysiological predictions to be made. The results presented here simulate only the boundary grouping system of adult cortical architecture. However, we also discuss how this model contributes to a larger neural theory of vision which suggests how intracortical and intercortical feedback help to stabilize development and learning within these cortical circuits. Although feedback plays a key role, fast feedforward processing is possible in response to unambiguous information. Model circuits are capable of synchronizing quickly, but context-sensitive persistence of previous events can influence how synchrony develops. Although these results focus on how the interblob cortical processing stream controls boundary grouping and attention, related modeling of the blob cortical processing stream suggests how visible surfaces are formed, and modeling of the motion stream suggests how transient responses to scenic changes can control long-range apparent motion and also attract spatial attention. From oreilly at grey.colorado.edu Mon Feb 5 01:20:31 2001 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Sun, 4 Feb 2001 23:20:31 -0700 Subject: Changes at Cognitive Science Message-ID: <200102050620.XAA21864@grey.colorado.edu> ANNOUNCEMENT OF RECENT CHANGES AT COGNITIVE SCIENCE: The Cognitive Science journal has a new editorial board and set of policies. One of the more important policy changes is that we now officially publish four categories of articles: Letters to the editor, brief reports, regular articles, and extended articles. To find out more details on the nature of these categories and the journal more generally, visit the cognitive science society's web page at: http://www.cognitivesciencesociety.org/about.html The journal is also now available online for all members of the cognitive science society at: http://www.elsevier.nl/gej-ng/10/15/15/show/ Editor: Robert L. Goldstone, Indiana University Associate Editors: John R. Anderson, Carnegie Mellon University Nick Chater, University of Warwick Andy Clark, University of Sussex Shimon Edelman, Cornell University Kenneth Forbus, Northwestern University Dedre Gentner, Northwestern University Raymond W. Gibbs, Jr., University of California, Santa Cruz James Greeno, Stanford University Robert A. Jacobs, University of Rochester Randall C. O'Reilly, University of Colorado Boulder Colleen M. Seifert, University of Michigan Daniel Sperber, CNRS Paris From gustavo.deco at mchp.siemens.de Mon Feb 5 05:39:33 2001 From: gustavo.deco at mchp.siemens.de (Gustavo Deco) Date: Mon, 5 Feb 2001 11:39:33 +0100 (MET) Subject: New book Information Dynamics Message-ID: <200102051039.f15AdXS28209@loire.mchp.siemens.de> Dear Connectionists It is my pleasure to announce the publication of a NEW BOOK: INFORMATION DYNAMICS:Foundations and Applications by Gustavo DECO and Bernd SCHUERMANN Editorial: SPRINGER-VERLAG Approx. 304 pp. 89 figs., ISBN 0-387-95047-8 Information Dynamics presents an interdisciplinary treatment of nonlinear dynamical systems with an emphasis on complex systems which transmit information in time and space. The book offers a new theoretical approach to dynamical systems from an information theory perspective. The goal of the book is to provide a detailed and unified study of the flow of information in a quantitative manner, utilizing methods and techniques from information theory, time series analysis, nonlinear dynamics and neural networks. The authors use analysis of test-bed simulations, empirical data, and real-world applications to give concrete perspectives and reinforcement for the key conceptual ideas and methods. The formulation provides a unique and consistent conceptual framework for the problem of discovering knowledge behind empirical data. Contents: Dynamical Systems Overview.- Statistical Structure Extraction in Dynamical Systems: Parametric Formulation.- Applications: Parametric Characterization of Time Series.- Statistical Structure Extraction in Dynamical Systems: Nonparametric Formulation.- Applications: Nonparametric Characterization of Time Series.- Statistical Structure Extraction in Dynamical Systems: Semiparametric Formulation.- Applications: Semiparametric Characterization of Time Series.- Information Processing and Coding in Spatio-Temporal Dynamical Systems.- Applications: Information Processing and Coding in Spatio-Temporal Dynamical Systems.- Appendixes A,B,C,D.- References. Info Processing Theory, Nonlinear Dynamics, Time Series Modeling, Neural Networks For graduates, researchers, professionals ---------------------------- PD Dr.Dr.habil. Gustavo Deco Siemens Corporate Research Computational Neuroscience Otto-Hahn-Ring 6 D-81739 Munich Germany Tel. +49 89 636 47373 Fax. +49 89 636 49767 From m.usher at psychology.bbk.ac.uk Mon Feb 5 09:48:47 2001 From: m.usher at psychology.bbk.ac.uk (m.usher@psychology.bbk.ac.uk) Date: Mon, 5 Feb 2001 14:48:47 GMT Subject: article on stochastic resonance in human cognition Message-ID: <200102051448.OAA01475@acer.ccs.bbk.ac.uk> The following article addressing the phenomenon of stochastic resonance (SR), and which was recently published as a Letter to the Editors in Biol. Cybern. 83 (2000) 6, L011-L016, can now be accessed from the following website: www.psyc.bbk.ac.uk/staff/mu/homepage/noise/SR.pdf Stochastic resonance in the speed of memory retrieval Marius Usher and Mario Feingold Birkbeck College Ben Gurion University University of London Beer Sheva, Israel Email: M.Usher at bbk.ac.uk Email: Mario at bgumail.bgu.ac.il Abstract. The stochastic resonance (SR) phenomenon in human cognition (memory retrieval speed for arithmetical multiplication rules) is addressed in a behavioral and neurocomputational study. The results of an experiment in which performance was monitored for various magnitudes of acoustic noise are presented. The average response time was found to be minimal for some optimal noise level. Moreover, it was shown that the optimal noise level and the magnitude of the SR effect depend on the difficulty of the task. A computational framework based on leaky accumulators that integrate noisy information and provide the output upon reaching a threshold criterion is used to explain the observed phenomena. From a.hussain at cs.stir.ac.uk Tue Feb 6 12:23:04 2001 From: a.hussain at cs.stir.ac.uk (Dr. Amir Hussain) Date: Tue, 6 Feb 2001 17:23:04 -0000 Subject: CIS Journal Special Issue: Call for Papers Message-ID: <003701c09061$77e99c80$5299fc3e@amir> Dear connectionists, Please find the Call for Papers for the Control & Intelligent Systems (CIS) Journal Special Issue on "Non-linear Speech Processing Techniques & Applications", at the link below: http://www.actapress.com/journals/specialci.htm Looking forward to hearing from interested connectionists soon! Sincerely - Dr.Amir Hussain, Lecturer Guest Editor, Control & Intelligent Systems Journal Special Issue Department of Computing Science & Mathematics University of Stirling, Stirling FK9 4LA SCOTLAND, UK Tel / Fax: (++44) 01786 - 476437 / 464551 Email: a.hussain at cs.stir.ac.uk http://www.cs.stir.ac.uk/~ahu/ From jbower at bbb.caltech.edu Tue Feb 6 12:32:00 2001 From: jbower at bbb.caltech.edu (James M. Bower) Date: Tue, 6 Feb 2001 09:32:00 -0800 Subject: No subject Message-ID: New deadline for submitting papers to CNS*01 Due to circumstances beyond our control, the computer systems serving the Computational Neuroscience Meetings have not been available for the last two weeks. They are now reinstated, however, and we have established a new deadline of Feb. 12 for paper submissions to the meeting. CNS*01 is the tenth annual meeting of an interdisciplinary conference addressing a broad range of research approaches and issues involved in the field of computational neuroscience. These meetings bring together experimental and theoretical neurobiologists along with engineers, computer scientists, cognitive scientists, physicists, and mathematicians interested in the functioning of biological nervous systems. The meeting in 2001 will take place on the Central West Coast of the United States. The meeting will officially convene in San Francisco on the evening of June 30th with an opening reception. Then on the morning of Sunday July 1st, meeting participants will board busses for the Monterey Peninsula where the formal scientific sessions will start at 5:00 p.m. at the Asilomar Conference Grounds in Pacific Grove. The last meeting event will be the traditional banquet on Thursday evening, July 5th at the Monterey Bay Aquarium . Housing accommodations will be available for the evening of June 30th at the Ramada Plaza Hotel in the Market Street area of San Francisco and at the Asilomar Conference Grounds for the duration of the meeting. Travel information as well as instructions for paper submission can be found at the meeting's web site: http://cns.numedeon.com/cns2001/ Jim Bower Meeting Chairman From ormoneit at Robotics.Stanford.EDU Tue Feb 6 20:50:22 2001 From: ormoneit at Robotics.Stanford.EDU (Dirk Ormoneit) Date: Tue, 6 Feb 2001 17:50:22 -0800 (PST) Subject: Reinforcement learning, particle filters, and human motion Message-ID: <200102070150.RAA00782@bonfire.Stanford.EDU> Hi, Please take notice of the following brand-new papers: Kernel-based reinforcement learning in average-cost problems. D. Ormoneit and P. W. Glynn. http://robotics.stanford.edu/~ormoneit/publications/control.ps Lattice Particle Filters. C. Lemieux, D. Ormoneit, and David J. Fleet. http://robotics.stanford.edu/~ormoneit/publications/lattice.ps.gz Functional analysis of human motion data. D. Ormoneit, T. Hastie, and M.Black. http://robotics.stanford.edu/~ormoneit/publications/motion.ps.gz A more detailed description is enclosed below. Cheers, Dirk _________________________________________________________________ Kernel-based reinforcement learning in average-cost problems D. Ormoneit and P. W. Glynn. http://robotics.stanford.edu/~ormoneit/publications/control.ps Reinforcement learning (RL) is concerned with the identification of optimal controls in Markov Decision Processes (MDP) where no explicit model of the transition probabilities is available. Many existing approaches to RL --- including ``temporal-difference learning'' --- employ simulation-based approximations of the value function for this purpose \cite{sutton88,tsitsiklis97}. This proceeding frequently leads to numerical instabilities of the resulting learning algorithm, especially if the function approximators used are parametric such as linear combinations of basis functions or neural networks. In this work, we propose an alternative class of RL algorithms which always produces stable estimates of the value function. In detail, we use ``local averaging'' methods to construct an approximate dynamic programming (ADP) algorithm. _________________________________________________________________ Lattice Particle Filters C. Lemieux, D. Ormoneit, and David J. Fleet. http://robotics.stanford.edu/~ormoneit/publications/lattice.ps.gz A common way to formulate visual tracking is to adopt a Bayesian approach, and to use particle filters to cope with nonlinear dynamics and nonlinear observation equations. While particle filters can deal with such filtering tasks in principle, their performance often varies significantly due to their stochastic nature. We present a class of algorithms, called lattice particle filters, that circumvent this difficulty by placing the particles deterministically according to a Quasi-Monte Carlo integration rule. We describe a practical realization of this idea and discuss its theoretical properties. Experimental results with a synthetic 2D tracking problem show that the lattice particle filter yields a performance improvement over conventional particle filters that is equivalent to an increase between 10 and 60\% in the number of particles, depending on their ``sparsity'' in the state-space. We also present results on inferring 3D human motion from moving light displays. _________________________________________________________________ Functional analysis of human motion data D. Ormoneit, T. Hastie, and M.Black. http://robotics.stanford.edu/~ormoneit/publications/motion.ps.gz We present a method for the modeling of 3D human motion data using functional analysis. First, we estimate a statistical model of typical activities from a large set of 3D human motion data. For this purpose, the human body is represented as a set of articulated cylinders and the evolution of a particular joint angle is described by a time-series. Specifically, we consider periodic motion such as ``walking'' in this work, and we develop a new set of tools that allows for the automatic segmentation of the training data into a sequence of identical ``motion cycles''. Then we compute the mean and the principal components of these cycles using a new algorithm that accounts for missing information and that enforces smooth transitions between cycles. As an application of this methodology we consider the visual tracking of human motion in a 2D video sequence. Here the principal components serve to define a low-dimensional representation of the human 3D poses in a state-space model that treats the 2D video images as observations. We apply (approximate) Bayesian inference using a particle filter to the state-space model to infer the body poses at each time-step. The resulting algorithm is able to track human subjects in monocular video sequences and to recover their 3D motion in complex unknown environments. _________________________________________________________________ Dirk Ormoneit Department of Computer Science Gates Building, 1A-148 Stanford University Stanford, CA 94305-9010 ph.: (650) 725-8797 fax: (650) 725-1449 ormoneit at cs.stanford.edu http://robotics.stanford.edu/~ormoneit/ From Patrick.DeMaziere at med.kuleuven.ac.be Wed Feb 7 04:38:09 2001 From: Patrick.DeMaziere at med.kuleuven.ac.be (Patrick De Maziere) Date: Wed, 07 Feb 2001 10:38:09 +0100 Subject: Postdoctoral position in biomedical signal-processing Message-ID: <3A811781.62E819C5@med.kuleuven.ac.be> ----------------------------------------------------- Postdoctoral position in biomedical signal-processing ----------------------------------------------------- Deadline for application: 15 March 2001 The Computational Neuroscience Group of the Laboratory of Neuro- and Psychophysiology, Medical School of the Katholiek Universiteit Leuven, Belgium (http:\\simone.neuro.kuleuven.ac.be), invites applications for a post-doctoral position in the area of biomedical signal-processing (functional Magnetic Resonance Imaging). Desired profile: The highly qualified applicant should possess a Ph.D. degree in the field of signal-processing, image-processing, or neural networks. He/she should be familiar with Independent Components Analysis (ICA) and/or related techniques as Projection Pursuit, Blind Source Separation, ... Programming skills are an asset (C, Matlab, ...), as is a familiarity with UNIX and PC platforms. We offer: 1) A challenging research environment. The applicant will have access to data from state-of-the-art Magnetic Resonance scanners and signal-processing tools for examining brain activity in both humans and monkeys. 2) An attractive income. The applicant will receive 2375 Euro net per month, including a full social security coverage. This is comparable to the salary of an associate Professor at the University. Housing will be taken care of by the host institute. 3) Free return airline ticket, economy class (maximum 1500 Euro) and a reimbursement of all costs incurred for shipping luggage to Belgium (maximum 1000 Euro). Please send (mail/fax/email) your CV (including bibliography, and the names and addresses of three references), before the deadline of 15 March 2001 to: Prof. Dr. Marc M. Van Hulle K.U.Leuven Laboratorium voor Neuro- en Psychofysiologie Faculteit Geneeskunde Campus Gasthuisberg Herestraat 49 B-3000 Leuven Belgium Phone: + 32 16 345961 Fax: + 32 16 345993 E-mail: marc at neuro.kuleuven.ac.be URL: http://simone.neuro.kuleuven.ac.be From ml_conn at infrm.kiev.ua Wed Feb 7 16:00:18 2001 From: ml_conn at infrm.kiev.ua (Dmitri Rachkovskij) Date: Wed, 7 Feb 2001 23:00:18 +0200 (UKR) Subject: Building large-scale hierarchical models of the world... Message-ID: <2.07b5.7329.G8ENOI@infrm.kiev.ua> Keywords: analogy, analogical mapping, analogical retrieval, APNN, associative-projective neural networks, binary coding, binding, categories, chunking, compositional distributed representations, concepts, concept hierarchy, connectionist symbol processing, context-dependent thinning, distributed memory, distributed representations, Hebb, long-term memory, nested representations, neural assemblies, part-whole hierarchy, representation of structure, sparse coding, taxonomy hierarchy, thinning, working memory, world model Dear Colleagues, The following paper draft (abstract enclosed): Dmitri Rachkovskij & Ernst Kussul: "Building large-scale hierarchical models of the world with binary sparse distributed representations" is available at http://cogprints.soton.ac.uk/documents/disk0/00/00/12/87/index.html or by the ID code: cog00001287 at http://cogprints.soton.ac.uk/ Comments are welcome! Thank you and best regards, Dmitri Rachkovskij ************************************************************************* Dmitri A. Rachkovskij, Ph.D. Net: dar at infrm.kiev.ua Senior Researcher, V.M.Glushkov Cybernetics Center, Tel: 380 (44) 266-4119 Pr. Acad. Glushkova 40, Kiev 03680, UKRAINE Fax: 380 (44) 266-1570 ************************************************************************* Encl: Abstract Many researchers agree on the basic architecture of the "world model" where knowledge about the world required for organization of agent's intelligent behavior is represented. However, most proposals on possible implementation of such a model are far from being plausible both from computational and neurobiological points of view. Implementation ideas based on distributed connectionist representations offer a huge information capacity and flexibility of similarity representation. They also allow a distributed neural network memory to be used that provides an excellent storage capacity for sparse patterns and naturally forms generalization (or concept, or taxonomy) hierarchy using the Hebbian learning rule. However, for a long time distributed representations suffered from the "superposition catastrophe" that did not allow nested part-whole (or compositional) hierarchies to be handled. Besides, statistical nature of distributed representations demands their high dimensionality and a lot of memory, even for small tasks. Local representations are vivid, pictorial and easily interpretable, allow for an easy manual construction of both types of hierarchies and an economical computer simulation of toy tasks. The problems of local representations show up with scaling to the real world models. Such models include an enormous number of associated items that are met in various contexts and situations, comprise parts of other items, form multitude intersecting multilevel part-whole hierarchies, belong to various category-based hierarchies with fuzzy boundaries formed naturally in interaction with environment. It appears that using local representations in such models becomes less economical than distributed ones, and it is unclear how to solve their inherent problems under reasonable requirements imposed on memory size and speed (e.g., at the level of mammals' brain). We discuss the architecture of Associative-Projective Neural Networks (APNNs) that is based on binary sparse distributed representations of fixed dimensionality for items of various complexity and generality. Such representations are rather neurobiologically plausible, however we consider that the main biologically relevant feature of APNNs is their promise for scaling up to the full-sized adequate model of the real world, the feature that is lacked by implementations of other schemes As in other schemes of compositional distributed representations, such as HRRs of Plate and BSCs of Kanerva, an on-the-fly binding procedure is proposed for APNNs. It overcomes the superposition catastrophe, permitting the order and grouping of component items in structures to be represented. The APNN representations allow a simple estimation of structures' similarity (such as analogical episodes), as well as finding various kinds of associations based on context-dependent similarity of these representations. Structured distributed auto-associative neural network of feedback type is used as long-term memory, wherein representations of models organized into both types of hierarchy are built. Examples of schematic APNN architectures and processes for recognition, prediction, reaction, analogical reasoning, and other tasks required for functioning of an intelligent system, as well as APNN implementations, are considered. --------------------------------------------------------------------- From school at cogs.nbu.acad.bg Fri Feb 9 12:56:12 2001 From: school at cogs.nbu.acad.bg (CogSci Summer School) Date: Fri, 9 Feb 2001 19:56:12 +0200 Subject: CogSci 2001 Message-ID: 8th International Summer School in Cognitive Science Sofia, New Bulgarian University, July 9 - 28, 2001 Courses: * Arthur Shimamura (University of California at Berkeley, USA) - Ecexutive Control, Metacognition and Memory Processes * Barbara Finlay (Cornell University, USA) - Development and Evolution of the Brain * Maurice Greenberg (New Bulgarian University, BG) - Introduction to Connectionism * Stella Vosniadou (University of Athens, GR) - Cognitive Development and Conceptual Change * Michael Thomas (NCDU, ICH, UK) - Connectionist Models of Developmental Disorders * Csaba Pleh (University of Seget, HU) - Language Understanding in Children, Adults and Patients * Robert Goldstone (Indiana University, USA) - Human learning and adaptive systems * Jeff Elman (University of California at San Diego, USA) - Connectionist Models of Learning and Development * Susan Epstein (City University of New York, USA) - Cognitive Modeling and the Development of Expertise Organised by New Bulgarian University, Bulgarian Academy of Sciences, and Bulgarian Society for Cognitive Science Endorsed by the Cognitive Science Society For more information look at: http://www.nbu.bg/cogs/events/ss2001.htm Central and East European Center for Cognitive Science New Bulgarian University 21 Montevideo Str. Sofia 1635 phone: 955-7518 Guergana Erdeklian, administrative manager, CEECenter for Cognitive Science From marchand at site.uottawa.ca Thu Feb 8 09:00:15 2001 From: marchand at site.uottawa.ca (Mario Marchand) Date: Thu, 8 Feb 2001 09:00:15 -0500 Subject: Tenure-track and contractual positions available Message-ID: <004a01c091d7$76d76e40$255a7a89@site.uottawa.ca> Dear Colleagues The School of Information Technology and Engineering (SITE) at the University of Ottawa is searching for good candidates to fill several tenure-track faculty positions and contractual positions in basically any area of Computer Science, Software Engineering, and Computer and Electrical Engineering. Hence researchers in Machine Learning, Neural Networks, Pattern Recognition,... are invited to apply. For more details about these positions and our institution, please consult: http://www.site.uottawa.ca/school/positions/ Regards, -mario From dandre at cs.berkeley.edu Sat Feb 10 23:38:44 2001 From: dandre at cs.berkeley.edu (David Andre) Date: Sat, 10 Feb 2001 20:38:44 -0800 Subject: ICML Workshop on Hierarchy and Memory in Reinforcement Learning Message-ID: <23276.981866324@tatami.cs.berkeley.edu> Workshop on Hierarchy and Memory in Reinforcement Learning ICML 2001, Williams College, June 28, 2001 Call for Participation In recent years, much research in reinforcement learning has focused on learning, planning, and representing knowledge at multiple levels of temporal abstraction. If reinforcement learning is to scale to solving larger, more real-world-like problems, it is essential to consider a hierarchical approach in which a complex learning task is decomposed into subtasks. It has been shown in recent and past work that a hierarchical approach substantially increases the efficiency and abilities of RL systems. Early work in reinforcement learning showed faster learning resulted when tasks were decomposed into behaviors (Lin, 1993; Mahadevan and Connell, 1992; Singh et al 1994). However, these approaches were mostly based on modular, but not hierarchical decompositions. More recently, researchers have proposed various models, the most widely recognized being Hierarchies of Abstract Machines (HAMs) (Parr, 1998), options (Sutton, Precup and Singh, to appear), and MAXQ value function decomposition (Dietterich, 1998). A key technical breakthrough that enabled these approaches is the use of reinforcement learning over semi-Markov decision processes (Bradtke and Duff, Mahadevan et al, 1997, Parr, 1998). Although these approaches speed up learning considerably, they still assume the underlying environment is accessible (i.e. not perceptually aliased). A major direction of research into scaling up hierarchical methods is extend them to domains where the underlying states are hidden (i.e., partially observable Markov decision processes, or POMDPs). Over the past year, there has been a surge of interest in applying hierarchical reinforcement learning (HRL) methods to such partially observable domains. Researchers are investigating techniques such as memory, state abstraction, offline decomposition, action abstraction, and many others to simplify the problem of learning near-optimal behaviors as they attack increasingly complex environments. This workshop will be an opportunity for the researchers in this growing field to share knowledge and expertise on the topic, open lines of communication for collaboration, prevent redundant research, and possibly agree on standard problems and techniques. The format of the workshop is designed to encourage discussion and debate. There will be approximately four invited talks by senior researchers. Ample time between talks will be provided for discussion. Additionally, there will be a a NIPS-like poster session (complete with plenary poster previews). We will also be providing a partially observable problem domain that participants can optionally use to test their techniques, which may provide for a underlying common experience for comparing and contrasting techniques. The discussion will focus not only on the various techniques and theories for using memory in reinforcement learning, but also on higher order questions, such as ``Must hierarchy and memory be combined for practical RL systems?'' and ``What forms of structured memory are useful for RL?''. We are thus seeking attendees with the following interests: * Basic technologies applied to the problem of hierarchy and memory in RL - Function approximation - State abstraction - Finite memory methods * Training Issues for hierarchical RL - Shaping - Knowledge tranfser across tasks/subproblems * Hierarchy Construction issues - Action models - Action decomposition methods - Automatic acquisition of hierarchical structure - Learning state abstractions * Applications - Hierarchical motor control (especially with feedback). - Hierarchical visual attention/gaze control - Hierarchical navigation Submissions: To participate in the workshop, please send a email message to one of the two organizers, giving your name, address, email address, and a brief description of your reasons for wanting to attend. In addition, if you wish to present a poster, please send a short (< 5 pages in ICML format) paper in postscript or PDF format to the organizers. If you have questions, please feel free to contact us. Important Dates: Deadline for workshop submissions: Mar 22, 2001 Notification of acceptance: April 9, 2001 Final version of workshop papers due: April 31, 2001 The workshop itself: June 28, 2001 Committee: Workshop Chairs: David Andre (dandre at cs.berkeley.edu) Anders Jonsson (ajonsson at cs.umass.edu) Workshop Committee: Andrew Barto Natalia Hernandez Sridhar Mahadevan Ronald Parr Stuart Russell Please see http://www-anw.cs.umass.edu/~ajonsson/icml/ for more information, including directions for formatting the submissions and details on the benchmark domain. From R.J.Howlett at bton.ac.uk Sun Feb 11 13:13:24 2001 From: R.J.Howlett at bton.ac.uk (R.J.Howlett@bton.ac.uk) Date: Sun, 11 Feb 2001 18:13:24 -0000 Subject: Invitation to AI/Internet researchers/authors Message-ID: <08C89600D966D4119C7800105AF0CB6B0246D12B@moulsecoomb.bton.ac.uk> Invitation to authors I am collecting material for a book with a working title of "Intelligent Internet Systems" to be published by a major publisher. The volume will describe the latest research on the interaction between intelligent systems (neural networks, fuzzy and rule-based systems, intelligent agents) and the internet/WWW. I already have quite a lot of material, but would like to invite proposals for chapters from prospective authors. Typical topics might be intelligent techniques for:- * interpreting WWW-derived information * internet data-mining * internet search algorithms and mechanisms * remote condition monitoring and security monitoring * interfaces * tools * applications but suggestions for further topics would be welcome. If you would like to propose a chapter please would you let me have the following information:- Name/email: University/Company: Are you a Professor/Researcher/Engineer/etc: Title of proposed chapter: Brief description of contents: Many thanks, Bob Howlett ------------------------------------------ Dr R.J.Howlett BSc, MPhil, PhD, MBCS, CEng Director of Brighton TCS Centre, Head of Intelligent Signal Processing Labs, University of Brighton, Moulsecoomb, Brighton, UK Tel: 01273 642305 Fax: 01273 642444 Email: R.J.Howlett at brighton.ac.uk ------------------------------------------ From ckiw at dai.ed.ac.uk Mon Feb 12 14:17:05 2001 From: ckiw at dai.ed.ac.uk (Chris Williams) Date: Mon, 12 Feb 2001 19:17:05 +0000 (GMT) Subject: Advert for Chair in Machine Learning at U of Edinburgh, UK Message-ID: The University of Edinburgh Chair in Machine Learning The University invites applications for a Chair in Machine Learning, to be held within the Division of Informatics. We seek a candidate who will further develop the strengths of the Division through empirical, theoretical, applications-oriented or inter-disciplinary research in any area of machine learning, including probabilistic graphical modelling, computational learning theory, inductive logic programming, pattern recognition, reinforcement learning, data mining, text mining, relations between human and machine learning. Outstanding applicants from areas of Informatics that provide a foundation for Machine Learning are also welcome, including databases, information retrieval, knowledge representation. In addition to strengths in research and scholarship, the successful candidate will provide leadership and inspiration, and play an active role in teaching and administration. Informal enquiries can be made to the Head of Division, Professor Alan Bundy, at the Division of Informatics, by telephone (+44~131~650~2716) or e-mail (hod at informatics.ed.ac.uk). Further particulars and application packs should be obtained from Personnel Department, the University of Edinburgh, 9-16 Chambers Street, Edinburgh EH1 1HT. Tel 0131 650 2511. Please quote Ref Number 316129. Further particulars may also be obtained from the Informatics web site http://www.informatics.ed.ac.uk/events/vacancies Closing date: 15 March 2001 From stefan.wermter at sunderland.ac.uk Mon Feb 12 07:35:22 2001 From: stefan.wermter at sunderland.ac.uk (Stefan.Wermter) Date: Mon, 12 Feb 2001 12:35:22 +0000 Subject: CfC: Book on Intelligent Agent Engineering Message-ID: <3A87D88A.39862A9D@sunderland.ac.uk> We are particularly interested in Novel approaches to intelligent agents: ----------------- CALL FOR CHAPTERS Intelligent Agent Software Engineering A book edited by Valentina Plekhanova and Stefan Wermter, University of Sunderland, United Kingdom Intelligent software agents are a unique generation of information society tools that independently perform various tasks on behalf of human user(s) or other software agents. The new possibility of information society requires the development of new more intelligent methods, tools and theories for modelling/engineering of agent-based systems and technologies. This directly involves a need for consideration, understanding and analysis of human factors, e.g. people's knowledge/skill, learning, and performance capabilities/compatibilities in particular software development environments. This is because software developers utilise their experience and represent their mental models via development/engineering of intelligent agent(s). Therefore, the study of interrelated factors such as people's and agent's capabilities constitute an important/critical area in intelligent agent software engineering. This should eventually lead to more robust, intelligent, interactive, learning and adaptive agents. We encourage submitting papers where intelligent agent software engineering is considered as the application of the integration of formal methods and heuristic approaches to ensure support for the evaluation, comparison, analysis, and evolution of agent behaviour. The primary objective of the book is to introduce the readers to the concept of intelligent agent software engineering so that they will be able to develop their own agents and implement concepts in their own environments. Chapters based on research from both academia and industry are encouraged. Representative topics include, but are not limited to, the following: Requirements for intelligent agent software engineering Intelligent agent modelling and management Integration problems in intelligent agent software engineering Intelligent planning/scheduling systems Prioritisation problems in intelligent agent software engineering Engineering models of intelligent agents Modelling, analysis, and management of learning agents Engineering the Information Technology Engineering the learning processes Hybrid neural agents Adaptive agents Learning agents Communicating agents Meta agent architectures Negotiating agents Emerging knowledge based on collaborating software agents Machine learning for intelligent agents Cognitively oriented software agents Biologically motivated agent models Lifelong learning in software agents Robustness and noise in software agents Machine learning for software engineering Evolutionary agents Neuroscience - inspired agents; Agents based on soft and fuzzy computing SUBMISSION PROCEDURE Researchers and practitioners are invited to submit on or before March 31, 2001, a 2-5 page proposal outlining the mission of the proposed chapter. Authors of accepted proposals will be notified by May 15, 2001 about the status of their proposals and sent chapter organisational guidelines. Full chapters are expected to be submitted by September 15, 2001. The book is scheduled to be published by Idea Group Publishing in 2002. Inquiries and Submissions can be forwarded electronically to: Valentina Plekhanova, and Stefan Wermter, University of Sunderland, SCET, The Informatics Centre, St Peter's Campus, St Peter's Way, SR6 0DD, UK Tel: +44 (0)-191-515-2755 Fax: +44 (0)-191 -515-2781 Email: valentina.plekhanova at sunderland.ac.uk stefan.wermter at sunderland.ac.uk *************************************** Professor Stefan Wermter Research Chair in Intelligent Systems University of Sunderland Informatics Centre, SCET St Peters Way Sunderland SR6 0DD United Kingdom phone: +44 191 515 3279 fax: +44 191 515 3553 email: stefan.wermter at sunderland.ac.uk http://www.his.sunderland.ac.uk/~cs0stw/ http://www.his.sunderland.ac.uk/ **************************************** From dagli at umr.edu Mon Feb 12 12:01:38 2001 From: dagli at umr.edu (Cihan Dagli) Date: Mon, 12 Feb 2001 11:01:38 -0600 Subject: ANNIE 2001 Smart Engineering Systems Design Conference November 4-7, 2001 Message-ID: Dear Colleagues: On behalf of the organizing committee I would like to invite you to attend ANNIE 2001 ( http://www.umr.edu/~annie/annie01 ) which will be held November 4th-7th, 2001, at the Marriott Pavilion Hotel in downtown St. Louis, Missouri, USA. This will be the eleventh international gathering of researchers interested in Smart Engineering System Design using neural networks, fuzzy logic, evolutionary programming, complex systems, data mining, and rough sets. Each previous conference drew approximately 150 papers from twenty countries. The proceedings of all conferences have been published by ASME Press as hardbound books in nine volumes. The latest volume, edited by Dagli, et. al., was titled "Smart Engineering System Design: Neural Networks, Fuzzy Logic, Evolutionary Programming, Data Mining, and Complex Systems." You can visit ANNIE home page for details of these conferences http://www.umr.edu/~annie/ . ANNIE 2001 ( http://www.umr.edu/~annie/annie01 ) will cover the theory of Smart Engineering System Design techniques, namely: neural networks, fuzzy logic, evolutionary programming, complex systems, data mining, and rough sets. Presentations dealing with applications of these technologies are encouraged. The organizing committee invites all persons interested in Computational Intelligence to submit papers for presentation at the conference. You can submit your abstracts online at http://www.umr.edu/~annie/oas.htm . All papers accepted for presentation will be published in the conference proceedings. They will be reviewed by two referees, senior researchers in the field, for technical merit and content. AUTHORS SCHEDULE March 2, 2001: Deadline for contributed paper abstract, information sheet, and letter of intent. May 18, 2001: Deadline for full papers. July 6, 2001: Notification of status of contributed papers. August 10, 2001: Deadline for camera ready manuscripts. Six pages will be allocated for each accepted paper in the proceedings. All accepted papers will be published as a hardbound book by ASME Press and edited by Drs. Dagli, Buczak, Embrechts, Ersoy, Ghosh and Kercel. Authors are requested to submit the following by March 2, 2001: 1) an abstract (up to 200 words) 2) an information sheet that includes the full name of the authors, address, phone and FAX number, E-mail, Web address 3) a letter of intent Authors should forward their letter of intent, information sheet, abstract, and full paper to: Dr. Cihan H. Dagli, Conference Chair Smart Engineering Systems Laboratory Department of Engineering Management University of Missouri - Rolla 1870 Miner Circle Rolla, MO 65409-0370, USA Phone: (573) 341-6576 or (573) 341-4374 FAX: (573) 341-6268 E-Mail: annie at umr.edu -or- patti at umr.edu Internet: http://www.umr.edu/~annie From tsims at talley.com Fri Feb 9 18:15:17 2001 From: tsims at talley.com (tsims@talley.com) Date: Fri, 9 Feb 2001 18:15:17 -0500 Subject: IJCNN 2001 deadline extended to Feb 22, 2001 Message-ID: <852569EE.007FBE33.00@frodo.talley.com> E X T E N D E D A B S T R A C T D E A D L I N E for I J C N N 2 0 0 1 International Joint Conference on Neural Networks 2001 Washington, D.C., July 15-19,2001 Deadline for ABSTRACTS extended to February 22, 2001. Electronic Submission of Abstracts: http://ams.cos.com/cgi-bin/login?institutionId=2425&meetingId=19 --------------------------------------------------- Dear Colleague, There is an important new development for the IJCNN 2001. We are informed that the National Science Foundation is launching a major new interdisciplinary funding initiative soon, one that should be of extreme interest to our community. Accordingly, the INNS Board has decided to feature this NSF Initiative at the IJCNN 2001, including opportunity for special sessions/tracks, and a Panel that focuses on this new initiative. This will be a venue in which our community can provide the agencies involved in this new interdisciplinary initiative a broader scope of ideas to include in the definition of their programs. WE INVITE you, our colleagues, to consider submitting paper abstracts and/or proposals for special sessions on topics which involve multiple disciplines in our field, and hopefully, in areas that push boundaries beyond the scope of previous session topics. We encourage you and other interested scientists to participate in presentations and on-site discussions to launch the new initiative. Our hope is that such presentations, coupled with details of the funding initiative and a panel discussion on the topic will provide an outstanding opportunity for attendees of the IJCNN 2001. Sincerely, Ken Marko and Paul Werbos, General Chairs, IJCNN 2001 PS: We are happy to announce that over 500 Abstracts have already been submitted for the IJCNN 2001, and encourage those who may have missed the original deadline to submit their abstracts now. Use the following URL to go to the abstract submission site: http://ams.cos.com/cgi-bin/login?institutionId=2425&meetingId=19 Sincerely, Ken Marko and Paul Werbos For the Organizing Committee of IJCNN 2001 From steve at cns.bu.edu Tue Feb 13 11:21:54 2001 From: steve at cns.bu.edu (Stephen Grossberg) Date: Tue, 13 Feb 2001 08:21:54 -0800 Subject: Resonant Dynamics of Speech Perception Message-ID: The following article is now available at http://www.cns.bu.edu/Profiles/Grossberg in HTML, PDF, and Gzipped Postscript. Grossberg, S. and Myers, C.W. The resonant dynamics of speech perception: Interword integration and duration-dependent backward effects. Psychological Review. ABSTRACT: How do listeners integrate temporally distributed phonemic information into coherent representations of syllables and words? During fluent speech perception, variations in the durations of speech sounds and silent pauses can produce different perceived groupings. For example, increasing the silence interval between the words "gray chip" may result in the percept "great chip", whereas increasing the duration of fricative noise in "chip" may alter the percept to "great ship" (Repp et al., 1978). The ARTWORD neural model quantitatively simulates such context-sensitive speech data. In ARTWORD, sequential activation and storage of phonemic items in working memory provides bottom-up input to unitized representations, or list chunks, that group together sequences of items of variable length. The list chunks compete with each other as they dynamically integrate this bottom-up information. The winning groupings feed back to provide top-down support to their phonemic items. Feedback establishes a resonance which temporarily boosts the activation levels of selected items and chunks, thereby creating an emergent conscious percept. Because the resonance evolves more slowly than working memory activation, it can be influenced by information presented after relatively long intervening silence intervals. The same phonemic input can hereby yield different groupings depending on its arrival time. Processes of resonant transfer and competitive teaming help determine which groupings win the competition. Habituating levels of neurotransmitter along the pathways that sustain the resonant feedback lead to a resonant collapse that permits the formation of subsequent resonances. Keywords: speech perception, word recognition, consciousness, adaptive esonance, context effects, consonant perception, neural network, silence duration, working memory, categorization, clustering. From mpp at us.ibm.com Tue Feb 13 13:58:04 2001 From: mpp at us.ibm.com (Michael Perrone) Date: Tue, 13 Feb 2001 13:58:04 -0500 Subject: IBM Graduate Summer Intern Positions in Handwriting Recognition Message-ID: _________________________________________________________________________________ Graduate Summer Intern Positions at IBM _________________________________________________________________________________ The Pen Technologies Group at the IBM T.J. Watson Research Center is looking for graduate students to fill summer R&D positions in the area of large-vocabulary, unconstrained, handwriting recognition. Candidates should have the following qualifications: - Currently enrolled in a PhD program in EE, CS, Math, Physics or similar field - Research experience in handwriting recognition or IR - Strong mathematics/probability background - Excellent programming skills (in C and C++) - Creativity Our current projects include: - HMM-based, unconstrained, handwriting recognition - Language and grammar modeling - Accurate, high-speed, search methods - Document understanding and processing - Pen computing - Handwritten document retrieval The IBM T.J. Watson Research Center is one of the top industrial laboratories in the world. We offer an exciting research environment with the opportunity to become involved in all aspects of cutting edge technology in the computer industry. ______________________ Please send CV's to: Michael P. Perrone mpp at us.ibm.com -or- Michael P. Perrone IBM T.J. Watson Research Center - 36-207 Route 134 Yorktown Heights, NY 10598 914-945-1779 From cangulo at esaii.upc.es Tue Feb 13 12:14:36 2001 From: cangulo at esaii.upc.es (Cecilio Angulo) Date: Tue, 13 Feb 2001 18:14:36 +0100 Subject: CFP Proposed Pre-Organized Session. IWANN 2001 Message-ID: <3A896B7B.19B6BAC9@esaii.upc.es> Call for papers ========== Proposed Pre-Organized Session. IWANN 2001 Granada, Spain. June 13-15 , 2001 Title: Kernel Machines. Kernel-based Learning Methods. Organizer: Dr. Andreu Catal Description: Recent theoretical advances and experimental results have drawn considerable attention to the use of kernel functions in learning systems for problems of classification, regression, density estimation, ... Some learning machines employing kernel methodology are Support Vector Machines, Radial Basis Function Neural Networks, Gaussian Process prediction, Mathematical Programming with Kernels, Regularized Artificial Neural Networks, Reproducing Kernel Hilbert Spaces and related methods. Some working directions are design of novel kernel-based algorithms and novel types of kernel functions, development of new learning theory concepts and speed-up learning process. Final Date for Submission February 28, 2001 Acceptance notification and start of inscription March 31, 2001 End of reduction fee for early inscription April 30, 2001 Congress date June 13-15, 2001 From cindy at cns.bu.edu Wed Feb 14 14:56:32 2001 From: cindy at cns.bu.edu (Cynthia Bradford) Date: Wed, 14 Feb 2001 14:56:32 -0500 Subject: Neural Networks 14(2) Message-ID: <200102141956.OAA07727@retina.bu.edu> NEURAL NETWORKS 14(2) Contents - Volume 14, Number 2 - 2001 ------------------------------------------------------------------ CONTRIBUTED ARTICLES: ***** Mathematical and Computational Analysis ***** Modularity, evolution, and the binding problem: A view from stability theory J.-J.E. Slotine and W. Lohmiller A pruning method for the recursive least squared algorithm Chi-Sing Leung, Kwok-Wo Wong, Pui-Fai Sum, and Lai-Wan Chan Two methods for encoding clusters Pierre Courrieu Numerical solution of differential equations using multiquadric radial basis function networks Nam Mai-Duy and Thanh Tran-Cong ***** Engineering and Design ***** Biomimetic gaze stabilization based on feedback-error-learning with nonparametric regression networks T. Shibata and S. Schaal A globally convergent Lagrange and barrier function iterative algorithm for the traveling salesman problem Chuangyin Dang and Lei Xu Quick fuzzy backpropagation algorithm A. Nikov and S. Stoeva ------------------------------------------------------------------ Electronic access: www.elsevier.com/locate/neunet/. Individuals can look up instructions, aims & scope, see news, tables of contents, etc. Those who are at institutions which subscribe to Neural Networks get access to full article text as part of the institutional subscription. Sample copies can be requested for free and back issues can be ordered through the Elsevier customer support offices: nlinfo-f at elsevier.nl usinfo-f at elsevier.com or info at elsevier.co.jp ------------------------------ INNS/ENNS/JNNS Membership includes a subscription to Neural Networks: The International (INNS), European (ENNS), and Japanese (JNNS) Neural Network Societies are associations of scientists, engineers, students, and others seeking to learn about and advance the understanding of the modeling of behavioral and brain processes, and the application of neural modeling concepts to technological problems. Membership in any of the societies includes a subscription to Neural Networks, the official journal of the societies. Application forms should be sent to all the societies you want to apply to (for example, one as a member with subscription and the other one or two as a member without subscription). The JNNS does not accept credit cards or checks; to apply to the JNNS, send in the application form and wait for instructions about remitting payment. The ENNS accepts bank orders in Swedish Crowns (SEK) or credit cards. The INNS does not invoice for payment. ---------------------------------------------------------------------------- Membership Type INNS ENNS JNNS ---------------------------------------------------------------------------- membership with $80 or 660 SEK or Y 15,000 [including Neural Networks 2,000 entrance fee] or $55 (student) 460 SEK (student) Y 13,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- membership without $30 200 SEK not available to Neural Networks non-students (subscribe through another society) Y 5,000 (student) [including 2,000 entrance fee] ----------------------------------------------------------------------------- Institutional rates $1132 2230 NLG Y 149,524 ----------------------------------------------------------------------------- Name: _____________________________________ Title: _____________________________________ Address: _____________________________________ _____________________________________ _____________________________________ Phone: _____________________________________ Fax: _____________________________________ Email: _____________________________________ Payment: [ ] Check or money order enclosed, payable to INNS or ENNS OR [ ] Charge my VISA or MasterCard card number ____________________________ expiration date ________________________ INNS Membership 19 Mantua Road Mount Royal NJ 08061 USA 856 423 0162 (phone) 856 423 3420 (fax) innshq at talley.com http://www.inns.org ENNS Membership University of Skovde P.O. Box 408 531 28 Skovde Sweden 46 500 44 83 37 (phone) 46 500 44 83 99 (fax) enns at ida.his.se http://www.his.se/ida/enns JNNS Membership c/o Professor Tsukada Faculty of Engineering Tamagawa University 6-1-1, Tamagawa Gakuen, Machida-city Tokyo 113-8656 Japan 81 42 739 8431 (phone) 81 42 739 8858 (fax) jnns at jnns.inf.eng.tamagawa.ac.jp http://jnns.inf.eng.tamagawa.ac.jp/home-j.html ----------------------------------------------------------------- From evansdj at aston.ac.uk Wed Feb 14 05:51:10 2001 From: evansdj at aston.ac.uk (DJ EVANS) Date: Wed, 14 Feb 2001 10:51:10 +0000 Subject: JOB: Postdoctoral Research Fellowship in Non-linear Statistics Message-ID: <3A8A631E.4EEFE77C@aston.ac.uk> Postdoctoral Research Fellowship Neural Computing Research Group, Aston University, Birmingham, UK NAOC: Neural network Algorithms for Ocean Colour We are looking for a highly motivated individual for a 3 year postdoctoral research position in the area of the analysis of multi-spectral, remotely sensed data. The emphasis of this research will be on developing and applying data modelling, visualisation and analysis algorithms to measurements of the 'ocean colour' across the electro-magnetic spectrum to generate estimates of chlorophyll-A concentrations using satellite observations. This will allow a better understanding of the distribution of primary biological production and the global carbon cycle. You will be working in a team from seven institutions around the EuropeanUnion, and there is scope for a great deal of interaction. For more details, see the NAOC web site http://www.ncrg.aston.ac.uk/~cornfosd/naoc/. You will work in the Neural Computing Research Group which has a worldwide reputation in practical and theoretical aspects of information analysis. Applicants should have strong mathematical and computational skills; some knowledge of remote sensing would be an advantage. We expect to use methods such as the Generative Topographic Mapping (GTM) to model the unconditional probability distribution of the ocean colour data, and then combine this with expert labelling to perform 'unsupervised' classification. We will also develop spatial (Gaussian process) models of the abundance of phytoplankton to aid field-wise (Bayesian) ocean colour retrieval. Candidates should hold (or imminently expect to gain) a PhD in Mathematics, Statistics, Physics, Remote Sensing or a related field. Experience of programming (especially MATLAB or C/C++) and a knowledge of Unix would be an advantage. Salaries will be up to point 6 on the RA 1A scale, currently 18731 UK pounds per annum. The salary scale is subject to annual increments. If you wish to be considered for this Fellowship, please send a full CV and publications list, including full details and grades of academic qualifications, together with the names of 3 referees, quoting reference NAOC01, to: Personnel Officer Aston University Birmingham B4 7ET, U.K. Tel: +44 (0)121 359 0870 Fax: +44 (0)121 359 6470 Further details can be obtained from Dr. Dan Cornford: d.cornford at aston.ac.uk. Closing date: 16 March 2001 -- Dr Dan Cornford d.cornford at aston.ac.uk Computer Science Aston University Aston Triangle tel +44 121 359 3611 ext 4667 Birmingham B4 7ET fax +44 121 333 6215 http://www.ncrg.aston.ac.uk/~cornfosd/ ------------------------------------------------------------- David J. Evans evansdj at aston.ac.uk Neural Computing Research Group tel: +44 (0)121 359 3611 Room 318 C ext. 4668 Aston University Aston Triangle Birmingham B4 7ET http://www.ncrg.aston.ac.uk -------------------------------------------------------------- From terry at salk.edu Wed Feb 14 21:15:47 2001 From: terry at salk.edu (Terry Sejnowski) Date: Wed, 14 Feb 2001 18:15:47 -0800 (PST) Subject: NEURAL COMPUTATION 13:3 In-Reply-To: <200101130743.f0D7h9v12640@purkinje.salk.edu> Message-ID: <200102150215.f1F2FlC47730@kepler.salk.edu> Neural Computation - Contents - Volume 13, Number 3 - March 1, 2001 Article The Limits of Counting Accuracy in Distributed Neural Representations A. R. Gardner-Medwin and H. B. Barlow Note An Expectation-Maximisation Approach to Nonlinear Component Analysis Roman Rosipal and Mark Girolami Letters A Population Density Approach that Facilitates Large-Scale Modeling of Neural Networks: Extension to Slow Inhibitory Synapses Duane Q. Nykamp and Daniel Tranchina A Complex Cell-Like Receptive Field Obtained Information Maximization Kenji Okajima and Hitoshi Imaoka Self-Organization of Topographic Mixture Networks Using Attentional Feedback James R. Williamson Auto-SOM: Recurseive Parameter Estimation for Guidance of Self-Organizing Feature Maps Karin Haese and Geoffrey J. Goodhill Exponential Convergence of Delayed Dynamical Systems Tianping Chen and Shun-ichi Amari Improvements to Platt's SMO Algorithm for SVM Classifier Design S. S. Keerthi, S. K. Shevade, C. Bhattacharyya and K. R. K. Murthy Learning Hough Transform: A Neural Network Model Jayanta Basak A Constrained E.M. Algorithm for Independent Component Analysis Max Welling and Markus Webber Emergence of Memory-Driven Command Neurons in Evolved Artificial Agents Ranit Aharonov-Barki, Tuvik Beker and Eytan Ruppin ----- ON-LINE - http://neco.mitpress.org/ SUBSCRIPTIONS - 2001 - VOLUME 13 - 12 ISSUES USA Canada* Other Countries Student/Retired $60 $64.20 $108 Individual $88 $94.16 $136 Institution $460 $492.20 $508 * includes 7% GST MIT Press Journals, 5 Cambridge Center, Cambridge, MA 02142-9902. Tel: (617) 253-2889 FAX: (617) 577-1545 journals-orders at mit.edu ----- From d.mareschal at bbk.ac.uk Thu Feb 15 04:24:20 2001 From: d.mareschal at bbk.ac.uk (Denis Mareschal) Date: Thu, 15 Feb 2001 10:24:20 +0100 Subject: chnages to Developmental Science Message-ID: Dear all, Following the tragic and unexpected demise of Professor George Butterworth (founding editor of Developmental Science) a few chnages have been made to the editorial board and emphasis of the journal. DEVELOPMENTAL SCIENCE publishes cutting-edge theory and up-to-the-minute research on scientific developmental psychology from leading thinkers in the field. New scientific findings and in-depth empirical studies are published, with coverage including species comparative, compuational modelling (ESPECIALLY CONNECTIONIST OR NEURAL NETWORK MODELLING), social and biological approaches to development as well as cognitive development. INCREASING EMPHASIS WILL BE PLACED ON PAPERS THAT BRIDGE LEVELS OF EXPLANATION IN DEVELOPMENTAL SCIENCE, such as between brain growth and perceptual, cognitive and social development (sometimes called "developmental cognitive neuroscience"), and those which cover emerging areas such as functional neuroimaging of the developing brain. SPECIAL ISSUE, Volume 4, Issue 3 THE DEVELOPING HUMAN BRAIN Edited by Michael Posner, Mary Rothbart, Martha Farah and John Bruer This special issue will contain a state-of-the-art assessment written by six international panels who reviewed research progress on fundamental questions in the field. Chapters include: perception and attention, language, temperament and emotion, socialization and psychopathology. The two-year process of compiling the report involved meetings that included leading investigators in many fields and countries. EDITOR Mark H. Johnson, Birkbeck College, UK ASSOCIATE EDITORS B.J. Casey, Sackler Institute, Cornell University, USA Adele Diamond, Eunice Kennedy Shriver Center, USA Barbara Finlay, Cornell University, USA Patricia Kuhl, University of Washington, Seattle, USA Denis Mareschal, Birkbeck College, UK Andy Meltzoff, University of Seattle, USA Helen Neville, University of Oregon, USA Paul Quinn, Washington and Jefferson College, USA Scania de Schonen, CNRS Paris, France Michael Tomasello, Max Planck Inst. of Evol. Anthropology, Germany Editor for "Developmental Science in other languages" Juan-Carlos Gomez, University of St Andrews, UK FURTHER INFORMATION CAN BE OBTAINED FROM: http://www.blackwellpublishers.co.uk/journals/desc/ ================================================= Dr. Denis Mareschal Centre for Brain and Cognitive Development School of Psychology Birkbeck College University of London Malet St., London WC1E 7HX, UK tel +44 (0)20 7631-6582/6207 fax +44 (0)20 7631-6312 http://www.psyc.bbk.ac.uk/staff/dm.html ================================================= From lunga at ifi.unizh.ch Mon Feb 19 08:48:43 2001 From: lunga at ifi.unizh.ch (Max Lungarella) Date: 19 Feb 2001 14:48:43 +0100 Subject: DECO2001 - CFP Message-ID: <3A91243B.FB87BB9B@ifi.unizh.ch> We apologize for multiple postings of this call. ********************** CALL FOR PAPERS ************************** DECO2001 - Developmental Embodied Cognition Workshop to be held in conjuction with the 23rd Annual Meeting of the Cognitive Science Society July 31st, University of Edinburgh, Scotland http://www.cogsci.ed.ac.uk/~deco/ PAPER DEADLINE: April 30, 2001 WORKSHOP DATE: July 31, 2001 SCOPE The objective of this workshop is to bring together researchers from cognitive science, psychology, robotics, artificial intelligence, philosophy, and related fields to discuss the role of developmental and embodied views of cognition, and in particular, their mutual relationship. The ultimate goal of this approach is to understand the emergence of high-level cognition in organisms based on their interactions with their environment over extended periods of time. FOCUS The symposium will focus on research that explicitly takes development in embodied systems into account, either at the level of empirical work, computational models, or real-world devices. Finally, contributions giving a broad and novel philosophical or methodological view on embodied cognition are welcome. CONTRIBUTIONS Contributions are solicited from the following areas (but not restricted to this list): - Mechanisms of development (e.g. neural networks, dynamical systems, information theoretic) - Development of sensory and motor systems - Perception-action coupling, sensory-motor coordination - Cognitive developmental robotics - Categorization, object exploration - Communication and social interaction - Methodologies - Debates and philosophical issues (e.g. constructivism-selectionism, nature-nurture, scalability, symbol grounding) ORGANIZATION This will be a one-day workshop with a number of invited talks, a poster session, and with lots of room for discussion. In order to allow for more individual interactions, and to make better use of the limited time we have for the workshop, accepted papers will generally be as posters. The poster session will be over a cocktail to ensure a relaxed atmosphere. FORM OF CONTRIBUTION Contributions should be in the form of full papers with a maximum of five pages, which, if accepted, can be presented as posters at the workshop. Accepted contributions will be assembled in a handout paper collection for the participants of the workshop. At the workshop there will be a discussion on potential further publication, e.g. as a special issue of a journal or a book. GUIDELINES FOR ABSTRACT/PAPER SUBMISSION The contribution should be submitted electronically to deco at cogsci.ed.ac.uk in pdf format. Information on how to convert LaTeX or Word documents to pdf can be found at http://www.hcrc.ed.ac.uk/cogsci2001/PDF.html IMPORTANT DATES Full papers submission: April 30, 2001 Notification of acceptance: May 31, 2001 Workshop date: July 31, 2001 LINKS Workshop page: http://www.cogsci.ed.ac.uk/~deco Cognitive Science Conference: http://www.hcrc.ed.ac.uk/cogsci2001/ PROGRAM COMMITTEE Rolf Pfeifer (chair, AI Lab, University of Zurich, Switzerland) Gert Westermann (co-chair, Sony CSL, Paris) Cynthia Breazeal (MIT AI Lab, Cambridge, Mass., USA) Yiannis Demiris (Laboratory of Physiology, Oxford University, UK) Max Lungarella (AI Lab, University of Zurich, Switzerland) Rafael Nunez (University of Fribourg, Switzerland, and University of California, Berkeley, USA) Linda Smith (Department of Psychology, Indiana University, Bloomington, IN, USA) From moody at cse.ogi.edu Mon Feb 19 18:30:44 2001 From: moody at cse.ogi.edu (John E Moody) Date: Mon, 19 Feb 2001 15:30:44 -0800 Subject: Rio de Janeiro, 5th Congress on Neural Nets, April 2-5, 2001 Message-ID: <3A91ACA4.93039698@cse.ogi.edu> Connectionists, I am posting this announcement at the request of the organizers of the Fifth Brazilian Congress on Neural Networks, which will convene in Rio de Janeiro on April 2-5, 2001. For registration and abstract submission information, please see the conference URL at: http://www.ele.ita.br/cnrn Early registration continues through February 28. The Congress will feature presentations by Paul Werbos, Teuvo Kohonen and others, including: "Automatic Knowledge Discovery in Power Systems", Prof. Louis A. Wehenkel (U. Liege). "Real Time Dynamic Security Assessment", Prof. Mohamed El-Sharkawi (U. Washington). "Electrical Load Forecasting Via Neural Networks", Prof. Tharam S. Dillon (La Trobe U.). "The Application of Fundamental and Neural Network Models to Credit Risk Analysis", Dr. Pratap Sondhi (Citibank). "Financial Model Calibration Using Consistency Hints", Prof. Yaser Abu-Mostafa (CALTECH). "Learning to Trade via Direct Reinforcement", Prof. John Moody (Oregon Graduate Institute). "Self-Organizing Maps of Massive Document Collections", Prof. Teuvo Kohonen (Helsinki University of Technology). "Specification, Estimation, and Evaluation of Neural Networks Time Series Models", Prof. Timo Tersvirta (Stockholm School of Economics). "Neural Networks in Control Systems", Dr. Paul Werbos (National Science Foundation). See you in Rio! John ================================================================= John E. Moody http://www.cse.ogi.edu/~moody/ Professor and Director, http://www.cse.ogi.edu/CompFin/ Computational Finance Program Oregon Graduate Institute Voice: (503) 748-1554 20000 NW Walker Road FAX: (503) 748-1548 Beaverton, OR 97006 USA Email: moody at cse.ogi.edu ================================================================= From nnsp01 at neuro.kuleuven.ac.be Mon Feb 19 09:08:43 2001 From: nnsp01 at neuro.kuleuven.ac.be (Neural Networks for Signal Processing 2001) Date: Mon, 19 Feb 2001 15:08:43 +0100 Subject: NNSP 2001, Call For Papers Message-ID: <3A9128EB.6364725B@neuro.kuleuven.ac.be> ****************************************************************************** 2001 IEEE Workshop on Neural Networks for Signal Processing September 10--12, 2001 Falmouth, Massachusetts, USA ****************************************************************************** Sponsored by the IEEE Signal Processing Society In cooperation with the IEEE Neural Networks Council General Chairs David J. MILLER Pennsylvania State University Tulay ADALI University of Maryland Baltimore County Program Chairs Jan LARSEN Technical University of Denmark Marc VAN HULLE Katholieke Universiteit, Leuven Finance Chair Lian YAN Athene Software, Inc. Proceedings Chair Scott C. DOUGLAS Southern Methodist University Publicity Chair Patrick DE MAZIERE Katholieke Universiteit, Leuven Registration Elizabeth J. WILSON Raytheon Co. Europe Liaison Herve BOURLARD Swiss Federal Institute of Technology, Lausanne America Liaison Amir ASSADI University of Wisconsin at Madison Asia Liaison H.C. FU National Chiao Tung University Papers Thanks to the sponsorship of IEEE Signal Processing Society and IEEE Neural Network Council, the eleventh of a series of IEEE workshops on Neural Networks for Signal Processing will be held in Falmouth, Massachusetts, at the SeaCrest Oceanfront Resort and Conference Center. The workshop will feature keynote addresses, technical presentations and panel discussions. Papers are solicited for, but not limited to,the following areas: Algorithms and Architectures: Artificial neural networks (ANN), adaptive signal processing, Bayesian modeling, MCMC, parameter estimation, nonlinear signal processing, generalization, design algorithms, optimization, Markov models, fuzzy systems (FS), evolutionary computation (EC), synergistic models of ANN/FS/EC, and wavelets. Applications: Speech processing, image processing, blind source separation, sonar and radar, data fusion, data mining, intelligent multimedia and web processing, OCR, robotics, adaptive filtering, communications, sensors, system identification, and other general signal processing and pattern recognition applications. Implementations: Parallel and distributed implementation, hardware design, and other general implementation technologies. Further Information NNSP'2001 webpage: http://eivind.imm.dtu.dk/nnsp2001 Paper Submission Procedure Prospective authors are invited to submit a full paper of up to ten pages using the electronic submission procedure described at the workshop homepage: http://eivind.imm.dtu.dk/nnsp2001 Accepted papers will be published in a hard-bound volume by IEEE and distributed at the workshop. Extended versions of the best workshop papers will be selected and published in a Special Issue of an international journal published by Kluwer Academic Publishers. Program Committee Yianni Attikiouzel Andrew Back Herve Bourlard Andrzej Cichocki Jesus Cid-Sueiro Robert Dony Ling Guan Tzyy-Ping Jung Shigeru Katagiri Jens Kohlmorgen Fa Long Luo Danilo Mandic Elias Manolakos Michael Manry Takashi Matsumoto Li Min Fu Christophe Molina Bernard Mulgrew Mahesan Niranjan Tomaso Poggio Kostas N. Plataniotis Jose Principe Phillip A.Regalia Joao-Marcos T. Romano Kenneth Rose Jonas Sjoberg Robert Snapp M. Kemal Sonmez Naonori Ueda Lizhong Wu Lian Yan Fernando J.Von Z uben Schedule Submission of full paper: March 15, 2001 Notification of acceptance: May 1, 2001 Submission of photo-ready accepted paper and author registration: June 1, 2001 Advance registration, before: July 15, 2001 From A.van.Ooyen at nih.knaw.nl Wed Feb 21 10:29:54 2001 From: A.van.Ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Wed, 21 Feb 2001 16:29:54 +0100 Subject: New Paper Message-ID: <3A93DEF2.D1FF5234@nih.knaw.nl> New Paper: Competition in the Development of Nerve Connections: A Review of Models Network: Computation in Neural Systems (2001) 12: R1-R47 by Arjen van Ooyen Please download (free) from Network: http://www.iop.org/Journals/ne or via my website http://www.anc.ed.ac.uk/~arjen/competition.html Abstract: The establishment and refinement of neural circuits involve both the formation of new connections and the elimination of already existing connections. Elimination of connections occurs, for example, in the development of mononeural innervation of muscle fibres and in the formation of ocular dominance columns in the visual cortex. The process that leads to the elimination of connections is often referred to as axonal or synaptic competition. Although the notion of competition is commonly used, the process is not well understood -- with respect to, for example, the type of competition, what axons and synapses are competing for, and the role of electrical activity. This article reviews the types of competition that have been distinguished and the models of competition that have been proposed. Models of both the neuromuscular system and the visual system are described. For each of these models, the assumptions on which it is based, its mathematical structure, and the extent to which it is supported by the experimental data are evaluated. Special attention is given to the different modelling approaches and the role of electrical activity in competition. -- Arjen van Ooyen, Netherlands Institute for Brain Research, Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands. email: A.van.Ooyen at nih.knaw.nl website: http://www.anc.ed.ac.uk/~arjen phone: +31.20.5665483 fax: +31.20.6961006 From jbower at bbb.caltech.edu Wed Feb 21 17:57:43 2001 From: jbower at bbb.caltech.edu (James M. Bower) Date: Wed, 21 Feb 2001 14:57:43 -0800 Subject: No subject Message-ID: CNS*01 Housing Alert We are pleased to announce the submission of almost 300 papers to this year's Computational Neuroscience Meeting in San Francisco and Asylomar California. http://cns.numedeon.com/cns2001/ Those of you planning on attending this year's CNS meeting, however, need to make your reservations at the Ramada Plaza Hotel in San Francisco immediately. The deadline for reserving rooms at the conference rate is: Wednesday, FEB. 28th, 2001 After that date, the cost per night for a hotel room will go from the conference rate of $139 to the regular rate of $239. Please reserve your room by calling 1-800-227-4747 or (415)626-8000. You can also download the reservation form from http://cns.numedeon.com/cns2001/ and fax the form to (415) 861-1435, Attn: Kuldip Singh. Please indicate to the reservation agent that you are a participant in the CNS*01 conference. Remember, you can reserve rooms at the conference rate anytime from Friday, June 29 to Saturday, June 30. Transportation will be provided from the Ramada Plaza Hotel on Sunday, July 1 to the Asilomar Conference Center. Thank you. Jim Bower, CNS Meeting Chair Judy Macias, CNS Conference Coordinator From harnad at coglit.ecs.soton.ac.uk Sun Feb 25 10:53:31 2001 From: harnad at coglit.ecs.soton.ac.uk (Stevan Harnad) Date: Sun, 25 Feb 2001 15:53:31 +0000 (GMT) Subject: Survey of Users and Non-Users of Eprint Archives Message-ID: I would be very grateful if you could participate in a survey we are conducting on current users and non-users of Eprint Archives. http://www.eprints.org/survey/ The purpose of the survey is to determine who is and is not using such archives at this time, how they use them if they do, why they do not use them if they do not, and what features they would like to have added to them to make them more useful. (The survey is anonymous. Revealing your identity is optional and it will be kept confidential.) The survey consists of about web-based 72 questions, and comes in four versions: PHYSICISTS, ASTROPHYSICISTS, MATHEMATICIANS 1. arXiv Users 2. arXiv Non-Users COGNITIVE SCIENTISTS (Psychologists, Neuroscientists, Behavioral Biologists, Computer Scientists [AI/robotics/vision/speech/learning], Linguists, Philosophers) 3. CogPrints Users 4. CogPrints Non-Users OTHER DISCIPLINES: Please use either 2. or 4. http://www.eprints.org/survey/ Many thanks, Stevan Harnad harnad at cogsci.soton.ac.uk Professor of Cognitive Science harnad at princeton.edu Department of Electronics and phone: +44 23-80 592-582 Computer Science fax: +44 23-80 592-865 University of Southampton http://www.cogsci.soton.ac.uk/~harnad/ Highfield, Southampton http://www.princeton.edu/~harnad/ SO17 1BJ UNITED KINGDOM From: esann To: "Connectionists at cs.cmu.edu" References: From bogus@does.not.exist.com Mon Feb 26 08:10:40 2001 From: bogus@does.not.exist.com () Date: Mon, 26 Feb 2001 14:10:40 +0100 Subject: ESANN'2001 programme ( European Symposium on Artificial Neural Networks) Message-ID: ---------------------------------------------------- | | | ESANN'2001 | | | | 9th European Symposium | | on Artificial Neural Networks | | | | Bruges (Belgium) - April 25-26-27, 2001 | | | | Preliminary programme | ---------------------------------------------------- The preliminary programme of the ESANN'2001 conference is now available on the Web: http://www.dice.ucl.ac.be/esann For those of you who maintain WWW pages including lists of related ANN sites: we would appreciate if you could add the above URL to your list; thank you very much! We try as much as possible to avoid multiple sendings of this call for papers; however please apologize if you receive this e-mail twice, despite our precautions. For 9 years the ESANN conference has become a major event in the field of neural computation. ESANN is a human-size conference focusing on fundamental aspects of artificial neural networks (theory, models, algorithms, links with statistics, data analysis, biological background,...). The programme of the conference can be found at the URL http://www.dice.ucl.ac.be/esann, together with practical information about the conference venue, registration,... Other information can be obtained by sending an e-mail to esann at dice.ucl.ac.be . ===================================================== ESANN - European Symposium on Artificial Neural Networks http://www.dice.ucl.ac.be/esann * For submissions of papers, reviews,... Michel Verleysen Univ. Cath. de Louvain - Microelectronics Laboratory 3, pl. du Levant - B-1348 Louvain-la-Neuve - Belgium tel: +32 10 47 25 51 - fax: + 32 10 47 25 98 mailto:esann at dice.ucl.ac.be * Conference secretariat D facto conference services 24 av. L. Mommaerts - B-1140 Evere - Belgium tel: + 32 2 730 06 11 - fax: + 32 2 730 06 00 mailto:esann at dice.ucl.ac.be ===================================================== From mzib at ee.technion.ac.il Mon Feb 26 09:31:58 2001 From: mzib at ee.technion.ac.il (Michael Zibulevsky) Date: Mon, 26 Feb 2001 16:31:58 +0200 (IST) Subject: New paper on extraction of a single source from multichannel data Message-ID: Announcing a paper ... Title: Extraction of a single source from multichannel data using sparse decomposition Authors: M. Zibulevsky and Y.Y. Zeevi ABSTRACT: It was discovered recently that use of sparse decompositions in signal dictionaries improves dramatically quality of blind source separation. In this work we exploit sparse decomposition of a single source in order to extract it from the multidimensional sensor data, when in addition a rough template of the source is known. This leads to a convex optimization problem, which is solved by a Newton-type method. Complete and overcomplete dictionaries are considered. Simulations with synthetic evoked responses mixed into natural 122-channel MEG data show significant improvement in accuracy of signal restoration. URL of gzipped ps file: http://ie.technion.ac.il/~mcib/extr_onesrc2.ps.gz Contact: mzib at ee.technion.ac.il From campber at CLEMSON.EDU Mon Feb 26 17:51:36 2001 From: campber at CLEMSON.EDU (Robert L. Campbell) Date: Mon, 26 Feb 2001 17:51:36 -0500 Subject: Interactivist Summer Institute, Lehigh University, July 23-27, 2001 Message-ID: The Interactivist Summer Institute 2001 July 23-27, 2001 Lehigh University, Bethlehem, Pennsylvania, USA GENERAL INFORMATION (This Call for Participation, along with related information, can also be viewed at http://www.lehigh.edu/~interact/isi2001.html). It's happening: research threads in multiple fields scattered across the mind-sciences seem to be converging towards a point where the classical treatment of representation within the encodingist framework is felt as an impasse. A rethinking of the methods, concepts, arguments, facts, etc. is needed and, so it seems, is being found in the interactivist approach. From research in human cognition, motivation, and development, through consciousness, sociality, and language, to artificial intelligence, post-behaviorist cognitive robotics, and interface design, we are witnessing the appearance of projects where the assumptions of interactivism are embraced. More often then not, this is in an implicit manner, so that at a superficial level those projects (the problems they deal with, the methods they use) seem to be incommensurable. However, underneath, one can feel their interactivist gist. The time is right (and ripe) we felt, to articulate this "irrational" (in Feyerabendian sense) pressure for change at a programmatic level, and this is what we want to accomplish with the present workshop. The workshop will be preceded by a Summer School in Interactivism featuring several tutorials which are meant to provide the needed theoretical background, based mainly on Mark Bickhard and his collaborators' work. The intention is for this Institute to become a traditional annual meeting where those sharing the core ideas of interactivism will meet and discuss their work, try to reconstruct its historical roots, put forward current research in different fields that fits the interactivist framework, and define research topics for prospective graduate students. People working in philosophy of mind, linguistics, social sciences, artificial intelligence, cognitive robotics, and other fields related to the sciences of mind are invited to send their statement of interest for participation to the organizers (see details below). ORGANIZING COMMITTEE Mark Bickhard John C. Christopher Wayne Christensen Robert Campbell Georgi Stojanov Goran Trajkovski MAJOR THEMES * Foundations of Interactivism * Naturalism * Emergence * Process metaphysics * Cognition and Representation * Representation emergent in action systems * Dissolution of problems of skepticism, error, Chinese room, etc. * Concepts * Memory * Learning * Heuristic learning * Metaphor * Rationality and negative knowledge * Agents * Interaction * Motivation * Emotions * Autonomous agents * Persons * Development * Consciousness * Sociality * Language * Ethics * Social processes and realities ACADEMIC AND SCIENTIFIC CO-SPONSORS Lehigh University, Bethlehem, PA, USA Institute for Interactive Studies Cognitive Science Program Humanities Research Center SS Cyril and Methodius University, Skopje, Macedonia West Virginia University at Parkersburg CALL FOR PARTICIPATION Participation will be limited to 30 people and by invitation only; People wishing to participate should submit a short curriculum vitae and a statement of interest to Interactivist Summer Institute. Please include e-mail address and/or fax number, if available. Applications should be received by March 15, 2001. Notification of acceptance will be provided by April 15, 2001. The meeting will take place in the conference room The Governor's Suite, Iaccoca Hall (tentative). A small number of scholarships for partial financial support will be provided by the organizers for graduate students or postdocs. CALL FOR PAPERS If you are interested in the issues mentioned above and wish to share your thoughts and research results with like-minded people, please submit an extended abstract or full paper via email with attached files (in ASCII, RTF, or Word) to: Interactivist Summer Institute (interact at lehigh.edu) Abstracts and papers should be sent taking into account the following format: 1. Major theme of the paper, related to the major themes given above. 2. Paper title. 3. Extended abstract of 500 to 1500 words and/or paper drafts of 2000 to 5000 words, in English. 4. Author or co-authors with names, addresses, telephone number, fax number and e-mail address. All abstracts will be refereed by an independent panel of experts. The judgments of the referees will determine the list of papers to be presented at the conference. DEADLINES Applications: March 15 Submission of papers: March 15 Notification date: April 15 Receipt of registration fee: May 1 On campus housing reservation (see below): June 30 Off campus housing reservation (see below): June 22 CONFERENCE FEES Standard registration fee: $150 Student registration fee: $100 Checks should be made out to: Interactivist Summer Institute. Mail to: Mark H. Bickhard Interactivist Summer Institute 17 Memorial Drive East Bethlehem, PA 18015 USA For wire transfers: Wire address: First Union National Bank Funds Transfer Department Attention: NC0803 1525 West W.T. Harris Blvd. Charlotte, NC 28288-0803 ABA # 031201467 Account # 2100012444293 Account Name: Lehigh University For international wires, these additional identification numbers are required: CHIPS Participant #0509 Swift TID #PNBPUS33 You must include your name and identify that the transfer is for the Interactivist Summer Institute. HOUSING Housing is available both on campus and off campus. Off campus housing is with Comfort Suites, and is within easy walking distance of the main Lehigh campus. The rates are $80/night for a single and $85/night for a double. Please contact: Comfort Suites 120 W 3rd Bethlehem, PA 18015 USA 610-882-9700 On campus housing is available both air-conditioned (Trembly Park) and not air-conditioned (Gamma Phi Beta). For on campus housing, please fill out and return the Interactivist Summer Institute housing form. This can be obtained from the Web site as a PDF file or a Word file. TRAVEL The easiest way to get to Bethlehem is to fly into Lehigh Valley International Airport (known as ABE, from Allentown, Bethlehem, Easton - LVI is already taken by Las Vegas International Airport). There are direct flights from Chicago, for example, for those coming from the west, and also flights from the South (e.g., Atlanta). Flying into New York, particularly Newark Airport, also works well. There are buses to Bethlehem from Newark Airport and from the Port Authority Bus Terminal in Manhattan. So, from Kennedy or LaGuardia, you first go the Port Authority, and then get a bus to Bethlehem. The bus company is: Trans Bridge Lines 2012 Industrial Drive Bethlehem 610-868-6001 800-962-9135 The Industrial Drive terminal is the main bus terminal, and taxis are available to the Lehigh campus. Lehigh Valley Taxi: 610-867-6000 Quick Service Taxi: 610-434-8132 Airport Taxi Service: 610-231-2000 There is also a South Bethlehem terminal that is within walking distance of Comfort Suites and of campus (though it would a little long with luggage), but fewer buses make that stop. Philadelphia airport is closer than Newark airport, but getting to Bethlehem from there is harder than from Newark. You get to the Philadelphia bus station (probably by taxi, though there is a train to downtown Philadelphia), and then take a bus (Bieber Tours) to Bethlehem: it's roughly the equivalent in complication of coming through Kennedy airport -- Robert L. Campbell Professor, Psychology Brackett Hall 410A Clemson University Clemson, SC 29634-1355 USA phone (864) 656-4986 fax (864) 656-0358 http://hubcap.clemson.edu/~campber/index.html From frey at dendrite.uwaterloo.ca Tue Feb 27 13:11:39 2001 From: frey at dendrite.uwaterloo.ca (frey@dendrite.uwaterloo.ca) Date: Tue, 27 Feb 2001 13:11:39 -0500 Subject: postdoc advertisement Message-ID: <200102271811.NAA31868@dendrite.uwaterloo.ca> CALL FOR APPLICATIONS POSTDOCTORAL RESEARCH SCIENTIST Probabilistic Inference - Machine Learning - Decision Making Statistical Computing - Bayesian Theory & Applications Computer Vision - Coding - Speech Recognition - Bioinformatics Our group at the University of Toronto would like to hire one or more postdoctoral research scientists. The successful applicant(s) will work on theoretical and applied research in areas such as those listed above. Faculty members in our group and their interests are as follows: Craig Boutilier http://www.cs.toronto.edu/~cebly Markov decision processes, reinforcement learning. Probabilistic inference. Economic models of agency, combinatorial auctions. Preference elicitation, interactive optimization under uncertainty. Brendan Frey http://www.cs.toronto.edu/~frey Graphical models, machine learning, variational techniques, loopy belief propagation. Computer vision. Speech recognition. Iterative error-correcting decoding. SAR and MRI imaging. Bioinformatics. Radford Neal http://www.cs.toronto.edu/~radford/ Bayesian modeling with neural networks, Gaussian processes, and mixtures. Markov chain Monte Carlo methods. Low density parity check codes. Empirical assessment of learning methods. Jeffrey Rosenthal http://markov.utstat.toronto.edu/jeff/ Probability theory and stochastic processes. Markov chain Monte Carlo theory and methods. Convergence rates of Markov chains. Randomized algorithms. Random walks on groups. Rich Zemel http://www.cs.toronto.edu/~zemel Unsupervised learning, boosting. Perceptual learning, representations of visual motion, multisensory integration. Neural coding, probabilistic models of neural representations. The group currently consists of the above faculty members, 2 postdoctoral researchers and 25 graduate students and has joint projects with Microsoft Research, Xerox PARC, the University of Illinois at Urbana-Champaign, the University of British Columbia, the University of Waterloo, and Simon Fraser University. Applicants should * have a solid background in one or more of the areas described above * have good scientific skills * be good at writing software to implement and evaluate algorithms Successful applicants who wish to do so will have the opportunity to apply to do sessional teaching in the departments of Computer Science, Statistics, or Electrical and Computer Engineering. Applicants should EMAIL a CV, the email addresses of 3 references, and a short description of their research interests and goals as a postdoc (ascii format, < 500 words) to Brendan Frey at frey at cs.toronto.edu. From xwu at gauss.Mines.EDU Tue Feb 27 12:35:44 2001 From: xwu at gauss.Mines.EDU (Xindong Wu) Date: Tue, 27 Feb 2001 10:35:44 -0700 (MST) Subject: Knowledge and Information Systems: 3(1) and 3(2), 2001 Message-ID: <200102271735.KAA05770@gauss.Mines.EDU> Knowledge and Information Systems: An International Journal ----------------------------------------------------------- ISSN: 0219-1377 (printed version) ISSN: 0219-3116 (electronic version) by Springer-Verlag Home Page: http://kais.mines.edu/~kais/home.html ================================================ I. Volume 3, Number 1 (February 2001) ------------------------------------- Regular Papers - Parallel Data Mining for Association Rules on Shared-Memory Systems by Srinivasan Parthasarathy, Mohammed J. Zaki, Mitsunori Ogihara, and Wei Li URL link.springer.de/link/service/journals/10115/bibs/1003001/10030001.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030001.htm - On Similarity Measures for Multimedia Database Applications by K. Selcuk Candan and Wen-Syan Li URL link.springer.de/link/service/journals/10115/bibs/1003001/10030030.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030030.htm - Representing and Reasoning on Database Conceptual Schemas by Mohand-Said Hacid, Jean-Marc Petit and Farouk Toumani URL link.springer.de/link/service/journals/10115/bibs/1003001/10030052.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030052.htm - Fuzzy User Modeling for Information Retrieval on the World Wide Web by Robert I. John and Gabrielle J. Mooney URL link.springer.de/link/service/journals/10115/bibs/1003001/10030081.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030081.htm - An Artificial Network Simulating Cause-to-Effect Reasoning: Cancellation Interactions and Numerical Studies by L. Ben Romdhane, B. Ayeb, and S. Wang URL link.springer.de/link/service/journals/10115/bibs/1003001/10030096.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030096.htm Short Papers - Zipf's Law for Web Surfers by Mark Levene, Jose Borges and George Loizou URL link.springer.de/link/service/journals/10115/bibs/1003001/10030120.htm or link.springer-ny.com/link/service/journals/10115/bibs/1003001/10030120.htm II. Volume 3, Number 2 (May 2001) --------------------------------- Regular Papers - Making Use of the Most Expressive Jumping Emerging Patterns for Classification by Jinyan Li, Guozhu Dong, and Kotagiri Ramamohanarao - A Description Length Based Decision Criterion for Default Knowledge in the Ripple Down Rules Method by Takuya Wada, Tadashi Horiuchi, Hiroshi Motoda and Takashi Washio - Multipass Algorithms for Mining Association Rules in Text Databases by J.D. Holt and S.M. Chung - C-Net: A Method for Generating Non-Deterministic and Dynamic Multi-Variate Decision Trees by H.A. Abbass, M. Towsey, and G. Finn - A Hybrid Fragmentation Approach for Distributed Deductive Database Systems by Seung-Jin Lim and Yiu-Kai Ng - ActiveCBR: An Agent System that Integrates Case-based Reasoning and Active Databases by Sheng Li and Qiang Yang Short Papers - XML Indexing and Retrieval with a Hybrid Storage Model by Dongwook Shin From se37 at cornell.edu Tue Feb 27 17:25:17 2001 From: se37 at cornell.edu (Shimon Edelman) Date: Tue, 27 Feb 2001 17:25:17 -0500 Subject: a cross-disciplinary symposium on communication Message-ID: ---------------------------------------------------------------------- WHAT: "From Signals to Structured Communication" WHEN: May 4 and 5 WHERE: Ithaca, NY WEB: http://kybele.psych.cornell.edu/~edelman/CogStud/May2001.html The list of speakers at this Spring Symposium of the Cornell Cognitive Studies Program includes representatives from neurobiology, ethology, philosophy, linguistics, economics, game theory, and computer science. The emergence of a diverse, unconventional and exciting set of perspectives on communication at this event will be facilitated by focusing the presentations on certain common threads - notably, issues having to do with structure, compositional or other - that run through the entire spectrum of research themes represented at the symposium. ---------------------------------------------------------------------- Shimon Edelman Professor, Dept. of Psychology, 232 Uris Hall Cornell University, Ithaca, NY 14853-7601, USA Web: http://kybele.psych.cornell.edu/~edelman From giacomo at ini.phys.ethz.ch Tue Feb 27 03:58:12 2001 From: giacomo at ini.phys.ethz.ch (Giacomo Indiveri) Date: Tue, 27 Feb 2001 09:58:12 +0100 Subject: NEUROMORPHIC ENGINEERING WORKSHOP (second call) Message-ID: <3A9B6C23.7A1A7D89@ini.phys.ethz.ch> Please accept our apology for cross-postings. This is the second call for the Telluride Workshop application announcement (also at http://www.ini.unizh.ch/telluride2000/tell2001_announcement.html) ---------------------------------------------------------------------- NEUROMORPHIC ENGINEERING WORKSHOP Sunday, JULY 1 - Saturday, JULY 21, 2001 TELLURIDE, COLORADO ------------------------------------------------------------------------ Avis COHEN (University of Maryland) Rodney DOUGLAS (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Timmer HORIUCHI (University of Maryland) Giacomo INDIVERI (Institute of Neuroinformatics, UNI/ETH Zurich, Switzerland) Christof KOCH (California Institute of Technology) Terrence SEJNOWSKI (Salk Institute and UCSD) Shihab SHAMMA (University of Maryland) ------------------------------------------------------------------------ We invite applications for a three week summer workshop that will be held in Telluride, Colorado from Sunday, July 1 to Sunday, July 21, 2001. The application deadline is Friday, March 7, and application instructions are described at the bottom of this document. The 2000 summer workshop on "Neuromorphic Engineering", sponsored by the National Science Foundation, the Gatsby Foundation, Whitaker Foundation, the Office of Naval Research, and by the Center for Neuromorphic Systems Engineering at the California Institute of Technology, was an exciting event and a great success. A detailed report on the workshop is available at http://www.ini.unizh.ch/telluride2000. We strongly encourage interested parties to browse through the previous workshop web pages: http://www.ini.unizh.ch/telluride2000/ GOALS: Carver Mead introduced the term "Neuromorphic Engineering" for a new field based on the design and fabrication of artificial neural systems, such as vision systems, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems. The goal of this workshop is to bring together young investigators and more established researchers from academia with their counterparts in industry and national laboratories, working on both neurobiological as well as engineering aspects of sensory systems and sensory-motor integration. The focus of the workshop will be on active participation, with demonstration systems and hands-on-experience for all participants. Neuromorphic engineering has a wide range of applications from nonlinear adaptive control of complex systems to the design of smart sensors. Many of the fundamental principles in this field, such as the use of learning methods and the design of parallel hardware (with an emphasis on analog and asynchronous digital VLSI), are inspired by biological systems. However, existing applications are modest and the challenge of scaling up from small artificial neural networks and designing completely autonomous systems at the levels achieved by biological systems lies ahead. The assumption underlying this three week workshop is that the next generation of neuromorphic systems would benefit from closer attention to the principles found through experimental and theoretical studies of real biological nervous systems as whole systems. FORMAT: The three week summer workshop will include background lectures on systems neuroscience (in particular learning, oculo-motor and other motor systems and attention), practical tutorials on analog VLSI design, small mobile robots (Koalas and Kheperas), hands-on projects, and special interest groups. Participants are required to take part and possibly complete at least one of the projects proposed (soon to be defined). They are furthermore encouraged to become involved in as many of the other activities proposed as interest and time allow. There will be two lectures in the morning that cover issues that are important to the community in general. Because of the diverse range of backgrounds among the participants, the majority of these lectures will be tutorials, rather than detailed reports of current research. These lectures will be given by invited speakers. Participants will be free to explore and play with whatever they choose in the afternoon. Projects and interest groups meet in the late afternoons, and after dinner. The analog VLSI practical tutorials will cover all aspects of analog VLSI design, simulation, layout, and testing during the three weeks of the workshop. The first week covers basics of transistors, simple circuit design and simulation. This material is intended for participants who have no experience with analog VLSI. The second week will focus on design frames for silicon retinas, from the silicon compilation and layout of on-chip video scanners, to building the peripheral boards necessary for interfacing analog VLSI retinas to video output monitors. Retina chips will be provided. The third week will feature sessions on floating gates, including lectures on the physics of tunneling and injection, and on inter-chip communication systems. We will also feature a tutorial on the use of small, mobile robots, focusing on Koala's, as an ideal platform for vision, auditory and sensory-motor circuits. Projects that are carried out during the workshop will be centered in a number of working groups, including: * active vision * audition * olfaction * motor control * central pattern generator * robotics, multichip communication * analog VLSI * learning The active perception project group will emphasize vision and human sensory-motor coordination. Issues to be covered will include spatial localization and constancy, attention, motor planning, eye movements, and the use of visual motion information for motor control. Demonstrations will include a robot head active vision system consisting of a three degree-of-freedom binocular camera system that is fully programmable. The central pattern generator group will focus on small walking and undulating robots. It will look at characteristics and sources of parts for building robots, play with working examples of legged and segmented robots, and discuss CPG's and theories of nonlinear oscillators for locomotion. It will also explore the use of simple analog VLSI sensors for autonomous robots. The robotics group will use rovers and working digital vision boards as well as other possible sensors to investigate issues of sensorimotor integration, navigation and learning. The audition group aims to develop biologically plausible algorithms and aVLSI implementations of specific auditory tasks such as source localization and tracking, and sound pattern recognition. Projects will be integrated with visual and motor tasks in the context of a robot platform. The multichip communication project group will use existing interchip communication interfaces to program small networks of artificial neurons to exhibit particular behaviors such as amplification, oscillation, and associative memory. Issues in multichip communication will be discussed. LOCATION AND ARRANGEMENTS: The workshop will take place in the small town of Telluride, 9000 feet high in Southwest Colorado, about 6 hours drive away from Denver (350 miles). America West and United Airlines provide daily flights directly into Telluride. All facilities within the beautifully renovated public school building are fully accessible to participants with disabilities. Participants will be housed in ski condominiums, within walking distance of the school. Participants are expected to share condominiums. The workshop is intended to be very informal and hands-on. Participants are not required to have had previous experience in analog VLSI circuit design, computational or machine vision, systems level neurophysiology or modeling the brain at the systems level. However, we strongly encourage active researchers with relevant backgrounds from academia, industry and national laboratories to apply, in particular if they are prepared to work on specific projects, talk about their own work or bring demonstrations to Telluride (e.g. robots, chips, software). Internet access will be provided. Technical staff present throughout the workshops will assist with software and hardware issues. We will have a network of workstations running UNIX and PCs running LINUX and Microsoft Windows. No cars are required. Bring hiking boots, warm clothes and a backpack, since Telluride is surrounded by beautiful mountains. Unless otherwise arranged with one of the organizers, we expect participants to stay for the entire duration of this three week workshop. FINANCIAL ARRANGEMENT: Notification of acceptances will be mailed out around March 9, 2001. Participants are expected to pay a $275.00 workshop fee at that time in order to reserve a place in the workshop. The cost of a shared condominium will be covered for all academic participants but upgrades to a private room will cost extra. Participants from National Laboratories and Industry are expected to pay for these condominiums. Travel reimbursement of up to $500 for US domestic travel and up to $800 for overseas travel will be possible if financial help is needed (Please specify on the application). HOW TO APPLY: Applicants should be at the level of graduate students or above (i.e. postdoctoral fellows, faculty, research and engineering staff and the equivalent positions in industry and national laboratories). We actively encourage qualified women and minority candidates to apply. Application should include: * First name, Last name, valid email address. * Curriculum Vitae. * One page summary of background and interests relevant to the workshop. * Description of special equipment needed for demonstrations that could be brought to the workshop. * Two letters of recommendation Complete applications should be sent to: Terrence Sejnowski The Salk Institute 10010 North Torrey Pines Road San Diego, CA 92037 email: telluride at salk.edu FAX: (858) 587 0417 DEADLINE: March 7, 2001 Applicants will be notified by email around March 21, 2001 From vercher at laps.univ-mrs.fr Tue Feb 27 06:28:15 2001 From: vercher at laps.univ-mrs.fr (Jean-Louis Vercher) Date: Tue, 27 Feb 2001 12:28:15 +0100 Subject: Conference announcement (Marseille 2001) Message-ID: SORRY FOR MULTIPLE POSTINGS --------------------------- CONFERENCE ANNOUNCEMENT 3rd INTERNATIONAL CONFERENCE ON SENSORIMOTOR CONTROLS IN MEN AND MACHINES We are pleased to announce The IIIrd Conference on Sensorimotor systems in Men and Machines, which will take place on October, Friday 5 - Saturday 6 2001 at Palais du Pharo, Marseille, France. The first one was held in Berkeley in 1994 (Starkfest), the second one in Chicago in 1997. To see details, including the list of speakers, and to obtain your registration form, visit conference web site at http:www.laps.univ-mrs.fr/umr/workshop/index.html or contact the Conference secretary: Nathalie FENOUIL mailto:workshop at laps.univ-mrs.fr +33 [0] 491 17 22 50 Note that the deadline for abstract submission is April 31. Confirmation of acceptance will be forwarded one month after the deadline. From rosi-ci0 at wpmail.paisley.ac.uk Tue Feb 27 11:15:56 2001 From: rosi-ci0 at wpmail.paisley.ac.uk (Roman Rosipal) Date: Tue, 27 Feb 2001 16:15:56 +0000 Subject: New TR Message-ID: Dear Connectionists, The following TR is now available at my home page: Kernel Partial Least Squares Regression in RKHS Roman Rosipal and Leonard J Trejo Abstract A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the Kernel Partial Least Squares (PLS) regression model. Similar to Principal Components Regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the latent variables (components). However, in contrast to PCR, PLS creates the components by modeling the relationship between input andoutput variables while maintaining most of the input variables information. PLS is considered to be useful in situations where the number of explanatory variables exceeds the number of observations and/or a high level of multicollinearityamong those variables is assumed. Motivated by this fact we will provide a Kernel PLS algorithm for construction of non-linear regression models in possiblyhigh-dimensional feature spaces. We give the theoretical description of the Kernel PLS algorithm and we experimentally compare the algorithm with the existing Kernel PCR and Kernel Ridge Regression techniques. We will demonstrate that on the data sets employed Kernel PLS achieves the same results but in comparison to Kernel PCR uses significantly smaller, qualitatively different components. __________________ You can download gzipped postscript from http://cis.paisley.ac.uk/rosi-ci0/Papers/TR01_1.ps.gz Any comments and remarks are very welcome. _______________ Roman Rosipal University of Paisley, CIS Department, Paisley, PA1 2BE Scotland, UK http://cis.paisley.ac.uk/rosi-ci0 e-mai:rosi-ci0 at paisley.ac.uk Legal disclaimer -------------------------- The information transmitted is the property of the University of Paisley and is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Statements and opinions expressed in this e-mail may not represent those of the company. Any review, retransmission, dissemination and other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender immediately and delete the material from any computer. -------------------------- From horwitzb at mail.nih.gov Wed Feb 28 13:50:19 2001 From: horwitzb at mail.nih.gov (Horwitz, Barry (NIDCD)) Date: Wed, 28 Feb 2001 13:50:19 -0500 Subject: postdoctoral position available Message-ID: <45120BC2AC24D4119B7100508B9506D48F6A97@nihexchange5.nih.gov> National Institute on Deafness and Other Communication Disorders National Institutes of Health Postdoctoral Fellowship in Neural Modeling of Human Functional Neuroimaging Data A two-year postdoctoral fellowship under the supervision of Dr. Barry Horwitz is available immediately for developing and applying computational neuroscience modeling methods to in vivo human functional neuroimaging data, obtained from functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). PET and fMRI facilities are available to implement and test hypotheses derived from the modeling. Two areas of research are possible, depending on the background and interests of the fellow: (1) extending the method of structural equation modeling (path analysis) to deal with fMRI designs and to improve the use of this method for network analysis; (2) extending a large-scale neurobiologically realistic computational model with the goal of understanding the relation between functional neuroimaging data and the underlying electrophysiological behavior of multiple interconnected neuronal populations. See Trends in Cognitive Science 3: 91-98, 1999 for examples of these types of research. Knowledge of neural modeling or statistical techniques, and programming experience are required. PhD or MD degree required. The position is in the Language Section, Voice, Speech and Language Branch, NIDCD, NIH, Bethesda, MD USA. For further information, contact: Dr. Barry Horwitz, Bldg. 10, Rm. 6C420, National Institutes of Health, Bethesda, MD 20892, USA. Tel. 301-594-7755; FAX: 301-480-5625; Email: horwitz at helix.nih.gov. ---------------------------------------------------------------------------- ------ Barry Horwitz, Ph.D. Senior Investigator Language Section Voice, Speech and Language Branch National Institute on Deafness and other Communication Disorders National Institutes of Health Bldg. 10, Rm. 6C420 MSC 1591 Bethesda, MD 20892 USA Tel. 301-594-7755 FAX 301-480-5625 horwitz at helix.nih.gov http://www.nidcd.nih.gov/intram/scientists/horwitzb.htm From kap-listman at wkap.nl Wed Feb 28 20:25:32 2001 From: kap-listman at wkap.nl (kap-listman@wkap.nl) Date: Thu, 01 Mar 2001 02:25:32 +0100 (MET) Subject: New Issue: Neural Processing Letters. Vol. 13, Issue 1 Message-ID: <200103010125.CAA08052@wkap.nl> Kluwer ALERT, the free notification service from Kluwer Academic/PLENUM Publishers and Kluwer Law International ------------------------------------------------------------ Neural Processing Letters ISSN 1370-4621 http://www.wkap.nl/issuetoc.htm/1370-4621+13+1+2001 Vol. 13, Issue 1, February 2001. TITLE: The Neural Solids; For optimization problems AUTHOR(S): Giansalvo Cirrincione, Maurizio Cirrincione KEYWORD(S): constrained optimization, essential matrix decomposition, Hopfield networks, matrix decomposition, polar decomposition, self organization, stucture from motion. PAGE(S): 1-15 TITLE: Efficient Vector Quantization Using the WTA-Rule with Activity Equalization AUTHOR(S): Gunther Heidemann, Helge Ritter KEYWORD(S): clustering, codebook generation, competitive learning, neural gas, unsupervised learning, vector quantization, winner takes all. PAGE(S): 17-30 TITLE: Probability Density Estimation Using Adaptive Activation Function Neurons AUTHOR(S): Simone Fiori, Paolo Bucciarelli KEYWORD(S): adaptive activation function neurons, cumulative distribution function, differential entropy, probability density function, stochastic gradient. PAGE(S): 31-42 TITLE: A Simple Neural Network Pruning Algorithm with Application to Filter Synthesis AUTHOR(S): Kenji Suzuki, Isao Horiba, Noboru Sugie KEYWORD(S): generalization ability, image enhancement, generalization ability, image enhancement, neural filter, optimal structure, redundancy removal, right function, signal processing. PAGE(S): 43-53 TITLE: An Annealed Chaotic Competitive Learning Network with Nonlinear Self-feedback and Its Application in Edge Detection AUTHOR(S): Jzau-Sheng Lin, Ching-Tsorng Tsai, Jiann-Shu Lee KEYWORD(S): annealed chaotic competitive learning network, chaotic dynamic, edge detection, competitive learning network, simulated annealing. PAGE(S): 55-69 TITLE: Storage Capacity of the Exponential Correlation Associative Memory AUTHOR(S): Richard C. Wilson, Edwin R. Hancock KEYWORD(S): exponential correlation associative memory, storage capacity. PAGE(S): 71-80 TITLE: Learning Algorithm and Retrieval Process for the Multiple Classes Random Neural Network Model AUTHOR(S): Jose Aguilar KEYWORD(S): color pattern recognition, learning algorithm, multiple classes random neural network, retrieval process. PAGE(S): 81-91 -------------------------------------------------------------- Thank you for your interest in Kluwer's books and journals. NORTH, CENTRAL AND SOUTH AMERICA Kluwer Academic Publishers Order Department, PO Box 358 Accord Station, Hingham, MA 02018-0358 USA Telephone (781) 871-6600 Fax (781) 681-9045 E-Mail: kluwer at wkap.com EUROPE, ASIA AND AFRICA Kluwer Academic Publishers Distribution Center PO Box 322 3300 AH Dordrecht The Netherlands Telephone 31-78-6392392 Fax 31-78-6546474 E-Mail: orderdept at wkap.nl