From ted.carnevale at yale.edu Sat May 1 13:21:36 1999 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Sat, 01 May 1999 13:21:36 -0400 Subject: NEURON Summer Course Message-ID: <372B381F.1E4C@yale.edu> COURSE ANNOUNCEMENT What: "The NEURON Simulation Environment" (1999 NEURON Summer Course) When: Saturday, July 31, through Wednesday, August 4, 1999 Where: San Diego Supercomputer Center University of California at San Diego, CA Organizers: N.T. Carnevale and M.L. Hines Faculty includes: N.T. Carnevale, M.L. Hines, W.W. Lytton, and T.J. Sejnowski Description: This intensive hands-on course covers the design, construction, and use of models in the NEURON simulation environment. It is intended primarily for those who are concerned with models of biological neurons and neural networks that are closely linked to empirical observations, e.g. experimentalists who wish to incorporate modeling in their research plans, and theoreticians who are interested in the principles of biological computation. The course is designed to be useful and informative for registrants at all levels of experience, from those who are just beginning to those who are already quite familiar with NEURON or other simulation tools. Registration is limited to 20 and the deadline is Thursday, July 1, 1999. For more information see http://www.neuron.yale.edu/sdsc99/sdsc99.htm or contact Ted Carnevale Psychology Dept. Box 208205 Yale University New Haven CT 06520-8205 USA phone 203-432-7363 fax 203-432-7172 email ted.carnevale at yale.edu Supported in part by the National Science Foundation. This course is not sponsored by the University of California. --Ted From annimab at www.phil.gu.se Mon May 3 16:28:48 1999 From: annimab at www.phil.gu.se (ANNIMAB) Date: Mon, 3 May 1999 22:28:48 +0200 Subject: No subject Message-ID: Below you can find information about a new conference of interest for researchers in medicine, biology, statistics, AI and artificial neural networks. (Our apologies if you receive more than one copy of this message!) For more information about the conference see: http://www.phil.gu.se/annimab.html First announcement: ANNIMAB-1 an international conference on Artificial Neural Networks In Medicine And Biology Gothenburg, Sweden, May 13-16, 2000 Artificial neural network (ANN) techniques are currently being used for many data analysis and modelling tasks in clinical medicine as well as in theoretical biology, and the possible applications of ANNs in these fields are countless. The ANNIMAB-1 conference will summarise the state of the art, analyse the relations between ANN techniques and other available methods, and point to possible future biological and medical uses of ANNs. It will have three main themes: 1) Medical applications of artificial neural networks: for better diagnoses and outcome predictions from clinical and laboratory data, in the analysis of ECG and EEG signals, in medical image analysis, for the handling of medical databases, etc. 2) Uses of ANNs in biology outside clinical medicine: for example, in models of ecology and evolution, for data analysis in molecular biology, in simulations of cell signalling mechanisms, and (of course) in models of animal and human nervous systems and their capabilities. 3) Theoretical aspects: recent developments in ANN techniques, ANNs in relation to AI and to traditional statistical procedures, possible roles of ANNs in the medical decision process, etc. Hybrid systems and integrative approaches, such as those involving Bayesian belief nets, will receive special attention. Among the keynote speakers are: Wayne Getz (Berkeley); Teuvo Kohonen (Helsinki); Anders Lansner (Stockholm); Paulo Lisboa (Liverpool). The size of the conference, which starts at 2PM on Saturday, May 13 and ends at 3PM on Tuesday, May 16, will be limited to 250 participants. The second announcement and call for papers is scheduled for August 15, 1999, and the deadline for abstract submissions is October 15, 1999. ANNIMAB-S the Artificial Neural Networks In Medicine And Biology Society Dept of Philosophy, Gothenburg University Box 200, SE-405 30 Gothenburg, Sweden phone (+46) 31 773 5573 fax (+46) 31 773 5159 e-mail: annimab at www.phil.gu.se http://www.phil.gu.se/annimab.html From mitra at its.caltech.edu Tue May 4 02:38:27 1999 From: mitra at its.caltech.edu (Partha Mitra) Date: Mon, 3 May 1999 23:38:27 -0700 (PDT) Subject: No subject Message-ID: <199905040638.XAA27052@stucco.cco.caltech.edu> Workshop on Analysis of Neural Data ________________________________________________ Modern methods and open issues in the analysis and interpretation of time-series and imaging data in the neurosciences ___________________________________________________ >> 16 August - 28 August 1999 >> Marine Biological Laboratories - Woods Hole, MA ___________________________________________________ A working group of scientists committed to quantitative approaches to problems in neuroscience will focus their efforts on experimental and theoretical issues related to the analysis of large, multi-channel data sets. The motivation for the work group is based on issues that arise in two complimentary areas critical to an understanding of brain function. The first involves advanced signal processing methods, particularly those appropriate for emerging multi-site recording techniques and noninvasive imaging techniques. The second involves the development of a calculus to study the dynamical behavior of nervous systems and the computations they perform. A distinguishing feature of the work group will be the close collaboration between experimentalists and theorists, particularly with regard to the analysis of data and the planning of experiments. The work group will have a small number of pedagogical lectures, supplemented by tutorials on relevant computational and mathematical techniques. This work group is a means to critically evaluate techniques for the processing of multi-channel data, of which imaging forms an important category. Such techniques are of fundamental importance for basic research and medical diagnostics. We have begun to establish a repository of these techniques to insure the rapidly dissemination of modern analytical techniques throughout the neuroscience community. The work group convenes on a yearly basis. In 1999, we will continue to focus on topics in the analysis of multivariate time series data, consisting of both continuous and point processes. In addition, we will have two specialized programs on neuronal instrumentation: * 21 August 1999 - Multisite recording of extracellular cortical potentials with Si-based probes. This is offered in collaboration with the Center for Neural Communication Technology at the University of Michigan. * 26 August 1999 - A comparison of analysis techniques for fMRI data. Participants: Twenty five participants, both experimentalists and theorists. Experimentalists are specifically encouraged to bring data records to the work group; appropriate computational facilities will be provided. The work group will further take advantage of interested investigators concurrently present at the MBL. We encourage graduate students and postdoctoral fellows as well as senior researchers to apply. Participant Fee: $250. Support: National Institutes of Health - NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA, and NINDS. Organizers: David Kleinfeld (UCSD) and Partha P. Mitra (Bell Laboratories). Website: http://www.vis.caltech.edu/~WAND Application: Send a copy of your curriculum vita, together with a cover letter that contains a brief (ca. 200 word) paragraph on why you wish to attend the work group to: Ms. Jean B. Ainge Bell Laboratories, Lucent Technologies 700 Mountain Avenue 1D-427 Murray Hill, NJ 07974 908-582-4702 (fax) or The MBL is an EEO AAI. Graduate students and postdoctoral fellows are encouraged to include a letter of support from their research advisor. Shared accomodations and board will be provided. Applications must be received by 21 May 1999. Participants will be notified by 7 June 1999. The Archives for Neurosciences can be found at: http://xxx.lanl.gov/archive/neuro-sys From tnatschl at igi.tu-graz.ac.at Tue May 4 05:30:06 1999 From: tnatschl at igi.tu-graz.ac.at (Thomas Natschlaeger) Date: Tue, 4 May 1999 11:30:06 +0200 (CEST) Subject: Fast analog computation with unreliable synapses Message-ID: Announcement of a new paper by Wolfgang Maass and Thomas Natschlaeger: "A model for fast analog computation based on unreliable synapses" Neural Computation, 1999. in press. Abstract: We investigate through theoretical analysis and computer simulations the consequences of unreliable synapses for fast analog computations in networks of spiking neurons, with analog variables encoded by the current firing activities of pools of spiking neurons. Our results suggest that the known unreliability of synaptic transmission may be viewed as a useful tool for analog computing, rather than as a ``bug'' in neuronal hardware. We also investigate computations on time series and Hebbian learning in this context of space-rate coding. This paper is available as gzipped postscript file (26 pages, 211Kb) from http://www.cis.tu-graz.ac.at/igi/maass/#Publications (see #102) or http://www.cis.tu-graz.ac.at/igi/tnatschl/Publications.html Sincerely Thomas Natschlaeger ********************************************************* ** ** ** Thomas Natschlaeger ** ** Institute for Theoretical Computer Science ** ** Technische Universitaet Graz ** ** Klosterwiesgasse 32/2 ** ** A - 8010 Graz, Austria ** ** email: tnatschl at igi.tu-graz.ac.at ** ** www: http://www.cis.tu-graz.ac.at/igi/tnatschl/ ** ** Tel: ++43 316 873 5814 ** ** Fax: ++43 316 873 5805 ** ** ** ********************************************************* From oby at cs.tu-berlin.de Tue May 4 07:19:17 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Tue, 4 May 1999 13:19:17 +0200 (MET DST) Subject: No subject Message-ID: <199905041119.NAA20539@pollux.cs.tu-berlin.de> Dear Connectionists, below please find abstract and preprint-location of a recent paper on modelling contrast adaptation in primary visual cortex. Cheers Klaus -------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ -------------------------------------------------------------------- -------------------------------------------------------------------- Contrast Adaptation and Infomax in Visual Cortical Neurons Peter Adorjan, Christian Piepenbrock, and Klaus Obermayer CS Department, Technical University of Berlin, Berlin, Germany In the primary visual cortex (V1) the contrast response function of many neurons saturates at high contrast and adapts depending on the visual stimulus. We propose that both effects--contrast saturation and adaptation--can be explained by a fast and a slow component in the synaptic dynamics. In our model the saturation is an effect of fast synaptic depression with a recovery time constant of about 200 ms. Fast synaptic depression leads to a contrast response function with a high gain for only a limited range of contrast values. Furthermore, we propose that slow adaptation of the transmitter release probability at the geniculocortical synapses is the underlying neural mechanism that accounts for contrast adaptation on a time scale of about 7 sec. For the functional role of contrast adaptation we make the hypothesis that it serves to achieve the best visual cortical representation of the geniculate input. This representation should maximize the mutual information between the cortical activity and the geniculocortical input by increasing the release probability in a low contrast environment. We derive an adaptation rule for the transmitter release probability based on this EM infomax principle. We show that changes in the transmitter release probability may compensate for changes in the variance of the geniculate inputs--an essential requirement for contrast adaptation. Also, we suggest that increasing the release probability in a low contrast environment is beneficial for signal extraction, because neurons remain sensitive only to an increase in the presynaptic activity if it is synchronous and, therefore, likely to be stimulus related. Our hypotheses are tested in numerical simulations of a network of integrate-and-fire neurons for one column of V1 using fast synaptic depression and slow synaptic adaptation. The simulations show that changing the synaptic release probability of the geniculocortical synapses is a better model for contrast adaptation than the adaptation of the synaptic weights: only in the case of changing the transmitter release probability our model reproduces the experimental finding that the average membrane potential (DC component) adapts much stronger than the stimulus modulated component (F1 component). In the case of changing synaptic weights, however, the average membrane potential (DC) as well as the stimulus modulated component (F1 component) would adapt. Furthermore, changing the release probability at the recurrent cortical synapses cannot account for contrast adaptation, but could be responsible for establishing oscillatory activity often observed in recordings from visual cortical cells. Rev. Neurosci. 1999, in press available at: http://ni.cs.tu-berlin.de/publications/ From ormoneit at stat.Stanford.EDU Tue May 4 13:45:46 1999 From: ormoneit at stat.Stanford.EDU (Dirk Ormoneit) Date: Tue, 4 May 1999 10:45:46 -0700 (PDT) Subject: New TR on Kernel-Based Reinforcement Learning Message-ID: <199905041745.KAA26003@rgmiller.Stanford.EDU> The following technical report is now available on-line at http://www-stat.stanford.edu/~ormoneit/tr-1999-8.ps Best, Dirk ------------------------------------------------------------------ KERNEL-BASED REINFORCEMENT LEARNING by Dirk Ormoneit and Saunak Sen Kernel-based methods have recently attracted increased attention in the machine learning literature as reliable tools to attack regression and classification tasks. In this work, we consider a kernel-based approach to reinforcement learning that will be shown to produce a consistent estimate of the true value function in a continuous Markov Decision Process. Typically, consistency cannot be obtained using parametric value function estimates such as neural networks. As further contributions, we derive the asymptotic distribution of the kernel-based estimate and establish optimal convergence rates. The asymptotic distribution is then used to derive a formula for the asymptotic bias inherent in the kernel-based approximation. In spite of the fact that reinforcement learning is generally biased due to the involved maximum operator, this is the first theoretical result in this spirit to our knowledge. The suggested bias formulas may serve as the basis for bias correction techniques that can be used in practice to improve the estimate of the value function. -------------------------------------------- Dirk Ormoneit Department of Statistics, Room 206 Stanford University Stanford, CA 94305-4065 ph.: (650) 725-6148 fax: (650) 725-8977 ormoneit at stat.stanford.edu http://www-stat.stanford.edu/~ormoneit/ From reggia at cs.umd.edu Tue May 4 15:29:16 1999 From: reggia at cs.umd.edu (James A. Reggia) Date: Tue, 4 May 1999 15:29:16 -0400 (EDT) Subject: Postdoc position, computational neurosci., language Message-ID: <199905041929.PAA25542@avion.cs.umd.edu> (The following includes computational modeling related to neuoscience of language and its disorders. Please direct questions etc. to Dr. Rita Berndt as indicated below.) Post-doctoral Fellowship in the Cognitive Neuroscience of Language and its Disorders Two-year post-doctoral fellowship available after July 1, 1999, at the University of Maryland School of Medicine, in Baltimore, Maryland. Training opportunities will provide experience in the application of contemporary research methods (including computational modeling, cognitive neuropsychology, event-related potentials and functional neuroimaging) to the topic of normal and disordered language processing. Applicants with doctoral degrees in related basic science areas (cognitive psychology, neuroscience, linguistics, computer science, etc.) and clinical disciplines (speech/language pathology; clinical neuropsychology) are invited to apply. Questions may be directed to rberndt at umaryland.edu. To apply, send HARD COPIES of C.V., names and addresses of three referees, and statement of research interests and career goals to: Rita S. Berndt, Ph.D., Department of Neurology, University of Maryland School of Medicine, 22 South Greene Street, Baltimore, Maryland 21201. Applications should be received by July 1, 1999, for full consideration. From KSTUEBER at holycross.edu Mon May 3 14:05:32 1999 From: KSTUEBER at holycross.edu (Karsten Stueber) Date: Mon, 03 May 1999 14:05:32 -0400 Subject: SPP program, housing and conference registration Message-ID: This e-mail contains information about the preliminary program, on-campus housing, and conference registration. It also contains a Conference Pre-registration form and an On- Campus Housing reservation form. On- Campus Housing is limited. You are therefore advised to reserve a room ASAP. (Latest by May 24, 1999). Please note also that the On-Campus Reservation and Conference Registration have to be sent to different addresses. For information about the finalized program and further travel information please check the SPP conference web page at http://www.hfac.uh.edu/cogsci/spp/wwwanlmt.htm PRELIMINARY PROGRAM OF THE 25th ANNIVERSARY MEETING OF THE SOCIETY FOR PHILOSOPHY AND PSYCHOLOGY (For the exact program, please check the SPP website in the first week of May) STANFORD UNIVERSITY JUNE 19-22, 1999 The first panel will be on Saturday, June 19, 1999 at 3:30pm. The last session will be on Tuesday Morning, June 22, 1999. SYMPOSIA: Symposium I: Theory of Mind: Infants and Primates Confirmed speakers: Alison Gopnik (Psychology, UC Berkeley) Daniel Povinelli (New Iberia Research Center, USL) Symposium II: Frontiers in Cognitive Neuroscience Confirmed speakers: Gregory McCarthy (Brain Imaging & Analysis Center, Duke) Lynn Robertson (Martinez VA & UC Berkeley) Symposium III: "Then and Now in Philosophy and Psychology" Confirmed speakers: Hilary Putnam (Philosophy, Harvard) Roger Shepard (Psychology, Stanford) Symposium IV: Past, Present, and Future of SPP (Special Panel Discussion) Panelists: Patrick Suppes (Philosophy, Stanford) Daniel Dennett (Philosophy, Tufts) Stephen Stich (Philosophy, Rutgers) Kathleen Akins (Philosophy, Simon Fraser) Stevan Harnad (Cognitive Science, Southampton) INVITED SPEAKERS: Session I: Mechanism of Pain Allan Basbaum (Neuroscience, UCSF) [Other invited and symposium speakers unlisted due to pending confirmation] CONTRIBUTED SESSIONS A. Belief and Explanation Carol Slater (Alma College) No 'There' There: Ruth Millikan, Lloyd Morgan, and the Case of the Missing Indexicals" Kristen Andrews and Peter Verbeek (University of Minnesota) Prediction, Explanation, and Folk Psychology B. Belief and Thought Eric Schwitzgebel (University of California, Riverside) In-Between Believing Lawrence A. Beyer (Stanford University) Do We Believe Only What We Take to Be True? C. Mind and Brain William Bechtel (Washington University, St. Louis) and Robert N. McCauley (Emory University) Heuristic Identity Theory (or Back to the Future): The Mind-Body Problem Against the Background of Research Strategies in Cognitive Neuroscience" Max Velmans (University of London) How to Make Sense of the Causal Interactions Between Consciousness and the Brain Bruce Mangan (University of California, Berkeley) The Fallacy of Functional Exclusion D. Functions of the Senses Brian Keeley (Washington University, St. Louis / University of Northern Iowa) Making Sense of Modalities" Alva Noe (University of California, Santa Cruz) What Change Blindness Really Teaches Us About Vision Bernard Baars (The Wright Institute) Criteria for Consciousness in the Brain: Methodological Implications of Recent Developments in Visual Neuroscience" E. Cognition Jesse Prinz (Washington University, St. Louis) Mad Dogs and Englishmen: Concept Nativism Reconsidered Muhammad Ali Khalidi (American University) Two Models of Innateness James Blackmon, David Byrd, Robert Cummins, Pierre Poirier, Martin Roth (University of California, Davis) Systematicity and the Cognition of Structural Domains F. Representation and Pain Stephanie Beardman (Rutgers University) The Choice Between Actual and Remembered Pain Murat Aydede (University of Chicago) Pain Qualia and Representationalism William Robinson (Iowa State) Representationalism and Epiphenomenalism POSTER PRESENTATIONS Tim Bayne (University of Arizona) H. Looren de Jong (Vrije Universiteit, Amsterdam) Donald Dryden (Duke University) Sanford Goldberg (Grinell College) Daniel Haybron (Rutgers University) David Hunter (Buffalo State University) Ariel Kernberg (University College, London) Stan Klein (UC-Berkeley) Uriah Kriegel (Brown University) John Kulvicki (University of Chicago) Justin Leiber (University of Houston) Ron Mallon (Rutgers University) Shaun Maxwell (Queens University, Canada) Lawrence Roberts (SUNY Binghamton) Teed Rockwell Peter Ross (Pomona College) James Taylor (Bowling Green State University) Charles Twardy (Indiana University) Ruediger Vaas (University of Stuttgart & University of Hohenheim) Adam Vinueza (University of Colorado) Jonathan Weinberg (Rutgers University) Josh Weisberg (City University of New York) Tadeusz Zawidzki (Washington University, St. Louis) Jing Zhu (University of Waterloo, Canada) Conference Pre-Registration Form Mail completed form with check or money order made payable to Society for Philosophy and Psychology, ASAP (It has to be received no later than June 8, 1999) to:Karsten Stueber;Secretary-Treasurer; SPP; Department of Philosophy; College of the Holy Cross; PO Box 137A; Worcester, MA 01610 Name Address Daytime Phone ( ) Fax( ) e-mail Institutional Affiliation Conference Registration Fee Member:$40 Nonmember:$60 Student: $15 Banquet and Presidential Address ($48 per person, including taxes and tips) $48 per person x #of persons $ August 1998-July 1999 SPP Membership Dues (New Members pay member conference registration fees) Regular Member: $25 Student: $5 Contributions to William James Graduate Student Award $ Total $ On Campus Accommodations: Rates: 51.00 per night for single 39.00 per person per night for shared These rates include daily continental breakfast and weekly maid service (beds are made upon arrival but only common areas are cleaned each day). Linen and Towels will be provided but you may wish to bring extra towels since we will not have daily maid service. Reservations: All Rooms must be paid in advance. Only three-day packages may be reserved. That is, each on campus resident must pay, in advance, for (at least) three nights, regardless of intended check-in/check-out date. a single room on campus 153.00 In order to reserve please send a check for a double room on campus 117/person In order to be assured a room, Your check MUST be received not later than May 24, 1999 Please fill out the On Campus Housing registration form and make sure to indicate your Name, Address, phone number, gender, and roommate preference, if any. Roommate Preferences: If you have a preferred roommate, please indicate the person with whom you intend to room. If you wish to be assigned a roommate, please indicate your gender. Check-in Time: 1-3 at American Studies House in the governor's Corner Complex. Late Check-in: Late arrivals will check in at Summer Conference Office at the Elliot Program Center, near the governor's Corner Complex. Check-out. By noon, Tuesday, June 22. Additional Nights: Additional may be arranged with Stanford Summer Conference Services at the rate of 43.75/night for single 32.00/night for doubles (no continental breakfast included). Those requesting additional nights will pay the Summer Conferences Services directly for additional nights. If you wish additional nights, indicate so in writing and we will pass word onto them. Key Deposit: Each on campus resident will be required to leave a refundable $70.00 key deposit. Special Services: Participants or attendees needing special arrangements to accommodate a disability may request accommodations by contacting Kenneth Taylor. Request should by made by May 15th, 1999. phone 650-723-2547. e-mail taylor at turing.stanford.edu, fax: 650-723-0985 Phone Services: Pay phones are available in each dorm. Individual rooms do not have phones, however. On Campus Parking: Daily parking permits for $2.00/day may be purchased at check-in and at conference registration. On Campus Housing Reservation Form Mail completed form with check or money order made payable to Society for Philosophy and Psychology, ASAP (It has to be received latest by May 24, 1999) to: Kenneth Taylor Department of Philosophy Stanford University Stanford, CA 94022. Name (Mr. or Ms.) Address Daytime Phone ( ) Fax( ) e-mail Institutional Affiliation Name of Preferred Roommate: Reservation for a single room on campus (three days) $153.00/per person Reservation for a double room on campus(three days) $117/person Total: $ Alternate Accommodations: For those preserving to stay off campus, a small number of rooms have been blocked off at the Cardinal Hotel in downtown Palo Alto, about a mile from the center of Campus, but close to many fine restaurants, bars, and shopping. To reserve a room at the Cardinal call: 650-323-5101. Rates: 107+ tax for both double and single rooms. The Cardinal Hotel is located at 235 Hamilton Ave Palo Alto, California. Be sure to Mention the Society for Philosophy and Psychology to receive the Stanford Rate. This block of rooms will be released as of May 19th, 1999 and will be available only on a first-come, first-serve basis thereafter. For additional hotels and other visitor information consult the following web page: http://www.stanford.edu/home/visitors/index.html Program Co-Chair: Guven Guzeldere guven at aas.duke.edu. Program Co-Chair: Steven Harnad harnad at coglit.soton.ac.uk President (1999): Brian Cantwell Smith smithbc at indiana.edu Secretary-Treasurer: Karsten Stueber kstueber at holycross.edu Local Arrangements: Kenneth Taylor taylor at csli.stanford.edu From oby at cs.tu-berlin.de Wed May 5 06:06:28 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Wed, 5 May 1999 12:06:28 +0200 (MET DST) Subject: faculty position Message-ID: <199905051006.MAA29310@pollux.cs.tu-berlin.de> Dear Connectionists, the CS department of the Technical University of Berlin solicits application for a tenured faculty position (C3) in the area of Computer Vision. Although we encourage candidates from a variety of backgrounds to apply, one potential focus is pattern recognition and neural computation. For information about the department and its research groups you are welcome to visit our Web-pages at: http://www.cs.tu-berlin.de/cs/index-en.html Best wishes Klaus =========================================================================== FACULTY POSITION IN COMPUTER VISION CS-Department, Technical University of Berlin, Berlin, Germany The Department for Computer Science solicits application for a tenured faculty position (salary level C3) in the area of image acquisition, processing, and understanding. The successful candidate is expected to join the department's undergraduate teaching programs, as well as the graduate education in the area of Computer Vision. The successful candidate is expected to teach courses in German after one year. Requirements: Ph.D. degree in computer science, electrical engineering, or neighboring fields; Habilitation or equivalent achievements; research experience in in the field of computer vision; a strong publication record; teaching experience; track record in acquiring research grants. Experience in application areas like biomedicine, automation and control, or robotics is desirable. Please send applications to: Search Committee (Computer Vision) FR 5-1, Department of Computer Science Technical University of Berlin Franklinstrasse 28/29 10587 Berlin, Germany The Technical University of Berlin wants to increase the percentage of women on its faculty and strongly encourages applications from qualified individuals. Handicapped persons are also encouraged to apply and will be preferred given equal qualifications. ============================================================================ Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ From ml_conn at infrm.kiev.ua Wed May 5 11:39:50 1999 From: ml_conn at infrm.kiev.ua (Dmitri Rachkovskij) Date: Wed, 5 May 1999 17:39:50 +0200 (UKR) Subject: Connectionist symbol processing: any progress? References: Message-ID: Keywords: distributed representation, sparse coding, binary coding, binding, variable binding, thinning, representation of structure, structured representation, recursive representation, nested representation, compositional representation, connectionist symbol processing, associative-projective neural networks. Dear Colleagues, The following paper draft (abstract enclosed) inspired by the last year's debate is available at http://cogprints.soton.ac.uk/abs/comp/199904008 Dmitri A. Rachkovskij & Ernst M. Kussul "Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning". Comments are welcome! Thank you and best regards, Dmitri Rachkovskij Abstract: Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows to reach high information capacity of distributed associative memory where the codevectors may be stored. In distinction to known binding procedures, Context-Dependent Thinning allows to support the same low density (or sparseness) of the bound codevector for varied number of constituent codevectors. Besides, a bound codevector is not only similar to another one with similar constituent codevectors (as in other schemes), but it is also similar to the constituent codevectors themselves. This allows to estimate a structure similarity just by the overlap of codevectors, without the retrieval of the constituent codevectors. This also allows an easy retrieval of the constituent codevectors. Examples of algorithmic and neural network implementations of the thinning procedures are considered. We also present representation examples of various types of nested structured data (propositions using role-filler and predicate-arguments representation, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations oftraditional AI, as well as to the localist and microfeature-based connectionist representations. ************************************************************************* Dmitri A. Rachkovskij, Ph.D. Net: dar at infrm.kiev.ua Senior Researcher, V.M.Glushkov Cybernetics Center, Tel: 380 (44) 266-4119 Pr. Acad. Glushkova 40, Kiev 22, 252022, UKRAINE Fax: 380 (44) 266-1570 ************************************************************************* From jf218 at hermes.cam.ac.uk Wed May 5 04:39:31 1999 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Wed, 5 May 1999 09:39:31 +0100 (BST) Subject: Paper available In-Reply-To: <372808DF.8A5DDC14@syseng.anu.edu.au> Message-ID: Dear All, You could download the following paper (*.ps.gz) from my home page (address below) Varibility of firing of Hodgkin-Huxley and FitzHugh-Nagumo neurons with stochastic synaptic input Phys. Rev. Lett. (in press) Abstract: The variability and mean of the firing rate of Hodgkin-Huxley and FitzHugh-Nagumo neurons subjected to random synaptic input are only weakly dependent on the level of inhibitory inputs, unlike integrate and fire neurons. For the later model, substantial inhibitory input is essential to ensure output variability close to Poissonian firing. Jianfeng Feng The Babraham Institute Cambridge CB2 4AT UK http://www.cus.cam.ac.uk/~jf218 From jfgf at eng.cam.ac.uk Thu May 6 04:56:39 1999 From: jfgf at eng.cam.ac.uk (J.F. Gomes De Freitas) Date: Thu, 6 May 1999 09:56:39 +0100 (BST) Subject: Paper: Bayesian Support Vectors Message-ID: Dear colleagues A paper, to appear in NNSP99, on sequential Bayesian estimation techniques for support vectors is now available at: http://svr-www.eng.cam.ac.uk/~jfgf/publications.html As I am presently writing a longer version, I would very much appreciate your feedback, especially if you have any negative comments and/or if you can answer some of the questions I pose in the paper. ABSTRACT: In this paper, we derive an algorithm to train support vector machines sequentially. The algorithm makes use of the Kalman filter and is optimal in a Bayesian framework. It extends the support vector machine paradigm to applications involving real-time and non-stationary signal processing. It also provides a computationally efficient alternative to the problem of quadratic optimisation. Thanks Nando _______________________________________________________________________________ JFG de Freitas (Nando) Speech, Vision and Robotics Group Information Engineering Cambridge University CB2 1PZ England http://svr-www.eng.cam.ac.uk/~jfgf Tel (01223) 302323 (H) (01223) 332754 (W) _______________________________________________________________________________ From stephen at computing.dundee.ac.uk Thu May 6 05:14:08 1999 From: stephen at computing.dundee.ac.uk (Stephen McKenna) Date: Thu, 06 May 1999 10:14:08 +0100 Subject: PhD/CASE Studentships, University of Dundee Message-ID: <37315D60.6D30C949@computing.dundee.ac.uk> PhD studentships in the areas of face recognition and human action/behaviour recognition using computer vision are currently available in the Department of Applied Computing, University of Dundee. The department was awarded a 5A rating in the latest UK research assessment exercise. A CASE award is available in collaboration with NCR and provides an additional maintenance grant to complement the EPSRC studentship rate. Applicants should send CV and references to Dr Stephen McKenna, Department of Applied Computing, University of Dundee, Scotland. Informal enquiries to stephen at computing.dundee.ac.uk From tcp1 at leicester.ac.uk Thu May 6 07:18:40 1999 From: tcp1 at leicester.ac.uk (Tim .C. Pearce) Date: Thu, 6 May 1999 12:18:40 +0100 (BST) Subject: Studentship Opportunity Message-ID: <350B7A7964@violet.le.ac.uk> Postgraduate Studentship in Neuromorphic Engineering An opportunity exists for a studentship in the broad area of neuroinformatics and sensory information processing. A wide range of possible research topics will be considered relating to artificial and biological olfaction (smell) including but not limited to: optical chemical sensing for artificial olfaction, sign processing for chemical sensor arrays using classical or neuromorphic pattern recognition approaches, discrete electronic or silicon implementations of neurones for sensory processing, and/or data processing of electrophysiological data from the biologic olfactory bulb in order to investigate the coding of odour information. The successful candidate will obtain an upper second class honours degree or better in a relevant discipline, including all areas of engineering, computer science, mathematics, chemistry, physics or biology. Computer literacy and evidence of mathematical ability will be seen as a distinct advantage. The award will includ full-time fees for higher degree registration and a maintenance grant of 8,000 pounds sterling p.a. for which students may be required to contribute to laboratory demonstration within the department (up to a maximum of 6 hours/week). For further details and informal discussions contact Dr. Tim Pearce, Tel: +44 (0) 116 223 1290, E-mail: t.c.pearce at le.ac.uk. Those interested should submit a CV and single page statement of potential research interests to Dr. Tim C. Pearce, Department of Engineering, University of Leicester, University Road, Leicester LE1 H, United Kingdom. In order to guarantee consideration, applications should be submitted by June 18th, 1999, although the studentship will remain open until a suitable candidate has been found. The University of Leicester is an equal opportunities employer. Regards, Tim. -- T.C. Pearce, PhD URL: http://www.leicester.ac.uk/engineering/ Lecturer in Bioengineering E-mail: t.c.pearce at leicester.ac.uk Department of Engineering Tel: +44 (0)116 223 1290 University of Leicester Fax: +44 (0)116 252 2619 Leicester LE1 7RH Bioengineering, Transducers and United Kingdom Signal Processing Group From murase at synapse.fuis.fukui-u.ac.jp Thu May 6 04:28:40 1999 From: murase at synapse.fuis.fukui-u.ac.jp (kazuyuki murase) Date: Thu, 06 May 1999 17:28:40 +0900 Subject: Professor Positions in Japan Message-ID: <373152B8.5A09912E@synapse.fuis.fukui-u.ac.jp> Dear Sirs; I would like to announce the following openings of three Professor and/or Associate Professor positions in the Department of Human and Artificial Intelligent Systems at Fukui University in Japan. Potential candidates are welcome to apply, and also I would appreciate if you inform this to your colleagues who might be interested. Sincerely yours, Kazuyuki Murase Chairman Department of Human and Artificial Intelligent Systems (HARTs) Faculty of Engineering, Fukui University 3-9-1 Bunkyo, Fukui 910-8507, Japan. Phone: (+81) 776-27-8774, Fax: (+81) 776-27-8751 E-mail: murase at synapse.fuis.fukui-u.ac.jp THREE FACULTY POSITIONS AVAILABLE The Faculty of Engineering at the Fukui University, a National University of Japan, seeks candidates for three professor and/or associate professor positions in the newly established Department of Human and Artificial Intelligent Systems (HARTs). The department is aimed to study and teach fundamentals and applications of systems with intelligence. It is consisted of three academic units, the Basic Intelligent Systems, the Applied Intelligent Systems, and the Intelligent Systems for Human Aid, and is planed to have a total of twenty-two faculty members by the fiscal year of 2002. Candidates with a variety of backgrounds related to the human and artificial intelligence as well as systems science and engineering are encouraged to apply. The area of research includes; Intelligent Robotics, Intelligent Sensing, Cognitive Science, Emergent Systems, Automatic Control, Artificial Intelligence, AI Systems, Adaptive Learning and Autonomous Systems, Evolutionary Systems, Multi-agent Systems, and others. Applicants are expected to establish an independent, and highly original, research program. They are to teach graduate and undergraduate courses, in which students are mostly Japanese, and to supervise student research at the levels of Undergraduate, Master and Doctor degrees. Applicants should send a curriculum vitae, copies of publications, summaries of twelve representative publications, a statement of present and future research plans and teaching interest, and two names of references by August 31, 1999, to Dr. Kazuyuki Murase, Department of Human and Artificial Intelligent Systems, Fukui University, 3-9-1 Bunkyo, Fukui 910-8507, Japan. For informal inquiries, phone: (0776) 27-8774, fax:(0776) 27-8751. E-mail: murase at synapse.fuis.fukui-u.ac.jp From herbert.jaeger at gmd.de Thu May 6 05:43:37 1999 From: herbert.jaeger at gmd.de (Herbert Jaeger) Date: Thu, 06 May 1999 11:43:37 +0200 Subject: New paper on stochastic time series modeling Message-ID: <37316377.D915FE60@gmd.de> Dear Connectionists, I would like to announce the paper, Herbert Jaeger, "Observable operator models for discrete stochastic time series", accepted for publication by Neural Computation Abstract: A widely used class of models for stochastic systems is Hidden Markov models. Systems which can be modeled by hidden Markov models are a proper subclass of *linearly dependent processes*, a class of stochastic systems known from mathematical investigations carried out over the last four decades. This article provides a novel, simple characterization of linearly dependent processes, called observable operator models. The mathematical properties of observable operator models lead to a constructive learning algorithm for the identification of linearly dependent processes. The core of the algorithm has a time complexity of O(N + n m^3), where N is the size of training data, n is the number of distinguishable outcomes of observations, and m is model state space dimension. A preprint of the paper is available electronically at ftp://ftp.gmd.de/GMD/ais/publications/1999/jaeger.99.neco.pdf (PDF format, 410 K) ftp://ftp.gmd.de/GMD/ais/publications/1999/jaeger.99.neco.ps.gz (g'zipped PostScript format, 700 K) I warmly appreciate your comments! Sincerely, Herbert Jaeger ---------------------------------------------------------------- Dr. Herbert Jaeger Phone +49-2241-14-2253 German National Research Center Fax +49-2241-14-2384 for Information Technology (GMD) email herbert.jaeger at gmd.de AiS.BE Schloss Birlinghoven D-53754 Sankt Augustin, Germany http://www.gmd.de/People/Herbert.Jaeger/ From grb at neuralt.com Fri May 7 09:59:33 1999 From: grb at neuralt.com (George Bolt) Date: Fri, 7 May 1999 14:59:33 +0100 Subject: Job vacancy - Neural Technologies Limited Message-ID: Neural Scientist Neural Technologies Limited is the leading UK company working in the application and exploitation of neural computing and other advanced technologies across a wide range of industrial and commercial environments. Our continued growth has led to the requirement of an applied Neural?Scientist to join our highly motivated team to help in the development and deployment of practical advanced computing solutions on a high profile projects. Do you want to apply your neural computing skills to solve real-world problems? Neural Technologies can offer you this opportunity - just some of the areas we work in are: * Telecommunications - fraud, churn, etc. * Finance - credit scoring, risk management, instrument trading, etc. * Marketing - modelling and analysis * Data Analysis and Visualisation - virtual reality * Etc. You will be expected to demonstrate not only high standards of professionalism but technical innovation second to none. Self confidence, adaptability, proactivity and communication skills are as important as the technical skills. Required skills are: * Well versed in neural network and other advanced algorithm development and their practical application, should have at least 2 years applied knowledge of at least 2 of the following: * MLP, RBF, Decision Trees, etc. * Kohonen/SOM, LVQ, etc. * Rule induction and inferencing, case-based reasoning, etc. * Evolution, GA's, etc. * Optimisation * Experienced using MATLAB * Proven problem solving abilities and system design * Good mathematical background * Able to code in C or C++ within the PC environment Experience of the following would also be an advantage: * Knowledge of conventional statistics * Signal processing techniques (e.g. speech) * Application domains (credit scoring, fraud analysis, telecommunications, banking and finance) All candidates should be working at a practical research level or have extensive industrial experience. A keen view to the commercial realities of working within a small, but fast growing, company is required. Neural Technologies Limited operate a non-smoking policy. Contact: Julie Naylor, Technical Administrator, Neural Technologies Limited, Bedford Road, PETERSFIELD, Hampshire GU32 3QA (UK) Fax: +44 (0) 1730 260466 Phone: +44 (0) 1730 260256 Email: techadmn at neuralt.com Website: http://www.neuralt.com George Bolt Director of Product Innovation Neural Technologies Cafe Neural: http://www.neuralt.com Tel: +44 (0) 1730 260 256 Fax: +44 (0) 1730 260 466 > ********** NOTE > Any views expressed in this message are those of the individual > sender, > except where the sender specifically states them to be the views of > Neural Technologies Limited > ********** > From shai at cs.Technion.AC.IL Sun May 9 05:25:38 1999 From: shai at cs.Technion.AC.IL (Shai Ben-David) Date: Sun, 9 May 1999 12:25:38 +0300 (IDT) Subject: COLT99 program Message-ID: <199905090925.MAA15305@cs.Technion.AC.IL> Twelfth Annual Conference on Computational Learning Theory University of California at Santa Cruz July 6-9, 1999 ======================================== A PRELIMINARY PROGRAM ======================================== Tuesday, July 6 --------------- Session 1 (9:00-10:30) --------- The Robustness of the p-norm Algorithms, Claudio Gentile and Nick Littlestone Minimax Regret under Log Loss for General Classes of Experts, Nicolo Cesa-Bianchi and Gabor Lugosi On Prediction of Individual Sequences Relative to a set of Experts, Neri Merhav and Tsachy Weissman Regret Bounds for Prediction Problems, Geoffrey J. Gordon Session 2 (11:00-12:00) --------- On theory revision with queries, Robert H. Sloan and Gyorgy Turan Estimating a mixture of two product distributions, Yoav Freund and Yishay Mansour An Apprentice Learning Model, Stephen S. Kwek Session 3 (2:00-3:00) --------- Uniform-Distribution Attribute Noise Learnability, Nader H. Bshouty and Jeffrey C. Jackson and Christino Tamon On Learning in the Presence of Unspecified Attribute Values, Nader H. Bshouty and David K. Wilson Learning Fixed-dimension Linear Thresholds From Fragmented Data, Paul W. Goldberg Tutorial 1 (3:30-5:30) --------- Boosting, Yoav Freund and Rob Schapire ++++++++++++++++++++++++++++++++++++++++ 19:00 - 21:00 RECEPTION +++++++++++++++++++++++++++++++++++++++++ Wednesday, July 7 ----------------- Invited Speaker --------------- TBA, David Shmoys (9:00-10:00) Session 4 (10:30 - 12:10) --------- An adaptive version of the boost-by-majority algorithm, Yoav Freund Drifting Games, Robert E. Schapire Additive Models, Boosting, and Inference for Generalized Divergences, John Lafferty Boosting as Entropy Projection, J. Kivinen and M. K. Warmuth Multiclass Learning, Boosting, and Error-Correcting Codes, Venkatesan Guruswami and Amit Sahai Session 5 (2:00-3:00) --------- Theoretical Analysis of a Class of Randomized Regularization Methods, Tong Zhang PAC-Bayesian Model Averaging, David McAllester Viewing all Models as `Probabilistic', Peter Grunwald Tutorial 2 (3:30- 5:30) ---------- Reinforcement Learning, Michael Kearns (?) and Yishay Mansour +++++++++++++++++++++++++++++++++++++++++ -------------- Thursday, July 8 ----------------- Session 6 (9-10:30) --------- Reinforcement Learning and Mistake Bounded Algorithms, Yishay Mansour Convergence analysis of temporal-difference learning algorithms, Vladislav Tadic Beating the Hold-Out, Avrim Blum and Adam Kalai and John Langford Microchoice Bounds and Self Bounding Learning Algorithms, John Langford and Avrim Blum Session 7 (11:00- 12:00) --------- Learning Specialist Decision Lists, Atsuyoshi Nakamura Linear Relations between Square-Loss and Kolmogorov Complexity, Yuri A. Kalnishkan Individual sequence prediction - upper bounds and application for complexity, Chamy Allenberg Session 8 (2:00- 3:00) ---------- Extensional Set Learning, S. A. Terwijn On a generalized notion of mistake bounds, Sanjay Jain and Arun Sharma On the intrinsic complexity of learning infinite objects from finite samples, Kinber and Papazian and Smith and Wiehagen +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Friday, July 9 -------------- Tutorial 3 (9:00-11:00) ---------- Large Margin Classification, Peter Bartlett, John Shawe-Taylor, and Bob Williamson Session 9 (11:30-12:10) --------- Covering Numbers for Support Vector Machines, Ying Guo and Peter L. Bartlett and John Shawe-Taylor and Robert C. Williamson Further Results on the Margin Distribution, John Shawe-Taylor and Nello Cristianini Session 10 (2:00- 3:40) ---------- Attribute Efficient PAC-learning of DNF with Membership Queries, Nader H. Bshouty and Jeffrey C. Jackson and Christino Tamon On PAC Learning Using Winnow, Perceptron, and a Perceptron-Like Algorithm, Rocco A. Servedio Extension of the PAC Framework to Finite and Countable Markov Chains, David Gamarnik Learning threshold functions with small weights using membership queries., E. Abboud, N. Agha, N.H. Bshouty, N. Radwan, F. Saleh Exact Learning of Unordered Tree Patterns From Queries, Thomas R. Amoth and Paul Cull and Prasad Tadepalli +++++++++++++++++++++++++++++++++++++++++ From marwan at ee.usyd.edu.au Mon May 10 07:47:00 1999 From: marwan at ee.usyd.edu.au (Marwan Jabri) Date: Mon, 10 May 1999 21:47:00 +1000 (EST) Subject: postdoc Message-ID: Please feel free to post... Post-Doctoral Fellow in Wavelets-Based Timeseries Analysis Computer Engineering Laboratory School of Electrical and Information Engineering The University of Sydney, Australia Applications are invited for Post-doctoral fellow position funded by an Australian Research Council project grant and in collaboration with an financial engineering company. The two-year project aims at investigating wavelets preprocessing techniques and their applications to timeseries analysis and compression. Applicants would have completed (or about to complete) their PhD in electrical, computer or related engineering or science discipline and have demonstrated research capacity in the area of timeseries analysis, machine learning or related field. The fellow will be expected to work independently and to play a leading role in the project co-supervising postgraduate students contributing to the project. Knowledge of computational techniques in the neural computing area, and their implementation in software are advantages. Appointment will be made initially for a period of one year, and renewable for another year subject to progress. Expected starting date in June, 1999. Closing date: 21 May 1999 Salary range: A$ 41,620 - A$ 46,017 To apply, send letter of application, CV and names, fax and email of three referees to M. Jabri Tel (+61-2) 9351 2240, Fax (+61-2) 9351 7209, Email: marwan at sedal.usyd.edu.au From dhw at ptolemy.arc.nasa.gov Mon May 10 19:46:23 1999 From: dhw at ptolemy.arc.nasa.gov (David Wolpert) Date: Mon, 10 May 1999 16:46:23 -0700 (PDT) Subject: Job Announcement Message-ID: <199905102346.QAA16733@buson.arc.nasa.gov> Job opening in the Information Directorate at NASA Ames Research Center. Description: This position provides an opportunity to participate in research of Collective Intelligence (COIN), that is the analysis of large, distributed artificial systems and implementation of local strategies for augmenting their performance. In particular, this position involves research into the implementation on networks of COIN research involving machine learning. Tasks will include: -- Building and maintaining models for simulating the behavior of networks, using the OPNET network simulator. Setting up and running experiments on the application of COIN technology to the simulated control of those networks. Create and investigate new COIN technology in the domain of the OPNET simulations. -- Writing programs and scripts to carry out data collection from the network simulations. Analyzing the data and creating plots from these experiments, using packages like Matlab or OPNET's built-in analysis tool. The research of the group is described at http://ic.arc.nasa.gov/ic/projects/bayes-group/index.html. Minimal Requirements: -- B.A./B.S. in Computer Science, Mathematics, Statistics, or related field. -- Extended knowledge of and experience with C and/or C++. -- Basic knowledge of UNIX. -- Interest in Artificial Intelligence / Machine Learning / Statistics and/or network theory. Preferred: -- Masters Degree -- Experience with the OPNET network simulator. -- Experience with Perl or shell scripts. -- Experience with Matlab. -- Familiarity with reinforcement learning algorithms. -- U.S. citizen or permanent resident. Please direct responses, including a resume, to David Wolpert, Automated Learning Group NASA Ames Research Center MS 269-1, Moffett Field, CA 94035, USA dhw at ptolemy.arc.nasa.gov. Salary will be commensurate with experience. NASA/Ames is an equal oppurtunity employer. From robtag at dia.unisa.it Mon May 10 06:09:04 1999 From: robtag at dia.unisa.it (Tagliaferri Roberto) Date: Mon, 10 May 1999 12:09:04 +0200 Subject: Program WIRN Vietri '99: XI ITALIAN WORKSHOP ON NEURAL NETS Message-ID: <9905101009.AA30926@udsab> WIRN Vietri '99 XI ITALIAN WORKSHOP ON NEURAL NETS Preliminary Program IIASS "E.R. Caianiello", 20-22 May 1999) http://dsi.ing.unifi.it/neural/siren/WIRN99/home_en.html Thursday 20 May 9.30 - 10.30 Use and Abuse of Neural Networks Niranjan M. (invited talk) 10.30 - 11-00 coffe break Models 11.00 - 11.20 Dynamics of On-line Learning in Radial Basis Function Neural Networks Marinaro M., Scarpetta S. 11.20 - 11.40 Online Learning with Adaptive Local Step Sizes Schraudolph N.N. 11.40 - 12.00 Continual Prediction using LSTM with Forget Gates Gers F.A., Schmidhuber J., Cummins F. 12.00 - 12.20 Polynomial Clustering Exhibit Statistical Estimation Abilities Fiori S., Burrascano P. 12.20-13.20 Theory, Implementation, and Applications of Support Vector Machines Pittore M., Verri A. (review talk)) Poster Session 15.30 high light spotting Friday 21 May 9.00-10.00 Computational Intelligence in Hydroinformatics: a Review Cicioni Gb., Masulli F. (review talk) Applications 10.00 - 10.20 A Neural Network Based Urban Environment Monitoring System Exploiting Previous Knowledge Simone G., Morabito F.C. 10.20 -10.40 A Combination of Tools: NLS and NN Estimation of the Consumers' Expenditure in Durable Goods. Determinants, Trend and Forecasting for the Motor Vehicles Sector D'Orio G. 10.40 - 11.10 coffe break Signal and Image Processing 11.10 - 11.30 A Feed-Forward Neural Network for Robust Segmentation of Color Images Amoroso C., Chella A., Morreale V., Storniolo P. 11.30 - 11.50 A Neural Network Based ARX Model of Virgo Noise Barone F., De Rosa R., Eleuteri A., Garufi F., Milano L., Tagliaferri R. Architectures and Algorithms 11.50 - 12.10 Building Neural and Logical Networks with Hamming Clustering Muselli M. 12.10 - 12.30 Training Semiparametric Support Vector Machines Mattera D., Palmieri F., Haykin S. 12.30 - 12.50 An Analog On-chip Learning Architecture Based on Weight Perturbation Algorithm and on Current Mode Translinear Circuits Valle M., Diotalevi F., Bo G.M., Biglieri E., Caviglia D.D. Caianiello Session 15.00-16.00 Title to be announced Taylor J. (invited talk) 16.00 - 16.30 the winner of the Caianiello prize 16.30-17.00 coffe break 17.00 assemblea SIREN 20.00 Social Dinner Saturday 22 May Special Session on Neural Networks in Economics (Co-chairs S. Giove and M. Salzano) 9.00 - 9.50 Neural graphs in the handling of economic and management problems Gil Aluja (invited talk) 9.50 - 10.40 Statistical Neural Networks and their Applications in Economics Lauro N.C., Davino C., Vistocco D. (review talk) 10.40- 11.10 coffe break 11.10 - 11.35 title to be announced von Altrock (invited talk) 11.35 -11.55 Neural Network for Economic Forecasting Salzano M. (review talk) 11.55 - 12.15 Fuzzy Local Algorithms for Time Series Analysis and Forecasting Giove S. (review talk) 12.15 - 12.35 Regional Economic Policy and Computational Economics Marino D. (review talk) 12.35 - 12.50 A Fuzzy Definition of Industrial District Bruni S., Facchinetti G., Paba S. 12.50 - 13.05 Estimating the Conditional Mean of Non-linear Time Series using Neural Networks Giordano F., Perna C. Papers in the Poster Session The N-SOBoS Model Frisone F., Morasso P.G. A Neural Network Approach to Detect Functional MRI Signal Frisone F., Morasso P.G., Vitali P., Rodriguez G., Pilot A., Sardanelli F., Rosa M. Interval Arithmetic Multilayer Perceptron as Possibility-Necessity Pattern Classifier Drago G.P., Ridella S. Inferring Understandable Rules through Digital Synthesis Muselli M., Liberati D. YANNS: Yet Another Neural Network Simulator d'Acierno A., Sansone S. A Multilayer Perceptron for Fast Interpolation of JPEG/MPEG Coded Images Carrato S. A General Assembly as Implementation of a Hebbian Rule in a Boolean Neural Network Lauria F.E., Milo M., Prevete R., Visco S. The Search for Spiculated Lesions in the CALMA Project: Status and Perspectives Marzulli V.M. An Experimental Comparison of Three PCA Neural Techniques Fiori S. Weightless Neural Networks for Face Recognition Lauria S., Mitchell R. Gesture Recognition using hybrid SOM/DHMM Corradini A., Boehme H.J., Gross H.M. Parameter Identification using Aspects - Application to the Human Cardiovascular System Asteroth A., Frings-Naberschulte J., Mvller K. Development of Selectivity Maps in a BCM Network using Various Connectivity Schemes Remondini D., Castellani G.C., Bazzani A., Campanini R., Bersani F. A Novel Wavelet Filtering Method in SAR Image Classification by Neural Networks Simone G., Morabito F.C. Real Time Neural Network Disruption Prediction in Tokamak Reactors Morabito F.C., Versaci M. The Automatic Detection of Microcalcification Clusters in the CALMA Project: Status and Perspectives Delogu P. Recursive Networks: an Overview of Theoretical Results Bianchini M., Gori M., Scarselli F. Local Wavelet Decomposition and its Application to Face Reconstruction Borghese N.A., Ferrari S., Piuri V. Harmony Theory and Binding Problem Pessa E., Pietronilla Penna M. Scale Based Clustering Optimization via Gravitational Law Imitation Frattale Mascioli F.M., Rizzi A., Scrocca G., Martinelli G. Signal Classification by Subspace Neural Networks Martinelli G., Di Giacomo M. On-line Quality Control of DC Permanent Magnet Motor using Neural Networks Solazzi M., Uncini A. Neural Networks for Spectral Analysis of Unevenly Sampled Data Tagliaferri R., Ciaramella A., Milano L., Barone F. Using the Hermite Regression Algorithm to Improve the Generalization Capability of a Neural Network Pilato G., Sorbello F., Vassallo G. Training Semiparametric Support Vector Machines Mattera D., Palmieri F., Haykin S. A Comparison among Clustering Techniques for Identifying Objects in Images Carrai P., Izzo G., Esposito A., Agarossi L. From Peter.Bartlett at syseng.anu.edu.au Tue May 11 03:47:20 1999 From: Peter.Bartlett at syseng.anu.edu.au (Peter Bartlett) Date: Tue, 11 May 1999 17:47:20 +1000 (EST) Subject: position in machine learning at ANU Message-ID: <199905110747.RAA26686@reid.anu.edu.au> The Machine Learning Group at the Australian National University is advertising a position in theoretical and experimental machine learning. It's a continuing research + graduate teaching position at academic level C ("Fellow" = assistant/associate professor). Closing date is June 4. See http://wwwrsise.anu.edu.au/ad.html#LevC_ML for the advertisement and details. -- Peter Bartlett. From malaka at eml.org Tue May 11 07:51:36 1999 From: malaka at eml.org (Rainer Malaka) Date: Tue, 11 May 1999 13:51:36 +0200 Subject: job openings at EML, Germany Message-ID: <373819C8.BBED9617@eml.org> Could you please forward the following announcement. The jobs are partially related to connectionism and might be interesting to the people on the list. Best regards Rainer Malaka ######################################## Research Positions at the European Media Laboratory The Human Language Technology and the Personal Memory group at the European Media Laboratory in Heidelberg are seeking several researchers to work in the areas of information retrieval, information extraction, domain ontology building, and human computer interfaces. 1. A researcher with experience in terminological ontology building, knowledge representation languages, reasoning, lexical / knowledge acquisition from corpora. The appointee will work in close collaboration with the team of Bio-Informatics (molecular biology) of EML and the Department of Information Science of Tokyo University. We require someone who has a PhD in NLP or in CS with a demonstrated ability to do independent research. Preference will be given to applicants who can demonstrate practical abilities in the building of domain ontologies and who have a strong NLP background. 2. A researcher to work on natural language interfaces with excellent knowledge in object-oriented software development (e.g., Java) and XML. The appointee should have experience in GUI programming. Experience in dialog modeling and discourse structure would be an advantage. The successful candidate will be responsible for designing and implementing an interactive interface for information retrieval, database integration and ontological integration. The candidate will join an interdisciplinary team of computational linguists, computer scientists and domain experts who will use the designed interface for practical purposes. 3. A researcher with strong interests in the area of NLP and in particular in statistical NLP. The candidate should have excellent programming skills. The candidate should be familiar with the theory and implementation of finite-state automata, finite-state transducers and robust parsing techniques. The candidate is expected to work on areas such as semantic information retrieval, document classification, and clustering. 4. A Computer Scientist with strong background in one or more of the following areas: adaptive user interfaces, dialog management for human-computer interaction, context and situation modeling. We expect the candidates to have significant experience in object-oriented software development (e.g., Java), machine learning, and databases. Successful candidates should have a PhD or professional experience the field. Successful candidates will join a multi-disciplinary group and will participate in projects that aim at building user-oriented computer systems such as mobile tourism information systems. These projects are embedded into a network of collaborations with national and international research partners from industry and academics. We offer competitive salaries, depending on professional experience and scientific achievements. The positions are available for 3 years, with the possibility of renewing the appointment depending on performance, and availability of funding. EML is a newly established private research laboratory that primarily does contract research for the Klaus Tschira Foundation. It engages in research in the manifold uses of information technology, it's primary interest being the development of new ways to increase the usefulness of such technology for the individual and for society. Scientists from many different disciplines and countries work together at the EML, in particular, there is a regular exchange between the EML and national and international institutions. You will experience a challenging and stimulating international work environment here. In addition, the EML is located in one of the most beautiful old mansions of Heidelberg. Should you be interested in working with us please send your application including full CV and relevant material attesting your qualifications by 11th of June 1999 to our secretary, c/o Brbel Mack, Schloss-Wolfsbrunnenweg 33, D-69118 Heidelberg, . For further information, please contact our web-site www.eml.org. From jairmail at ISI.EDU Tue May 11 16:33:44 1999 From: jairmail at ISI.EDU (Steve Minton) Date: Tue, 11 May 1999 13:33:44 -0700 (PDT) Subject: JAIR article: "Learing to Order Things" Message-ID: <199905112033.NAA29984@quark.isi.edu> Readers of this mailing list may be interested in the following article, which was just published by JAIR: Cohen, W.W., Schapire, R.E., and Singer, Y. (1999) "Learning to Order Things", Volume 10, pages 243-270. Available in PDF, PostScript and compressed PostScript. For quick access via your WWW browser, use this URL: http://www.jair.org/abstracts/cohen99a.html More detailed instructions are below. Abstract: There are many applications in which it is desirable to order rather than classify instances. Here we consider the problem of learning how to order instances given feedback in the form of preference judgments, i.e., statements to the effect that one instance should be ranked ahead of another. We outline a two-stage approach in which one first learns by conventional means a binary preference function indicating whether it is advisable to rank one instance before another. Here we consider an on-line algorithm for learning preference functions that is based on Freund and Schapire's 'Hedge' algorithm. In the second stage, new instances are ordered so as to maximize agreement with the learned preference function. We show that the problem of finding the ordering that agrees best with a learned preference function is NP-complete. Nevertheless, we describe simple greedy algorithms that are guaranteed to find a good approximation. Finally, we show how metasearch can be formulated as an ordering problem, and present experimental results on learning a combination of 'search experts', each of which is a domain-specific query expansion strategy for a web search engine. The article is available via: -- comp.ai.jair.papers (also see comp.ai.jair.announce) -- World Wide Web: The URL for our World Wide Web server is http://www.jair.org/ For direct access to this article and related files try: http://www.jair.org/abstracts/cohen99a.html -- Anonymous FTP from either of the two sites below. Carnegie-Mellon University (USA): ftp://ftp.cs.cmu.edu/project/jair/volume10/cohen99a.ps The University of Genoa (Italy): ftp://ftp.mrg.dist.unige.it/pub/jair/pub/volume10/cohen99a.ps The compressed PostScript file is named cohen99a.ps.Z (229K) -- automated email. Send mail to jair at cs.cmu.edu or jair at ftp.mrg.dist.unige.it with the subject AUTORESPOND and our automailer will respond. To get the Postscript file, use the message body GET volume10/cohen99a.ps (Note: Your mailer might find this file too large to handle.) Only one can file be requested in each message. For more information about JAIR, visit our WWW or FTP sites, or send electronic mail to jair at cs.cmu.edu with the subject AUTORESPOND and the message body HELP, or contact jair-ed at ptolemy.arc.nasa.gov. From horn at alice.nc.huji.ac.il Thu May 13 07:22:55 1999 From: horn at alice.nc.huji.ac.il (David Horn) Date: Thu, 13 May 1999 14:22:55 +0300 Subject: new deadline NCST-99 Message-ID: Special Announcement ==================== postponement of deadline for submissions to the conference Neural Computation in Science and Technology ============================================ Place: Maale Hachamisha, Israel. Dates of the conference: October 10-13, 1999. Deadline for submission of contributions: postponed to June 15, 1999. ============== Due to a local strike, mail and email communications to and from Tel Aviv University were cut off during the past week, and are still down at present. As a result, we have decided to postpone the deadline for submission of contributions from May 15 to June 15. The conference will concentrate on modern issues in computational neuroscience as well as in applications of neural network techniques in science and technology. Currently confirmed speakers are: D.Amit, Y. Baram, M. Bialek, E. Domany, R. Douglas, G. Dreyfus, W. Gerstner, D. Golomb, M. Hasselmo, J. Hertz, D. Horn, N. Intrator, I. Kanter, W. Kinzel, R. Miikkulainen, K. Obermayer, E. Oja, E. Ruppin, I. Segev, T. Sejnowski, H. Siegelmann, H. Sompolinsky, N. Tishby, M. Tsodyks, V. Vapnik and D. Willshaw. In view of the current communication problem, abstracts of submitted papers should be sent to the interim email address horn at alice.nc.huji.ac.il. Four copies of full papers up to seven pages in length should be mailed to Prof. David Horn, NCST-99, School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel. Registration forms and further information will be available at the website http://neuron.tau.ac.il when electronic services will resume. From szepes at sol.cc.u-szeged.hu Sat May 15 15:25:51 1999 From: szepes at sol.cc.u-szeged.hu (Szepesvari Csaba) Date: Sat, 15 May 1999 21:25:51 +0200 (MET DST) Subject: Paper available Message-ID: The following paper is available from http://victoria.mindmaker.hu/~szepes/papers/scann98.ps.gz Reinforcement Learning: Theory and Practice Cs. Szepesvri in Proceedings of the 2nd Slovak Conference on Artificial Neural Networks (SCANN'98). Nov. 10-12, 1998, Smolenice, Slovakia, pp. 29-39 (Ed: Marian Hrehus) We consider reinforcement learning methods for the solution of complex sequential optimization problems. In particular, the soundness of two methods proposed for the solution of partially observable problems will be shown. The first method is a state-estimation scheme and requires mild {\em a priori} knowledge, while the second method assumes that a significant amount of abstract knowledge is available about the decision problem and uses this knowledge to setup a macro-hierarchy to turn the partially observable problem into another one which can already be handled using methods worked out for observable problems. This second method is also illustrated with some experiments on a real-robot. -------------------------------------------------------------------- Csaba Szepesvari Head of Research Department Mindmaker Ltd. Budapest 1112 Konkoly-Thege Miklos u. 29-33. HUNGARY e-mail: szepes at mindmaker.hu WEB: http://victoria.mindmaker.hu/~szepes Phone: +36 1 395 9220/1205 (dial extension continuously) Fax: +36 1 395 9218 From piuri at elet.polimi.it Mon May 17 05:34:33 1999 From: piuri at elet.polimi.it (vincenzo piuri) Date: Mon, 17 May 1999 11:34:33 +0200 Subject: IJCNN'2000 - PRELIMINARY CALL FOR PAPERS Message-ID: <3.0.5.32.19990517113433.014e1100@elet.polimi.it> Dear Colleague, On the behalf of the organizing committee I am glad to announce the IEEE-INNS-ENNS International Joint Conference on Neural Networks IJCNN'2000, to be held at the Grand Hotel Cernobbio, Como, Italy, on 24-27 July 2000. The conference is organized and sponsored by the IEEE Neural Network Council, with the cooperation with the International Neural Network Society and the European Neural Network Society. The preliminary call for papers can be found at the conference web site http://www.ims.unico.it/2000ijcnn.html All information about the conference will be posted there. No printed mailing will be done this year: very few emailing will be also done to avoid bothering you. Therefore, stay tuned on the above web site! If any colleague or student of yours like to be included in this emailing list, please forward her/him this message: instructions to subscribe the list are attached below. Other events will be organized before and after the conference: they will be announced soon. Ciao and see you in Como! Vincenzo Piuri IJCNN'2000 Program Co-Chair for Europe ============================================================================= To subscribe the list send an email to listproc at ims.unico.it The body of the message must contain ONLY the following line SUBSCRIBE CONFERENCES lastname firstname where lastname and firstname must be replaced by your last and first name respectively. Do not put any signature or any other message: they will be ignored and may result in error messages. ============================================================================= Vincenzo Piuri Department of Electronics and Information, Politecnico di Milano piazza L. da Vinci 32, 20133 Milano, Italy phone +39-02-2399-3606 secretary +39-02-2399-3623 fax +39-02-2399-3411 email piuri at elet.polimi.it From jairmail at ISI.EDU Mon May 17 18:26:19 1999 From: jairmail at ISI.EDU (Steve Minton) Date: Mon, 17 May 1999 15:26:19 -0700 (PDT) Subject: JAIR article, "Variational Probabilistic Inference..." Message-ID: <199905172226.PAA21005@quark.isi.edu> JAIR is pleased to announce the publication of the following article, which may be of interest to readers of this mailing list: Jaakkola, T.S. and Jordan, M.I. (1999) "Variational Probabilistic Inference and the QMR-DT Network", Volume 10, pages 291-322. Available in HTML, PDF, PostScript and compressed PostScript. For quick access via your WWW browser, use this URL: http://www.jair.org/abstracts/jaakkola99a.html More detailed instructions are below. Abstract: We describe a variational approximation method for efficient inference in large-scale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the `Quick Medical Reference' (QMR) network. The QMR network is a large-scale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a state-of-the-art stochastic sampling method. The article is available via: -- comp.ai.jair.papers (also see comp.ai.jair.announce) -- World Wide Web: The URL for our World Wide Web server is http://www.jair.org/ For direct access to this article and related files try: http://www.jair.org/abstracts/jaakkola99a.html -- Anonymous FTP from either of the two sites below. Carnegie-Mellon University (USA): ftp://ftp.cs.cmu.edu/project/jair/volume10/jaakkola99a.ps The University of Genoa (Italy): ftp://ftp.mrg.dist.unige.it/pub/jair/pub/volume10/jaakkola99a.ps The compressed PostScript file is named jaakkola99a.ps.Z (249K) -- automated email. Send mail to jair at cs.cmu.edu or jair at ftp.mrg.dist.unige.it with the subject AUTORESPOND and our automailer will respond. To get the Postscript file, use the message body GET volume10/jaakkola99a.ps (Note: Your mailer might find this file too large to handle.) Only one can file be requested in each message. For more information about JAIR, visit our WWW or FTP sites, or send electronic mail to jair at cs.cmu.edu with the subject AUTORESPOND and the message body HELP, or contact jair-ed at ptolemy.arc.nasa.gov. From oreilly at grey.colorado.edu Tue May 18 11:24:25 1999 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 18 May 1999 09:24:25 -0600 Subject: Postdoc Position Available Message-ID: <199905181524.JAA08370@grey.colorado.edu> A postdocotral position is available starting immediately for someone interested in pursuing computational modeling approaches to the role of neuromodulation and/or prefrontal cortical function in cognition. The nature of the position is flexible, depending upon the individual's interest and expertise. Approaches can be focused at the neurobiological level (e.g., modeling detailed physiological characteristics of neuromodulatory systems, such as locus coeruleus and/or dopaminergic nuclei, or the circuitry of prefrontal cortex), or at the more systems/cognitive level (e.g., the nature of representations and/or the mechanisms involved in active maintenance of information within prefrontal cortex, and their role in working memory). The primary requirement for the position is a Ph.D. in the cognitive, computational, or neurosciences, and extensive experience with computational modeling work, either at the PDP/connectionist or detailed biophysical level. The postdoc is funded from a collaborative research grant involving Jonathan Cohen at the Department of Psychology and Center for the Study of Brain, Mind, and Behavior at Princeton University (http://www.csbmb.princeton.edu/ncc/jdc.html) and the Western Psychiatric Institute and Clinic at the University of Pittsburgh, and Randall O'Reilly at the Department of Psychology and Institute for Cognitive Science at the University of Colorado, Boulder (http://psych.colorado.edu/~oreilly). Further information about the resources and affiliated faculty at the possible sponsoring institutions and programs is available at: http://www.csbmb.princeton.edu, http://www.cnbc.cmu.edu, http://psych.colorado.edu/~oreilly/postdoc.html. Interested individuals should send a curriculum vitae, representative publications, a statement of research interests, and three letters of reference via email to jdc at princeton.edu (please begin subject with the words ``Modeling Position'') or via snail mail to Jonathan D. Cohen, Department of Psychology, Green Hall, Princeton University, Princeton, NJ 08544. We will begin reviewing applications as they are received, continuing until the position is filled. Princeton University, the University of Pittsburgh, and the University of Colorado are all equal opportunity employers; minorities and women are encouraged to apply. From ericr at ee.usyd.edu.au Tue May 18 22:42:22 1999 From: ericr at ee.usyd.edu.au (Eric Ronco) Date: Wed, 19 May 1999 12:42:22 +1000 Subject: Human motor control modelling Message-ID: <3742250E.EADC9EBF@ee.usyd.edu.au> Hello, This is to let you know of a new paper submitted to NIPS99 which presents a new model of the human movement control system. Please, see the abstract bellow for details. This article is available at http://www.ee.usyd.edu.au/~ericr/pub/EE99003.ps.gz Title: Open-Loop Intermittent Feedback Optimal Predictive Control: a human movement control model Abstract: This paper introduces the Open-Loop Intermittent Feedback Optimal predictive (OLIFO) controller as an alternative to human movement control models based on system inverse control. OLIFO has the advantages of being applicable to any system, not requiring a desired system trajectory and handling naturally systems with time delays and constraints. Moreover, it shares important functional characteristics with the human movement control system. Its behaviour is illustrated through the control of a six muscles human arm model. Comparable performances obtained with the OLIFO controller and actual subjects suggest the plausibility of this scheme. Bye Eric -- Dr Eric Ronco, room 316 Tel: +61 2 9351 7680 School of Electrical Engineering Fax: +61 2 9351 5132 Bldg J13, Sydney University Email: ericr at ee.usyd.edu.au NSW 2006, Australia http://www.ee.usyd.edu.au/~ericr From gaj at psychology.nottingham.ac.uk Wed May 19 04:40:57 1999 From: gaj at psychology.nottingham.ac.uk (Gary Jones) Date: Wed, 19 May 1999 09:40:57 +0100 Subject: CogSci conference tutorials Message-ID: This may be of interest to the people on this mailing list. I have taken tutorials in three out of the four and found them to be very useful. The Twenty First Annual Meeting of the Cognitive Science Society will take place on August 19-21, 1999 at Simon Fraser University in Vancouver, B.C. The day before the conference, there will be an open workshop on teaching cognitive science and a tutorial programme. This year there are four tutorials on cognitive architectures, including Soar, ACT-R, Cogent, and PDP++. These tutorials are generally designed to introduce architectures to potential users. These tutorials do not teach the architectures completely, but provide enough background to understand models written in them and usually provide enough information for modellers to judge if the architecture is right for their problem. More information on the conference is available at . Gary Jones Psychology Department University of Nottingham Nottingham NG7 2RD England E-mail: gaj at Psychology.Nottingham.AC.UK Web: http://www.psychology.nottingham.ac.uk/staff/Gary.Jones/ From dph at cse.ucsc.edu Wed May 19 13:42:45 1999 From: dph at cse.ucsc.edu (David Helmbold) Date: Wed, 19 May 1999 10:42:45 -0700 Subject: COLT 99 registration information (plain text) Message-ID: <199905191742.KAA08500@sundance.cse.ucsc.edu> From lisa at cse.ucsc.edu Wed May 19 13:21:58 1999 From: lisa at cse.ucsc.edu (Lisa Weiss) Date: Wed, 19 May 1999 10:21:58 -0700 Subject: plain text file Message-ID: <199905191721.KAA13838@rio.cse.ucsc.edu> COLT '99 Twelfth ACM Conference on Computational Learning Theory Tuesday, July 6 through Friday, July 9, 1999 University of California, Santa Cruz, California The workshop will be held on campus, which is hidden away in the redwoods on the Pacific Coast of Northern California. The workshop is in cooperation with the ACM Special Interest Group on Algorithms and Computation Theory (SIGACT) and the ACM Special Interest Group on Artificial Intelligence (SIGART). 1. Flight tickets: San Jose Airport is the closest, about a 45 minute drive. San Francisco Airport (SFO) is about an hour and forty-five minutes away, but has slightly better flight connections. 2. Transportation from the airport to Santa Cruz: The first option is to rent a car and drive south from San Jose on Hwy 880, which becomes Hwy 17 or from San Francisco take either Hwy 280 or 101 to Hwy 17. When you get to Santa Cruz, take Route 1 (Mission St.) north. Turn right on Bay Street and follow the signs to UCSC. Commuters must purchase parking permits for $4.00/day M-F (parking is free Saturday and Sunday) from the information kiosk at the Main entrance to campus or the conference satellite office. Those staying on campus can pick up permits with their room keys. Various van services also connect Santa Cruz with the the San Francisco and San Jose airports. The Santa Cruz Airporter (831) 423-1214 (or (800) 497-4997 from anywhere) has regularly scheduled trips (every two hours from 9am until 11pm from San Jose International Airport ($30 each way), and every two hours from 8am until 10pm from SFO, ($35 each way) from either airport. ABC Transportation (831) 464-8893 ((800) 734-4313 from California (24hr.)) runs a private sedan service ($47 for one, $57 for two, $67 for three to six from San Jose Airport to UC Santa Cruz, $79 for one, $89 for two, and $99 for three to six from SFO to UCSC, additional $10 after 11:30 pm, additional $20 to meet an international flight) and will drop you off at your room. Book at least 24 hours in advance. 3. Conference and room registration: Please fill out the enclosed form and send it to us with your payment. It must be postmarked by June 1 and received by June 5 to obtain the early registration rate and guarantee the room. Conference housing is limited by the available space, and late registrants may need to seek off-campus accommodations. Your arrival: This year we will be at the Kresge apartments. Enter the campus at the Main Entrance, which is the intersection of High and Bay Streets. (Look for the COLT signs.) Bay St. turns into Coolidge Dr., continue on this road, which becomes McLaughlin Dr., until you reach the stop sign at the T in the road, turn left onto Heller Dr., and then you will make a right turn into the Kresge East Apts parking lot. Housing registration will be at the Kresge East Apts parking lot from 2:00 to 4:00 pm on Monday. Keys, meal cards, parking permits, maps, and information about what to do in Santa Cruz will be available. The office will remain open until 10:00 pm for late arrivals. Arrivals after 10:00 pm: stop at the Main Entrance Kiosk and have the guard call the College Proctor, who will meet you at the Satellite Office and give you your housing materials. Please be prepared to show I.D. or you will not be permitted on campus. Problems? Please go directly to the Kresge Satellite Office, or contact the Conference Director at (831) 459-2611. The Kresge College Conference Office is located in Apt. Building R11-Apt 1111. From the Kresge East Apts parking lot continue on Heller Dr. to the Porter College entrance, turn right into Porter College, follow the road around (curving right) to Kresge College. At Kresge College, park in the first lot on your left and look for signs for the Conference Office. Please do not park in spaces with posted restrictions at any time. In case of emergency, dial 911 from any campus phone. The weather in July is mostly sunny with occasional summer fog. Even though the air may be cool, the sun can be deceptively strong; those susceptible to sunburn should come prepared with sunblock. Bring T-shirts, slacks, shorts, and a sweater or light jacket, as it cools down at night. For information on the local bus routes and schedules, call the Metro Center at (831) 425-8600. Bring swimming trunks, tennis rackets, etc. You can get day passes for $5.00 (East Field House, Physical Education Office) to use the recreation facilities on campus. For questions about registration or accommodations, contact COLT'99, Computer Science Dept., UCSC, Santa Cruz, CA 95064. The e-mail address is colt99 at cse.ucsc.edu, and fax is (831)459-4829. For emergencies, call (831)459-2263. 4. General Conference Information: The Conference Registration will be at 8:30 Tuesday, outside the Porter dining hall. Late registrations will be at the same location during the technical sessions. All lectures will be in the Porter dining hall. A banquet will be held Tuesday from 6:30--8:00pm outside the Porter dining hall. The workshop has been organized to allow time for informal discussion and collaboration. COLT `99 CONFERENCE SCHEDULE ---------------------------- MONDAY, July 5 -------------- 2:00-4:00 pm, Housing Registration, Kresge East Apts Parking Lot. Note: All technical sessions will take place in the Porter Dining Hall. TUESDAY, July 6 --------------- SESSION 1: 9:00 - 10:30 9:00-9:30 The Robustness of the p-norm Algorithms Claudio Gentile and Nick Littlestone 9:30-10:00 Minimax Regret under Log Loss for General Classes of Experts Nicolo Cesa-Bianchi and Gabor Lugosi 10:00-10:30 On Prediction of Individual Sequences Relative to a set of Experts Neri Merhav and Tsachy Weissman 10:30 - 11:00 BREAK SESSION 2: 11:00 - 12:00 11:00-11:20 On Theory Revision with Queries Robert H. Sloan and Gyorgy Turan 11:20-11:40 Estimating a Mixture of Two Product Distributions Yoav Freund and Yishay Mansour 11:40-12:00 An Apprentice Learning Model Stephen S. Kwek 12:00 - 2:00 LUNCH SESSION 3: 2:00 - 3:00 2:00-2:20 Uniform-Distribution Attribute Noise Learnability Nader H. Bshouty, Jeffrey C. Jackson and Christino Tamon 2:20-2:40 On Learning in the Presence of Unspecified Attribute Values Nader H. Bshouty, and David K. Wilson 2:40-3:00 Learning Fixed-dimension Linear Thresholds from Fragmented Data Paul W. Goldberg 3:00 - 3:30 BREAK TUTORIAL 1: 3:30-5:30 3:30-5:30 Boosting Yoav Freund and Rob Schapire 7:30-9:30 RECEPTION - Kresge Town Hall Area WEDNESDAY, July 7 ----------------- 9:00-10:00 INVITED TALK: by David Shmoys Approximation Algorithms for Clustering Problems 10:00-10:30 BREAK SESSION 4: 10:30-12:10 10:30-10:50 An Adaptive Version of the Boost-by-majority Algorithm Yoav Freund 10:50-11:10 Drifting Games Robert E. Schapire 11:10-11:30 Additive Models, Boosting, and Inference for Generalized Divergences John Lafferty 11:30-11:50 Boosting as Entropy Projection J. Kivinen and M. K. Warmuth 11:50-12:10 Multiclass Learning, Boosting, and Error-Correcting Codes Venkatesan Guruswami and Amit Sahai 12:10-2:00 LUNCH SESSION 5: 2:00-3:00 2:00-2:20 Theoretical Analysis of a Class of Randomized Regularization Methods Tong Zhang 2:20-2:40 PAC-Bayesian Model Averaging David McAllester 2:40-3:00 Viewing all Models as `Probabilistic' Peter Grunwald 3:00-3:30 BREAK TUTORIAL 2: 3:30-5:30 3:30-5:30 Reinforcement Learning Michael Kearns and Yishay Mansour 6:30-8:00 BANQUET - Porter Dining Hall THURSDAY, July 8 ---------------- SESSION 6: 9:00-10:30 9:00-9:20 Reinforcement Learning and Mistake Bounded Algorithms Yishay Mansour 9:20-9:40 Convergence Analysis of Temporal-difference Learning Algorithms Vladislav Tadic 9:40-10:00 Beating the Hold-Out Avrim Blum, Adam Kalai and John Langford 10:00-10:20 Microchoice Bounds and Self Bounding Learning Algorithms John Langford and Avrim Blum 10:30-11:00 BREAK SESSION 7: 11:00-12:00 11:00-11:20 Learning Specialist Decision Lists Atsuyoshi Nakamura 11:20-11:40 Linear Relations between Square-Loss and Kolmogorov Complexity Yuri A. Kalnishkan 11:40-12:00 Individual Sequence Prediction - Upper Bounds and Application for Complexity Chamy Allenberg 12:00-2:00 LUNCH SESSION 8: 2:00-3:00 2:00-2:20 Extensional Set Learning S. A. Terwijn 2:20-2:40 On a Generalized Notion of Mistake Bounds Sanjay Jain and Arun Sharma 2:40-3:00 On the Intrinsic Complexity of Learning Infinite Objects from Finite Samples Kinber, Papazian, Smith, and Wiehagen 2:20-2:30 Concept Learning with Geometric Hypotheses David P. Dobkin and Dimitrios Gunopulos Friday, July 9 -------------- TUTORIAL 3: 9:00-11:00 9:00-11:00 Large Margin Classification Peter Bartlett, John Shawe-Taylor, and Bob Williamson 11:00-11:30 BREAK SESSION 9: 11:30-12:10 11:30-11:50 Covering Numbers for Support Vector Machines Ying Guo, Peter L. Bartlett, John Shawe-Taylor, and Robert C. Williamson 11:50-12:10 Further Results on the Margin Distribution John Shawe-Taylor and Nello Cristianini 12:10-2:00 LUNCH SESSION 10: 2:00-3:40 2:00 - 2:20 Attribute Efficient PAC-learning of DNF with Membership Queries Nader H. Bshouty and Jeffrey C. Jackson and Christino Tamon 2:20 - 2:40 On PAC Learning Using Winnow, Perceptron, and a Perceptron-Like Algorithm Rocco A. Servedio 2:40 - 3:00 Extension of the PAC Framework to Finite and Countable Markov Chains David Gamarnik 3:00 - 3:20 Learning Threshold Functions with Small Weights using Membership Queries E. Abboud, N. Agha, N.H. Bshouty, N. Radwan, F. Saleh 3:20 - 3:40 Exact Learning of Unordered Tree Patterns From Queries Thomas R. Amoth, Paul Cull, and Prasad Tadepalli 3:40 CONFERENCE ENDS REGISTRATION INFORMATION Please fill in the information needed for registration and accommodations. Make your payment by check or international money order, in U.S. dollars and payable through a U.S. bank, to UC Regents/COLT '99. Mail this form together with payment (by June 4, 1999 to avoid the late fee) to: COLT '99 Computer Science Department University of California Santa Cruz, California 95064 Payment may also be made by VISA or MC but will have a $15 surcharge and you can fax or email the form. Questions: e-mail colt99 at cse.ucsc.edu, fax (831)459-4829. Confirmations will be sent by e-mail. Anyone needing special arrangements to accommodate a disability should enclose a note with their registration. CONFERENCE REGISTRATION ======================= Name: ________________________________________________________________ Affiliation: _________________________________________________________ Address: _____________________________________________________________ City: _____________________________ State: ________ Zip: _________ Country: _____________________________ Telephone: _____________________________ Email address: __________________________ The registration fee includes a copy of the proceedings and the banquet dinner. ACM/SIG Members: $175 Non-Members: $240 Full time students: $100 Reg. Late Fee: $50 (rec'd after June 4) Extra banquet tickets: ____ (quantity) x $25 = _______________ How many in your party have dietary restrictions? Vegetarian: _____ Other: _____ SHIRT SIZE, please circle one: medium large x-large ACCOMMODATIONS AND DINING ------------------------- Accommodation fees are $67 per person for a double and $80 for a single per night at the Kresge Apartments. Cafeteria style breakfast (7:00 to 8:00am), lunch (12:00 to 1:00pm), and dinner (6:30 to 7:30pm) will be served in the College Eight Dining Hall. Doors close at the end of the time indicated, but dining may continue beyond this time. The first meal provided is dinner on the day of arrival and the last meal is lunch on the day you leave. NO REFUNDS can be given after June 7. Those with uncertain plans should make reservations at an off-campus hotel. Each attendee should pick one of the following options: PACKAGE #1: Mon., Tue., Wed., Thurs. nights: $268 double, $320 single. PACKAGE #2: Tue., Weds., Thurs. nights: $201 double, $240 single. OTHER housing arrangement. Each 4-person apartment has a living room, a kitchen, a common bathroom, and either four single separate rooms, two double rooms, or two single and one double room. We need the following information to make room assignments: Gender (M/F): __________ Smoker (Y/N): ___________ Roommate Preference: ________________________________ For shorter stays, longer stays, and other special requirements, you can get other accommodations through the Conference Office. Make reservations directly with them at (831) 459-2611, fax (831) 459-3422, and do this soon as on-campus rooms for the summer fill up well in advance. Off-campus hotels include the Dream Inn (831) 426-4330 and the Ocean Pacific Lodge (831) 457-1234 or (800) 995-0289. AMOUNT ENCLOSED: Registration total _________________ VISA/MC $15 Charge _________________ Extra Banquet tickets _________________ Accommodations _________________ Mark Fulk Award* _________________ TOTAL _________________ * The optional Donation for the Mark Fulk Award is tax deductible in the U.S.A., please see Carl Smith for receipt. From A.van.Ooyen at nih.knaw.nl Wed May 19 14:39:38 1999 From: A.van.Ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Wed, 19 May 1999 20:39:38 +0200 Subject: Model of Axonal Competition Message-ID: <37430569.736F@nih.knaw.nl> NEW PAPER: Competition for Neurotrophic Factor in the Development of Nerve Connections A. van Ooyen & D. J. Willshaw Proc. R. Soc. Lond. B Biol. Sci. (1999) 266: 883-892. Download full text from the following website: http://www.cns.ed.ac.uk/people/arjen/competition.html Or request a reprint of the paper version (don't forget to give your address): A.van.Ooyen at nih.knaw.nl ABSTRACT The development of nerve connections is thought to involve competition among axons for survival promoting factors, or neurotrophins, which are released by the cells that are innervated by the axons. Although the notion of competition is widely used within neurobiology, there is little understanding of the nature of the competitive process and the underlying mechanisms. We present a new theoretical model to analyse competition in the development of nerve connections. According to the model, the precise manner in which neurotrophins regulate the growth of axons, in particular the growth of the amount of neurotrophin receptor, determines what patterns of target innervation can develop. The regulation of neurotrophin receptors is also involved in the degeneration and regeneration of connections. Competition in our model can be influenced by factors dependent on and independent of neuronal electrical activity. Our results point to the need to measure directly the specific form of the regulation by neurotrophins of their receptors. -- Arjen van Ooyen, Netherlands Institute for Brain Research, Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands. email: A.van.Ooyen at nih.knaw.nl website: http://www.cns.ed.ac.uk/people/arjen.html phone: +31.20.5665483 fax: +31.20.6961006 From llicht at ifi.unizh.ch Thu May 20 12:00:14 1999 From: llicht at ifi.unizh.ch (Lukas Lichtensteiger) Date: Thu, 20 May 1999 18:00:14 +0200 Subject: PhD position in Evolutionary Robotics/Artificial Life Message-ID: ---------------------------------------------------------------------- Position for a Ph.D. student in Evolutionary Robotics/Artificial Life at the AI Lab, University of Zurich ---------------------------------------------------------------------- A new PhD student position is open at the Artificial Intelligence Laboratory, Department of Information Technology, University of Zurich, Switzerland. Availability: Immediately or at earliest convenience. The position is dedicated to research in Evolutionary Robotics/Artificial Life. In particular, the research focus will be on one or several of the following topics: - the generation of simple artificial organisms capable of exhibiting behavior (in simulation and in the real world) - mechanisms of ontogenetic development - the interaction of microscopic (genome, cell) and macroscopic (behavior) processes - the investigation of the interdependence of environment, morphology, and control - the incorporation of biological insights at appropriate levels of abstraction into computer/robotic models Previous work on an "Artificial Evolutionary System" that implements pertinent biological concepts can serve as a starting point. This research is intended to bring together ideas from biology, engineering, and computer science in novel and productive ways. If these challenges attract your interest and if you would like to become a member of an international research team conducting transdisciplinary work, please submit a curriculum vitae, statement of research interests, and the names of three references to: Rolf Pfeifer, Director AI Lab Dept. of Information Technology University of Zurich Winterthurerstrasse 190 CH-8057 Zurich, Switzerland E-mail: pfeifer at ifi.unizh.ch Phone: +41 1 635 43 20/31 Fax: +41 1 635 68 09 Profile: Applicants should have a master's degree, or equivalent (e.g. diploma or "licence" (Lizentiat)), in one of the following areas: biology, neurobiology, computer science, electrical or mechanical engineering, physics, mathematics (or related disciplines). Ability to work in a strongly interdisciplinary team is expected. Tasks: The main task for the accepted candidate will be to conduct research towards his/her Ph.D. Additional tasks include support for classes organized by the AI-Lab. Financial: The salary will be according to the specification of the University of Zurich. Time prospect: The candidate is expected to complete his/her Ph.D. work within a period of maximum 4 years. ---------------------------------------------------------- Prof. Dr. Rolf Pfeifer Director, Artificial Intelligence Laboratory Department of Information Technology, University of Zurich Winterthurerstrasse 190, CH-8057 Zurich, Switzerland Phone: +41 1 635 43 20/31 Fax: +41 1 635 68 09 www.ifi.unizh.ch/~pfeifer ---------------------------------------------------------- From Zoubin at gatsby.ucl.ac.uk Fri May 21 13:14:23 1999 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Fri, 21 May 1999 18:14:23 +0100 (BST) Subject: Gatsby Unit Tutorial on Neural Computation Message-ID: <199905211714.SAA31263@cajal.gatsby.ucl.ac.uk> GATSBY UNIT TUTORIAL: NEURAL COMPUTATION Tuesday 31st August to Friday 3rd September 1999 University College London England http://www.gatsby.ucl.ac.uk/tutorial/tutorial.html The Gatsby Computational Neuroscience Unit at University College London is holding a four-day tutorial on Neural Computation from Tuesday 31st August to Friday 3rd September 1999. The tutorial is aimed at research students and postdoctoral researchers who are interested in and wish to learn more about neural computation and computational neuroscience. The tutorial is intended for researchers who are trying to understand how the brain computes, but it should also be of value to researchers who are using neural network methods for solving applied problems. The first part of the tutorial will describe techniques for learning and inference that have their roots in computer science, statistics, physics, engineering and dynamical systems. The second part will describe how these techniques have been applied to understanding computation in the brain. Tutorial Faculty: Geoff Hinton, Peter Dayan, Zoubin Ghahramani, Zhaoping Li, Hagai Attias, Sam Roweis, Emo Todorov and Carl van Vreeswijk Registration Fee: GBP 25.00. Includes course papers and light refreshments. Further information on the tutorial contents and on-line registration are available at: http://www.gatsby.ucl.ac.uk/tutorial/tutorial.html From NKasabov at infoscience.otago.ac.nz Sat May 22 17:26:17 1999 From: NKasabov at infoscience.otago.ac.nz (Nik Kasabov) Date: Sun, 23 May 1999 09:26:17 +1200 Subject: Three TRs and software on on-line learning and applications Message-ID: Dear colleagues, Three Technical Reports and software functions in MATLAB on evolving connectionist systems for on-line learning and their applications for on-line adaptive speech recognition and dynamic time series prediction are available from: http://divcom.otago.ac.nz/infoscience/kel/CBIIS.html (software/EFuNN) regards, Nik Kasabov -------------------------------------------- Prof. Dr. Nikola (Nik) Kasabov Department of Information Science University of Otago,P.O. Box 56,Dunedin New Zealand, email: nkasabov at otago.ac.nz phone:+64 3 479 8319, fax: +64 3 479 8311 http://divcom.otago.ac.nz:800/infosci/Staff/NikolaK.htm -------------------------------------------- TR99/02 N.Kasabov, Evolving Connectionist Systems for On-line, Knowledge-based Learning: Principles and Applications, TR99/02, Department of Information Science, University of Otago, New Zealand Abstract. The paper introduces evolving connectionist systems (ECOS) as an effective approach to building on-line, adaptive intelligent systems. ECOS evolve through incremental, hybrid (supervised/unsupervised), on-line learning. They can accommodate new input data, including new features, new classes, etc. through local element tuning. New connections and new neurons are created during the operation of the system. The ECOS framework is presented and illustrated on a particular type of evolving neural networks - evolving fuzzy neural networks (EFuNNs). EFuNNs can learn spatial-temporal sequences in an adaptive way through one pass learning. Rules can be inserted and extracted at any time of the system operation. The characteristics of ECOS and EFuNNs are illustrated on several case studies that include: adaptive pattern classification; adaptive, phoneme-based spoken language recognition; adaptive dynamic time-series prediction; intelligent agents. ---------------------------------------------------------------------------- ---- TR99/03 N.Kasabov, M.Watts, Spatial-Temporal Adaptation in Evolving Fuzzy Neural Networks for On-line Adaptive Phoneme Recognition, TR99/03, Department of Information Science, University of Otago, New Zealand Abstract. The paper is a study on the spatial-temporal characteristics of evolving fuzzy neural network systems (EFuNNs)for on-line adaptive learning. These characteristics are important for the task of adaptive, speaker independent spoken language recognition, where new pronunciations and new accents need to be learned in an on-line, adaptive mode. Experiments with EFuNNs, and also with multi-layer perceptrons, and fuzzy neural networks (FuNNs), conducted on the whole set of 43 New Zealand English phonemes, show the superiority and the potential of EFuNNs when used for the task. Spatial allocation of nodes and their aggregation in EFuNNs allow for similarity preserving and similarity observation within one phoneme data and across phonemes, while subtle temporal variations within one phoneme data can be learned and adjusted through temporal feedback connections. The experimental results support the claim that spatial-temporal organisation in EFuNNs can lead to a significant improvement in the recognition rate especially for the diphthong and the vowel phonemes in English, which in many cases are problematic for a system to learn and adjust in an on-line, adaptive way. ---------------------------------------------------------------------------- ---- TR99/04 N.Kasabov and Q.Song, Dynamic Evolving Fuzzy Neural Networks with 'm-out-of-n' Activation Nodes for On-line Adaptive Systems, TR99/04, Department of Information Science, University of Otago, New Zealand Abstract. The paper introduces a new type of evolving fuzzy neural networks (EFuNNs), denoted as mEFuNNs, for on-line learning and their applications for dynamic time series analysis and prediction. At each time moment the output vector of a mEFuNN is calculated based on the m-most activated rule nodes. Two approaches are proposed: (1) using weighted fuzzy rules of Zadeh-Mamdani type; (2) using Takagi-Sugeno fuzzy rules that utilise dynamically changing and adapting values for the inference parameters. It is proved that the mEFuNNs can effectively learn complex temporal sequences in an adaptive way and outperform EFuNNs, ANFIS and other connectionist and hybrid connectionist models. The characteristics of the mEFuNNs are illustrated on two bench-mark dynamic time series data, as well as on two real case studies for on-line adaptive control and decision making. Aggregation of rule nodes in evolved mEFuNNs can be achieved through fuzzy C-means clustering algorithm which is also illustrated on the bench mark data sets. The regularly trained and aggregated in an on-line, self-organised mode mEFuNNs perform as well, or better, than the mEFuNNs that use fuzzy C-means clustering algorithm for off-line rule node generation on the same data set. ------------------------------------------------------------------- From uwe.zimmer at gmd.gr.jp Tue May 25 06:58:37 1999 From: uwe.zimmer at gmd.gr.jp (Uwe R. Zimmer) Date: Tue, 25 May 1999 19:58:37 +0900 Subject: PhD thesis (Giovanni Indiveri) Message-ID: <374A825D.E9D23E81@gmd.gr.jp> Dear Collegues, my PhD thesis "Modelling and Identification of Underwater Robotic" is available in pdf format at the URL: http://www.gmd.gr.jp/JRL/publications.html#98 Please find in the following its abstract and table of contents. Best wishes, Giovanni Indiveri ABSTRACT Whatever is the strategy pursued to design a control system or a state estimation filter for an underwater robotic system the knowledge of its identified model is very important. As far as ROVs are concerned the results presented in this thesis suggest that low cost on board sensor based identification is feasible: the detailed analysis of the residual least square costs and of the parameter estimated variances show that a decoupled vehicle model can be successfully identified by swimming pool test provided that a suitable identification procedure is designed and implemented. A two step identification procedure has been designed on the basis of: (i) the vehicle model structure, which has been deeply analyzed in the first part of this work, (ii) the type of available sensors and (iii) the actuator dynamics. First the drag coefficients are evaluated by constant speed tests and afterwards with the aid of their knowledge a sub-optimal sinusoidal input thrust is designed in order to identify the inertia parameters. Extensive experimental activity on the ROMEO ROV of CNR-IAN has shown the effectiveness of such approach. Moreover it has been shown that the standard unmanned underwater vehicle models may need, as for the ROMEO ROV, to take into account propeller-propeller and propeller-hull interactions that have a most relevant influence on the system dynamics (up to 50% of efficiency loss in the applied thrust with respect to the nominal model). It has been shown that such phenomena can be correctly modelled by an efficiency parameter and experimental results concerning its identification on a real system have been extensively analyzed. The parameter estimated variances are generally relatively low, specially for the drag coefficients, confirming the effectiveness of the adopted identification scheme. The surge drag coefficients have been estimated relatively to two different vehicle payload configurations, i.e. carrying a plankton sampling device or a Doppler velocimeter (see chapter 4 for details), and the results show that in the considered surge velocity range (|u| < 1 m/s) the drag coefficients are different, but perhaps less then expected. Moreover it has been shown that in the usual operating yaw rate range (< 10 deg /s) drag is better modeled by a simple linear term rather then both a linear and a quadratic one. This is interesting as it suggests that the control system of the yaw axis of slow motion open frame ROV can be realized by standard linear control techniques. For a detailed description of the identification procedure and of the identification results of the ROMEO ROV consult chapter 4. In the last part of this thesis the issue of planar motion control of a nonholonomic vehicle has been addressed. Inspired by the previous works of Casalino et al. and Aicardi et al. regarding a unicycle like kinematic model, a novel globally asymptotically convergent smooth feedback control law for the point stabilization of a car-like robot has been developed. The resulting linear velocity does not change sign, curvature is bounded and the target is asymptotically approached on a straight line. Applications to the control of underwater vehicles are discussed and extensive simulations are performed in order to analyze the algorithms behaviour with respect to actuator saturation. It is analytically shown that convergence is achieved also in presence of actuator saturation and simulations are performed to evaluate the control law performance with and without actuator saturation. Moreover the generation of smooth paths having minimum square curvature, integrated over length, is addressed and solved with variational calculus in 3D for an arbitrary curve parametrization. The plane projection of such paths are shown to be least yaw drag energy paths for the 2D underwater motion of rigid bodies. ______________________________________________________________ -------------------------------------------------------------- TABLE OF CONTENTS 1 Introduction 9 1.1 Motivations and Objectives 9 1.2 Outline of the work 11 1.3 Acknowledgments 12 2 Kinematics 13 2.1 Vectors 13 2.1.1 Vector notation 13 2.1.2 Time derivatives of vectors 14 Poisson Formula 15 Velocity composition rules 17 2.1.3 On useful vector operations properties 19 3 Dynamics 21 3.1 Rigid body Newton-Euler equations 21 3.2 Fluid forces and moments on a rigid body 26 3.2.1 The Navier Stokes equation 26 3.2.2 Viscous effects 28 Viscous drag forces 28 Lift forces 29 Added mass effects 30 On the properties of ideal fluids 30 Dynamic pressure forces and moments on a rigid body 33 3.2.4 Current effects 36 3.2.5 Weight and buoyancy 37 3.3 Underwater Remotely Operated Vehicles Model 37 3.3.1 Thruster dynamics 38 3.3.2 Overall ROV Model 40 3.4 Underwater Manipulator Model 41 4 Identification 43 4.1 Estimation approach 43 4.1.1 Least Squares Technique 44 4.1.2 Consistency and Efficiency 47 4.1.3 On the normal distribution case 47 4.1.4 Measurement variance estimation 49 4.2 On board sensor based ROV identification 49 4.2.1 Model structure 50 4.2.2 Thruster model identification 54 4.2.3 Off line velocity estimation 55 4.2.4 Heave model identification 58 4.2.5 Yaw model identification 70 4.2.6 Surge model identification 84 4.2.7 Sway model identification 89 4.2.8 Inertia parameters identification 94 4.2.9 Surge inertia parameter identification 97 4.2.10 Yaw inertia parameter identification 100 4.3 Summary 105 5 Motion control and path planning 107 5.1 2D motion control of a nonholonomic vehicle 107 5.1.1 A state feedback solution for the unicycle model 109 5.1.2 A state feedback solution for a more general model 112 5.2 Path Planning 126 5.2.1 Curvature 128 5.2.2 Planning criterion: a variational calculus approach 129 5.2.3 Solution properties 135 5.2.4 Solution examples 137 References 145 ___________________________________________________________ Giovanni Indiveri, Dr. Visiting Researcher at GMD-Japan Research Laboratory, Kitakyushu, Japan. mailto:giovanni.indiveri at gmd.de _URL__http://www.gmd.gr.jp, voice +81 93 512 1566 /// fax + 81 93 512 1588 ___________________________________________________________ From Richard_Kempter at physik.tu-muenchen.de Tue May 25 10:30:05 1999 From: Richard_Kempter at physik.tu-muenchen.de (Richard Kempter) Date: Tue, 25 May 1999 16:30:05 +0200 Subject: Hebbian Learning and Spiking Neurons Message-ID: <199905251430.QAA18781@srv.cip.physik.tu-muenchen.de> The following paper has appeared in Physical Review E, 59:4498-4514,1999 Hebbian Learning and Spiking Neurons by R. Kempter, W. Gerstner and J.L. van Hemmen Since we are out of reprints, copies of the paper are now available from http://diwww.epfl.ch/lami/team/gerstner/wg_pub.html Abstract: A correlation-based (``Hebbian'') learning rule at the spike level is formulated, mathematically analyzed, and compared to learning in a firing-rate description. As for spike coding, we take advantage of a ``learning window'' that describes the effect of timing of pre- and postsynaptic spikes on synaptic weights. A differential equation for the learning dynamics is derived under the assumption that the time scales of learning and spiking dynamics can be separated. Formation of structured synapses is analyzed for a Poissonian neuron model which receives time-dependent stochastic input. It is shown that correlations between input and output spikes tend to stabilize structure formation. With an appropriate choice of parameters, learning leads to an intrinsic normalization of the average weight and the output firing rates. Noise generates diffusion-like spreading of synaptic weights. From georgiou at csusb.edu Wed May 26 02:30:58 1999 From: georgiou at csusb.edu (georgiou@csusb.edu) Date: Tue, 25 May 1999 23:30:58 -0700 (PDT) Subject: CFP: 4th ICCIN (Feb. 27-Mar. 3, 2000) Message-ID: <199905260630.XAA17882@mail.csusb.edu> Call for Papers 4th International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE http://www.csci.csusb.edu/iccin Trump Taj Mahal Casino and Resort]], Atlantic City, NJ USA February 27 -- March 3, 2000 Summary Submission Deadline: September 1, 1999 Conference Co-chairs: Subhash C. Kak, Louisiana State University Jeffrey P. Sutton, Harvard University This conference is part of the Fourth Joint Conference Information Sciences. http://www.ee.duke.edu/JCIS/ Plenary Speakers include the following: +------------------------------------------------------------------------+ |James Anderson |Wolfgang Banzhaf |B. Chandrasekaran|Lawrence J. Fogel| |-----------------+------------------+-----------------+-----------------| |Walter J. Freeman|David E. Goldberg |Irwin Goodman |Stephen Grossberg| |-----------------+------------------+-----------------+-----------------| |Thomas S.Huang |Janusz Kacprzyk |A. C. Kak |Subhash C. Kak | |-----------------+------------------+-----------------+-----------------| |John Mordeson |Kumpati S. Narenda|Anil Nerode |Huang T. Nguyen | |-----------------+------------------+-----------------+-----------------| |Jeffrey P. Sutton|Ron Yager | | | +------------------------------------------------------------------------+ Special Sessions on Quantum Computation. More to be added. Areas for which papers are sought include: o Artificial Life o Artificially Intelligent NNs o Associative Memory o Cognitive Science o Computational Intelligence o Efficiency/Robustness Comparisons o Evaluationary Computation for Neural Networks o Feature Extraction & Pattern Recognition o Implementations (electronic, Optical, Biochips) o Intelligent Control o Learning and Memory o Neural Network Architectures o Neurocognition o Neurodynamics o Optimization o Parallel Computer Applications o Theory of Evolutionary Computation Summary Submission Deadline: September 1, 1999 Notification of authors upon review: November 1, 1999 December 1, 1999 - Deadline for invited sessions and exhibition proposals Papers will be accepted based on summaries. A summary shall not exceed 4 pages of 10-point font, double-column, single-spaced text, with figures and tables included. For the Fourth ICCIN, send 3 copies of summaries to: George M. Georgiou Computer Science Department California State University San Bernardino, CA 92407-2397 U.S.A. georgiou at csci.csusb.edu From tgd at cs.orst.edu Wed May 26 10:38:57 1999 From: tgd at cs.orst.edu (Thomas G. Dietterich) Date: Wed, 26 May 1999 16:38:57 +0200 Subject: 2 papers on Hierarchical Reinforcement Learning Message-ID: <1966-Wed26May1999163857+0200-tgd@cs.orst.edu> The following two papers are available from the Computing Research Repository (CoRR) (http://xxx.lanl.gov/archive/cs/intro.html or its mirror sites). They can also be retrieved from the Reinforcement Learning Repository http://web.cps.msu.edu/rlr/ or from my home page: http://www.cs.orst.edu/~tgd/cv/pubs.html. Number: cs.LG/9905014 Title: Hierarchical Reinforcement Learning with the MAXQ Value Function Decomposition Authors: Thomas G. Dietterich Comments: 63 pages, 15 figures Subj-class: Learning ACM-class: I.2.6 This paper presents the MAXQ approach to hierarchical reinforcement learning based on decomposing the target Markov decision process (MDP) into a hierarchy of smaller MDPs and decomposing the value function of the target MDP into an additive combination of the value functions of the smaller MDPs. The paper defines the MAXQ hierarchy, proves formal results on its representational power, and establishes five conditions for the safe use of state abstractions. The paper presents an online model-free learning algorithm, MAXQ-Q, and proves that it converges wih probability 1 to a kind of locally-optimal policy known as a recursively optimal policy, even in the presence of the five kinds of state abstraction. The paper evaluates the MAXQ representation and MAXQ-Q through a series of experiments in three domains and shows experimentally that MAXQ-Q (with state abstractions) converges to a recursively optimal policy much faster than flat Q learning. The fact that MAXQ learns a representation of the value function has an important benefit: it makes it possible to compute and execute an improved, non-hierarchical policy via a procedure similar to the policy improvement step of policy iteration. The paper demonstrates the effectiveness of this non-hierarchical execution experimentally. Finally, the paper concludes with a comparison to related work and a discussion of the design tradeoffs in hierarchical reinforcement learning. (168kb) Number: cs.LG/9905015 Title: State Abstraction in MAXQ Hierarchical Reinforcement Learning Authors: Thomas G. Dietterich Comments: 7 pages, 2 figures Subj-class: Learning ACM-class: I.2.6 Many researchers have explored methods for hierarchical reinforcement learning (RL) with temporal abstractions, in which abstract actions are defined that can perform many primitive actions before terminating. However, little is known about learning with state abstractions, in which aspects of the state space are ignored. In previous work, we developed the MAXQ method for hierarchical RL. In this paper, we define five conditions under which state abstraction can be combined with the MAXQ value function decomposition. We prove that the MAXQ-Q learning algorithm converges under these conditions and show experimentally that state abstraction is important for the successful application of MAXQ-Q learning. (37kb) From mschmitt at lmi.ruhr-uni-bochum.de Wed May 26 08:01:17 1999 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Wed, 26 May 1999 14:01:17 +0200 Subject: Preprint: Complexity of learning for spiking neurons Message-ID: <374BE28C.D9678B93@lmi.ruhr-uni-bochum.de> Dear colleagues, the following paper has been accepted for publication in "Information and Computation" and is available from http://www.cis.tu-graz.ac.at/igi/maass/96.ps.gz or http://www.ruhr-uni-bochum.de/lmi/mschmitt/spikingneurons.ps.gz (24 pages, 136K). Regards, Michael Schmitt ------------------------------------------------------------ TITLE: On the Complexity of Learning for Spiking Neurons with Temporal Coding AUTHORS: Wolfgang Maass and Michael Schmitt ABSTRACT Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. In a network of spiking neurons a new set of parameters becomes relevant which has no counterpart in traditional neural network models: the time that a pulse needs to travel through a connection between two neurons (also known as delay of a connection). It is known that these delays are tuned in biological neural systems through a variety of mechanisms. In this article we consider the arguably most simple model for a spiking neuron, which can also easily be implemented in pulsed VLSI. We investigate the VC dimension of networks of spiking neurons where the delays are viewed as programmable parameters and we prove tight bounds for this VC dimension. Thus we get quantitative estimates for the diversity of functions that a network with fixed architecture can compute with different settings of its delays. In particular, it turns out that a network of spiking neurons with $k$ adjustable delays is able to compute a much richer class of functions than a threshold circuit with $k$ adjustable weights. The results also yield bounds for the number of training examples that an algorithm needs for tuning the delays of a network of spiking neurons. Results about the computational complexity of such algorithms are also given. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: ++49 234 700-3209 , Fax: ++49 234 7094-465 From zhaoping at gatsby.ucl.ac.uk Mon May 24 07:07:24 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Mon, 24 May 1999 12:07:24 +0100 Subject: Paper available on Visual Segmentation by intracortical interactions in V1 Message-ID: <199905241107.MAA17223@vision.gatsby.ucl.ac.uk> Paper on Visual Segmentation by intracortical interactions in V1 Published in NETWORK: COMPUTATION IN NEURAL SYSTEMS, vol.10, number 2, May 1999, page, 187-212 Available at Online Service of the Journal --- http://www.iop.org/EJ/S Or on the website --- http://www.gatsby.ucl.ac.uk/~zhaoping/preattentivevision.html Title: Visual segmentation by contextual influences via intracortical interactions in primary visual cortex. Author: Zhaoping LI Abstract: Stimuli outside classical receptive fields have been shown to exert significant influence over the activities of neurons in primary visual cortex. We propose that contextual influences are used for pre-attentive visual segmentation. The difference between contextual influences near and far from region boundaries makes neural activities near region boundaries higher than elsewhere, making boundaries more salient for perceptual pop-out. The cortex thus computes {\it global} region boundaries by detecting the breakdown of homogeneity or translation invariance in the input, using {\it local} intra-cortical interactions mediated by the horizontal connections. This proposal is implemented in a biologically based model of V1, and demonstrated using examples of texture segmentation and figure-ground segregation. The model is also the first that performs texture or region segmentation in exactly the same neural circuit that solves the dual problem of the enhancement of contours, as is suggested by experimental observations. The computational framework in this model is simpler than previous approaches, making it implementable by V1 mechanisms, though higher level visual mechanisms are needed to refine its output. However, it easily handles a class of segmentation problems that are known to be tricky. Its behavior is compared with psychophysical and physiological data on segmentation, contour enhancement, contextual influences, and other phenomena such as asymmetry in visual search. From FYFE-CI0 at wpmail.paisley.ac.uk Fri May 28 04:55:25 1999 From: FYFE-CI0 at wpmail.paisley.ac.uk (COLIN FYFE) Date: Fri, 28 May 1999 08:55:25 +0000 Subject: Jobs in Computational Intelligence Message-ID: Applications are now sought for the following posts from those in the Computational Intelligence community: Chair of Computational Intelligence Department of Computing and Information Systems The University of Paisley, Paisley, Scotland. This senior post carries a salary of 36981 pa (award pending) and is an additional post intended to strengthen the Department's research group in this area. There are already 6 promoted academics in this area which covers the groups involved in Artificial Neural Networks Statistical Signal Processing Applied Artificial Intelligence Artificial Life You should have a substantial track record of publications and be able to demonstrate leadership in research for this post. There are also three posts at the Lecturer/Senior Lecturer level. Candidates for the Senior Lecturer post should also be able to demonstrate substantial research in this area or in Information Systems. Senior Lecturer Salary Scale : 26146 to 33798 pa (award pending) Lecturer Salary Scale : 15885 to 28507 pa (award pending) Recruitment packs are available from the Personnel Office, University of Paisley, Paisley PA1 2BE tel : 0141 848 3940 Closing date is 16th June 1999 Informal enquiries may be made to Prof Malcolm Crowe, Head of Department on 0141 848 3300 or to Colin Fyfe on 0141 848 3305 http://cis.paisley.ac.uk From gsiegle at sciences.sdsu.edu Mon May 31 01:12:30 1999 From: gsiegle at sciences.sdsu.edu (Greg Siegle) Date: Sun, 30 May 1999 22:12:30 -0700 (PDT) Subject: Dissertation available on attention in depression Message-ID: <199905310512.WAA26907@sciences.sdsu.edu> Dear Connectionists, The following dissertation is now available on-line at http://www.sci.sdsu.edu/CAL/greg/dissert/ Cognitive and Physiological Aspects of Attention to Personally Relevant Negative Information in Depression by Greg Jeremy Siegle Abstract Evidence suggests depressed individuals pay excessive attention to negative information. The current research investigates the nature and clinical implications of such attention biases. A computational neural network, reflecting interacting brain systems that identify emotional and nonemotional aspects of information, is described in which depression is identified with strongly learning certain negative information. The model's behavior suggested that depressed people are reminded of, and attend to personally relevant negative information in response to many stimuli. Predictions for depressed and nondepressed individuals' reaction times, signal detection rates, and the time course of cognitive load in response to emotional stimuli were derived from the computational model. To evaluate these predictions, pupil dilations and reaction times were collected from 24 unmedicated depressed and 25 nondepressed adults in response to emotional lexical decision and valence identification tasks. Pupil dilation was used to index cognitive load. Mixed ANOVA planned contrasts were employed to evaluate predictions. In support of model derived predictions, depressed individuals rated many stimuli as negative more than nondepressed individuals. The network's behavior suggested that depressed individuals would be quicker to say that negative words were negative, than positive words were positive, and that this difference would be reduced in nondepressed individuals. This prediction was supported empirically. Principal components analysis of pupil dilations revealed early attentional components (at or before reaction times) and late, possibly ruminative, components (peaking 2 and 4 seconds after reaction times). The computational model suggested cognitive load, indexed by pupil dilation, would be highest for nondepressed individuals during early stages of attention but highest for depressed individuals during later stages of attention. This prediction was supported. Contrary to predictions, differences in depressed individuals' dilations to positive and negative stimuli were not detected. These data suggest depressed individuals may not initially attend to the content of presented information, but may quickly associate any incoming information with whatever made them depressed. Sustained attention to personally relevant negative information may characterize depressive attention biases. Targeting implicated cognitive and brain processes may improve interventions for depression. ----------------------------------------- Greg Siegle, Ph.D. San Diego State University / University of California, San Diego / University of Toronto www.sci.sdsu.edu/CAL/greg.html 416-979-4747 x2376 The Clarke Institute, 250 College St., Toronto, ON M5T 1R8 Canada Visit the Connectionist Models of Cognitive, Affective, Brain, and Behvioral Disorders website at www.sci.sdsu.edu/CAL/connectionist.models From ted.carnevale at yale.edu Sat May 1 13:21:36 1999 From: ted.carnevale at yale.edu (Ted Carnevale) Date: Sat, 01 May 1999 13:21:36 -0400 Subject: NEURON Summer Course Message-ID: <372B381F.1E4C@yale.edu> COURSE ANNOUNCEMENT What: "The NEURON Simulation Environment" (1999 NEURON Summer Course) When: Saturday, July 31, through Wednesday, August 4, 1999 Where: San Diego Supercomputer Center University of California at San Diego, CA Organizers: N.T. Carnevale and M.L. Hines Faculty includes: N.T. Carnevale, M.L. Hines, W.W. Lytton, and T.J. Sejnowski Description: This intensive hands-on course covers the design, construction, and use of models in the NEURON simulation environment. It is intended primarily for those who are concerned with models of biological neurons and neural networks that are closely linked to empirical observations, e.g. experimentalists who wish to incorporate modeling in their research plans, and theoreticians who are interested in the principles of biological computation. The course is designed to be useful and informative for registrants at all levels of experience, from those who are just beginning to those who are already quite familiar with NEURON or other simulation tools. Registration is limited to 20 and the deadline is Thursday, July 1, 1999. For more information see http://www.neuron.yale.edu/sdsc99/sdsc99.htm or contact Ted Carnevale Psychology Dept. Box 208205 Yale University New Haven CT 06520-8205 USA phone 203-432-7363 fax 203-432-7172 email ted.carnevale at yale.edu Supported in part by the National Science Foundation. This course is not sponsored by the University of California. --Ted From annimab at www.phil.gu.se Mon May 3 16:28:48 1999 From: annimab at www.phil.gu.se (ANNIMAB) Date: Mon, 3 May 1999 22:28:48 +0200 Subject: No subject Message-ID: Below you can find information about a new conference of interest for researchers in medicine, biology, statistics, AI and artificial neural networks. (Our apologies if you receive more than one copy of this message!) For more information about the conference see: http://www.phil.gu.se/annimab.html First announcement: ANNIMAB-1 an international conference on Artificial Neural Networks In Medicine And Biology Gothenburg, Sweden, May 13-16, 2000 Artificial neural network (ANN) techniques are currently being used for many data analysis and modelling tasks in clinical medicine as well as in theoretical biology, and the possible applications of ANNs in these fields are countless. The ANNIMAB-1 conference will summarise the state of the art, analyse the relations between ANN techniques and other available methods, and point to possible future biological and medical uses of ANNs. It will have three main themes: 1) Medical applications of artificial neural networks: for better diagnoses and outcome predictions from clinical and laboratory data, in the analysis of ECG and EEG signals, in medical image analysis, for the handling of medical databases, etc. 2) Uses of ANNs in biology outside clinical medicine: for example, in models of ecology and evolution, for data analysis in molecular biology, in simulations of cell signalling mechanisms, and (of course) in models of animal and human nervous systems and their capabilities. 3) Theoretical aspects: recent developments in ANN techniques, ANNs in relation to AI and to traditional statistical procedures, possible roles of ANNs in the medical decision process, etc. Hybrid systems and integrative approaches, such as those involving Bayesian belief nets, will receive special attention. Among the keynote speakers are: Wayne Getz (Berkeley); Teuvo Kohonen (Helsinki); Anders Lansner (Stockholm); Paulo Lisboa (Liverpool). The size of the conference, which starts at 2PM on Saturday, May 13 and ends at 3PM on Tuesday, May 16, will be limited to 250 participants. The second announcement and call for papers is scheduled for August 15, 1999, and the deadline for abstract submissions is October 15, 1999. ANNIMAB-S the Artificial Neural Networks In Medicine And Biology Society Dept of Philosophy, Gothenburg University Box 200, SE-405 30 Gothenburg, Sweden phone (+46) 31 773 5573 fax (+46) 31 773 5159 e-mail: annimab at www.phil.gu.se http://www.phil.gu.se/annimab.html From mitra at its.caltech.edu Tue May 4 02:38:27 1999 From: mitra at its.caltech.edu (Partha Mitra) Date: Mon, 3 May 1999 23:38:27 -0700 (PDT) Subject: No subject Message-ID: <199905040638.XAA27052@stucco.cco.caltech.edu> Workshop on Analysis of Neural Data ________________________________________________ Modern methods and open issues in the analysis and interpretation of time-series and imaging data in the neurosciences ___________________________________________________ >> 16 August - 28 August 1999 >> Marine Biological Laboratories - Woods Hole, MA ___________________________________________________ A working group of scientists committed to quantitative approaches to problems in neuroscience will focus their efforts on experimental and theoretical issues related to the analysis of large, multi-channel data sets. The motivation for the work group is based on issues that arise in two complimentary areas critical to an understanding of brain function. The first involves advanced signal processing methods, particularly those appropriate for emerging multi-site recording techniques and noninvasive imaging techniques. The second involves the development of a calculus to study the dynamical behavior of nervous systems and the computations they perform. A distinguishing feature of the work group will be the close collaboration between experimentalists and theorists, particularly with regard to the analysis of data and the planning of experiments. The work group will have a small number of pedagogical lectures, supplemented by tutorials on relevant computational and mathematical techniques. This work group is a means to critically evaluate techniques for the processing of multi-channel data, of which imaging forms an important category. Such techniques are of fundamental importance for basic research and medical diagnostics. We have begun to establish a repository of these techniques to insure the rapidly dissemination of modern analytical techniques throughout the neuroscience community. The work group convenes on a yearly basis. In 1999, we will continue to focus on topics in the analysis of multivariate time series data, consisting of both continuous and point processes. In addition, we will have two specialized programs on neuronal instrumentation: * 21 August 1999 - Multisite recording of extracellular cortical potentials with Si-based probes. This is offered in collaboration with the Center for Neural Communication Technology at the University of Michigan. * 26 August 1999 - A comparison of analysis techniques for fMRI data. Participants: Twenty five participants, both experimentalists and theorists. Experimentalists are specifically encouraged to bring data records to the work group; appropriate computational facilities will be provided. The work group will further take advantage of interested investigators concurrently present at the MBL. We encourage graduate students and postdoctoral fellows as well as senior researchers to apply. Participant Fee: $250. Support: National Institutes of Health - NIMH, NIA, NIAAA, NICHD/NCRR, NIDCD, NIDA, and NINDS. Organizers: David Kleinfeld (UCSD) and Partha P. Mitra (Bell Laboratories). Website: http://www.vis.caltech.edu/~WAND Application: Send a copy of your curriculum vita, together with a cover letter that contains a brief (ca. 200 word) paragraph on why you wish to attend the work group to: Ms. Jean B. Ainge Bell Laboratories, Lucent Technologies 700 Mountain Avenue 1D-427 Murray Hill, NJ 07974 908-582-4702 (fax) or The MBL is an EEO AAI. Graduate students and postdoctoral fellows are encouraged to include a letter of support from their research advisor. Shared accomodations and board will be provided. Applications must be received by 21 May 1999. Participants will be notified by 7 June 1999. The Archives for Neurosciences can be found at: http://xxx.lanl.gov/archive/neuro-sys From tnatschl at igi.tu-graz.ac.at Tue May 4 05:30:06 1999 From: tnatschl at igi.tu-graz.ac.at (Thomas Natschlaeger) Date: Tue, 4 May 1999 11:30:06 +0200 (CEST) Subject: Fast analog computation with unreliable synapses Message-ID: Announcement of a new paper by Wolfgang Maass and Thomas Natschlaeger: "A model for fast analog computation based on unreliable synapses" Neural Computation, 1999. in press. Abstract: We investigate through theoretical analysis and computer simulations the consequences of unreliable synapses for fast analog computations in networks of spiking neurons, with analog variables encoded by the current firing activities of pools of spiking neurons. Our results suggest that the known unreliability of synaptic transmission may be viewed as a useful tool for analog computing, rather than as a ``bug'' in neuronal hardware. We also investigate computations on time series and Hebbian learning in this context of space-rate coding. This paper is available as gzipped postscript file (26 pages, 211Kb) from http://www.cis.tu-graz.ac.at/igi/maass/#Publications (see #102) or http://www.cis.tu-graz.ac.at/igi/tnatschl/Publications.html Sincerely Thomas Natschlaeger ********************************************************* ** ** ** Thomas Natschlaeger ** ** Institute for Theoretical Computer Science ** ** Technische Universitaet Graz ** ** Klosterwiesgasse 32/2 ** ** A - 8010 Graz, Austria ** ** email: tnatschl at igi.tu-graz.ac.at ** ** www: http://www.cis.tu-graz.ac.at/igi/tnatschl/ ** ** Tel: ++43 316 873 5814 ** ** Fax: ++43 316 873 5805 ** ** ** ********************************************************* From oby at cs.tu-berlin.de Tue May 4 07:19:17 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Tue, 4 May 1999 13:19:17 +0200 (MET DST) Subject: No subject Message-ID: <199905041119.NAA20539@pollux.cs.tu-berlin.de> Dear Connectionists, below please find abstract and preprint-location of a recent paper on modelling contrast adaptation in primary visual cortex. Cheers Klaus -------------------------------------------------------------------- Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ -------------------------------------------------------------------- -------------------------------------------------------------------- Contrast Adaptation and Infomax in Visual Cortical Neurons Peter Adorjan, Christian Piepenbrock, and Klaus Obermayer CS Department, Technical University of Berlin, Berlin, Germany In the primary visual cortex (V1) the contrast response function of many neurons saturates at high contrast and adapts depending on the visual stimulus. We propose that both effects--contrast saturation and adaptation--can be explained by a fast and a slow component in the synaptic dynamics. In our model the saturation is an effect of fast synaptic depression with a recovery time constant of about 200 ms. Fast synaptic depression leads to a contrast response function with a high gain for only a limited range of contrast values. Furthermore, we propose that slow adaptation of the transmitter release probability at the geniculocortical synapses is the underlying neural mechanism that accounts for contrast adaptation on a time scale of about 7 sec. For the functional role of contrast adaptation we make the hypothesis that it serves to achieve the best visual cortical representation of the geniculate input. This representation should maximize the mutual information between the cortical activity and the geniculocortical input by increasing the release probability in a low contrast environment. We derive an adaptation rule for the transmitter release probability based on this EM infomax principle. We show that changes in the transmitter release probability may compensate for changes in the variance of the geniculate inputs--an essential requirement for contrast adaptation. Also, we suggest that increasing the release probability in a low contrast environment is beneficial for signal extraction, because neurons remain sensitive only to an increase in the presynaptic activity if it is synchronous and, therefore, likely to be stimulus related. Our hypotheses are tested in numerical simulations of a network of integrate-and-fire neurons for one column of V1 using fast synaptic depression and slow synaptic adaptation. The simulations show that changing the synaptic release probability of the geniculocortical synapses is a better model for contrast adaptation than the adaptation of the synaptic weights: only in the case of changing the transmitter release probability our model reproduces the experimental finding that the average membrane potential (DC component) adapts much stronger than the stimulus modulated component (F1 component). In the case of changing synaptic weights, however, the average membrane potential (DC) as well as the stimulus modulated component (F1 component) would adapt. Furthermore, changing the release probability at the recurrent cortical synapses cannot account for contrast adaptation, but could be responsible for establishing oscillatory activity often observed in recordings from visual cortical cells. Rev. Neurosci. 1999, in press available at: http://ni.cs.tu-berlin.de/publications/ From ormoneit at stat.Stanford.EDU Tue May 4 13:45:46 1999 From: ormoneit at stat.Stanford.EDU (Dirk Ormoneit) Date: Tue, 4 May 1999 10:45:46 -0700 (PDT) Subject: New TR on Kernel-Based Reinforcement Learning Message-ID: <199905041745.KAA26003@rgmiller.Stanford.EDU> The following technical report is now available on-line at http://www-stat.stanford.edu/~ormoneit/tr-1999-8.ps Best, Dirk ------------------------------------------------------------------ KERNEL-BASED REINFORCEMENT LEARNING by Dirk Ormoneit and Saunak Sen Kernel-based methods have recently attracted increased attention in the machine learning literature as reliable tools to attack regression and classification tasks. In this work, we consider a kernel-based approach to reinforcement learning that will be shown to produce a consistent estimate of the true value function in a continuous Markov Decision Process. Typically, consistency cannot be obtained using parametric value function estimates such as neural networks. As further contributions, we derive the asymptotic distribution of the kernel-based estimate and establish optimal convergence rates. The asymptotic distribution is then used to derive a formula for the asymptotic bias inherent in the kernel-based approximation. In spite of the fact that reinforcement learning is generally biased due to the involved maximum operator, this is the first theoretical result in this spirit to our knowledge. The suggested bias formulas may serve as the basis for bias correction techniques that can be used in practice to improve the estimate of the value function. -------------------------------------------- Dirk Ormoneit Department of Statistics, Room 206 Stanford University Stanford, CA 94305-4065 ph.: (650) 725-6148 fax: (650) 725-8977 ormoneit at stat.stanford.edu http://www-stat.stanford.edu/~ormoneit/ From reggia at cs.umd.edu Tue May 4 15:29:16 1999 From: reggia at cs.umd.edu (James A. Reggia) Date: Tue, 4 May 1999 15:29:16 -0400 (EDT) Subject: Postdoc position, computational neurosci., language Message-ID: <199905041929.PAA25542@avion.cs.umd.edu> (The following includes computational modeling related to neuoscience of language and its disorders. Please direct questions etc. to Dr. Rita Berndt as indicated below.) Post-doctoral Fellowship in the Cognitive Neuroscience of Language and its Disorders Two-year post-doctoral fellowship available after July 1, 1999, at the University of Maryland School of Medicine, in Baltimore, Maryland. Training opportunities will provide experience in the application of contemporary research methods (including computational modeling, cognitive neuropsychology, event-related potentials and functional neuroimaging) to the topic of normal and disordered language processing. Applicants with doctoral degrees in related basic science areas (cognitive psychology, neuroscience, linguistics, computer science, etc.) and clinical disciplines (speech/language pathology; clinical neuropsychology) are invited to apply. Questions may be directed to rberndt at umaryland.edu. To apply, send HARD COPIES of C.V., names and addresses of three referees, and statement of research interests and career goals to: Rita S. Berndt, Ph.D., Department of Neurology, University of Maryland School of Medicine, 22 South Greene Street, Baltimore, Maryland 21201. Applications should be received by July 1, 1999, for full consideration. From KSTUEBER at holycross.edu Mon May 3 14:05:32 1999 From: KSTUEBER at holycross.edu (Karsten Stueber) Date: Mon, 03 May 1999 14:05:32 -0400 Subject: SPP program, housing and conference registration Message-ID: This e-mail contains information about the preliminary program, on-campus housing, and conference registration. It also contains a Conference Pre-registration form and an On- Campus Housing reservation form. On- Campus Housing is limited. You are therefore advised to reserve a room ASAP. (Latest by May 24, 1999). Please note also that the On-Campus Reservation and Conference Registration have to be sent to different addresses. For information about the finalized program and further travel information please check the SPP conference web page at http://www.hfac.uh.edu/cogsci/spp/wwwanlmt.htm PRELIMINARY PROGRAM OF THE 25th ANNIVERSARY MEETING OF THE SOCIETY FOR PHILOSOPHY AND PSYCHOLOGY (For the exact program, please check the SPP website in the first week of May) STANFORD UNIVERSITY JUNE 19-22, 1999 The first panel will be on Saturday, June 19, 1999 at 3:30pm. The last session will be on Tuesday Morning, June 22, 1999. SYMPOSIA: Symposium I: Theory of Mind: Infants and Primates Confirmed speakers: Alison Gopnik (Psychology, UC Berkeley) Daniel Povinelli (New Iberia Research Center, USL) Symposium II: Frontiers in Cognitive Neuroscience Confirmed speakers: Gregory McCarthy (Brain Imaging & Analysis Center, Duke) Lynn Robertson (Martinez VA & UC Berkeley) Symposium III: "Then and Now in Philosophy and Psychology" Confirmed speakers: Hilary Putnam (Philosophy, Harvard) Roger Shepard (Psychology, Stanford) Symposium IV: Past, Present, and Future of SPP (Special Panel Discussion) Panelists: Patrick Suppes (Philosophy, Stanford) Daniel Dennett (Philosophy, Tufts) Stephen Stich (Philosophy, Rutgers) Kathleen Akins (Philosophy, Simon Fraser) Stevan Harnad (Cognitive Science, Southampton) INVITED SPEAKERS: Session I: Mechanism of Pain Allan Basbaum (Neuroscience, UCSF) [Other invited and symposium speakers unlisted due to pending confirmation] CONTRIBUTED SESSIONS A. Belief and Explanation Carol Slater (Alma College) No 'There' There: Ruth Millikan, Lloyd Morgan, and the Case of the Missing Indexicals" Kristen Andrews and Peter Verbeek (University of Minnesota) Prediction, Explanation, and Folk Psychology B. Belief and Thought Eric Schwitzgebel (University of California, Riverside) In-Between Believing Lawrence A. Beyer (Stanford University) Do We Believe Only What We Take to Be True? C. Mind and Brain William Bechtel (Washington University, St. Louis) and Robert N. McCauley (Emory University) Heuristic Identity Theory (or Back to the Future): The Mind-Body Problem Against the Background of Research Strategies in Cognitive Neuroscience" Max Velmans (University of London) How to Make Sense of the Causal Interactions Between Consciousness and the Brain Bruce Mangan (University of California, Berkeley) The Fallacy of Functional Exclusion D. Functions of the Senses Brian Keeley (Washington University, St. Louis / University of Northern Iowa) Making Sense of Modalities" Alva Noe (University of California, Santa Cruz) What Change Blindness Really Teaches Us About Vision Bernard Baars (The Wright Institute) Criteria for Consciousness in the Brain: Methodological Implications of Recent Developments in Visual Neuroscience" E. Cognition Jesse Prinz (Washington University, St. Louis) Mad Dogs and Englishmen: Concept Nativism Reconsidered Muhammad Ali Khalidi (American University) Two Models of Innateness James Blackmon, David Byrd, Robert Cummins, Pierre Poirier, Martin Roth (University of California, Davis) Systematicity and the Cognition of Structural Domains F. Representation and Pain Stephanie Beardman (Rutgers University) The Choice Between Actual and Remembered Pain Murat Aydede (University of Chicago) Pain Qualia and Representationalism William Robinson (Iowa State) Representationalism and Epiphenomenalism POSTER PRESENTATIONS Tim Bayne (University of Arizona) H. Looren de Jong (Vrije Universiteit, Amsterdam) Donald Dryden (Duke University) Sanford Goldberg (Grinell College) Daniel Haybron (Rutgers University) David Hunter (Buffalo State University) Ariel Kernberg (University College, London) Stan Klein (UC-Berkeley) Uriah Kriegel (Brown University) John Kulvicki (University of Chicago) Justin Leiber (University of Houston) Ron Mallon (Rutgers University) Shaun Maxwell (Queens University, Canada) Lawrence Roberts (SUNY Binghamton) Teed Rockwell Peter Ross (Pomona College) James Taylor (Bowling Green State University) Charles Twardy (Indiana University) Ruediger Vaas (University of Stuttgart & University of Hohenheim) Adam Vinueza (University of Colorado) Jonathan Weinberg (Rutgers University) Josh Weisberg (City University of New York) Tadeusz Zawidzki (Washington University, St. Louis) Jing Zhu (University of Waterloo, Canada) Conference Pre-Registration Form Mail completed form with check or money order made payable to Society for Philosophy and Psychology, ASAP (It has to be received no later than June 8, 1999) to:Karsten Stueber;Secretary-Treasurer; SPP; Department of Philosophy; College of the Holy Cross; PO Box 137A; Worcester, MA 01610 Name Address Daytime Phone ( ) Fax( ) e-mail Institutional Affiliation Conference Registration Fee Member:$40 Nonmember:$60 Student: $15 Banquet and Presidential Address ($48 per person, including taxes and tips) $48 per person x #of persons $ August 1998-July 1999 SPP Membership Dues (New Members pay member conference registration fees) Regular Member: $25 Student: $5 Contributions to William James Graduate Student Award $ Total $ On Campus Accommodations: Rates: 51.00 per night for single 39.00 per person per night for shared These rates include daily continental breakfast and weekly maid service (beds are made upon arrival but only common areas are cleaned each day). Linen and Towels will be provided but you may wish to bring extra towels since we will not have daily maid service. Reservations: All Rooms must be paid in advance. Only three-day packages may be reserved. That is, each on campus resident must pay, in advance, for (at least) three nights, regardless of intended check-in/check-out date. a single room on campus 153.00 In order to reserve please send a check for a double room on campus 117/person In order to be assured a room, Your check MUST be received not later than May 24, 1999 Please fill out the On Campus Housing registration form and make sure to indicate your Name, Address, phone number, gender, and roommate preference, if any. Roommate Preferences: If you have a preferred roommate, please indicate the person with whom you intend to room. If you wish to be assigned a roommate, please indicate your gender. Check-in Time: 1-3 at American Studies House in the governor's Corner Complex. Late Check-in: Late arrivals will check in at Summer Conference Office at the Elliot Program Center, near the governor's Corner Complex. Check-out. By noon, Tuesday, June 22. Additional Nights: Additional may be arranged with Stanford Summer Conference Services at the rate of 43.75/night for single 32.00/night for doubles (no continental breakfast included). Those requesting additional nights will pay the Summer Conferences Services directly for additional nights. If you wish additional nights, indicate so in writing and we will pass word onto them. Key Deposit: Each on campus resident will be required to leave a refundable $70.00 key deposit. Special Services: Participants or attendees needing special arrangements to accommodate a disability may request accommodations by contacting Kenneth Taylor. Request should by made by May 15th, 1999. phone 650-723-2547. e-mail taylor at turing.stanford.edu, fax: 650-723-0985 Phone Services: Pay phones are available in each dorm. Individual rooms do not have phones, however. On Campus Parking: Daily parking permits for $2.00/day may be purchased at check-in and at conference registration. On Campus Housing Reservation Form Mail completed form with check or money order made payable to Society for Philosophy and Psychology, ASAP (It has to be received latest by May 24, 1999) to: Kenneth Taylor Department of Philosophy Stanford University Stanford, CA 94022. Name (Mr. or Ms.) Address Daytime Phone ( ) Fax( ) e-mail Institutional Affiliation Name of Preferred Roommate: Reservation for a single room on campus (three days) $153.00/per person Reservation for a double room on campus(three days) $117/person Total: $ Alternate Accommodations: For those preserving to stay off campus, a small number of rooms have been blocked off at the Cardinal Hotel in downtown Palo Alto, about a mile from the center of Campus, but close to many fine restaurants, bars, and shopping. To reserve a room at the Cardinal call: 650-323-5101. Rates: 107+ tax for both double and single rooms. The Cardinal Hotel is located at 235 Hamilton Ave Palo Alto, California. Be sure to Mention the Society for Philosophy and Psychology to receive the Stanford Rate. This block of rooms will be released as of May 19th, 1999 and will be available only on a first-come, first-serve basis thereafter. For additional hotels and other visitor information consult the following web page: http://www.stanford.edu/home/visitors/index.html Program Co-Chair: Guven Guzeldere guven at aas.duke.edu. Program Co-Chair: Steven Harnad harnad at coglit.soton.ac.uk President (1999): Brian Cantwell Smith smithbc at indiana.edu Secretary-Treasurer: Karsten Stueber kstueber at holycross.edu Local Arrangements: Kenneth Taylor taylor at csli.stanford.edu From oby at cs.tu-berlin.de Wed May 5 06:06:28 1999 From: oby at cs.tu-berlin.de (Klaus Obermayer) Date: Wed, 5 May 1999 12:06:28 +0200 (MET DST) Subject: faculty position Message-ID: <199905051006.MAA29310@pollux.cs.tu-berlin.de> Dear Connectionists, the CS department of the Technical University of Berlin solicits application for a tenured faculty position (C3) in the area of Computer Vision. Although we encourage candidates from a variety of backgrounds to apply, one potential focus is pattern recognition and neural computation. For information about the department and its research groups you are welcome to visit our Web-pages at: http://www.cs.tu-berlin.de/cs/index-en.html Best wishes Klaus =========================================================================== FACULTY POSITION IN COMPUTER VISION CS-Department, Technical University of Berlin, Berlin, Germany The Department for Computer Science solicits application for a tenured faculty position (salary level C3) in the area of image acquisition, processing, and understanding. The successful candidate is expected to join the department's undergraduate teaching programs, as well as the graduate education in the area of Computer Vision. The successful candidate is expected to teach courses in German after one year. Requirements: Ph.D. degree in computer science, electrical engineering, or neighboring fields; Habilitation or equivalent achievements; research experience in in the field of computer vision; a strong publication record; teaching experience; track record in acquiring research grants. Experience in application areas like biomedicine, automation and control, or robotics is desirable. Please send applications to: Search Committee (Computer Vision) FR 5-1, Department of Computer Science Technical University of Berlin Franklinstrasse 28/29 10587 Berlin, Germany The Technical University of Berlin wants to increase the percentage of women on its faculty and strongly encourages applications from qualified individuals. Handicapped persons are also encouraged to apply and will be preferred given equal qualifications. ============================================================================ Prof. Dr. Klaus Obermayer phone: 49-30-314-73442 FR2-1, NI, Informatik 49-30-314-73120 Technische Universitaet Berlin fax: 49-30-314-73121 Franklinstrasse 28/29 e-mail: oby at cs.tu-berlin.de 10587 Berlin, Germany http://ni.cs.tu-berlin.de/ From ml_conn at infrm.kiev.ua Wed May 5 11:39:50 1999 From: ml_conn at infrm.kiev.ua (Dmitri Rachkovskij) Date: Wed, 5 May 1999 17:39:50 +0200 (UKR) Subject: Connectionist symbol processing: any progress? References: Message-ID: Keywords: distributed representation, sparse coding, binary coding, binding, variable binding, thinning, representation of structure, structured representation, recursive representation, nested representation, compositional representation, connectionist symbol processing, associative-projective neural networks. Dear Colleagues, The following paper draft (abstract enclosed) inspired by the last year's debate is available at http://cogprints.soton.ac.uk/abs/comp/199904008 Dmitri A. Rachkovskij & Ernst M. Kussul "Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning". Comments are welcome! Thank you and best regards, Dmitri Rachkovskij Abstract: Distributed representations were often criticized as inappropriate for encoding of data with a complex structure. However Plate's Holographic Reduced Representations and Kanerva's Binary Spatter Codes are recent schemes that allow on-the-fly encoding of nested compositional structures by real-valued or dense binary vectors of fixed dimensionality. In this paper we consider procedures of the Context-Dependent Thinning which were developed for representation of complex hierarchical items in the architecture of Associative-Projective Neural Networks. These procedures provide binding of items represented by sparse binary codevectors (with low probability of 1s). Such an encoding is biologically plausible and allows to reach high information capacity of distributed associative memory where the codevectors may be stored. In distinction to known binding procedures, Context-Dependent Thinning allows to support the same low density (or sparseness) of the bound codevector for varied number of constituent codevectors. Besides, a bound codevector is not only similar to another one with similar constituent codevectors (as in other schemes), but it is also similar to the constituent codevectors themselves. This allows to estimate a structure similarity just by the overlap of codevectors, without the retrieval of the constituent codevectors. This also allows an easy retrieval of the constituent codevectors. Examples of algorithmic and neural network implementations of the thinning procedures are considered. We also present representation examples of various types of nested structured data (propositions using role-filler and predicate-arguments representation, trees, directed acyclic graphs) using sparse codevectors of fixed dimension. Such representations may provide a fruitful alternative to the symbolic representations oftraditional AI, as well as to the localist and microfeature-based connectionist representations. ************************************************************************* Dmitri A. Rachkovskij, Ph.D. Net: dar at infrm.kiev.ua Senior Researcher, V.M.Glushkov Cybernetics Center, Tel: 380 (44) 266-4119 Pr. Acad. Glushkova 40, Kiev 22, 252022, UKRAINE Fax: 380 (44) 266-1570 ************************************************************************* From jf218 at hermes.cam.ac.uk Wed May 5 04:39:31 1999 From: jf218 at hermes.cam.ac.uk (Dr J. Feng) Date: Wed, 5 May 1999 09:39:31 +0100 (BST) Subject: Paper available In-Reply-To: <372808DF.8A5DDC14@syseng.anu.edu.au> Message-ID: Dear All, You could download the following paper (*.ps.gz) from my home page (address below) Varibility of firing of Hodgkin-Huxley and FitzHugh-Nagumo neurons with stochastic synaptic input Phys. Rev. Lett. (in press) Abstract: The variability and mean of the firing rate of Hodgkin-Huxley and FitzHugh-Nagumo neurons subjected to random synaptic input are only weakly dependent on the level of inhibitory inputs, unlike integrate and fire neurons. For the later model, substantial inhibitory input is essential to ensure output variability close to Poissonian firing. Jianfeng Feng The Babraham Institute Cambridge CB2 4AT UK http://www.cus.cam.ac.uk/~jf218 From jfgf at eng.cam.ac.uk Thu May 6 04:56:39 1999 From: jfgf at eng.cam.ac.uk (J.F. Gomes De Freitas) Date: Thu, 6 May 1999 09:56:39 +0100 (BST) Subject: Paper: Bayesian Support Vectors Message-ID: Dear colleagues A paper, to appear in NNSP99, on sequential Bayesian estimation techniques for support vectors is now available at: http://svr-www.eng.cam.ac.uk/~jfgf/publications.html As I am presently writing a longer version, I would very much appreciate your feedback, especially if you have any negative comments and/or if you can answer some of the questions I pose in the paper. ABSTRACT: In this paper, we derive an algorithm to train support vector machines sequentially. The algorithm makes use of the Kalman filter and is optimal in a Bayesian framework. It extends the support vector machine paradigm to applications involving real-time and non-stationary signal processing. It also provides a computationally efficient alternative to the problem of quadratic optimisation. Thanks Nando _______________________________________________________________________________ JFG de Freitas (Nando) Speech, Vision and Robotics Group Information Engineering Cambridge University CB2 1PZ England http://svr-www.eng.cam.ac.uk/~jfgf Tel (01223) 302323 (H) (01223) 332754 (W) _______________________________________________________________________________ From stephen at computing.dundee.ac.uk Thu May 6 05:14:08 1999 From: stephen at computing.dundee.ac.uk (Stephen McKenna) Date: Thu, 06 May 1999 10:14:08 +0100 Subject: PhD/CASE Studentships, University of Dundee Message-ID: <37315D60.6D30C949@computing.dundee.ac.uk> PhD studentships in the areas of face recognition and human action/behaviour recognition using computer vision are currently available in the Department of Applied Computing, University of Dundee. The department was awarded a 5A rating in the latest UK research assessment exercise. A CASE award is available in collaboration with NCR and provides an additional maintenance grant to complement the EPSRC studentship rate. Applicants should send CV and references to Dr Stephen McKenna, Department of Applied Computing, University of Dundee, Scotland. Informal enquiries to stephen at computing.dundee.ac.uk From tcp1 at leicester.ac.uk Thu May 6 07:18:40 1999 From: tcp1 at leicester.ac.uk (Tim .C. Pearce) Date: Thu, 6 May 1999 12:18:40 +0100 (BST) Subject: Studentship Opportunity Message-ID: <350B7A7964@violet.le.ac.uk> Postgraduate Studentship in Neuromorphic Engineering An opportunity exists for a studentship in the broad area of neuroinformatics and sensory information processing. A wide range of possible research topics will be considered relating to artificial and biological olfaction (smell) including but not limited to: optical chemical sensing for artificial olfaction, sign processing for chemical sensor arrays using classical or neuromorphic pattern recognition approaches, discrete electronic or silicon implementations of neurones for sensory processing, and/or data processing of electrophysiological data from the biologic olfactory bulb in order to investigate the coding of odour information. The successful candidate will obtain an upper second class honours degree or better in a relevant discipline, including all areas of engineering, computer science, mathematics, chemistry, physics or biology. Computer literacy and evidence of mathematical ability will be seen as a distinct advantage. The award will includ full-time fees for higher degree registration and a maintenance grant of 8,000 pounds sterling p.a. for which students may be required to contribute to laboratory demonstration within the department (up to a maximum of 6 hours/week). For further details and informal discussions contact Dr. Tim Pearce, Tel: +44 (0) 116 223 1290, E-mail: t.c.pearce at le.ac.uk. Those interested should submit a CV and single page statement of potential research interests to Dr. Tim C. Pearce, Department of Engineering, University of Leicester, University Road, Leicester LE1 H, United Kingdom. In order to guarantee consideration, applications should be submitted by June 18th, 1999, although the studentship will remain open until a suitable candidate has been found. The University of Leicester is an equal opportunities employer. Regards, Tim. -- T.C. Pearce, PhD URL: http://www.leicester.ac.uk/engineering/ Lecturer in Bioengineering E-mail: t.c.pearce at leicester.ac.uk Department of Engineering Tel: +44 (0)116 223 1290 University of Leicester Fax: +44 (0)116 252 2619 Leicester LE1 7RH Bioengineering, Transducers and United Kingdom Signal Processing Group From murase at synapse.fuis.fukui-u.ac.jp Thu May 6 04:28:40 1999 From: murase at synapse.fuis.fukui-u.ac.jp (kazuyuki murase) Date: Thu, 06 May 1999 17:28:40 +0900 Subject: Professor Positions in Japan Message-ID: <373152B8.5A09912E@synapse.fuis.fukui-u.ac.jp> Dear Sirs; I would like to announce the following openings of three Professor and/or Associate Professor positions in the Department of Human and Artificial Intelligent Systems at Fukui University in Japan. Potential candidates are welcome to apply, and also I would appreciate if you inform this to your colleagues who might be interested. Sincerely yours, Kazuyuki Murase Chairman Department of Human and Artificial Intelligent Systems (HARTs) Faculty of Engineering, Fukui University 3-9-1 Bunkyo, Fukui 910-8507, Japan. Phone: (+81) 776-27-8774, Fax: (+81) 776-27-8751 E-mail: murase at synapse.fuis.fukui-u.ac.jp THREE FACULTY POSITIONS AVAILABLE The Faculty of Engineering at the Fukui University, a National University of Japan, seeks candidates for three professor and/or associate professor positions in the newly established Department of Human and Artificial Intelligent Systems (HARTs). The department is aimed to study and teach fundamentals and applications of systems with intelligence. It is consisted of three academic units, the Basic Intelligent Systems, the Applied Intelligent Systems, and the Intelligent Systems for Human Aid, and is planed to have a total of twenty-two faculty members by the fiscal year of 2002. Candidates with a variety of backgrounds related to the human and artificial intelligence as well as systems science and engineering are encouraged to apply. The area of research includes; Intelligent Robotics, Intelligent Sensing, Cognitive Science, Emergent Systems, Automatic Control, Artificial Intelligence, AI Systems, Adaptive Learning and Autonomous Systems, Evolutionary Systems, Multi-agent Systems, and others. Applicants are expected to establish an independent, and highly original, research program. They are to teach graduate and undergraduate courses, in which students are mostly Japanese, and to supervise student research at the levels of Undergraduate, Master and Doctor degrees. Applicants should send a curriculum vitae, copies of publications, summaries of twelve representative publications, a statement of present and future research plans and teaching interest, and two names of references by August 31, 1999, to Dr. Kazuyuki Murase, Department of Human and Artificial Intelligent Systems, Fukui University, 3-9-1 Bunkyo, Fukui 910-8507, Japan. For informal inquiries, phone: (0776) 27-8774, fax:(0776) 27-8751. E-mail: murase at synapse.fuis.fukui-u.ac.jp From herbert.jaeger at gmd.de Thu May 6 05:43:37 1999 From: herbert.jaeger at gmd.de (Herbert Jaeger) Date: Thu, 06 May 1999 11:43:37 +0200 Subject: New paper on stochastic time series modeling Message-ID: <37316377.D915FE60@gmd.de> Dear Connectionists, I would like to announce the paper, Herbert Jaeger, "Observable operator models for discrete stochastic time series", accepted for publication by Neural Computation Abstract: A widely used class of models for stochastic systems is Hidden Markov models. Systems which can be modeled by hidden Markov models are a proper subclass of *linearly dependent processes*, a class of stochastic systems known from mathematical investigations carried out over the last four decades. This article provides a novel, simple characterization of linearly dependent processes, called observable operator models. The mathematical properties of observable operator models lead to a constructive learning algorithm for the identification of linearly dependent processes. The core of the algorithm has a time complexity of O(N + n m^3), where N is the size of training data, n is the number of distinguishable outcomes of observations, and m is model state space dimension. A preprint of the paper is available electronically at ftp://ftp.gmd.de/GMD/ais/publications/1999/jaeger.99.neco.pdf (PDF format, 410 K) ftp://ftp.gmd.de/GMD/ais/publications/1999/jaeger.99.neco.ps.gz (g'zipped PostScript format, 700 K) I warmly appreciate your comments! Sincerely, Herbert Jaeger ---------------------------------------------------------------- Dr. Herbert Jaeger Phone +49-2241-14-2253 German National Research Center Fax +49-2241-14-2384 for Information Technology (GMD) email herbert.jaeger at gmd.de AiS.BE Schloss Birlinghoven D-53754 Sankt Augustin, Germany http://www.gmd.de/People/Herbert.Jaeger/ From grb at neuralt.com Fri May 7 09:59:33 1999 From: grb at neuralt.com (George Bolt) Date: Fri, 7 May 1999 14:59:33 +0100 Subject: Job vacancy - Neural Technologies Limited Message-ID: Neural Scientist Neural Technologies Limited is the leading UK company working in the application and exploitation of neural computing and other advanced technologies across a wide range of industrial and commercial environments. Our continued growth has led to the requirement of an applied Neural?Scientist to join our highly motivated team to help in the development and deployment of practical advanced computing solutions on a high profile projects. Do you want to apply your neural computing skills to solve real-world problems? Neural Technologies can offer you this opportunity - just some of the areas we work in are: * Telecommunications - fraud, churn, etc. * Finance - credit scoring, risk management, instrument trading, etc. * Marketing - modelling and analysis * Data Analysis and Visualisation - virtual reality * Etc. You will be expected to demonstrate not only high standards of professionalism but technical innovation second to none. Self confidence, adaptability, proactivity and communication skills are as important as the technical skills. Required skills are: * Well versed in neural network and other advanced algorithm development and their practical application, should have at least 2 years applied knowledge of at least 2 of the following: * MLP, RBF, Decision Trees, etc. * Kohonen/SOM, LVQ, etc. * Rule induction and inferencing, case-based reasoning, etc. * Evolution, GA's, etc. * Optimisation * Experienced using MATLAB * Proven problem solving abilities and system design * Good mathematical background * Able to code in C or C++ within the PC environment Experience of the following would also be an advantage: * Knowledge of conventional statistics * Signal processing techniques (e.g. speech) * Application domains (credit scoring, fraud analysis, telecommunications, banking and finance) All candidates should be working at a practical research level or have extensive industrial experience. A keen view to the commercial realities of working within a small, but fast growing, company is required. Neural Technologies Limited operate a non-smoking policy. Contact: Julie Naylor, Technical Administrator, Neural Technologies Limited, Bedford Road, PETERSFIELD, Hampshire GU32 3QA (UK) Fax: +44 (0) 1730 260466 Phone: +44 (0) 1730 260256 Email: techadmn at neuralt.com Website: http://www.neuralt.com George Bolt Director of Product Innovation Neural Technologies Cafe Neural: http://www.neuralt.com Tel: +44 (0) 1730 260 256 Fax: +44 (0) 1730 260 466 > ********** NOTE > Any views expressed in this message are those of the individual > sender, > except where the sender specifically states them to be the views of > Neural Technologies Limited > ********** > From shai at cs.Technion.AC.IL Sun May 9 05:25:38 1999 From: shai at cs.Technion.AC.IL (Shai Ben-David) Date: Sun, 9 May 1999 12:25:38 +0300 (IDT) Subject: COLT99 program Message-ID: <199905090925.MAA15305@cs.Technion.AC.IL> Twelfth Annual Conference on Computational Learning Theory University of California at Santa Cruz July 6-9, 1999 ======================================== A PRELIMINARY PROGRAM ======================================== Tuesday, July 6 --------------- Session 1 (9:00-10:30) --------- The Robustness of the p-norm Algorithms, Claudio Gentile and Nick Littlestone Minimax Regret under Log Loss for General Classes of Experts, Nicolo Cesa-Bianchi and Gabor Lugosi On Prediction of Individual Sequences Relative to a set of Experts, Neri Merhav and Tsachy Weissman Regret Bounds for Prediction Problems, Geoffrey J. Gordon Session 2 (11:00-12:00) --------- On theory revision with queries, Robert H. Sloan and Gyorgy Turan Estimating a mixture of two product distributions, Yoav Freund and Yishay Mansour An Apprentice Learning Model, Stephen S. Kwek Session 3 (2:00-3:00) --------- Uniform-Distribution Attribute Noise Learnability, Nader H. Bshouty and Jeffrey C. Jackson and Christino Tamon On Learning in the Presence of Unspecified Attribute Values, Nader H. Bshouty and David K. Wilson Learning Fixed-dimension Linear Thresholds From Fragmented Data, Paul W. Goldberg Tutorial 1 (3:30-5:30) --------- Boosting, Yoav Freund and Rob Schapire ++++++++++++++++++++++++++++++++++++++++ 19:00 - 21:00 RECEPTION +++++++++++++++++++++++++++++++++++++++++ Wednesday, July 7 ----------------- Invited Speaker --------------- TBA, David Shmoys (9:00-10:00) Session 4 (10:30 - 12:10) --------- An adaptive version of the boost-by-majority algorithm, Yoav Freund Drifting Games, Robert E. Schapire Additive Models, Boosting, and Inference for Generalized Divergences, John Lafferty Boosting as Entropy Projection, J. Kivinen and M. K. Warmuth Multiclass Learning, Boosting, and Error-Correcting Codes, Venkatesan Guruswami and Amit Sahai Session 5 (2:00-3:00) --------- Theoretical Analysis of a Class of Randomized Regularization Methods, Tong Zhang PAC-Bayesian Model Averaging, David McAllester Viewing all Models as `Probabilistic', Peter Grunwald Tutorial 2 (3:30- 5:30) ---------- Reinforcement Learning, Michael Kearns (?) and Yishay Mansour +++++++++++++++++++++++++++++++++++++++++ -------------- Thursday, July 8 ----------------- Session 6 (9-10:30) --------- Reinforcement Learning and Mistake Bounded Algorithms, Yishay Mansour Convergence analysis of temporal-difference learning algorithms, Vladislav Tadic Beating the Hold-Out, Avrim Blum and Adam Kalai and John Langford Microchoice Bounds and Self Bounding Learning Algorithms, John Langford and Avrim Blum Session 7 (11:00- 12:00) --------- Learning Specialist Decision Lists, Atsuyoshi Nakamura Linear Relations between Square-Loss and Kolmogorov Complexity, Yuri A. Kalnishkan Individual sequence prediction - upper bounds and application for complexity, Chamy Allenberg Session 8 (2:00- 3:00) ---------- Extensional Set Learning, S. A. Terwijn On a generalized notion of mistake bounds, Sanjay Jain and Arun Sharma On the intrinsic complexity of learning infinite objects from finite samples, Kinber and Papazian and Smith and Wiehagen +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Friday, July 9 -------------- Tutorial 3 (9:00-11:00) ---------- Large Margin Classification, Peter Bartlett, John Shawe-Taylor, and Bob Williamson Session 9 (11:30-12:10) --------- Covering Numbers for Support Vector Machines, Ying Guo and Peter L. Bartlett and John Shawe-Taylor and Robert C. Williamson Further Results on the Margin Distribution, John Shawe-Taylor and Nello Cristianini Session 10 (2:00- 3:40) ---------- Attribute Efficient PAC-learning of DNF with Membership Queries, Nader H. Bshouty and Jeffrey C. Jackson and Christino Tamon On PAC Learning Using Winnow, Perceptron, and a Perceptron-Like Algorithm, Rocco A. Servedio Extension of the PAC Framework to Finite and Countable Markov Chains, David Gamarnik Learning threshold functions with small weights using membership queries., E. Abboud, N. Agha, N.H. Bshouty, N. Radwan, F. Saleh Exact Learning of Unordered Tree Patterns From Queries, Thomas R. Amoth and Paul Cull and Prasad Tadepalli +++++++++++++++++++++++++++++++++++++++++ From marwan at ee.usyd.edu.au Mon May 10 07:47:00 1999 From: marwan at ee.usyd.edu.au (Marwan Jabri) Date: Mon, 10 May 1999 21:47:00 +1000 (EST) Subject: postdoc Message-ID: Please feel free to post... Post-Doctoral Fellow in Wavelets-Based Timeseries Analysis Computer Engineering Laboratory School of Electrical and Information Engineering The University of Sydney, Australia Applications are invited for Post-doctoral fellow position funded by an Australian Research Council project grant and in collaboration with an financial engineering company. The two-year project aims at investigating wavelets preprocessing techniques and their applications to timeseries analysis and compression. Applicants would have completed (or about to complete) their PhD in electrical, computer or related engineering or science discipline and have demonstrated research capacity in the area of timeseries analysis, machine learning or related field. The fellow will be expected to work independently and to play a leading role in the project co-supervising postgraduate students contributing to the project. Knowledge of computational techniques in the neural computing area, and their implementation in software are advantages. Appointment will be made initially for a period of one year, and renewable for another year subject to progress. Expected starting date in June, 1999. Closing date: 21 May 1999 Salary range: A$ 41,620 - A$ 46,017 To apply, send letter of application, CV and names, fax and email of three referees to M. Jabri Tel (+61-2) 9351 2240, Fax (+61-2) 9351 7209, Email: marwan at sedal.usyd.edu.au From dhw at ptolemy.arc.nasa.gov Mon May 10 19:46:23 1999 From: dhw at ptolemy.arc.nasa.gov (David Wolpert) Date: Mon, 10 May 1999 16:46:23 -0700 (PDT) Subject: Job Announcement Message-ID: <199905102346.QAA16733@buson.arc.nasa.gov> Job opening in the Information Directorate at NASA Ames Research Center. Description: This position provides an opportunity to participate in research of Collective Intelligence (COIN), that is the analysis of large, distributed artificial systems and implementation of local strategies for augmenting their performance. In particular, this position involves research into the implementation on networks of COIN research involving machine learning. Tasks will include: -- Building and maintaining models for simulating the behavior of networks, using the OPNET network simulator. Setting up and running experiments on the application of COIN technology to the simulated control of those networks. Create and investigate new COIN technology in the domain of the OPNET simulations. -- Writing programs and scripts to carry out data collection from the network simulations. Analyzing the data and creating plots from these experiments, using packages like Matlab or OPNET's built-in analysis tool. The research of the group is described at http://ic.arc.nasa.gov/ic/projects/bayes-group/index.html. Minimal Requirements: -- B.A./B.S. in Computer Science, Mathematics, Statistics, or related field. -- Extended knowledge of and experience with C and/or C++. -- Basic knowledge of UNIX. -- Interest in Artificial Intelligence / Machine Learning / Statistics and/or network theory. Preferred: -- Masters Degree -- Experience with the OPNET network simulator. -- Experience with Perl or shell scripts. -- Experience with Matlab. -- Familiarity with reinforcement learning algorithms. -- U.S. citizen or permanent resident. Please direct responses, including a resume, to David Wolpert, Automated Learning Group NASA Ames Research Center MS 269-1, Moffett Field, CA 94035, USA dhw at ptolemy.arc.nasa.gov. Salary will be commensurate with experience. NASA/Ames is an equal oppurtunity employer. From robtag at dia.unisa.it Mon May 10 06:09:04 1999 From: robtag at dia.unisa.it (Tagliaferri Roberto) Date: Mon, 10 May 1999 12:09:04 +0200 Subject: Program WIRN Vietri '99: XI ITALIAN WORKSHOP ON NEURAL NETS Message-ID: <9905101009.AA30926@udsab> WIRN Vietri '99 XI ITALIAN WORKSHOP ON NEURAL NETS Preliminary Program IIASS "E.R. Caianiello", 20-22 May 1999) http://dsi.ing.unifi.it/neural/siren/WIRN99/home_en.html Thursday 20 May 9.30 - 10.30 Use and Abuse of Neural Networks Niranjan M. (invited talk) 10.30 - 11-00 coffe break Models 11.00 - 11.20 Dynamics of On-line Learning in Radial Basis Function Neural Networks Marinaro M., Scarpetta S. 11.20 - 11.40 Online Learning with Adaptive Local Step Sizes Schraudolph N.N. 11.40 - 12.00 Continual Prediction using LSTM with Forget Gates Gers F.A., Schmidhuber J., Cummins F. 12.00 - 12.20 Polynomial Clustering Exhibit Statistical Estimation Abilities Fiori S., Burrascano P. 12.20-13.20 Theory, Implementation, and Applications of Support Vector Machines Pittore M., Verri A. (review talk)) Poster Session 15.30 high light spotting Friday 21 May 9.00-10.00 Computational Intelligence in Hydroinformatics: a Review Cicioni Gb., Masulli F. (review talk) Applications 10.00 - 10.20 A Neural Network Based Urban Environment Monitoring System Exploiting Previous Knowledge Simone G., Morabito F.C. 10.20 -10.40 A Combination of Tools: NLS and NN Estimation of the Consumers' Expenditure in Durable Goods. Determinants, Trend and Forecasting for the Motor Vehicles Sector D'Orio G. 10.40 - 11.10 coffe break Signal and Image Processing 11.10 - 11.30 A Feed-Forward Neural Network for Robust Segmentation of Color Images Amoroso C., Chella A., Morreale V., Storniolo P. 11.30 - 11.50 A Neural Network Based ARX Model of Virgo Noise Barone F., De Rosa R., Eleuteri A., Garufi F., Milano L., Tagliaferri R. Architectures and Algorithms 11.50 - 12.10 Building Neural and Logical Networks with Hamming Clustering Muselli M. 12.10 - 12.30 Training Semiparametric Support Vector Machines Mattera D., Palmieri F., Haykin S. 12.30 - 12.50 An Analog On-chip Learning Architecture Based on Weight Perturbation Algorithm and on Current Mode Translinear Circuits Valle M., Diotalevi F., Bo G.M., Biglieri E., Caviglia D.D. Caianiello Session 15.00-16.00 Title to be announced Taylor J. (invited talk) 16.00 - 16.30 the winner of the Caianiello prize 16.30-17.00 coffe break 17.00 assemblea SIREN 20.00 Social Dinner Saturday 22 May Special Session on Neural Networks in Economics (Co-chairs S. Giove and M. Salzano) 9.00 - 9.50 Neural graphs in the handling of economic and management problems Gil Aluja (invited talk) 9.50 - 10.40 Statistical Neural Networks and their Applications in Economics Lauro N.C., Davino C., Vistocco D. (review talk) 10.40- 11.10 coffe break 11.10 - 11.35 title to be announced von Altrock (invited talk) 11.35 -11.55 Neural Network for Economic Forecasting Salzano M. (review talk) 11.55 - 12.15 Fuzzy Local Algorithms for Time Series Analysis and Forecasting Giove S. (review talk) 12.15 - 12.35 Regional Economic Policy and Computational Economics Marino D. (review talk) 12.35 - 12.50 A Fuzzy Definition of Industrial District Bruni S., Facchinetti G., Paba S. 12.50 - 13.05 Estimating the Conditional Mean of Non-linear Time Series using Neural Networks Giordano F., Perna C. Papers in the Poster Session The N-SOBoS Model Frisone F., Morasso P.G. A Neural Network Approach to Detect Functional MRI Signal Frisone F., Morasso P.G., Vitali P., Rodriguez G., Pilot A., Sardanelli F., Rosa M. Interval Arithmetic Multilayer Perceptron as Possibility-Necessity Pattern Classifier Drago G.P., Ridella S. Inferring Understandable Rules through Digital Synthesis Muselli M., Liberati D. YANNS: Yet Another Neural Network Simulator d'Acierno A., Sansone S. A Multilayer Perceptron for Fast Interpolation of JPEG/MPEG Coded Images Carrato S. A General Assembly as Implementation of a Hebbian Rule in a Boolean Neural Network Lauria F.E., Milo M., Prevete R., Visco S. The Search for Spiculated Lesions in the CALMA Project: Status and Perspectives Marzulli V.M. An Experimental Comparison of Three PCA Neural Techniques Fiori S. Weightless Neural Networks for Face Recognition Lauria S., Mitchell R. Gesture Recognition using hybrid SOM/DHMM Corradini A., Boehme H.J., Gross H.M. Parameter Identification using Aspects - Application to the Human Cardiovascular System Asteroth A., Frings-Naberschulte J., Mvller K. Development of Selectivity Maps in a BCM Network using Various Connectivity Schemes Remondini D., Castellani G.C., Bazzani A., Campanini R., Bersani F. A Novel Wavelet Filtering Method in SAR Image Classification by Neural Networks Simone G., Morabito F.C. Real Time Neural Network Disruption Prediction in Tokamak Reactors Morabito F.C., Versaci M. The Automatic Detection of Microcalcification Clusters in the CALMA Project: Status and Perspectives Delogu P. Recursive Networks: an Overview of Theoretical Results Bianchini M., Gori M., Scarselli F. Local Wavelet Decomposition and its Application to Face Reconstruction Borghese N.A., Ferrari S., Piuri V. Harmony Theory and Binding Problem Pessa E., Pietronilla Penna M. Scale Based Clustering Optimization via Gravitational Law Imitation Frattale Mascioli F.M., Rizzi A., Scrocca G., Martinelli G. Signal Classification by Subspace Neural Networks Martinelli G., Di Giacomo M. On-line Quality Control of DC Permanent Magnet Motor using Neural Networks Solazzi M., Uncini A. Neural Networks for Spectral Analysis of Unevenly Sampled Data Tagliaferri R., Ciaramella A., Milano L., Barone F. Using the Hermite Regression Algorithm to Improve the Generalization Capability of a Neural Network Pilato G., Sorbello F., Vassallo G. Training Semiparametric Support Vector Machines Mattera D., Palmieri F., Haykin S. A Comparison among Clustering Techniques for Identifying Objects in Images Carrai P., Izzo G., Esposito A., Agarossi L. From Peter.Bartlett at syseng.anu.edu.au Tue May 11 03:47:20 1999 From: Peter.Bartlett at syseng.anu.edu.au (Peter Bartlett) Date: Tue, 11 May 1999 17:47:20 +1000 (EST) Subject: position in machine learning at ANU Message-ID: <199905110747.RAA26686@reid.anu.edu.au> The Machine Learning Group at the Australian National University is advertising a position in theoretical and experimental machine learning. It's a continuing research + graduate teaching position at academic level C ("Fellow" = assistant/associate professor). Closing date is June 4. See http://wwwrsise.anu.edu.au/ad.html#LevC_ML for the advertisement and details. -- Peter Bartlett. From malaka at eml.org Tue May 11 07:51:36 1999 From: malaka at eml.org (Rainer Malaka) Date: Tue, 11 May 1999 13:51:36 +0200 Subject: job openings at EML, Germany Message-ID: <373819C8.BBED9617@eml.org> Could you please forward the following announcement. The jobs are partially related to connectionism and might be interesting to the people on the list. Best regards Rainer Malaka ######################################## Research Positions at the European Media Laboratory The Human Language Technology and the Personal Memory group at the European Media Laboratory in Heidelberg are seeking several researchers to work in the areas of information retrieval, information extraction, domain ontology building, and human computer interfaces. 1. A researcher with experience in terminological ontology building, knowledge representation languages, reasoning, lexical / knowledge acquisition from corpora. The appointee will work in close collaboration with the team of Bio-Informatics (molecular biology) of EML and the Department of Information Science of Tokyo University. We require someone who has a PhD in NLP or in CS with a demonstrated ability to do independent research. Preference will be given to applicants who can demonstrate practical abilities in the building of domain ontologies and who have a strong NLP background. 2. A researcher to work on natural language interfaces with excellent knowledge in object-oriented software development (e.g., Java) and XML. The appointee should have experience in GUI programming. Experience in dialog modeling and discourse structure would be an advantage. The successful candidate will be responsible for designing and implementing an interactive interface for information retrieval, database integration and ontological integration. The candidate will join an interdisciplinary team of computational linguists, computer scientists and domain experts who will use the designed interface for practical purposes. 3. A researcher with strong interests in the area of NLP and in particular in statistical NLP. The candidate should have excellent programming skills. The candidate should be familiar with the theory and implementation of finite-state automata, finite-state transducers and robust parsing techniques. The candidate is expected to work on areas such as semantic information retrieval, document classification, and clustering. 4. A Computer Scientist with strong background in one or more of the following areas: adaptive user interfaces, dialog management for human-computer interaction, context and situation modeling. We expect the candidates to have significant experience in object-oriented software development (e.g., Java), machine learning, and databases. Successful candidates should have a PhD or professional experience the field. Successful candidates will join a multi-disciplinary group and will participate in projects that aim at building user-oriented computer systems such as mobile tourism information systems. These projects are embedded into a network of collaborations with national and international research partners from industry and academics. We offer competitive salaries, depending on professional experience and scientific achievements. The positions are available for 3 years, with the possibility of renewing the appointment depending on performance, and availability of funding. EML is a newly established private research laboratory that primarily does contract research for the Klaus Tschira Foundation. It engages in research in the manifold uses of information technology, it's primary interest being the development of new ways to increase the usefulness of such technology for the individual and for society. Scientists from many different disciplines and countries work together at the EML, in particular, there is a regular exchange between the EML and national and international institutions. You will experience a challenging and stimulating international work environment here. In addition, the EML is located in one of the most beautiful old mansions of Heidelberg. Should you be interested in working with us please send your application including full CV and relevant material attesting your qualifications by 11th of June 1999 to our secretary, c/o Brbel Mack, Schloss-Wolfsbrunnenweg 33, D-69118 Heidelberg, . For further information, please contact our web-site www.eml.org. From jairmail at ISI.EDU Tue May 11 16:33:44 1999 From: jairmail at ISI.EDU (Steve Minton) Date: Tue, 11 May 1999 13:33:44 -0700 (PDT) Subject: JAIR article: "Learing to Order Things" Message-ID: <199905112033.NAA29984@quark.isi.edu> Readers of this mailing list may be interested in the following article, which was just published by JAIR: Cohen, W.W., Schapire, R.E., and Singer, Y. (1999) "Learning to Order Things", Volume 10, pages 243-270. Available in PDF, PostScript and compressed PostScript. For quick access via your WWW browser, use this URL: http://www.jair.org/abstracts/cohen99a.html More detailed instructions are below. Abstract: There are many applications in which it is desirable to order rather than classify instances. Here we consider the problem of learning how to order instances given feedback in the form of preference judgments, i.e., statements to the effect that one instance should be ranked ahead of another. We outline a two-stage approach in which one first learns by conventional means a binary preference function indicating whether it is advisable to rank one instance before another. Here we consider an on-line algorithm for learning preference functions that is based on Freund and Schapire's 'Hedge' algorithm. In the second stage, new instances are ordered so as to maximize agreement with the learned preference function. We show that the problem of finding the ordering that agrees best with a learned preference function is NP-complete. Nevertheless, we describe simple greedy algorithms that are guaranteed to find a good approximation. Finally, we show how metasearch can be formulated as an ordering problem, and present experimental results on learning a combination of 'search experts', each of which is a domain-specific query expansion strategy for a web search engine. The article is available via: -- comp.ai.jair.papers (also see comp.ai.jair.announce) -- World Wide Web: The URL for our World Wide Web server is http://www.jair.org/ For direct access to this article and related files try: http://www.jair.org/abstracts/cohen99a.html -- Anonymous FTP from either of the two sites below. Carnegie-Mellon University (USA): ftp://ftp.cs.cmu.edu/project/jair/volume10/cohen99a.ps The University of Genoa (Italy): ftp://ftp.mrg.dist.unige.it/pub/jair/pub/volume10/cohen99a.ps The compressed PostScript file is named cohen99a.ps.Z (229K) -- automated email. Send mail to jair at cs.cmu.edu or jair at ftp.mrg.dist.unige.it with the subject AUTORESPOND and our automailer will respond. To get the Postscript file, use the message body GET volume10/cohen99a.ps (Note: Your mailer might find this file too large to handle.) Only one can file be requested in each message. For more information about JAIR, visit our WWW or FTP sites, or send electronic mail to jair at cs.cmu.edu with the subject AUTORESPOND and the message body HELP, or contact jair-ed at ptolemy.arc.nasa.gov. From horn at alice.nc.huji.ac.il Thu May 13 07:22:55 1999 From: horn at alice.nc.huji.ac.il (David Horn) Date: Thu, 13 May 1999 14:22:55 +0300 Subject: new deadline NCST-99 Message-ID: Special Announcement ==================== postponement of deadline for submissions to the conference Neural Computation in Science and Technology ============================================ Place: Maale Hachamisha, Israel. Dates of the conference: October 10-13, 1999. Deadline for submission of contributions: postponed to June 15, 1999. ============== Due to a local strike, mail and email communications to and from Tel Aviv University were cut off during the past week, and are still down at present. As a result, we have decided to postpone the deadline for submission of contributions from May 15 to June 15. The conference will concentrate on modern issues in computational neuroscience as well as in applications of neural network techniques in science and technology. Currently confirmed speakers are: D.Amit, Y. Baram, M. Bialek, E. Domany, R. Douglas, G. Dreyfus, W. Gerstner, D. Golomb, M. Hasselmo, J. Hertz, D. Horn, N. Intrator, I. Kanter, W. Kinzel, R. Miikkulainen, K. Obermayer, E. Oja, E. Ruppin, I. Segev, T. Sejnowski, H. Siegelmann, H. Sompolinsky, N. Tishby, M. Tsodyks, V. Vapnik and D. Willshaw. In view of the current communication problem, abstracts of submitted papers should be sent to the interim email address horn at alice.nc.huji.ac.il. Four copies of full papers up to seven pages in length should be mailed to Prof. David Horn, NCST-99, School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978, Israel. Registration forms and further information will be available at the website http://neuron.tau.ac.il when electronic services will resume. From szepes at sol.cc.u-szeged.hu Sat May 15 15:25:51 1999 From: szepes at sol.cc.u-szeged.hu (Szepesvari Csaba) Date: Sat, 15 May 1999 21:25:51 +0200 (MET DST) Subject: Paper available Message-ID: The following paper is available from http://victoria.mindmaker.hu/~szepes/papers/scann98.ps.gz Reinforcement Learning: Theory and Practice Cs. Szepesvri in Proceedings of the 2nd Slovak Conference on Artificial Neural Networks (SCANN'98). Nov. 10-12, 1998, Smolenice, Slovakia, pp. 29-39 (Ed: Marian Hrehus) We consider reinforcement learning methods for the solution of complex sequential optimization problems. In particular, the soundness of two methods proposed for the solution of partially observable problems will be shown. The first method is a state-estimation scheme and requires mild {\em a priori} knowledge, while the second method assumes that a significant amount of abstract knowledge is available about the decision problem and uses this knowledge to setup a macro-hierarchy to turn the partially observable problem into another one which can already be handled using methods worked out for observable problems. This second method is also illustrated with some experiments on a real-robot. -------------------------------------------------------------------- Csaba Szepesvari Head of Research Department Mindmaker Ltd. Budapest 1112 Konkoly-Thege Miklos u. 29-33. HUNGARY e-mail: szepes at mindmaker.hu WEB: http://victoria.mindmaker.hu/~szepes Phone: +36 1 395 9220/1205 (dial extension continuously) Fax: +36 1 395 9218 From piuri at elet.polimi.it Mon May 17 05:34:33 1999 From: piuri at elet.polimi.it (vincenzo piuri) Date: Mon, 17 May 1999 11:34:33 +0200 Subject: IJCNN'2000 - PRELIMINARY CALL FOR PAPERS Message-ID: <3.0.5.32.19990517113433.014e1100@elet.polimi.it> Dear Colleague, On the behalf of the organizing committee I am glad to announce the IEEE-INNS-ENNS International Joint Conference on Neural Networks IJCNN'2000, to be held at the Grand Hotel Cernobbio, Como, Italy, on 24-27 July 2000. The conference is organized and sponsored by the IEEE Neural Network Council, with the cooperation with the International Neural Network Society and the European Neural Network Society. The preliminary call for papers can be found at the conference web site http://www.ims.unico.it/2000ijcnn.html All information about the conference will be posted there. No printed mailing will be done this year: very few emailing will be also done to avoid bothering you. Therefore, stay tuned on the above web site! If any colleague or student of yours like to be included in this emailing list, please forward her/him this message: instructions to subscribe the list are attached below. Other events will be organized before and after the conference: they will be announced soon. Ciao and see you in Como! Vincenzo Piuri IJCNN'2000 Program Co-Chair for Europe ============================================================================= To subscribe the list send an email to listproc at ims.unico.it The body of the message must contain ONLY the following line SUBSCRIBE CONFERENCES lastname firstname where lastname and firstname must be replaced by your last and first name respectively. Do not put any signature or any other message: they will be ignored and may result in error messages. ============================================================================= Vincenzo Piuri Department of Electronics and Information, Politecnico di Milano piazza L. da Vinci 32, 20133 Milano, Italy phone +39-02-2399-3606 secretary +39-02-2399-3623 fax +39-02-2399-3411 email piuri at elet.polimi.it From jairmail at ISI.EDU Mon May 17 18:26:19 1999 From: jairmail at ISI.EDU (Steve Minton) Date: Mon, 17 May 1999 15:26:19 -0700 (PDT) Subject: JAIR article, "Variational Probabilistic Inference..." Message-ID: <199905172226.PAA21005@quark.isi.edu> JAIR is pleased to announce the publication of the following article, which may be of interest to readers of this mailing list: Jaakkola, T.S. and Jordan, M.I. (1999) "Variational Probabilistic Inference and the QMR-DT Network", Volume 10, pages 291-322. Available in HTML, PDF, PostScript and compressed PostScript. For quick access via your WWW browser, use this URL: http://www.jair.org/abstracts/jaakkola99a.html More detailed instructions are below. Abstract: We describe a variational approximation method for efficient inference in large-scale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnostic inference in the `Quick Medical Reference' (QMR) network. The QMR network is a large-scale probabilistic graphical model built on statistical and expert knowledge. Exact probabilistic inference is infeasible in this model for all but a small set of cases. We evaluate our variational inference algorithm on a large set of diagnostic test cases, comparing the algorithm to a state-of-the-art stochastic sampling method. The article is available via: -- comp.ai.jair.papers (also see comp.ai.jair.announce) -- World Wide Web: The URL for our World Wide Web server is http://www.jair.org/ For direct access to this article and related files try: http://www.jair.org/abstracts/jaakkola99a.html -- Anonymous FTP from either of the two sites below. Carnegie-Mellon University (USA): ftp://ftp.cs.cmu.edu/project/jair/volume10/jaakkola99a.ps The University of Genoa (Italy): ftp://ftp.mrg.dist.unige.it/pub/jair/pub/volume10/jaakkola99a.ps The compressed PostScript file is named jaakkola99a.ps.Z (249K) -- automated email. Send mail to jair at cs.cmu.edu or jair at ftp.mrg.dist.unige.it with the subject AUTORESPOND and our automailer will respond. To get the Postscript file, use the message body GET volume10/jaakkola99a.ps (Note: Your mailer might find this file too large to handle.) Only one can file be requested in each message. For more information about JAIR, visit our WWW or FTP sites, or send electronic mail to jair at cs.cmu.edu with the subject AUTORESPOND and the message body HELP, or contact jair-ed at ptolemy.arc.nasa.gov. From oreilly at grey.colorado.edu Tue May 18 11:24:25 1999 From: oreilly at grey.colorado.edu (Randall C. O'Reilly) Date: Tue, 18 May 1999 09:24:25 -0600 Subject: Postdoc Position Available Message-ID: <199905181524.JAA08370@grey.colorado.edu> A postdocotral position is available starting immediately for someone interested in pursuing computational modeling approaches to the role of neuromodulation and/or prefrontal cortical function in cognition. The nature of the position is flexible, depending upon the individual's interest and expertise. Approaches can be focused at the neurobiological level (e.g., modeling detailed physiological characteristics of neuromodulatory systems, such as locus coeruleus and/or dopaminergic nuclei, or the circuitry of prefrontal cortex), or at the more systems/cognitive level (e.g., the nature of representations and/or the mechanisms involved in active maintenance of information within prefrontal cortex, and their role in working memory). The primary requirement for the position is a Ph.D. in the cognitive, computational, or neurosciences, and extensive experience with computational modeling work, either at the PDP/connectionist or detailed biophysical level. The postdoc is funded from a collaborative research grant involving Jonathan Cohen at the Department of Psychology and Center for the Study of Brain, Mind, and Behavior at Princeton University (http://www.csbmb.princeton.edu/ncc/jdc.html) and the Western Psychiatric Institute and Clinic at the University of Pittsburgh, and Randall O'Reilly at the Department of Psychology and Institute for Cognitive Science at the University of Colorado, Boulder (http://psych.colorado.edu/~oreilly). Further information about the resources and affiliated faculty at the possible sponsoring institutions and programs is available at: http://www.csbmb.princeton.edu, http://www.cnbc.cmu.edu, http://psych.colorado.edu/~oreilly/postdoc.html. Interested individuals should send a curriculum vitae, representative publications, a statement of research interests, and three letters of reference via email to jdc at princeton.edu (please begin subject with the words ``Modeling Position'') or via snail mail to Jonathan D. Cohen, Department of Psychology, Green Hall, Princeton University, Princeton, NJ 08544. We will begin reviewing applications as they are received, continuing until the position is filled. Princeton University, the University of Pittsburgh, and the University of Colorado are all equal opportunity employers; minorities and women are encouraged to apply. From ericr at ee.usyd.edu.au Tue May 18 22:42:22 1999 From: ericr at ee.usyd.edu.au (Eric Ronco) Date: Wed, 19 May 1999 12:42:22 +1000 Subject: Human motor control modelling Message-ID: <3742250E.EADC9EBF@ee.usyd.edu.au> Hello, This is to let you know of a new paper submitted to NIPS99 which presents a new model of the human movement control system. Please, see the abstract bellow for details. This article is available at http://www.ee.usyd.edu.au/~ericr/pub/EE99003.ps.gz Title: Open-Loop Intermittent Feedback Optimal Predictive Control: a human movement control model Abstract: This paper introduces the Open-Loop Intermittent Feedback Optimal predictive (OLIFO) controller as an alternative to human movement control models based on system inverse control. OLIFO has the advantages of being applicable to any system, not requiring a desired system trajectory and handling naturally systems with time delays and constraints. Moreover, it shares important functional characteristics with the human movement control system. Its behaviour is illustrated through the control of a six muscles human arm model. Comparable performances obtained with the OLIFO controller and actual subjects suggest the plausibility of this scheme. Bye Eric -- Dr Eric Ronco, room 316 Tel: +61 2 9351 7680 School of Electrical Engineering Fax: +61 2 9351 5132 Bldg J13, Sydney University Email: ericr at ee.usyd.edu.au NSW 2006, Australia http://www.ee.usyd.edu.au/~ericr From gaj at psychology.nottingham.ac.uk Wed May 19 04:40:57 1999 From: gaj at psychology.nottingham.ac.uk (Gary Jones) Date: Wed, 19 May 1999 09:40:57 +0100 Subject: CogSci conference tutorials Message-ID: This may be of interest to the people on this mailing list. I have taken tutorials in three out of the four and found them to be very useful. The Twenty First Annual Meeting of the Cognitive Science Society will take place on August 19-21, 1999 at Simon Fraser University in Vancouver, B.C. The day before the conference, there will be an open workshop on teaching cognitive science and a tutorial programme. This year there are four tutorials on cognitive architectures, including Soar, ACT-R, Cogent, and PDP++. These tutorials are generally designed to introduce architectures to potential users. These tutorials do not teach the architectures completely, but provide enough background to understand models written in them and usually provide enough information for modellers to judge if the architecture is right for their problem. More information on the conference is available at . Gary Jones Psychology Department University of Nottingham Nottingham NG7 2RD England E-mail: gaj at Psychology.Nottingham.AC.UK Web: http://www.psychology.nottingham.ac.uk/staff/Gary.Jones/ From dph at cse.ucsc.edu Wed May 19 13:42:45 1999 From: dph at cse.ucsc.edu (David Helmbold) Date: Wed, 19 May 1999 10:42:45 -0700 Subject: COLT 99 registration information (plain text) Message-ID: <199905191742.KAA08500@sundance.cse.ucsc.edu> From lisa at cse.ucsc.edu Wed May 19 13:21:58 1999 From: lisa at cse.ucsc.edu (Lisa Weiss) Date: Wed, 19 May 1999 10:21:58 -0700 Subject: plain text file Message-ID: <199905191721.KAA13838@rio.cse.ucsc.edu> COLT '99 Twelfth ACM Conference on Computational Learning Theory Tuesday, July 6 through Friday, July 9, 1999 University of California, Santa Cruz, California The workshop will be held on campus, which is hidden away in the redwoods on the Pacific Coast of Northern California. The workshop is in cooperation with the ACM Special Interest Group on Algorithms and Computation Theory (SIGACT) and the ACM Special Interest Group on Artificial Intelligence (SIGART). 1. Flight tickets: San Jose Airport is the closest, about a 45 minute drive. San Francisco Airport (SFO) is about an hour and forty-five minutes away, but has slightly better flight connections. 2. Transportation from the airport to Santa Cruz: The first option is to rent a car and drive south from San Jose on Hwy 880, which becomes Hwy 17 or from San Francisco take either Hwy 280 or 101 to Hwy 17. When you get to Santa Cruz, take Route 1 (Mission St.) north. Turn right on Bay Street and follow the signs to UCSC. Commuters must purchase parking permits for $4.00/day M-F (parking is free Saturday and Sunday) from the information kiosk at the Main entrance to campus or the conference satellite office. Those staying on campus can pick up permits with their room keys. Various van services also connect Santa Cruz with the the San Francisco and San Jose airports. The Santa Cruz Airporter (831) 423-1214 (or (800) 497-4997 from anywhere) has regularly scheduled trips (every two hours from 9am until 11pm from San Jose International Airport ($30 each way), and every two hours from 8am until 10pm from SFO, ($35 each way) from either airport. ABC Transportation (831) 464-8893 ((800) 734-4313 from California (24hr.)) runs a private sedan service ($47 for one, $57 for two, $67 for three to six from San Jose Airport to UC Santa Cruz, $79 for one, $89 for two, and $99 for three to six from SFO to UCSC, additional $10 after 11:30 pm, additional $20 to meet an international flight) and will drop you off at your room. Book at least 24 hours in advance. 3. Conference and room registration: Please fill out the enclosed form and send it to us with your payment. It must be postmarked by June 1 and received by June 5 to obtain the early registration rate and guarantee the room. Conference housing is limited by the available space, and late registrants may need to seek off-campus accommodations. Your arrival: This year we will be at the Kresge apartments. Enter the campus at the Main Entrance, which is the intersection of High and Bay Streets. (Look for the COLT signs.) Bay St. turns into Coolidge Dr., continue on this road, which becomes McLaughlin Dr., until you reach the stop sign at the T in the road, turn left onto Heller Dr., and then you will make a right turn into the Kresge East Apts parking lot. Housing registration will be at the Kresge East Apts parking lot from 2:00 to 4:00 pm on Monday. Keys, meal cards, parking permits, maps, and information about what to do in Santa Cruz will be available. The office will remain open until 10:00 pm for late arrivals. Arrivals after 10:00 pm: stop at the Main Entrance Kiosk and have the guard call the College Proctor, who will meet you at the Satellite Office and give you your housing materials. Please be prepared to show I.D. or you will not be permitted on campus. Problems? Please go directly to the Kresge Satellite Office, or contact the Conference Director at (831) 459-2611. The Kresge College Conference Office is located in Apt. Building R11-Apt 1111. From the Kresge East Apts parking lot continue on Heller Dr. to the Porter College entrance, turn right into Porter College, follow the road around (curving right) to Kresge College. At Kresge College, park in the first lot on your left and look for signs for the Conference Office. Please do not park in spaces with posted restrictions at any time. In case of emergency, dial 911 from any campus phone. The weather in July is mostly sunny with occasional summer fog. Even though the air may be cool, the sun can be deceptively strong; those susceptible to sunburn should come prepared with sunblock. Bring T-shirts, slacks, shorts, and a sweater or light jacket, as it cools down at night. For information on the local bus routes and schedules, call the Metro Center at (831) 425-8600. Bring swimming trunks, tennis rackets, etc. You can get day passes for $5.00 (East Field House, Physical Education Office) to use the recreation facilities on campus. For questions about registration or accommodations, contact COLT'99, Computer Science Dept., UCSC, Santa Cruz, CA 95064. The e-mail address is colt99 at cse.ucsc.edu, and fax is (831)459-4829. For emergencies, call (831)459-2263. 4. General Conference Information: The Conference Registration will be at 8:30 Tuesday, outside the Porter dining hall. Late registrations will be at the same location during the technical sessions. All lectures will be in the Porter dining hall. A banquet will be held Tuesday from 6:30--8:00pm outside the Porter dining hall. The workshop has been organized to allow time for informal discussion and collaboration. COLT `99 CONFERENCE SCHEDULE ---------------------------- MONDAY, July 5 -------------- 2:00-4:00 pm, Housing Registration, Kresge East Apts Parking Lot. Note: All technical sessions will take place in the Porter Dining Hall. TUESDAY, July 6 --------------- SESSION 1: 9:00 - 10:30 9:00-9:30 The Robustness of the p-norm Algorithms Claudio Gentile and Nick Littlestone 9:30-10:00 Minimax Regret under Log Loss for General Classes of Experts Nicolo Cesa-Bianchi and Gabor Lugosi 10:00-10:30 On Prediction of Individual Sequences Relative to a set of Experts Neri Merhav and Tsachy Weissman 10:30 - 11:00 BREAK SESSION 2: 11:00 - 12:00 11:00-11:20 On Theory Revision with Queries Robert H. Sloan and Gyorgy Turan 11:20-11:40 Estimating a Mixture of Two Product Distributions Yoav Freund and Yishay Mansour 11:40-12:00 An Apprentice Learning Model Stephen S. Kwek 12:00 - 2:00 LUNCH SESSION 3: 2:00 - 3:00 2:00-2:20 Uniform-Distribution Attribute Noise Learnability Nader H. Bshouty, Jeffrey C. Jackson and Christino Tamon 2:20-2:40 On Learning in the Presence of Unspecified Attribute Values Nader H. Bshouty, and David K. Wilson 2:40-3:00 Learning Fixed-dimension Linear Thresholds from Fragmented Data Paul W. Goldberg 3:00 - 3:30 BREAK TUTORIAL 1: 3:30-5:30 3:30-5:30 Boosting Yoav Freund and Rob Schapire 7:30-9:30 RECEPTION - Kresge Town Hall Area WEDNESDAY, July 7 ----------------- 9:00-10:00 INVITED TALK: by David Shmoys Approximation Algorithms for Clustering Problems 10:00-10:30 BREAK SESSION 4: 10:30-12:10 10:30-10:50 An Adaptive Version of the Boost-by-majority Algorithm Yoav Freund 10:50-11:10 Drifting Games Robert E. Schapire 11:10-11:30 Additive Models, Boosting, and Inference for Generalized Divergences John Lafferty 11:30-11:50 Boosting as Entropy Projection J. Kivinen and M. K. Warmuth 11:50-12:10 Multiclass Learning, Boosting, and Error-Correcting Codes Venkatesan Guruswami and Amit Sahai 12:10-2:00 LUNCH SESSION 5: 2:00-3:00 2:00-2:20 Theoretical Analysis of a Class of Randomized Regularization Methods Tong Zhang 2:20-2:40 PAC-Bayesian Model Averaging David McAllester 2:40-3:00 Viewing all Models as `Probabilistic' Peter Grunwald 3:00-3:30 BREAK TUTORIAL 2: 3:30-5:30 3:30-5:30 Reinforcement Learning Michael Kearns and Yishay Mansour 6:30-8:00 BANQUET - Porter Dining Hall THURSDAY, July 8 ---------------- SESSION 6: 9:00-10:30 9:00-9:20 Reinforcement Learning and Mistake Bounded Algorithms Yishay Mansour 9:20-9:40 Convergence Analysis of Temporal-difference Learning Algorithms Vladislav Tadic 9:40-10:00 Beating the Hold-Out Avrim Blum, Adam Kalai and John Langford 10:00-10:20 Microchoice Bounds and Self Bounding Learning Algorithms John Langford and Avrim Blum 10:30-11:00 BREAK SESSION 7: 11:00-12:00 11:00-11:20 Learning Specialist Decision Lists Atsuyoshi Nakamura 11:20-11:40 Linear Relations between Square-Loss and Kolmogorov Complexity Yuri A. Kalnishkan 11:40-12:00 Individual Sequence Prediction - Upper Bounds and Application for Complexity Chamy Allenberg 12:00-2:00 LUNCH SESSION 8: 2:00-3:00 2:00-2:20 Extensional Set Learning S. A. Terwijn 2:20-2:40 On a Generalized Notion of Mistake Bounds Sanjay Jain and Arun Sharma 2:40-3:00 On the Intrinsic Complexity of Learning Infinite Objects from Finite Samples Kinber, Papazian, Smith, and Wiehagen 2:20-2:30 Concept Learning with Geometric Hypotheses David P. Dobkin and Dimitrios Gunopulos Friday, July 9 -------------- TUTORIAL 3: 9:00-11:00 9:00-11:00 Large Margin Classification Peter Bartlett, John Shawe-Taylor, and Bob Williamson 11:00-11:30 BREAK SESSION 9: 11:30-12:10 11:30-11:50 Covering Numbers for Support Vector Machines Ying Guo, Peter L. Bartlett, John Shawe-Taylor, and Robert C. Williamson 11:50-12:10 Further Results on the Margin Distribution John Shawe-Taylor and Nello Cristianini 12:10-2:00 LUNCH SESSION 10: 2:00-3:40 2:00 - 2:20 Attribute Efficient PAC-learning of DNF with Membership Queries Nader H. Bshouty and Jeffrey C. Jackson and Christino Tamon 2:20 - 2:40 On PAC Learning Using Winnow, Perceptron, and a Perceptron-Like Algorithm Rocco A. Servedio 2:40 - 3:00 Extension of the PAC Framework to Finite and Countable Markov Chains David Gamarnik 3:00 - 3:20 Learning Threshold Functions with Small Weights using Membership Queries E. Abboud, N. Agha, N.H. Bshouty, N. Radwan, F. Saleh 3:20 - 3:40 Exact Learning of Unordered Tree Patterns From Queries Thomas R. Amoth, Paul Cull, and Prasad Tadepalli 3:40 CONFERENCE ENDS REGISTRATION INFORMATION Please fill in the information needed for registration and accommodations. Make your payment by check or international money order, in U.S. dollars and payable through a U.S. bank, to UC Regents/COLT '99. Mail this form together with payment (by June 4, 1999 to avoid the late fee) to: COLT '99 Computer Science Department University of California Santa Cruz, California 95064 Payment may also be made by VISA or MC but will have a $15 surcharge and you can fax or email the form. Questions: e-mail colt99 at cse.ucsc.edu, fax (831)459-4829. Confirmations will be sent by e-mail. Anyone needing special arrangements to accommodate a disability should enclose a note with their registration. CONFERENCE REGISTRATION ======================= Name: ________________________________________________________________ Affiliation: _________________________________________________________ Address: _____________________________________________________________ City: _____________________________ State: ________ Zip: _________ Country: _____________________________ Telephone: _____________________________ Email address: __________________________ The registration fee includes a copy of the proceedings and the banquet dinner. ACM/SIG Members: $175 Non-Members: $240 Full time students: $100 Reg. Late Fee: $50 (rec'd after June 4) Extra banquet tickets: ____ (quantity) x $25 = _______________ How many in your party have dietary restrictions? Vegetarian: _____ Other: _____ SHIRT SIZE, please circle one: medium large x-large ACCOMMODATIONS AND DINING ------------------------- Accommodation fees are $67 per person for a double and $80 for a single per night at the Kresge Apartments. Cafeteria style breakfast (7:00 to 8:00am), lunch (12:00 to 1:00pm), and dinner (6:30 to 7:30pm) will be served in the College Eight Dining Hall. Doors close at the end of the time indicated, but dining may continue beyond this time. The first meal provided is dinner on the day of arrival and the last meal is lunch on the day you leave. NO REFUNDS can be given after June 7. Those with uncertain plans should make reservations at an off-campus hotel. Each attendee should pick one of the following options: PACKAGE #1: Mon., Tue., Wed., Thurs. nights: $268 double, $320 single. PACKAGE #2: Tue., Weds., Thurs. nights: $201 double, $240 single. OTHER housing arrangement. Each 4-person apartment has a living room, a kitchen, a common bathroom, and either four single separate rooms, two double rooms, or two single and one double room. We need the following information to make room assignments: Gender (M/F): __________ Smoker (Y/N): ___________ Roommate Preference: ________________________________ For shorter stays, longer stays, and other special requirements, you can get other accommodations through the Conference Office. Make reservations directly with them at (831) 459-2611, fax (831) 459-3422, and do this soon as on-campus rooms for the summer fill up well in advance. Off-campus hotels include the Dream Inn (831) 426-4330 and the Ocean Pacific Lodge (831) 457-1234 or (800) 995-0289. AMOUNT ENCLOSED: Registration total _________________ VISA/MC $15 Charge _________________ Extra Banquet tickets _________________ Accommodations _________________ Mark Fulk Award* _________________ TOTAL _________________ * The optional Donation for the Mark Fulk Award is tax deductible in the U.S.A., please see Carl Smith for receipt. From A.van.Ooyen at nih.knaw.nl Wed May 19 14:39:38 1999 From: A.van.Ooyen at nih.knaw.nl (Arjen van Ooyen) Date: Wed, 19 May 1999 20:39:38 +0200 Subject: Model of Axonal Competition Message-ID: <37430569.736F@nih.knaw.nl> NEW PAPER: Competition for Neurotrophic Factor in the Development of Nerve Connections A. van Ooyen & D. J. Willshaw Proc. R. Soc. Lond. B Biol. Sci. (1999) 266: 883-892. Download full text from the following website: http://www.cns.ed.ac.uk/people/arjen/competition.html Or request a reprint of the paper version (don't forget to give your address): A.van.Ooyen at nih.knaw.nl ABSTRACT The development of nerve connections is thought to involve competition among axons for survival promoting factors, or neurotrophins, which are released by the cells that are innervated by the axons. Although the notion of competition is widely used within neurobiology, there is little understanding of the nature of the competitive process and the underlying mechanisms. We present a new theoretical model to analyse competition in the development of nerve connections. According to the model, the precise manner in which neurotrophins regulate the growth of axons, in particular the growth of the amount of neurotrophin receptor, determines what patterns of target innervation can develop. The regulation of neurotrophin receptors is also involved in the degeneration and regeneration of connections. Competition in our model can be influenced by factors dependent on and independent of neuronal electrical activity. Our results point to the need to measure directly the specific form of the regulation by neurotrophins of their receptors. -- Arjen van Ooyen, Netherlands Institute for Brain Research, Meibergdreef 33, 1105 AZ Amsterdam, The Netherlands. email: A.van.Ooyen at nih.knaw.nl website: http://www.cns.ed.ac.uk/people/arjen.html phone: +31.20.5665483 fax: +31.20.6961006 From llicht at ifi.unizh.ch Thu May 20 12:00:14 1999 From: llicht at ifi.unizh.ch (Lukas Lichtensteiger) Date: Thu, 20 May 1999 18:00:14 +0200 Subject: PhD position in Evolutionary Robotics/Artificial Life Message-ID: ---------------------------------------------------------------------- Position for a Ph.D. student in Evolutionary Robotics/Artificial Life at the AI Lab, University of Zurich ---------------------------------------------------------------------- A new PhD student position is open at the Artificial Intelligence Laboratory, Department of Information Technology, University of Zurich, Switzerland. Availability: Immediately or at earliest convenience. The position is dedicated to research in Evolutionary Robotics/Artificial Life. In particular, the research focus will be on one or several of the following topics: - the generation of simple artificial organisms capable of exhibiting behavior (in simulation and in the real world) - mechanisms of ontogenetic development - the interaction of microscopic (genome, cell) and macroscopic (behavior) processes - the investigation of the interdependence of environment, morphology, and control - the incorporation of biological insights at appropriate levels of abstraction into computer/robotic models Previous work on an "Artificial Evolutionary System" that implements pertinent biological concepts can serve as a starting point. This research is intended to bring together ideas from biology, engineering, and computer science in novel and productive ways. If these challenges attract your interest and if you would like to become a member of an international research team conducting transdisciplinary work, please submit a curriculum vitae, statement of research interests, and the names of three references to: Rolf Pfeifer, Director AI Lab Dept. of Information Technology University of Zurich Winterthurerstrasse 190 CH-8057 Zurich, Switzerland E-mail: pfeifer at ifi.unizh.ch Phone: +41 1 635 43 20/31 Fax: +41 1 635 68 09 Profile: Applicants should have a master's degree, or equivalent (e.g. diploma or "licence" (Lizentiat)), in one of the following areas: biology, neurobiology, computer science, electrical or mechanical engineering, physics, mathematics (or related disciplines). Ability to work in a strongly interdisciplinary team is expected. Tasks: The main task for the accepted candidate will be to conduct research towards his/her Ph.D. Additional tasks include support for classes organized by the AI-Lab. Financial: The salary will be according to the specification of the University of Zurich. Time prospect: The candidate is expected to complete his/her Ph.D. work within a period of maximum 4 years. ---------------------------------------------------------- Prof. Dr. Rolf Pfeifer Director, Artificial Intelligence Laboratory Department of Information Technology, University of Zurich Winterthurerstrasse 190, CH-8057 Zurich, Switzerland Phone: +41 1 635 43 20/31 Fax: +41 1 635 68 09 www.ifi.unizh.ch/~pfeifer ---------------------------------------------------------- From Zoubin at gatsby.ucl.ac.uk Fri May 21 13:14:23 1999 From: Zoubin at gatsby.ucl.ac.uk (Zoubin Ghahramani) Date: Fri, 21 May 1999 18:14:23 +0100 (BST) Subject: Gatsby Unit Tutorial on Neural Computation Message-ID: <199905211714.SAA31263@cajal.gatsby.ucl.ac.uk> GATSBY UNIT TUTORIAL: NEURAL COMPUTATION Tuesday 31st August to Friday 3rd September 1999 University College London England http://www.gatsby.ucl.ac.uk/tutorial/tutorial.html The Gatsby Computational Neuroscience Unit at University College London is holding a four-day tutorial on Neural Computation from Tuesday 31st August to Friday 3rd September 1999. The tutorial is aimed at research students and postdoctoral researchers who are interested in and wish to learn more about neural computation and computational neuroscience. The tutorial is intended for researchers who are trying to understand how the brain computes, but it should also be of value to researchers who are using neural network methods for solving applied problems. The first part of the tutorial will describe techniques for learning and inference that have their roots in computer science, statistics, physics, engineering and dynamical systems. The second part will describe how these techniques have been applied to understanding computation in the brain. Tutorial Faculty: Geoff Hinton, Peter Dayan, Zoubin Ghahramani, Zhaoping Li, Hagai Attias, Sam Roweis, Emo Todorov and Carl van Vreeswijk Registration Fee: GBP 25.00. Includes course papers and light refreshments. Further information on the tutorial contents and on-line registration are available at: http://www.gatsby.ucl.ac.uk/tutorial/tutorial.html From NKasabov at infoscience.otago.ac.nz Sat May 22 17:26:17 1999 From: NKasabov at infoscience.otago.ac.nz (Nik Kasabov) Date: Sun, 23 May 1999 09:26:17 +1200 Subject: Three TRs and software on on-line learning and applications Message-ID: Dear colleagues, Three Technical Reports and software functions in MATLAB on evolving connectionist systems for on-line learning and their applications for on-line adaptive speech recognition and dynamic time series prediction are available from: http://divcom.otago.ac.nz/infoscience/kel/CBIIS.html (software/EFuNN) regards, Nik Kasabov -------------------------------------------- Prof. Dr. Nikola (Nik) Kasabov Department of Information Science University of Otago,P.O. Box 56,Dunedin New Zealand, email: nkasabov at otago.ac.nz phone:+64 3 479 8319, fax: +64 3 479 8311 http://divcom.otago.ac.nz:800/infosci/Staff/NikolaK.htm -------------------------------------------- TR99/02 N.Kasabov, Evolving Connectionist Systems for On-line, Knowledge-based Learning: Principles and Applications, TR99/02, Department of Information Science, University of Otago, New Zealand Abstract. The paper introduces evolving connectionist systems (ECOS) as an effective approach to building on-line, adaptive intelligent systems. ECOS evolve through incremental, hybrid (supervised/unsupervised), on-line learning. They can accommodate new input data, including new features, new classes, etc. through local element tuning. New connections and new neurons are created during the operation of the system. The ECOS framework is presented and illustrated on a particular type of evolving neural networks - evolving fuzzy neural networks (EFuNNs). EFuNNs can learn spatial-temporal sequences in an adaptive way through one pass learning. Rules can be inserted and extracted at any time of the system operation. The characteristics of ECOS and EFuNNs are illustrated on several case studies that include: adaptive pattern classification; adaptive, phoneme-based spoken language recognition; adaptive dynamic time-series prediction; intelligent agents. ---------------------------------------------------------------------------- ---- TR99/03 N.Kasabov, M.Watts, Spatial-Temporal Adaptation in Evolving Fuzzy Neural Networks for On-line Adaptive Phoneme Recognition, TR99/03, Department of Information Science, University of Otago, New Zealand Abstract. The paper is a study on the spatial-temporal characteristics of evolving fuzzy neural network systems (EFuNNs)for on-line adaptive learning. These characteristics are important for the task of adaptive, speaker independent spoken language recognition, where new pronunciations and new accents need to be learned in an on-line, adaptive mode. Experiments with EFuNNs, and also with multi-layer perceptrons, and fuzzy neural networks (FuNNs), conducted on the whole set of 43 New Zealand English phonemes, show the superiority and the potential of EFuNNs when used for the task. Spatial allocation of nodes and their aggregation in EFuNNs allow for similarity preserving and similarity observation within one phoneme data and across phonemes, while subtle temporal variations within one phoneme data can be learned and adjusted through temporal feedback connections. The experimental results support the claim that spatial-temporal organisation in EFuNNs can lead to a significant improvement in the recognition rate especially for the diphthong and the vowel phonemes in English, which in many cases are problematic for a system to learn and adjust in an on-line, adaptive way. ---------------------------------------------------------------------------- ---- TR99/04 N.Kasabov and Q.Song, Dynamic Evolving Fuzzy Neural Networks with 'm-out-of-n' Activation Nodes for On-line Adaptive Systems, TR99/04, Department of Information Science, University of Otago, New Zealand Abstract. The paper introduces a new type of evolving fuzzy neural networks (EFuNNs), denoted as mEFuNNs, for on-line learning and their applications for dynamic time series analysis and prediction. At each time moment the output vector of a mEFuNN is calculated based on the m-most activated rule nodes. Two approaches are proposed: (1) using weighted fuzzy rules of Zadeh-Mamdani type; (2) using Takagi-Sugeno fuzzy rules that utilise dynamically changing and adapting values for the inference parameters. It is proved that the mEFuNNs can effectively learn complex temporal sequences in an adaptive way and outperform EFuNNs, ANFIS and other connectionist and hybrid connectionist models. The characteristics of the mEFuNNs are illustrated on two bench-mark dynamic time series data, as well as on two real case studies for on-line adaptive control and decision making. Aggregation of rule nodes in evolved mEFuNNs can be achieved through fuzzy C-means clustering algorithm which is also illustrated on the bench mark data sets. The regularly trained and aggregated in an on-line, self-organised mode mEFuNNs perform as well, or better, than the mEFuNNs that use fuzzy C-means clustering algorithm for off-line rule node generation on the same data set. ------------------------------------------------------------------- From uwe.zimmer at gmd.gr.jp Tue May 25 06:58:37 1999 From: uwe.zimmer at gmd.gr.jp (Uwe R. Zimmer) Date: Tue, 25 May 1999 19:58:37 +0900 Subject: PhD thesis (Giovanni Indiveri) Message-ID: <374A825D.E9D23E81@gmd.gr.jp> Dear Collegues, my PhD thesis "Modelling and Identification of Underwater Robotic" is available in pdf format at the URL: http://www.gmd.gr.jp/JRL/publications.html#98 Please find in the following its abstract and table of contents. Best wishes, Giovanni Indiveri ABSTRACT Whatever is the strategy pursued to design a control system or a state estimation filter for an underwater robotic system the knowledge of its identified model is very important. As far as ROVs are concerned the results presented in this thesis suggest that low cost on board sensor based identification is feasible: the detailed analysis of the residual least square costs and of the parameter estimated variances show that a decoupled vehicle model can be successfully identified by swimming pool test provided that a suitable identification procedure is designed and implemented. A two step identification procedure has been designed on the basis of: (i) the vehicle model structure, which has been deeply analyzed in the first part of this work, (ii) the type of available sensors and (iii) the actuator dynamics. First the drag coefficients are evaluated by constant speed tests and afterwards with the aid of their knowledge a sub-optimal sinusoidal input thrust is designed in order to identify the inertia parameters. Extensive experimental activity on the ROMEO ROV of CNR-IAN has shown the effectiveness of such approach. Moreover it has been shown that the standard unmanned underwater vehicle models may need, as for the ROMEO ROV, to take into account propeller-propeller and propeller-hull interactions that have a most relevant influence on the system dynamics (up to 50% of efficiency loss in the applied thrust with respect to the nominal model). It has been shown that such phenomena can be correctly modelled by an efficiency parameter and experimental results concerning its identification on a real system have been extensively analyzed. The parameter estimated variances are generally relatively low, specially for the drag coefficients, confirming the effectiveness of the adopted identification scheme. The surge drag coefficients have been estimated relatively to two different vehicle payload configurations, i.e. carrying a plankton sampling device or a Doppler velocimeter (see chapter 4 for details), and the results show that in the considered surge velocity range (|u| < 1 m/s) the drag coefficients are different, but perhaps less then expected. Moreover it has been shown that in the usual operating yaw rate range (< 10 deg /s) drag is better modeled by a simple linear term rather then both a linear and a quadratic one. This is interesting as it suggests that the control system of the yaw axis of slow motion open frame ROV can be realized by standard linear control techniques. For a detailed description of the identification procedure and of the identification results of the ROMEO ROV consult chapter 4. In the last part of this thesis the issue of planar motion control of a nonholonomic vehicle has been addressed. Inspired by the previous works of Casalino et al. and Aicardi et al. regarding a unicycle like kinematic model, a novel globally asymptotically convergent smooth feedback control law for the point stabilization of a car-like robot has been developed. The resulting linear velocity does not change sign, curvature is bounded and the target is asymptotically approached on a straight line. Applications to the control of underwater vehicles are discussed and extensive simulations are performed in order to analyze the algorithms behaviour with respect to actuator saturation. It is analytically shown that convergence is achieved also in presence of actuator saturation and simulations are performed to evaluate the control law performance with and without actuator saturation. Moreover the generation of smooth paths having minimum square curvature, integrated over length, is addressed and solved with variational calculus in 3D for an arbitrary curve parametrization. The plane projection of such paths are shown to be least yaw drag energy paths for the 2D underwater motion of rigid bodies. ______________________________________________________________ -------------------------------------------------------------- TABLE OF CONTENTS 1 Introduction 9 1.1 Motivations and Objectives 9 1.2 Outline of the work 11 1.3 Acknowledgments 12 2 Kinematics 13 2.1 Vectors 13 2.1.1 Vector notation 13 2.1.2 Time derivatives of vectors 14 Poisson Formula 15 Velocity composition rules 17 2.1.3 On useful vector operations properties 19 3 Dynamics 21 3.1 Rigid body Newton-Euler equations 21 3.2 Fluid forces and moments on a rigid body 26 3.2.1 The Navier Stokes equation 26 3.2.2 Viscous effects 28 Viscous drag forces 28 Lift forces 29 Added mass effects 30 On the properties of ideal fluids 30 Dynamic pressure forces and moments on a rigid body 33 3.2.4 Current effects 36 3.2.5 Weight and buoyancy 37 3.3 Underwater Remotely Operated Vehicles Model 37 3.3.1 Thruster dynamics 38 3.3.2 Overall ROV Model 40 3.4 Underwater Manipulator Model 41 4 Identification 43 4.1 Estimation approach 43 4.1.1 Least Squares Technique 44 4.1.2 Consistency and Efficiency 47 4.1.3 On the normal distribution case 47 4.1.4 Measurement variance estimation 49 4.2 On board sensor based ROV identification 49 4.2.1 Model structure 50 4.2.2 Thruster model identification 54 4.2.3 Off line velocity estimation 55 4.2.4 Heave model identification 58 4.2.5 Yaw model identification 70 4.2.6 Surge model identification 84 4.2.7 Sway model identification 89 4.2.8 Inertia parameters identification 94 4.2.9 Surge inertia parameter identification 97 4.2.10 Yaw inertia parameter identification 100 4.3 Summary 105 5 Motion control and path planning 107 5.1 2D motion control of a nonholonomic vehicle 107 5.1.1 A state feedback solution for the unicycle model 109 5.1.2 A state feedback solution for a more general model 112 5.2 Path Planning 126 5.2.1 Curvature 128 5.2.2 Planning criterion: a variational calculus approach 129 5.2.3 Solution properties 135 5.2.4 Solution examples 137 References 145 ___________________________________________________________ Giovanni Indiveri, Dr. Visiting Researcher at GMD-Japan Research Laboratory, Kitakyushu, Japan. mailto:giovanni.indiveri at gmd.de _URL__http://www.gmd.gr.jp, voice +81 93 512 1566 /// fax + 81 93 512 1588 ___________________________________________________________ From Richard_Kempter at physik.tu-muenchen.de Tue May 25 10:30:05 1999 From: Richard_Kempter at physik.tu-muenchen.de (Richard Kempter) Date: Tue, 25 May 1999 16:30:05 +0200 Subject: Hebbian Learning and Spiking Neurons Message-ID: <199905251430.QAA18781@srv.cip.physik.tu-muenchen.de> The following paper has appeared in Physical Review E, 59:4498-4514,1999 Hebbian Learning and Spiking Neurons by R. Kempter, W. Gerstner and J.L. van Hemmen Since we are out of reprints, copies of the paper are now available from http://diwww.epfl.ch/lami/team/gerstner/wg_pub.html Abstract: A correlation-based (``Hebbian'') learning rule at the spike level is formulated, mathematically analyzed, and compared to learning in a firing-rate description. As for spike coding, we take advantage of a ``learning window'' that describes the effect of timing of pre- and postsynaptic spikes on synaptic weights. A differential equation for the learning dynamics is derived under the assumption that the time scales of learning and spiking dynamics can be separated. Formation of structured synapses is analyzed for a Poissonian neuron model which receives time-dependent stochastic input. It is shown that correlations between input and output spikes tend to stabilize structure formation. With an appropriate choice of parameters, learning leads to an intrinsic normalization of the average weight and the output firing rates. Noise generates diffusion-like spreading of synaptic weights. From georgiou at csusb.edu Wed May 26 02:30:58 1999 From: georgiou at csusb.edu (georgiou@csusb.edu) Date: Tue, 25 May 1999 23:30:58 -0700 (PDT) Subject: CFP: 4th ICCIN (Feb. 27-Mar. 3, 2000) Message-ID: <199905260630.XAA17882@mail.csusb.edu> Call for Papers 4th International Conference on COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE http://www.csci.csusb.edu/iccin Trump Taj Mahal Casino and Resort]], Atlantic City, NJ USA February 27 -- March 3, 2000 Summary Submission Deadline: September 1, 1999 Conference Co-chairs: Subhash C. Kak, Louisiana State University Jeffrey P. Sutton, Harvard University This conference is part of the Fourth Joint Conference Information Sciences. http://www.ee.duke.edu/JCIS/ Plenary Speakers include the following: +------------------------------------------------------------------------+ |James Anderson |Wolfgang Banzhaf |B. Chandrasekaran|Lawrence J. Fogel| |-----------------+------------------+-----------------+-----------------| |Walter J. Freeman|David E. Goldberg |Irwin Goodman |Stephen Grossberg| |-----------------+------------------+-----------------+-----------------| |Thomas S.Huang |Janusz Kacprzyk |A. C. Kak |Subhash C. Kak | |-----------------+------------------+-----------------+-----------------| |John Mordeson |Kumpati S. Narenda|Anil Nerode |Huang T. Nguyen | |-----------------+------------------+-----------------+-----------------| |Jeffrey P. Sutton|Ron Yager | | | +------------------------------------------------------------------------+ Special Sessions on Quantum Computation. More to be added. Areas for which papers are sought include: o Artificial Life o Artificially Intelligent NNs o Associative Memory o Cognitive Science o Computational Intelligence o Efficiency/Robustness Comparisons o Evaluationary Computation for Neural Networks o Feature Extraction & Pattern Recognition o Implementations (electronic, Optical, Biochips) o Intelligent Control o Learning and Memory o Neural Network Architectures o Neurocognition o Neurodynamics o Optimization o Parallel Computer Applications o Theory of Evolutionary Computation Summary Submission Deadline: September 1, 1999 Notification of authors upon review: November 1, 1999 December 1, 1999 - Deadline for invited sessions and exhibition proposals Papers will be accepted based on summaries. A summary shall not exceed 4 pages of 10-point font, double-column, single-spaced text, with figures and tables included. For the Fourth ICCIN, send 3 copies of summaries to: George M. Georgiou Computer Science Department California State University San Bernardino, CA 92407-2397 U.S.A. georgiou at csci.csusb.edu From tgd at cs.orst.edu Wed May 26 10:38:57 1999 From: tgd at cs.orst.edu (Thomas G. Dietterich) Date: Wed, 26 May 1999 16:38:57 +0200 Subject: 2 papers on Hierarchical Reinforcement Learning Message-ID: <1966-Wed26May1999163857+0200-tgd@cs.orst.edu> The following two papers are available from the Computing Research Repository (CoRR) (http://xxx.lanl.gov/archive/cs/intro.html or its mirror sites). They can also be retrieved from the Reinforcement Learning Repository http://web.cps.msu.edu/rlr/ or from my home page: http://www.cs.orst.edu/~tgd/cv/pubs.html. Number: cs.LG/9905014 Title: Hierarchical Reinforcement Learning with the MAXQ Value Function Decomposition Authors: Thomas G. Dietterich Comments: 63 pages, 15 figures Subj-class: Learning ACM-class: I.2.6 This paper presents the MAXQ approach to hierarchical reinforcement learning based on decomposing the target Markov decision process (MDP) into a hierarchy of smaller MDPs and decomposing the value function of the target MDP into an additive combination of the value functions of the smaller MDPs. The paper defines the MAXQ hierarchy, proves formal results on its representational power, and establishes five conditions for the safe use of state abstractions. The paper presents an online model-free learning algorithm, MAXQ-Q, and proves that it converges wih probability 1 to a kind of locally-optimal policy known as a recursively optimal policy, even in the presence of the five kinds of state abstraction. The paper evaluates the MAXQ representation and MAXQ-Q through a series of experiments in three domains and shows experimentally that MAXQ-Q (with state abstractions) converges to a recursively optimal policy much faster than flat Q learning. The fact that MAXQ learns a representation of the value function has an important benefit: it makes it possible to compute and execute an improved, non-hierarchical policy via a procedure similar to the policy improvement step of policy iteration. The paper demonstrates the effectiveness of this non-hierarchical execution experimentally. Finally, the paper concludes with a comparison to related work and a discussion of the design tradeoffs in hierarchical reinforcement learning. (168kb) Number: cs.LG/9905015 Title: State Abstraction in MAXQ Hierarchical Reinforcement Learning Authors: Thomas G. Dietterich Comments: 7 pages, 2 figures Subj-class: Learning ACM-class: I.2.6 Many researchers have explored methods for hierarchical reinforcement learning (RL) with temporal abstractions, in which abstract actions are defined that can perform many primitive actions before terminating. However, little is known about learning with state abstractions, in which aspects of the state space are ignored. In previous work, we developed the MAXQ method for hierarchical RL. In this paper, we define five conditions under which state abstraction can be combined with the MAXQ value function decomposition. We prove that the MAXQ-Q learning algorithm converges under these conditions and show experimentally that state abstraction is important for the successful application of MAXQ-Q learning. (37kb) From mschmitt at lmi.ruhr-uni-bochum.de Wed May 26 08:01:17 1999 From: mschmitt at lmi.ruhr-uni-bochum.de (Michael Schmitt) Date: Wed, 26 May 1999 14:01:17 +0200 Subject: Preprint: Complexity of learning for spiking neurons Message-ID: <374BE28C.D9678B93@lmi.ruhr-uni-bochum.de> Dear colleagues, the following paper has been accepted for publication in "Information and Computation" and is available from http://www.cis.tu-graz.ac.at/igi/maass/96.ps.gz or http://www.ruhr-uni-bochum.de/lmi/mschmitt/spikingneurons.ps.gz (24 pages, 136K). Regards, Michael Schmitt ------------------------------------------------------------ TITLE: On the Complexity of Learning for Spiking Neurons with Temporal Coding AUTHORS: Wolfgang Maass and Michael Schmitt ABSTRACT Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. In a network of spiking neurons a new set of parameters becomes relevant which has no counterpart in traditional neural network models: the time that a pulse needs to travel through a connection between two neurons (also known as delay of a connection). It is known that these delays are tuned in biological neural systems through a variety of mechanisms. In this article we consider the arguably most simple model for a spiking neuron, which can also easily be implemented in pulsed VLSI. We investigate the VC dimension of networks of spiking neurons where the delays are viewed as programmable parameters and we prove tight bounds for this VC dimension. Thus we get quantitative estimates for the diversity of functions that a network with fixed architecture can compute with different settings of its delays. In particular, it turns out that a network of spiking neurons with $k$ adjustable delays is able to compute a much richer class of functions than a threshold circuit with $k$ adjustable weights. The results also yield bounds for the number of training examples that an algorithm needs for tuning the delays of a network of spiking neurons. Results about the computational complexity of such algorithms are also given. -- Michael Schmitt LS Mathematik & Informatik, Fakultaet fuer Mathematik Ruhr-Universitaet Bochum, D-44780 Bochum, Germany Phone: ++49 234 700-3209 , Fax: ++49 234 7094-465 From zhaoping at gatsby.ucl.ac.uk Mon May 24 07:07:24 1999 From: zhaoping at gatsby.ucl.ac.uk (Dr Zhaoping Li) Date: Mon, 24 May 1999 12:07:24 +0100 Subject: Paper available on Visual Segmentation by intracortical interactions in V1 Message-ID: <199905241107.MAA17223@vision.gatsby.ucl.ac.uk> Paper on Visual Segmentation by intracortical interactions in V1 Published in NETWORK: COMPUTATION IN NEURAL SYSTEMS, vol.10, number 2, May 1999, page, 187-212 Available at Online Service of the Journal --- http://www.iop.org/EJ/S Or on the website --- http://www.gatsby.ucl.ac.uk/~zhaoping/preattentivevision.html Title: Visual segmentation by contextual influences via intracortical interactions in primary visual cortex. Author: Zhaoping LI Abstract: Stimuli outside classical receptive fields have been shown to exert significant influence over the activities of neurons in primary visual cortex. We propose that contextual influences are used for pre-attentive visual segmentation. The difference between contextual influences near and far from region boundaries makes neural activities near region boundaries higher than elsewhere, making boundaries more salient for perceptual pop-out. The cortex thus computes {\it global} region boundaries by detecting the breakdown of homogeneity or translation invariance in the input, using {\it local} intra-cortical interactions mediated by the horizontal connections. This proposal is implemented in a biologically based model of V1, and demonstrated using examples of texture segmentation and figure-ground segregation. The model is also the first that performs texture or region segmentation in exactly the same neural circuit that solves the dual problem of the enhancement of contours, as is suggested by experimental observations. The computational framework in this model is simpler than previous approaches, making it implementable by V1 mechanisms, though higher level visual mechanisms are needed to refine its output. However, it easily handles a class of segmentation problems that are known to be tricky. Its behavior is compared with psychophysical and physiological data on segmentation, contour enhancement, contextual influences, and other phenomena such as asymmetry in visual search. From FYFE-CI0 at wpmail.paisley.ac.uk Fri May 28 04:55:25 1999 From: FYFE-CI0 at wpmail.paisley.ac.uk (COLIN FYFE) Date: Fri, 28 May 1999 08:55:25 +0000 Subject: Jobs in Computational Intelligence Message-ID: Applications are now sought for the following posts from those in the Computational Intelligence community: Chair of Computational Intelligence Department of Computing and Information Systems The University of Paisley, Paisley, Scotland. This senior post carries a salary of 36981 pa (award pending) and is an additional post intended to strengthen the Department's research group in this area. There are already 6 promoted academics in this area which covers the groups involved in Artificial Neural Networks Statistical Signal Processing Applied Artificial Intelligence Artificial Life You should have a substantial track record of publications and be able to demonstrate leadership in research for this post. There are also three posts at the Lecturer/Senior Lecturer level. Candidates for the Senior Lecturer post should also be able to demonstrate substantial research in this area or in Information Systems. Senior Lecturer Salary Scale : 26146 to 33798 pa (award pending) Lecturer Salary Scale : 15885 to 28507 pa (award pending) Recruitment packs are available from the Personnel Office, University of Paisley, Paisley PA1 2BE tel : 0141 848 3940 Closing date is 16th June 1999 Informal enquiries may be made to Prof Malcolm Crowe, Head of Department on 0141 848 3300 or to Colin Fyfe on 0141 848 3305 http://cis.paisley.ac.uk From gsiegle at sciences.sdsu.edu Mon May 31 01:12:30 1999 From: gsiegle at sciences.sdsu.edu (Greg Siegle) Date: Sun, 30 May 1999 22:12:30 -0700 (PDT) Subject: Dissertation available on attention in depression Message-ID: <199905310512.WAA26907@sciences.sdsu.edu> Dear Connectionists, The following dissertation is now available on-line at http://www.sci.sdsu.edu/CAL/greg/dissert/ Cognitive and Physiological Aspects of Attention to Personally Relevant Negative Information in Depression by Greg Jeremy Siegle Abstract Evidence suggests depressed individuals pay excessive attention to negative information. The current research investigates the nature and clinical implications of such attention biases. A computational neural network, reflecting interacting brain systems that identify emotional and nonemotional aspects of information, is described in which depression is identified with strongly learning certain negative information. The model's behavior suggested that depressed people are reminded of, and attend to personally relevant negative information in response to many stimuli. Predictions for depressed and nondepressed individuals' reaction times, signal detection rates, and the time course of cognitive load in response to emotional stimuli were derived from the computational model. To evaluate these predictions, pupil dilations and reaction times were collected from 24 unmedicated depressed and 25 nondepressed adults in response to emotional lexical decision and valence identification tasks. Pupil dilation was used to index cognitive load. Mixed ANOVA planned contrasts were employed to evaluate predictions. In support of model derived predictions, depressed individuals rated many stimuli as negative more than nondepressed individuals. The network's behavior suggested that depressed individuals would be quicker to say that negative words were negative, than positive words were positive, and that this difference would be reduced in nondepressed individuals. This prediction was supported empirically. Principal components analysis of pupil dilations revealed early attentional components (at or before reaction times) and late, possibly ruminative, components (peaking 2 and 4 seconds after reaction times). The computational model suggested cognitive load, indexed by pupil dilation, would be highest for nondepressed individuals during early stages of attention but highest for depressed individuals during later stages of attention. This prediction was supported. Contrary to predictions, differences in depressed individuals' dilations to positive and negative stimuli were not detected. These data suggest depressed individuals may not initially attend to the content of presented information, but may quickly associate any incoming information with whatever made them depressed. Sustained attention to personally relevant negative information may characterize depressive attention biases. Targeting implicated cognitive and brain processes may improve interventions for depression. ----------------------------------------- Greg Siegle, Ph.D. San Diego State University / University of California, San Diego / University of Toronto www.sci.sdsu.edu/CAL/greg.html 416-979-4747 x2376 The Clarke Institute, 250 College St., Toronto, ON M5T 1R8 Canada Visit the Connectionist Models of Cognitive, Affective, Brain, and Behvioral Disorders website at www.sci.sdsu.edu/CAL/connectionist.models